Search Results

Search found 40294 results on 1612 pages for 'dot files'.

Page 537/1612 | < Previous Page | 533 534 535 536 537 538 539 540 541 542 543 544  | Next Page >

  • Project organization in perforce

    - by Chupa
    Hello. I created several web applications that use the same static files (css, js, images). When I use svn for version control, I use an external repository (svn: externals) to add files to the current project. For example: - Project_1 ---- Webapp -------- Static (external to static's repo) - Project_2 ---- Webapp -------- Static (external to static's repo) I could easily use it in their web pages by adding a link like /static/ ... But now our company has moved to perforce. How can I support the current structure? We also use maven, I think to pack these files as a jar and use as a dependency, but then my editor (idea) does not see that this dependence are js-scripts and styles. And i need to repackage and deploy jar file when create minor changes. How to use maven correctly?

    Read the article

  • How to convert an CHM file into a single HTML file?

    - by ruslan
    I have tried many different CHM-to-HTML utilities, but I am having a difficult time finding one that is able to produce a single HTML file. I can decompile a CHM file using hh.exe, but I don't know how to easily merge the resulting files into a single HTML file, all while preserving the correct order of pages. Is there a free tool which can do this? If not, how can I merge the HTML files in order?

    Read the article

  • Secure, efficient, version-preserving, filename-hiding backup implemented in this way?

    - by barrycarter
    I tried writing a "perfect" backup program (below), but ran into problems (also below). Is there an efficient/working version of this?: Assumptions: you're backing up from 'local', which you own and has limited disk space to 'remote', which has infinite disk space and belongs to someone else, so you need encryption. Network bandwidth is finite. 'local' keeps a db of backed-up files w/ this data for each file: filename, including full path file's last modified time (mtime) sha1sum of file's unencrypted contents sha1sum of file's encrypted contents Given a list of files to backup (some perhaps already backed up), the program runs 'find' and gets the full path/mtime for each file (this is fairly efficient; conversely, computing the sha1sum of each file would NOT be efficient) The program discards files whose filename and mtime are in 'local' db. The program now computes the sha1sum of the (unencrypted contents of each remaining file. If the sha1sum matches one in 'local' db, we create a special entry in 'local' db that points this file/mtime to the file/mtime of the existing entry. Effectively, we're saying "we have a backup of this file's contents, but under another filename, so no need to back it up again". For each remaining file, we encrypt the file, take the sha1sum of the encrypted file's contents, rsync the file to its sha1sum. Example: if the file's encrypted sha1sum was da39a3ee5e6b4b0d3255bfef95601890afd80709, we'd rsync it to /some/path/da/39/a3/da39a3ee5e6b4b0d3255bfef95601890afd80709 on 'remote'. Once the step above succeeds, we add the file to the 'local' db. Note that we efficiently avoid computing sha1sums and encrypting unless absolutely necessary. Note: I don't specify encryption method: this would be user's choice. The problems: We must encrypt and backup 'local' db regularly. However, 'local' db grows quickly and rsync'ing encrypted files is inefficient, since a small change in 'local' db means a big change in the encrypted version of 'local' db. We create a file on 'remote' for each file on 'local', which is ugly and excessive. We query 'local' db frequently. Even w/ indexes, these queries are slow, since we're often making one query for each file. Would be nice to speed this up by batching queries or something. Probably other problems that I've now forgotten.

    Read the article

  • File copying software to do this kind of work... in Windows 7 32 bit

    - by Senthil
    I need a software (Windows 7 32bit) to help me in this process: I have my documents, music, video clips, movies, pictures in my hard disk. These will not be scattered around the system. But will be inside C:\Senthil\ At the end of every week, I want to plug in an external hard disk and run a software that should make sure whatever is inside C:\Senthil\ is also present in the external disk. Files deleted from C:\Senthil\ should be deleted there, and new files should be copied etc... at the end of the process, every bit inside the source folder in my internal disk should be inside my external disk. A couple of important requirements and points: I do NOT need multiple versions or historic versions. I don't need the previous versions of my files. I only want the latest copy to be present in my "backup". Incremental backup makes sense. If files were not touched since the last backup, it need not copy. The size of my folder will run into GBs and in a year or two will go into TBs. But I will make sure the size of the external HDD is equal to or bigger than my source folder. I do not want it to run automatically because when I accidentally delete a file in my source, it will delete the one in the backup (I know this is why we have versioning facilities). I just want to be able to run it manually so that I am in control of when the backup is made and what is backed up and I should be able to pick something from the backup and restore it to the source folder in the above situation. Is there any software that will let me do exactly this? I don't want any other "smart" facility of the software to interfere with this process. I know what I want and the software can keep its smartness to itself :D The main reason I am asking this question is, I am a software developer and I can write this software myself. But I am a little constrained by time at the moment and I want to know if there is an existing program that can do this. Kindly don't worry about earthquakes or fire or snowstorms and bring up the "in case of a natural disaster your backup will also be in the damage zone and will be lost" argument because: I will have bigger things to worry about than my holiday memories. I don't think I will digitally store any life-ruining documents. This backup is only to avoid the inconvenience of obtaining a new copy of stuff that I have. Not to protect them against the end of the world. I am more worried about power surges in my area frying my system, hard disk failure, children who merrily hit Delete or teens who hit Shift + Delete or myself getting a little careless at times! In short: Is there a file/folder syncing software that listens to what I say and doesn't try to act smart? Please forgive me if I sound arrogant :)

    Read the article

  • Get CruiseControl to talk to github with the correct public key.

    - by Danny Lister
    Hi All, Has anybody installed git and ControlControl and got CruiseControl to pull from GitHub on a window 2003 server. I keep getting public key errors (access denied) - Which is good i suppose as that confirms git is talking to github. However what is not good is that I dont not know where to install the rsa keys so they will be picked up by the running process (git in the context of cc.net). Any help would save me a lot of hair! I have tried installing the keys into; c:\Program Files\Git.ssh Whereby running git bash and cd ~ take me to: c:\Program Files\Git Current error from CC.net is Error Message: ThoughtWorks.CruiseControl.Core.CruiseControlException: Source control operation failed: Permission denied (publickey). fatal: The remote end hung up unexpectedly . Process command: C:\Program Files\Git\bin\git.exe fetch origin Thanks in advance

    Read the article

  • Extract 100G of Music from Ipod to Harddisk

    - by user10826
    Hi, I have an ipod 5th gen with 110G of Music and a macbook with itunes. I would like to rip all music files from my ipod to my hard disk and then select only some of the files and add them to the itunes library, which will sync with the ipod. I tried Expod and similar softwares but they hang with more than 50G. Do you know any other approach? Thanks

    Read the article

  • Windows Server 2008 Software Raid 5 - Data integrity issues

    - by Fopedush
    I've got a server running Windows Server 2008 R2, with a (windows native) software raid-5 array. The array consists of 7x 1TB Western Digital RE3 and RE4 drives. I have offline backups of this array. The problem is this: I noticed a few days ago after copying a large file to the disk that there was an integrity issue with that file - it was a ~12GB file that I had downloaded via uTorrent. After moving it to the raid array, I used uTorrent to relocate the download location, and performed a re-check so I could seed it from that location. The recheck found that only 6308/6310 chunks of the copied file were intact. My next step was to write a quick powershell script that would copy files to the array, while performing a SHA1 hash of the original and resultant files and comparing them. Smaller files (100-1000MB) copied over just fine. When I started copying larger data (~15GB), I found that the hash check failed about 2/3rds of the time. The corrupt files had very, very small inconsistencies - less than .01%. I further eliminated the possibility of networking or client issues by placing this large file on the C:\ of the server, and copying it repeatedly from there to the array, seeing similar results. Copying the data via explorer, powershell, or the standard windows command prompt yield the same results. None of the copies fail or report any problems. The raid array itself is listed as healthy in disk management. After a few experiments, I shut down the server and ran memtest overnight. No errors were detected. A basic run of chkdsk found no problems, but I did not use the /R flag, as I was unsure how that might affect a software raid-5 volume. I next ran Crystal Disk Info to check the smart data on the drives - but found that CDI only detected 5 out of 7 of the disks in the array. I have no idea why. Nevertheless, CDI shows the following "caution" flags on a single one of the drives: 05 199 199 140 000000000001 Reallocated Sectors Count C5 200 200 __0 000000000001 Current Pending Sector Count Which is a little bit alarming, but I don't really know what to do with the information. I hardly feel like one reallocated sector could be causing this. At this point, I'm looking for some guidance on what to do next. I need to determine the cause of this issue, but I'm hesitant to run chkdsk /R or any bootable disk health checkers because I'm afraid they might break the array. I've considered triggering a re-sync of the array, but I'm not actually sure how to do that without doing something silly like manually dropping a disk and then restoring it. Any advice that could help me ferret out the precise cause of this issue would be greatly appreciated.

    Read the article

  • amavisd Net server pid file already exists after system crash and startup

    - by Simiyu
    Hi all, whenever i have an unclean shutdown, which is often due to power failure most of the time i get problems with amavis starting up. The error amavisd Net server pid_file already exists for running process comes when i start it under debug mode, so i always have to delete the amavisd.pid and amavisd.lock files manually before it starts. Is there a way i can stop this from happening or get a way to delete the files during reboot in the case of an unclean shutdown. Thanks

    Read the article

  • What does this httpd directive do?

    - by alsciende
    Hello, I stumbled upon a httpd.conf directive that I can't manage to understand: <Files ~ "^\.ht"> Order allow,deny Deny from all Satisfy All </Files> According to the doc , I would say that Satisfy doesn't have any effect since there is no Allow. Am I wrong? What do you think this directive does?

    Read the article

  • Using Lucene to Query File properties in Windows

    - by sneha
    Hi All, I am planning to use Apache lucense in one of my projects, I want to index files based on the file properties (I won’t be indexing the data) and I want lucense to query the index so that I can quickly find list of files to based on the properties . E.g: give me all the files with access time greater than 10/10/2005 and access time less than 10/04/2010 and file created by james. Can i use Lucene for these kind of projects ? or i better of using windows search (the foor print is very heavy almost 5 MB :( ) and i have to bundling this as part of my application is seems to tough. Can you please suggest is there any better alternatives here?

    Read the article

  • Others users deleting my folders

    - by Abhiram
    Hi, I'm working in solaris server machine. More than 10 people are logged in to the same server machine.It take one hour for finishing the installation. But after the installation is over , some other user is deleting some of the directories and files in my installation folder. Is there any way to know who removed the files? Or atleast I want to know can I see the command history of other users.

    Read the article

  • Where can I get precompiled mod_perl, mod_python for Apache on Win64?

    - by Soumya92
    I have managed to set up pure 64-bit Apache, PHP, MySQL, and 64-bit distributions of Perl and Pyton. However, I cannot get Apache to automatically parse .pl files with Perl, and .py files with Python. Looking around points to mod_perl and mod_python for Apache, which unfortunately fail to build. Is there any precompiled mod_perl, mod_python for Win64? Or is there any other way of getting .pl, .py to work on Apache?

    Read the article

  • CD Burned in XP isn't readable in Vista

    - by RickMeasham
    I burned a CD on XP using the built-in burning software and I can read the CD on that machine, but when I insert it into my Vista machine, I can't read the files. It shows the correct volume label, and the correct 'free space', but I can't access the actual files. Am I missing something obvious? (Both systems are fully up-to-date)

    Read the article

  • I'm mplement http live streaming video from my webserver to iPhone. Will I get rejected for bandwid

    - by yujean
    Apache webserver setup added: AddType application/x-mpegURL .m3u8 AddType video/MP2T .ts to "httpd.conf" file. Movie file preparation I have 3 movie files (9mb - 25mb each). Used QuickTime to convert movies into iPhone format. Used mediafilesegmenter to convert .m4v into 10-second segments of .ts files, with an accompanying .m3u8 file. Placed these in a folder on webserver. iPhone App implementation Created UIWebView whose URL points to http://71.191.59.68/~yujean/stream.html Simulator accesses the site and streams the movie-files just fine. Question Will I still get rejected by apple for bandwidth issues over the 3G and/or Edge network? Do I need to somehow check which network the end-user is on first? And then provide a different movie accordingly? If so, how do I do that ...? Thank you in advance, Eugene

    Read the article

  • Most efficient approach for multilingual PHP website

    - by alexteg
    I am working on a large multilingual website and I am considering different approaches for making it multilingual. The possible alternatives I can think of are: The Gettext functions with generation of .po files One MySQL table with the translations and a unique string ID for each text PHP-files with arrays containing the different translations with unique string IDs As far as I have understood the Gettext functions should be most efficient, but my requirement is that it should be possible to change a text string in the original reference language (English) without the other translations of that string automatically reverting back to English just because a couple of words changed. Is this possible with Gettext? What is the least resource demanding solution? Is using the Gettext functions or PHP files with arrays more or less equally resource demanding? Any other suggestions for more efficient solutions?

    Read the article

  • Quarter turn pdf document

    - by Rogier
    We have created thousands of pdf files that are printed as a label on a special label printer. Printing these labels is ok, but some of the label paper are quarter turned and the pdf are printed incorrectly. There is a possibility to rotate the page before printing. But is it possible to rotate a pdf file and save it again as a pdf file? And there are thousands of pdf files, is it also possible to do this is a batch program?

    Read the article

  • windows extensions

    - by lego 69
    hello everybody, I have some files without extension (in my case without .txt) how can I associate these files with wordpad but without using extenstions, I'm very tired every time to choose it from the list, thanks in advance

    Read the article

  • Performance impact of rewrite conditions?

    - by makeee
    My web framework (cakephp) uses the following rewrite conditions and rule: RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.*)$ index.php?url=$1 [QSA,L] I serve up a lot of image files (~10 a second). I'm wondering if it would improve performance to have a rewrite rule that exempted requests for files in my images directory from even trying those rewrite conditions (checking whether the file exists). My traffic is constantly fluctuating, so this would be hard to benchmark, which is why I thought I'd ask here. If that would be beneficial, how might I exclude files in "/images" directory from trying those conditions and rewrite rule?

    Read the article

  • Suppress "running out of disk space" Message (per drive) on Windows Server 2003

    - by Shoeless
    We have a database server with separate drives for OS, various data files and the transaction log. Our transaction log spills over onto other volumes as well- this is expected behavior. The problem is that we are constantly getting popups that our transaction log drive is out of space (and that I can free space by deleting old or unnecessary files). Is there some way to prevent this message from popping up for this particular drive?

    Read the article

  • Referencing View directory in asp.net mvc

    - by ooo
    i have some html files as part of a regular website that has been ported over to asp.net mvc. In my code i need to read and write these html files and stick them in a tinymce editor To be able to read and write this file from disk in the past i had a hard coded path but this doesn't seem to work in asp.net mvc unless i do something like this: Writing: string _urlDirectory = @"c:\hosting\MySite\Views\Members\newsletters\test.html"; System.IO.File.WriteAllText(_urlDirectory, htmlData_); Reading: string url = @"c:\hosting\MySite\Views\Members\newsletters\test.html"; var req = WebRequest.Create(url); var response = req.GetResponse(); StreamReader sr = new StreamReader(response.GetResponseStream()); string htmlData_ = sr.ReadToEnd(); i am moving my site from one data center to another and the directory structure is changing. instead of just changing the hard coded path to another hard coded path i wanted to see if there was a more relative way to reference these files.

    Read the article

  • Infotips for Word Documents in Windows XP in Network Drives

    - by Knight Samar
    Hi, MS Word 2007 files have a property page for entering details like summary and title. This is displayed when you hover over the documents on Desktop. Now on my Windows XP SP2 computer, inside Windows Explorer, it shows the special properties for all such files from Desktop, but not from the Network Drives. This is a big problem when I have a large collection of Word documents all in one folder. How can I display these special properties (infotips) for documents in my network drives ? Thanks :)

    Read the article

  • Vbscript - Object Required for DateLastModified

    - by Kenny Bones
    I don't really know what's wrong right here. I'm trying to create a vbscript that basically checks two Folders for their files and compare the DateLastModified attribute of each and then copies the source files to the destination folder if the DateLastModified of the source file is newer than the existing one. I have this code: Dim strSourceFolder, strDestFolder Dim fso, objFolder, colFiles strSourceFolder = "c:\users\user\desktop\Source\" strDestFolder = "c:\users\user\desktop\Dest\" Set fso = CreateObject("Scripting.FileSystemObject") Set objFolder = fso.GetFolder(strSourceFolder) Set colFiles = objFolder.Files For each objFile in colFiles Dim DateModified DateModified = objFile.DateLastModified ReplaceIfNewer objFile, DateModified, strSourceFolder, strDestFolder Next Sub ReplaceIfNewer (sourceFile, DateModified, SourceFolder, DestFolder) Const OVERWRITE_EXISTING = True Dim fso, objFolder, colFiles, sourceFileName, destFileName Dim DestDateModified, objDestFile Set fso = CreateObject("Scripting.FileSystemObject") sourceFileName = fso.GetFileName(sourceFile) destFileName = DestFolder & sourceFileName if fso.FileExists(destFileName) Then objDestFile = fso.GetFile(destFileName) DestDateModified = objDestFile.DateLastModified msgbox "File last modified: " & DateModified msgbox "New file last modified: " & DestDateModified End if End Sub And I get the error: On line 34, Char 3 "Object required: 'objDestFile' But objDestFile IS created?

    Read the article

  • Add directories to root of Zip folder Ionic Zip c#

    - by Movieboy
    I asked this a few months ago, and only received one response which didn't work. I've been tinkering with it for the past few weeks, and I'm still completely lost, so I would appreciate anyone else's input as I'm all out of ideas. what I'm trying to do is add a list of folders and files all to the root of my Zip file, using the Ionic Zip library (c#). Here's what I have so far string k = "B:/My Documents/Workspace"; private void button1_Click(object sender, EventArgs e) { using (ZipFile zip = new ZipFile()) { //add directory, give it a name zip.AddDirectory(k); zip.Save("t.zip"); } } Now, I want my zip to be looking like this. t.zip -Random Files and Folder But it's looking like this. t.zip -t (folder) -Random files and folders Any help would be appreciated, Thank you.

    Read the article

  • hg add -X not working

    - by Lo'oris
    I'd like to hg add excluding files that begin with ._, or even better exclude all hidden files. It's not working, it's just completely ignoring my -X option. I tried the following: hg add -n -X '._*' hg add -n -X '*._*' and just to be sure also: hg add -n -X '*.*' nothing. It's just as if I didn't -X at all. I've tried both with hg 1.4.3 and hg 1.0.1

    Read the article

  • emacsclient: create a frame if a frame does not exist

    - by Idlecool
    I start emacs server using emacs --daemon then open files using emacsclient filename.ext but the first file has to be opened using emacsclient -c filename.ext in order to create a new frame which can be later used by subsequent files without using -c command line flag for emacsclient. I want to automate this. "if there is no emacs frame, emacsclient should create a frame else it should use the current frame". How can it be done?

    Read the article

< Previous Page | 533 534 535 536 537 538 539 540 541 542 543 544  | Next Page >