Search Results

Search found 40294 results on 1612 pages for 'dot files'.

Page 115/1612 | < Previous Page | 111 112 113 114 115 116 117 118 119 120 121 122  | Next Page >

  • Download web server structure with empty files

    - by golimar
    I want to make a mirror of a Web server, but downloading the actual files will take too long. So I thought of having just the directory and file structure, and when I need the actual contents of the file, I can download just that file. I have tried wget --spider URL and in a short time it has created in my local disk the directory structure with no files. But I've checked all of wget's or curl's switches and there is nothing like what I need. Can this be done with wget, curl or any other tool?

    Read the article

  • Excel macro to change location of .cub files used by pivot tables? (to allow .xls files that depend

    - by Rory
    I often use Excel with pivot tables based on .cub files for OLAP-type analysis. This is great except when you want to move the xls and you realise internally it's got a non-relative reference to the location of the .cub file. How can we cope with this - ie make it convenient to move around xls files that depend on .cub files? The best answer I could come up with is writing a macro that updates the pivot tables' reference to the .cub file location....so I'll pop that in an answer.

    Read the article

  • subversion: how to manage tweaked files

    - by punk4funk
    Our group is considering moving to SVN. But, I can't seem to find a way to do the following: I need to make minor tweaks locally to about 20 files in the repository w/o having SVN consider them "changed" and included in the commit. (Changes like communication time-outs and logging levels.) Ideally I would want to merge the tweaked files to newer versions in the repository. (Keeping the tweaked local file up-to-date with committed changes form other users.) I can't imagine we're unique in wanting/needing this. Are there best practices around this type of use case? One thing I'm considering is putting all the tweaked files into a branched "tweaked" working copy. Then merging my tweaked files into my "official" working copy. Then using a script, which compares the "tweaked" and "official" working copies, to update my ignore list. The script would also un-ignore and alert me to any files that had tweaks and other changes that, presumably, needed to be committed to the repository. This seems kinda hacky and I can't imagine there's not a better way.

    Read the article

  • Problem with Xcode and iPhone preference files

    - by CyDummy
    Hi, I created four language files for my app: two for the preferences and two for the strings. I added them into Xcode and it works great on the phone but the string files are shown as "Localizable.strings (de)" and "Localizable.strings (en)", while I have two files called "Root.plist" under the "Settings.bundle" in the groups "de.lproj" and "en.lproj" (see screenshot).When I don't add the two "Root.plist" files instead of "Settings.bundle", I get what I want ("Root.plist (de)" and "Root.plist (en)") but I have no "Settings.bundle" in Xcode. Is somebody out there who can help me to a get what I want?

    Read the article

  • iOS - How to iterate through list of files on website directory

    - by svjim
    On iOS, I am looking for the proper way to return the URLs for a list of files that are in a directory. /images 1.jpg 2.jpg 3.jpg ... I know the URL for the directory and know all the files in the directory will be of type jpeg. It is not clear to me how to properly format the NSURLRequest to return the list of files in the images directory. Once the list is returned, I will iterate through it to return the individual images.

    Read the article

  • Can I Store MediaWiki Files on the cloud?

    - by user219048
    I recently got a chromebook, and I've been brainstorming different ways to put mediawiki on it (with localhost, not a server). One way I've read about online is to go into developer mode to download and set up LAMP. I was wondering, wouldn't I be able to store the apache, mysql, php, and mediawiki files on the cloud (google drive)? And if so, would anything prevent me from accessing my wiki on any other computer's localhost, assuming I could just log into Google Drive to access these files? Might there be any reduced performance when operating from the cloud?

    Read the article

  • How to use vim's syntax files in emacs to color the text

    - by Vijayender
    Are there any snippets to make emacs use the .vim syntax files found in /usr/share/vim/vimfiles/ for coloring text. Many applications like conky have the vim syntax files like "conkyrc.vim" for vim but not for emacs. So is there an easy way to use those files rather than rewriting a new language-mode for each of those available in vimfiles directory.

    Read the article

  • Prevent IntelliJ from adding iml files

    - by ripper234
    I'm working with Maven pom files, and I don't wish to source control iml files. When I open a project, IntelliJ seems to add some of the iml files it creates to the SVN source control. How can I prevent this? I have "*.iml" in an "svn:ignore" property on the repository root, but it doesn't seem to prevent IntelliJ from adding the imls.

    Read the article

  • Search desktop files using a list of keywords stored in a text file

    - by Tod1d
    I have a list of 1285 keywords (database object names) that I have compiled into a TXT file; one keyword per line. I would like to search a directory of files (most have a .aspx or .cs extension) using this list of keywords. My main goal is find out which of the 1285 database objects are being referenced in these files. Can anyone recommend a tool that could accomplish this? Otherwise, I will just create my own. Thanks.

    Read the article

  • Editing remotely the PHP files on a Centos server

    - by Alex2012
    I have a intranet web server (Centos 6, Apache, PHP) to which I would like to give access to a developer. He will connect by remote desktop from Windows 7 to Ubuntu 12.4 and from here by SSH to /var/www/html folder where it has to create and edit the files. This solution was chosen because: - I could not make a remote desktop connection from Windows to Centos - The web developer need some editor for PHP files and is not allowed to install software on Windows 7 machine - it is more a test solution ( we are all learning to use Linux). When the developer is connected from Ubuntu to Centos by SSH (SFTP) he could save the changes only if on Centos the account used to connect has ownership to that folder. Can you please tell how can I give all required rights. I tried different solutions found on Internet but without to much success. Are there other way to connect to Centos server?

    Read the article

  • Find completely commented files which are commented by /* */.

    - by Dave
    Similar to this question. I would like to find all commented files. But in my case /* */ is a possibility. Apparently when you write changes to a database project, dropped objects are only commented out instead of deleting the file. I would like to remove all of these commented out files from the project. Is is possible to find all files which start with /* and end with */?

    Read the article

  • Dedicated hard disk for SE dbname.dbs files & dedicated ramdisk for /tmp files.

    - by Frank Computer
    INFORMIX-SE 7.2: I would like to dedicate a hard disk, exclusively for my dbname.dbs directory which holds all the .dat and .idx files, and create a ramdisk for my /tmp temporary files in order to improve performance. I would also like to strip down the OS from any unecessary files and processes to minimize overhead for my dedicated application. Is this a good idea and are there any roadmaps for accomplishing this?

    Read the article

  • minifying patched javascript files

    - by Stacia
    I'm writing a Rails app and I've partially integrated in this nice little patch to the in line ajax editor: http://inplacericheditor.box.re/ The problem is, on that page I have tinymce, prototype and scriptaculous included. In Firefox at least there's a big lag when all this stuff is loading. I was hoping to fix it by compressing the files so I checked out a plugin for rails called Smurf. It seemed to do what it was supposed to do nicely, but it choked on the little patch files that are included with the Ajax editor thing. THe patch files look like this: Object.extend(Ajax.InPlaceEditor.prototype, { handleAJAXFailure: function(transport) Alternatively, should I just be catching them instead of worrying about minfying them? I know I'm running on development and that Apache would maybe be handling serving the js files differently..It just seems like a lot of things to serve on one page.

    Read the article

  • List files recursively and sort by modification time

    - by Problemaniac
    How do I list all files under a directory recursively and sort the output by modification time? I normally use ls -lhtc but it doesn't find all files recursively. I am using Linux and Mac. ls -l on Mac OS X can give -rw-r--r-- 1 fsr user 1928 Mar 1 2011 foo.c -rwx------ 1 fsr user 3509 Feb 25 14:34 bar.c where the date part isn't consistent or aligned, so a solution have to take this into account. Partial solution stat -f "%m%t%Sm %N" ./* | sort -rn | head -3 | cut -f2- works, but not recursively.

    Read the article

  • Subversion: Addiing files to the project

    - by Ran
    Hi I am using library xyz where the files exists in folder xyz, and I want to update the files (eg. a upgrade to a new version), can I just copy the new xyz folder into my project using the file browser? The folder has both files and directories. /Subversion noob

    Read the article

  • Zip multiple database PDF blob files

    - by Michael
    I have a database table that contains numerous PDF blob files. I am attempting to combine all of the files into a single ZIP file that I can download and then print. Please help! <?php include '../config.php'; include '../connect.php'; $session = $_GET['session']; $query = "SELECT $tbl_uploads.username, $tbl_uploads.description, $tbl_uploads.type, $tbl_uploads.size, $tbl_uploads.content, $tbl_members.session FROM $tbl_uploads LEFT JOIN $tbl_members ON $tbl_uploads.username = $tbl_members.username WHERE $tbl_members.session = '$session'"; $result = mysql_query($query) or die('Error, query failed'); $files = array(); while(list($username, $description, $type, $size, $content) = mysql_fetch_array($result)) { $files[] = "$username-$description.pdf"; } $zip = new ZipArchive; $zip->open('file.zip', ZipArchive::CREATE); foreach ($files as $file) { $zip->addFile($file); } $zip->close(); header('Content-Type: application/zip'); header('Content-disposition: attachment; filename=filename.zip'); header('Content-Length: ' . filesize($zipfilename)); readfile($zipname); exit(); ?>

    Read the article

  • Perl, FastCGI and writing uploaded files

    - by ibogdanov
    My upload function looks like: sub Upload_File{ my ($file, $mime, $description) = @_; my $file_name = param('filename'); my $data; $file = UnTaint($file); if ($mime =~ /text/) { sysopen(VAULT, "$path/$file", O_RDWR | O_EXCL | O_CREAT | O_TEXT) or die "couldn't create $file for R/W: $!\n"; } else { sysopen(VAULT, "$path/$file", O_RDWR | O_EXCL | O_CREAT | O_BINARY) or die "couldn't create $file for R/W: $!\n"; } my $upfh = \*VAULT; flock $upfh, 2; seek $upfh, 0, 0; select((select($upfh), $| = 1)[0]); while( sysread($file_name, $data, 8192) ) { syswrite($upfh, $data, 8192) or die "couldn't write $upfh: $!\n"; } close $upfh; } When I am using read and print with FastCGI upload script, files uploaded with corruptions (including simple text files), this is because perl uses buffered I/O. But when I use syswrite and sysread i.e. non-buffered I/O, as a result I get good text files, but binary files are corrupted anyway.

    Read the article

  • old .pst files on a networked drive still being accessed

    - by icecurtain
    Real quick question, A few months back I have set up email accounts exchange using AD on a domain all is fine. (email, address books ,calendars etc.) But I left the old .pst files on a networked drive. For some reason outlook the clients are still accessessing the old .pst not for (email, address books ,calendars). I have removed an account with no adverse effect as a test. But can any one tell me why outlook is still accessing the old .pst, as I was going to delete all the old .pst files until they all had todays timestamp on.

    Read the article

  • Share files and catelog them across a server

    - by Ultan
    Hi all, here is what I am looking for, I use a number of software packages when creating online learning courses. I have a huge number of courses and they use text, audio files, images and video files. I am looking for somthing that I can install on our server and upload the content, index, tag and categorise and enter a description so that when a screen print has to be changed for example I can find all the places that I used the image so that I can go in and change the image without having to check every file that I have created over the last year. An open source solution would be great or something that isn't to expensive. Thank you for any suggestions you can make.

    Read the article

  • SFTP sending files between laptops on Ubuntu

    - by twigg
    I want to transfer files between two Ubuntu systems using SFTP. I have got it set-up and I can connect to the other laptop, ping it and see its file list using sftp> dir. I can see the files on the other system. But when I call get filename.deb it comes up saying Fetching /home/user/filename.deb to filename.deb 0% 0 0.0KB/s --:-- ETA and then drops back to the sftp command promote without transferring anything. Have I missed something?

    Read the article

< Previous Page | 111 112 113 114 115 116 117 118 119 120 121 122  | Next Page >