Search Results

Search found 46908 results on 1877 pages for 'managing files and folder'.

Page 188/1877 | < Previous Page | 184 185 186 187 188 189 190 191 192 193 194 195  | Next Page >

  • After deleting log files, Ubuntu server still saying there is no space

    - by Mark
    My Ubuntu server has stopped due to a lack of disk space. I deleted some log files which has grown huge very quickly. But df -h still shows I have no space left. When I run du -sh /* I can see that I should have plenty of disk space left after deleting the logs. I ran lsof +L1 and it brought up two files: /var/log/mail.log and /var/log/mail.err. These are two logs I had deleted. I restarted apache, postfix and mysql (mysql wont restart because of lack of disk space, it think) but still df -h shows no space.

    Read the article

  • Import data from multiple CSV files to an Excel sheet

    - by Chetan
    I need to import data from 50 similar csv files to a single excel sheet. Is there any way to get only selected columns from each file and put them together in one sheet. Structure of my csv files: A few columns exactly same in all the files. (I want them to be in excel) then one column with same column name but different data which I want to place next to each other with different names in the excel sheet. I do not not want all other remaining columns from csv files. In short, read all csv files, get all the columns which are common to all csv & put in excel sheet. Now, take one column from each file which has the same header but different data and put one after the other in excel sheet with named from the csv file name. Leave remaining rest of the columns. Write excel sheet to a excel file. Initially I thought it can be easily done, but considering my programming skill in learning stage it is way difficult for me. Please help.

    Read the article

  • .htaccess - deny downloading of files

    - by user317005
    I keep several fonts in the directory "/fonts/" on my server which I then load into my css files via @font-face. However, I want to make sure that people cannot download the file just by simply going to http://www.domain.com/fonts/fontname.ttf. Can I somehow prevent this, and still be able to load the font files into my css files? Because I think putting deny from all into the .htaccess file will even prevent the css files for correctly loading the fonts. I hope this makes sense.

    Read the article

  • unlock database files when SQL server is idle

    - by Andy
    In my development/test environment on my laptop, I can't back up the SQL server database files because the file handles are kept permanently open by SQL server (VSS doesn't work because the drive is truecrypted) I was hoping there may be some setting in SQL server that can make it unlock the data files after a certain period of inactivity and automatically open them again on demand, but I can't find anything. I don't really want to be dumping the database out every night because it's only a development environment. apart from stopping sql server before I do the backup is there any other solution?

    Read the article

  • Uploading files to a server that has Real Time Antivirus scan running

    - by zecougar
    I need to allow users to upload files onto a server that has an antivirus program running with real-time scanning switched on. What would be a good design to ensure that infected files are not uploaded to the server. Questions - would large files be copied onto disk and then immediately scanned, or would they be scanned as they are copied and not allowed to appear on disk if infected Should i build a seperate infrastructure around this to specifically ionvoke a scan on the copied file ? this might be an issue if the file is deleted through the real-time scan

    Read the article

  • About using assembly with c

    - by kristus
    Hi. I've sort of just finished a mandatory task at school, and I'm about to deliver it. But then I came across something that was unfamiliar, header files. :( What I've got: test-program.c task_header.h function1.s function2.s function3.s function4.s test-program.c: #include <stdio.h> #include <stdlib.h> #include <string.h> #include "task_header.h" . .. ... task_header.h: extern void function1(...); extern void function2(...); extern int function3(...); extern void function4(...); And then I use the command: gcc -m32 -o runtest test-program.c function1.s function2.s function3.s function4.s Is this a proper way to do it, or is it possible to modify it? So I can type: gcc -m32 -o runtest test-program.c ?

    Read the article

  • Git: submodule init and update from different folder

    - by jmccartie
    Trying to write a deployment script, working on a repo in a different path. The "git-dir" flag seems to work fine for most commands, but not for submodule work. Am I missing a path directive? Works: git --git-dir=/tmp/repo_path/.git log Doesn't work: git --git-dir=/tmp/repo_path/.git submodule init Error: No submodule mapping found in .gitmodules for path 'path_to/submodule' Much thanks for any help.

    Read the article

  • Unable to move or delete files

    - by Erik
    Hi: Just today I got the following error while trying to move/delete several files: The action can't be completed because the file is open in another program. The file wasn't open, but just in case, I closed all programs. When that failed to allow me to move or delete the file, I restarted the computer. When that failed to let me move/delete I came here. Any suggestions? The files can be copy/pasted but move/delete fails even after multiple restarts.

    Read the article

  • Update an app from other iTunes/Computer without losing the Documents folder

    - by mongeta
    Hello, Our inHouse tech have installed our own app to some of our iPods 3.x devices. Now, we have to update that app but WIHTOUT LOSING the data that has been entered by users. The problem is that the computer who installed those app and syncronyzed, isn't working now, and when we try to update them from another computer, iTunes insist in deleting ALL apps and documents app from iPods and add the new ones. I suppose that this action will erase ALL documents from the app, so we will lose the data. What can we do ? thanks, m. ps. those devices have currently an AdHoc profile

    Read the article

  • Download web server structure with empty files

    - by golimar
    I want to make a mirror of a Web server, but downloading the actual files will take too long. So I thought of having just the directory and file structure, and when I need the actual contents of the file, I can download just that file. I have tried wget --spider URL and in a short time it has created in my local disk the directory structure with no files. But I've checked all of wget's or curl's switches and there is nothing like what I need. Can this be done with wget, curl or any other tool?

    Read the article

  • Edit write-protected files by breaking hard links

    - by Taymon
    A directory which I own and can write to contains hard links to files that I don't own and don't have write permission for. I want to open and edit these files in Emacs. When I save my changes, Emacs should rename the existing hard link by appending ~, then write my new version of the file as a new file owned by me. I was under the impression that Emacs could just do this (because of the way it does backups), but it's not working; when I save, it attempts to change the file's permissions in order to write to it (and fails because I don't own the file). How do I make this happen?

    Read the article

  • Can I Store MediaWiki Files on the cloud?

    - by user219048
    I recently got a chromebook, and I've been brainstorming different ways to put mediawiki on it (with localhost, not a server). One way I've read about online is to go into developer mode to download and set up LAMP. I was wondering, wouldn't I be able to store the apache, mysql, php, and mediawiki files on the cloud (google drive)? And if so, would anything prevent me from accessing my wiki on any other computer's localhost, assuming I could just log into Google Drive to access these files? Might there be any reduced performance when operating from the cloud?

    Read the article

  • PHP (images folder) image Listing in Alphabetical Order?

    - by user338233
    Hello, I'm having problems with a PHP script trying to list images alphabetically. I need this urgently and I have not much knowledge of PHP. I was trying to use scandir() but I'm not succeeding. Thanks for your help!! Here is the code: function listerImages($repertoire){ $i = 0; $repertoireCourant = opendir('./'.$repertoire); while($fichierTrouve = readdir($repertoireCourant)){ $fichierTemp = ""; if($repertoire == '.') $fichierTemp = $fichierTrouve; else $fichierTemp = $repertoire.'/'.$fichierTrouve; if(estUneImageValide($fichierTemp)){ echo afficherPhoto($fichierTemp,$i); chmod($fichierTemp,0700); } $i++; } }

    Read the article

  • minifying patched javascript files

    - by Stacia
    I'm writing a Rails app and I've partially integrated in this nice little patch to the in line ajax editor: http://inplacericheditor.box.re/ The problem is, on that page I have tinymce, prototype and scriptaculous included. In Firefox at least there's a big lag when all this stuff is loading. I was hoping to fix it by compressing the files so I checked out a plugin for rails called Smurf. It seemed to do what it was supposed to do nicely, but it choked on the little patch files that are included with the Ajax editor thing. THe patch files look like this: Object.extend(Ajax.InPlaceEditor.prototype, { handleAJAXFailure: function(transport) Alternatively, should I just be catching them instead of worrying about minfying them? I know I'm running on development and that Apache would maybe be handling serving the js files differently..It just seems like a lot of things to serve on one page.

    Read the article

  • Problem with passing folder path string to web service function via jQuery.ajax

    - by the_V
    Hello, I need to perform asp.net web-service function call via jQuery and pass asp.net application path to it. That's the way I'm trying to do it (code is located within asp.net page, e.g. aspx file): var d = "{'str':'<%=System.DateTime.Now.ToString() %>', 'applicationPath':'<%=GetApplicationPath() %>'}"; $.ajax({ type: "POST", url: "http://localhost/testwebsite/TestWebService.asmx/Test", data: d, contentType: "application/json; charset=utf-8", dataType: "json", error: function (xhr, status, error) { var err = eval("(" + xhr.responseText + ")"); alert(err.Message); }, success: function (msg) { } }); That's what GetApplicationPath method looks like: protected string GetApplicationPath() { return HttpUtility.HtmlEncode(Request.PhysicalApplicationPath); } And here is a header of web-service function which I'm trying to call: public void Test(string str, string applicationPath) Function call works well, but applicationPath parameter doesn't passed correctly. When I debug it I see that backslashes are removed, function gets "C:ProjectsSamplesmytestwebsite" instead of "'C:\Projects\Samples\mytestwebsite\'". How can I overcome this?

    Read the article

  • How to have CVS files in different directory than source files in NetBeans?

    - by Ondrej Slinták
    I have a project in NetBeans which haven't used CVS until now. Let's say the directory with source files is called /www/source_files and directory with project files /www/project_files. Module in repository is called differently than source files directory. When I'm trying to check out CVS, it forces me to create a directory called exactly how module is called, which is fine by me in fact. Straight after it asks me if I wanted to create new project. And here the problem begins. I don't want to do that and I have no idea how to link newly created directory with CVS and checked out files with my project. I'd like to end up with following structure: /www /source_files /project_files /cvs_files Any ideas how to do this? I'm using NetBeans 6.8.

    Read the article

  • JAVA: multiple files download at the same time?

    - by user319096
    hi guys, Is there any methods for downloading multiple files at the same time? That is, after select multiple files, click the download button, and choose the destination directory, the files selected will be downloaded at the same time. i googled it and not find any solutions, can anybody know? im using struts1 and spring2.

    Read the article

  • Apache2 WebServer not allowing me to view website/files in /var/www

    - by CitadelCSAlum
    I used to be able to access websites/files that were stored in the directory /var/www I have not used this for a while, but now I have a need to store, media in this directory or in the directory/var/www/images I noticed that my apache web server wasnt running correctly so I did a complete package removal and then reinstalled, but I am still unable to access a test page inde.html in the directory /var/www/index.html by going to http://myipaddresshere/index.html Is there some initial configuration I need to do to allow me to store HTML and media files in this directory and be able to access them from the browser? I dont remember having to do anything before.

    Read the article

  • List files recursively and sort by modification time

    - by Problemaniac
    How do I list all files under a directory recursively and sort the output by modification time? I normally use ls -lhtc but it doesn't find all files recursively. I am using Linux and Mac. ls -l on Mac OS X can give -rw-r--r-- 1 fsr user 1928 Mar 1 2011 foo.c -rwx------ 1 fsr user 3509 Feb 25 14:34 bar.c where the date part isn't consistent or aligned, so a solution have to take this into account. Partial solution stat -f "%m%t%Sm %N" ./* | sort -rn | head -3 | cut -f2- works, but not recursively.

    Read the article

  • Zip multiple database PDF blob files

    - by Michael
    I have a database table that contains numerous PDF blob files. I am attempting to combine all of the files into a single ZIP file that I can download and then print. Please help! <?php include '../config.php'; include '../connect.php'; $session = $_GET['session']; $query = "SELECT $tbl_uploads.username, $tbl_uploads.description, $tbl_uploads.type, $tbl_uploads.size, $tbl_uploads.content, $tbl_members.session FROM $tbl_uploads LEFT JOIN $tbl_members ON $tbl_uploads.username = $tbl_members.username WHERE $tbl_members.session = '$session'"; $result = mysql_query($query) or die('Error, query failed'); $files = array(); while(list($username, $description, $type, $size, $content) = mysql_fetch_array($result)) { $files[] = "$username-$description.pdf"; } $zip = new ZipArchive; $zip->open('file.zip', ZipArchive::CREATE); foreach ($files as $file) { $zip->addFile($file); } $zip->close(); header('Content-Type: application/zip'); header('Content-disposition: attachment; filename=filename.zip'); header('Content-Length: ' . filesize($zipfilename)); readfile($zipname); exit(); ?>

    Read the article

< Previous Page | 184 185 186 187 188 189 190 191 192 193 194 195  | Next Page >