Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 174/1620 | < Previous Page | 170 171 172 173 174 175 176 177 178 179 180 181  | Next Page >

  • umask is being ignored on Gentoo while creating new files

    - by drcelus
    I have a server running Gentoo and hosting a drupal installation. Whenever a Drupal update is executed, the directory permissions of the updated module turn from 755 to 744 preventing the application from accessing the files. The umask is defined as 022 under /etc/profile and the Apache server is running under user and group nobody. I believe this has nothing to do with the drupal installation since if I create a directory as root, the same happens, it is created with 744 permissions, since the umask is 022 shouldn't it be created as 755 ? Why is the umask being ignored and how do I tell the server to create the directories with permission 755 ?

    Read the article

  • Web log files analyzer

    - by Peter Štibraný
    I already use Google Analytics on my page, but I'd like to get additional info from log files. I've looked at various packages during last days, but nothing impressed me so far. Some requirements: must work on log file level (I use apache combined logs, but can configure apache to produce other types of logs) can generate static reports (windows/linux) or use GUI (windows only) should be easy to add custom user agents, and rerun analysis if it can recognize installation of eclipse plugins from log, that would be big plus understands google serp position referer should not require two days to setup (awstats, I am looking at you) should be still under active developement (i.e. analog isn't good answer) preferrably free, or at not very expensive :-) Any good analyzers programs out there?

    Read the article

  • Is there any way of preventing .csv files being converted into excel format

    - by Kevin Trainer
    I'm trying to work with an automated testing tool which can use .csv files as its data sourse. After saving a notepad file containing a number of fields and data seperated by commas as .csv it appears to have been converted to an excel file. When I run the test, only the first line of values is identified and can be run within the automated test. Not sure if this is expected with the testing product (www.badboy.co.au), but just wondered if there was a way of preventing excel from taking control of the .csv file? Any helpfull feedback would be great.

    Read the article

  • rsync to ONLY keep files in destination that have been removed from source

    - by David Corley
    We use rsync to copy filesystem contents from one machine to another as a backup. We first run MACHINE-X-MACHINE-Y rsync for a straight backup with the --delete and --delete-excluded switches We also run an internal Rsync between the MACHINE-Y destination, and another folder on MACHINE-Y with either of the delete flags. This maintains a non-destructive copy in the event someone inadvertently deletes a file on MACHINE-X. However, it also has the overhead of being a complete copy of what has already been synchronized. Ideally I want to be able to run the non-destructive rsync in such a way that the destination ONLY receives the deleted files and so avoids unnecessary duplication . Is there any way to do this?

    Read the article

  • Search for files which will open a certain application in Mac OS X

    - by Jacob Palme
    In Mac OS X, when you doubleclick on a file name, that file will open with the application which created the file. So there must be stored, somewhere in the file description on a Mac OS X file, information on which application created this file. Note that this is not the file extension, the file can have any extension, or no extension at all. Two questions regarding this information: (1) How can I search for all files which will open a specific application? (2) How can I see, and change, the application which a certain file will open?

    Read the article

  • find files their name is smaller or greater than a given parameter

    - by Tzury Bar Yochay
    Say that in a given directory I got tzury@x200:~/Desktop/sandbox$ ls -l total 20 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N00.P000 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N00.P001 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N00.P002 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N00.P003 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N00.P004 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N01.P000 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N01.P001 drwxr-xr-x 2 tzury tzury 4096 2011-03-09 10:19 N01.P002 I seek for a bash way to grab the list of files which their name is either grater or smaller than a given parameter, for instance: $ my_finder lt N00.P003 shall return N00.P000, N00.P001 and N00.P002 $ my_finder gt N00.P003 shall return N00.P004, N01.P000, N01.P001 and N01.P002 I was thinking of iterating over for name in $(ls) and while $name != $2 but believe there are more elegant way of doing so

    Read the article

  • Server unresponsive, messages shown on console but not in log files

    - by raistlin majere
    I'm using Ubuntu Server 10.04.4, and once in a while the server hangs (once a month) and is totally unresponsive. The tty is flooded with messages like these. The problem is that these messages are not in my log files after reboot. How to log these messages so that I can analyze them later? In the current logs I can't see anything that would tell me why this is happening. I would also appreciate if anybody can tell from those messages what's going on. This server is a guest virtual machine. The host server is also Ubuntu server 10.04 with KVM/QEMU.

    Read the article

  • Cannot access an application folder in Program files

    - by GiddyUpHorsey
    I recently installed Windows 7 Professional 64bit on a new machine. I installed an application using a ClickOnce installer. The application runs fine, but I cannot access the application folder it created in c:\Program files (x86). It bombs with access denied. I try to view the properties on the folder and it takes about 1 minute to display (other folders take 1 second). It says I cannot view any information because I'm not the owner. It doesn't say who the current owner is (instead - Unable to display current owner.) but says I can take ownership. When I try it fails again with Access Denied, even though I have administrative permissions. Why can't I access this folder nor take ownership?

    Read the article

  • Why the huge discrepancy in size between two similar zip files

    - by twpc
    I use WinZip to zip entire directories of code and send them to a fellow programmer. He makes changes and sends the directories of code back to me. Ignoring the fact that this is not a good way to keep the code clean when we are both working on it, I notice that his zip files are far smaller than mine, with basically the same data inside (mine range around 36,000 KB, his 2,000 KB). I believe he is also using WinZip. What's going on here, and how can I make mine "more compressed"?

    Read the article

  • "Keep iTune music files organized" no longer works?

    - by user22105
    I remember iTunes used to have the option to "keep music files organized" (i know it still does in its advanced Properties). When I used to edit music's artist, genre or album information. The mp3 file is automatically updated as well. But lately I've been updating some mp3s. And if I tried to open the updated mp3's Properties, under Details, the Artist, genre, and album information is not available. How can i fix this? Added: you can try editing this mp3 file with iTunes, it didn't work on my PC Here's a screenshot of what Im seeing after the edit.

    Read the article

  • Migrate to SSD - NTFS mount point for Program Files

    - by Icode4food
    Here is my thought. I have a new computer that I just built and am considering migrating to a SSD. I have Windows all setup and my Development environment configured so I want to avoid having to re-install a bunch of stuff. My thought is to clone my OS (win7) to the SSD and then mount a HDD partion to C:\Program Files (x86)\ with C being my SSD. This way as far as the programs are concerned they still live on the C drive but in reality they are physically located on the HDD. This seems to me like a good idea but after searching around a bit and not having found anyone else that had the same idea, I'm wondering why not. Maybe I am missing something that is obvious to everyone but me. Why is this a good or a bad idea?

    Read the article

  • xcopy files and directory

    - by user1044937
    I have a folder named "C:\Jobs\job#1" , "C:\Jobs\job#2" "C:\Jobs\job#3" etc and a lot of directories and sub-directories under it. I want to get the all the directories under Jobs and xcopy them to C:\backup. Then I want to xcopy all the files under each Job#1, 2 ,3 etc. to C:\backup\job#1\month\\*.* To make it clearer. Source dir = C:\Jobs\job#1\"myfiles&dir" Destination dir = C:\Backup\job#1\month\"myfiles&dir" then do the next folder Source dir = C:\Jobs\job#2\"myfiles&dir" Destination dir = C:\Backup\job#2\month\"myfiles&dir" ...until all folders are back-up. Since the job folder keep increasing, by doing it this way I don't have to add extra code on this script except modify the month. Thank you.

    Read the article

  • nginx php-fpm keeps downloading files

    - by Sam Williams
    vhost: server { listen *:8080; location / { root /var/www/default/pub; index index.php; # if file exists return it right away if (-f $request_filename) { break; } if (!-e $request_filename) { rewrite ^(.+)$ /index.php$1 last; break; } } # serve static files directly location ~* \.(jpg|jpeg|gif|css|png|js|ico|html)$ { access_log off; expires max; } location ~* \.php$ { # By all means use a different server for the fcgi processes if you need to fastcgi_pass 127.0.0.1:9000; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param PATH_INFO $fastcgi_script_name; include /usr/local/nginx/conf/fastcgi_params; } location ~ /\.ht { deny all; } } http://192.168.135.128/index.php loads just fine... http://192.168.135.128/public_/html/index.php downloads...

    Read the article

  • Apache 2.0 is showing only text and not php files

    - by denonth
    I have a web application written with PHP, html and JavaScript. On my pc I have installed a EasyPHP program which has Apache and everything installed. But I wanted to put this web app to my server and I have installed a Apache 2.0 but my php files are displayed as text or it starts to download them automatically. I have tried several things one of them is to add this to my conf file: AddType application/x-httpd-php .php AddType application/x-httpd-php-source .phps But it's still not working. What else can I do? Thank you

    Read the article

  • nginx deny directory and files to be downloaded

    - by YeppThat'sMe
    gurus. I have a problem and i dont know how to solve it. I am working with Git and Compass/SASS on some projects. Now i want to protect those directories. When i go only to the folder its all fine – i get what i expected a 403 forbidden. location ~ /\.git { deny all; } But when i try use the full path to the config file from git the browser start to download it. Same scenario with compass. There is a config.rb file within the folder which also starts to download it. How can i prevent this behaviour? How can i deny downloading specific files?

    Read the article

  • mod_rewrite filename from mod_pagespeed back to normal files

    - by British Sea Turtle
    I am hoping someone can help me with this problem. I am moving to a new server and not using mod_pagespeed any more. However we have lots of external links to images on our site using the strange mod_pagespeed filenames. This is not an issue but we do not want to have lots of 404 errors. So I have lots of links like the following : http://www.domain.com/images/150x150xlink.png.pagespeed.ic.pPXw45HSQm.png http://www.domain.com/images/paris_01.gif.pagespeed.ce.vfrkuKUaj0.gif http://www.doamin.com/images/1st2.gif.pagespeed.ce.OUg38q6VbZ.gif How can I redirect them to : http://www.domain.com/images/150x150xlink.png http://www.domain.com/images/paris_01.gif http://www.doamin.com/images/1st2.gif There are thousands of files like this so I am hoping for a simple solution with mod_rewrite, I tried this but it does not work. So any help would be appreciated. RewriteCond %{REQUEST_URI} \.gif\.pagespeed\. [NC] RewriteRule ^(.*?\.gif)\..*\.gif$ $1 [NC,L]

    Read the article

  • Apache trailing slash added to files problem

    - by Francisc
    Hello! I am having a problem with Apache. What it does is this: Take /index.php file containing an code with src set to relative path myimg.jpg, both in the root of my server. So, www.mysite.com would show the image as would www.mysite.com/index.php. However, if I access www.mysite.com/index.php/ (with a trailing slash) it does the odd thing of executing index.php code as it would be inside an index.php folder (e.g. /index.php/index.php), thus not showing the image anymore. This is a simple example that's easy to solve with absolte addressing etc, the problem I am getting from this a security one that's not so easily fixed. So, how can I get Apache to give a 403 or 404 when files are accessed "as folders"? Thank you.

    Read the article

  • How to play SWF files without browser

    - by Mehper C. Palavuzlar
    I like to play some downloaded Shockwave Flash (.SWF) files without opening my internet browser. Sometime ago, I remember I could do it just double clicking on the SWF item. Then it was opening in a plain Shockwave window. Now XP won't play ball. I tried Folder Options > File types but couldn't find the associated player exe. How can I surpass this? Edit: There has to be some way to do this without a 3rd party software since I can already play SWFs on my browsers.

    Read the article

  • monitor a folder and send files via ftp to clients

    - by user73109
    I am looking for software that will monitor a specific folder and when a file is created in it send that file off via ftp to a client associated with that folder by the software. I have tried software such as smart FTP and cute FTP and they don't seem to monitor folders very consistently. Some of the options with them were to write scripts to delete duplicated files from the transfer queue. I really don't want to have to write scripts for software I purchase. I am not opposed to needing scripting or writing it but I feel I shouldn't have to write scripting to make there software properly do some thing it says it does out of the box. I am currently trying to do this on a Windows XP box though running on a Server 2003 is an option if it would make things easier. I really just want pointed in the correct direction this is all fairly foreign to me

    Read the article

  • Can't delete files in XP

    - by maaartinus
    On Windows XP I've made a copy of my home directory. Now I want to remove it, but there's a directory with two files which I can't get rid of: N:\COPY-OF-HOME\Local Settings\Application Data\Microsoft\CardSpace The directory is read-only, and I can't change it (access denied). Cacls shows the following Everyone:(special access:) READ_CONTROL SYNCHRONIZE FILE_READ_ATTRIBUTES BUILTIN\Administrators:(special access:) READ_CONTROL SYNCHRONIZE FILE_GENERIC_READ FILE_READ_DATA FILE_READ_EA FILE_READ_ATTRIBUTES and I can't change it either. I do have the administrator privileges. For copying I didn't use any fancy tool, so I'd expect me to be the owner of the copy. Why can't I delete it? Do I need to boot Linux?

    Read the article

  • Windows 8 Modern UI searching in files doesn't work

    - by Peter Jansen
    I have a problem with my search in Windows 8. When I search through the Modern UI style (WinF) for files, it won't return a single result from none of my drives. Searching via Windows Explorer works fine. I had the same problem in Windows 8 Consumer Preview, but it worked in Developer Preview. And I looked on the net for other users with similar problems, but I haven't found anything. Is there someone who knows what the problem might be?

    Read the article

  • Recursively copy only new or changed files

    - by Niklas
    I'm sure this must have been asked and answered before, but I just can't find it right now... I have a Visual Studio post-build action that currently does a recursive copy (using xcopy) of an output folder to a different folder. This takes longer than I like and I'd like to only copy newly created and newer (changed) files each time (which xcopy doesn't seem to support). I cannot depend on any not-installed-by-default tool since the solution is used by different developers on different machines. What would a superuser do?

    Read the article

  • selecting files in windows-7 explorer just by hovering

    - by bortao
    One of the biggest annoyances for me of Windows 7 vs Windows XP is that you cant select files properly by hovering over them on Windows Explorer (or multi select by holding ctrl). When i hover an item it gets light blue, not selected. Only after a time or after more hovering, it gets selected. I set the delay (MouseHoverTime) to 1. Windows seems to avoid wrong selection by requiring an additional time or space of mouse hovering before actually selecting the item. I tried playing with the values MouseHoverHeight, MouseHoverWidth, MouseSensitivity, MouseThreshold1 and MouseThreshold2 in HKEY_CURRENT_USER\Control Panel\Mouse, but they seem to have no effect over this. (they are currently on (20, 20, 16, 6, 10, respectively). What i want: to immediately select an item whenever the mouse gets over its bounds.

    Read the article

  • Duplicating keepass files instead of creating a new file

    - by BlakBat
    I'm currently using KeePass 2 and syncing them via dropbox. I have a few KeePass files (one for websites, one to store software licenses, etc...) Every time I need a new KeePass file, I just create a copy of the kbdx file, open it, remove all existing entries, change the key transformation rounds to another pseudo-random value. I do not change the master password. I want to know if this was unsafe practice, or was a security risk, compared to just creating a new KeePass file via the "File-New" menu. The reason I don't use the menu: i'm lazy enough to not want to reconfigure "database settings" every time.

    Read the article

  • Windows hiding other user's files?

    - by JoshJordan
    I had a hard drive whose windows installation (running Vista) became corrupt. I bought a new hard drive, installed Windows 7, and hooked up the old drive using an external enclosure. The Users folder on the old drive shows the users that existed on the machine, but it doesn't show any of the contents of them. I assume this is due to not having the permissions I need. I have "taken control" of the folders I'm interested in, but this didn't prompt me for the original owner's password as I expected, and I still can't see the file contents. I would guess that this is a fairly common issue, but I'm not sure what to Google here. How can I get access to files in that drive's User directory?

    Read the article

< Previous Page | 170 171 172 173 174 175 176 177 178 179 180 181  | Next Page >