Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 614/1620 | < Previous Page | 610 611 612 613 614 615 616 617 618 619 620 621  | Next Page >

  • .htaccess to block by file name possible?

    - by Tiffany Walker
    I have a bunch of files that are secure_xxxxxx.php. Is there a way to use .htaccess to block access to all the secure_* php files based on IP? EDIT: I've tried but I get 500 errors <FilesMatch "^secure_.*\.php$"> order deny all deny from all allow from my ip here </FilesMatch> Don't see any errors in apache error logs either httpd -M Loaded Modules: core_module (static) authn_file_module (static) authn_default_module (static) authz_host_module (static) authz_groupfile_module (static) authz_user_module (static) authz_default_module (static) auth_basic_module (static) include_module (static) filter_module (static) log_config_module (static) logio_module (static) env_module (static) expires_module (static) headers_module (static) setenvif_module (static) version_module (static) proxy_module (static) proxy_connect_module (static) proxy_ftp_module (static) proxy_http_module (static) proxy_scgi_module (static) proxy_ajp_module (static) proxy_balancer_module (static) ssl_module (static) mpm_prefork_module (static) http_module (static) mime_module (static) dav_module (static) status_module (static) autoindex_module (static) asis_module (static) info_module (static) suexec_module (static) cgi_module (static) dav_fs_module (static) negotiation_module (static) dir_module (static) actions_module (static) userdir_module (static) alias_module (static) rewrite_module (static) so_module (static) fastinclude_module (shared) auth_passthrough_module (shared) bwlimited_module (shared) frontpage_module (shared) suphp_module (shared) Syntax OK

    Read the article

  • Minidump folder cannot be found

    - by Saxtus
    Although I have enabled the creation of minidump files in my system, it appears that either Windows doesn't create them where the Startup and recovery dialog points to (%SystemRoot%\Minidump) or at least I can't find them. Even the Minidump folder under Windows directory is missing and I had numerous BSODs till now. I've searched all my HDs for mini*.dmp files only to find some old ones in a backup folder from my Vista x86 installation, before I install Windows 7 x64. Any thoughts of why this is happening and how to fix it? Pre-thanx!

    Read the article

  • How to specify file permission when putting a file using OpenSSH sftp command

    - by Adi Roiban
    I am using various SFTP clients for uploading files to an SFTP server and I have a problem with default permission used when putting files. When requesting to put a file, SFTP client like WinSCP or Filezilla will send the SSH_OPEN command without requesting any explicit file permission. On the other side, it looks like the OpenSSH sftp command on Linux (Red Hat and Ubuntu) is pending the SSH_OPEN command together with the '640' mode. How can I configure the OpenSSH command to not explictly set the file mode or how can I configure it to send a mode, other than 640? Many thanks! Update: I checked the OpenSSH sftp client source code and it looks like OpenSSH sftp will always tries to preserve file mode even if -P is not set: http://www.koders.com/c/fidD3B20680F615B33ACCB42398FAAFEE1C007DF942.aspx?s=rsa#L986 To solve this problem I used Putty SFTP client.

    Read the article

  • File sync over LAN

    - by Jack
    At the moment, I'm using Dropbox on my computer and Dropsync on my Android phone (over WiFi) to one-way sync the files (such as photos/app backups) on my phone to the Dropbox servers which are then synced to my computer. I like the features of Dropsync which can be set to sync only when the phone is charging and the WiFi is on. So everytime I charge my phone, i know my files are being backed up. Now I'm wondering if there's a similar app/program combo or something that can do what I'm currently doing but remove the middle-man (Dropbox servers)?

    Read the article

  • User Permissions: Daemon and User

    - by Eddie Parker
    Hello: I often run into this issue on Linux, and I'd love to know the proper way of solving it. Say I have a daemon running. In my example, I'll use LigHTTPD, a webserver. Some software, like Wordpress, enjoys having read/write access to files for updating applications via a web interface, which I think is quite handy. At the same time, I enjoy being able to hack on my files using vim, using my local user account, 'eddie'. Herein lies the rub. Either I chown everything to lighttpd or eddie and a shared group between them both, and chmod it 660, or perpetually sudo to edit the damned things. The former isn't a bad solution, until I create a new file in which case I have to remember to chmod it appropriately, or create some hack like a cron job that chmods for me. Is there an easier way of doing this? Have I overlooked something? Cheers, -e-

    Read the article

  • Guests can't access KVM host server by name although nslookup and dig returns correct record

    - by user190196
    So I have a KVM host that also runs an apache server with some yum repos. The VM guests are connected to the default virtual network, which is configured to offer DHCP and forwarding with NAT on virbr0 (192.168.12.1). The guests can successfully access the yum repos on the host by IP address, so for example curl 192.168.122.1/repo1 returns the content without problems. But I'd like to have the guests be able to reach the web server on the host by name rather IP address. I added the desired name record to the host's /etc/hosts file and libvirt's dnsmasq service seems to be serving that correctly to the guests since nslookup and dig successfully resolve the name on the guests: [root@localhost ~]# nslookup repo Server: 192.168.122.1 Address: 192.168.122.1#53 Name: repo Address: 192.168.122.1 [root@localhost ~]# dig repo ; <<>> DiG 9.8.2rc1-RedHat-9.8.2-0.17.rc1.el6 <<>> repo ;; global options: +cmd ;; Got answer: ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 55938 ;; flags: qr aa rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 0 ;; QUESTION SECTION: ;repo. IN A ;; ANSWER SECTION: repo. 0 IN A 192.168.122.1 ;; Query time: 0 msec ;; SERVER: 192.168.122.1#53(192.168.122.1) ;; WHEN: Tue Sep 17 02:10:46 2013 ;; MSG SIZE rcvd: 38 But curl/ping/etc still fail: [root@localhost ~]# curl repo curl: (6) Couldn't resolve host 'repo' While a request via ip address works: [root@localhost ~]# curl 192.168.122.1 <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN"> <html> <head> <title>Index of /</title> [...] Same with ping: [root@localhost ~]# ping repo ping: unknown host repo [root@localhost ~]# ping 192.168.122.1 PING 192.168.122.1 (192.168.122.1) 56(84) bytes of data. 64 bytes from 192.168.122.1: icmp_seq=1 ttl=64 time=0.110 ms 64 bytes from 192.168.122.1: icmp_seq=2 ttl=64 time=0.146 ms 64 bytes from 192.168.122.1: icmp_seq=3 ttl=64 time=0.191 ms ^C --- 192.168.122.1 ping statistics --- 3 packets transmitted, 3 received, 0% packet loss, time 2298ms rtt min/avg/max/mdev = 0.110/0.149/0.191/0.033 ms I tried adding repo 192.168.122.1 to the guests' /etc/hosts files but still no dice. Also tried changing guests' /etc/nsswitch.conf with both: hosts: files dns and hosts: dns files I've read the relevant libvirt documentation and I'm not sure where else to learn more about this and be able to move forward with it.

    Read the article

  • Remote file access.

    - by Rob Rob
    Hi, We're needing to provide remote (read/write) access to a number of files on our network to several users (some technical, some non-technical) who will be running Windows. The non technical users will need to be able to access their files in an easy to use manner. From previous experience, we could do this with: (some sort of) VPN SSH and something like Dokan (i've only previously done this on linux with sshfs) WebDav FTP VPN and SSH access are more open that we need at present, so I'm leaning towards webdav, however I only have limited experience of it (setting up an SVN server several years ago), but my understanding is that users can access it through windows explorer. FTP I haven't had much experience of, as I've always used SFTP via ssh - but i'd imagine we could make this work in a similar way to ssh. So my question is - have I missed any obvious candidates for this task, or if webdav is (or isn't) suitable what are the security implications of using it for this (obviously https will be used for the transfers, etc). Thanks, Rob.

    Read the article

  • Does any know fix/cause of USB drives that lose connection

    - by Burch Kealey
    I have been having significant problems trying to copy files to a USB drive (G0-Flex 1.5 tb) No matter what I use to try to copy the files (python code, Windows copy and paste or Drobo Copy utility) I ultimately get a failure. The failure is always some indication that the device is not ready or able to be written to. I am pretty sure the problem is that the drive is losing its connection. I have done everything I can find so far including setting the USB Root hub to never turn off the power, I updated some usb drivers etc. I have found references to this problem primarily with Win7-64 bit. I have also had USB connection problems with other devices- we kept losing a connection to our Bravo Disc Publisher when we went to Win7 and finally bought a newer model and have not had problems since. Any pointers about diagnosing and or understanding the problem would be appreciated.

    Read the article

  • Settings on php.ini ignored

    - by bfavaretto
    I can't get my server to obey the settings from php.ini (I'm trying to change memory_limit and upload_max_filesize). As far as I can tell, I'm editing the correct file. phpinfo() gives: Loaded Configuration File /etc/php.ini The file permission is 644. There are also some extra .ini files on /etc/php.d, but none include any of the keys I'm trying to change. No matter what I do, phpinfo reports the default values on both "Local" and "Master" columns. I also scanned my Apache config files, but found nothing related to PHP (besides loading the PHP module). The only way I was able to change those settings was by adding some php_value lines to my .htaccess. Is there something obvious I'm missing? This is a virtual server, and I can perform root commands with sudo. I'm running Apache 2.1.3 and PHP 5.3.3. System info (from uname -a) is: Linux sesctbapp01 2.6.18-308.1.1.el5 #1 SMP Wed Mar 7 04:16:51 EST 2012 x86_64

    Read the article

  • Why suddenly DOS-type hexadecimal file names?

    - by Marvin Nicholson
    One of the fairly recent folders on my XP SATA data drive suddenly shows DOS-type hexadecimal file names (i.e., eight characters with three-character extensions) I deleted them and now my Recycle bin shows them with a tilde (i.e., 194ABE~1.JPG). The images are all valid but the file names I assigned are gone. (The 2-terabyte SATA data drive has no OS, if that matters.) The last time this happened on an IDE drive, I was able to back up all the remaining files just before the drive died. Am I facing the same scenario now with my 2-terabyte SATA data drive? It is only a couple of years old. Should I quickly buy another one and back up 20 years of files to it before my current drive dies?

    Read the article

  • How can I re-encode H.264 video with minimal quality loss?

    - by SofaKng
    I have a lot of MPEG-TS files (.TS container but H.264 video) and playback is fine except that when you skip forward/backward or fast forward it's very sluggish and gets pixelated, etc. I've been trying to do research and I'm guessing that they were encoding with very few reference blocks (ie. it's a capture from a DVB-S satellite stream). When I re-encode them with Handbrake (.MP4 container) they play very, very good and seeking in the video is instant, etc, etc. Is it possible to transcode/re-encode my MPEG-TS files with minimal quality loss? If so, what is my best bet? They are each about 2 Mbps (ie. 2 GB per hour) but I don't want to re-encode them if "minimal quality loss" requires 10+ GB per file. I'm hoping to keep the video are the same size. Can anybody give me any advice?

    Read the article

  • Sql 2008 r2 DEV install failed

    - by obulay
    I am trying to install Sql 2008 r2 DEVeloper on Windows 2008 STD. It previously had Sql 2008 Enterprise installed. I first uninstalled SQL 2008. I think uninstall still leaves some crap in registery, and in Program files. Installation fails in the last step of the process - installation, about couple of minutes into it. Can't insert picture for you because serverfault does not allow me to do so. But it basically it: Error message: "Installation failed" I re-installed SQL 2008 Enterprise on this box and we are not going to go to R2 on older servers that previously had SQL 2005 or SQL 2008 on it. Looking at Windows log: Activation context generation failed for "C:\Program Files\Microsoft SQL Server\100\Setup Bootstrap\SQLServer2008R2\x64\Microsoft.SqlServer.Configuration.SqlServer_ConfigExtension.dll". Dependent Assembly Microsoft.VC80.CRT,processorArchitecture="amd64",publicKeyToken="1fc8b3b9a1e18e3b",type="win32",version="8.0.50727.4027" could not be found. Please use sxstrace.exe for detailed diagnosis.

    Read the article

  • Bash - read as a fallback to $@

    - by user137369
    I have a working bash script (working on OSX) that takes files and directories as input and does something like for inputFile in $@ do [someStuff] done but I want to provide a “fallback”, meaning, if the script is started with no arguments (double-clicked, for example), it can take input at that time, by letting the user drop the files directly on the terminal (possibly through read but not mandatory, I'm open to better/different solutions). I'm guessing I should use some kind of if statement, but I'm not sure how. I'd like to not have to essentially duplicate the script's size by two by repeating [someStuff] for each case. Thank you.

    Read the article

  • How do I use an internal SSD as a scratch disk for FCP X?

    - by andrewb
    I'm contemplating setting up my MacBook Air as a video editing machine. If I do this, I'll upgrade to a 256 GB SSD, and I should be able to keep around 100 GB or more free for video editing. The video files would of course be stored externally, but save purchasing some expensive Thunderbolt RAID device (which I suppose is gradually becoming more of an option), it will be slow for read/writes. How can I have a set up where I take advantage of my SSD's speed for a scratch disk/cache for FCP X, but still have the TB(s) of storage of externals? I don't want to have to be moving files constantly back and forth, this is about saving time not wasting it.

    Read the article

  • Problems accessing shared folder in Windows Server 2008

    - by Triynko
    In Windows Server 2008, I have a shared folder. For my username: NTFS permission (read/modify) Share Permissions (read/modify) Result when trying to access the share: I can traverse directory and read files, but I cannot write files. When I try to examine my effective permissions, it says "Windows can't calculate the effective permissions for [My Username]". The folder is owned by the Administrators group (the default), and NTFS read/write permissions are granted to my username, which is a member of the Administrators group. I notice that to make any changes to the folder locally require me to acknowledge a UAC prompt. Why does that prompt appear? I also tried creating a new group, giving it full NTFS permissions, and full control in the shared permissions, and added my username to the group. The result is even worse... I cannot even traverse the shared folder directories or read anything at all.

    Read the article

  • Apache has a 2GB file limit on a CIFS network drive?

    - by netvope
    Setup: A Windows and a Ubuntu Server are hosted in VMware ESXi I have a a 6GB file on a Windows share The Windows share is mount on Ubuntu with smbmount A symlink pointing to the 6GB file is created in a public_html directory, which is readable by Apache The problem: wget gets an error Connection closed at byte 2130706432. Retrying. after downloading 2130706432 bytes (exactly 2032 MiB, and is the same every time) Apache returns 206 Partial Content without showing any errors in the log Same error even if I download from localhost Similar error when Firefox is used instead of wget No error if I md5sum or cp the file on Ubuntu, suggesting that smbmount and the Windows Server are OK with 6GB files. No error if Apache serve a 6GB file from the local disk, suggesting that Apache has no problems handling 6GB files. Any ideas why Apache/symlink/smbmount/Windows would cause an error when used together? How can I fix the problem? Software used: VMware ESXi 4 Update 1 Windows Server 2008 R2 Ubuntu 8.04 Server, vmxnet3 Apache 2.2.8 mount.cifs 1.10-3.0.28a

    Read the article

  • Excel Help: Userforms

    - by B-Ballerl
    I have developed a macro that does a whole bunch of things for me based on a few things. (Importing files). The file names are dated dd_mm_yyyy and right now I enter them into a sheet where the macro can call the information. Not really wanting this I designed a userform where the user could enter the "dd", "mm", "yyyy" and how many consecutive days of files there were. Ex. Say 28_06_2011.txt 29_06_2011.txt there would be one consecutive day. I want to be able to call the information entered in the user-form (day, month, year, and consecutive days) to use in the macro and have been unsuccessful because I don't know how to call that information. Is it similar to referring to a range in a worksheet? Thanks in Advance for any Help.

    Read the article

  • Encrypted passwords for better security on server

    - by Ke
    Hi, I use wordpress and other CMS's and all these have plain text passwords in their config files e.g. in wp-config.php I wonder is this the normal way an administrator would protect security? I realise its possible to move the wp-config outside of the root web directory, but still if the server itself is compromised, its possible to find the wp-config file and the password inside, then the system is comprimised. Is there a way to encrypt all passwords on the system, so that in the web applications config files it uses the encrypted pass and not just plain text? Is there a sensible way of keeping plain-text passwords off the server? PS i use linux vps ubuntu servers Cheers Ke

    Read the article

  • Linux - Imaging backup solution?

    - by xperator
    I want to know is there a way to make a snapshot-like backup of a linux system into a single file and restore it in another system ? You know in windows there are programs which makes a copy of a drive (like C:\ ) into a single image file. So you can restore this file later incase you are infected or something happens. Every time I want to migrate my vps into another host, I have to setup the new server from scratch and move the files manually. Can I just make a snapshot backup of the whole system and restore it somewhere else (or on the same server) ? I am not familiar with linux and I have no idea if this is technically possible or not ? Does the paritions, configs, system files,etc... are individual for each system ? I heard about rsync, but that's not what I am looking for.

    Read the article

  • AutoCAD 11 and network file shares

    - by gravyface
    Small network of perhaps half a dozen engineers, currently working on local copies of AutoCAD project files, which are then copied back up to file server (2008 Standard, 1-2 year old Dell server hardware, RAID 5 SAS disks (10k? not positive)) at end of day. To me, this sounds horribly inefficient and error-prone, however, I've been told that "AutoCAD and network files = bad idea" and this is gospel. The network is currently 10/100 (perhaps this is the reason for the "gospel") but all the workstations are within 2 years old and have GbE NICs so an upgrade of the core switch is long overdue. However, I know certain applications don't like network access, at all, and any sign of latency or disruption brings the whole thing crashing down. Anyone care to chime in?

    Read the article

  • ¿How to set maximum downloads or sockets or whatever, in Apache and or PHP?

    - by Petruza
    I made a PHP script, running from my localhost, that streams files from a remote server and serves them. I do this so I can rename the files prior to the browser shows the dialog to save them, through header( "filename:..." ) Anyway, although the remote server allows many simultaneous file downloads at good rates, when they stream through my local apache/php I can't get more than 6 at the same time. When I try to download the 7th, the save as dialog appears as soon as the sixth download has finished. I'm almost sure this is some limit imposed by php.ini or apache's httpd.ini, but don't have a clue about which one is it. do you?

    Read the article

  • Logrotate not doing any rotation

    - by blizz
    I just set up LogRotate on my RHEL6 server so that it rotates my custom Apache log files. However, it doesn't do anything when i try manually running it. I expect it to rotate the log files "access.log" and "err.log". They have been there for a few days and need to be rotated. Here is the output: [root@pc1 httpd]# logrotate -d -f /etc/logrotate.d/apache reading config file /etc/logrotate.d/apache reading config info for /var/log/httpd/*log /var/www/html/NSLogs/access.log /var/www/html/NSErrorLogs/err.log Handling 1 logs rotating pattern: /var/log/httpd/*log /var/www/html/NSLogs/access.log /var/www/html/NSErrorLogs/err.log forced from command line (no old logs will be kept) empty log files are rotated, old logs are removed considering log /var/log/httpd/access_log log needs rotating considering log /var/log/httpd/error_log log needs rotating considering log /var/www/html/NSLogs/access.log log needs rotating considering log /var/www/html/NSErrorLogs/err.log log needs rotating rotating log /var/log/httpd/access_log, log->rotateCount is 0 dateext suffix '-20131023' glob pattern '-[0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9]' glob finding old rotated logs failed fscreate context set to unconfined_u:object_r:httpd_log_t:s0 renaming /var/log/httpd/access_log to /var/log/httpd/access_log-20131023 disposeName will be /var/log/httpd/access_log-20131023.gz running postrotate script running script with arg /var/log/httpd/access_log: " /usr/bin/killall -HUP httpd " compressing log with: /bin/gzip removing old log /var/log/httpd/access_log-20131023.gz error: error opening /var/log/httpd/access_log-20131023.gz: No such file or directory rotating log /var/log/httpd/error_log, log->rotateCount is 0 dateext suffix '-20131023' glob pattern '-[0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9]' glob finding old rotated logs failed fscreate context set to unconfined_u:object_r:httpd_log_t:s0 renaming /var/log/httpd/error_log to /var/log/httpd/error_log-20131023 disposeName will be /var/log/httpd/error_log-20131023.gz running postrotate script running script with arg /var/log/httpd/error_log: " /usr/bin/killall -HUP httpd " compressing log with: /bin/gzip removing old log /var/log/httpd/error_log-20131023.gz error: error opening /var/log/httpd/error_log-20131023.gz: No such file or directory rotating log /var/www/html/NSLogs/access.log, log->rotateCount is 0 dateext suffix '-20131023' glob pattern '-[0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9]' glob finding old rotated logs failed fscreate context set to unconfined_u:object_r:httpd_sys_rw_content_t:s0 renaming /var/www/html/NSLogs/access.log to /var/www/html/NSLogs/access.log-20131023 disposeName will be /var/www/html/NSLogs/access.log-20131023.gz running postrotate script running script with arg /var/www/html/NSLogs/access.log: " /usr/bin/killall -HUP httpd " compressing log with: /bin/gzip removing old log /var/www/html/NSLogs/access.log-20131023.gz error: error opening /var/www/html/NSLogs/access.log-20131023.gz: No such file or directory rotating log /var/www/html/NSErrorLogs/err.log, log->rotateCount is 0 dateext suffix '-20131023' glob pattern '-[0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9]' glob finding old rotated logs failed fscreate context set to unconfined_u:object_r:httpd_sys_rw_content_t:s0 renaming /var/www/html/NSErrorLogs/err.log to /var/www/html/NSErrorLogs/err.log-20131023 disposeName will be /var/www/html/NSErrorLogs/err.log-20131023.gz running postrotate script running script with arg /var/www/html/NSErrorLogs/err.log: " /usr/bin/killall -HUP httpd " compressing log with: /bin/gzip removing old log /var/www/html/NSErrorLogs/err.log-20131023.gz error: error opening /var/www/html/NSErrorLogs/err.log-20131023.gz: No such file or directory

    Read the article

  • Nginx location issue

    - by dave
    I'm trying to set a longer (30 day) 'expires' header for my (images only) in the /misc-stuff/ directory. This is what I'm using for my site : # Serve static files directly from nginx location ~* \.(jpg|jpeg|gif|png|bmp|ico|pdf|flv|swf|exe|html|htm|txt|css|js) { add_header Cache-Control public; add_header Cache-Control must-revalidate; expires 7d; } I want to be able to keep that code in to handle regular site images, but create a new block to handle the /misc-stuff/ directory. I have tried : location ^~ /misc-stuff/ { ... } The problem I'm having now is that my backup .php files in that directory show up as plain text if someone tries to access it. How do I set it up so ONLY .gif images in the /misc-stuff/ directory are effected?

    Read the article

  • pam_filter usage prevent passwd from working

    - by Henry-Nicolas Tourneur
    Hello everybody, I have PAM+LDAP SSL running on Debian Lenny, it works well. I always want to restrict who's able to connect, in the past I used pam_groupdn for that but I recently got a situation where I has to accept 2 different groups. So I used pam_filter like this : pam_filter |(groupattribute=server)(groupattribute=restricted_server) The problem is that with this statement, passwd doesn't work anymore with LDAP accounts. Any idea why ? Please find hereby some links to my config files : Since serverfault.com only allow me to post 1 link, please find hereunder the link to other conf files : http://pastebin.org/447148 Many thanks in advance :)

    Read the article

  • How can I free up some space in my C: drive?

    - by Faraaz
    Each time I try to save a file, I get a message from my computer (with Windows 7) that asks me to free up some space in my Drive C before being able to save my intended file. But the more I search for extraneous files to delete, the more I get frustrated. I simply can't find out what "extra" file(s) I have that are occupying about 20 gigs of my C drive. As far as I know I save all the downloadable stuff to my other drives, and the most part of what I do with my computer is just Internet browsing. Would you please help me find the file or files that have occupied so much space in my Drive C so that I can remove them?

    Read the article

< Previous Page | 610 611 612 613 614 615 616 617 618 619 620 621  | Next Page >