Search Results

Search found 12017 results on 481 pages for 'no root'.

Page 227/481 | < Previous Page | 223 224 225 226 227 228 229 230 231 232 233 234  | Next Page >

  • Cannot access the EC2 server - permissions problems, ssh is dead

    - by user1494072
    One of our developers worked on a beta server and accidentally changed the permissions of the whole system (chmod /) to root. Due to that, services are unable to access files, and we can't ssh to the machine (permission denied on the key) (UPDATE: ssh is dead after reboot, probably can't start). Does Amazon has an option to browse files / physically access the machine? Any other creative solution?

    Read the article

  • "Mail" command hangs and maillogs shows error

    - by harmony
    i tried linux command on my CentOS 5.x: mail -s "mysubject" [email protected] minutes takes, it dont finish, i do Ctrl+C, it says "(Interrupt -- one more to kill letter)" i tried command: mail No mail for root /var/log/maillog is empty /var/log/kloxo/maillog is full of messages: Oct 25 17:28:17 vps qmail: 1382736497.255902 delivery 1246425: deferral: Uh-oh:_.qmail_has_prog_delivery_but_has_x_bit_set._(#4.7.0)/ Oct 25 17:28:17 vps qmail: 1382736497.255915 status: local 1/10 remote 0/60 i checked all my .qmail files and none has execution permissions. Any idea please how to debug?

    Read the article

  • How to allow only specific directories to use htaccess?

    - by DisgruntledGoat
    Currently in apache2.conf I have AllowOverride all set for /var/www which simply allows htaccess for all the sites on the server (which is Ubuntu, 9.04). However, I'd rather only allow overrides in each site root directory and nothing else. In other words, /var/www/site1, /var/www/site2, etc. can have a htaccess, but all other directories including /var/www and /var/www/site1/content cannot. Is there a way to do this without having to write a rule for every site on the server?

    Read the article

  • Where does Chrome store its bookmarks in Ubuntu 11.10?

    - by Alan Wood
    I looked at all the other posts on this but can't find the directories mentioned (~/.config/google-chrome/Default/Bookmarks, it's a JSON file.). Being a 2 day Newbie to Ubuntu/Linux I would like to know if the location has changed in the latest version or if not how I locate the directory indicated. I have logged in as root and searched for the folder and can't find it although I imported my bookmarks from a html file so I know that they must be saved somewhere.

    Read the article

  • Setting Up My Server to Do DNS On OpenSuse 11.3

    - by adaykin
    Hello, I am attempting to use my server to be a DNS server. I am having trouble getting the domain setup. Here is what I have so far: /var/lib/named/master/andydaykin.com: $TTL 2d @ IN SOA andydaykin.com. root.andydaykin.com. ( 2011011000 ; serial 0 ; refresh 0 ; retry 0 ; expiry 0 ) ; minimum andydaykin.com. IN NS ns1.andydaykin.com. andydaykin.com. IN SOA ns1.andydaykin.com. hostmaster.andydaykin.com. ( @.andydaykin.com. IN NS ns1.andydaykin.com. ns1.andydaykin.com. IN A 204.12.227.33 www.andydaykin.com. IN A 204.12.227.33 /etc/resolve.conf: search andydaykin.com nameserver 204.12.227.33 /etc/named.conf: options { # The directory statement defines the name server's working directory directory "/var/lib/named"; dump-file "/var/log/named_dump.db"; statistics-file "/var/log/named.stats"; listen-on port 53 { 127.0.0.1; }; listen-on-v6 { any; }; notify no; disable-empty-zone "1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.IP6.ARPA"; include "/etc/named.d/forwarders.conf"; }; zone "." in { type hint; file "root.hint"; }; zone "localhost" in { type master; file "localhost.zone"; }; zone "0.0.127.in-addr.arpa" in { type master; file "127.0.0.zone"; }; Include the meta include file generated by createNamedConfInclude. This includes all files as configured in NAMED_CONF_INCLUDE_FILES from /etc/sysconfig/named include "/etc/named.conf.include"; zone "andydaykin.com" in { file "master/andydaykin.com"; type master; allow-transfer { any; }; }; logging { category default { log_syslog; }; channel log_syslog { syslog; }; }; What I am doing wrong?

    Read the article

  • Need to open port 10000 for webmin and 21 for FTP in Centos?

    - by Abir Sepahvand
    Hi hwo can I open these two ports in CentOS. I have used webmin with Ubuntu before but I never had to manually open any port. When I enter iptables -L I get a output like this. Chain INPUT (policy ACCEPT) target prot opt source destination Chain FORWARD (policy ACCEPT) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination [root@sachinvasudev test]#

    Read the article

  • extracting files from tar

    - by shantanuo
    tar -xvf company_raw_2012-03-16.tgz --directory=/root/test --strip-components=4 I am using the following tar option to remove the leading directories and it is working as expected. --strip-components NUMBER strip NUMBER of leading components from file names before extraction It works only when I know that there are going to be 4 sub-directories. I have tar files and I do not know if there will be 2, 3 or 4 folders inside. How do I strip the entire path and extract files in the given "directory" path.

    Read the article

  • Nginx reverse proxy error page

    - by Lormayna
    I'm using nginx as reverse proxy for a single machine. I would like to have an error page when the backend machine goes down. This is my configuration file: server { listen 80; access_log /var/log/nginx/access.log; root /var/www/nginx; error_page 403 404 500 502 503 504 /error.html; location / { proxy_pass http://192.168.1.78/; include /etc/nginx/proxy.conf; }

    Read the article

  • openssh sftp chroot

    - by Zulakis
    I chrooted a user to the directory /var/www/upload using ChrootDirectory /var/www/upload in my etc/ssh/sshd_config. The permissions of all the files in var/www/upload is 755 and owner is root:upload_user. However, I still cannot modify the files. (Getting a permission denied error.) Is it possible if I create a subdirectory with ownership upload_user:upload_user. Is it, by any means, possible to allow my chrooted user to write to his / directory?

    Read the article

  • Slow Transfer Speeds from KVM host to client

    - by indian maiden
    I am trying to isolate the root cause of slow transfer speeds from my host OS to a KVM client. Both are Linux. Rsync on the host 192.168.1.72 rsync -auv --progress rut3.img /tmp/ [54.09MB/s] Rsync to the client: rsync -auv --progress rut3.img 192.168.1.80:/tmp/ [25.52MB/s] I realize that there will be some TCP overhead on the transfer but over 50%? Can someone enlighten me on what could be slowing down the transfers so much?

    Read the article

  • Windows 7 booting and startup repair issues

    - by aardvark
    I have a MSI FR720 laptop with Windows 7 and Lubuntu partions. For quite a while (6 months or so) I've been having issues booting from my hard drive, it'd take me between 5 minutes and several hours for me to be able to have it recognize the hard drive as a bootable device. I did several disk checks on it, and my hard drive seems in perfect condition, and the fact that booting would usually only work after removing the hard drive and trying to reset it in its slot or lightly shaking it makes me think it had something to do with the connection in the hard drive slot as opposed to the hard drive itself. I was having particular issues with it detecting the hard drive today so I decided to try booting it from an external hard drive dock. It detected it first try and so far has had no problems finding the bootable partitions on my hard drive. When I selected my Windows 7 partition from the boot menu it said that it hadn't been shut down properly last time and needed startup repair. I've done this several times over the last 6 months, so this is hardly unusual. I do startup repair, it fails, and then I try to do a system restore. The system restore also failed, and it says that no files were changed. I restart and try it again. However, this time when I get to the startup repair it's not detecting a Windows OS. I tried clicking next and doing a startup repair but the repair is always failing. If I ignore the startup repair option and instead select "Launch windows normally" it will get to the windows animation, stop halfway through and then crash into a BSoD. I can't read the error on the screen because it immediately switches to back and tries to restart. This is my first time asking a question like this online, so let me know if I need to provide any extra information and I'll do my best to give it I tried using diskpart to find the list of partitions and see if one's labelled as an active partition, but it says that no disk were detected. I can run Lubuntu just fine. I can also see all of my Windows 7 files from it EDIT: The startup repair diagnosis and repair log is this: -- Number of repair attempts: 1 Session details System Disk = Windows directory = AutoChk Run = 0 Number of root causes = 1 Test Performed: Name: Check for updates Result: Completed successfully. Error code = 0x0 Time taken = 15ms Test Performed: Name: System disk test Result: Completed successfully. Error code = 0x0 Time taken = 31ms Root cause found. If a hard disk is installed, it is not responding. -- Any chance that this is a result of me doing this through an external dock through a USB drive?

    Read the article

  • mod_secdownload in lighttpd support subdirectories for secure stream?

    - by zomail
    i want to know that lighttpd supports secure stream for subdirectories ? I want to secure my subdirectories within a directory but looks like its not working on subdirectories . I want to secure my subdirectories within download-area directory given below secdownload.secret = "MySecretSecurePassword" secdownload.document-root = "/home/lighttpd/download-area/" secdownload.uri-prefix = "/dl/" secdownload.timeout = 3600

    Read the article

  • How to append to a file as sudo? [closed]

    - by obvio171
    Possible Duplicate: sudo unable to write to /etc/profile I want to do: echo "something" >> /etc/config_file But, since only the root user has write permission to this file, I can't do that. But this: sudo echo "something" >> /etc/config_file also doesn't work. Is there any way to append to a file in that situation without having to first open it with a sudo'd editor and then appending the new content by hand?

    Read the article

  • "apache2ctl command not found" appears when invoking apache2ctl

    - by OC2PS
    I am running Apache 2.4.6 on my CentOS 6.4 server. Having some trouble with rewrite...so was trying to check loaded modules apache2ctl -M But that returns apache2ctl command not found So I tried which apache2ctl and I get no apache2ctl in (/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin) I am sure apache is installed and running. How do I find apache2ctl/ check loaded modules now?

    Read the article

  • scp to remote server with sudo

    - by NP
    I have a file on server A which needs to get to server B in an area which I only have sudo access (i.e. I have a user account that has root privileges with sudo). what is the syntax for the scp command?

    Read the article

  • Best practice guide to install to Program Files

    - by Cold T
    Have seen quite a few questions in Serverfault and Super User but none that specifically answers my question. We have an application that is being provided and installed by a third party company. They are charging market rate 'consultancy' fee to do this. They installed majority of the folders in the root of the C drive, to my shock. Are there any official Microsoft Best Practice guides out there to say applications should be installed in Program Files.

    Read the article

  • File permissions on a dedicated server [duplicate]

    - by Niet the Dark Absol
    This question already has an answer here: What permissions should my website files/folders have on a Linux webserver? 4 answers I have a dedicated server for my website. There are no other users, and no other websites on the same machine. Is there any risk in setting 777-permissions on my site's public_html folder, bearing in mind configuration files with passwords and access keys are stored outside that root?

    Read the article

  • How to give user read/write access to folders?

    - by Will
    I'm running a certain script that is using a non-root user to do the following... mkdir: cannot create directory `/srv/www/example.com/releases' *** [err :: 12.23.45.789] : Permission denied How would I allow user xyz to have permanent permissions to do so and still keep this web server secure? Also is it possible to make it recursive for all subfolders? I know its probably chmod something but I'm not that linux savy, thanks.

    Read the article

  • unable to connect site to different port

    - by JohnMerlino
    I have a domain was registered at godaddy named http://mysite.com/. I logged into godaddy and I went to All Products Domains Domain Management. I clicked on the appropriate domain and it took me to the Domain Details page. I clicked Launch under DNS Manager and it took me to the Zone File Editor. I noticed that notify.mysite.com was pointing to an IP address pointing to a dead server, so I switched it to an operating server. Then I pinged the domain to see where it was pointing to and it was correctly pointing to the working server. So I copied the default configuration under sites-available: sudo cp default notify.mysite.com. And then I made some edits to it to have it point to a different document root to serve files at a different port: Listen 1740 Listen 64.135.xx.xxx:1740 #I also tried this as well: NameVirtualHost 64.135.xx.xxx:1740 <VirtualHost 64.135.xx.xxx:1740> ServerAdmin [email protected] ServerName notify.mysite.com DocumentRoot /var/www/test/public <Directory /var/www/test/public> Order allow,deny allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> Then I enabled the virtual host. Then I went to the document root and added an index.html file with some text in it. Then I restarted apache. The restart gave no errors. Then I type the correct domain in URL: http://notify.mysite.com:1740/ and I get: Oops! Google Chrome could not connect to notify.mysite.com:1740 Somehow it took out all my other sites. Now even the ones that were responding on port 80 are no longe responding, even though I did not touch the virtual hosts for them. I get this message now: Oops! Google Chrome could not connect to mysite.com However, ping responds: ping mysite.com PING mysite.com (64.135.12.134): 56 data bytes 64 bytes from 64.135.12.134: icmp_seq=0 ttl=49 time=20.839 ms 64 bytes from 64.135.12.134: icmp_seq=1 ttl=49 time=20.489 ms The result of telnet: $ telnet guarddoggps.com 80 Trying 64.135.12.134... telnet: connect to address 64.135.12.134: Connection refused telnet: Unable to connect to remote host

    Read the article

< Previous Page | 223 224 225 226 227 228 229 230 231 232 233 234  | Next Page >