Search Results

Search found 14709 results on 589 pages for 'root permission'.

Page 402/589 | < Previous Page | 398 399 400 401 402 403 404 405 406 407 408 409  | Next Page >

  • Exclude a process from being listed in `top`

    - by warren
    Is it possible to exclude some processes from being reported by top? For example, I would like to exclude itself from its listing (ie, I don't want top to show in the process list). I would also like to be able to exclude processes that do not belong to the user running top (except for root). Is this possible? If so, how? If not, is there a similar tool that will do what I want (that does not involve running something like ps frequently).

    Read the article

  • How to downgrade a kernel?

    - by JATMON
    I need to downgrade the kernel from 2.6.32-358.6.2.el6.centos.plus.x86_64 to 2.6.32-220.el6.x86_64 I am unable to install the older version using Yum/rpm as it gives the following error root@localhost kernels]# rpm -i --ignoreos kernel-2.6.32-220.el6.x86_64.rpm warning: kernel-2.6.32-220.el6.x86_64.rpm: Header V4 DSA/SHA1 Signature, key ID 192a7d7d: NOKEY package kernel-2.6.32-279.el6.x86_64 (which is newer than kernel-2.6.32-220.el6.x86_64) is already installed package kernel-2.6.32-358.6.1.el6.centos.plus.x86_64 (which is newer than kernel-2.6.32-220.el6.x86_64) is already installed package kernel-2.6.32-358.6.2.el6.centos.plus.x86_64 (which is newer than kernel-2.6.32-220.el6.x86_64) is already installed I cant remove the currently running kernel , so whats the way out? Yum search doesnt even get me to this old version, so had to get the rpm from web. Any help is much appreciated.

    Read the article

  • Joomla in Windows is catching my access to a Virtual Directory where I placed my MVC application

    - by Romias
    In our windows hosting we use the root (wwwroot) folder to host a JOOMLA website as public website. This is running IIS 7. Then, we created a virtual directory called "App" to host there a ASP.NET MVC4 application. When I enter www.mydomain.com it shows the joomla website correctly. When I enter www.mydomain.com/App/ it somehow access my MVC app... as I see the URL changing to www.mydomain.com/App/Account/LogOn?ReturnUrl=%2fApp%2f BUT shows a 404 Joomla error as if it were looking that URL in Joomla. BTW, the hosting has 2 ASP.NET IIS Setup options: 4.0 Classic and 4.0 integrated. Using the Integrated one... it displays a blank page... using the classic one shows the 404 Joomla page. Any idea where to look for this?

    Read the article

  • Can Camel Jetty proxy URL share the webapplication context

    - by user1750353
    I am struggling with Camel Jetty Proxy Routes. At times, the routes exhibits inconsistent behavior. My Proxy app deployed with context root "Proxy", however, if give that as the context path for my proxy URL I get service not found error. If change the Proxy to an arbitrary context such as "Dummy" then the route works. Is that how camel jetty component works? From Path: jetty:http://0.0.0.0:6080/Proxy/PurchaseOrder/?matchOnUriPrefix=true&disableStreamCache=true&traceEnabled=true To Path: jetty:http://localhost:7001/Provider/PurchaseOrder/?bridgeEndpoint=true&throwExceptionOnFailure=false Another issue i noticed with jetty is If deploy both Proxy and Provider on the same app container(same listening port), then the route completely stops working saying "Provider/PurchaseOrder/" service not found. The only way the routes work is, both routes have to run different ports and the from route shouldn't share the webcontext path. I have requirement that if required I should be a able run both Proxy and Provider on the same container. Any help appreciated. Thanks,

    Read the article

  • What TLDs should I use for my NS records for redundancy? (DNSSEC support required)

    - by makerofthings7
    Question As a general practice, is it a good idea to use multiple TLDs for the name servers? How should I choose between which TLD would be a good candidate for being the root server for my NS name? More Info I am switching over 800 DNS zones to an outsourced DNS provider. I originally planned on setting the zone names to nsX.company.com, but think it would be best to have multiple TLDs such as .net , .org and .info Since I plan on supporting DNSSec at company.com I think all the 1st tier Name servers must support it as well. Part of the inspiration for this question came from our provider UltraDNS. In their configuration screen for our domains, they actively verify and alert us if our name servers aren't exactly: pdns1.ultradns.net pdns2.ultradns.net pdns3.ultradns.org pdns4.ultradns.org pdns5.ultradna.info pdns6.ultradns.co.uk

    Read the article

  • What options exist when the vendor does not supply an ADB driver for an Android device?

    - by STATUS_ACCESS_DENIED
    So I bought an Android phone and the vendor does not offer any drivers whatsoever. The Android SDK and the drivers that come with it don't seem to work with the device, but the device itself reports as Android 2.2.1. Other users have reported that the drivers of the Nook Color worked for them, but I cannot confirm this, after trying. What options do I have to connect to the device (and ultimately to root it)? Is it truly just the .inf file that I need to manipulate in order to make the device ID known to Windows? After all there are tools to figure out those strings while the device is connected (although "unknown") ...

    Read the article

  • Ideal permissions scheme for multiple Apache/PHP sites...

    - by Omega
    I'm hosting multiple sites from one server where each site has it's own user and www directory in their home dir. Currently our web server runs as user nobody(99). We're noticing that to run several popular scripts and engines, they require write access to their own files. As the home directory is owned by the user, not nobody(99), what is the best policy or change in hosting configuration that would: ...make it so that all the various engines and platforms work? ...still allow us to work with files and edit them without having to diddle with permissions as root? Thanks for the advice!

    Read the article

  • Two sites running same code and different config files?

    - by Gen
    I have a Windows 2008R2 server with IIS, running one site (ASP.NET4.5). All the parameters are written in web.config file. I have to add a new site, that will run on the same code (same root folder) as the first one, but will read parameters like sql connection strings etc. from its own config file, not from first site web.config. How can I do that? Is it possible to run the second site in different app pool? Both sites will run the same .NET version of course. Thank you!

    Read the article

  • Configurarion of Alert on Windows Server 2003

    - by Ferre06
    I'm trying to configure an alert on low space on disk in Windows Server 2003, I already followed this step by step tutorial of microsoft. I try to execute a bat file created by me, located on the home folder of the user I'm using. I seted to trigger when the free space is below 6 GB when the disk have lower free space than 6 GB, the "Sample data interval" is the default (5 seconds). The problem is that the alert isnt triggered. And another thing, the user that is seted for the alert isnt the root user, but It have administration privileges. Thanks in advance

    Read the article

  • How to set up Apache 2 to serve only subdirectories

    - by Lynden Shields
    I have 3 sites which need to be hosted on a web server (apache2 from repo running on Ubuntu 12.04). They are each in their own subdirectory within /var/www/ I would like apache to serve files from the relevant directories only if the directory name is given in the URL, but not serve the /var/www/ directory itself. E.g: http://1.2.3.4/site1/ should work and serve the index from /var/www/site1/index.html, but http://1.2.3.4/ should not serve anything. Currently, I can't get the url to point to the directory. Either I can get http://1.2.3.4/ to serve everything within /var/www/ (including /var/www/site2/secretstuff/), or I can get the root http://1.2.3.4/ to serve one of the subdirectories (/var/www/site1/). This is unacceptable site 1 needs Indexes enabled but the others must not. I just want to make site1's config only respond to requests of the form http://1.2.3.4/site1/* and not handle requests of the form http://1.2.3.4/ I do not have a domain name set up so I can't use subdomains.

    Read the article

  • Pop current directory until specific file is found

    - by edarroyo
    I'm interested in writing a script with the following behavior: See if a file, build.xml, exists in the current directory, if so run a command supplied through arguments to the script. If not, pop the current directory and look at the parent. Go to 1. The script would end once we find the file or we reach the root. Also, once the script finishes and control comes back to the user I want the current directory to be that one where the script was initially l I'm not very familiar with shell scripting but any help/guidance will be much appreciated.

    Read the article

  • Pointing a subdomain at a file

    - by Seva Alekseyev
    Switched hosting recently (Linux, CPanel, WHM). At the old host, there was a subdomain that had a file (instead of a directory) as a root. The file was a CGI script. The said subdomain was created via CPanel by me a while ago. At the new host, I'm trying to recreate this subdomain. And I get the following error: The directory, /home/(...)/cgi-bin/guest.cgi could not be created. Is there a tweak somewhere that enables this functionality? EDIT: tried to repeat the trick on the old site, and I could not. CPanel update broke it, maybe?

    Read the article

  • PDF Icon changes to blank in Dropbox folder

    - by Windows8Fanatic
    Strange enough, the PDF reader icon corrupts in my Dropbox folder on my Windows 8 machine. I am using Windows 8 x64 Pro. If I change the "open with" to some other reader and then back to Adobe Acrobat Reader, it shows the PDF icon and preview of the PDF file. But somehow it MAGICALLY corrupts sometime later and the PDF file gets a blank icon. Possibly corruption of thumbs.db in Dropbox during synchronization? This screenshot is in the root folder of my Dropbox folder on my Windows 8 machine.

    Read the article

  • Unable to SSH into ESXi 4.1 host - "Access Denied"

    - by Andrew White
    I am unable to SSH into an existing ESXi server. I have a user which is in the "root" and "users" group and is able to connect via vSphere. However after enabling "Remote support (SSH)", attempting to connect by putty and entering my username/password when prompted I am presented with an "Access Denied" message. I have run through the options presented at this KB article to no avail. I have taken down the firewall on the machine (it is remote) temporarily to check if this helped - no change. The username/password are definitely correct and I can obviously get connectivity if I am presented with the username/password prompt. I am at a bit of a loss what else I can try. Thanks all

    Read the article

  • Squid randomly stops serving requests. How can I resolve this issue?

    - by Vijay
    The squid (2.7) proxy that I have running on ubuntu 8.10 stops accepting new requests after being online for a while, due to reasons that I can't discover. However doing a squid -k reload resolves the problem immediately. Now I manually run this command by monitoring the log and if i don't see any activity for 5 minutes I reload the config. Now on my quest for a solution I had several ideas: diagnose the root cause and eliminate it setup a script to automatically reload script if no new entries in access.log for the past 3 minutes painstakingly upgrade server to newer ubuntu version while keeping network offline or during off hours to minimize downtime. so i thought I would turn to you for solutions to option 2), as I do not understand squid enough for 1), and I'm avoiding 3) as long as i can. so can ideas?

    Read the article

  • Windows/Samba connection error

    - by Gomibushi
    I have a Linux fileserver serving up /home for linux and windows users. I was able to connect from my windows client, but not from a DC. Then suddenly I could connect from the DC too. The linux servers run Centrify clients, and as such are part of the domain. All on same subnet. This is what the the log.smbd says, repeatedly: [2010/02/11 11:25:57, 0] lib/util_sock.c:read_data(534) read_data: read failure for 4 bytes to client 192.168.200.3. Error = Connection reset by peer On Windows it appeared as an "unknown error". EDIT: the error code is "0x80004005". We are developing a system depended on the samba share, and are worried this will appear again. It would be nice to pin point the root of this. Any ideas what this might be? Places to look?

    Read the article

  • Default document not working after installing SP1 on Windows 2008 R2 x64

    - by boredgeek
    We have a web site that should only be available for authorized users. So we deny anonymous access for the site. However we do allow anonymous access to the default page and the login page. When we installed SP1 the behavior of the server changed. Now if the user is trying to access the root of the site, say http://mysite.com, she is redirected to login page rather than the default page. Is there a hotfix to bring back the previous behavior?

    Read the article

  • How to compile gcc-4.0 on Mountain Lion

    - by Frizlab
    So far I've successfully launched the configure, but when I type make, I get the following error, after some time (there's a lot which compile successfully): ld: unknown/unsupported architecture name for: -arch i686 /usr/bin/libtool: internal link edit command failed make[2]: *** [libgcc_s.dylib] Error 1 make[1]: *** [libgcc.a] Error 2 make: *** [all-gcc] Error 2 Is there a way to tell gcc not to compile itself for the i686 architecture? Here's my uname -a if it can help: Darwin Frizlabs-Computer.local 12.2.0 Darwin Kernel Version 12.2.0: Sat Aug 25 00:48:52 PDT 2012; root:xnu-2050.18.24~1/RELEASE_X86_64 x86_64 PS: I know gcc-4.0 is ancient, but I do need it.

    Read the article

  • How to tell rsync not to touch the destination directory permissions?

    - by Sorin Sbarnea
    I am using rsync to sync a directory from a machine to another but I encountered the following problem: the destination directory permissions are altered. rsync -ahv defaults/ root@hostname:~/ The problem is that in this case the permissions and ownership of the defaults forlder will be assigned to the destination folder. I do want to keep the permissions for the files and subdirectories but not for the source directory itself. Also, I do not want to remove any existing files from the destination (but to update them if needed), but I think that current settings are already ok regarding this. How can I do this?

    Read the article

  • Can not start Apache 2.2.22 in Fedora 15

    - by Roderik
    I am trying to start Apache 2.2.22 under Fedora 15 on my local machine. After fixing some errors related to missing modules, httpd -t will just give me 'Syntax OK'. However when I try to start apache as the root user: service httpd start it still returns: Starting httpd (via systemctl): Job failed. See system logs and 'systemctl status' for details. [FAILED] When entering systemctl I don't see any extra information other than: httpd.service loaded failed failed LSB: start and stop Apache HTTP Server So I wonder where to look now to get this back up and running.

    Read the article

  • Java and Sendmail HELO requires domain address

    - by ealgestorm
    I am trying to set up emailing from a java web application hosted on a linux server (Cent OS) in apache. Sendmail is working fine from the command line as root on localhost but when trying to send emails from the java web app (also on the same server from localhost) the following java exception is thrown. 501 5.0.0 HELO requires domain address EDIT: I have read that some people have found this is due to an incorrect hosts entry currently the hosts file contains 127.0.0.1 Centos-VPS localhost.localdomain localhost and i'm not sure what the Centos-VPS bit at the start is for but this is a clients hosted server so don't really want to break stuff EDIT see the RFC is helpful ... 501 Syntax error in parameters or arguments Now I know what the problem is! (note the sarcasm people.)

    Read the article

  • How to automatically copy a file uploaded by a user by FTP in Linux (CentOS)?

    - by Buttle Butkus
    Outside contractor says they need read/write/execute permissions on part of the filesystem so they can run a script. I'm ok with that, but I want to know what they're running, in case it turns out there is some nefarious code. I assume they are going to upload the file, run it, and then delete it to prevent me from finding out what they've done. How can I find out exactly what they've done? My question specifically asks for a way of automatically copying the file, which would be one way. But if you have another solution, that's fine. For example, if the file could be automatically copied to /home/root/uploaded_files/ that would be awesome.

    Read the article

  • Unable to sunchronize local and remote directories ("set times: Operation not permitted")

    - by Tom Auger
    I'm running into FTP errors using software like NetBeans or WinSCP: whenever I attempt to perform a synchronization or update of files from local -- server I get errors on the client saying "set times: Operation not permitted". This is clearly an issue with the way I've configured my Fedora installation. The user that I'm logging in with cannot touch -t any of these files, though he IS part of a group that has r/w access on the files. I do have root / sudo access to this server. What I would like to know is: a) is it likely that this problem would be solved by allowing my FTP user to "touch -t" these files b) how do I enable a certain user to be able to set timestamps on files without giving them ownership of the files (certain of these files need to be owned by Apache, for instance, so I don't want to chown them). Thanks in advance.

    Read the article

  • Server 2008 R2 file access permissions

    - by Napster100
    I'm finding it awkward to sort out permissions for file sharing and access on my LAN. I've created an account on the server node (as a normal user) and shared a drive that has 2 folders at the root, one is for personal file storage and the other shared files, if I connect to the shared area from a workstation running windows 7 and log-in using the account I created on the server, I can look through directories but can't look in some (which I wanted as I changed the permissions for that to happen), but my problem is although the permissions are set for this user account to have full control of the specific folder I can't create a folder in that area or upload files to that folder. Could someone explain why this is? Thanks in advanced

    Read the article

  • Can I install fresh Linux accross partitions (LUKS & LVM) and preserve/use existing home user?

    - by xtian
    With an existing LUKS encrypted logical volume partitioned hard disk dual boot to Windoz and Linux (Fedora 15), is it necessary to "start over" with the LUKS setup when upgrading the system? I recall some note about dividing the Linux installation over different partitions would help to preserve the home data in future update (I can't find this now) Before I try it, is this possible and intended use case for partitioning a Linux installation? # lsblk -fa NAME FSTYPE LABEL MOUNTPOINT sda [80G] +-sda1 [system W95 FAT 32] vfat +-sda2 ext4 /boot +-sda3 [52.4G] crypto_LUKS +-luks-de25ac97-6a32-4b79-a6a0-296a39376b3b (dm-0) LVM2_member +-cryptVG-root (dm-1) [21.5G] ext4 / +-cryptVG-swap (dm-2) [5.4MB] swap [SWAP] +-cryptVG-data (dm-3) [25.6G] ext4 /home

    Read the article

< Previous Page | 398 399 400 401 402 403 404 405 406 407 408 409  | Next Page >