Search Results

Search found 2346 results on 94 pages for 'dedicated'.

Page 24/94 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • Determining which database instance makes biggest IO

    - by user2008937
    Assuming that I have a dedicated server on which I am running multiple instances of mysql and postresql servers. How without iotop determine which instance in particular time (proc/pid/io shows data collected in some peroid of time) makes the biggest IO (so it increases IOWAIT)? When lots of ppl do something on DB then I clearly see which instance is making the load because of high cpu usage, but I had a situation when the cpu usage was just normal, but very high iowait made a huge load on server and i had problem finding process that was making some outstanding IO

    Read the article

  • Why is my server using so much memory?

    - by Qasim
    I haven't even set up my website on my dedicated server so I'm the only one using it at the moment. And yet this is what I see in my sys info: Full Size I just got a bunch of security softwares installed today so I'm wondering if that could be the reason. Programs like Dos deflate, CSF firewall, Mod_security, SIM, Log watch, etc. My server's details: CentOS Processor Intel Xeon CPU X3220 CPU Speed 2.39 GHz Cache Size 4.00 MB RAM 2GB DDR2

    Read the article

  • Small website on Amazon EC2 Linux: a single large instance or more small instances in load balancing?

    - by Enrico Detoma
    I need to run a small website with a JSON webservice on Amazon EC2 Linux. The largest number of requests come from the JSON webservice, which provides some load in terms of MySQL queries. I'm trying to decide between two choices: A single large instance (Ubuntu 12.04 64-bit) with full LAMP stack or One or two small instances (Ubuntu 12.04 64-bit) with Apache/PHP only One small instance dedicated to MySQL (or RDS) Which setup would you consider to be more performant?

    Read the article

  • Maximise network transfer speed of various applications

    - by Alex
    When using nc, scp, wget to transfer files between 2 machines on a dedicated 2Mbps link, I get speeds between 0.5 and 1 Mbps. However, when I use iperf -c 10.0.1.4 -t 20 -P 12 (for example) I can maximise the speed of the link (getting stable 2Mbps). Is there a way to make single stream transfers (such as those done by scp) to utilise all/most of the link? Some kind of tcp settings, or iptables...?

    Read the article

  • PHP ftp_connect

    - by Dude Lebowski
    I try to use the php ftp_connect fucntion on my dedicated server and I'm unable to establish a connection: $conn_id = ftp_connect($ftp_server, 21) or die("Unable to connect to $ftp_server") ; I'm sure the function is available as I test with : function_exists('ftp_connect') and it returns true When I ftp the server through the shell I can reach it so I guess it's not a firewall issue. Am I missing something else ? Thanks for your precious advices

    Read the article

  • How faster is using an internal IP address instead of an external one?

    - by user349603
    I have a mailing list application that sends emails through several dedicated SMTP servers (running Linux Debian 5 and Postfix) in the same network of a hosting company. However, the application is using the servers' external IP addresses in order to connect to them over SMTP, and I was wondering what kind of improvement would be obtained if the application used the internal IP addresses of the servers instead? Thank you in advance for your insight.

    Read the article

  • Resource consumption of FreeBSD's jails

    - by Juan Francisco Cantero Hurtado
    Just for curiosity. An example machine: an dedicated amd64 server with the last stable version of FreeBSD and UFS for the partitions. How much resources consume FreeBSD for each empty jail? I mean, I don't want know what is the resource consumption of a jailed server or whatever, just the overhead of each jail. I'm especially interested on CPU, memory and IO. For a few jails the overhead is negligible but imagine a server with 100 jails.

    Read the article

  • Setting up a subdomain SSL with custom port

    - by Webnet
    I'm setting up a subdomain on a dedicated server that I'm going to use for SVN services. The SVN server is up and running I just need to setup the subdomain. The https has been switched to a custom port because there's a confliction with a port forward pointing to another server. Should I do this through GoDaddy or Apache?

    Read the article

  • DNS Update 'stuck'

    - by Postus
    I have a dedicated server with Debian and ISPconfig on it And a domain example.it I did a error while setting the dns for the first time I've tried to set it to ns1.example.it IP.IP.IP.IP myDedicnumber.kimsufi.com IP.IP.IP.IP But it got stuck it shows "Current Status: on hold" its been 7days already. I've tried contacting ovh but they just told me to set my dns to something diffrent but I can't as this operation is blocking any changes to the DNS records. Is there anything that I could do accept bug OVH?

    Read the article

  • Internal Server Error on HTTPS SSL URL

    - by spike5792
    I am running cPanel/WHM on Apache server and have just installed an SSL certificate for a single domain. Domain/server is on a fixed dedicated IP address. I'm given the 'successfully installed' message when installing the SSL certificate, however when trying to visit the domain using https, the 500 Internal Server Error message appears: The server encountered an internal error or misconfiguration and was unable to complete your request. Additionally, a 500 Internal Server Error error was encountered while trying to use an ErrorDocument to handle the request.

    Read the article

  • Server to server replication and CPU and 32k\ corrupt doc

    - by nick wall
    Summary: if database contains a doc with 32K issue or corrupt, on server to server replication it causes marked increase in CPU in nserver.exe task, which effectively causes our server(s) to slow right down. We have a 5 server cluster (1 "hub" and 4 HTTP servers accessed via reverse proxy and SSO for load balancing and redundancy). All are physically located next to each other on network, they don't have dedicated network\ ports for cluster or replication. I realise IBM recommendation is dedicated port for cluster. Cluster queues are in tolerance and under heavy application user load, i.e. the maximum number of documents are being created, edited, deleted, the replication times between servers are negligible. Normally, all is well. Of the servers in the cluster, 1 is considered the "hub", and imitates a PUSH-PULL replication with it's cluster mates every 60mins, so that the replication load is taken by the hub and not cluster mates. The problem we have: every now and then we get a slow replication time from the hub to a cluster mate, sometimes up to 30mins. This maxes out the nserver.exe task on the "cluster mate" which causes it to respond to http requests very slowly. In the past, we have found that if a corrupt document is in the DB, it can have this affect, but on those occasions, the server log will show the corrupt doc noteId, we run fixup, all well. But we are not now seeing any record of corrupt docs. What we have noticed is if a doc with the 32K issue is present, the same thing can happen. Our only solution in that case is to run a : fixup mydb.nsf -V, which shows it is purging a 32K doc. Luckily we run a reverse proxy, so we can shut HTTP servers down without users noticing, but users do notice when a server has the problem! Has anyone else seen this occur? I have set up DDM event handlers for many of the replication events. I have set the replication time out limit to 5 mins (the max we usually see under full user load is 0.1min), to prevent it rep'ing for 30mins as before. This ia a temporary work around. Does anyone know of a DDM event to trap the 32K issue? we could at least then send alert. Regarding 32K issue: this prob needs another thread, but we are finding this relatively hard to find the source of the issue as the 32K event is fairly rare. Our app is fairly complex, interacting with various other external web services, with 2 way data transfer. But if we do encounter a 32K doc, we can't look at field properties, so we can't work out which field has issue which would give us a clue as to which process is culprit. As above, we run a fixup -V. Any help\ comments on this would be gratefully received.

    Read the article

  • Is there a way to backup AD and DNS in Windows 2008 without backing up the whole Volume?

    - by EtherDragon
    I would like to know if there is a (cost free) way to backup Windows Server 2008 Active Directory and DNS settings without using Windows Server Backup. The problem stems from not having a seperate volume available to store the resulting backup from Windows Server Backup. I examined the command line options with wbadmin and it also expects the destination to be a dedicated volume for the backup. ~ED

    Read the article

  • WHMCS on main website [closed]

    - by Miskone
    I ordered a Dedicated Root Server on Hetzner. I have WHM and cPanel and now I want to make front site for histing company where users can control their accounts, nameservers of registered domain, register new domains and new hosting accounts. My question is: Do I need to buy WHMCS for that or something else. I need to put client zone on front site and I want to find the best solution for that. Sorry if this is not topic for this site, but I just don't know where can I ask this question :(

    Read the article

  • How to limit traffic on a router based on IP?

    - by YuriKolovsky
    how to limit internet traffic on a router based on a LAN IP. so that for example on a 10mb/s internet connection I can have a IP Camera with a dedicated 1mb/s, 2 computers with 3mb/s, and 2 computers with 6mb/s. as far as I know it's called something like traffic-shaping... I'm really not sure how this all is called, so please show me or point me at some guide for dummies. :)

    Read the article

  • VPN connection over apache mod_proxy

    - by This is it
    Hi We have several virtual machines which are connected in a private virtual network connection. Internet access for these machines is provided via dedicated virtual machine which has apache proxy server on it (they all use this machine as proxy). The problem now is that from several machines we need to connect to external VPN Server, but it seems that VPN connections don't work over apache proxy. Any suggestions on how to enable VPN connection over apache proxy (or some other proxy)? Some other solution? Thanks

    Read the article

  • Is there a way to backup AP and DNS in Windows 2008 without backing up the whole Volume?

    - by EtherDragon
    I would like to know if there is a (cost free) way to backup Windows Server 2008 Active Directory and DNS settings without using Windows Server Backup. The problem stems from not having a seperate volume available to store the resulting backup from Windows Server Backup. I examined the command line options with wbadmin and it also expects the destination to be a dedicated volume for the backup. ~ED

    Read the article

  • How to change IP Location information?

    - by jimm
    How to change IP Location information? Hello, When viewing website whois through whois.domaintools.com, under the Server Stats tab, there is the IP Location information and looks like this: IP Location: - Netherlands - Dedicated/collocated/various Hosting Customers What to do to change the IP Location information after the word Netherlands? Thanks in advance

    Read the article

  • Linux: permission to read file from /sys/kernel/debug wihout root privileges

    - by gogowitczak
    I'd like to read small text file, located at: /sys/kernel/debug/vgaswitcheroo/switch without root permissions. It contains information about graphic cards (dedicated ATI Radeon and integrated Intel HD3000). I wrote a simple script displaying infomation I need, but it works only with root privileges. I already tried changing file & folder permissions, but when I restart my computer, the problem remains. Is it possible to permanently change that file permissions or owner?

    Read the article

  • tracking bandwidth usage per person

    - by deepak
    We have an office of 15 people and all of us share one connection One person can hog the connection to consume all the bandwidth. which of these router firmwares would allow me to restrict access to certain mac address track bandwidth usage per mac-address or some personal identity I do not want to track the actual websites visited, only the total bandwidth usage Also want to use a router, not a dedicated computer

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >