Search Results

Search found 9715 results on 389 pages for 'servers'.

Page 255/389 | < Previous Page | 251 252 253 254 255 256 257 258 259 260 261 262  | Next Page >

  • Website ocasionally does not load on first click

    - by tfe
    Today I noticed that my website hosted on a virtual server ocasionally does not load on first click. I click on some link, browser starts loading page but nothing loads and does not not appear any error message (like "connection reset by peer etc). Nothing. When I click the same link again, page loads immediately. The same situation on 2 computers in different browsers. It happends not always, maybe on each 20... or 30 click. Sites from other servers load without this problem. Any ideas what can cause this problem?

    Read the article

  • In-House DropBox

    - by beardedlinuxgeek
    Dropbox is perfect, but as a company, no one can host anything worthwhile on servers that we don't control. So I've been tasked with coming up with a Dropbox alternative, something in house. GlusterFS is nice, but no offline access. SparkleShare uses Git which isn't great for large files. It also doesn't have windows ports. Any other options? If I were to roll out my own from scratch, what do you think the based way to go about doing this would be?

    Read the article

  • Server Sizing Methodology

    - by adbrpc
    Our development environment consist of JBoss 5.0.1 DB Server, SQL Server 2008, Oracle IDM. Hardware is Win 2008 32 bit, 4GB RAM. We have reached stage where our environment can not handle application resulting in JBoss shut down throwing out of memory errors and CPU reaching to 90% usage. I am looking methodology to calculate correct server sizing where I input TPS, max number of concurrent users, max CPU utilization etc.. to give me number of servers, RAM size, number of cores. I am expecting application to grow 10% annually. Load Balancer and Failover should also be taken in account while sizing.

    Read the article

  • Will learning to use Fedora also teach me my way around Redhat (CentOS)?

    - by Matt Untsaiyi
    I want to dive into the open source world and start using a Linux distro while learning to program. I've looked over the options and it pretty much boils down to Fedora or CentOS. The reasoning behind it is I'm hoping to kill two birds with one stone... Redhat seems to be "the choice" for servers, so I figure as I learn to program, I can also learn my way around Linux... or Redhat more specifically... and get that under my belt too. I want to use Fedora, and be on the frontier of new software (since I'm not doing anything critical), but if it's completely different than Redhat I'd rather just use CentOS. So is it? Or can I use one and know the other?

    Read the article

  • Poor disk performance with high disk capacity usage

    - by GoldenNewby
    I've heard numerous times in the web hosting industry that using "too much" disk space on a drive is bad for performance. Is this just a myth? Can someone explain why this is an issue, even in a situation where the amount of IO done to the drive would be the same at 10% as it would be at 90%? I'm especially curious in the case of virtual servers. If I set up 10 Logical volumes as the virtual disks for some VMs, is it going to run better if I "waste" 20% of the disk space?

    Read the article

  • What would Stack Exchange's yearly expenses be if it were to be using a third party host?

    - by abel
    StackExchange manages it's own servers, as it should, but if SE were to be hosted on a 3rd party "cloud" hosting (like Amazon's), what would it's monthly / yearly expenses be(keeping everything else the same)? A detailed answer comparing it to the bills that Stackexchange boots currently (including power/property/staff) would help. (PS: I know that the blog is a good resource. I also understand that managing your own hosting is almost the same as setting up a hosting company and using it for your own needs. Plus is this a question for meta or does it fit within serverfault's purview?)

    Read the article

  • Performance Drop Lingers after Load [closed]

    - by Charles
    Possible Duplicate: How do you do Load Testing and Capacity Planning for Databases I'm noticing a drop in performance after subsequent load tests. Although our cpu and ram numbers look fine, performance seems to degrade over time as sustained load is applied to the system. If we allow more time between the load tests, the performance gets back to about 1,000 ms, but if you apply load every 3 minutes or so, it starts to degrade to a point where it takes 12,000 ms. None of the application servers are showing lingering apache processes and the number of database connections cools down to about 3 (from a sustained 20). Is there anything else I should be looking out for here?

    Read the article

  • Does a modern PC require a graphics card to run?

    - by ArtM
    As I can remember, on old systems (Pentium II or III) it was not possible to boot and run the PC if the graphics card was missing (AGP cards were used in those days). Many years from then, I'm using motherboards with integrated graphics and I have no experience related to this subject, the "graphics card" always was present. Currently I intend to build a home/private "server" for my purposes and most of the motherboards I want to buy have no integrated graphics (AMD 870 or 970). I can take a normal graphics card from my firends for a few hours/days and use it when installing the necessary software. The question is: can I boot and run the PC without problems after I install everything I need and the graphics card is removed? if a general answer cannot be given, at least some examples of manufacturers/MB series/MB models will be helpfull I think it's obvious, but for completeness: I mean cheap desktop components, not real servers.

    Read the article

  • Uptime concerns in case of AWS outage

    - by Aditya Patawari
    I am running an Elastic Load Balancer backup by 2 instances in different Availability Zones in US East. I am using Multi-AZ RDS as well. Ideally this should ensure that if one AZ goes down, it should not effect the app because everything is spread across multiple AZs. But the recent AWS outage took the app down for a long time. I am not sure how this can happen. It would be great if someone can point out what went wrong. Major question here I have is how can I avoid this in future? I can setup app servers across different regions or even providers and use DNS for load balancing but what do I do with MySQL? Read Replicas will introduce some lag which I would want to avoid.

    Read the article

  • Exclude localhost from htt_proxy

    - by Chilloutman
    I'm trying to access a server that is running locally on my machine (localhost). I'm using the "wget"-command to download the servers http-response. I've also tried using the "curl"-command to do this, but both (wget and curl) are trying to get through my proxy-server and failing at it: --2010-05-04 09:05:34-- http://localhost:8080/api/getplist ... Proxy request sent, awaiting response... 503 Service Unavailable 2010-05-04 09:05:35 ERROR 503: Service Unavailable. Obviously they shouldn't need to go through the proxy, right? So I disabled the http_proxy: export http_proxy="" And then it worked fine. Disabling the the http_proxy every time or permanently are no options. How can I set it to ignore the proxy settings when accessing "localhost"?

    Read the article

  • node.js server not running

    - by CMDadabo
    I am trying to learn node.js, but I'm having trouble getting the simple server to run on localhost:8888. Here is the code for server.js: var http = require("http"); http.createServer(function(request, response) { response.writeHead(200, {"Content-Type": "text/plain"}); response.write("Hello World"); response.end(); }).listen(8888); server.js runs without errors, and trying netstat -an | grep 8888 from terminal returns tcp4 0 0 *.8888 *.* LISTEN However, when I go to localhost:8888 in a browser, it says that it cannot be found. I've looked at all the related questions, and nothing has worked so far. I've tried different ports, etc. I know that my router blocks incoming traffic on port 8888, but shouldn't that not matter if I'm trying to access it locally? I've run tomcat servers on this port before, for example. Thanks so much for your help! node.js version: v0.6.15 OS: Mac OS 10.6.8

    Read the article

  • Proxy between data centers [closed]

    - by dstarh
    Possible Duplicate: Can IIS be configure to forward request to another web server? We are switching data centers (actually datac-center to EC2 to be specific) and some customers have not yet made DNS changes to point the domains to the new load balancers. We are thinking of leaving the existing servers up and just using a proxy server to forward the request to the new load balancer. Can anyone recommend a good proxy server for doing this I've got squid installed but it seems it's fairly easy to just have a fairly wide open proxy server and we don't want this. I want all requests coming in on port 80 to be sent to port 80 at a specific domain (elastic load balancer) the data center env is windows 2k3 and the EC2 env will all be linux but the ec2 env should be irrelevant

    Read the article

  • GPO errors filling up event viewer

    - by burntehsky
    there have been a few issues with the server i have been working on i check the event viewer and it is filled with the errors below i was not sure how to go about fixing this i looked in the path where the file is and it is there Windows cannot access the file gpt.ini for GPO CN={31B2F340-016D-11D2-945F-00C04FB984F9},CN=Policies,CN=System,DC=ISPHOME,DC=NET. The file must be present at the location <\\isphome.net\\sysvol\ISPHOME.NET\Policies\{31B2F340-016D-11D2-945F-00C04FB984F9}\gpt.ini>. (The network location cannot be reached. For information about network troubleshooting, see Windows Help. ). Group Policy processing aborted. C:\Documents and Settings\Dimitri>ipconfig /all Windows IP Configuration Host Name . . . . . . . . . . . . : ispserver Primary Dns Suffix . . . . . . . : ISPHOME.NET Node Type . . . . . . . . . . . . : Unknown IP Routing Enabled. . . . . . . . : No WINS Proxy Enabled. . . . . . . . : No DNS Suffix Search List. . . . . . : ISPHOME.NET Ethernet adapter Local Area Connection 3: Connection-specific DNS Suffix . : Description . . . . . . . . . . . : Intel(R) PRO/100 VE Network Connection #2 Physical Address. . . . . . . . . : 00-07-E9-AA-3E-C3 DHCP Enabled. . . . . . . . . . . : No IP Address. . . . . . . . . . . . : 192.168.1.50 Subnet Mask . . . . . . . . . . . : 255.255.255.0 Default Gateway . . . . . . . . . : 192.168.1.1 DNS Servers . . . . . . . . . . . : 127.0.0.1 *dcdiag /c /v is below* Summary of test results for DNS servers used by the above domain contro llers: DNS server: 192.168.1.1 (<name unavailable>) All tests passed on this DNS server This is a valid DNS server DNS server: 192.168.1.50 (<name unavailable>) All tests passed on this DNS server This is a valid DNS server Name resolution is funtional. _ldap._tcp SRV record for the fores t root domain is registered Summary of DNS test results: Auth Basc Forw Del Dyn RReg Ext ________________________________________________________________ Domain: ISPHOME.NET ispserver PASS FAIL PASS PASS PASS PASS n/a ......................... ISPHOME.NET failed test DNS

    Read the article

  • mongodb replication: no primary elected

    - by Max
    I have three servers with mongod installed on it running as a replication set. Suddenly the two secondories became unavailable (the mongod process died) - I think because they were too stale. The problem is that the original PRIMARY is now the SECONDARY and my application doesn't work because it can't connect to a PRIMARY. I mean, in which way does that help me? If the replica set can't do failover?! Am I missing something? Furhtermore I am asking myself why did the SECONDARIES die / why are they too stale? What can I do about it? FYI: My database is quite big (40GB on disk).

    Read the article

  • KVM guest storage difference with NBD and NFS

    - by WojonsTech
    I am setting up my own little private cloud for my own use maybe for a project or to. I am using linux kvm on debian 6. I have 3 servers 2 of them for compute nodes and 1 storage node. I would I have already installed kvm made a few test machines got my networking setup. I have 2 nics on each server 1 nic is for web traffic other nic is for network traffic. My first Idea was to use NFS for storing the guest machines which can range in size, maybe 8gb maybe 100gb, it just depends. I was doing have heard of nbd before seems like it could work but I dont know what the performance differences are and if it will effect my enviroment, nfs looks like it will be easier to use.

    Read the article

  • Activating ssl on tomcat

    - by toom
    I want to encrypt the http traffic on a tomcat instance via ssl. Therefore I followed the most simplistic approach described on various webpages. But anyway it simply does not work. Here is what I did: "keytool -genkey -alias tomcat -keyalg RSA" and I enterd "changeit" as the password (since this is the defaut chosen by tomcat) Altering $CATALINA_HOME/conf/servers.xml by uncommenting the following line Connector port="8443" protocol="HTTP/1.1" SSLEnabled="true" maxThreads="150" scheme="https" secure="true" clientAuth="false" sslProtocol="TLS"/ Restarting tomcat Entering https://localhost:8443 does not work. However, I can still access the page via normal http like http://localhost:8080 The logfile does not contain any suspicious information. What is going wrong here?

    Read the article

  • Sticking with Ubuntu 12.04 while heavily using PPA for newest software updates (Apache 2.4, PHP 5.5)

    - by MechaStorm
    I was wondering whether is it worthwhile to stick with Ubuntu 12.04 LTS until 14.04 comes or should I be switching to just the latest Ubuntu server version 13.10. My server needs are not enterprise heavy and previous thought to keep with LTS was simply to gain the security updates without having to upgrade the servers every couple months. But as we are moving forward with our software development, I have found that alot of the default version of software with 12.04 is way out of the date forcing me to up date via PPA or from source instead of from default apt-get. ie PHP 5.3 is on 12.04, and I'd like to get it to 5.5. Is it worthwhile to simply move to 13.10 in that situation? With the idea to move to 14.04 when it comes?

    Read the article

  • Can a RAID disk setup crash if only 1 hard disk fails?

    - by Steve Rodrigue
    I am a web developer. I have not much experience in hardware. For this reason, I use managed servers. This morning, one of the drives in our setup failed. However, the full site went down. I asked my web host what happened and he replied that the hard disk failed in such a way that the RAID controller couldn't work properly. Do you guys ever seen that before? Is it possible? Thanks for any help on this guys. I need to know if my web host is honest with me. Steve

    Read the article

  • Controlling access to my API using SSH public key (not SSL)

    - by tharrison
    I have the challenge of implementing an API to be consumed by relatively non-technical clients -- pasting some sample code into their WordPress or homegrown PHP site is probably as much as we can ask. Asking them to install SSL on their servers ain't happening. So I am seeking a simple yet secure way to authenticate API clients. OAuth is the obvious solution, but I don't think it passes the "simple" test. Adding a client id and hashed secret as a parameter to the requests is closer -- it's not hard to do md5($secret . $client_id) or whatever the php would be. It seems to me that if client requests could use the same approach as SSH public keys (client gives us a key from their server(s) there should be some existing magic to make all of the subsequent transactions transparently work just as regular HTTP API requests. I am still working this out (obviously :-), so if I am being an idiot, it would be nice to know why. Thanks!

    Read the article

  • Windows file Sync

    - by Deane Venske
    So I have a big problem at the moment. Trying to find a reliable solution for syncing 2 windows IIS servers. I need to keep the web content imaged on both. Now I have been trying to use Rsync to this point, but unfortunately file permission errors are a nightmare to manage this way. I'm testing out dropbox, but the performance sucks. I'm more familiar with Linux stuff and I've used Rsync in the past but isn't there a native windows solution that will work?

    Read the article

  • xenserver: xe command never returns?

    - by ethrbunny
    I'm trying to port a xen server 6.2 pool to a new IP address range. I've got three servers total: 2 currently at their new IP but no longer in the pool and one remaining. I'm trying to set IP address information on the two disconnected ones using the xe command and all of its variants. Oddly enough, it never returns with any values. xe host-list It just sits there until I ctrl-c it. The server is still awake and responding though. I can enter other commands (EG ifconfig) and they work fine. If I enter this same command on the remaining server in the pool it works ok. I've tried restarting the toolstack and even rebooting. No change. What am I doing wrong?

    Read the article

  • Subnetting design for a new building?

    - by Zombie
    A building with 4 floors, each floor is divided as follows; 15 users for accounting, 15 users for finance and 15 users for marketing (i.e 45 user in each floor). Data center is located on the ground floor, with 45 servers to be divided into 15 for all the accounting users in the four floors, another 15 for the finance and the last 15 for the marketing. (i.e each 15 server for each one of the above categories are separated from the other 15 and so on) What is the proper subnetting design for such scenario? Knowing that we are allowed to use anything we want!

    Read the article

  • XP Clients can't copy to networkshare

    - by chewbacca76
    i have a windows 2003 domain where i have strange problem. One of our file shares is on a 2003r2 domain controller, xp clients trying to copy files on the share are always getting the error error copying file or folder filename could not be copied. path too long while windows 7 clients work fine. Nothing unusal is found in the eventlog on both the server and the client. It doesn't matter if i access the share by fqdn or ip, the path is including filename shorter than 20 characters i.e. \path\share\file.txt Copying files to other servers is fine. Reading from the shares is ok too. Happened from one day to the other, one windows update that was installed this day (kb2736233) was removed but nothing changed. thanks for any tips

    Read the article

  • Apache with mod_perl eating memory when idle

    - by syneticon-dj
    An Apache webserver running a mod_perl application is exposing abnormal memory usage - after the "day load" ceases, the system's memory is being exhausted by the Apache processes and oom_killer is being invoked. As the load returns the following morning, the memory usage normalizes - probably because Apache workers get recycled periodically if a sufficient number of hits is generated: This is the graph for apache hits per second to correlate: The remaining 2 hits per second throughout the night are induced by HAProxy checks - it runs HEAD http://mydomain.example.com/running HTTP/1.0 requests against the server every half a second with "running" being a static file (i.e. not invoking any perl code). It also seems that disabling these checks remedies the memory usage problem, but obviously cannot be a solution. All of 3 similarly configured servers (behind HAProxy) expose this behavior. The running OS is Ubuntu 10.10, Apache version 2.2.16. This seems to be a memory leak but I have no idea how to start debugging it - any hints?

    Read the article

  • How to get a windows domain server to recognize a linux machine by its name?

    - by CaCl
    In my company I ran into an issue where we have a linux machine that serves up a Subversion repository. Its hooked up via LDAP to the Active Directory. We got an account setup for an application and they set the Limited Workstations up so it didn't have full access to the network. The problem is that even though the hostname for our machine resolves correctly for me, the credentials for the application account seem to come back as not being allowed based on the name (the error was related to authorized workstations). I don't have access to any of the domain servers but it might be helpful to come at the management or high-level techs with some ideas, they don't seem to have a solution besides allowing all workstations for the user. Does anyone have any idea on how to get my linux machine to properly identify itself with the Domain machine by name?

    Read the article

< Previous Page | 251 252 253 254 255 256 257 258 259 260 261 262  | Next Page >