Search Results

Search found 9816 results on 393 pages for 'blade servers'.

Page 262/393 | < Previous Page | 258 259 260 261 262 263 264 265 266 267 268 269  | Next Page >

  • How to NFSv4 share a ZFS file system on FreeBSD?

    - by Sandra
    Using FreeBSD 9, and created a ZFS file system like so zfs create tank/project1 zfs set sharenfs=on tank/project1 There are many howto's on setting up NFSv3 on FreeBSD on the net, but I can't find any one NFSv4 and when the NFS share is done with ZFS. E.g. this howto say I have to restart the (NFSv3) by nfsd -u -t -n 4, but I don't even have nfsd. When I do # echo /usr/ports/*/*nfs* /usr/ports/net-mgmt/nfsen /usr/ports/net/nfsshell /usr/ports/net/pcnfsd /usr/ports/net/unfs3 /usr/ports/sysutils/fusefs-chironfs /usr/ports/sysutils/fusefs-funionfs /usr/ports/sysutils/fusefs-unionfs # I don't see any NFSv4 servers, which I could install with pkg_add. Question How do I install and setup NFSv4, so I can mount the share from e.g. a Linux host?

    Read the article

  • rsync generates very much traffic

    - by user109459
    I use rsync for backing up one of my servers with 4GB of files. When I now try to transfer these files the traffic for the files isn't the estimated 4GB. It is a lot higher. It's about 60GB. I also checked the traffic on my server, backup server and router and all three say that there was a traffic of 60GB. But at the end rsync says that it only has transfered 4GB. Another problem is that I can't debugg it because the problem occures randomly.

    Read the article

  • Is it possible as an Administrator to gain access to a SQL Server 2008 instance without changing any passwords?

    - by adhocgeek
    I have administrative access on our network, but I don't manage the installation of all servers or software. On some of our machines instances of SQL Server 2008 have been installed which I need to be able to access, but since my account hasn't been explicitly granted a login, I can't get into. Is there a way to get into the database without changing anyone's password (e.g. I could solve this by changing the password of the user who installed the instance, assuming they've set themselves up as admin, and then logging on as them, but I don't want to have to do this).

    Read the article

  • Application running as a service is not able to create the same number of processes as when it runs

    - by Pini Reznik
    I have a Windows application which creates up to 35 processes and it's working OK when it's running from cmd. But when it is executed as a service on the same machine it is able to create only 20 processes and all other are killed because of some kind of resource exhaustion problem. The problem is persistent on one Windows 2003 server but not reproducible on other servers. Can it be because the system has run out of desktop heap? http://support.microsoft.com/kb/184802 How can I check it?

    Read the article

  • What factors can affect performance of Http Server written in C-Sharp? [on hold]

    - by Yousaf
    I am having trouble in terms of handling huge databases. I have multiple clients like 100-300 (clients are basically servers with i.e windows sql). Each client may have 38 thousand rows/listing of data, each row has 10-12 fields. I cannot afford to have json files of each client and than handle them on main server, because of memory issue. What if i have http server written in c or c# installed on clients and they return 250 rows in each response to the main server. How the factors like speed, memory or other issues can effect us ? What exactly I am asking for ? In short words if a server writter in c-sharp sends 250 rows per request. What factors can effect the performance of server ? for example. Speed, processing, Operating system, Implementation of algorithm of server ? How these factors can really effect the performance on large scale?

    Read the article

  • need help setting up a VPN for remote computer connection

    - by Chowdan
    I am on a low budget right now. I am currently in the process of starting a computer company. I am in need of a VPN network so I can run Dameware tools for working on customers/partners computers remotely. I will be working with Windows and some Apple and linux machines. I have desktop with an AMD Phenom II 965BE(currently running stable at 3.8Ghz) processor with 8 GB of ram and a radeon hd 6870(i know graphics aren't too useful) and about 1.5TB of HDD space. I am attempting to create a network out of my office based all on one machine that would also be secure for me to remotely connect to my partners computers so when they have issues I would be able to connect and do the diagnosing and repairs remotely. What types of servers besides a VPN server would i need to create this? I have access to all Microsoft products so I can run Windows Server 2012, Windows Server 2008 R2, or any other Microsoft Software. thanks for the help all

    Read the article

  • Speedup of fixing an openssl bug with 8192 bit key [on hold]

    - by rubo77
    This is related to this Bug-Report https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=747453 OpenSSL contains a set of arbitrary limitations on the size of accepted key parameters that make unrelated software fail to establish secure connections. The problem was found while debugging a XMPP s2s connection issue where two servers with long certificate keys (8192 Bit RSA) failed to establish a secure connection because OpenSSL rejected the handshake. This seems to be a small problem to be fixed but although there is an easy patch available to fix the issue in that bug report, no reactions are noticed so far.. The last patch that broke the 2048 barrier took 2 years to be implemented and only resulted in an increase to 4096bit, which seems to be a bad joke. Where would we have to report this to speed up the implementation for such an issue?

    Read the article

  • Nightly backups (and maybe other tasks) causing server alerts

    - by J. Pablo Fernández
    I have two independent alert notification systems for my servers. The server is a virtual machine on Linode and one of the alerts comes from Linode. The other monitoring system we use is New Relic. They are both watching out for IO utilization. Every night I get alerts from both of them as the server is using too much IO. I run quite a few tasks in the middle of the night but the one I confirmed that can cause IO-warnings is running the backups. The backup is done by s3cmd sync. I tried ionice but it still generates the warnings. Getting warnings every night reduces the efficacy of warnings when they happen for real. For Linode I could raise the level at which a warning is issued, but it might mean making the whole thing useless as the level is too high. What would be the proper solution for this?

    Read the article

  • Has anyone seen an HTTP 500 error when HTTPS traffic going through Pound Proxy forwards to an HTTP page?

    - by scientastic
    We have Varnish as our load balancer and reverse proxy cache for normal HTTP traffic. For HTTPS traffic, we use Pound proxy to unwrap the SSL and forward to Varnish, which then forwards to the back-end servers. This is used for our "checkout" process to encrypt credit card info in transition. However, on the last stage of checkout, users are always getting an HTTP 500 (Internal Server) error. It doesn't seem to be due to our back-end app server, by all tests I've tried. Does anyone know anything about how that transition works-- the transition back from HTTPS to HTTP and the interaction between Pound and Varnish-- and why it might cause 500 errors?

    Read the article

  • Some Domain Clients unable to access certain websites

    - by Shaunie
    I have a small domain around 20 clients with a 2003 R2 SP2 DC. Most of my clients can browse the internet freely and dont have a problem. However a couple are reporting problems accessing certain sites. IE: Hotmail, skyscanner, bbc news They can browse the sites sometimes then other times they get 408\409 errors. other machines in the domain can access these sites. I have cleared out dns cache on these machines modified external dns servers on the DC still to no avail. The main issue is the person not able to access skyscanner uses it several times a day to book flights for employess going on leave or returning to work. both clients are running XP SP3 though one machine is getting change for one running win7 shortly. Any advice greatly appreciated. thanks

    Read the article

  • PPTP VPN Server issue : server = centOS & client = windows 7

    - by jmassic
    I have a CentOS server configured as a PPTP VPN Server. The client is a Windows 7 with "Use default gateway on remote network" in advanced TCP/IPv4 properties enable. He can connect to CentOS without any problem and can access to: The Box of his ISP (http://192.168.1.254/) The CentOS server The website which is hosted by the server (through http://) But he canNOT access any other web service (google.com or 74.125.230.224) I am a beginner with web servers so I do not know what can cause this problem. Note 0 : The Windows 7 user must be able to access the whole internet through the CentOS PPTP proxy. Note 1 : With "Use default gateway on remote network" in advanced TCP/IPv4 UNCHECKED it is the same problem Note 2 : With "Use default gateway on remote network" in advanced TCP/IPv4 UNCHECKED AND "disable class based route addition" CHECKED the Win 7 can access google but with the ISP IP (no use of the VPN...) See Screenshot Note 3 : I have made a echo 1 > /proc/sys/net/ipv4/ip_forward and a iptables -t nat -A POSTROUTING -o eth0 -j MASQUERADE

    Read the article

  • Question about conditions of vps host provide

    - by baobeiii
    Looking into buying a VPS from a company. In their terms of service it says: User may not: a) Use 25% or more of system resources for longer then 90 seconds. There are numerous activities that could cause such problems; these include: CGI scripts, FTP, PHP, HTTP, etc. So basically your only allowed to use a 1/4 of what your paying for? Anyone know if this is a standard restriction for most hosting providers? Seems a bit ridiculous but i don't know whats normal in the server world. And the weird thing is they only sell xen servers so why can't i use my allotted resources as no-one else can. Thanks.

    Read the article

  • Can't conncet to memcached

    - by DMClark
    We currently have memcached running on CentOS. None of our PHP applications can connect, have tried multiple applications trying to establish access. The most informative PHP error we get is: "Memcache::get() [function.Memcache-get]: Server 127.0.0.1 (tcp 11211) failed with: Permission denied (13) in /var/www/.." memcached 1.4.5 PECL 2.25 We can telnet and it works. IP tables is full access from lo to lo. We've tried this on two different servers with both compiled version and the rpm in CentOS 5.5 and get the same result. Is there anything fairly obvious that we are missing?

    Read the article

  • SSH not working after installing SVN server on Debian

    - by sLIDe
    Today I had to install SVN server on my Debian server. I used this tutorial (only I didn't do anything to connect to SVN through file://, http:// or https://, only svn:// and svn+ssh://). After I installed SVN server and configured it after that tutorial, I tried to connect to it. I could connect to it using svn:// protocol, but when I tried to connect using svn+ssh:// protocol, my servers SSH stopped responding. Even after I stopped SVN server and restarted SSH server I can't connect to it.

    Read the article

  • Please provide how to setup using VMware, AD [closed]

    - by user552585
    In my organisation we have more than 100 pcs and high configured 3 IBM servers. Now the senario is 300 employees with diff programmers like .Net,java,php etc. these employees use by these systems only in diff shifts without stop their work. I want all applications required them on every system and they have perticular id, Pw to login and i have to secure the organisation data and userdata to tamper or any thing by other users. Please provide how to setup using VMware, AD with MicroSoft environment with fully secured manner. please give brief explanation. Please help me

    Read the article

  • Multi server management

    - by user788721
    We are running a website that allow users to create their own content, then share it through an iframe. We would like to get more servers to host the user content, and the main one for our website. Each user has a link like xxxx.com/content989856, xxxx.com/content45454545 We were thinking of two options : using a htaccess on the main server that will redirect to the good server but the problem is that if the main server is out, then all the content is out as well using subdomain depending on where the content is hosted, but then if we change the user content from one domain to another one, we will have to change his links as well Do you know a better option or is that really the only two available ? I am wondering how big websites like youtube handle this problem. Thank you very much for your help,

    Read the article

  • Best Practice for upgrading PHP On Production Systems

    - by Demic
    We Have two load balanced web servers running php 5.3. I've been asked by our dev team to upgrade php to 5.4 because they need certain functionality it will bring. The main issue is that 5.3 is the latest thats been built into the distros repository, so to upgrade using the package manager, Ill need to add another 3rd party repo. I dont have a problem with this per se, but Im concerned about using a package from a "non official" source. The other option is to compile php from source, but I guess this will prevent me from using the package manager to upgrade at any stage in the future? So I guess Im just looking for some guidance on which way to go. Compile from source or install from any old repo that purports to supply php 5.4? Or perhaps theres a third option I havent considered? Thanks in advance Demic

    Read the article

  • Windows Server 2003 - Give User Full Admin Privileges

    - by APShredder
    I am running Windows Server 2003. There are a couple of user accounts that I would like to promote to Administrator accounts. I've tried several ways to do so, but I am still relatively new to setting up a server. If anyone has any ideas on how to go about promoting these users, I thank you in advance. EDIT: I should probably mention that this a domain controller. I didn't realize that this changed the answer I was looking for. I apologize, like I said before I am new to the world of servers. EDIT #2: I've added the users to the Administrator group like most of the answer recommended, but the users don't seem to have admin rights yet. I think this might be because they are also in the Domain Users group, which I can't seem to be able to remove them from.

    Read the article

  • Where are iSCSI and volume information location on ESX 4.0?

    - by sec_goat
    Let me start by saying I am somewhat of a Vmware novice, I know just enought to manage and create servers at a basic level from the vSphere client. I have a VMWare server, ESX 4.0 connected over iSCSI to a SAN. The SAN is being used as storage for both RDMs and VMFS volumes. The SAN has died and I ordered a new one. Where can I see the settings related to the iSCSI and volume configuration so that I can try and replicate them on the new SAN when it arrives?

    Read the article

  • Cisco ASA Multiple Public IP

    - by KGDI
    I have a Cisco ASA5510 and articles related to ASA and mulitple Public IP says this cant be done. My question is how to best solve a scenario like this: I have 3 zones, Outside, Inside and DMZ Outside is Internet Inside is Client machines DMZ is a zone for servers related to external and internal services. My scenario is a bit more complex, but to keep things simple this will do: I want to place an Exchange server and a web server (externally reachable in the DMZ zone) The webserver uses both TCP80/443, the Exchange server uses 443 So to the problem: With the ASA only having one public IP, how would you make a DNAT to port 443 on both the internal hosts behind 1 Public IP? Usually, when i do this kind of scenario With Linux boxes i use alias Interfaces like eth0:0, eth0:1 and set 1 Public IP on each. To me this must be a pretty common scenario, any ideas on how to solve it With ASA? /KGDI

    Read the article

  • Unexpected "waiting for localhost"?

    - by Tenaar
    So I ran into something that kind of worried me today. Lately my computer has been kind of slow and I'm dealing with that, but today when I opened Facebook in Google Chrome, I noticed a message in the bottom left corner while it was loading the site that said "Waiting for localhost". It was brief and I managed to notice it because my computer is slower than it used to & it caused Chrome to hang briefly, long enough for me to read it. As I'm quite confident in that Facebook isn't running on my localhost, I'm wondering what could potentially make Chrome wait for localhost while I'm loading webpages from external servers. Is there a malware of some kind that I should be worrying about? Unfortunately I have no other information than this to go on, and I have no idea how to further investigate this, if it generated any logs or whatever. I'd appreciate any help in figuring out this matter!

    Read the article

  • Get SMTP to work

    - by user664408
    We upgraded to exchange 2010 and this broke an old java based script that connected and sent out e-mail messages. Many hours later we still can't get exchange to work like exchange 2003 did. That hope was abandoned and we decided to create a linux postfix server to forward the e-mail from the old system to exchange, eliminating exchange on the java side. This still doesn't work with similar errors. I need help figuring out what is different between exchange 2003 with SSL and authentication and the new servers, both linux and exchange 2010. My guess is both have TLS and for some reason the java code won't revert back to the older version of SSL, instead it just fails. Can someone help me either setup exchange 2010 to work like 2003 used to, OR to setup postfix to mandate it use SSL 2.0 instead of TLS? unfortunately no one knows anything about the Java code and they can't decompile it apparently. Any help is appreciated.

    Read the article

  • Hyper-V Manager - Host Access During a Catastrophe

    - by LonnieBest
    How can I ensure that I can always have Hyper-V Manager access to a Hyper-V server, even in the event that the Active Directory Server is down (in a domain-login environment)? Background: The one that came before me, set up the company's servers as virtual machines on top of a host running Hyper-V Server 6.1 (7601) Service Pack 1. For managing Hyper-V, he installed Window 7 onto a virtual machine (run on the same host) with Hyper-V Manager installed. When the (virtual) Active Directory server (run on this same host) is rebooted, during that reboot, I'm unable to RDP into the Windows 7 virtual machine, and I'm therefore unable to access Hyper-V Manager when the Active Directory server is down. I suspect I can't login because I can't authenticate with the Active Directory Server. I'm going to install Hyper-V Manger onto some addition manager's workstations, but how can I ensure they'll have access in a catastrophe where Active Directory authentication isn't possible?

    Read the article

  • How often does a Linux check it's /etc/localtime file

    - by DarkSheep
    I am trying to troubleshoot a problem. Some very helpful information would lead me to a solution is: Often does Linux check the /etc/localtime file? Additionally if there is a configuration file to increase the frequency (for testing) that would be helpful. Normally I would assume that it would check when a call to the ntp server has been executed, but NTP is not installed. It couldn't possibly check every time the date function is called as this would result in many problems on busy web servers. Server is Ubuntu 12.04, but I don't think that is relevant to the question.

    Read the article

  • Use server git installation in GitHub for Windows

    - by Lg102
    We are using Git as the version control for our website development. I work from a laptop, which is connected to the internal network via a WiFi connection. I've mapped the server drives as network drives in Windows. Commands such as git status take significantly longer for me than they do for my co-workers on wired connections. When connecting to the server using SSH and running commands on the git installation there, performance is even better. Is there a way to configure GitHub for Windows to use the server-installed git (with my credentials)? Note: While our production servers has a user configuration with proper permissions, the development server has only one root user.

    Read the article

< Previous Page | 258 259 260 261 262 263 264 265 266 267 268 269  | Next Page >