Search Results

Search found 2291 results on 92 pages for 'webserver'.

Page 13/92 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • How can I optimize ubuntu desktop to run my webserver

    - by Parry
    Hi, I am using ubuntu desktop edition to run My drupal website on Intranet. I know for running web servers best thing to install is ubuntu server editions, but due to some problem i am using Desktop edition. I installed XAMPP on my machine an my website is up and running. I want to know how can i optimize my machine?? Since I will not use very less features of desktop editions are there any things which I can remove or stop which will free memory and cpu consumption, are there any packages which i should install to increase the performance of my ubuntu??

    Read the article

  • Windows 2003 - Isolate mailserver on webserver with VMware Server

    - by user43279
    Hi, I've a Virtual Private Server with Windows 2003 and root access. This server mainly acts as a web hosting machine (IIS, Apache). Additionally it is used as a mail server. Is it possible to isolate a mailserver (for example HMailServer) by using VMware Server on Windows 2003 in order to avoid potential viruses moving from the guest into the host system? Is this is a good direction to protect the web server from viruses? Kind regards, Jakub

    Read the article

  • Can ping between Host and Guest, but can't acces webserver with Virtualbox

    - by Gastoni
    How come I can ping back and forth between host and guest using VirtualBox, but I can't access from the host the web server installed in the guest. I'm using a host-only network. Host Ubuntu 10.10 vboxnet0 - 192.168.56.1 ping to self, works ping to guest, works access to web server in guest, FAILS Guest Fedora 13 eth1 - 192.168.56.101 ping to self, works ping to host, works access to web server in host, works

    Read the article

  • Extracting httpdocs from Plesk Panel 9.5.4 Webserver backup file

    - by Paddington
    Good day, I am having problems manually extracting domains from Plesk 9.5 backup that was FTPed onto my back up server. I have followed the article http://kb.parallels.com/en/1757 using method 2. The problem is here: zcat DUMP_FILE.gz DUMP_FILE My backup file CP_1204131759.tar is a tar archive and zcat does not work with it. So I proceed to run the command: cat CP_1204131759.tar CP_1204131759. But when I try # cat CP_1204131759 | munpack I get an error that munpack did not find anything to read from standard input. I went on to extract the tar backup file using the xvf flags and got a lot of files (20) similar to these ones: CP_sapp-distrib.7686-0_1204131759.tgz CP_sapp-distrib.7686-35_1204131759.tgz CP_sapp-distrib.7686-6_1204131759.tgz How best can I extract the httpdocs of a domain from this server wide Plesk 9.5.4 backup?

    Read the article

  • safely remove webserver from nginx backent pool?

    - by Dean Hiller
    We have 4 webservers behind nginx being hit with 262 events/second. I would like to tell nginx to stop sending requests to that server. If I remove the server from the file and reload the file, aren't all requests in process dropped on the floor then as nginx no longer knows that server. What can I add to the config so it slowly drops that server out of the pool and I can wait for current requests to complete.

    Read the article

  • Windows 2003 Server Suddenly Drops Connections to everything but PING and port 80 (webserver)

    - by Urda
    Hi, I'm running a Windows 2003 Dedicated server, and the box has been rock solid for more than 2 years now. Just recently, I have had an issue where after a certain ammount of time the server will just stop accepting connections on all ports except 80, and still responds to pings. If you surf to a static HTML page on the box, you can display it. If you pull up a PHP file that has an outside database connection, it fails. RDP refuses to connect, and LogMeIn reports that "The connection was reset" The only fix at this time is to have the provider physically reboot my box to correct the issue. The event log doesn't seem to be leaving helpful messages, and I am at a lost to what is causing this error since I have not installed any new software in months on it. Any tips or help would be appreciated to tracking down this issue. I don't mind updating the question as responses come in, I just do not know where to start troubleshooting because it seems random at this time.

    Read the article

  • Apache2 WebServer not allowing me to view website/files in /var/www

    - by CitadelCSAlum
    I used to be able to access websites/files that were stored in the directory /var/www I have not used this for a while, but now I have a need to store, media in this directory or in the directory/var/www/images I noticed that my apache web server wasnt running correctly so I did a complete package removal and then reinstalled, but I am still unable to access a test page inde.html in the directory /var/www/index.html by going to http://myipaddresshere/index.html Is there some initial configuration I need to do to allow me to store HTML and media files in this directory and be able to access them from the browser? I dont remember having to do anything before.

    Read the article

  • View Webserver Form Internal and Extenal

    - by Just4Net
    We have a web server hosted out of our office, and it works fine when people access the site from outside the office. The problem is that when people are inside the office and go to the website (www.example.com) it's very slow, as it goes out over the internet and comes back in. Or LAN is Windows with Active Directory and our web server is CentOS with its own internet connection. How can I set when users in my office want see website opens it from local network instead of going out of one internet connection and back in through the other?

    Read the article

  • PHP file_get_contents() does not work after uploading to the webserver

    - by Ethan
    Sample code: $html = file_get_contents('http://www.google.com'); echo $html; It works on localhost. But after uploading to the webserver, it gives me a warning: file_get_contents(): php_network_getaddresses: getaddrinfo failed: Temporary failure in name resolution. If I replace the domain name with google's IP address, it gives a warning: failed to open stream: Connection timed out. And ini_get("allow_url_fopen") return 1.

    Read the article

  • Converting .docx to pdf (or .doc to pdf, or .doc to odt, etc.) with libreoffice on a webserver on the fly using php

    - by robertphyatt
    Ok, so I needed to convert .docx files to .pdf files on the fly, but none of the free php libraries that were available let me do it on my server (a webservice was not good enough). Basically either I needed to pay for a library (and have it maybe suck) or just deal with the free ones that didn't convert the formatting well enough. Not good enough! I found that LibreOffice (OpenOffice's successor) allows command line conversion using the LibreOffice conversion engine (which DID preserve the formatting like I wanted and generally worked great). I loaded the latest version of Ubuntu (http://www.ubuntu.com/download/ubuntu/download) onto my Virtual Box (https://www.virtualbox.org/wiki/Downloads) on my computer and found that I was able to easily convert files using the commandline like this: libreoffice --headless -convert-to pdf fileToConvert.docx -outdir output/path/for/pdf I thought: sweet...but I don't have admin rights on my host's web server. I tried to use a "portable" version of LibreOffice that I obtained from http://portablelinuxapps.org/ but I was unable to get it to work on my host's webserver, because my host's webserver didn't have all the dependencies (Dependency Hell! http://en.wikipedia.org/wiki/Dependency_hell) I was at a loss of how to make it work, until I ran across a cool project made by a Ph.D. student (Philip J. Guo) at Stanford called CDE: http://www.stanford.edu/~pgbovine/cde.html I will let you look at his explanations of how it works (I followed what he did in http://www.youtube.com/watch?feature=player_embedded&v=6XdwHo1BWwY, starting at about 32:00 as well as the directions on his site), but in short, it allows one to avoid dependency hell by copying all the files used when you run certain commands, recreating the linux environment where the command worked. I was able to use this to run LibreOffice without having to resort to someone's portable version of it, and it worked just like it did when I did it on Ubuntu with the command above, with a tweak: I needed to run the wrapper of LibreOffice the CDE generated. So, below is my PHP code that calls it. In this code snippet, the filename to be copied is passed in as $_POST["filename"]. I copy the file to the same spot where I originally converted the file, convert it, copy it back and then delete all the files (so that it doesn't start growing exponentially). I did it this way because I wasn't able to make it work otherwise on the webserver. If there is a linux + webserver ninja out there that can figure out how to make it work without doing this, I would be interested to know what you did. Please post a comment or something if you did that. <?php //first copy the file to the magic place where we can convert it to a pdf on the fly copy($time.$_POST["filename"], "../LibreOffice/cde-package/cde-root/home/robert/Desktop/".$_POST["filename"]); //change to that directory chdir('../LibreOffice/cde-package/cde-root/home/robert'); //the magic command that does the conversion $myCommand = "./libreoffice.cde --headless -convert-to pdf Desktop/".$_POST["filename"]." -outdir Desktop/"; exec ($myCommand); //copy the file back copy("Desktop/".str_replace(".docx", ".pdf", $_POST["filename"]), "../../../../../documents/".str_replace(".docx", ".pdf", $_POST["filename"])); //delete all the files out of the magic place where we can convert it to a pdf on the fly $files1 = scandir('Desktop'); //my files that I generated all happened to start with a number. $pattern = '/^[0-9]/'; foreach ($files1 as $value) { preg_match($pattern, $value, $matches); if(count($matches) ?> 0) { unlink("Desktop/".$value); } } //changing the header to the location of the file makes it work well on androids header( 'Location: '.str_replace(".docx", ".pdf", $_POST["filename"]) ); ?> And here is the tar.gz file I generated I generated with CDE. To duplicate what I did exactly, put the tar.gz file in a folder somewhere. I will call that folder the "root". Make a new folder called "documents" in the "root" folder. Unpack the tar.gz and run the php script above from the "documents" folder. Success! I made a truly portable version of LibreOffice that can convert files on the fly on a webserver using 100% free, open source software!

    Read the article

  • How can I close the output stream after a jsp has been included.

    - by stu
    I have a webpage that makes an ajax call to get some data. That data takes a long time to calculate, so what I did was the first ajax server call returns "loading..." and then the thread goes on to calculate the data and store it in a cache. meanwhile the client javascript checks back every few seconds with another ajax call to see if the cache has been loaded yet. Here's my problem, and it might not be a problem. After the initial ajax to the server call, I do a ...getServletContext().getRequestDispatcher(jsppath).include(request, response); then I use that thread to do the calculations. I don't mind tying up the webserver thread with this, but I want the browser to get the response and not wait for the server to close the socket. I can't tell if the server is closing the socket after the include, but I'm guessing it's not. So how can I forcibly close the stream after I've written out my response, before starting my long calculations? I tried o = response.getOutputStream(); o.close(); but I get an illegal state exception saying that the output stream has already been gotten (presumably by the jsp I'm including) So my qestions: 1) is the webserver closing the socket (I'm guessing not, because you could always include another jsp) 2) if it is as I assume not closing the socket, how do I do that?

    Read the article

  • LinkDemand error on webserver when using TraceSource

    - by robertpnl
    Hi, On a webserver (shared hosting provider) I published a website with a ADO.Net Framework model in use with MySql Connector 6.3.1. When I request a page, a Security Exception will be happen with this error messages: "LinkDemand The type of the first permission that failed was: System.Security.Permissions.SecurityPermission The Zone of the assembly that failed was: MyComputer ". This exception raised when code collect the listeners of a tracksource: public class MySqlTrace { private static TraceSource source = new TraceSource("mysql"); static MySqlTrace() { foreach (TraceListener listener in source.Listeners) // <-- Exception throw here { // ... } } } The web.config doesn't have any trace data or system.diagnostics. My question is, why will a get a LinkDemand security exception during collecting the source listeners. What can maybe be wrong in here?

    Read the article

  • Symfony Rewrite rules on Zeus webserver

    - by Ben
    I would like to run a symfony project on a zeus webserver, however i cannot get the rewrite rules to work. Has anyone done this successfully The symfony .htaccess is as follows: Options +FollowSymLinks +ExecCGI <IfModule mod_rewrite.c> RewriteEngine On # uncomment the following line, if you are having trouble # getting no_script_name to work #RewriteBase / # we skip all files with .something #RewriteCond %{REQUEST_URI} \..+$ #RewriteCond %{REQUEST_URI} !\.html$ #RewriteRule .* - [L] # we check if the .html version is here (caching) RewriteRule ^$ index.html [QSA] RewriteRule ^([^.]+)$ $1.html [QSA] RewriteCond %{REQUEST_FILENAME} !-f # no, so we redirect to our front web controller RewriteRule ^(.*)$ index.php [QSA,L] </IfModule> From what I can tell the following should work on zeus: match URL into $ with (^(.*)$) if matched then set URL = index.php But it doesn't... I can only load the home page from / all other pages just 404. Thanks..

    Read the article

  • How to access system.webserver web.config node in .NET 2

    - by JK
    Are there any .NET APis that can read/update the system.webServer node in web.config? I know I can do it via reading/parsing the web.config file as xml but that's awkward. To read/update the system.web node in .NET 2 I can use: HttpModulesSection httpModulesSection = (HttpModulesSection)configuration.GetSection("system.web/httpModules"); But is there any API based way of accessing system.web/modules using .NET 2? I have to reference the .NET 2 version of system.web.configuration because I don't know in advance if my web app will be run on a server with .NET 2 or 3.5. So it is limited to .NET 2 API calls only. Thanks

    Read the article

  • stub webserver for integration testing

    - by Frank Schwieterman
    I have some integration tests where I want to verify certain requires are made against a third-[arty webserver. I was thinking I would replace the third-party server with a stub server that simply logs calls made to it. The calls do not need to succeed, but I do need a record of the requests made (mainly just the path+querystring). I was considering just using IIS for this. I could 1) set up an empty site, 2) modify the system's host file to redirect requests to that site 3) parse the log file at the end of each test. This is problematic as for IIS the log files are not written to immediately, and the files are written to continuosly. I'll need to locate the file, read the contents before the test, wait a nondeterministic amount of time after the test, read the update contents, etc. Can someone think of a simpler way?

    Read the article

  • Problem in getting response from webserver

    - by Amarpreet
    Hi guys I am sending data to webserver in the Querystrings by creating URL dynamically. When I view that URl in UIAlertView it shows me correct one. And when i try to get response using NSURL code, it does not respond. Below is the code. NSString *uu = @"http://www.zenhomeenergy.com/ZenIphoneServUpdate.aspx?CustomerID=12&FirstName=Andrew&LastName=Turner&State=SA&Street=60 Highfiled Drive Hillbank no phone number&PostCode=5112&Email= &Mobile= &HomePhone= &WorkPhone= &PrimaryResidence=True&HomeOwner=True"; NSString *text = [NSString stringWithContentsOfURL:[NSURL URLWithString:uu]]; if(text) { if([text isEqualToString:@"Success."]) { textView.text = [textView.text stringByAppendingString:@"Success.\n"]; } else { textView.text = [textView.text stringByAppendingString:@"Failed.\n"]; } } If you try putting the above URL into browser it says "Success." But the code above doesnot work. Please Help.

    Read the article

  • More memory usage for IIS 6 asp.net 2.0 on webserver 2003

    - by Alan King
    Running a webserver 2003 SP2 (x86) with IIS 6 and asp.net 2. The box is running mostly dynamic asp pages connecting to a sql 2008 server. At any given time there is over 1 gig of memory available out of the 2 gig in the box. It seems like there would be a way for it to make better use of the free memory. It is using a default machine.config file and default http.sys. I would like to maximize incoming internet connections and database connections. Is there something I can do to make better use of the available memory?

    Read the article

  • LinkDemand error on webserver when using TrackSource

    - by robertpnl
    Hi, On a webserver (shared hosting provider) I published a website with a ADO.Net Framework model in use with MySql Connector 6.3.1. When I request a page, a Security Exception will be happen with this error messages: "LinkDemand The type of the first permission that failed was: System.Security.Permissions.SecurityPermission The Zone of the assembly that failed was: MyComputer ". This exception raised when code collect the listeners of a tracksource: public class MySqlTrace { private static TraceSource source = new TraceSource("mysql"); static MySqlTrace() { foreach (TraceListener listener in source.Listeners) // <-- Exception throw here { // ... } } } The web.config doesn't have any trace data or system.diagnostics. My question is, why will a get a LinkDemand security exception during collecting the source listeners. What can maybe be wrong in here?

    Read the article

  • How can I retrieve cookies for webserver A when my project is deployed on webserver B?

    - by medopal
    The project is multiple modules, each of them is deployed to a separate webserver. All of them on the same mainframe. (same IP address) I have a main menu where I login and then list all the available modules on all servers. From here I can click and go to any of them modules. I send cookies in the response (when logging in, say Server A), then on Server B (one of the modules) when I want to go back to the main menu, I check the cookies to see if the user is logged in. The problem is, Server B isn't seeing cookies generated by Server A. So each time I return to main menu, the user will be logged out. Is there anyway to store cookies to be used by multiple virtual webservers (on same IP) or any other idea?

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >