Search Results

Search found 18715 results on 749 pages for 'website attack'.

Page 84/749 | < Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >

  • Website ocasionally does not load on first click

    - by tfe
    Today I noticed that my website hosted on a virtual server ocasionally does not load on first click. I click on some link, browser starts loading page but nothing loads and does not not appear any error message (like "connection reset by peer etc). Nothing. When I click the same link again, page loads immediately. The same situation on 2 computers in different browsers. It happends not always, maybe on each 20... or 30 click. Sites from other servers load without this problem. Any ideas what can cause this problem?

    Read the article

  • Emulate, debug BlackBerry mobile website.

    - by Pennf0lio
    I'm developing a website that needs to be compatible for mobile phones (esp. BlackBerry). How can I debug it's CSS? I already installed the Default User Agent plug-in for Firefox problem is I just don't know how to add the BlackBerry agent. Next problem is how can I properly emulate the mobile site? without the actually having the device. Thanks! I'm using Windows 7, Firefox + Firebug + Web Developer Toolbar. I know XHTML, CSS and WordPress

    Read the article

  • Hosting website when port 80 is taken?

    - by cinqoTimo
    A few months ago, we purchased an R-HUB unit to replace WebEx for remote support. The device operates through port 80, ehich doesn't appear to be configurable. I know in IIS, you can specify a port besides port 80, but the problem is in the port forwarding. On our router, we have to map an incoming port to the forward port which then directs traffic to the node (webserver). However, the incoming port for both the webserver and the R-HUB is 80 - and the server seems to be getting confused as I can only get to the R-HUB, not the website. How can I expose both devices? Host header headers? DNS config?

    Read the article

  • hosting website on a private network

    - by razor
    i'm currently running a website off 3 linux servers. I'd like to setup a private network and only allow port 80 traffic to one of the servers. I'd also like to setup a vpn so only I can access the servers via ssh or any port for developing/debugging. How hard is this to setup and what do I need to get? Do enterprise/commercial routers have vpn functionality built in? how do I handle DNS? eg- www.mydomain.com would need to point to the router, which forwards traffic to the webserver. Do I set the A record to the router, and somehow tell the router which server to send the http request to? And how would I make server1.mydomain.com resolve to server1 within the private network (without editing host files)? Would I need to run my own DNS (eg- powerdns?) to do this?

    Read the article

  • Copying Tables from a Website

    - by amemus
    I have difficulty making an Excel-readable file from a table on a Website. The problems very specific to my question are: I have to use IE 7 to access the site. Excel is installed in another computer. The site does not let me view the HTML of the table. Normally, I would simply select the table I want and drag and drop it to Excel. Or, I would view the page source and copy the HTML data. Both do not work in this case. Is there any handy tool out there?

    Read the article

  • Allow access only to one website

    - by Alex
    Hey. I'd like to allow access on a computer connected directly to the internet to one website ONLY. The solution of IE's "Content advisor" or firefox's "FoxFilter" isn't good enough because it actually downloads the data and just don't display it. I want to block the traffic before the requests are sent. How is it possible? Thanks. Edit: OS is windows xp. The browser can be firefox, iexplorer, chrome... It doesn't matter. The computer is connected directly to the modem.

    Read the article

  • Best way to administer a website remotely

    - by Mark Szymanski
    I have a Windows computer running an intranet website with IIS and I was wondering what the best way was to administer it from another computer, in this case, a Mac. What I want to do: Be able to edit pages from my Mac. VNC into the server because it is 'headless'. (Already have this set up) My current file syncing setup: I have Dropbox setup to sync files between the computers and then use PureSync to sync the files in the Dropbox folder into the wwwroot folder. Is there a better (faster/easier) way I could do all this? Thanks in advance!

    Read the article

  • Authentication through mod_auth_kerb should provide website with no user if no TGT provided

    - by loomi
    Users are authenticated by mod_auth_kerb which works great. Therefore I need to set Require valid-user If there is no valid user Apache fails with an 401 Authorization Required. I would like Apache to deliver the website anyway but not providing a remote_user to the underlying script. This is related to How to tell mod_auth_kerb to do its job despite no "require valid-user". But with the important difference that on a whole subdirectory on every url a kerberos negotation should be initiated, and if it fails it should deliver the content anyway.

    Read the article

  • Using hg repository as web site

    - by Tex
    This is somewhat related to my security question here. Is it a bad idea to use an hg / mercurial repository for a live website? If so, why? Furthermore, we have dev, test and production installations of our website, like dev.example.com, test.example.com and www.example.com. If it's a bad idea to use a repository for a live/production website, would it be OK to use an hg repository for the dev and test sites? I'm also concerned about ease of deployment. We have technical and less technical co-workers who will be working with the site. The technical guys (software engineers) won't have any problem working with the command line or TortoiseHG. I'm more concerned about the less technical guys (web designers). They won't be comfortable working on the command line, and may even find TortoiseHG daunting. These guys mostly upload .css files and images to the server. I'd like for these files (at least the .css files) to be under version control, but I want this to be as transparent as possible for the non technical guys. What's the best way to achieve this? Edit: Our 'site' is actually a multi-site CMS setup with a main repository and several subrepositories. Mock-up of the repository structure: /root [main repository containing core files and subrepositories] /modules [modules subrepository] /sites/global [subrepository for global .css and .php files] /sites/site1 [site1 subrepository] ... /sites/siteN [siteN subrepository] Software engineers would work in the root, modules and sites/global repositories. Less technical guys (web designers) would work only in the site1 ... siteN subrepositories.

    Read the article

  • Website does not resolve in browser but traceroute is successful

    - by Colum
    I am trying to figure out an issue. My internet is working fine, but this one website is not resolving. It works via a proxy, traceroute works: 1 192.168.1.1 (192.168.1.1) 4.205 ms 0.568 ms 0.510 ms 2 * * * 3 67.59.255.13 (67.59.255.13) 10.583 ms 7.949 ms 7.557 ms 4 67.59.255.61 (67.59.255.61) 10.256 ms 9.576 ms 13.083 ms 5 64.15.8.126 (64.15.8.126) 9.943 ms 11.929 ms 11.452 ms 6 64.15.0.217 (64.15.0.217) 14.655 ms 14.092 ms 13.771 ms 7 64.15.0.118 (64.15.0.118) 33.201 ms 34.875 ms 36.544 ms 8 xe-6-0-3.ar1.ord1.us.nlayer.net (69.31.111.169) 34.027 ms 34.957 ms 34.231 ms 9 ae1-30g.cr1.ord1.us.nlayer.net (69.31.111.133) 82.683 ms 35.138 ms 37.592 ms 10 xe-3-0-0.cr2.iad1.us.nlayer.net (69.22.142.26) 41.657 ms 34.063 ms 34.519 ms 11 ae2-30g.ar2.iad1.us.nlayer.net (69.31.31.186) 35.780 ms 36.361 ms 33.968 ms 12 as33597.xe-3-0-7.ar2.iad1.us.nlayer.net (69.31.30.230) 35.086 ms as33597.xe-3-0-7.ar2.iad1.us.nlayer.net (69.31.30.234) 38.031 ms as33597.xe-3-0-7.ar2.iad1.us.nlayer.net (69.31.30.230) 36.833 ms 13 cr1.iad2.inforelay.net (66.231.176.246) 32.595 ms cr2.iad1.inforelay.net (66.231.176.10) 31.771 ms cr1.iad2.inforelay.net (66.231.176.246) 32.622 ms 14 cr1.iad2.inforelay.net (66.231.176.246) 32.956 ms 33.625 ms !X 41.058 ms 15 * cr1.iad2.inforelay.net (66.231.176.246) 35.312 ms !X * 16 * cr1.iad2.inforelay.net (66.231.176.246) 32.814 ms !X * 17 cr1.iad2.inforelay.net (66.231.176.246) 35.459 ms !X * 53.137 ms !X Ping returns this: Request timeout for icmp_seq 0 Request timeout for icmp_seq 1 Request timeout for icmp_seq 2 Request timeout for icmp_seq 3 Request timeout for icmp_seq 4 Request timeout for icmp_seq 5 Request timeout for icmp_seq 6 But what I can not figure out is why my browsers (Firefox, Safari, Opera) can not resolve the domain. I am on a Wifi connection. What could be the problem? BTW I am on a Mac (10.6.5)

    Read the article

  • Website does not resolve in browser but traceroute is successful

    - by Colum
    I am trying to figure out an issue. My internet is working fine, but this one website is not resolving. It works via a proxy, traceroute works: 1 192.168.1.1 (192.168.1.1) 4.205 ms 0.568 ms 0.510 ms 2 * * * 3 67.59.255.13 (67.59.255.13) 10.583 ms 7.949 ms 7.557 ms 4 67.59.255.61 (67.59.255.61) 10.256 ms 9.576 ms 13.083 ms 5 64.15.8.126 (64.15.8.126) 9.943 ms 11.929 ms 11.452 ms 6 64.15.0.217 (64.15.0.217) 14.655 ms 14.092 ms 13.771 ms 7 64.15.0.118 (64.15.0.118) 33.201 ms 34.875 ms 36.544 ms 8 xe-6-0-3.ar1.ord1.us.nlayer.net (69.31.111.169) 34.027 ms 34.957 ms 34.231 ms 9 ae1-30g.cr1.ord1.us.nlayer.net (69.31.111.133) 82.683 ms 35.138 ms 37.592 ms 10 xe-3-0-0.cr2.iad1.us.nlayer.net (69.22.142.26) 41.657 ms 34.063 ms 34.519 ms 11 ae2-30g.ar2.iad1.us.nlayer.net (69.31.31.186) 35.780 ms 36.361 ms 33.968 ms 12 as33597.xe-3-0-7.ar2.iad1.us.nlayer.net (69.31.30.230) 35.086 ms as33597.xe-3-0-7.ar2.iad1.us.nlayer.net (69.31.30.234) 38.031 ms as33597.xe-3-0-7.ar2.iad1.us.nlayer.net (69.31.30.230) 36.833 ms 13 cr1.iad2.inforelay.net (66.231.176.246) 32.595 ms cr2.iad1.inforelay.net (66.231.176.10) 31.771 ms cr1.iad2.inforelay.net (66.231.176.246) 32.622 ms 14 cr1.iad2.inforelay.net (66.231.176.246) 32.956 ms 33.625 ms !X 41.058 ms 15 * cr1.iad2.inforelay.net (66.231.176.246) 35.312 ms !X * 16 * cr1.iad2.inforelay.net (66.231.176.246) 32.814 ms !X * 17 cr1.iad2.inforelay.net (66.231.176.246) 35.459 ms !X * 53.137 ms !X Ping returns this: Request timeout for icmp_seq 0 Request timeout for icmp_seq 1 Request timeout for icmp_seq 2 Request timeout for icmp_seq 3 Request timeout for icmp_seq 4 Request timeout for icmp_seq 5 Request timeout for icmp_seq 6 But what I can not figure out is why my browsers (Firefox, Safari, Opera) can not resolve the domain. I am on a Wifi connection. What could be the problem? BTW I am on a Mac (10.6.5)

    Read the article

  • restrict access to IIS virtual directory from root website

    - by Senthil
    I have two domains (domain1.com and domain2.com). Both of them use the same Windows hosting server with IIS7. One of the domains is being called the "primary domain" by my hosting provider (GoDaddy) and it always points to the root folder that I was given. For the other domain, I have created a virtual directory in IIS and pointed it there. The folder structure is like this - root/ --Default.aspx --SomeFile.aspx --domain2folder/ ----Default.aspx ----Domain2SomeFile.aspx So, if I type domain1.com, I see the regulakr Default.aspx. But if I type domain2.com, I am shown the contents of domain2folder as if it were a separate web application - I think that is what IIS virtual directory is meant for. Well and good. But the problem is, when I type http://domain1.com/domain2folder, I see the domain2's website! But I don't want that to be shown when I use the path like that from domain1. Only if they use domain2.com, user should be able to see those contents. How can I do that? Hope I am making sense. Thanks.

    Read the article

  • managing a high traffic media sharing website

    - by Jordan Westerman
    i'm in the process of developing a website that i predict will generate a lot of traffic. the site will be similar to many other sites offering free media streaming: mp3's. we are going to start with a pretty minimal amount of media to share, but the basic idea is that artists will set up a profile page with music they have made available for consumers to visit the page and listen to the music. we are starting with just a handful of artists, but i think that this project will generate more and more artist pages. eventually i'd like to set it up so consumers can create personalized playlists. how can i best prepare server space and bandwidth capabilities? i have a small team of web designers and programmers working on the site, as i am pretty illiterate when it comes to site management. as the ring leader of this organization, i am more or less looking for financial requirements and monthly burn rate estimates. i don't have a ton of capitol to start with, putting together a business plan, but i am seeking investments. i have a game plan to grow fast enough to be successful, and slow enough to manage the financial growth requirements. any questions i may have failed to ask myself? is it realistic to start this project on a shared server, and upgrade? any financial advice you think i can use? i really appreciate any advice given, as this is my first business venture. thank you all in advance. Jordan Westerman D.B.A. Badfish Productions, LLC

    Read the article

  • Hiding a Website from Search Engine Bots and Viewers by Disabling Default VirtualHost

    - by Basel Shishani
    When staging a website on a remote VPS, we would like it to be accessible to team members only, and we would also like to keep the search engine bots off until the site is finalized. Access control by host whether in Iptables or Apache is not desirable, as accessing hosts can vary. After some reading in Apache config and other SF postings, I settled on the following design that relies on restricting access to only through specific domain names: Default virtual host would be disabled in Apache config as follows - relying on Apache behavior to use first virtual host for site default: <VirtualHost *:80> # Anything matching this should be silently ignored. </VirtualHost> <VirtualHost *:80> ServerName secretsiteone.com DocumentRoot /var/www/secretsiteone.com </VirtualHost> <VirtualHost *:80> ServerName secretsitetwo.com ... </VirtualHost> Then each team member can add the domain names in their local /etc/hosts: xx.xx.xx.xx secrethostone.com My question is: is the above technique good enough to achieve the above said goals esp restricting SE bots, or is it possible that bots would work around that. Note: I understand that mod_rewrite rules con be used to achieve a similar effect as discussed here: How to disable default VirtualHost in apache2?, so the same question would apply to that technique too. Also please note: the content is not highly secretive - the idea is not to devise something that is hack proof, so we are not concerned about traffic interception or the like. The idea is to keep competitors and casual surfers from viewing the content before it's released, and to prevent SE bots from indexing it.

    Read the article

  • How to run a website domain without redirecting if IP is already used for another website? [duplicate]

    - by SSpoke
    This question already has an answer here: Hosting multiple distinct folders for distinct domains 1 answer I bought a VPS Host that gave me only 1 IP Address which I used on my first domain name and it works without any problems. Now my second domain name I can't use the same ip address as it points to the first domain name. So I figured my only option was to use a GoDaddy hosted iframe redirection which redirects to a sub folder on my first domain which worked so far. Now I'm trying to load paypal from <?php headers() ?> and I get a permission error because of that iframe Refused to display 'https://www.paypal.com/cgi-bin/webscr?notify_url=&cmd=_cart&upload=1&business=removed&address_override=1' in a frame because it set 'X-Frame-Options' to 'SAMEORIGIN'. How do I avoid the Iframe solution for my second domain while not messing up my first domain? Somebody I forgot once told me it doesn't matter if you have 1 IP Address you could host multiple websites on it? how it that possible the DNS doesn't seem to work off ports afaik, yes I could host multiple websites on different folders but that's not what I call hosting a real website it has to be pointed by a domain name, so this iframe issue doesn't happen My server configuration is httpd (apache) that comes with CentOS 6 (Linux) operating system

    Read the article

  • Scanning website for vulnerablities

    - by Kristen
    I have found that the local school's website installed a Perl Calendar - this was years ago, it has not been used for ages, but Google has it indexed (which is how I found it) and it full of Viagra links and the like ... program was by Matt Kruse, here is details of the exploit: http://www.securiteam.com/exploits/5IP040A1QI.html I've got the school to remove that, but I think they also have MySQL installed and I'm aware that out-of-the-box there have been some exploits of Admin Tools / Login in old versions. For all I know they also have PHPBB and the like installed ... The school is just using some cheap, shared hosting; the HTTP response header I get is: Apache/1.3.29 (Unix) (Red-Hat/Linux) Chili!Soft-ASP/3.6.2 mod_ssl/2.8.14 OpenSSL/0.9.6b PHP/4.4.9 FrontPage/5.0.2.2510 I'm looking for some means of checking if they have other junk installed (quite possibly from way back, and now unused) that might put the site at risk. I'm more interested in something that can scan for things like the MySQL Admin exploit rather than open ports etc. My guess is that they have little control over the hosting space that they have - but I'm a Windows DEV, so this *nix stuff is all Greek to me. I found http://www.beyondsecurity.com/ which looks like it might do what I want (within their evaluation :) ) but I have a worry about how to find out if they are well known / honest - otherwise I will be tipping them a wink with a Domain Name that may be at risk! Many thanks.

    Read the article

  • Website does not resolve in browser but traceroute is successful

    - by Colum
    I am trying to figure out an issue. My internet is working fine, but this one website is not resolving. It works via a proxy, traceroute works: 1 192.168.1.1 (192.168.1.1) 4.205 ms 0.568 ms 0.510 ms 2 * * * 3 67.59.255.13 (67.59.255.13) 10.583 ms 7.949 ms 7.557 ms 4 67.59.255.61 (67.59.255.61) 10.256 ms 9.576 ms 13.083 ms 5 64.15.8.126 (64.15.8.126) 9.943 ms 11.929 ms 11.452 ms 6 64.15.0.217 (64.15.0.217) 14.655 ms 14.092 ms 13.771 ms 7 64.15.0.118 (64.15.0.118) 33.201 ms 34.875 ms 36.544 ms 8 xe-6-0-3.ar1.ord1.us.nlayer.net (69.31.111.169) 34.027 ms 34.957 ms 34.231 ms 9 ae1-30g.cr1.ord1.us.nlayer.net (69.31.111.133) 82.683 ms 35.138 ms 37.592 ms 10 xe-3-0-0.cr2.iad1.us.nlayer.net (69.22.142.26) 41.657 ms 34.063 ms 34.519 ms 11 ae2-30g.ar2.iad1.us.nlayer.net (69.31.31.186) 35.780 ms 36.361 ms 33.968 ms 12 as33597.xe-3-0-7.ar2.iad1.us.nlayer.net (69.31.30.230) 35.086 ms as33597.xe-3-0-7.ar2.iad1.us.nlayer.net (69.31.30.234) 38.031 ms as33597.xe-3-0-7.ar2.iad1.us.nlayer.net (69.31.30.230) 36.833 ms 13 cr1.iad2.inforelay.net (66.231.176.246) 32.595 ms cr2.iad1.inforelay.net (66.231.176.10) 31.771 ms cr1.iad2.inforelay.net (66.231.176.246) 32.622 ms 14 cr1.iad2.inforelay.net (66.231.176.246) 32.956 ms 33.625 ms !X 41.058 ms 15 * cr1.iad2.inforelay.net (66.231.176.246) 35.312 ms !X * 16 * cr1.iad2.inforelay.net (66.231.176.246) 32.814 ms !X * 17 cr1.iad2.inforelay.net (66.231.176.246) 35.459 ms !X * 53.137 ms !X Ping returns this: Request timeout for icmp_seq 0 Request timeout for icmp_seq 1 Request timeout for icmp_seq 2 Request timeout for icmp_seq 3 Request timeout for icmp_seq 4 Request timeout for icmp_seq 5 Request timeout for icmp_seq 6 But what I can not figure out is why my browsers (Firefox, Safari, Opera) can not resolve the domain. I am on a Wifi connection. What could be the problem? BTW I am on a Mac (10.6.5)

    Read the article

  • SQL Server Reporting Services - website blank, builder works

    - by Keith
    We have a few reports in SQL Server Reporting Services. For some reason when we run the report from the website, it doesn't return any data. When I run the same report from the Report Builder, it returns data. I looked in the logs and the only errors I could find is: ReportingServicesService!library!8!6/15/2012-08:12:33:: i INFO: Current DB Version Unknown, Instance Version C.0.8.54. ReportingServicesService!library!8!6/15/2012-08:12:33:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportServerDatabaseException: The version of the report server database is either in a format that is not valid, or it cannot be read. The found version is 'Unknown'. The expected version is 'C.0.8.54'. To continue, update the version of the report server database and verify access rights., ;Info: Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportServerDatabaseException: The version of the report server database is either in a format that is not valid, or it cannot be read. The found version is 'Unknown'. The expected version is 'C.0.8.54'. To continue, update the version of the report server database and verify access rights. ReportingServicesService!library!8!6/15/2012-08:12:33:: e ERROR: Exception caught while starting service. Error: Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportServerDatabaseException: The version of the report server database is either in a format that is not valid, or it cannot be read. The found version is 'Unknown'. The expected version is 'C.0.8.54'. To continue, update the version of the report server database and verify access rights. I'm not really sure why it would be a different version. It's all SQL Server 2008 R2 and I haven't made any changes to it since it's been running.

    Read the article

  • Copy a website and preserve the file & folder structure

    - by DrStalker
    I have an old web site running on an ancient version of Oracle Portal that we need to convert to a flat-html structure. Due to damage to the server we are not able to access the administrative interface, and even if we could there is no export functionality that can work with modern software versions. It would be enough to crawl the website and have all the pages & images saved to a folder, but the file structure needs to be preserved; that is, if a page is located at http://www.oldserver.com/foo/bar/baz/mypage.html then it needs to be saved to /foo/bar/baz/mypage.html so that the various Javascript bits will continue to function. None of the web crawlers I've found have been able to do this; they all want to rename the pages (page01.html, page02.html etc) and break the folder structure. Is there any crawler out there that will recreate the site structure as it appears to a user accessing the site? It doesn't need to redo any of teh content of the pages; once rehosted the pages will all have the same names they did originally so links will continue to work.

    Read the article

  • New Static Website with Hosted DNS alternating 502, 503 and Page Does Not Exist Errors

    - by Dave
    This has become an increasingly frustrating ordeal. I'm mostly a web developer, so forgive me if I am using improper terminology here. I have a client that had purchased a domain at JustHost. We built him a website and have it on our own server space. Now, I'm mostly used to dealing with godaddy and it is simple enough to manage dns records and point the A record to our server IP, where Apache on our end deals with the domains via name-based virtual hosts. But for some reason, in setting this up with JustHost, when attempting to go to the domain name, I either get a 502 or 503 error or "webpage does not exist". Now, I know that the basic functionality of the webpage must be working because I can access the the index etc straight through my servers www data (IE [server-ip]/website_folder). I was on the phone with technical support for over three hours yesterday with justhost and the best I could get was "That's really weird..." I've checked my logs and there doesn't seem to be anything coming through to my end. Does anybody have an idea of whats going on here? I would love for it to be a problem on my end, because justhost doesn't seem capable of helping further. Any help is greatly appreciated, thanks. I forgot to mention that we have several other sites up and running and completely accessible.

    Read the article

  • Client unable to access OWA website after temporarily changing SSL certificate on the server

    - by Lorenz Meyer
    I have the following issue: One client computer (Windows XP) cannot access the OWA website. All other client computers can (Except another one in the same remote office). How this happened: I temporarily changed the SSL certificate on the Exchange Server yesterday. After a few minutes, I reverted back, an now the same certificate that was installed for years is back again. During these few minutes, they were in OWA on this computer and got a certificate error. What exactly happens: Internet Explorer displays the error Internet Explorer cannot display the webpage, Firefox displays The connection was reset and Crome shows This webpage is not available. The connection to ... was interrupted. What I already did to try to get this working: Restart the client computer Restart the exchange server Deleted Internet Explorer browsing history In IE, Internet Options, tab Content, under Certificates deletes SSL cache Restored Internet Explorer to the default parameters I looked into certmgr.msc, but did not find a certificate related to the issue What could I do else to narrow down the origin of this problem (or better: resolve it) ? Can you give any advice ?

    Read the article

  • Server 2003 R2 - II6- granting access to website via IP with subnet range

    - by John
    We are trying to allow for a client to connect to our website. By default we are Denying all access except for those with the specified IPs we have configured to run, everything before has just been a single IP address. However now we must implement a range of IPs and rather than input thousands of records we want to use the group of computer options in the Grant Access page. However we have it configured to work off of the IP 72.21.192.0 with a subnet mask of 255.255.224.0 They are unable to connect. Looking over our IIS logs they are receiving a 302 error which is the same behavior anyone should get whom is unauthorized to view the page in question. The IP address coming in is 72.21.217.2, so it should be well within the rage of acceptable IP addresses. I'm at a loss as everything I look up tells me to do what we are doing. So any insight would be appreciated. Especially because I'm a software guy not hardware. Thanks!

    Read the article

  • Redirect from folder containing website

    - by Sam
    I have a website reached from this url: http://www.mysite.com/cms/index.php being served from this directory: public_html/cms/index.php In public_html I have this .htaccess RewriteRule (.*) cms/$1 [L] Which lets me get to the site like this: http://www.mysite.com/index.php But now if I reference the 'old' address, I'd like to redirect to the rewritten address with a permanent redirect code. for example: http://www.mysite.com/cms/?q=node/1 is redirected to... http://www.mysite.com/?q=node/1 How can I make this happen? EDIT: Also in the .htaccess file supplied with Drupal(cms), this is written. I've tried enabling it, but it doesn't seem to have any effect. # Modify the RewriteBase if you are using Drupal in a subdirectory or in a # VirtualDocumentRoot and the rewrite rules are not working properly. # For example if your site is at http://example.com/drupal uncomment and # modify the following line: # RewriteBase /drupal EDIT: Including more of my .htaccess file - seems relevant. # Block access to "hidden" directories whose names begin with a period. RewriteRule "(^|/)\." - [F] #Strip cms folder from url RewriteRule (.*) cms/$1 RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !=/favicon.ico RewriteRule ^ index.php [L] # Rules to correctly serve gzip compressed CSS and JS files. # Requires both mod_rewrite and mod_headers to be enabled. <IfModule mod_headers.c> # Serve gzip compressed CSS files if they exist and the client accepts gzip. RewriteCond %{HTTP:Accept-encoding} gzip RewriteCond %{REQUEST_FILENAME}\.gz -s RewriteRule ^(.*)\.css $1\.css\.gz [QSA] # Serve gzip compressed JS files if they exist and the client accepts gzip. RewriteCond %{HTTP:Accept-encoding} gzip RewriteCond %{REQUEST_FILENAME}\.gz -s RewriteRule ^(.*)\.js $1\.js\.gz [QSA] # Serve correct content types, and prevent mod_deflate double gzip. RewriteRule \.css\.gz$ - [T=text/css,E=no-gzip:1] RewriteRule \.js\.gz$ - [T=text/javascript,E=no-gzip:1] <FilesMatch "(\.js\.gz|\.css\.gz)$"> # Serve correct encoding type. Header append Content-Encoding gzip # Force proxies to cache gzipped & non-gzipped css/js files separately. Header append Vary Accept-Encoding </FilesMatch>

    Read the article

  • Strange mysql problem moving website from Ubuntu server to Mac server

    - by evan
    I'm moving a website (php/mysql) from an Ubuntu server to a OSX 10.6 server. I've set up apache to run php scripts and setup the newest version of mysql on the mac. I just copied all of the php files and dumped/imported all of the mysql databases (including the mysql users database). When I visit the page being served on the Mac the page is able to connect to the database, but not query. Specifically this function mysql_error() is returning this error message NO SUCH FILE OR DIRECTORY The reason it's strange is that I'm able to change the php connection strings for mysql on the Ubuntu server so that it points to the Mac server and the page works correctly (so it seems mysql is correctly set up on the mac and definitely contains all of the users and tables it should). Thinking it was something to do with file permissions on the mac, I've changed all of the files 755 and it hasn't helped. Any ideas? Thanks!! UPDATE: I've found this error which I'm relatively certain is related to this problem in /var/log/apache2/error_log PHP Warning: mysql_query(): A link to the server could not be established

    Read the article

  • repeated entries in website log file

    - by Reza
    I am writing an ad hoc log analyser for my website log file. The following is part of the log file in which it shows file1.pdf has been downloaded twice. Looking carefully, the time stamp and IP address are exactly the same in both entries. How can it be possible to have 2 downloads at the same time by the same person. Should I count it as 2 in my programme or as 1? Any reply is appreciated. name_of_subdomain xxx.xxx.xx.xx - - [02/Apr/2012:09:13:31 +0100] "GET /file1.pdf HTTP/1.1" 206 3706 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; CMDTDF)" name_of_subdomain xxx.xxx.xx.xx - - [02/Apr/2012:09:13:31 +0100] "GET /file1.pdf HTTP/1.1" 206 425462 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; CMDTDF)"

    Read the article

< Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >