Search Results

Search found 7625 results on 305 pages for 'scraper sites'.

Page 71/305 | < Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >

  • How to configure a static wildcard subdomain with dnsmasq.

    - by Prody
    I have a network behind a NAT with a few machines. The machines are: router - NAT, dnsmasq, forwarding - directly connected to the inet server - which runs ssh, www and some other stuff clients - which do stuff on server I also have mydomain.com. server.mydomain.com is pointing to my connection's IP (single IP), which is the router, which forwards ports to server. Server, has a httpd running, which serves different sites based on vhosts. So I have site1.server.mydomain.com, site2.. The problem is that all the traffic is going thru the router, and when I check logs I always see the router's IP for everything (so it's hard to see who is running the script with the while(1)). I would just ServerAlias site1.server.local, but most of the sites have a root URL saved somewhere on top of which other URLs are built, so I can't do that. The solution for me would be telling dnsmasq somehow to answer to *.mydomain.com with server's IP. Is this possible somehow?

    Read the article

  • OpenVPN: Single certificate authority, multiple VPNs

    - by darwish
    The company in which I work has a single site (I'll refer it as "Site A"). There are several private networks within site A. We have a running instance of OpenVPN which allows some employees to connect to one of the private networks in site A. We're planning to extend our facilities to another site (which I'll refer as "Site B") and we wish to connect both sites using OpenVPN. The VPN which will connect sites A to B will be a trunk link, meaning it will have access to all networks. If we use the same certificate authority for both VPN servers, this will allow the employees, which can only to one of the private networks within site A, to connect to the site-to-site link, which will give them access to all networks. Off course this is undesirable. Using 2 different certificate authorities seems like the obvious solution, but it doesn't feel right. I wounder if there's a way to maintain permission control within a single certificate authority.

    Read the article

  • Why aren't my old DLL's running with my app pool in 32bit mode?

    - by brokkalen
    I am moving my websites from a server 2003x86 environment to a server 2008x64. the 2008 server is using iis 7.5 and the app pool I am using is configured for 32bit mode. I get an error 'Server object error 'ASP 0177 : 800401f3' Server.createObject failed.' I beleive that it is in the DLL's that all the ASP sites point to. My programmers, as usual, say it isn't code or the DLL's. Am I missing something to make these old DLL's work? By the way these sites are connecting to a SQL 2000 Database.

    Read the article

  • turn off disable the performance cache

    - by jessie
    OK I run a streaming website and my CMS is giving me an error when uploading videos "Failed To Find Flength File" ok so I did some research. The answer I got from the coder was below. I did do all that, but the only thing I could not do is turn off what he refers to as performance cache, talked about in the last sentence... I am on a Cent OS Assuming the script is set up properly, you are probably dealing with some kind of write-caching. Some servers perform write-caching which prevents writing out the flength file or the entire CGITemp file during the upload. The flength file or the CGITemp file do not actually hit the disk until the upload is complete, making it worthless for reporting on progress during the upload. This may be fixed using a .htaccess file assuming your host supports them. Here is a link to an excellent tutorial on using .htaccess files. I strongly recommend giving it a quick read before attempting to install your own .htaccess file. 1. A mod_security module for Apache. To fix it just create a file called .htaccess (that's a period followed by "htaccess") and put the following lines in that file. Upload the file into the directory where the Uber-Uploader CGI ".pl" scripts resides, or in some directory above it (like your server's DOCUMENT_ROOT, i.e. the top-level of your webspace). htaccess files must be uploaded as ASCII mode, not BINARY. You may need to CHMOD the htaccess file to 644 or (RW-R--R--). # Turn off mod_security filtering. SecFilterEngine Off # The below probably isn't needed, # but better safe than sorry. SecFilterScanPOST Off If the above method does not work, try putting the following lines into the file SetEnvIfNoCase Content-Type \ "^multipart/form-data;" "MODSEC_NOPOSTBUFFERING=Do not buffer file uploads" mod_gzip_on No 2. "Performance Cache" enabled on OS X SERVER. If you're running OS X Server and the progress bar isn't working, it could be because of "performance caching." Apparently if ANY of your hosted sites are using performance caching, then by default, all sites (domains) will attempt to. The fix then is to disable the performance cache on all hosted sites.

    Read the article

  • Can you have a staging and production slot in Azure Websites

    - by Barry King
    I'm looking at hosting 3 Websites (there will all use the same linked database resource but I think I have to use 3 websites within Azure for this); www.website.com, provider.website.com and admin.website.com. Using Windows Azure Websites, can you have a Staging, Production slot? I think this feature is only available to Azure Cloud Services but there is little documentation on this. If its not possible, other than spinning up 3 more sites to act as the staging sites is there another way? I want the ability to "swap" from staging to production.

    Read the article

  • Under what circumstances might an IIS6 website be automatically deleted?

    - by E. Anderson
    Late last week my colleagues did some hardware maintenance on one of our vmWare esxi servers. One of the guests is a Windows Server 2003 Web Edition system that runs our low-traffic web sites. We discovered this morning that one of those websites was no longer working with what appeared to be an SSL error. After logging in, I found that the web site in question had been deleted from IIS! Is it possible for this to happen without a user actually going in and deleting that single web site? All of the other sites were fine. The files for the site in question had not been touched. I just re-created the web site, assigned the SSL cert, and everything was working again. When I logged in, I did see the 'Unexpected Shutdown' dialog.

    Read the article

  • Seeing DNS changes takes too long on my PC, can it be my router misconfiguration?

    - by Borek
    I administer a few sites and need to update their DNS entries from time to time, e.g., adding an A-record point certain subdomain to a certain IP. When I check sites like http://www.opendns.com/support/cache/, I can clearly see the DNS change taking effect throughout the world - is it just my PC that can't see this change (ping newsubdomain.example.org says it cannot resolve host name) The network "map" is like this: My PC -> my router -> my ISP's router -> internet On my PC, the DNS is set automatically which means that if I run iconfig /all, my router will be returned as the DNS server (192.168.1.1). On my router, the DNS is set to be what my ISP provided me with. Is this correct? What can I do to see new hostnames resolved quicker?

    Read the article

  • Advice for an EC2 Architecture and Deployment Strategy

    - by Mark
    My company is currently migrating several websites and PHP web applications (standard LAMP stack) from three in-house servers to Amazon EC2. Because we had only three servers, we clustered several low-traffic websites with perhaps one high-traffic web application, and served them from the same server. The server admin has pretty much copied the previous architecture wholesale onto the EC2 instances, simply upping the instance size to account for the highest traffic client that occupies that particular instance. This architecture might be okay if it wasn't for deployment. Any time one of these sites/apps changes, it means redeploying the entire instance, along with the 30 sites/apps it hosts, instead of just updating one. How can we architect our cloud in a more modular fashion? Should each app get its own appropriately-sized instance? What is the best strategy for deployment in this type of situation?

    Read the article

  • Building intranet search

    - by gmkv
    At work, we have lots of information squirreled away in many different sites -- wikis, product docs, ticketing system, etc -- many of which require authentication. I'm very interested in having a single way to search all our various silos, and in my spare time have looked at Nutch, Grub, Django + Haystack, etc. None of these is a complete solution a la Google Mini or Google Search Appliance. Has anybody built a basic intranet search engine out of a mixture of these tools? Would you have recommendations about how to go about it? I like Django, and Haystack seems to be a mildly popular search solution for it, but I'd need to wire up a crawler that can support crawling authenticated sites to it.

    Read the article

  • Web Content Filtering for Windows Clients

    - by djoyce
    I'm working with a small business to solve a bunch of problems. One is their Windows 7 POS registers need to have web access restricted to only three remote support sites, but the back office machine needs an unfiltered connection. I'd like something I can install and configure on the few registers to block all but those few sites. In a perfect world this would restrict the normal register user, but the admin user would not be filtered. Free is best, if it works, but a small fee would be alright too. Microsoft's Family Safety filter is close, but requires a Windows Live account, which isn't ideal, but may be alright. Anyone use this in a small business environment? I'd prefer something easily managed at the local machines. K9 Web Protection is interesting and I'm going to look into it more. Are there other options? Seems like someone would have made something simple like this as an open source project, but maybe not.

    Read the article

  • Very frequent "Server not found" messages

    - by Village
    Recently, while browsing and clicking or typing an address, I get: "Server not found", "Connection was reset", or a half-loaded page. Usually the page loads instantly after clicking reload, but this means I must reload on every page. Images on one site, but stored at a different server don't load until reloading for a second time. Clicking "submit" on many sites frequently doesn't work unless I reload many times. Sometimes sites load, but without colors and formating, appearing as if they would in Lynx. This seems to happen with every Web site. My Internet service claims everything on their end is fine. This happened a day after running an update in aptitude. I have not updated any hardware. I have tried clearing Iceweasel's cache. I do not have any router or other equipment. What could be going on? How can I troubleshoot this? PPPoE connection, Iceweasel 3.5.16, Debian 6

    Read the article

  • Windows 7 / Internet Explorer 8

    - by Rene
    I am a shop owner at zazzle.com. About six weeks ago, when my computer was running on Windows XP/IE7, my sites, as well as zazzle's homepages went out on me. I can only see part of each page. Since that time, I have a new computer running Windows 7/IE8, thinking that would solve the issue. It did not. Zazzle's emails told me to download Firefox and/or download Internet Explorer 7. I tried Firefox and was getting a different problem at the zazzle site. Now I was getting only the 'view source' pages on zazzle's homepages and my own shop sites as well. Question: Can I download IE 7 onto my IE 8 computer? Can this be done without loading that compilation of internet explorer 1 through 8? What do you think is the best solution to this problem?

    Read the article

  • Mozilla nonsense. Page changes size by itself

    - by Browser Madness
    I have never intentionally changed the size of font on latest mozilla browser install on windows machine. For example Google site is now 200% size, and I did nothing to make this happen. Whats worse is it does not change back but remembers this! Similarly other sites are too small and they remember this per site. What is going on here? I mean what nonsense! How can I undo this? And for extra points who came up with this absurd behavior at mozilla? Not making this up folks. 15.0.1 Not at all clear why it changes size or how to go back to default size for these sites Acutally it just happened again while editing this entry. Icon changes and than font size is too small.

    Read the article

  • IPv6 feature in Network Adaptor is Slowing Internet

    - by Teknophilia
    The past few days, my internet browsing has become very poor. It's not a matter of speed, as a speed test will give at least 15Mbps. It seems as if my laptop has a hard time actually connecting to the sites. I've found a possible culprit, but don't know why it would affect anything: Going to adapter settings and disabling ipv6, but leaving ipv4, my browsing is back to normal. Re-enabling ipv6 brings back the issue. This is strange though, because I have always had ipv6 enabled. Moreover, using sites to test ipv6 compatibility, I fail with ipv6 enabled on my adapter, and pass when it's disabled. Ideas about why this is happening, and how to fix it?

    Read the article

  • Should I be using www. when setting up virtual hosts on apache?

    - by MAZUMA
    Does it matter whether or not I include the www. sub-domain when creating new virtual hosts on apache? So is this? /etc/apache2/sites-available/www.example.com better than this? /etc/apache2/sites-available/example.com I would assume I'd need to a2ensite either www.example.com or example.com. Depending on whichever method used? This might be a a fairly basic question But, I have no one else to ask. And, want to do it right.

    Read the article

  • How can I restore the stored password in firefox 15.0.1 when deleted by error?

    - by Bob Legringe
    I, by error, deleted my stored passwords, using the "Wise disc cleaner 7" program. As I saw on another thread, the passwords are stored in 2 files signons.sqlite and the encryption key file key3.db When opening the file signons.sqlite with the text editor, I can see that the web adresses of the sites belonging to the passwords are still there. They have not been deleted by the "Wise disc cleaner 7" program, and adding a stored password on Firefox just modifies the file. However, Firefox will not display my old stored passwords and neither their respective sites. Is there any way to "undelete" the passwords?

    Read the article

  • Personal Browsing Monitor Software [closed]

    - by jmadden93
    Anyone know of any personal browsing monitor software? I'd like to be able to monitor my own browsing habbits and the time I spend on entertainment, vs work vs educational sites. Something that offers more than simply looking at the history feature built into browsers. It would be nice if it gave you a breakdown of how much time you spend on certain categories of sites like social media, vs video, vs. news, productivity, etc. I think it would be useful to know how one spends their time.

    Read the article

  • Why am I having trouble viewing HTTPS websites only using Chrome only on my employer's network?

    - by user1742777
    I'm using Google Chrome on my new MacBook Pro that has been provided to me by my employer. Many of the HTTPS sites I visit do not work when I visit them using Google Crome while I am connected to my employer's network. Example: www.facebook.com These same sites work perfectly fine if I use a different browser (like Safari) or even with Chrome when my Macbook is connected to my home WiFi network. Chrome reports the error: "The certificate was signed by an unknown authority". See attached screenshots. How can I resolve this problem? I really want to use Chrome. But not having access to numerous important work and outside websites is unacceptable.

    Read the article

  • How can I set clean urls (enable rewrite) if I don't have a domain ?

    - by Patrick
    In order to enable clean urls in Drupal, I add the lines below to the lighttpd configuration file. However I'm now working on a local server and I don't have a domain available. So I need to work with this address http://local.ip/Sites/mywebsite I've tried to replace ["host"] with ["socket"] and replace the domain with ip and subfolders (see address above), but unsuccessfully. How can I set the configuration file to set clean urls even if I don't have a domain ? thanks $HTTP["host"] =~ "(^|\.)mywebsite\.com" { server.document-root = "/var/www/sites/mywebsite" server.errorlog = "/var/log/lighttpd/mywebsite/error.log" server.name = "mywebsite.com" accesslog.filename = "/var/log/lighttpd/mywebsite/access.log" include_shell "./drupal-lua-conf.sh mywebsite.com" url.access-deny += ( "~", ".inc", ".engine", ".install", ".info", ".module", ".sh", "sql", ".theme", ".tpl.php", ".xtmpl", "Entries", "Repository", "Root" ) # "Fix" for Drupal SA-2006-006, requires lighttpd 1.4.13 or above # Only serve .php files of the drupal base directory $HTTP["url"] =~ "^/.*/.*\.php$" { fastcgi.server = () url.access-deny = ("") } magnet.attract-physical-path-to = ("/etc/lighttpd/drupal-lua-scripts/p-.lua") }

    Read the article

  • Remote additional domain controllers

    - by user125248
    Is it possible to setup several additional domain controllers (ADC) at remote locations that are connected via medium bandwidth DSL (2-10 Mbit) WAN connections for a single domain (intranet.example.com)? And would it be a good idea? We have five sites and would like to have extremely high availability if any of the site were to lose their Internet connection. However each site is very small, and all are over a fairly small geographical area within the same region, so it would seem strange to have a PDC for each of the sites. If it were possible to have an ADC for each site, would the clients use the ADC or just use the PDC if it's available to them?

    Read the article

  • Why are my favorite websites becoming slower, over months?

    - by Wolfpack'08
    I spend a lot of my time at sites for watching online videos: youtube, gorillavid, thedailyshow.com etc. I used to watch the videos in full screen mode, and then that became very laggy. So, I started watching them with full-browser zooming. Then that became laggy. Recently, I've had to actually zoom out; otherwise, the video will lag so much that my PC locks. Could this be a symptom of my processor, RAM, or motherboard going bad? Has it, perhaps, anything to do with softwares like Chrome or the playeres the sites are using being updated?

    Read the article

  • How to block a website completely?

    - by user37076
    I want to block some sites(e.g. youtube,news sites ), because I have a problem with procrastination and I find these websites affect my productivity very much. I used to block them by adding them to HOSTS file. However, gradually every time I want to take a break, I open the hosts file and comment my block again. Is there any way I can block the websites and cannot (at least a little bit hard, e.g. I have to reboot my pc) unblock them. I have no access to the router or any firewall, besides the ones on my computer. I just want to FORCE myself to work without any chance to procrastinate.

    Read the article

  • Windows Server 2003 seems to pick the 'outgoing' IP address at random from all the ones configured in IIS, how can I make it just use one?

    - by ioSamurai
    We have multiple sites in IIS with different IP addresses. This is cool, want different IPs to all go to this server and use the proper site. However I discovered an issue that when the server makes an outgoing connection, I cannot predict which IP it will use. I had to have one client add ALL the IPs to their firewall so that a certain service could communicate with their server. Well now the time has come to add another IP/site to IIS but I had told them they would not need to add any more IPs. So the question is, how can I make Windows Server 2003 use only ONE specific IP for outgoing calls instead of it being unpredictable? If this is not a good enough description, when I was RDPed into the server and I opened IE and went to 'what is my IP' it was sometimes different which is how I discovered why the one client's firewall was suddenly refusing the connections. How can I just make outgoing calls originate from a static IP yet still allow multiple IPs pointing to different sites in IIS?

    Read the article

  • Apache mod_rewrite not working properly on Mac OS X 10.6 (Snow Leopard)

    - by DashRantic
    Hello all, I'm trying to create a PHP website with clean URLs with Apache's mod_rewrite, using a .htaccess file. mod_rewrite seems to be working, however, it claims it cannot find files on my server that do exist. Just as a basic test, this is what my .htaccess file looks like at the moment--going to [mysite]/page should redirect to the index.php file: Options +FollowSymLinks RewriteEngine on RewriteRule ^page$ index.php Afaik, I have setup the .conf file appropriately as well: <Directory "/Users/myuser/Sites/"> Options Indexes MultiViews AllowOverride All Order allow,deny Allow from all </Directory> However, when I try accessing the URL setup via mod_rewrite ( localhost/~myuser/mysite/page ), I get this: Not Found The requested URL /Users/myuser/Sites/mysite/index.php was not found on this server. However, that file does exist, and that is the proper location! The site works fine otherwise, if I go to localhost/~myuser/mysite/index.php, everything works fine--minus any sort of clean URLs, of course. Has anyone seen this before/have any ideas as to what I'm doing wrong?

    Read the article

  • Open Source or Low Cost Layer 7 ("Content") Switch?

    - by Rob
    I have several web servers that host a number of different applications and web sites. I want to make it easy to host apps or parts of web sites on different servers (e.g. example.com/foo might be on one physical server and example.com/bar might be on another). We do this Apache redirects right now, but that gets messy fast and in any case we have other problems we want to solve, such as throttling requests from individual clients, and reducing dependency on specific physical hosts. Is there an open source or low cost layer 7 switch that would be suitable for this sort of task? I was hoping to find something like a stripped down Linux VMware guest/appliance built for this purpose, but haven't seen anything suitable out there so far.

    Read the article

< Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >