Search Results

Search found 7625 results on 305 pages for 'scraper sites'.

Page 47/305 | < Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >

  • Apply SharePoint template to existing site?

    - by johnnyb10
    I have several similar SharePoint sites (running on WSS 3) and I have saved one of the sites as a template. I now want to make a different site (which already exists) have the same structure as this site--the same lists, document libraries, views, etc. I know I can delete the existing site and then recreate it based on this template, but is there a way to apply this template to my existing site, so that it gets rid of its existing lists, etc., and replaces them with the ones from the template? I don't have any content in the site, and I don't want to keep any of the existing structures, so I don't care if anything gets swept away. I may need to do this with a bunch of sites in the future, so being able to apply the template rather than recreating from scratch might be very helpful.

    Read the article

  • Perl throwing 403 errors!

    - by Jamie
    When I first installed Perl in my WAMP setup, it worked fine. Then, after installing ASP.net, it began throwing 403 errors. Here's my ASP.net config: Load asp.net module LoadModule aspdotnet_module "modules/mod_aspdotnet.so" Set asp.net extensions AddHandler asp.net asp asax ascx ashx asmx aspx axd config cs csproj licx rem resources resx soap vb vbproj vsdisco webinfo # Mount application AspNetMount /asp "c:/users/jam/sites/asp" # ASP directory alias Alias /asp "c:/users/jam/sites/asp" # Directory setup <Directory "c:/users/jam/sites/asp"> # Options Options Indexes FollowSymLinks Includes +ExecCGI # Permissions Order allow,deny Allow from all # Default pages DirectoryIndex index.aspx index.htm </Directory> # aspnet_client files AliasMatch /aspnet_client/system_web/(\d+)_(\d+)_(\d+)_(\d+)/(.*) "C:/Windows/Microsoft.NET/Framework/v$1.$2.$3/ASP.NETClientFiles/$4" # Allow ASP.net scripts to be executed in the temp folder <Directory "C:/Windows/Microsoft.NET/Framework/v*/ASP.NETClientFiles"> Options FollowSymLinks Order allow,deny Allow from all </Directory> Also, what are the code tags for this site?

    Read the article

  • Google Chrome Automatic Page Load

    - by WDC
    Google Chrome keeps loading the second page of certain websites which is helpful on sites like Netflix so that instead of clicking on a link for page 2, I can just continue scrolling down and it'll automatically have the next page ready for me. But on online clothing sites, it just gets backed up loading all the next pages, misplacing links and loading the next page so that it actually replaces the page I'm trying to view literally every millisecond. Considering clothing sites generally have upwards of 20 to 200 pages of clothing, this is really annoying since Chrome tries to load all of them. How do I turn off the automatic page load???

    Read the article

  • Can't access some websites with any browser

    - by Charles Kingsmill
    I'm running Windows 7 64-bit on a new Samsung laptop and accessing the internet okay via ethernet cable to my university's ISP. Some sites work fine (e.g. google.com) but I can't access others at all (microsoft.com, topshop.com). I can't connect to those sites in safe mode with networking. And ping and tracert both fail. There's no proxy. Other users can connect successfully to these sites using my cable and socket. I've tried all the following with no success: using various browsers (IE9, FF, Chrome) creating a new user updating drivers clearing the DNS cache using OpenDNS and Google's DNS turning off Avast tweaking the MTU running MS malicious software removal tool running Spybot S&D reviewing the hosts file disabling the IPv6 options repairing / resetting winsock settings disabling advanced javascript options I have run out of ideas... can anyone see anything I've missed??!

    Read the article

  • Single sign-on for SharePoint to MySite?

    - by Chris W
    I've got a fairly simple SharePoint 2010 farm set up: 2 WFE servers with Network Load Balancing hosting the main portal site. As per Microsoft's best practice recommendations I've set up My Sites in a separate web application. As some of the user base are not using domain joined PCs they have to login once for the portal (http://portal) and then again when the access My Sites since they're crossing in to a separate web application on a separate host (http://mysite). Portal & MySite are both hosted on the same physical WFE servers. Is there an easy way to set up some thing to stop this happening and just have them login once? I understand that there's plans for us to deploy ISA in the not too distant future - could we use ISA to manage authentication to the two sites so that the users only need to log in once?

    Read the article

  • Printing to shared printers across VPN

    - by CYMR0
    I have a program that prints labels at five remote sites. Two sites, aren't working, but the rest are with an identical (as far as I can tell) setup. Using Wireshark, I have determined that the handshaking all goes well, but after the "Open Print File Response" the packet that is sent from the server, doesn't reach the client. But I'm a bit at a loss as to where I go from here. I know the port the packet was sent on (445) isn't being blocked, the RST packet gets sent on the same port and that gets there fine. It's also weird that the three out of five sites are working fine. This has been up and running for years without issue, all that we have changed is our connectivity (from DSL to bonded DSL). But this traffic is over a VPN - so it can't be the ISP interfering either can it? I'm totally stuck, and any help would be much appreciated. Thanks!

    Read the article

  • Group Policy for IE Security Zones

    - by Doug Luxem
    We are currently using the following Group Policy to control the Internet Explorer security zones: User Configuration\Administrative Templates\Windows Components\Internet Explorer\Internet Control Panel\Security Page Then setting the Site to Zone Assignment List with the various values using the following chart: Value Setting ------------------------------ 0 My Computer 1 Local Intranet Zone 2 Trusted sites Zone 3 Internet Zone 4 Restricted Sites Zone This works well; however, users are then unable to edit (or especially add) to their zone settings. Is there a way to lock in our custom zone settings while still giving users the ability to add their own sites to the security zones? Yes, I do realize the slight security risk in opening this up.

    Read the article

  • Two Routers, Two Internet (1 Open, 1 PPTP) - Routing?

    - by SomeUser
    Hi there, I'm trying to setup two routers - one to route specific sites to a always-on PPTP VPN connection, the other for open internet access w/ firewall. First router is connected to Internet w/ built-in firewall. Second router is connected to a PPTP VPN connection. I was going to connect a wire between the routers and would like some insight on how to get both groups of systems (connected to each router) to talk between automatically. Even better would be to setup one gateway for certain sites and another for general Internet. The other option is to default all sites to the net and shoot others to the gateway or vice versa... Any insight so I can get a better grasp of this? Thanks!

    Read the article

  • Configuring a Unified Communications Certificate for many virtual hosts running in Jetty

    - by rrc7cz
    I have a single IP with Jetty serving up X sites on port 80. Basically you can sign up for our service, then point your domain www.mycompany.com to that IP, and Jetty will serve up your custom site. I would like to add SSL support for all sites. To simplify things, I've looked at getting a single Unified Communications Certificate to plug into Jetty and have it work for all sites. Is this possible? Has anyone done this before? Does Jetty only support traditional, single-domain certs? What issues might I run in to compared to a single-domain cert?

    Read the article

  • Mac internet problems

    - by Bradley Herman
    Our office is set up with mostly macs (7 of them) but we do have a windows laptop and a windows desktop on the network as well. The network is configured with a modem going into a switch/router throughout the office to the computers, along with a wireless router. Everything runs fine most of the time, but periodically while using the web, certain sites will stop loading and timeout repeatedly. This usually lasts 20 minutes or so and can be incredibly annoying. Resetting the modem/router and/or rebooting the computer never helps. The weirdest part is that in almost every case, the websites are fine on our Windows machines. I frequently use github, google, Stack Overflow, and jQuery reference and I can count on the sites being unavailable to me at least once a day. While I can't get them to load, I can spin my chair around to the windows server behind me and load the sites just fine. Any idea what the hell could be going on here?

    Read the article

  • IIS site hacked with ww.robint.us malware

    - by sucuri
    A bunch of IIS sites got hacked with a javascript malware pointing to ww.robint.us/u.js. Google cache says more than 1,000,000 different pages got affected: http://www.google.com/#hl=en&source=hp&q=http%3A%2F%2Fww.robint.us%2Fu.js http://blog.sucuri.net/2010/06/mass-infection-of-iisasp-sites-robint-us.html My question is: Did anyone here got hacked with that and still have any logs (or network dump) available for analysis? If you do, have you spotted anything interesting in there? Sites as big as wsj.com got hacked and some people are saying that maybe a zero-day on IIS/ASP.net is in the wild...

    Read the article

  • Tomcat 5.5 - multiple contexts using same path

    - by ctn8iv
    Is it possible to set up multiple contexts using the same path? For example: <Context docBase="/www/websites/site1/java/base" path="/base" reloadable="true"/> <Context docBase="/www/websites/site2/java/base" path="/base" reloadable="true"/> I have two sites that use the same path both running on the same server/IP. The sites use different virtual hosts and different ServerNames, but I have no control over the directory structure of the sites because they are maintained by a client. Until now, they have been content with only allowing one site to run at a time, but this is a major hassle, so I need to know if there's a workaround.

    Read the article

  • Apache configuration: choose a site to display according to visitor's IP address

    - by user64294
    Hi all. I would like to set a special configuration in our apache web server. I would like to display sites to the users according their IP addresses. We plan to upgrade our web sites. During the upgrade we'll put a maintenance site: so all the users which will connect to our web sites will get this site. but in order to test the upgrade i need to set apache to let only my IP address to access to asked site. If my IP address is a.b.c.d and if i ask for test_dot_com i want to see it. but all other users, having a different IP address, should get the maintenane site even if they look for test_dot_com. Is there a way to do this? PS: sorry As i'm a new user i can't use more than one link. so test_dot_com = test.com Thank you.

    Read the article

  • Rewrite rule to redirect all subpages to a single page?

    - by user784637
    I have two two files /etc/apache2/sites-available/foo and /etc/apache2/sites-available/foo_maintenance The rewrite rule I use in /etc/apache2/sites-available/foo is <Directory /var/www/public_html> Options +FollowSymlinks RewriteOptions inherit RewriteEngine on # RewriteCond %{HTTP_HOST} ^mysite\.com [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] </Directory> so that all mysite.com/* redirect to www.mysite.com After I take my site down for maintenance, if the user is navigates to a subpage of the site like mysite.com/subdir/something.php I would like to redirect them to www.mysite.com so the index.html of the maintenance page would be displayed. What is the rewrite rule to redirect all traffic from any subpage to www.mysite.com?

    Read the article

  • /etc/hosts.deny ignored in Ubuntu 14.04

    - by Matt
    I have Apache2 running on Ubuntu 14.04LTS. To begin securing network access to the machine, I want to start by blocking everything, then make specific allow statements for specific subnets to browse to sites hosted in Apache. The Ubuntu Server is installed with no packages selected during install, the only packages added after install are: apt-get update; apt-get install apache2, php5 (with additional php5-modules), openssh-server, mysql-client Following are my /etc/hosts.deny & /etc/hosts.allow settings: /etc/hosts.deny ALL:ALL /etc/hosts.allow has no allow entries at all. I would expect all network protocols to be denied. The symptom is that I can still web browse to sites hosted on the Apache web server even though there is a deny all statement in /etc/hosts.deny The system was rebooted after the deny entry was added. Why would /etc/hosts.deny with ALL:ALL be ignored and allow http browsing to sites hosted on the apache web server?

    Read the article

  • HTTP 401 error in Windows Authentication disappears after swapping Providers

    - by Ray Cheng
    The IIS 7 on Windows 2008 R2 is acting really weird. We deploy our web apps as web sites with appcmd.exe. After they are deployed, if I browse to them, I'll get HTTP 401 errors. The web sites are only have Windows Authentication enabled and the providers are Negotiate and NTLM in such order. But if I swap the providers, the HTTP 401 error goes away. Even if I swap it back, the errors are still gone. So the order of the providers doesn't seem to matter, what matters is the swapping. It must have triggered something. Even if we delete the web site and application pool and reinstall the web sites, the errors are still gone. So far, we can't reproduce it easily since it happens randomly. Has anyone experienced this? How do I go about to troubleshoot it?

    Read the article

  • how can i cahe one more web site on same backend server (web server) with varnish?

    - by Kerberos
    i have one web server which is IIS that is back on varnish. there are more web sites on ISS. there are all web sites header's on IIS and all web sites publish from port 80. can i cache all web site by varnish like below code;backend cacheWebSite{.host = "192.168.0.1"; .port = "80";} sub vcl_recv {if (req.http.host == "www.example1.com") {set req.backend = CacheWebSites;} if (req.http.host == "www.example2.com") {set req.backend = CacheWebSites; } if (req.http.host == "www.example3.com") {set req.backend = CacheWebSites; }} i can't test this code. that is just senario. thank you for your help already now.

    Read the article

  • What is the difference between a plain Amazon ec2 instance and beanstalk?

    - by Alex Ford
    I am a solo developer and the sites I'm deploying are very small, usually hobby sites and I have a few questions about the Amazon services. Is there a reason for me to use beanstalk or should I just stick with a single ec2 instance? Should I use RDS for database? I heard someone say that I could just install a database on my ec2 instance, making it cheaper. I'm trying to keep everything as cheap as possible. I need to point custom domains to my sites. Pretty sure that means I have to deal with elastic IPs. Do those work with beanstalk or only with individual ec2 instances? Thanks in advance!

    Read the article

  • Running "ipconfig /displaydns" in cmd prompt still displays results even after I run "ipconfig /flushdns"

    - by 400_THE_CAT
    Whenever I run "ipconfig /displaydns" I get a long list of sites even after running "/flushdns." I thought my results should be empty considering /flushdns. Is this normal behavior? I also noticed that after I run /flushdns and browse the internet for a couple of hours, my list of cached sites doesn't really change. Google.com, for example isn't on the list but a bunch of sites I've never visited do show up in my DNS cache. Can someone explain this?

    Read the article

  • Apache not directing to correct VHost

    - by BANANENMANNFRAU
    I have setup the following virtual host ServerAdmin [email protected] ServerName mysite.com ServerAlias www.mysite.com DocumentRoot /var/www/homepage/public_html ErrorLog ${APACHE_LOG_DIR}/error.log CustomLog ${APACHE_LOG_DIR}/access.log combined When I hit my url Apache still shows the default page. Not the index Ive created in the give Document root. In my Domain i have set the A Record to the Ip of my VPS: apache2ctl -S: output: VirtualHost configuration: *:80 is a NameVirtualHost default server xxxxxx.stratoserver.net (/etc/apache2/sites-enabled/000-default.conf:1) port 80 namevhost xxxxxxx.stratoserver.net (/etc/apache2/sites-enabled/000-default.conf:1) port 80 namevhost mysite.com (/etc/apache2/sites-enabled/homepage.conf:1) alias www.mysite.com ServerRoot: "/etc/apache2" Main DocumentRoot: "/var/www" Main ErrorLog: "/var/log/apache2/error.log" Mutex default: dir="/var/lock/apache2" mechanism=fcntl Mutex mpm-accept: using_defaults Mutex watchdog-callback: using_defaults PidFile: "/var/run/apache2/apache2.pid" Define: DUMP_VHOSTS Define: DUMP_RUN_CFG User: name="www-data" id=33 not_used Group: name="www-data" id=33 not_used How would I need to setup my Virtual host so that apache shows the correct site depending on the Domain im redirecting from.

    Read the article

  • How can I audit a Linux filesystem for files which have been changed or added within a specific time

    - by Bcos
    We are a website design/hosting company running several sites on a Linux server using Joomla 1.5.14 and recently someone was able exploit a vulnerability in the RW Cards component to write arbitrary files/modify existing files on our filesystem enabling them to do some nasty things to our customers sites. We have removed vulnerable modules from all sites but are still seeing some problems. We suspect that they still have some scripts installed and need a way to audit anything that has been changed or added in the last 10 days. Is there a command or script we can run to do this?

    Read the article

  • In SharePoint, What is the best way to move a subsite to its own site collection?

    - by user7862
    I am currently running a SharePoint 2007 Farm. I have a subsite (http://server/sites/hr/finance) that I wish to move to its own site collection (http://server/sites/finance). I exported the subsite using stsadm -o export. Then I created the new site collection (http://server/sites/finance). Then I attempted to import the site using stsadm -o import. However, I'm getting the following error: "The file cannot be imported because its parent web does not exist" I am running as the Site Collection administrator.

    Read the article

  • Transferring NS records to a new server

    - by lanemiller
    I feel like that was NOT worded well, but here is my current predicament. I recently had a GoDaddy dedicated server, and decided after their customer support failed to do anything but disappoint, to switch to Rackspace. We have 2 ns records that point to our godaddy server, and we have a few sites left on the server, that rely on it for their DNS zones, and the owners of the domains fail to respond to us. So, the question is, if I need to transfer the sites off of the OLD godaddy NS, can I point the A records from my ns1.domain.com and ns2.domain.com to match up with IP addresses of the Rackspace nameservers? OR, do I cname my NS records to match the rackspace ones? I DO know that this isn't advised, either method, but I need to get these sites moved before Godaddy tries charging another $2k for the server.

    Read the article

  • Broadband Traffic Question

    - by rutherford
    I have a broadband ADSL line with plus.net in the UK. Having checked the modem there is no firewall or any weird features enabled. But since I arrived at the apartment (the broadband already being installed), I cannot log into Twitter nor update any of my wordpress blogs (I can browse them and log in, but cannot save any edits or new posts). It only seems to affect these two sites in their unique ways. If I take the netbook I use in this place out to say a McDonalds or some other wifi access point then these sites work fine again. Anyone know what could possibly be preventing access of the pages in question? The only thing common to these pages are the POST response they are expecting. But POST form submission works fine on other sites...

    Read the article

  • WGet from one site on a server to another site on the same server

    - by JoshReedSchramm
    Hey all, I've recently been asked to administer a couple ubuntu boxes running web servers. I'm a dev by trade so if this question is fairly noob please forgive. We have about a dozen sites running on this box. 2 of our sites need to talk back and forth over a restful api. Unfortunately we are having issues with the sited connection to each other via wget. When we try and run wget manually from the command line from the server pointing to a site also on that server it hangs and eventually times out. If we do the same thing from outside the server to the same site on the server it works. Is there something that could be preventing sites on the same server from communicating with each other? The same thing happens pinging the site from the server.

    Read the article

< Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >