Search Results

Search found 8020 results on 321 pages for 'webcenter sites'.

Page 26/321 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • Windows 7 PC browsers having trouble to open certain sites but not others

    - by user55345
    Hi, My Win7 PC these days acting very weird: Examples: It can not open bing.com, alexa.com, msdn.microsoft.com (blank page or page can not be found); It can open zdnet.com but lost all CSS layouts and pictures; It can open stackoverflow and Google no problem; I tried all three browsers (IE8, Chrome, Firefox 3.6) same thing. I didn't change anything on this PC in days. This PC does have Anti-Virus installed and updated. I'm also pretty sure it's not a network problem, because my other laptops are just fine at the same time. The only thing I can think of is it might be some auto-update stuff underneath happened without my knowledge, but I have no idea where to troubleshoot? Please shed some lights for me? Thanks a lot.

    Read the article

  • Windows FTP client to mirror ftp sites?

    - by user15318
    I need a Windows application that works like *nix's lftp. The Windows host will download the latest changes made to FTP Server #1, and upload those changes to FTP Server #2. Server #2 can't pull files directly from Server #1 because it's a www shared host with no shell account. The requirements are: 1. Windows app available for XP/Vista/W7 2. Must run either as icon or service. I don't want to have an extra icon open in the task bar 3. Reliable, so I don't have to worry about it. Is there an application you would recommend? Thank you.

    Read the article

  • Plesk and Apache configuration gives me 403 on all sites

    - by Michael Stark
    My friends server running Plesk 9.2 with Apache. Now there were some problems the last days where he couldn't tell me what exactly has happened. The situation now is the following: He has a lot of domains in it. When somebody visit any domain it shows up a 403. I checked the logs and saw the problem [Sun Jun 24 08:24:47 2012] [error] [client XX.XX.XX.XX] script '/srv/www/htdocs/index.php' not found or unable to stat Apache should route to '/srv/www/vhosts/domain.tld/htdocs/index.php' instead of /srv/www/htdocs/index.php It's doing that on all of the domains. Can you tell me whats wrong and how to fix it?

    Read the article

  • Running Multiple sites with multiple domains apache

    - by PsychoData
    I am having a rough time running apache and using multiple domain names here is a snippet of my config file. I keep getting a error saying that NameVirtualHost has no VirtualHosts. I want them both running on the same IP and I'm not sure why this doesn't work. I've been digging through the documentation for VirtualHosts, NameVirtualHost, and apache's page about name based virtual hosting. That example in the name based page is almost exactly my config! What am I doing wrong? Listen *:80 NameVirtualHost *:80 <VirtualHost *:80> ServerName www.sample1.net DocumentRoot /var/www/sample1-net </VirtualHost> <VirtualHost *:80> ServerName www.example2.net DocumentRoot /var/www/example2-net </VirtualHost>

    Read the article

  • Safari Plugin/Extension Sites?

    - by Kaji
    Useful as it was when I was running Tiger, the number of broken links and general lack of information on what's compatible with version 4.0+, I've decided it's time to give up on PimpMySafari.com and look elsewhere. Can anyone recommend a similar site that actually keeps its content up-to-date?

    Read the article

  • 389 DS Achitecture for Multiple Sites

    - by Kyle Flavin
    I'm looking to deploy 389 Directory in my environment to replace an existing iPlanet installation. I would be using it primarily to store user account data for authentication purposes. I have two physically separate data centers that I would like to share the same directory tree. My initial thinking is to setup 389 DS as follows: -A Master/Consumer in DataCenter A -A Master/Consumer in DataCenter B -Replication agreement between both masters, to mirror the directory tree in both environments. Does this sound like a reasonable approach? Is there a better way to do it? (ie: four masters?) Is there documentation for best practices when setting up 389 DS in situations such as this? Thanks.

    Read the article

  • Mirror DFS configuration data between 2 servers/ sites

    - by Retro69
    I have 1 Windows 2008 R2 server in Site A running Domain Integrated DFS in 2008 mode with a Single Namespace with a large number of DFS Targets all configured to point to a share on our NetApp SAN. Step 1. I want to initially copy this configuration data across to a 2012 server in Site A preserving all the configuration data. Step 2. I need to mirror this configuration to a 2nd server in Site B so we dont have a single point of failure for the DFS namespace. For Example. A user in Site B would "connect" to the DFS server in Site B, but if that site was down, it would attempt to connect to the Server in site A and vice versa. Note im not interesting in replicating actual Data here, just the configuration. Our NetApp SANS have mirroring which take care of that. Is this possible? Many thanks.

    Read the article

  • Different external ip addresses from different sites

    - by user630286
    My router is ClearOS 6(Centos 6). In my router, I have two external (internet) network connections from two ISP's. The primary connection is eth2 connected to a cable modem and the second one is ppp0 connected to a dsl modem. I have assigned eth2 as the primary connection (with a high metric value). In fact this is done through clearos's multiwan web interface. I have a test in my Nagios to monitor whether the primary connection. This connection is done based on the result of curl ifconfig.me But it seems that ifconfig.me is always giving the ip address of my secondary connection. I tested it through a browser. Yes ifconfig.me gives the secondary internet's(ppp0) ip address. But whatismyipaddress.[com|org] give my primary ip address (eth2). I checked the default route on the router through ip route list 0/0 which also shows the primary connection (eth2) as the default route. The traceroute www.google.com and traceroute ifconfig.me both seems to trace through the primary connection (eth2). As our secondary internet connection has only got a limited download, I don't want to end up having to pay a large sum at the end of the month. Has somebody got an idea why the ifconfig.me shows my secondary address? What is the best way to ensure that my router(and thus the lan) use the right internet connection.

    Read the article

  • DansGuardian: content regexp list for exact sites, how?

    - by Sergey
    We have contentregexplist file where we can write all substitutions like "source regex"-"dest string" And they ALL run for each page. Is it possible somehow to define a domain name(s) for which (only for 1 domain! not every page) some regexps should be looked for? To be clear: How to replace "Google"-"garbage" in page source only for host host.example.org? May be other content filtering system can do this? Then which one?

    Read the article

  • How to publish a page to two sites?

    - by George2
    Hello everyone, I am using SharePoint 2007 Enterprise + Publishing portal template + Windows Server 2008. I have a root site and a sub-site. I want to enable the following function -- when the sub-site administrator publishing a page, the administrator could select to publish to the sub-site only or publish to both root site and sub-site. Any ideas how to implement this? I am not sure whether there is any ready-to-use solution without writing code? thanks in advance, George

    Read the article

  • Xen virtual host can reach some sites but not others

    - by Tun H S Lee
    Okay, this is killing me. Debian Squeeze, Xen 4.0, brand new install. No iptables rules whatsoever except for the ones added by the default xen bridge script. Dom0 can reach the entire world, no problems. DomU can receive packets from some hosts, but not from others. For instance, if I ping Host A, it works fine. If I ping Host B, the DomU reports 100% packet loss. The hosts are random, but consistent (even after reboots). I can see no pattern to why some work and others don't. In fact, in some cases, different virtual hosts on the same server (an other server at a different data center) are divided; some work and others do not. I can reboot (DomU or Dom0 too) and the same hosts will work or fail as before. If I tcpdump on the Host B while pinging from the DomU, everything looks fine. It sees the echo request coming in and says it's sending one back. However, if I tcpdump peth0 on the Dom0, it never sees the echo reply. Any ideas what could be happening? I'm tearing my hair out here.

    Read the article

  • xampp - can access control panel, cannot access projects/sites on local network

    - by Peter O.
    I've configured xampp and firewall so I can access desktop pc's localhost over my local network through desktop pc's IP. But I'm not able to access auctual projects: I can access: http://192.168.x.x/xampp or http://192.168.x.x/phpMyAdmin But I cannot access: http://192.168.x.x/myWebsite/ I get an error: Server error We're sorry! The server encountered an internal error and was unable to complete your request. Please try again later. error 500

    Read the article

  • Certain sites not working in Firefox, working in IE

    - by PSU_Kardi
    Totally weird thing happening on my PC after I came back from the Holiday shutdown. My homepage by default is google.com/ig but when it opens (in Firefox) the G-Mail panel does not display and eventually times out. I then try to navigate to https://www.gmail.com but that also times out. Thinking maybe work decided to drop the ban-hammer on g-mail I decided to try it in Internet Explorer. Oddly enough it works in IE Any idea on why it works fine in one browser but not the other?

    Read the article

  • Javascript loading never completes on many sites

    - by Joe
    I recently moved country and have found that on many websites the page never finishes loading. In some cases, no content is ever displayed, but the loading will never time out. Loading Developer Tools in Chrome shows me that it is the Javascript files which never load. For example, this BBC article will never load compatability.js, though will load all the other JS files perfectly. Google Maps often fails to finish loading, meaning it's impossible to make searches. There seems to be no pattern to which files will fail to load (i.e. they don't come from the same CDN). I have tried Chrome, Safari and Firefox on OSX 10.8, and Chrome on my girlfriend's OSX 10.7. I have similar issues on the iPad. In many cases, if I can go to the mobile version of the page that seems to load fine. I have run the browsers in private mode, disabled plugins, updated flash, cleared the cache, flushed the DNS cache - though it would seem that if this is happening on other devices, none of this would work anyway. Is this an ISP issue? And if so, why would it be limited to certain JS files and not all? JS files from the same domain work fine, so I'm not really sure what I should be looking for.

    Read the article

  • why only google related sites load using wireless network

    - by gansai
    I have a wireless internet connection through BSNL. I have a windows 7 ultimate, and using latest Google chrome. During night time ie from 12:30 am onwards till around 5:30 am, I have the following problem: Only the following webpages load: Google Search Google images and all related to google product pages like you see them on the google home page. Blogspot webpages as supported through Google. Any other site loading gets stuck with the status message:- Waiting for www.thaturl.com .... And it goes on and on. Why does this happen? I checked around googling for this but no solution at all. By the way I tried changing my dns addresses too sometimes from automatically to opendns and then Google public dns. I need some solution which might help me to load other websites also. Thanks in advance.

    Read the article

  • List all documents (webparts) and sites using a certain solution in sharepoint 2007

    - by tnolan
    I would like to uninstall a Sharepoint application template (GroupBoard Workspace to be exact) but I want to make sure nothing currently relies on it. I don't see any functions within stsadm that will tell me this information and I have even tried SPM which would work, but with such a huge site it's tedious to go through every single web and page to see which features are in use. Is there a way (probably with SQL using the id from stsadm -o enumsolutions) to list everything that relies on a template within a given solution, including webparts on custom pages? If this is not possible, what is the best way to check dependencies prior to uninstalling a solution (especially since GBW is not the only one on my axe list.) Note: I know that stsadm -o deletesolution will stop me from removing something that is in use, but I want to see all of the things that are using a given solution.

    Read the article

  • Creating Test Sites

    - by Robert
    I have a website running off site. When we hire someone I would like to create a test site (a copy of live site) for the new employee to tinker with. I will need to take fresh copies of the Files and Database (basically a snapshot) and allow them to access these copied files and database so they could edit and upload them to see the changes they made as if it was the live site Basically what is the best practice for creating a copy of a website for testing? Server is running Linux, PHP, mySQL

    Read the article

  • Strange issue with Wordpress sites, is it PHP Memory?

    - by Drai
    This has happened to me twice with the same host and I want to know the real cause. I have multiple wordpress sites hosted on a shared server. One day when I attempt to visit any of the sites, the webpage simply downloads the index.php file. It happens on all wordpress sites but not on static sites hosted there. I understand this is a php issue on the server, but what could be happening specifically? the only thing I could find when searching is something to do with memory limits. Is this common? Should I be worried about this host?

    Read the article

  • Adding users to multiple/all sites in Google Webmaster Tools?

    - by Christian
    I didn't find an answer to this, but maybe I didn't use the right terms for my search. So I'm sorry if this is a duplicate. Anyway, my situation is this: the company I work at manages a lot of sites (100+), and we've recently put them all into Google Webmaster Tools under my Google account, which was tedious enough. Now two coworkers are supposed to be added as users for each site, so they can see the data and manage stuff there as well. But I can only find an option to add users for a single site, not for all sites that are currently associated with my account. Do I really have to go through more than a hundred sites one by one and add the two users to each of them, or is there some way to add both users to all/multiple sites at once?

    Read the article

  • Apache httpd VirtualHost config - multiple sites

    - by DaFoot
    [Advised to post here from StackExchange] I have a site to work on, because of the way the URLs are built the application seems to have been created on the assumption that it will be at the server root (only app). On my dev server I have other projects and up to now a simple symlink has been working for me, but that's not the case now because this new app wants to sit at the route and process all URLs arriving on :80. Hopefully this snippet from httpd.conf will help explain what I'm trying to acheive: # default for any not matched elsewhere <VirtualHost *:80> ServerName localhost DocumentRoot /var/www/html/newproject </VirtualHost> # now try to pick out specific URLs <VirtualHost localhost/webdev> DocumentRoot /var/www/html/existingProject ServerName localhost/project </VirtualHost> Also need to be able to get same affect from wherever I'm accessing the httpd instance from. Hope that makes sense.

    Read the article

  • Des attaquants pourraient savoir quels sites vous avez visité sur IE, Chrome et Firefox d'après les recherches d'un ingénieur Belge

    Des attaquants pourraient savoir quels sites vous avez visité sur IE, Chrome et Firefox d'après les recherches d'un ingénieur Belge En 2010 des chercheurs de l'université de Californie du Sud ont mené une analyse sur plus de 40 sites qui espionnaient les habitudes de navigation des utilisateurs. Parmi eux figuraient YouPorn.com qui se targue d'être le YouTube de son domaine et utilisait du code JavaScript pour savoir si les visiteurs avaient visité récemment des sites concurrents. En combinant...

    Read the article

  • Unable to connect to FTP sites

    - by Fariborz Navidan
    Since a few days ago I am unable to connect to any FTP site neither through Windows's FTP client nor through other software like WinSCP. But I can login through command line FTP client. I am using Windows 7. ESET firewall is disabled and did not find any blocking rule in windows firewall. A symptom is when enter ftp://my-server.com in windows explorer address bar it says "windows cannot access the folder" and there is no further info in Details section. Please help!

    Read the article

  • Virtual folder for multiple sites

    - by Cups
    I am creating a very simple flat file CMS for small (multilingual) websites. The little file writing that goes on is handled by 4 scripts in a publicly available folder in each site named /edit. Given that I have 2 websites now working on that simple system: websiteA/index.php (etc) websiteA/edit/ websiteB/index.php (etc) websiteB/edit/ What is the best way of making that /edit folder "virtual" in order that these and each subsequent website owner can login to their view of /edit and yet the code only exists in one place. I do not want the website owners to have to login from a central website, but from their own /edit directory. I have already read about different solutions seemingly using the <Directory> directive in my httpd.conf declaration for each website, and also using straight mod_rewrite but admit to now becoming confused about some of the terminology. Each website has its own config file which contains path settings and so on. What in your opinion is the best way to handle this? EDIT In light of a reply, I suppose that given a virtual host directive such as this: <VirtualHost 00.00.00.00:80> DocumentRoot /var/www/html/websitea.com ServerName www.websitea.com ServerAlias websitea.com DirectoryIndex index.htm index.php CustomLog logs/websitea combined </VirtualHost> Is it possible to create an alias inside that directive for the folder websitea.com/edit ?

    Read the article

  • How to enable WordPress to have multiple sites without a re-direct

    - by user57039
    I'm using WordPress to manage my site and when the site does a re-direct, it slows down performance. For example, WordPress allows you a single default site, www.mycompany.com. If a user goes to mycompany.com, WP will re-direct it www.mycompany.com. Is there a way to configure WP so that it will listen on both www.mycompany.com and mycompany.com without redirects. The redirects are causing performance hits to the site.

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >