Search Results

Search found 26123 results on 1045 pages for 'multiple sites'.

Page 40/1045 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • Windows FTP client to mirror ftp sites?

    - by user15318
    I need a Windows application that works like *nix's lftp. The Windows host will download the latest changes made to FTP Server #1, and upload those changes to FTP Server #2. Server #2 can't pull files directly from Server #1 because it's a www shared host with no shell account. The requirements are: 1. Windows app available for XP/Vista/W7 2. Must run either as icon or service. I don't want to have an extra icon open in the task bar 3. Reliable, so I don't have to worry about it. Is there an application you would recommend? Thank you.

    Read the article

  • Plesk and Apache configuration gives me 403 on all sites

    - by Michael Stark
    My friends server running Plesk 9.2 with Apache. Now there were some problems the last days where he couldn't tell me what exactly has happened. The situation now is the following: He has a lot of domains in it. When somebody visit any domain it shows up a 403. I checked the logs and saw the problem [Sun Jun 24 08:24:47 2012] [error] [client XX.XX.XX.XX] script '/srv/www/htdocs/index.php' not found or unable to stat Apache should route to '/srv/www/vhosts/domain.tld/htdocs/index.php' instead of /srv/www/htdocs/index.php It's doing that on all of the domains. Can you tell me whats wrong and how to fix it?

    Read the article

  • Dual monitor setup can cause boot problem?

    - by kriszpontaz
    I have a two monitor setup, one 22" (1680x1050 res., 16:10) and another 19" (1280x1024 res. 5:4). I've installed ubuntu 11.10 beta2 x86, and the installation worked fine, the system boot was successful. I've upgraded ubuntu from the main server, and after restart, booting with the kernel 3.0.0-13, my system hangs up with a purple screen, and than nothing happens (the system boots successful with the kernel image 3.0.0-8). Nvidia current drivers not installed, but if i install it, the situation is the same. I have an Nvidia 9600GT installed. I tried to boot with one screen attached, I've tried each port, but no luck at all. With kernel image 3.0.0-8 the system successfully boots with each display attached, but the farther kernels (3.0.0-11; 3.0.0-12; ect.) all freezes, even one display, or multiple attached. I have two systems with ubuntu installed, and the other (with Ati HD 2400XT, latest closed drivers) don't have any issues like this, I wrote about. Update: The problem solved by reinstalling the operatin system, without automatically installing updates during install, with only one monitor attached. After completing installation, and clean reboot, i've installed closed nVidia drivers. After all, i found it's safe to connect another monitor to the system, it's not causing any problems. Probably the situation stays like this.

    Read the article

  • Safari Plugin/Extension Sites?

    - by Kaji
    Useful as it was when I was running Tiger, the number of broken links and general lack of information on what's compatible with version 4.0+, I've decided it's time to give up on PimpMySafari.com and look elsewhere. Can anyone recommend a similar site that actually keeps its content up-to-date?

    Read the article

  • Deploy multiple emails to email providers, but without showing favouritism

    - by Ardman
    We are currently developing a new email deployment system. We have the system currently configured so that it reads a record from the database and loads the email content and deploys it to the target. Now we want to move this over to multiple threads. That is easily done, except we then hit the email providers returning SMTP codes referring to "Too many connections", or "Deferred connection". The solution to this is to have a thread open up a connection to the email provider and deploy n emails and then disconnect. We have currently configured the application so that it will support these session based email deployments. The problem is this, the database table has multiple email addresses in and they aren't grouped by email provider because that will show favouritism. We need to be able to retrieve a set number of, i.e. Hotmail, emails (@hotmail.com, @hotmail.co.uk, @live.co.uk) so that we are reducing the number of connections to Hotmail and reducing the risks of getting the "Too many connections" error. We are at the point now where we have gone round and round in circles trying to get a solution, so I thought I'd throw it out there and see if anyone has any ideas? EDIT I would like to stress that this application is not used for spamming purposes.

    Read the article

  • If you develop on multiple operating systems, is it better to have multiple computers + displays?

    - by dan
    I develop for iOS and Linux. My preferred OS is Ubuntu. Now my software shop (me and a partner) is developing for Windows too. Now the question is, is it more efficient to have multiple workstations, one for each target OS? Efficiency and productivity is a higher priority than saving money. I have a 3.4Ghz i7 desktop workstation running Ubuntu and virtualized Windows with two displays, and I'm putting together an even more powerful i7 Hackintosh with 16GB RAM (to replace my weak 2.2Ghz i5 Macbook Pro). My specific dilemma is whether I should sell the first computer and triple boot on the second one, or buy two more displays and run both desktop systems simultaneously. Would appreciate answers from developers who write software for multiple OSes. Running guest OSes in VirtualBox on one system not ideal, because in my experience performance is seriously degraded under virtualization. So the choice is between dual/triple booting on one system vs having two systems, one for OSX+iOS/Windows (dual boot) and the other for Ubuntu (which I prefer to use as my main OS). For much of our work, I write a server-side application in Linux and a client for iOS (or for Windows or OS X) simultaneously.

    Read the article

  • 389 DS Achitecture for Multiple Sites

    - by Kyle Flavin
    I'm looking to deploy 389 Directory in my environment to replace an existing iPlanet installation. I would be using it primarily to store user account data for authentication purposes. I have two physically separate data centers that I would like to share the same directory tree. My initial thinking is to setup 389 DS as follows: -A Master/Consumer in DataCenter A -A Master/Consumer in DataCenter B -Replication agreement between both masters, to mirror the directory tree in both environments. Does this sound like a reasonable approach? Is there a better way to do it? (ie: four masters?) Is there documentation for best practices when setting up 389 DS in situations such as this? Thanks.

    Read the article

  • Mirror DFS configuration data between 2 servers/ sites

    - by Retro69
    I have 1 Windows 2008 R2 server in Site A running Domain Integrated DFS in 2008 mode with a Single Namespace with a large number of DFS Targets all configured to point to a share on our NetApp SAN. Step 1. I want to initially copy this configuration data across to a 2012 server in Site A preserving all the configuration data. Step 2. I need to mirror this configuration to a 2nd server in Site B so we dont have a single point of failure for the DFS namespace. For Example. A user in Site B would "connect" to the DFS server in Site B, but if that site was down, it would attempt to connect to the Server in site A and vice versa. Note im not interesting in replicating actual Data here, just the configuration. Our NetApp SANS have mirroring which take care of that. Is this possible? Many thanks.

    Read the article

  • Different external ip addresses from different sites

    - by user630286
    My router is ClearOS 6(Centos 6). In my router, I have two external (internet) network connections from two ISP's. The primary connection is eth2 connected to a cable modem and the second one is ppp0 connected to a dsl modem. I have assigned eth2 as the primary connection (with a high metric value). In fact this is done through clearos's multiwan web interface. I have a test in my Nagios to monitor whether the primary connection. This connection is done based on the result of curl ifconfig.me But it seems that ifconfig.me is always giving the ip address of my secondary connection. I tested it through a browser. Yes ifconfig.me gives the secondary internet's(ppp0) ip address. But whatismyipaddress.[com|org] give my primary ip address (eth2). I checked the default route on the router through ip route list 0/0 which also shows the primary connection (eth2) as the default route. The traceroute www.google.com and traceroute ifconfig.me both seems to trace through the primary connection (eth2). As our secondary internet connection has only got a limited download, I don't want to end up having to pay a large sum at the end of the month. Has somebody got an idea why the ifconfig.me shows my secondary address? What is the best way to ensure that my router(and thus the lan) use the right internet connection.

    Read the article

  • DansGuardian: content regexp list for exact sites, how?

    - by Sergey
    We have contentregexplist file where we can write all substitutions like "source regex"-"dest string" And they ALL run for each page. Is it possible somehow to define a domain name(s) for which (only for 1 domain! not every page) some regexps should be looked for? To be clear: How to replace "Google"-"garbage" in page source only for host host.example.org? May be other content filtering system can do this? Then which one?

    Read the article

  • multiple domian links on google from one wordpress site

    - by user557318
    at present when i google the domain name of the wordpress sites i have worked on i receive at least three listings(often the top three. The first listing is the only one i am interested in seeing, others appear from individual pages from that wordpress site i.e. 1st hit - www.domain.com 2nd hit www.domain.com/about 3rd hit www.domain.com/designers . Does anybody know if its possible to remove all the links but the absoloute www.domain.com

    Read the article

  • How to publish a page to two sites?

    - by George2
    Hello everyone, I am using SharePoint 2007 Enterprise + Publishing portal template + Windows Server 2008. I have a root site and a sub-site. I want to enable the following function -- when the sub-site administrator publishing a page, the administrator could select to publish to the sub-site only or publish to both root site and sub-site. Any ideas how to implement this? I am not sure whether there is any ready-to-use solution without writing code? thanks in advance, George

    Read the article

  • Xen virtual host can reach some sites but not others

    - by Tun H S Lee
    Okay, this is killing me. Debian Squeeze, Xen 4.0, brand new install. No iptables rules whatsoever except for the ones added by the default xen bridge script. Dom0 can reach the entire world, no problems. DomU can receive packets from some hosts, but not from others. For instance, if I ping Host A, it works fine. If I ping Host B, the DomU reports 100% packet loss. The hosts are random, but consistent (even after reboots). I can see no pattern to why some work and others don't. In fact, in some cases, different virtual hosts on the same server (an other server at a different data center) are divided; some work and others do not. I can reboot (DomU or Dom0 too) and the same hosts will work or fail as before. If I tcpdump on the Host B while pinging from the DomU, everything looks fine. It sees the echo request coming in and says it's sending one back. However, if I tcpdump peth0 on the Dom0, it never sees the echo reply. Any ideas what could be happening? I'm tearing my hair out here.

    Read the article

  • xampp - can access control panel, cannot access projects/sites on local network

    - by Peter O.
    I've configured xampp and firewall so I can access desktop pc's localhost over my local network through desktop pc's IP. But I'm not able to access auctual projects: I can access: http://192.168.x.x/xampp or http://192.168.x.x/phpMyAdmin But I cannot access: http://192.168.x.x/myWebsite/ I get an error: Server error We're sorry! The server encountered an internal error and was unable to complete your request. Please try again later. error 500

    Read the article

  • Certain sites not working in Firefox, working in IE

    - by PSU_Kardi
    Totally weird thing happening on my PC after I came back from the Holiday shutdown. My homepage by default is google.com/ig but when it opens (in Firefox) the G-Mail panel does not display and eventually times out. I then try to navigate to https://www.gmail.com but that also times out. Thinking maybe work decided to drop the ban-hammer on g-mail I decided to try it in Internet Explorer. Oddly enough it works in IE Any idea on why it works fine in one browser but not the other?

    Read the article

  • Javascript loading never completes on many sites

    - by Joe
    I recently moved country and have found that on many websites the page never finishes loading. In some cases, no content is ever displayed, but the loading will never time out. Loading Developer Tools in Chrome shows me that it is the Javascript files which never load. For example, this BBC article will never load compatability.js, though will load all the other JS files perfectly. Google Maps often fails to finish loading, meaning it's impossible to make searches. There seems to be no pattern to which files will fail to load (i.e. they don't come from the same CDN). I have tried Chrome, Safari and Firefox on OSX 10.8, and Chrome on my girlfriend's OSX 10.7. I have similar issues on the iPad. In many cases, if I can go to the mobile version of the page that seems to load fine. I have run the browsers in private mode, disabled plugins, updated flash, cleared the cache, flushed the DNS cache - though it would seem that if this is happening on other devices, none of this would work anyway. Is this an ISP issue? And if so, why would it be limited to certain JS files and not all? JS files from the same domain work fine, so I'm not really sure what I should be looking for.

    Read the article

  • External Full HD monitor and Virtual Desktop Size

    - by Stefan
    I have two FullHD monitors attached to my ATI graphics card [2]. The resolution of both of them is detected properly without any modifications to /etc/X11/xorg.conf. I can run both of them in clone mode. However, when I try to run them next to each other, I got the following error: The selected configuration for displays could not be applied. If tried to fix this according to [1]. My xorg.conf now looks like this: Section "Module" Load "glx" EndSection Section "Screen" Identifier "Default Screen" DefaultDepth 24 SubSection "Display" # The 1088 is the smallest multiple of 32 >= 1088 # see manpages Virtual 1920 1088 EndSubSection EndSection This does not seem to be parsed properly. After restarting X, I cannot set resolutions beyond 1600 or so any more. /var/log/Xorg.0.log gives: [ 15.676] (II) fglrx(0): Not using mode "1920x1080" (width too large for virtual size) [ 15.676] (II) fglrx(0): Not using mode "1680x1050" (width too large for virtual size) Are my modifications syntactically incorrect? According to the man page, it should be fine. Any ideas? OS: Ubuntu 11.10 64bit [1] http://askubuntu.com/a/75546/5023 [2] 01:00.0 VGA compatible controller: ATI Technologies Inc Juniper [Radeon HD 5700 Series]

    Read the article

  • why only google related sites load using wireless network

    - by gansai
    I have a wireless internet connection through BSNL. I have a windows 7 ultimate, and using latest Google chrome. During night time ie from 12:30 am onwards till around 5:30 am, I have the following problem: Only the following webpages load: Google Search Google images and all related to google product pages like you see them on the google home page. Blogspot webpages as supported through Google. Any other site loading gets stuck with the status message:- Waiting for www.thaturl.com .... And it goes on and on. Why does this happen? I checked around googling for this but no solution at all. By the way I tried changing my dns addresses too sometimes from automatically to opendns and then Google public dns. I need some solution which might help me to load other websites also. Thanks in advance.

    Read the article

  • List all documents (webparts) and sites using a certain solution in sharepoint 2007

    - by tnolan
    I would like to uninstall a Sharepoint application template (GroupBoard Workspace to be exact) but I want to make sure nothing currently relies on it. I don't see any functions within stsadm that will tell me this information and I have even tried SPM which would work, but with such a huge site it's tedious to go through every single web and page to see which features are in use. Is there a way (probably with SQL using the id from stsadm -o enumsolutions) to list everything that relies on a template within a given solution, including webparts on custom pages? If this is not possible, what is the best way to check dependencies prior to uninstalling a solution (especially since GBW is not the only one on my axe list.) Note: I know that stsadm -o deletesolution will stop me from removing something that is in use, but I want to see all of the things that are using a given solution.

    Read the article

  • Working with Git on multiple machines

    - by Tesserex
    This may sound a bit strange, but I'm wondering about a good way to work in Git from multiple machines networked together in some way. It looks to me like I have two options, and I can see benefits on both sides: Use git itself for sharing, each machine has its own repo and you have to fetch between them. You can work on either machine even if the other is offline. This by itself is pretty big I think. Use one repo that is shared over the network between machines. No need to do git pulls every time you switch machines, since your code is always up to date. Never worry that you forgot to push code from your other non-hosting machine, which is now out of reach, since you were working off a fileshare on this machine. My intuition says that everyone generally goes with the first option. But the downside I see is that you might not always be able to access code from your other machines, and I certainly don't want to push all my WIP branches to github at the end of every day. I also don't want to have to leave my computers on all the time so I can fetch from them directly. Lastly a minor point is that all the git commands to keep multiple branches up to date can get tedious. Is there a third handle on this situation? Maybe some third party tools are available that help make this process easier? If you deal with this situation regularly, what do you suggest?

    Read the article

  • Strange issue with Wordpress sites, is it PHP Memory?

    - by Drai
    This has happened to me twice with the same host and I want to know the real cause. I have multiple wordpress sites hosted on a shared server. One day when I attempt to visit any of the sites, the webpage simply downloads the index.php file. It happens on all wordpress sites but not on static sites hosted there. I understand this is a php issue on the server, but what could be happening specifically? the only thing I could find when searching is something to do with memory limits. Is this common? Should I be worried about this host?

    Read the article

  • Creating Test Sites

    - by Robert
    I have a website running off site. When we hire someone I would like to create a test site (a copy of live site) for the new employee to tinker with. I will need to take fresh copies of the Files and Database (basically a snapshot) and allow them to access these copied files and database so they could edit and upload them to see the changes they made as if it was the live site Basically what is the best practice for creating a copy of a website for testing? Server is running Linux, PHP, mySQL

    Read the article

  • Adding users to multiple/all sites in Google Webmaster Tools?

    - by Christian
    I didn't find an answer to this, but maybe I didn't use the right terms for my search. So I'm sorry if this is a duplicate. Anyway, my situation is this: the company I work at manages a lot of sites (100+), and we've recently put them all into Google Webmaster Tools under my Google account, which was tedious enough. Now two coworkers are supposed to be added as users for each site, so they can see the data and manage stuff there as well. But I can only find an option to add users for a single site, not for all sites that are currently associated with my account. Do I really have to go through more than a hundred sites one by one and add the two users to each of them, or is there some way to add both users to all/multiple sites at once?

    Read the article

  • Apache httpd VirtualHost config - multiple sites

    - by DaFoot
    [Advised to post here from StackExchange] I have a site to work on, because of the way the URLs are built the application seems to have been created on the assumption that it will be at the server root (only app). On my dev server I have other projects and up to now a simple symlink has been working for me, but that's not the case now because this new app wants to sit at the route and process all URLs arriving on :80. Hopefully this snippet from httpd.conf will help explain what I'm trying to acheive: # default for any not matched elsewhere <VirtualHost *:80> ServerName localhost DocumentRoot /var/www/html/newproject </VirtualHost> # now try to pick out specific URLs <VirtualHost localhost/webdev> DocumentRoot /var/www/html/existingProject ServerName localhost/project </VirtualHost> Also need to be able to get same affect from wherever I'm accessing the httpd instance from. Hope that makes sense.

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >