Search Results

Search found 28303 results on 1133 pages for 'multi site'.

Page 173/1133 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • Backup Exec 10 - Network connection to the remote agent has been lost

    - by jherlitz
    Okay, so I have 4 remote offices, all running off of a 3mb ethernet connection. Two sites are part of a WAN and 2 sites are using 3mb connections over a site to site tunnel. I am using Backup Exec 2010, I have the remote agent installed on all the remote servers. For the past few weeks now, on the two sites running over the site to site tunnel have been failing with the following error message now. "The network connection to the Backup Exec Remote Agent has been lost. Check for network errors" We used to be on a DSL connection site to site tunnel, now we changed to the 3mb ethernet connection using site to site tunnel. I have to find out, has it been failing ever since we changed, or just recently. Backup exec support is telling me it is a network issue. My communication or connection to the server is solid, we don't have any issues, or outages. So I am baffled on why this continues to fail. And why just those two sites.. Any advice?

    Read the article

  • Configure one IIS site to handle two separate SSL certificates using external Load Balancing or SSL Acceleration Servers

    - by bmccleary
    I have one web application on our server that needs to be referenced by two different domain names, both of which have their own SSL certificates. The application is exactly the same for both domains, but we have to keep the two domain names for legal reasons. The problem is that, since both domains need to have their own SSL certificate, that inside of our IIS 7.5 configuration we have to have two separate IIS applications (both pointing to the same physical location) with their own unique IP address and SSL certificate installed. Now, I know that, due to the nature of SSL communications, that this is by design and that you can't assign more than one SSL certificate per IP address and domain name. My question is… is there any way around this limitation and keep one web application in IIS and have it service two SSL certificates based on host name? I know that with the basic IIS configuration that this is not possible, but I was thinking that with some sort of combination of external load balancing and/or SSL acceleration servers/services that we could have these servers process the SSL request and leave IIS clean to have one single application. I am not familiar at all with these technologies, hence the reason I am asking if it is theoretically possible. If not, does anyone else know how to achieve this?

    Read the article

  • Any way to stop people from img "framing" your site?

    - by Yegor
    Someone was trying to get cute with me, by "iframeing" my search result page via an IMG tag with 0 width and 0 height, in hopes of killing my server resources. My searches are cached, so it doesn't do much damage, since its just a static file being served, but I was wondering if there was anything I can do to "fight back"? I know you can use a frame breaker, had it been an iframe. Is there anything to do in the case of an image?

    Read the article

  • What is the best way to archive (spider) a site that is going to be removed?

    - by Guy
    Three different blogs that I read have recently announced that they are going to be discontinued and removed from the web. Although the archived pages will probably be in Google's cache for a few weeks after they've gone and some of the pages will be in the Way Back Machine I'd like to archive those sites to my hard disk for future reference. What is the best way to do this? Is there any software that transforms a blog (e.g. Blogspot) into a chronological PDF?

    Read the article

  • why i cannot download jdk from oracle web site directly without AuthParam?

    - by hugemeow
    that is download with the following command, why it fails to download that file? wget http://download.oracle.com/otn-pub/java/jdk/6u35-b10/jdk-6u35-linux-i586.bin the following command works, but that AuthParam may not work after a while, why? wget http://download.oracle.com/otn-pub/java/jdk/6u35-b10/jdk-6u35-linux-i586.bin?AuthParam=1346955572_27e44512fe8ef5cb920c4c329e5f0fd8 how this AuthParam option is implemented? why i cannot download without this parameter? and why i can only get this parameter using explorer? is rewrite used in the oracle server when deal with wget request? why the same command not works after an hour, does the value of AuthParam expired? so how the server check whether the value of AuthParam is expired? wget http://download.oracle.com/otn-pub/java/jdk/6u35-b10/jdk-6u35-linux-i586.bin?AuthParam=1346955572_27e44512fe8ef5cb920c4c329e5f0fd8 --2012-09-07 03:51:01-- http://download.oracle.com/otn-pub/java/jdk/6u35-b10/jdk-6u35-linux-i586.bin?AuthParam=1346955572_27e44512fe8ef5cb920c4c329e5f0fd8 Resolving download.oracle.com... 23.67.251.50, 23.67.251.57 Connecting to download.oracle.com|23.67.251.50|:80... connected. HTTP request sent, awaiting response... 403 Forbidden 2012-09-07 03:51:01 ERROR 403: Forbidden. @KJ-SRS is that kind of CGI program which is used to judge if AuthParam is right? is that possible to download jdk package purely using wget command, and no need to get that AuthParam in explorer

    Read the article

  • My site was recently attacked. What do I do?

    - by ChrisH
    This is a first for me. One of the sites I run was recently attacked. Not at all an intelligent attack - pure brute force - hit every page and every non-page with every extension possible. Posted with garbage data to every form and tried to post to some random urls too. All tod, 16000 requests in one hour. What should I do to prevent/alert this kind of behavior? Is there a way to limit the request/hr for a given ip/client? Is there a place I should be reporting the user to? They appear to be from China and did leave what seems like a valid e-mail.

    Read the article

  • Is it possible to rsync your web site to another backup server and use the same .htaccess files?

    - by stephenmm
    I am trying to use rsync to replicate all the files from one web server to another server that could act as a backup if the first one went down. The problem I am having is that the .htaccess file requires the AuthUserFile to have the fully quallified path to the .htpasswd file and I cannot make the paths the same on the two machines. Does anyone know how I might use the same .htaccess file on two different servers? Thanks for any help that can be provided.

    Read the article

  • This web site needs a different Google Maps API key. A new key can be generated at http://code.googl

    - by MJI
    Apologies in advance if this is the wrong place to post. I tried searching for this issue and all that seemed to come up were questions posted from people who had this issue with their web pages. I couldn't find questions related to this issue from a laypersons perspective. I'm not a developer. I have no domain, nor wish to have one at the time. Rather I'm just a regular person who likes to upload photos to some photo related sites. My uploading process constantly gets interrupted by one of these annoying API errors. I get it at least two times, one when I click the page to upload, and also right after it has uploaded. It also pops up if I go to edit a photo or delete it. This interrupts my browsing experience until I click okay. I just want a fix for the annoying without having to register for a key. I tried before and it required a web domain. I rather not have to create a domain and go through such hoops just to fix this. Is there a solution for this problem that doesn't require registration? Another thing to note: I have used two computers. One has the message pop-up and the other doesn't. What is different about the two computers?

    Read the article

  • Will my internet address for my internal site cause my traffic to go external?

    - by Toby Allen
    If I have two domains pointing to the same machine, but one resolves to an internal address and the other to my internet facing router, will there be any differnce in route taken to my machine (primarily in terms of performanc). eg. internal.mydomain.com resolves to 192.168.1.200 external.mydomain.com resolves to A.Web.External.IP both eventually resolve back to the same machine. For a client in the network, will using the external address give a performance penalty?

    Read the article

  • How would you change a home wireless router with a self-signed admin site certificate to be more secure?

    - by jldugger
    littleblackbox is publishing "private keys" that are accessible on publicly available firmwares. Debian calls these "snake-oil" certs. Most of these routers are securing their HTTPS certs with these, and as I think about it, I've never seen one of these internal admin websites with certs that wasn't self signed. Given a webserver on IP 192.168.1.1, how do you secure it to the point that Firefox doesn't offer warnings (and is still secured)?

    Read the article

  • FTP upload for a PHP file hosting site, how to connect ProFTPD to mysql database?

    - by Igor
    I'm running a file upload service and users have requested to have FTP upload features Basically, I need to allow users to login, via FTP, to an FTP daemon (say, proFTPd) and they should be able to use their username and password (stored in a mysql database) to login there After logging in, I'll take care of the files with a cron job I'm stuck on how to make proftpd get users and passwords from my database..any ideas?

    Read the article

  • I am starting to think that Prevx.com isnt a legit site...but heres my long-winded question

    - by cop1152
    I apologize in advance for the long-winded post. I posted it all because I believe its informative and may be useful. Also, I posted my question at the end. Moments ago I was RDC to a file server in my home (from inside my home). I had opened Firefox and Googled for a manufacturers website. Immediately after clicking the link, Firefox abruptly closed. This seemed odd to me to so I checked the running processes and discovered d.exe, e.exe, and f.exe running. I Googled these processes on a different machine and found them belonging to a key-logger/screen-capturer/trojan called defender.exe, which according to the Prevx lives in c:\documents and settings\user\local settings\temp. (Prevx link http://www.prevx.com/filenames/147352809685142526-X1/DEFENDER32.EXE.html) Simultaneously, an obviously-spoofed Windows Firewall popup appeared on the server asking me to click ‘yes’ to update Windows Firewall. At this time I ended all rogue processes, emptied the temp folder, removed defender.exe from startup, and checked my registry and a few other locations. Before deleting Defender.exe I noted that it was created moments ago, just before Firefox crashed. I believe that I was ‘almost’ infected with this malware. I believe that it needed me to click the phony popup in order to complete infection because it wasn’t allowed to execute processes from the temp folder. After cleaning the machine, I restarted it and have been monitoring it for over an hour. I am debating on whether or not to restore the Windows partition (a separate physical drive from the data) or to just watch it for awhle. I should mention that, because of the specs on this machine, I do not run antivirus software, but I know it well and inspect it regularly. It is a very old Compaq with a 400mhz processer and 512mb of ram. I have a static IP and the server is in the DMZ running an FTP client and some HTTP server software. All files transferred to and stored on this machine are scanned for malware before transferring. Usually the machine only runs 19 processes and performs pretty well for its intended purpose. I posted the story so that you could be aware of a possible new piece of malware and how it acts, but I also have a question or two. First, over the last few months I have noticed that PREVX is listed at the top of most of my Google searches when researching malware, especially for new or obscure malware…and they always want you to purchase something. I don’t think they are one of the top AV companies, so it seems odd that they are always the top Google result. Does anyone have any experience with any of their products? Also, what sites do you rely on for malware researching? Recently, I have found it difficult to find good info because of HijackThis-logs and other deadend info cluttering up my searches. And lastly, besides antivirus, third-party firewall, etc, what settings would you use to lock down a machine to make it more secure in instances where a stubborn admin like myself refuses to run AV? Thanks.

    Read the article

  • Web site kills hard disk I/O, how to prevent?

    - by Taras Voynarovsky
    The situation: I have a server, on which we have 2-3 projects. Starting not long ago, the server started hanging up (We could not connect to it by ssh, and the connected clients had to wait 20 minutes for top to give results) Early today I managed to execute gstat while it was in this state and saw, that it stays on 100% on da0, da0s1 and da0s1f. I dont quite know what those ids meen, but I understand that some processes just kill the HD by bombing it down with requests. I ask of some propositions. I dont know how to find the culpit and can't prevent this. I have freebsd on server.

    Read the article

  • How can HAProxy improve availibility, or "how can I prevent my site from going down"? [closed]

    - by Joe Hopfgartner
    I am aware of what HAProxy does, but what if my HAproxy goes down? Or what if my DNS servers go down? Yes, dns is less the problem. However dns only solves to an IP and an IP is announced via BGP to be routed over some router. What if that router goes down? Of course if I have complicated application servers that are likely to fail HAProxy can significantly improve uptime. But my application isnt. In fact my application may very well just be delivering a small static html file via HTTP. Basically if any user anywhere types in MYDOMAIN.COM, I want the user to get SOMETHING on the screen other than a timeout or DNS resolution error. How can I do that? The point of entry is difficult. The so called "initial closure mechanism".

    Read the article

  • Why are my DNS Lookups so long (300+ms) when accessing my web site?

    - by Travis
    I'm running a Fedora 11 server with Apache 2. I'm trying to optimize so things are as fast as possible from the server side, and I'm noticing (via Firebug for Firefox) that upon loading the homepage of one of the sites on the web server that for every file it loads (HTML, CSS, JavaScript, GIF, PNG, JPG, etc.), it does a DNS lookup. All of the files it is looking up are local to the server, so I'm surprised to see it even do a DNS lookup. Also, each of these lookups is in the 150-450ms range, which is way too high for my liking. I've tried adjusting /etc/resolve.conf to use Google's Public DNS servers. I restarted the network service and tapped the page again, but the numbers didn't go down. I've reverted back to the default DNS servers since I didn't see any gain. Any ideas on what is causing it to: a) do the dns lookup in the first place, and b) take so long when doing the actual lookup? Thanks in advance.

    Read the article

  • Is there a way to make IETab always open a site in Internet Explorer?

    - by reg
    Hello, IETab offers a setting that lets you define sites that should always be opened using IE's rendering engine. This is great and I use it all the time. There are a few sites, though, with which I encounter problems when I access them this way (needless to say that these have some IE-only features, otherwise I would simply use Firefox without IETab). So my question is: is there a way to make IETab (or any other extension that does the same thing) automatically open a specific list of sites in IE? To be clear: I want a way to specify a list of sites that will open an external IE process from Firefox, when clicking on those links or bookmarks from within Firefox.

    Read the article

  • Scientific calculator app/site that saves calculations for Win 7?

    - by verve
    Are there any scientific calculator apps that aren't browser specific that's easy to use which displays whatever is inputted clearly and one that can be used by a keyboard only but can be used by only a mouse too? Also, I want one that saves the history of calculations and lets us paste it into a document at another time. Also, please don't recommend ones from sketchy-looking sites. Freeware or paidware. I prefer paidware if it's the only way to guarantee accurate calculations. Win 7 Pro.

    Read the article

  • How should I interpret site analytics with 11 pageviews in an 3 second visit?

    - by Juank
    I'm using google analytics and recently i've noticed some weird trends going on. I have a lot of visits that last mere seconds but mark several page views... more than a normal human can see in that range of time. A specific case is that the only visitor from Ireland i've had until now recorded 11 pageviews in a 3 second visit. Are these crawlers? Shouldn't google analytics filter those out?

    Read the article

  • If I let Google handle my emails for my domain, my Wordpress site won't send out emails anymore

    - by Fulvio
    Since I decided to let Google handle all my emails for my domain, while the domain is hosted on a 3rd party server, emails send out by a Wordpress installation no longer work. My supposition is that since all email is being routed to Google, my specific account on that server for that domain is unable to send out emails. I definitely wish to keep using google services for handling my emails since it comes with all the advantages connected to a Google account. However I need my Wordpress installation to send out administrative emails. I run my server with CPanel. How to configure that specific account and/or Wordpress to keep it able to send out emails? I don't need people to answer these emails sent out from server (eventually I might set a reply-to-address perhaps) thanks

    Read the article

  • Something keeps changing the default permissions of my settings.php file for a web site on Linux

    - by JrSysAdmin
    I keep changing the file permissions for the file /var/www/html/websitename/settings.php to 775 and within 15 minutes or so it automatically changes back 555. The owner of the file is "apache" and the group ownership is for our Linux Developers just like all of the other files which are not having this issue. Obviously there must be some sort of process running that is automatically changing the file permissions (apache maybe?) but I haven't been able to figure out. Any help would be greatly appreciated.

    Read the article

  • Multiple SSL certificates on Apache using multiple public IPs - not working

    - by St. Even
    I need configure multiple SSL certificates on a single Apache server. I already know that I need multiple external IP addresses as I cannot use SNI (only running Apache 2.2.3 on this server). I assumed that I had everything configured correctly, unfortunately things are not working as they should (or maybe I should say, as I expected them to work)... In my httpd.conf I have: NameVirtualHost *:80 NameVirtualHost *:443 Lets say my public IP is 12.0.0.1 and my private IP is 192.168.0.1. When I use the public IP in my vhost my default website is being shown instead the one defined in my vhost, e.g.: <VirtualHost 12.0.0.1:443> ServerAdmin [email protected] ServerName blablabla.site.com DocumentRoot /data/sites/blablabla.site.com ErrorLog /data/sites/blablabla.site.com-error.log #CustomLog /data/sites/blablabla.site.com-access.log common SSLEngine On SSLCertificateFile /etc/httpd/conf/ssl/blablabla.site.com.crt SSLCertificateKeyFile /etc/httpd/conf/ssl/blablabla.site.com.key SSLCertificateChainFile /etc/httpd/conf/ssl/blablabla.site.com.ca-bundle <Location /> SSLRequireSSL On SSLVerifyDepth 1 SSLOptions +StdEnvVars +StrictRequire </Location> </VirtualHost> When I use the private IP in my vhost everything works as it should (the website defined in my vhost is being shown), e.g.: <VirtualHost 192.168.0.1:443> ...same as above... </VirtualHost> My server is listening on all interfaces: [root@grbictwebp02 httpd]# netstat -tulpn | grep :443 tcp 0 0 0.0.0.0:443 0.0.0.0:* LISTEN 5585/httpd What am I doing wrong? If I cannot get this to work I cannot continue to add the second SSL certificate on the other public IP... If more information is required just let me know!

    Read the article

  • How to block a specific site's directory in windows?

    - by Creedy
    How can I block a specific directory of a website as e.g. http://example.com/someSite unfortunately the hosts file is not an option since you can only block whole domains there and any "/" just destroys these rules. This is just for my personal protection against visiting some sites too often, while i still have to be able to get to the other sites of that domain (as e.g. example.com/someOtherSite) Would be great if someone knows a solution regarding this topic.

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >