Search Results

Search found 19155 results on 767 pages for 'url redirection'.

Page 225/767 | < Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >

  • Rsync --backup-dir seems to be ignored

    - by Patrik
    I want to use rsync to backup a directory from a local location to a remote location, and store changed files in another remote location. I did use: rsync -rcvhL --progress --backup [email protected]:/home/user/Changes/`date +%Y.%m.%d` . [email protected]:/home/user/Files/ The --backup-dir stays empty, while it should be filled. Is it possible what I try to accomplish, and am I doing something wrong? Thanks

    Read the article

  • How can make firefox do this work?

    - by gylns
    I'm using Vimperator with firefox, and I find i always type the same URL, for example I type " http://gigapedia.info/1/@title C++ " to search a ebook with "C++" include in title, How can i just type "C++" and can get the complete URL?

    Read the article

  • Copy and paste twice

    - by Shane
    Is there a way, to copy and paste twice? For example, is there a way, for me to copy one url, store it, and then copy another url, and then for the urls to be pasted respectively? I read somewhere, that this is possible, but have not ben able to figure it out.

    Read the article

  • Restart browser on error

    - by billfredtom
    Is there a way to restart a browser (Internet Explorer, Firefox, etc) automatically upon receiving an error such as a DNS error or Page not found? Background: It is for a web based signage application that has intermittent network drop outs, therefore if there's no internet, and the sign tries to go to the next page, an error occurs on browser, and we get one big ugly sign. We need a way to either restart the browser, or customise displayed error messages to use javascript to try redirection to the live sign again.

    Read the article

  • programatically check if a domain is availible?

    - by acidzombie24
    Using this solution http://serverfault.com/questions/98940/bot-check-if-a-domain-name-is-availible/98956#98956 I wrote a quick script (pasted below) in C# to check if the domain MIGHT be available. A LOT of results come up with taken domains. It looks like all 2 and 3 letter .com domains are taken and it looks like all 3 letter are taken (not including numbers which many are available). Is there a command or website to take my list of domains and check if they are registered or available? using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Text.RegularExpressions; using System.Diagnostics; using System.IO; namespace domainCheck { class Program { static void Main(string[] args) { var sw = (TextWriter)File.CreateText(@"c:\path\aviliableUrlsCA.txt"); int countIndex = 0; int letterAmount=3; char [] sz = new char[letterAmount]; for(int z=0; z<letterAmount; z++) { sz[z] = '0'; } //*/ List<string> urls = new List<string>(); //var sz = "df3".ToCharArray(); int i=0; while (i <letterAmount) { if (sz[i] == '9') sz[i] = 'a'; else if (sz[i] == 'z') { if (i != 0 && i != letterAmount - 1) sz[i] = '-'; else { sz[i] = 'a'; i++; continue; } } else if (sz[i] == '-') { sz[i] = 'a'; i++; continue; } else sz[i]++; string uu = new string(sz); string url = uu + ".ca"; Console.WriteLine(url); Process p = new Process(); p.StartInfo.UseShellExecute = false; p.StartInfo.RedirectStandardError = true; p.StartInfo.RedirectStandardOutput = true; p.StartInfo.FileName = "nslookup "; p.StartInfo.Arguments = url; p.Start(); var res = ((TextReader) new StreamReader( p.StandardError.BaseStream)).ReadToEnd(); if (res.IndexOf("Non-existent domain") != -1) { sw.WriteLine(uu); if (++countIndex >= 100) { sw.Flush(); countIndex = 0; } urls.Add(uu); Console.WriteLine("Found domain {0}", url); } i = 0; } Console.WriteLine("Writing out list of urls"); foreach (var u in urls) Console.WriteLine(u); sw.Close(); } } }

    Read the article

  • google.com re-directing to local html copy

    - by Sneha
    When I type http://google.com and press ENTER on any of my browsers ( Mozilla, Chrome ), the URL bar suddenly transforms into this file:///C:/Users/Administrator/AppData/Roaming/Google_Toolbar/Google_Toolbar/1.0.0.0/MyGoogle.html After this too, I can continue to search in google but the URL bar still shows the local address. Surprisingly this is happening only for google.com and not other sites.

    Read the article

  • Windows 2008 IIS 7.0 HTTP to HTTPS Redirect -- Versus IIS 6.0 Mechanism

    - by Dan7el
    This topic, creating a mechanism for redirection from HTTP to HTTPS on a Windows 2008 server running IIS 7.0 is a much written-about topic on the Internet. How this is done is really not so much my issue. My issue is more of explaining why this can't be done with the standard HTTP Redirect module that ships with Windows 2008 IIS 7.0. Instead, there are other methods needed that are more arduous. First, the IIS 6.0 method requires no externally available modules nor does it require any additional modifications to the web.config or any type of other development effort. It's outlined here: http://blogs.microsoft.co.il/blogs/dorr/archive/2009/01/13/how-to-force-redirection-from-http-to-https-on-iis-6-0.aspx And, you can see the basic steps are to run the snap-in, get the properties on the site, and do some modifications. Presto, you have the HTTP -- HTTP redirect setup. Now, on the IIS 7.0 platform, it doesn't seem this simple. An initial search found the following site: http://www.sslshopper.com/iis7-redirect-http-to-https.html Which has two separate approcates: 1. Involves installing a separately available Microsoft module -- URL Rewrite Module, and then adding XML to the web.config. 2. Custom Error Page. ...there might be other methods, but these are the basic ones and the first is listed as the primary method. But wait...There exists on the IIS 7.0 an HTTP Redirect Module. So...why can't I use the HTTP Redirect Module to do this very thing? This is really my big question. I need to know this because my management is going to insist I use the HTTP Redirect Module and set up the HTTP to HTTPS redirect in a similar fashion to how we do in IIS 6.0. Can someone please explain to me, in clean, simple, easy to understand, terms that both I and my management can understand as to why I need to go get the URL Rewrite Module and install that on the server and make the web.config changes suggested by the article instead of simply using the HTTP Redirect module that's already installed on the site? Thanks a bunch.

    Read the article

  • Unextending Sharepoint 2007 Web Application from a zone

    - by dunxd
    When our Sharepoint was migrated from Sharepoint 2003 to Sharepoint 2007 (both fully paid versions), the consultants who carried it out extended each web app into two IIS sites/zones (e.g. the original Web App was http://intranet, then http://newintranet and http://intranet would be created for Sharepoint 2007 - each with its own IIS site). The idea was that during the migration period we would set up DNS to point the old url to SP2003 servers and the new one to SP2007, then once the migration was complete, do a DNS change so the SP2007 would recieve the requests to the http://intranet type URLs. Unfortunately the contractors did not tidy up the application extensions and IIS sites after the migration, and for some time both URLs were in use, resulting in many document links pointing to the http://newintranet type URLs. This means I need to maintain these URLs. Due to a rejig of organisation structure we now need to relocate some Sharepoint sites, and I'd like to use the RDA Collaboration Sharepoint URL Redirector feature. However a limitation of this is that it doesn't work for Web Applications which have been extended into multiple zones. So I have a need to tidy up the situation that our consultants left behind. I think the right thing to do is use the "Remove Sharepoint from IIS Web Site" page in Central Admin to remove the zone for the newintranet type sites, and select the option to also delete the IIS site. That should result in having no IIS sites listening for http://newintranet type URLs. Is this the right procedure? Once I have done that I need to set up Sharepoint to receive requests sent to the http://newintranet type URLs so they will continue to work. I am not sure if I should do this: using Alternative Access Mappings or, by adding a host header to the IIS site or, creating a non Sharepoint IIS site for each http://newintranet type URL, and use IIS redirection to forward the requests to the new URL using variables to pass the path to the Sharepoint site. Does anyone have any thoughts on these options, or any other way of achieving this? Sharepoint 2007 is running on Windows 2003 with IIS6. We don't currently have plans/budget to upgrade to Sharepoint 2010.

    Read the article

  • Multiple Instances Of The Same Computer Under Network

    - by Reafidy
    Can anyone tell me why we have multiple instances of the same computer (SALLY) under network in the open file dialog. Please see the image below. This is not an issue in itself, however I am wondering if it is related to some file corruption issues we have been having lately. All pc's are windows 7. Server is Windows Server 2008 R2. We are using folder redirection, roaming profiles and offline files.

    Read the article

  • IE Behaviors in my system

    - by Dharani
    When i type some Url say www.google.com in IE (installed and tested with IE 6.0/7.0/8.0) at first attempt it does not recognize that URL when i type it second time.It goes to the page. But when i test it in other browsers like firefox ,Chrome without problem it is working. I scanned my system with Norton and Kasper sky,they do not complaint about any virus.I am using Windows XP Service pack 2. Does my system get affected with something say doubleclick virus?

    Read the article

  • Force caching of handler output which actively resists caching

    - by deceze
    I'm trying to force caching of a very obnoxious piece of PHP script which actively tries to resist caching for no good reason by actively setting all the anti-cache headers: Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Content-Type: text/html; charset=UTF-8 Date: Thu, 22 May 2014 08:43:53 GMT Expires: Thu, 19 Nov 1981 08:52:00 GMT Last-Modified: Pragma: no-cache Set-Cookie: ECSESSID=...; path=/ Vary: User-Agent,Accept-Encoding Server: Apache/2.4.6 (Ubuntu) X-Powered-By: PHP/5.5.3-1ubuntu2.3 If at all avoidable I do not want to have to modify this 3rd party piece of code at all and instead just get Apache to cache the page for a while. I'm doing this very selectively to only very specific pages which have no real impact on session cookies or the like, i.e. which do not contain any personalised information. CacheDefaultExpire 600 CacheMinExpire 600 CacheMaxExpire 1800 CacheHeader On CacheDetailHeader On CacheIgnoreHeaders Set-Cookie CacheIgnoreCacheControl On CacheIgnoreNoLastMod On CacheStoreExpired On CacheStoreNoStore On CacheLock On CacheEnable disk /the/script.php Apache is caching the page alright: [cache:debug] AH00698: cache: Key for entity /the/script.php?(null) is http://example.com:80/the/script.php? [cache_disk:debug] AH00709: Recalled cached URL info header http://example.com:80/the/script.php? [cache_disk:debug] AH00720: Recalled headers for URL http://example.com:80/the/script.php? [cache:debug] AH00695: Cached response for /the/script.php isn't fresh. Adding conditional request headers. [cache:debug] AH00750: Adding CACHE_SAVE filter for /the/script.php [cache:debug] AH00751: Adding CACHE_REMOVE_URL filter for /the/script.php [cache:debug] AH00769: cache: Caching url: /the/script.php [cache:debug] AH00770: cache: Removing CACHE_REMOVE_URL filter. [cache_disk:debug] AH00737: commit_entity: Headers and body for URL http://example.com:80/the/script.php? cached. However, it is always insisting that the "cached response isn't fresh" and is never serving the cached version. I guess this has to do with the Expires header, which marks the document as expired (but I don't know whether that's the correct assumption). I've tried to overwrite and unset headers using mod_headers, but this doesn't help; whatever combination I try the cache is not impressed at all. I'm guessing that the order of operation is wrong, and headers are being rewritten after the cache sees them. early header processing doesn't help either. I've experimented with CacheQuickHandler Off and trying to set explicit filter chains, but nothing is helping. But I'm really mostly poking in the dark, as I do not have a lot of experience with configuring Apache filter chains. Is there a straight forward solution for how to cache this obnoxious piece of code?

    Read the article

  • Question about RewriteRule and HTTP_HOST server variable

    - by SeancoJr
    In evaluating a rewrite rule that redirects to a specific URL and say the rewrite condition is met, would it be possible to use HTTP_HOST as part of the URL to be redirected to? Example in question: RewriteRule .*\.(jpg|jpe?g|gif|png|bmp)$ http://%{HTTP_HOST}/no-leech.jpg [R,NC] The motive behind this question is a desire to create a single htaccess file that would match against an addon domain (on a shared hosting account) and an infinite amount of subdomains below it to prevent hotlinking of images.

    Read the article

  • run browser from remote machine

    - by cathy02
    Hello, I can connect to a remote machine using ssh, I would like to open a URL from the remote machine. Well, I can do this using lynx (a command line browser), but then, javascript etc... gets messed up. Is there a way I can open the URL and see it in my local machine firefox browser for example ?? Thanks

    Read the article

  • What is the recommended approach to add static subdomains to a website?

    - by shg
    I would like to create a few static subdomains like: mycategory.mydomain.com in a rather small website and would like it to point to the folder: mydomain.com/mycategory without showing such redirection in browser address bar. What is an easiest way to achieve it? I can do it in either IIS settings, asp.net, C# code, etc I guess there are better ways then creating a few separate Sites in IIS - one for each subdomain.

    Read the article

  • MaxRequestLen error when i use https

    - by david
    When i got MaxRequestLen errors in file upload page, i set MaxRequestLen=31457280 using: <IfModule mod_fcgid.c> MaxRequestLen 31457280 FcgidIOTimeout 90 </IfModule> Now file upload works when i use http url. I have recently configured ssl for my site and when i use https url for same upload page, i get the same error: HTTP request length 131073 (so far) exceeds MaxRequestLen (131072) Is there a different setting for https? Please help. Thank you.

    Read the article

  • Exclude certain files or directories from redirected folders

    - by jao
    We have a windows 2003 AD and are using Folder redirection to redirect the users My Documents to a share. Is there a way to save certain filetypes (*.mp3, *.avi) or folders (My Music, My Pictures) on the user's hard disk instead of saving on the netwerk share? I'm aware of the GPO setting 'Exclude directories in roaming profile' but I'm not sure if that will do what I want (we're using redirected folders)

    Read the article

  • Over writing output to a text file

    - by Naveen Gamage
    I'm trying to write wget command's output to a text file, but it always appends to the text file. #!/bin/sh download() { local url=$1 echo -n " " wget --progress=dot $url 2>&1 | grep --line-buffered "%" | \ sed -u -e "s,\.,,g" | awk '{printf("\b\b\b\b%4s", $2)}' echo " DONE" } file="$1" echo -n "Downloading $file:" download "$file" > file.log I tried using using > won't work, where am I doing wrong?

    Read the article

  • Domain Forwarding | A Magento Store

    - by WaZ
    I have installed Magento inside a folder called magento. The URL of the site currently looks like this: http://gios.azamdevelopment.co.uk/magento/ We want our domain forwarded to the above URL and moreover, any links relative should work as well. e.g. http://gios.azamdevelopment.co.uk/magento/customer/account/login/ should ideally be www.giosconcept.com/customer/account/login and so forth. Thanks very much.

    Read the article

  • Domain Forwarding | A Magento Store

    - by WaZ
    I have installed Magento inside a folder called magento. The URL of the site currently looks like this: http://gios.azamdevelopment.co.uk/magento/ We want our domain forwarded to the above URL and moreover, any links relative should work as well. e.g. http://gios.azamdevelopment.co.uk/magento/customer/account/login/ should ideally be www.giosconcept.com/customer/account/login and so forth. Thanks very much.

    Read the article

  • Redirect all traffic to specified hosts behind NAT

    - by biesiad
    Is there a possibility to redirect all traffic to specified hosts behind NAT? For example i have a server, a domain "mydomain.com" and 3 hosts behind NAT. I wish to configure 3 subdomains: host1.mydomain.com host2.mydomain.com host2.mydomain.com and each of them to redirect all ports to specified host in local net. That redirection should provide funcionality like this: http://host1.mydomain.com (can be achieved using apache) ssh [email protected] (???) and other protocols on diferent ports Thanks for any help.

    Read the article

  • Can I tell Chrome not to redirect mistyped URLs?

    - by Nathan Long
    If I mistype a URL, Chrome sometimes redirects me to a search. For instance, typing "example_url_i_sometimes_mistype.conm" into the location bar gets me: Your search - example_url_i_sometimes_mistype.conm - did not match any documents. Not only is this annoying, but if the mistyped URL was one on private DNS, I've now just told Google that the domain exists. (A small concern, but bad in principle.) Can I configure Chrome to just show an error and not blab to Google Search about it?

    Read the article

< Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >