Search Results

Search found 1083 results on 44 pages for 'mysite'.

Page 13/44 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • Zyxel p-2602HW-1DA - LAN to WAN routing problems

    - by Garrett
    Hi Got a new router yesterday (due to new internet supplier) and now all my requests for my own server (local lan) is routed directly to the router instead of the server, when using dns. Ex. I have a website www.mysite.org running on my server at home (local lan). From work I can access it via www.mysite.org, which is great. But from home (local lan) my request's for www.mysite.org gets rerouted to the routers web admin interface My last router didn't do this. My new router is a Zyxel P-2602HW-1DA, my old one was a LinkSys WRT-54GC V. 2.0. There's a rather wierd WAN-LAN, WAN-WAN setup interface which I cant really comprehend yet and the docs are rather vague. Have anyone had the same problem and can anyone guide me to a solution, would nice not write the ip address everytime i need to access the server on local lan. :). Kind regards Garrett

    Read the article

  • How can I view my IIS hosted sites on other machines on my network

    - by Truegilly
    Hello, at home i have a simple network setup conatining 2 machines and 1 belkin router. On one machine i have a site hosted with IIS7. Rather than the standard localhost/index.htm address i have added an entry in the HOSTS file pointing the local ip (127.0.0.1) to this domain - www.mysite.dev. i can access the site with www.mysite.dev with no problem. what i would like to do is be able to view this site from my other machine on the network. initially i assumed this could be done with a url like so MACHINE-NAME/www.mysite.dev, but the connection always times out. But I can ping MACHINE-NAME without problems. For testing purposes i have diabled the windows firewall on both machines but to no joy. Like a typical web developer, my techy/network skills are pretty poor. Can anyone see where im going wrong ?? thank you for your time Truegilly :)

    Read the article

  • Not able to recieve mails in my mailbox

    - by jestges
    Hi, I've configured google apps (google services) to my domain for access mails some thing like mail.mysite.com. I've configured successfully all the accounts include admin and users also. But here the surprising thing is I cant able to receive any mails when i sent mails to [email protected]. But I can able to send mails from the same account ([email protected])to any other email ids. Anybody know the reason? I'm working on the same thing from the week. Thanks in advance

    Read the article

  • IIS6 won't respond to a request for a JS file after accessing through subdomain

    - by James
    I have a site running of www.mysite.com for example. There is a JS file I'm accessing: www.mysite.com/packages.js The first and subsequent times that I acccess that packages.js file causes no problems........until I access a sub-site like this: sub-site.mysite.com This naturally makes a request for that same packages.js....but the site hangs as it just keeps waiting and waiting for that JS file. Going back to the main site, the problem perists there. If I then rename packages.js to say packages2.js it then works in the same way. I can access the file on the main site but after I try and access it through a sub-site IIS then fails to respond to a request for that file. I realise this explanation is a little vague, but has anyone seen this sort of behaviour before? Thanks very much, James.

    Read the article

  • How can I mitigate DNS Server outages?

    - by Eric Belair
    Let's say I have a root domain of "mysite.com". That domain and its sub-domains have DNS served by an external service - let's call them Setwork Nolutions. If this external company is hit with a DDoS attack, my interally-hosted websites under this domain are no longer accessible at "mysite.com" or "*.mysite.com", even though the website(s) is/are fully up and operational. How can I mitigate such a problem so as to keep end users happy? The only solution others at my company have come up with is to create a second domain - i.e. "mysite2.com", and host its DNS at another company, and then communicate to all end users that this is the website they should use. I think this is ridiculous, and just leads to a bunch of other problems. I'd like to find a solution where we can point to the same website with the same URL without the original DNS host being operational. Any thoughts?

    Read the article

  • Different file locations for http v https on IIS?

    - by Jeremy Morgan
    We have a server running IIS and have some folders running under https, but most are open. The problem I'm having is when someone is directed from a page in the secure section of the site, the relative link brings up https. For example: link to /pictures goes to http://www.mysite.com/pictures But if someone is on a secured part of the site https://www.mysite.com/shoppingcart And then clicks back to /pictures, they get https://www.mysite.com/pictures so the pictures directory is shown under https. My problem is, they get a 404 not found message when this happens. I could not find anything in the settings that would indicate that secured connections are pulling files from anywhere different than non-secured. If I type http or https on the main page of the site both come up fine. But if I try to add the https:// in a folder level, I get a 404. Any ideas why this might be happening?

    Read the article

  • SharePoint Web Analytics not tracking usage for main application

    - by Chris W
    My SP 2010 setup is two separate applications - one for the main portal and one for MySite. Whilst WebAnalytics is tracking usage of MySite it's not showing any stats for the main Portal. The only thing it lists is the number of site collections but no page views etc. The WA service is clearly running to pick up data for MySite. In Configure web analytics and health data collection everything is ticked. I can't find any obvious settings that are different between the two applications. Where should I look to get usage tracking correctly?

    Read the article

  • Redirecting to a diferent exe for download based on user agent

    - by Ra
    I own a Linux-Apache site where I host exe files for download. Now, when a user clicks this link to my site (published on another site): http://mysite.com/downloads/file.exe I need to dynamically check their user agent and redirect them to either http://mysite.com/downloads/file-1.exe or http://mysite.com/downloads/file-2.exe It seems to me that I have to options: Put a .htaccess file stating that .exe files should be considered to be scripts. Then write a script that checks the user agent and redirects to a real exe placed in another folder. Call this script file.exe. Use Apache mod-rewrite to point file.exe to redirect.php. Which of these is better? Any other considerations? Thanks.

    Read the article

  • Basic clarification about Limited FTP/sFTP users

    - by mattewre
    I would like to get some clarification about the correct way to create limited users to access to my VPS user as WEBSERVER with Nginix. I'm used to NOT install FTP and access via SFTP only. It is ok for every set up? this is what I usually do from to create a limited user called "admin" that should be able to have access via SFTP to the folder with the website data mkdir -p /var/www/mysite.com/ adduser admin adduser admin www-data chown -R root:root /var/www chmod -R 755 /var/www chmod -R 755 /var/www/mysite.com chown -R admin:www-data /var/www/mysite.com/ It seems not to be the correct way, I always have problems with permission when I upload some files (for example with Wordpress in general). I would like to create an user that does work exactly as the one that the "provides" give to their client when they buy an Hosting service (that is a FTP, I would prefer SFTP access). It is for personal user, but I think that a limited user is a lot safer to use then the "root" via SFTP.

    Read the article

  • Changing Windows 'hosts' file in guest OS under Parallels Desktop 6 on Mac

    - by ling
    I am running Win7 in a Parallels Desktop 6 on Mac. On the mac, I use /etc/hosts : 127.0.0.1 mysite and it works fine : I can type mysite in the url and the website will display. What I want to do is be able to reproduce the same on Win7 with IE8 for example. Another guy did it successfully here but I can't : Parallels: How to see a Mac-hosted website from Windows? On Mac : ifconfig -a vnic0 10.211.55.2 on Win7 I tried this C:\Windows\System32\drivers\etc\hosts 10.211.55.2 mysite What am I missing ?

    Read the article

  • sudoers security

    - by jetboy
    I've setup a script to do Subversion updates across two servers - the localhost and a remote server - called by a post-commit hook run by the www-data user. /srv/svn/mysite/hooks/post-commit contains: sudo -u cli /usr/local/bin/svn_deploy /usr/local/bin/svn_deploy is owned by the cli user, and contains: #!/bin/sh svn update /srv/www/mysite ssh cli@remotehost 'svn update /srv/www/mysite' To get this to work I've had to add the following to the sudoers file: www-data ALL = (cli) NOPASSWD: /usr/local/bin/svn_deploy cli ALL = NOEXEC:NOPASSWD: /usr/local/bin/svn_deploy Entries for both www-data and cli were necessary to avoid the error: post commit hook failed: no tty present and no askpass program specified I'm wary of giving any kind of elevated rights to www-data. Is there anything else I should be doing to reduce or eliminate any security risk?

    Read the article

  • htaccess: how to rewrite to clean urls and redirect old urls to the new clean ones?

    - by Sebastian
    With htaccess I'm trying to make my sites urls clean. I use very basic urls like: www.mysite.com/pagename.php ("pagename" is variable). I want www.mysite.com/pagename to display the content of /pagename.php So this is in my htaccess-file now: Options +FollowSymlinks RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^([^\.]+)$ $1.php [NC,L] But I also want my old urls (/pagename.php), when called, to be rewritten to www.mysite.com/pagename How to do this? I can't figure it out (get loops all the time)... Thanks in advance!

    Read the article

  • rsync windows to linux permission denied

    - by user64908
    Using Command rsync -avzP --delete --omit-dir-times ../../ [email protected]:/var/www/mysite/ I'm getting rsync: mkstemp "/var/www/mysite/.." failed: Permission denied (13) If ext is in the www-data group should I still set all the files to be owned by user www-data? I am trying to publish the files with rsync and then set the permissions using sudo chown -R www-data doc sudo chgrp -R www-data doc but I can't even rsync because of the permission denied. The SSH works fine, the rsync too except when it tries to write over or update some of the files in /var/www Client * Windows 7 * Cygwin 1.7.16 (GNU bash, version 4.1.10(4)-release (i686-pc-cygwin)) * rsync version 3.0.9 protocol version 30 Server * Ubuntu 12.04 * Apache2 * Root Accounts [ubuntu,ext] * Groups [www-data] * sudo vigr has www-data:x:33:ubuntu,ext I have already configure this http://stackoverflow.com/questions/2124169/cwrsync-ignores-nontsec-on-windows-7 This article has also managed to confuse me http://unix.stackexchange.com/questions/41687/how-should-i-rsync-files-in-var-www-if-i-want-them-to-be-owned-by-www-data What is the right procedure?

    Read the article

  • How to remove an entry from Chrome's Remembered URLs from the url bar?

    - by cmcculloh
    I've got a url in Chrome "local.mysite.com" that autopopulates when I start typing "local.my" into the URL bar. Note that this URL DOES NOT EXIST in my browser history (at chrome://history/#e=1&p=0) because it isn't a real site and therefor couldn't ever be successfully visited and therefor never shows up in my history. The URL I want is "local.mysite.com/subdir/". That URL is like 3 down in the suggested results because I keep accidentally hitting "enter" when it auto-suggests the unwanted first URL and thus re-enforcing it's assumption that that is the one I want. How do I get rid of the "local.mysite.com" entry in Chrome's memory?

    Read the article

  • Changing Windows 'hosts' file in guest OS under Parallels Desktop 6 on Mac [migrated]

    - by ling
    I am running Win7 in a Parallels Desktop 6 on Mac. On the mac, I use /etc/hosts : 127.0.0.1 mysite and it works fine : I can type mysite in the url and the website will display. What I want to do is be able to reproduce the same on Win7 with IE8 for example. Another guy did it successfully here but I can't : Parallels: How to see a Mac-hosted website from Windows? On Mac : ifconfig -a vnic0 10.211.55.2 on Win7 I tried this C:\Windows\System32\drivers\etc\hosts 10.211.55.2 mysite What am I missing ?

    Read the article

  • Apache trailing slash added to files problem

    - by Francisc
    Hello! I am having a problem with Apache. What it does is this: Take /index.php file containing an code with src set to relative path myimg.jpg, both in the root of my server. So, www.mysite.com would show the image as would www.mysite.com/index.php. However, if I access www.mysite.com/index.php/ (with a trailing slash) it does the odd thing of executing index.php code as it would be inside an index.php folder (e.g. /index.php/index.php), thus not showing the image anymore. This is a simple example that's easy to solve with absolte addressing etc, the problem I am getting from this a security one that's not so easily fixed. So, how can I get Apache to give a 403 or 404 when files are accessed "as folders"? Thank you.

    Read the article

  • Changing the IP address with a name [closed]

    - by fede
    I have a web server (xampp) in a particular PC running on my LAN, and I added the following line in the 'hosts' file : 127.0.0.1 mysite. Then, in another PC connected to the LAN, I'm trying to acces the previous web server by typing 'http://mysite/index.php' on the web browser, with no luck. But if I type the IP from the server computer (http://192.168.2.87/index.php) I am able to access the web site. So, what should i configure so when i type 'http://mysite/index.php' I get the same result as http:// 192.168.2.87/index.php ?? Thanx!

    Read the article

  • Resolve local subdomain on apache for paths within user dir

    - by MaoPU
    On Apache 2.2.x I've activated mod_userdir. I used the default setup, so that http://localhost/~name/ will be connect with ~name/public_html/ and a path within public_html, e.g. ~name/public_html/mySite can be reached through http://localhost/~name/mySite. How can I achieve, that the same path can be reached through http://mySite.name.localhost/? I don't want a manual approach like it is suggested in other SF questions (such as http://serverfault.com/q/133921/53624), but rather want an automatic mapping of all available paths to the corresponding URL. I think, several steps will need to be taken: Change mod_userdir configuration, so that the subdomain of localhost will be connected with all available user names on the machine. The second step would maybe include the usage of mod_rewrite, so that the subsubdomain could be matched to the path within ~name/public_html... What would be your prefered way?

    Read the article

  • sites now not responding on port 80 [closed]

    - by JohnMerlino
    Possible Duplicate: unable to connect site to different port I was trying to resolve an issue with getting a site running on a different port: unable to connect site to different port But somehow it took out all my other sites. Now even the ones that were responding on port 80 are no longe responding, even though I did not touch the virtual hosts for them. I get this message now: Oops! Google Chrome could not connect to mysite.com However, ping responds: ping mysite.com PING mysite.com (64.135.12.134): 56 data bytes 64 bytes from 64.135.12.134: icmp_seq=0 ttl=49 time=20.839 ms 64 bytes from 64.135.12.134: icmp_seq=1 ttl=49 time=20.489 ms The result of telnet: $ telnet guarddoggps.com 80 Trying 64.135.12.134... telnet: connect to address 64.135.12.134: Connection refused telnet: Unable to connect to remote host

    Read the article

  • Can´t verify my site on Google (error 403 Forbidden). I have other sites in the same host with no problems whatsoever

    - by Rosamunda Rosamunda
    I can´t verify my site on Google. I´ve done this several times for several sites, all inside the same host. I´ve tried the HTML tag method, HTML upload, Domain Name provider (I canp´t find the options that Google tell me that I should activate...), and Google Analytics. I always get this response: Verification failed for http://www.mysite.com/ using the Google Analytics method (1 minute ago). Your verification file returns a status of 403 (Forbidden) instead of 200 (OK). I´ve checked the server headers, and I get this result: REQUESTING: http://www.mysite.com GET / HTTP/1.1 Connection: Keep-Alive Keep-Alive: 300 Accept:/ Host: www.mysite.com Accept-Language: en-us Accept-Encoding: gzip, deflate User-Agent: Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 6.0) SERVER RESPONSE: HTTP/1.1 403 Forbidden Date: Wed, 19 Sep 2012 03:25:22 GMT Server: Apache/2.2.19 (Unix) mod_ssl/2.2.19 OpenSSL/0.9.8e-fips-rhel5 mod_bwlimited/1.4 PHP/5.2.17 Connection: close Content-Type: text/html; charset=iso-8859-1 Final Destination Page (It shows my actual homepage). What can I do? The hosting is the very same as in my other sites, where I didn´t have any issue at all! Thanks for your help! Note: As I have a Drupal 7 site, I´ve tried a "Drupal solution" first, but haven´t found any that solved this issue... How can it be forbidden when I can access the link perfectly ok? Is there any solution to this? Thanks!

    Read the article

  • Homepage 301 Redirect to SSL Homepage

    - by user33692
    I'm hoping somebody might be able to provide a bit of advice on an issue I am having. I have 1 site where we implemented a 301 redirect on the homepage from http to https. We have links on the homepage to other parts of the site that are not under SSL (in fact there is only one other page under SSL). When I go to our webmaster account I notice that we are not being provided with any webmaster information (search queries, backlinks) related to our homepage under SSL. I performed a Fetch Google on the homepage and the information it returned is: HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="https://mysite.com/">here</a>.</p> <hr> <address>Apache/2.2.16 (Debian) Server at mysite.com</address> </body></html> I am worried that the fact that Google Fetch is not getting the correct Title Tags and Meta information from our homepage and that this is hurting our search results. Additionally, I am worried that we need to do something specific with the SiteMap to ensure that Google is correctly indexing all our pages and being able to flow from the https to the http without issues. Does anybody have any advice on how we can correctly set this up or be sure that Google is fetching the correct information?

    Read the article

  • Best solution for getting referral information in PHP

    - by absentx
    I am currently redoing some link structuring on a website. In the past we have used specific php files on the last step to direct the user to the proper place. Example: www.mysite.com/action/go-to-blue.php or www.mysite.com/action/short/go-to-red.php www.mysite.com/action/tall/go-to-red.php We are now restructuring to eliminate the /short/ or /tall/ directory. What this means is now "go-to-blue.php" will be doing some extra processing to make sure it sends the visitor to the proper place. The static method of the past was quite effective, because, well, if they left from that page we knew we had it right. Now since we are 301 redirecting action/short/go-to-red.php to just action/go-to-red.php it is quite important on "go-to-red.php" that we realize a user may have been redirected from /short/ or /tall/. So right now I am using HTTP_REFERRER and of course in my testing that works fine, but after a lot of reading it is clear that this is not a solid solution, so I was starting to brainstorm on other ways to check and make sure we get the proper referral information. If we could check HTTP_REFERRER plus some other test, I would feel confident we have a pretty good system in place to send the visitor to the right place. Some questions/comments: Could I use a session variable or a cookie to accomplish this goal? If so, would that be maintained through the 301 redirect? I don't see why it wouldn't be.. Passing the url in the url is not an option in this case.

    Read the article

  • Javascript widgets: do links count as SEO backlinks? [closed]

    - by j0nes
    Possible Duplicate: How good is it for SEO if you have a widget that lives on other sites? On my website I offer an option to let users embed information from my site with some kind of "homepage widget". If a user wants to embed it in his website, he basically has to add one line of Javascript to his HTML files like this: <script src="http://mysite.com/myscript.php?some_options_here"></script> Inside the widget, I export some content from my website and of course create a link back to my website. This is done in Javascript with document.write. document.writeln("My great exported content"); document.writeln('<a href="http://mysite.com?ref=widget>Check mysite.com</a>'); I have Google Analytics set up to track whether the links in there get clicked, and they do. Now I am asking myself if Google recognizes these links as valid backlinks from the embedding domain. I know that Googlebot can parse and execute Javascript, but I have not found any references whether these links also count as "normal" backlinks.

    Read the article

  • 301 redirect from HTTP to HTTPS - how to be sure Google is fetching the correct information?

    - by user33692
    I'm hoping somebody might be able to provide a bit of advice on an issue I am having. I have one site where we implemented a 301 redirect on the homepage from HTTP to HTTPS. We have links on the homepage to other parts of the site that are not under SSL (in fact there is only one other page under SSL). When I go to our Webmaster Tools account I notice that we are not being provided with any webmaster information (e.g., search queries, backlinks, etc...) related to our homepage under SSL. I performed a Fetch as Google on the homepage and the information it returned is: HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="https://mysite.com/">here</a>.</p> <hr> <address>Apache/2.2.16 (Debian) Server at mysite.com</address> </body></html> I am worried by the fact that Google fetch is not getting the correct Title tags and Meta information from our homepage and that this is hurting our search results. Additionally, I am worried that we need to do something specific with the sitemap to ensure that Google is correctly indexing all our pages and being able to flow from the HTTPS to the HTTP without issues. Does anybody have any advice on how we can correctly set this up or be sure that Google is fetching the correct information?

    Read the article

  • embed a jquery script after jquery is loaded by widget

    - by matthew k
    http://stackoverflow.com/a/6065421 was helpful to see how to confirm jquery has been loaded. my widget will need a class that was written using jquery. may i have some assistance on embedding this other class built using jquery? thank you, below is the snippet from the above link with my code added in the final portion as noted in the code comments: (function(window, document, version, callback) { var j, d; var loaded = false; if (!(j = window.jQuery) || version > j.fn.jquery || callback(j, loaded)) { var script = document.createElement("script"); script.type = "text/javascript"; script.src = "/media/jquery.js"; script.onload = script.onreadystatechange = function() { if (!loaded && (!(d = this.readyState) || d == "loaded" || d == "complete")) { callback((j = window.jQuery).noConflict(1), loaded = true); j(script).remove(); } }; document.documentElement.childNodes[0].appendChild(script) } })(window, document, "1.3", function($, jquery_loaded) { //my code added below var script_tag = document.createElement('script'); script_tag.setAttribute("type","text/javascript"); script_tag.setAttribute("src", "http://mysite.com/widget/slides.jquery.js"); (document.getElementsByTagName("head")[0] || document.documentElement).appendChild(script_tag); $('#slides').slides({}); //this line gives an error. }); right now, i am trying the following based on the response(s) provided to this question (line that throws error is noted with a comment): //this function is called after jquery being embedded has been confirmed. {mysite} placeholder is nonexistent in actual code. function main() { jQuery(document).ready(function($) { var css_link = $("<link>", { rel: "stylesheet", type: "text/css", href: "http://mysite/widget/widget.css" }); css_link.appendTo('head'); $('#crf_widget').after('<div id="crf_widget_container"></div>'); /******* Load HTML *******/ var jsonp_url = "http://mysite/widget.php?callback=?"; $.getJSON(jsonp_url, function(data) { $('#crf_widget_container').html(data); $('#category_sel').change(function(){ alert(this.value); }); $.getScript("http://mysite/widget/slides.jquery.js", function(data, textStatus, jqxhr) { alert(1); //fires ok $('#slides').slides({}); //errors }); }); }); }

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >