Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 161/216 | < Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >

  • is it ok to have 2 sitemaps on 1 website?

    - by user615041
    Do I have to have a sitemap page on my index page for bots to read it or can I just have it anywhere on my server? I have a phpbb/wordpress integration and I need 2 sitemaps mods for each one (or I need to have them somehow integrated together into one xml sitemap). Is this possible? Whats my best option? I would have the phpbb one something like this: http://www.example.com/phpbb/sitemap.html and the wordpress one something like this: http://www.example.com/wordpress/sitemap.html and then I would submit both off..but not have the links on my footer to confuse anyone.., the sitemaps would strictly be for search engines. Is this a good idea? what are you thoughts?

    Read the article

  • place to host a simple php socket server

    - by bonusbox
    i am running a small project that occasionally requires me to run a php socket script through ssh. it uses a few bytes of bandwidth (just some text) which activates my art installation project. i tried a simple web hosting plan but that didn't support sockets, so know i am using a 32$ vps plan on namecheap.com just to run a simple php script. i don't really need to host anything else on it. i find it kind of excessive for such a simple thing. is there a place i can run my script for a lower cost? any servers that support php sockets and ssh?

    Read the article

  • Sharing password-protected videos on social media

    - by PaulJ
    We are developing a site where users will be able to watch and download videos that they've recorded of themselves in a public event. The videos will be password protected, and will be available only to users who have paid for them at the event... ...But on the other hand, we also want users to share those videos on social media, since they will be an attractive publicity for our events. Having people log into our site with their password, download the video and then re-upload it to Youtube/Facebook will be too cumbersome, and I suspect that few users will be willing to do that. So the obvious alternative is to have one of those convenient "share" buttons, but the problem with that approach will be that: The video will be physically hosted (and linked to) in our site. What happens if those videos go viral and our bandwidth cost explodes? The video is password protected. The solution I've thought of for this is: Upload the user's video to our (password-protected site) and to Youtube at the same time, as an unlisted video. The user can access our site with his password and download his video (to watch on his TV or whatever). If the users hits the "share" button, we show him the Youtube link... and we turn the video into a listed one. This seems in line with the ideas in Using YouTube as a CDN, and there didn't seem to be any objections in that question. I'm posting this just to confirm that my idea doesn't violate any Youtube TOS, and also to see if it is a good one or there might be better alternatives.

    Read the article

  • Redirecting 2 or more domains to same hosting server

    - by mtk
    I have domains A.com, A.co.in and A.in Purchased from site X. I have a hosting space/account purchased from site Y, which has provided me with 2 DNS entries that is to be replaced in the account at the site from where I purchase the domains. I have successfully changed the DNS entries of A.com to these 2 DNS entries and I am able to see my index.html page when I hit A.com. Problem On similar lines, I have changed the DNS entries to the same entries for A.co.in and A.in, but on hitting those sites in browser gives me no response and browser specific page of 'Site not found' is been seen. Please let me know, how to set this, so that when I hit any of the domain, the web-site is rendered from the hosting server? What am I doing wrong here? Note It has been more than 3 days after changing the DNS entries, so I don't think so this is a problem of DNS propagation, which I heard from some people. Please provide some detail explanation, as I am very very new to this. This is my first hosting ;) -Thanks

    Read the article

  • duplicate pages

    - by Mert
    I did a small coding mistake and google indexed my site wrongly. this is correct form: https://www.foo.com/urunler/171/TENGA-CUP-DOUBLE-HOLE but google index my site like this : https://www.foo.com/urunler/171/cart.aspx first I fixed the problem and made a site map and only correct link in it. now I checked webmaster tools and I see this; Total indexed 513 Not selected 544 Blocked by robots 0 so I think this can be caused by double indexes and they looks not selected makes my data not selected. I want to know how to fix this "https://www.foo.com/urunler/171/cart.aspx" links. should I fix in code or should I connect to google to reindex my site. If I should redirect wrong/duplicate links to correct ones, what the way should be? thanks for your time in advance.

    Read the article

  • Google SEO - Migrating website from a sub-directory of another website to its own domain name

    - by DarioP
    At the present moment I have a website hosted on example.com/myWebsite, where example.com hosts in its root directory a different website. I have the domain example.net, which redirects to example.com/myWebsite. The point, however, that right now when somebody accesses example.net they are redirected to example.com/myWebsite and consequently to example.com/myWebsite/dirA, example.com/myWebsite/dirB etc.. I am now thinking about upgrading my account so that example.net no longer redirects to example.com - I was however wondering, however, since Google shows results searches in terms of example.com/myWebsite, how would this affect my rankings?

    Read the article

  • Google Analytics content experiment vs. funnel visualization

    - by Full Decent
    I am running an experiment on one of the pages in my funnel (the "Choose shipping options" page). But the numbers on the different reports do not correspond. First, I am expecting the 70 entrances in the funnel to equal the 131 experiment visits. Also, I expect the 23 conversions in the funnel to match the 21 transactions below. But they do not. How should I read this information to make good decisions?

    Read the article

  • Where to buy a domain for my local server [closed]

    - by Pradyut Bhattacharya
    I have made a website and hosted in my local computer using a static ip Where can i buy a domain name such as www.something.com such that it can redirect to my static IP. So that if i m using a page like a http://localhost/index.jsp it can be accessed by http://www.something.com/index.jsp Does it matter if i run the server locally or should I buy a managed web hosting server from a big company if the traffic is low on my site?

    Read the article

  • Github Feed affecting my WordPress installation? [on hold]

    - by saul
    Any idea how this fork is affecting my site? I went to verify my website log stats, and realized this may be the cause of a strange redirect constantly happening on my WordPress installation. Here's a line I found on my log: 54.81.91.95 - - [07/May/2014:22:52:08 -0400] "GET /category/selfie/feed/ HTTP/1.1" 200 1826 "-" "feedzirra http://github.com/pauldix/feedzirra/tree/master" And this is the Github fork (or however these are called). https://github.com/feedjira/feedjira/tree/master Basically, I think everytime I update my categories, (selfie in this case), I get redirected to install.php. Probably by triggering some GET function on that feed. to the best of my knowledge, this feed parses all url with this structure, blocking them, kind of like a DDoS attack?? Any ideas how to go about it??

    Read the article

  • Is there any way to simulate a slow connection between my server and an iPad (without installing anything on the server)?

    - by Clay Nichols
    Some of our webapp users have difficulty on slower connections. I"m trying to get a better idea of what that "speed barrier is" so I'd like to be able to test a variety of connection speeds. I've found ways to do this on Windows but no on the iPad, so I'm looking more for some sort of proxy service that'll work with any device (not running ON that device) I did find an article about using the CharlesProxy and providing a connection to another device, but I was hoping for something simpler (need not be free) Constraints * We are on a shared server so we can't install anything and we are limited in our control over that server. * I'd like to test an iPad, Android Tablet, Windows PC.

    Read the article

  • How to change my website's appearance in a Facebook wall post?

    - by Lode
    When posting a website link in a Facebook wall post, Facebook fetches some content (title, text and image) from the website to show it to readers. Is there a way I can adjust / propose which content is used / preferred by Facebook? I found someone saying to use <meta property="og:image" content="image.jpg">, but this doesn't seem to have any effect. But maybe Facebook caches these results for a while?

    Read the article

  • Good Compression for Slow-mo Video

    - by marienbad
    What's the best way to deliver super slow-motion video to the browser? This seems to me to be a special case, because with super slow-mo video (such as 10,000 frames per second) the visual difference from frame to frame is minimal. As such, it's easy to compress highly. Please suggest codecs, as well as encoding software, backend software, software configuration tips, and services like youtube. My goal is to get about 100 frames of QVGA video to the browser in 500KB. By the way, remember that Radiohead In Rainbows site?

    Read the article

  • Which tool can tidy/indent HTML5?

    - by user2534
    I've already spent about 2 hours searching google and trying various tools but either they don't do indent tags per say or they aren't HTML5 compatible such as the famous HTML tidy which was last updated 3 years ago… N.B. I don't want to apply this to the source of a page (like PHP or JS would do) but to the code in the editor so that a clear hierarchy appears between tags. Ideally I'd like a Mac OS X tool but I'll take any online tool and in last resot a Wine compatible one. P.S. at the moment I use Coda from Panic

    Read the article

  • Website .htaccess file for Wordpress sub folder

    - by ubique
    I developed a Flash website for a client and added the following .htaccess file in the root directory and the non-www to www redirect works perfectly. RewriteEngine On RewriteCond %{HTTP_HOST} ^website.com [NC] RewriteRule ^(.*)$ http://www.website.com/$1 [L,R=301] I was also asked to add a Wordpress blog so I put it in a new directory folder (as opposed to a sub domain) with so the URL is www.website.com/blog Does Google now see the main site and blog as two different websites? Do I need to link them together using another .htaccess file in the Wordpress Root so Google automatically crawls the whole domain? Any help appreciated....

    Read the article

  • PayPal - Account management

    - by Tom
    I'm running an app that gets small donations (Micro payments up to ~11 USD) and also I'm doing some freelancing where I get some higher payments over PayPal too. (~900 USD a month) Is it possible to have 2 accounts on PayPal? (I'm asking because if someone send me money for my freelancing, they get the contact information from the app - Like [email protected] instead of [email protected] ) Thanks.

    Read the article

  • Avoid pagination pages from appearing higher than real content in SERPS

    - by WordPress Developer
    I have a gallery-type website that has about 20k of pages and naturally it uses pagination. However, sometimes /page/2 appears higher in search results than /post/201339 for example. I'd like to give emphasis to the actual content (posts, videos, whatever the site is about) and not on pages that merely list this content in a paginated matter. What is the best way to avoid this issue? Maybe a NOINDEX,FOLLOW meta tag on the paginated pages?

    Read the article

  • Redirect Google crawler to different robots.txt via .htaccess

    - by user3474818
    I have googled for the answer all day and still couldn't find an answer. I have a virtual subdomain www.static.example.com which is a mirror site of www.example.com. It means I have just one root folder for subdomain and domain aswell. I want to redirect crawlers to different robots.txt file - robots_static.txt when they see .static in url in which I will forbid indexing via /disallow command. I want to do this because I have duplicated content in Google search results. Subdomain is showing the exact same content as the main domain. Does anyone know how could I achieve that crawlers sees robots_static.txt instead of robots.txt? What I have managed to find so far is this: RewriteCond %{HTTP_HOST} ^www.static.*$ [NC] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC] RewriteRule ^robots\.txt /robots_static.txt [NC,L] but when I check in webmaster tools, it still sees robots.txt as my robots file instead of robots_static.txt, so it crawls and index everything twice. What did I do wrong? Thanks EDIT: This is my .htaccess file ## # @package Joomla # @copyright Copyright (C) 2005 - 2013 Open Source Matters. All rights reserved. # @license GNU General Public License version 2 or later; see LICENSE.txt ## ## # READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! # # The line just below this section: 'Options +FollowSymLinks' may cause problems # with some server configurations. It is required for use of mod_rewrite, but may already # be set by your server administrator in a way that dissallows changing it in # your .htaccess file. If using it causes your server to error out, comment it out (add # to # beginning of line), reload your site in your browser and test your sef url's. If they work, # it has been set by your server administrator and you do not need it set here. ## ## Can be commented out if causes errors, see notes above. Options +FollowSymLinks ## Mod_rewrite in use. RewriteEngine On RewriteEngine On RewriteCond %{HTTP_HOST} !^www\. RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^www.static.*$ [NC] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC] RewriteRule ^robots\.txt /robots_static.txt [NC,L] ## Begin - Rewrite rules to block out some common exploits. # If you experience problems on your site block out the operations listed below # This attempts to block the most common type of exploit `attempts` to Joomla! # # Block out any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]*\([^)]*\) [OR] # Block out any script that includes a <script> tag in URL. RewriteCond %{QUERY_STRING} (<|%3C)([^s]*s)+cript.*(>|%3E) [NC,OR] # Block out any script trying to set a PHP GLOBALS variable via URL. RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR] # Block out any script trying to modify a _REQUEST variable via URL. RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2}) # Return 403 Forbidden header and show the content of the root homepage RewriteRule .* index.php [F] # ## End - Rewrite rules to block out some common exploits. ## Begin - Custom redirects # # If you need to redirect some pages, or set a canonical non-www to # www redirect (or vice versa), place that code here. Ensure those # redirects use the correct RewriteRule syntax and the [R=301,L] flags. # ## End - Custom redirects ## # Uncomment following line if your webserver's URL # is not directly related to physical file paths. # Update Your Joomla! Directory (just / for root). ## # RewriteBase / RewriteCond %{THE_REQUEST} ^GET.*index\.php [NC] RewriteCond %{THE_REQUEST} !/system/.* RewriteRule (.*?)index\.php/*(.*) /$1$2 [R=301,L] RewriteCond %{THE_REQUEST} ^GET ## Begin - Joomla! core SEF Section. # RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] # # If the requested path and file is not /index.php and the request # has not already been internally rewritten to the index.php script RewriteCond %{REQUEST_URI} !^/index\.php # and the request is for something within the component folder, # or for the site root, or for an extensionless URL, or the # requested URL ends with one of the listed extensions RewriteCond %{REQUEST_URI} /component/|(/[^.]*|\.(php|html?|feed|pdf|vcf|raw))$ [NC] # and the requested path and file doesn't directly match a physical file RewriteCond %{REQUEST_FILENAME} !-f # and the requested path and file doesn't directly match a physical folder RewriteCond %{REQUEST_FILENAME} !-d # internally rewrite the request to the index.php script RewriteRule .* index.php [L] # ## End - Joomla! core SEF Section. <FilesMatch "\.(ico|pdf|flv|jpg|ttf|jpg|jpeg|png|gif|js|css|swf)$"> Header set Expires "Wed, 15 Apr 2020 20:00:00 GMT" Header set Cache-Control "public" </FilesMatch> <ifModule mod_headers.c> Header set Connection keep-alive </ifModule> ########## Begin - Remove Etags # FileETag none # ########## End - Remove Etags

    Read the article

  • Web hosting announced downtime and how it affects FORWARD domain names?

    - by maple_shaft
    Our web hosting provider that holds our FORWARD domain names announced that at some point in the next couple weeks they will be migrating servers and that this will cause a 5-10 minute downtime at some point in that week during what happens to be our core business hours. They cite for technical reasons it is impossible to give an exact date or time when this downtime will occur. My questions are: If my domains are set to FORWARD to a static IP on servers not hosted by the web hosting provider in question then will this affect the DNS servers correctly routing to my website? Are their legitimate technical reasons for such a wide window of time, or could this just be a blanket statement to cover laziness in not being more organized with their server migrations? Are such downtimes normal for web hosting providers, or should I start to consider other providers?

    Read the article

  • Using google maps as an input method

    - by Vasiliy Sharapov
    I'm wondering if it's possible to use gmaps pretty much as a type of input. Normally for Location a user has to enter: street address town/city state/province postal code country This seems too clunky to me, does anyone here know of a more elegant process? My perfect address-entering interface is a gmaps search bar with a map under it. When the user clicks a marker instead of a speech bubble containing location information they get one containing a submit button or equivalent. I'm not sure anything like this has been implemented before.

    Read the article

  • Poor backlink profile - search rankings not updated for 2+ months

    - by fistameeny
    I am carrying out some work on a website that is a PR2 with a few good quality, relevant backlinks (PR4-6). It has a presence on Twitter that is updated regularly, a Google Places listing, and listings on some decent directories (Qype etc). The site was rebuilt into Drupal 7 two months ago, with all the basics done - URL rewriting, XML Sitemap submitted to Google, and most importantly, good quality, structured content. I've noticed that Google is still showing "old" URL's from the previous version of the site that was ditched 8 weeks ago. I think the site may be penalised under the Penguin update, as a previous SEO company created many low quality links from link farms/directories. My question is what the correct way to deal with this is. Bing Webmaster Tools can "disavow" links, and I guess I can attempt to contact the link farms to have them removed. I've already submitted a request to Google to request that we have the penalty removed as we're trying to tidy up a bad history. We submit updated sitemaps to Google and Bing daily, and have built some further decent quality, relevant links. Is there anything further I can do?

    Read the article

  • .htaccess redirect to subfolder in different domain, maintaining old domain in the URL

    - by Naoise Golden
    Redirect has been widely discussed and most problems solved, so I am sorry for opening yet another post about this, but none of the codes I am trying work. I have a WordPress site hosted in http://mydomain.com/clientsdomain.com/wordpress I would like to temporarily redirect http://clientsdomain.com/ to the abovementioned URL, maintaining the clientsdomain.com domain in the URL. So for example http://clientsdomain.com/some/page would be pointing to http://mydomain.com/clientsdomain.com/wordpress/some/page Is this even possible with .htaccess? Maybe som configuration or plugin option with WordPress?

    Read the article

  • Problems with Developer [closed]

    - by Concerned Client
    I engaged a developer who is developing a website for me. I am not happy with him and would like that once the website is ready, I transfer the duties of further development, seo and web admin to another developer. What do I need to be aware of? and what information do i need in terms of passwords etc? The website has been developed in word press and I have access to the CMS but I am not technical so i am not sure if there are security levels for the more technical people. thanks

    Read the article

  • One site being on a subdirectory of another. Does google count this againt you?

    - by Mick
    I have created two similar websites (relating to monetary systems). So far, one appears to be loved by Google and the other hated. I'm struggling to work out why. This is a mystery to me because both sites were created by me with the same design philosophy, both in pure html. Both are packed to the rafters with references to, and information about, their respective subjects. One issue I'm worried may be the cause is to do with the location of the sites. I got a web hosting package from hostmonster.com for the successful one, but less liked one is just an "add-on" which sits on a subdirectory of the successful one. I wonder if Google somehow detects this and treats it as a less significant website? EDIT: Just to clarify, even though one site is an add-on that sits on a subdirectory of the other, the URL is arranged to look like it is a root. I.e. the unpopular site can be accessed directly with a simple www.myunpopularsite.com name, without specifying any subdirectory.

    Read the article

  • Software for a online collaborative bi/tri lingual dictionary [closed]

    - by user537488
    I am looking for a software which I can host in popular and general shared web hosting services(online softwares like wordpress, meidawiki, drupal etc.) which can do the following- allow users to create account allow users or anons to add words to the dictionary (there will be English as base language and other languages) easier way to import all the words from English dictionary users should be able to write the that language equivalent of the English word Every word should have it's own address and page like www.namesomething.com/word/en/software will contain the word software and the other language word for it search should be faster and should find nearer results it's should be able to list related words like if the user is looking at "software" then other words from s like "softcopy" etc should appear alphabetically in that page Any one should be able to comment on the word which is not seen in the main page but other page similar to the talk page in the wiki any one should be able to contribute clean interface unlike wiki (media wiki and all other) just for words only I tried media wiki and other wiki software but it overloaded and unclean. I am looking for interface similar to oed.com but clean, minimal as we are not going to have such more information. Just words in English and it's other language equivalent. Here we are talking about a language which has not yet been in the Internet. It's should be collaborative.

    Read the article

  • Caching preventing users seeing site updates

    - by Timmeh
    I'm experiencing a caching issue I can't explain. This is happening across browsers, IPs and ISPs. If a user force-refreshes, they see the new content. If they then refresh or return to the page, the old one displays. I've tried using headers via PHP such as header( 'Expires: Sat, 26 Jul 1997 05:00:00 GMT' ); header( 'Last-Modified: ' . gmdate( 'D, d M Y H:i:s' ) . ' GMT' ); header( 'Cache-Control: no-store, no-cache, must-revalidate' ); header( 'Cache-Control: post-check=0, pre-check=0', false ); header( 'Pragma: no-cache' ); Laid out correctly, at the very beginning of the file. The problem persists. A pan-ISP proxy is unlikely. Suggestions?

    Read the article

< Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >