Search Results

Search found 9717 results on 389 pages for 'pro metedor'.

Page 87/389 | < Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >

  • Meta tags again. Good or bad to use them as page content?

    - by Guandalino
    From a SEO point of view, is it wise to use exactly the same page title value and keyword/description meta tag values not only as meta information, but also as page content? An example illustrates what I mean. Thanks for any answer, best regards. <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd"> <html> <head> <title>Meta tags again. Good or bad to use them as page content?</title> <meta name="DESCRIPTION" content="Why it is wise to use (or not) page title, meta tags description and keyword values as page content."> <meta name="KEYWORDS" content="seo,meta,tags,cms,content"> </head> <body> <h1>Meta tags again. Good or bad to use them as page content?</h1> <h2>Why it is wise to use (or not) page title, meta tags description and keyword values as page content.</h2> <ul> <li><a href="http://webmasters.stackexchange.com/questions/tagged/seo">seo</a> <li><a href="http://webmasters.stackexchange.com/questions/tagged/meta">meta</a> <li><a href="http://webmasters.stackexchange.com/questions/tagged/tags">tags</a> <li><a href="http://webmasters.stackexchange.com/questions/tagged/cms">cms</a> <li><a href="http://webmasters.stackexchange.com/questions/tagged/content">content</a> </ul> <p>Read the discussion on <a href="#">webmasters.stackexchange.com</a>. </body> </html>

    Read the article

  • Can I remove visible referer from link?

    - by Andreas
    I use referer info to track which of my campaigns works the best. So instead of <a href="someweb.com">someweb</a> I have a link like <a href="http://someweb.com?utm_source=john&utm_medium=email&utm_content=NAME&utm_campaign=campaing">someweb</a> Now when a user cliks "someweb" the whole URL string is shown in the adressbar. Is this possible to mask/hide somehow? Maybe via .htaccss? Thanks in advance

    Read the article

  • How is this site so fast?

    - by user8628
    how is the website http://dftba.com/ so fast? when i click a link it loads right then? what makes it work like this? how do i make it work like this on my site? some of the objects on the site are being hosted by a website called ecogeek-cdn.net? who is this company and why do they host the images of this site? i have been looking into this site some time because i want this site to be like mine site they site use Apache they site use Python (when asked the developer told me this) they site use jquery and jqueryui they site is custom built not using wordpress they site is ownedhosted by liquidweb they site gets a million users a month they site launched in january they site uses cpanel they site does not have SSH or FTP (i tried to connect but it denied me all) they does have SSH and FTP but only allowed by their addresses Please; my english is not as good as yours

    Read the article

  • Confirm that a dns zone is served by a nameserver

    - by adam
    We currently have a domain which has custom nameservers. Our host has their own nameservers. I'd like to switch our domain to use our host's nameservers for a while. Our host tells me that their nameservers hold a replica of our dns zone, but I'd like to confirm this before I switch. Is there a command line tool I can use that I can use to answer the question "does this nameserver know the dns zone of this domain?" Hope that makes sense! Thanks, Adam

    Read the article

  • passing data to php in ajax [closed]

    - by MertMETIN
    i try to pass my data, actually checkboxes has some datas in their ids, and i read try to read them. like that, <input align="right" type=checkbox name="checkArtist[]" class="checkClass" id ="{$movieId}-{$mCast.id}"></li> dont worry about {$movieId}-{$mCast.id} They are template lite tags. I successfully read checkboxes datas in ajax side. The problem is that i cannot send these datas to my php. var artistIds = new Array(); $(".p16 input:checked").each()(function(){ artistIds.push($(this).attr('id')); }); $.post('/json/crewonly/deleteDataAjax2', { 'artistIds': artistIds },function(response){ if(response=='ok') alert("ok"); }); above code is my ajax but in php side $artistIds is always empty. I found a topic in stackoverflow http://stackoverflow.com/questions/5571646/how-to-pass-a-javascript-array-via-jquery-post-so-that-all-its-contents-are-acce It says how to pass js arrays $.post('/url/to/page', {'someKeyName': variableName}); This is same with my code. What's going wrong ?

    Read the article

  • 301 url rewrite loop

    - by anyvendetta
    I need to do a 301 rewrite to force all urls to become lowercase i put in htaccess (RewriteMap lc int:tolower in httpd.conf) RewriteCond %{REQUEST_URI} [A-Z] RewriteRule . ${lc:{REQUEST_URI}} [R=301,L] Everything works just fine except to urls with subcategories which in this case are: /category-1256-Product-page-example.html the numer 1256 refers to a "subcategory" So when i try to access /category-1256-Product-page-example.html gives me a loop error message I think another redirect rules are making the loop but dunno how to fix it because are just this urls rewrite rules that don't work with the above rewrite. Rewriterule ^main-site-url/category-([0-9]*)-([-_a-zA-Z0-9]*)\.html$ /subcategories.php?idcategory_main=1&idcategory=$1&category=$2 [L] Rewriterule ^main-site-url/([0-9]*)-([-_a-zA-Z0-9]*)-([0-9]*)\.html$ /file.php?idcategory_main=1&idsubcategory=$1&product=$2&idproduct=$3 [L]

    Read the article

  • 1 Google Analytics account or top-level domain + profiles for sub-domains vs. 1 account for each sub-domain

    - by Eric Nguyen
    We have the following websites An online magazine Singapore edition - sg.abc.com The same online magazine Malaysia edition - my.abc.com Forums around the same subjects as the online magazine but functions independently - forums.abc.com Classifieds site rather also around the same subjects but functions independently - directory.abc.com Each of the above websites currently has its own Google Analytics account. abc.com has a separate Google Analytics account too. sg.abc.com has the most traffic and generates most revenues Are there any practical benefits of merging all the above sub-domains to be under abc.com? I can think of more reliable analytics and consistency for sure. Are there more? cross-sales?

    Read the article

  • Domain masking (and simple page links)

    - by Halik
    How do you set up the domain (Im using godaddy) to mask the server url but to append the sub-page link. Im thinking something like the wikipedia en.wikipedia.org/wiki/something (or, if it would require httpd.conf access, setting it to append the default subpage link eg. '?page_id=2') Currently I can set up the domain to either be masked completely without showing any sub-page links or to simply redirect my domain to my web server.

    Read the article

  • What liability concerns do advertising vendors raise, and how can I address them?

    - by Beofett
    One of the websites I administer wants to provide free advertising in the form of direct links to vendors at an event they are running. Up until now, there has been no advertising whatsoever on the site (or any of our other sites). The site is for a for-profit business. The idea of implicit endorsement of any vendors we advertise has been raised, which brought up the question of what we need to do, if anything, to protect ourselves from any potential problems such endorsement might create. I know that many sites have clauses in their Terms of Service that state that (in a nutshell) they are not responsible for any problems or grievances between the visitors to the site and any vendor advertised or linked. Are there other steps that a website typically takes when considering advertising, such as getting the advertiser to provide some sort of certification that their ad will not violate any trademarks or copyrighted material?

    Read the article

  • Should I rely on externally-hosted services?

    - by Mattis
    I am wondering over the dangers / difficulties in using external services like Google Chart in my production state website. With external services I mean them that you can't download and host on your own server. (-) Potentially the Google service can be down when my site is up. (+) I don't have to develop those particular systems for new browser technologies, hopefully Google will do that for me. (-) Extra latency while my site fetch the data from the google servers. What else? Is it worth spending time and money to develop my own systems to be more in control of things?

    Read the article

  • What is the best way to construct a "remove multiple items" area (ASP.NET VB) [on hold]

    - by Darkcat Studios
    Lets say for example I have a (variable length) 2 dimensional array of product names and their unique product codes. I can display this list in a datagrid, table etc. (Imagine this as a standard shopping basket type scenario) What I need to do is be able to tick multiple items (?) , then on clicking a submit button, fire an action. The bit im struggling with is how do i: A: programatically display asp:checkboxes for each item (and give them a unique ID) B: know which are ticked on firing the final action (not sure if this question is best suited to the main stack but theres so much activity on there that questions just get lost now!)

    Read the article

  • How to create a good sitemap for dynamic website

    - by Saif Bechan
    I have a website with dynamic content and different kind of pages. I have some pages that rarely change, and I have pages like blogs that change often. The blog pages also have links for sorting, for example sorting on date, asc, desc. On some of the pages I also have links to different tabbed content, and links that are just anchor links. Now when I use a xml sitemap generator then all the links are thrown into the site, and so I don't think all the links are really relevant. The blogposts up until now are also taken into the sitemap. Is this really necessary? I think the links to the blogposts can be indexed just fine. Is the best way to make a sitemap just to manually assign the main menu links to the sitemap, or is indexing everything really recommended?

    Read the article

  • SEM & Adwords: How many click without a sale before i should pause a keyword

    - by Thomas Jönsson
    I wonder how many clicks I optimally should let pass through every new keyword I try in Adwords before I find out that it's not making a profit and it should be paused! It's actually four question. 1: At which likelihood percentile should I pause a word? 2: How many clicks should I let through before I pause a word for those word which do not generate any lead? 3: How many clicks should I let through after one sale to consider the word not to be profitable? 4: Does the likelihood of the word becoming profitable affect the above? Conditions: -The clicks is normally distributed. (correct?) -A CR of 1% is break even, everything above is profit (1 sale/100 clicks=break even) Cost per Click(cpc) = 4$ -Marginal (profit per sale) = 400$ -Paybacktime = 1 year -Average click per word = 0,333 per day (121 + 2/3 per year) Exampel: After 1 click and no sale the keyword still has a high probability to be profitable. After 500 clicks and no sale it has almost no likelihood to not be profitable and should probably be paused. Thanks in advance!

    Read the article

  • How do I redirect www and non but not IP

    - by Chad T Parson
    I am trying to redirect www.domain.com or domain.com to www.domain.com/temp.html I am using the following code: RewriteCond %{HTTP_HOST} ^.*$ RewriteRule ^/?$ "http\:\/\/www\.domain\.com\/temp\.html" [R=301,L] That works however I do not want to redirect IP. So if someone types in the static IP of the domain then I do not want them to be redirected to www.domain.com/temp.html Anyone have the code to take care of this?

    Read the article

  • Displaying google analytics data on my website

    - by anon-user0
    I sale adspace for my websites directly to the advertisers. I ad page where I want to show google anaylytics information that update automatically without me having to manually update everyday or every month. Something like this: http://wstats.net/en/website/riverplate.com#stat_trafic I don't want to use embedded or iframed third party services. I know google has public API and you can connect it to google graph API to to show pretty graphs. There is a tutorial by google here on how to do it: https://developers.google.com/analytics/resources/articles/gdataAnalyticsCharts Few problems: I don't know much javascript The javascript seems to prompt for authntication as opposed to login automatically. (from my understanding by reading comments on the code) Does anyone know of any ready made script that does what I am looking for or know how I can fix this code that will allow me to display analytic info without authenticating? Thanks.

    Read the article

  • Will ranking be affected with a mobile XML sitemap for a mobile site with the same URLs as the desktop site?

    - by Emil Rasmussen
    We have a site with both a desktop version and a mobile version. Most of the content are the same and both versions have the same URL, but the HTML generated is device specific. Looking at Google's recommendations for smartphone-optimized sites, one could get the impression that the mobile xml sitemap is only for sites with different URLs. Will ranking be affected - negatively or positively - if we add a mobile xml sitemap that effectively will be a duplicate of the desktop sitemap?

    Read the article

  • Web Hosting Backup/Disaster Recovery Plan - Which Company?

    - by Harry Muscle
    I've been asked to look after consolidating all of our various company websites onto one host and also provide a disaster recover plan in case the chosen host goes down/out of business/etc. We're most likely going to go with HostGator as our chosen host, however, I'm not sure who to pick for our backup host. HostGator uses cPanel and has the functionality to provide regular full (ie: including configuration) backups of all the sites we host. Ideally I'm looking for a solution where we can provide these backups to another company and within a short period of time they restore all the sites onto their servers and we're back up and running. The whole disaster recover process has to be fairly straight forward from the point of view of what we need to do in case I am unavailable to assist in the disaster recovery process and no one else overly technical is available to assist (ie: take these backup files, send them to this company, and ask them to do this). Any suggestions on which company would be a good choice for this backup solution would be highly appreciated. Thanks, Harry

    Read the article

  • Help deploying using Capistrano to HostGator

    - by Kyle Macey
    My company uses HostGator to host our web sites, and I'm having a heck of a time figuring out what my final steps are to get a functioning RoR app up there. I've got all the way up to configuring mongrel (I think?) and being able to run deploy:cold without any errors. However, I can't seem to get the app to show up in the designated CPanel area (HG says the name "current" is already reserved for another application), and I'm not sure which port was allowed for me to use. I've opened tickets with Customer Support just to be told that "You can't access the database with root"... Totally unrelated to my question... So I think I'm in the final stretch and if anyone has any insight or experience with HostGator, please cue me in.

    Read the article

  • Meta description not displaying in custom site search results page

    - by Stephen Connolly
    We have Google Custom Site Search implemented on our company website. When I'm looking at the results page, I noticed that the Meta Description is not being displayed. It just seems to be reading the links titles from our drop down menu and using this as a description. When I search for the same page via google.com, the meta description is pulled in correctly. Any thoughts why this might be happening. I can't see anything in the Custom Site Search settings.

    Read the article

  • Why Facebook profiles are Google-searchable?

    - by Jose
    Facebook has around 1B user profiles. They can be found by searching in Google. However, I don't think these profiles are linked from anywhere, so how could Google discover them? As far as I know, sitemaps are not enough for that (http://webmasters.stackexchange.com/a/5151), as all URLs should be crawlable anyway. I ask the question as I also have a site with user profiles and would like to make them discoverable.

    Read the article

  • Highly SEO optimised forum posts

    - by Tom Gullen
    Given the following forum post: Basics of how internals of Construct work I've used GameMaker in the past. And I know some C++ and have used a few 3d engines with it. I have also looked at Unity, though I didn't get too much into it. So I know my way around programming etc... My question is, how does construct work internally? I know it allows python scripting, which itself is "technically" interpreted, though python is pretty fast as far as being interpreted goes. But what about the rest? Is the executable that gets cre... The forum software will take the first 150 chars of the first post as the page meta description, and the title will be the thread title. All ok. So in Google it will appear as: Basics of how internals of Construct work I've used GameMaker in the past. And I know some C++ and have used a few 3d engines with it. I have also looked at Unity, though I didn't get too much... http://www.domain.com/forum/basics-of-how-internals-of-construct-work.html Now the problem is (not so much with this thread, but other ones) is the first 150 chars don't always create the best meta description. Is it worth my time to cherry pick threads and manually set their description/title tags so they read like: Internal workings of Construct 2 Events aren't converted to any other language. The runtime is a standalone compiled EXE application, which is optimised and actually very fast. Your events... http://www.domain.com/forum/basics-of-how-internals-of-construct-work.html The H1 on the page is still the original title, but we have overridden the title and description to look more friendly on search results. Is this advantageous forgetting the obvious time cost?

    Read the article

  • Why does Google Search Engine reject my title tag's change?

    - by Michal P.
    I made a simple webpage http://pundaquitboat.michaelspages.com/ giving it the the title tag "Boat – Pundaquit" and I have submitted it to Google bot by Google Webmaster Tools. Then I decided to change the title to "Anawangin trip" of the same page and I submited my webpage again in the same way to Google bot. The result was that the new title of my webpage coexisted with the old title of the same webpage in SERPs for maybe 2 days. After that the new title was rejected and if I enter site:pundaquitboat.michaelspages.com/ I can see that Google has my old copy of my webpage with old title in its database. This problem doesn't occur in Bing when I can enjoy high position of "Anawangin trip" phrase. (In Bing I haven't submitted the old version of title.)

    Read the article

  • Prewritten App for Used Car Dealer?

    - by Shawn Eary
    Is there somewhere I can find a prewritten WebApp (with database) for a used car dealer? The application would need to support the following: Easy setup in a low cost Shared or Cloud Host Give potential customers easy way to browse current inventory (cars on lot) with suggested prices Give dealership easy way to login and update inventory (cars on lot) and suggested prices Give potential customers easy way to send the dealership an inquiry about a specific vehicle on the lot with CAPTCHA style SPAM protection I prefer ASP.NET MVC and Microsoft SQL Server, but I might consider other technologies such as WebForms and LightSwitch (HTML5). I am reasonably comfortable with MVC and WebForms, but I really don't want to waste a bunch of time writing an application that might already exist. I did find a few interesting templates via Bing that seem to control CSS and Layout, but I'm not sure if they contain any business logic or if they would integrate well into an MVC App.

    Read the article

  • Different robots.txt for two different domains point to same folder

    - by Ali
    Hi, I have the following two domains: domain.com test.domain.com Both point to same folder which is "public_html". What I want is a different robots.txt file for each domain. So when someone browse domain.com/robots.txt then a different file is shown. And when someone go to test.domain.com/robots.txt then a different file is shown. How can I do this using URL rewriting in .htacces? Thanks

    Read the article

  • Unindexing my tumblr blogs content and moving it to another tumblr blog

    - by sam
    ive been writing a tumblr blog for the past yr or so, ive writen about 300 articles, but now i need to move the blog to another site. (before it was running under blog.mysite.com and i now want it to run under blog.my*new*site.com) I want to keep the archived articles and have them on the new site, so what i was hoping to do was export the blog from tumblr, go into webmaster tools remove all the blogs indexed urls from google webmaster, then make a new tumblr blog and import the posts. Would google see this as new content as ive deleted their indexed copy ? Could i just move the mapping of the tumblr blog to the new subdomain, but in doing this i would lose all the pr and it would still look like duplicate content whats the best way to approach this ?

    Read the article

< Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >