Search Results

Search found 806 results on 33 pages for 'bourax webmaster'.

Page 11/33 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • On which page(s) to add canonical?

    - by user6211
    I have two pages with same content and same meta title and meta description. they also have very simular url: http://www.mysite.com/new-york http://www.mysite.com/new_york I need first link to be "official". To avoid having duplicated pages, i want to add canonical meta tag in header... but on which page? does it have to be on both of them or only on second? On on first? Can you give me some advice please?

    Read the article

  • How to improve a single-paged site search result [closed]

    - by Trisism
    Possible Duplicate: How to SEO a Single-Page website I created an online CV of mine a couple of weeks ago and it has had quite a few visits. Now I want to improve the chance it will appear in google search results; however, my web CV is a one-paged site and it contains only internal links (those with hash #) so I can't really submit a sitemap. I could have changed the internal links to normal links to be processed on server-side, but there's no point of doing so. I'm very new to web SEO so I would really appreciate if somebody can show me what should I do with a single-paged site with internal links to be effectively indexed by crawlers.

    Read the article

  • How do I make the home page of the website to come up in the rankings than the internal pages? [closed]

    - by Shahab
    Possible Duplicate: What are the best ways to increase your site's position in Google? Suppose I have a website, e.g. www.example.com that comes at number 6 on the Google search rankings. But the internal pages of the website i.e. www.example.com/index.php?a=1&b=2 or www.example.com/index.php comes at number 2 of the rankings. How would I make my prime domain name www.example.com to come at the top of the list ? Any guidance would be appreciated.

    Read the article

  • fault tolerant uploading tool

    - by andersonbd1
    I'm setting up a wordpress site for a friend who has some somewhat large audio files (150M)... He's on a bad connection and I'd guess it'll take him a while to upload those files with the normal wordpress way. I'm looking for a tool that I can install on the server that allows uploads and is also fault tolerant... for example if you loses his connection, or power, or whatever it'll pick up where it left off. I realize web technologies probably don't do that, but perhaps flash or something? Any ideas?

    Read the article

  • Does a large (hidden) submenu count towards site content in tems of determining page similarities?

    - by Name
    Basically, I have this site that recently lost a lot of traffic after I optimized the html, the exact reasons to which are uncertain. The graph of impressions (times a page appears on search listings) is continuously going down like an e^-x function. Because the content, previously occupying five pages of tables, now fits within a few paragraph tags, the menu now occupies about 80% of the live html code and I am starting to have doubts wherether this affects the "similar pages" factor that Google punishes. Questions: As far as I know, Google ignores invisible material and the submenus are only visible when hovered over. Has anything at all changed in this area? If I ajax in the submenus, leaving only the main eight menu items to load, will I be punished for "hiding" information? Is the idea worth testing or is it frankly retarded?

    Read the article

  • Googlebot DNS error HostPapa

    - by Gravy
    Received a message from Google Webmaster Tools: Over the last 24 hours, Googlebot encountered 2 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 40.0%. You can see more details about these errors in Webmaster Tools. Recommended action Contacted HostPapa and they deny that there is any issue with the site / server!!! Support in terms of what I can do to actually resolve this issue is non-existent!!!! The site is currently online. And I don't know much about DNS... so any advice about what I can do to resolve this problem would be much appreciated.

    Read the article

  • I am the Webmaster now. Where do I start? [closed]

    - by John C
    I just changed jobs and will soon be in charge of a custom-built ASP.NET CMS and website for a fairly large corporation with global offices. I have IT and developer FTE resources available to me but I am trying to build a list of branding, project, and functionality points to review. What guides or lists can/should I use to evaluate this website before I begin adding features, creating new projects, or even redesigning and redeveloping the site? (I have been a webmaster/designer/developer for small, WordPress/Drupal sites for 10 years. I have been an unofficial webmaster (director/content manager) for a large site for 3 years (no direct development control over Sharepoint administration, IIS, or hosting ... but everything else, I did. Analytics, email, advertising, social, SEO, etc.).) Thank you!

    Read the article

  • Google says: Sort parameters in URL problematic

    - by feklee
    From Google's recommendations for URL structure: Sorting parameters. Some large shopping sites provide multiple ways to sort the same items, resulting in a much greater number of URLs. For example: http://www.example.com/results?search_type=search_videos&search_query=tpb&search_sort=relevance&search_category=25" When linking from outside, then having URLs differing only by sort parameters is obviously a bad idea: Google will not understand that these links point to the same item, i.e. that the item is popular. Therefore ranking will be lower than it should. But what's the alternative? Using a fragment identifier (#), and then doing the sorting in JavaScript? What else? Some settings in Webmaster tools?

    Read the article

  • Why do 410 pages show as errors in Google Webmaster Tools?

    - by ElHaix
    To remove links from our site, we return a 410 code on on the links we want removed, and shows The page you requested was removed.. In Webmaster Tools, I see all the 410 pages in Crawl Errors / Not Found. I'm worried that because they appear in Crawl Errors that they could be negatively affecting SEO rankings. Is that the case, and if so, should I change the return codes from 410 to something else?

    Read the article

  • Is it ever a bad idea to publish a sitemap for a blog?

    - by mipadi
    I have a blog, and I have been considering publishing a sitemap for it, which would include the index page, archives page, and an entry for each individual blog post. Is this ever a bad idea? Is it a good (or useful) idea? I'm particularly interested in the <changefreq> element: I edit posts from time to time, and while that's not a common occurrence, I don't want to set a particularly infrequent change frequency that prevents search engines like Google from indexing the edits. (The sitemaps protocol says that search engines may still crawl the pages more frequently, but has no further details on the matter.)

    Read the article

  • how can i insert a new sitemap with google gdata api? it returns 400 bad request

    - by wingoo
    i try to insert a new sitemap to google using api, but i can't do it successful-_- this is the method var fullDomainUrl = "http://www.example.com/"; var entry = new SitemapsEntry(); entry.Id = new AtomId(fullDomainUrl + "sitemap.xml"); entry.Categories.Add(new AtomCategory("http://schemas.google.com/webmasters/tools/2007#site-info", new AtomUri("http://schemas.google.com/g/2005#kind"))); entry.SitemapType = "WEB"; myService.Insert(new Uri(string.Format("https://www.google.com/webmasters/tools/feeds/{0}/sitemaps/", HttpUtility.UrlEncode(fullDomainUrl))), entry); this will retuen a 400 bad requestand i try another method var settings = new RequestSettings("TesterApp1", domain.GoogleAuthToken, CommonService.GetRsaPrivateKey(Context)); var request = new WebmasterToolsRequest(settings); var sitemap = new Sitemap(); sitemap.Id = fullDomainUrl + "sitemap.xml"; sitemap.Categories.Add(new AtomCategory("http://schemas.google.com/webmasters/tools/2007#site-info", new AtomUri("http://schemas.google.com/g/2005#kind"))); sitemap.SitemapType = "WEB"; //request.AddSitemap(fullDomainUrl, sitemap); request.Insert(new Uri(string.Format("https://www.google.com/webmasters/tools/feeds/{0}/sitemaps/", HttpUtility.UrlEncode(fullDomainUrl))), sitemap); this also return a 400 bad request and then i try to use HttpWebRequest to post the atom to google,but it also return a 400 bad request(???") i can insert/update site successful,but can;t insert a new sitemap.. does any can give a right code with .net?

    Read the article

  • Can preventing directory listings in WordPress upload folders cause Google ranking drops when they cause 403 errors in Webmaster Tools?

    - by Kelly
    I recently moved to a new host that blocks crawling to my uploads folders but (hopefully) allows the files in the folder to be crawled. I now show many 403 errors for each folder in the uploads folder in my Webmaster Tools. For example, http://www.rewardcharts4kids.com/wp-content/uploads/2013/07/ shows a 403 error. For example, I can access this file: http://www.rewardcharts4kids.com/wp-content/uploads/2013/07/lunch-box-notes.jpg but I cannot access the folder it is in. My rankings went down after I moved to this host and I am wondering if: this could be the reason. is this how files/folders are supposed to be set up?

    Read the article

  • How to add sitemap of a blogger blog to Google webmaster tools?

    - by Chankey Pathak
    I have two blogs. Let's say http://name.blogspot.com/ and the other one is http://name2.blogspot.com/ For blog 1: While submitting sitemap to Google webmaster tool I selected the option and then added rss.xml at last. (http://name.blogspot.com/rss.xml) Sitemap added successfully. For blog 2: I followed the same procedure but it didn't work. Then I tried to open the url (http://name.blogspot.com/rss.xml). The url was showing the atom feeds. I tried same with the other blog but that url is redirecting to the feedburner feeds. I think this is the reason why the other blog's sitemap is not getting submitted. Help me with it.

    Read the article

  • Postfix : relay access denied

    - by kfa
    Since I can't find a solution that works with my config, I lean on you guys to help me out with this. I've installed postfix and dovecot on a CentOS server. Everything's running well. But when I try to send an e-mail from Outlook to tld that is not .com, server returns : Relay access denied. Here's the result from the postconf -n command alias_database = hash:/etc/aliases alias_maps = hash:/etc/aliases command_directory = /usr/sbin config_directory = /etc/postfix daemon_directory = /usr/libexec/postfix data_directory = /var/lib/postfix debug_peer_level = 2 home_mailbox = Maildir/ html_directory = no inet_protocols = all mailbox_size_limit = 104857600 mailq_path = /usr/bin/mailq.postfix manpage_directory = /usr/share/man message_size_limit = 20971520 mydestination = $myhostname, $mydomain, localhost, localhost.$mydomain newaliases_path = /usr/bin/newaliases.postfix readme_directory = /usr/share/doc/postfix-2.6.6/README_FILES sample_directory = /usr/share/doc/postfix-2.6.6/samples sendmail_path = /usr/sbin/sendmail.postfix setgid_group = postdrop smtp_tls_loglevel = 3 smtpd_tls_auth_only = yes smtpd_tls_cert_file = /etc/postfix/mailserver.pem smtpd_tls_key_file = /etc/postfix/mailserver.pem smtpd_tls_received_header = yes smtpd_tls_security_level = encrypt smtpd_tls_session_cache_timeout = 3600s tls_random_source = dev:/dev/urandom unknown_local_recipient_reject_code = 550 Here's the maillog error : Nov 23 13:26:24 website_name postfix/smtpd[16391]: extract_addr: input: <mrm@website_name.com> Nov 23 13:26:24 website_name postfix/smtpd[16391]: smtpd_check_addr: addr=mrm@website_name.com Nov 23 13:26:24 website_name postfix/smtpd[16391]: ctable_locate: move existing entry key mrm@website_name.com Nov 23 13:26:24 website_name postfix/smtpd[16391]: extract_addr: in: <mrm@website_name.com>, result: mrm@website_name.com Nov 23 13:26:24 website_name postfix/smtpd[16391]: fsspace: .: block size 4096, blocks free 23679665 Nov 23 13:26:24 website_name postfix/smtpd[16391]: smtpd_check_queue: blocks 4096 avail 23679665 min_free 0 msg_size_limit 20971520 Nov 23 13:26:24 website_name postfix/smtpd[16391]: > unknown[178.193.xxx.xxx]: 250 2.1.0 Ok Nov 23 13:26:24 website_name postfix/smtpd[16391]: < unknown[178.193.xxx.xxx]: RCPT TO:<[email protected]> Nov 23 13:26:24 website_name postfix/smtpd[16391]: extract_addr: input: <[email protected]> Nov 23 13:26:24 website_name postfix/smtpd[16391]: smtpd_check_addr: [email protected] Nov 23 13:26:24 website_name postfix/smtpd[16391]: ctable_locate: move existing entry key [email protected] Nov 23 13:26:24 website_name postfix/smtpd[16391]: extract_addr: in: <[email protected]>, result: [email protected] Nov 23 13:26:24 website_name postfix/smtpd[16391]: >>> START Recipient address RESTRICTIONS <<< Nov 23 13:26:24 website_name postfix/smtpd[16391]: generic_checks: name=permit_sasl_authenticated Nov 23 13:26:24 website_name postfix/smtpd[16391]: generic_checks: name=permit_sasl_authenticated status=0 Nov 23 13:26:24 website_name postfix/smtpd[16391]: generic_checks: name=reject_unauth_destination Nov 23 13:26:24 website_name postfix/smtpd[16391]: reject_unauth_destination: [email protected] Nov 23 13:26:24 website_name postfix/smtpd[16391]: permit_auth_destination: [email protected] Nov 23 13:26:24 website_name postfix/smtpd[16391]: ctable_locate: leave existing entry key [email protected] Nov 23 13:26:24 website_name postfix/smtpd[16391]: NOQUEUE: reject: RCPT from unknown[178.193.xxx.xxx]: 554 5.7.1 <[email protected]>: Relay access denied; from=<mrm@website_name.com> to=<[email protected]> proto=ESMTP helo=<[192.168.1.38]> Nov 23 13:26:24 website_name postfix/smtpd[16391]: generic_checks: name=reject_unauth_destination status=2 Nov 23 13:26:24 website_name postfix/smtpd[16391]: > unknown[178.193.xxx.xxx]: 554 5.7.1 <[email protected]>: Relay access denied Nov 23 13:26:24 website_name postfix/smtpd[16391]: smtp_get: EOF What's wrong with this? UPDATE : added to main.cf broken_sasl_auth_clients = yes smtpd_recipient_restrictions = permit_mynetworks permit_sasl_authenticated smtpd_sasl_auth_enable = yes smtpd_sasl_path = private/auth smtpd_sasl_security_options = noanonymous noplaintext smtpd_sasl_tls_security_options = $smtpd_sasl_security_options smtpd_sasl_type = dovecot UPDATE : EHLO EHLO mail.perflux.com 250-perflux.com 250-PIPELINING 250-SIZE 20971520 250-VRFY 250-ETRN 250-STARTTLS 250-ENHANCEDSTATUSCODES 250-8BITMIME 250 DSN

    Read the article

  • Postfix certificate verification failed for smtp.gmail.com

    - by Andi Unpam
    I have problem, my email server using postfix with gmail smtp, i use account google apps, but always ask for SASL authentication failed, I sent an email using php script, after I see the error logs in the wrong password, after I open the URL from the browser and no verification postfixnya captcha and could return, but after 2-3 days later happen like that again. This my config postfix #myorigin = /etc/mailname smtpd_banner = Hostingbitnet Mail Server biff = no append_dot_mydomain = no readme_directory = no myhostname = webmaster.hostingbitnet.com alias_maps = hash:/etc/aliases alias_database = hash:/etc/aliases myorigin = /etc/mailname mydestination = localhost, webmaster.hostingbitnet.com, localhost.localdomain, 103.9.126.163 relayhost = [smtp.googlemail.com]:587 relay_transport = relay relay_destination_concurrency_limit = 1 mynetworks = 127.0.0.0/8, 192.168.0.0/16, 172.16.0.0/16, 10.0.0.0/8, 103.9.126.0/24 mailbox_size_limit = 0 recipient_delimiter = + inet_interfaces = all default_transport = smtp relayhost = [smtp.gmail.com]:587 smtp_sasl_auth_enable = yes smtp_sasl_password_maps = hash:/etc/postfix/google-apps smtp_sasl_security_options = noanonymous smtp_use_tls = yes smtp_sender_dependent_authentication = yes tls_random_source = dev:/dev/urandom default_destination_concurrency_limit = 1 smtp_tls_CAfile = /etc/postfix/tls/root.crt smtp_tls_cert_file = /etc/postfix/tls/cert.pem smtp_tls_key_file = /etc/postfix/tls/privatekey.pem smtp_tls_session_cache_database = btree:$data_directory/smtp_tls_session_cache smtp_tls_security_level = may smtp_tls_loglevel = 1 smtpd_tls_CAfile = /etc/postfix/tls/root.crt smtpd_tls_cert_file = /etc/postfix/tls/cert.pem smtpd_tls_key_file = /etc/postfix/tls/privatekey.pem smtpd_tls_session_cache_database = btree:$data_directory/smtpd_tls_session_cache smtpd_tls_security_level = may smtpd_tls_loglevel = 1 #secure smtpd_recipient_restrictions = permit_mynetworks,permit_sasl_authenticated,check_client_access hash:/var/lib/pop-before-smtp/hosts,reject_unauth_destination Log from mail.log Oct 30 14:51:13 webmaster postfix/smtp[9506]: Untrusted TLS connection established to smtp.gmail.com[74.125.25.109]:587: TLSv1 with cipher RC4-SHA (128/128 bits) Oct 30 14:51:15 webmaster postfix/smtp[9506]: 87E2739400B1: SASL authentication failed; server smtp.gmail.com[74.125.25.109] said: 535-5.7.1 Please log in with your web browser and then try again. Learn more at?535 5.7.1 https://support.google.com/mail/bin/answer.py?answer=78754 ix9sm156630pbc.7 Oct 30 14:51:15 webmaster postfix/smtp[9506]: setting up TLS connection to smtp.gmail.com[74.125.25.108]:587 Oct 30 14:51:15 webmaster postfix/smtp[9506]: certificate verification failed for smtp.gmail.com[74.125.25.108]:587: untrusted issuer /C=US/O=Equifax/OU=Equifax Secure Certificate Authority Oct 30 14:51:16 webmaster postfix/smtp[9506]: Untrusted TLS connection established to smtp.gmail.com[74.125.25.108]:587: TLSv1 with cipher RC4-SHA (128/128 bits) Oct 30 14:51:17 webmaster postfix/smtp[9506]: 87E2739400B1: to=<[email protected]>, relay=smtp.gmail.com[74.125.25.108]:587, delay=972, delays=967/0.03/5.5/0, dsn=4.7.1, status=deferred (SASL authentication failed; server smtp.gmail.com[74.125.25.108] said: 535-5.7.1 Please log in with your web browser and then try again. Learn more at?535 5.7.1 https://support.google.com/mail/bin/answer.py?answer=78754 s1sm3850paz.0) Oct 30 14:51:17 webmaster postfix/error[9508]: B3960394009D: to=<[email protected]>, orig_to=<root>, relay=none, delay=29992, delays=29986/5.6/0/0.07, dsn=4.7.1, status=deferred (delivery temporarily suspended: SASL authentication failed; server smtp.gmail.com[74.125.25.108] said: 535-5.7.1 Please log in with your web browser and then try again. Learn more at?535 5.7.1 https://support.google.com/mail/bin/answer.py?answer=78754 s1sm3850paz.0) BTW I made cert follow the link here http://koti.kapsi.fi/ptk/postfix/postfix-tls-cacert.shtml and it worked, but after 2/3 days my email back to problem invalid SASL, and then i'm required to log in use a browser and enter the captcha there but success log in after input captcha, and my email server can send emails from telnet or php script. but it will be back in trouble after 2/3days later. My question is how to make it permanent certificate? Thanks n greeting.

    Read the article

  • Send email from server to Google Apps email address (same domains)

    - by Orlando
    I'm sending email from a server, let's say domain.com. I also have Google Apps email set up for hosted email, same domain, domain.com. If I get mail sent to me from anywhere else, I receive things just fine. However, if the email originates from my server, it just ends up in /var/mail/root as a delivery error saying the user is unknown. I created a user on the server for the name which is having trouble, [email protected]. Retried sending and it sends, but not to my hosted email at Google Apps. I just receive it at /var/mail/webmaster now. I'm using sendmail. I messed around with /etc/aliases but adding webmaster: [email protected] looked useless (and I was right.) Any help?

    Read the article

  • httprequest handle time delays till having response

    - by bourax webmaster
    I have an application that calls a function to send JSON object to a REST API, my problem is how can I handle time delays and repeat this function till i have a response from the server in case of interrupted network connexion ?? I try to use the Handler but i don't know how to stop it when i get a response !!! here's my function that is called when button clicked : protected void sendJson(final String name, final String email, final String homepage,final Long unixTime,final String bundleId) { Thread t = new Thread() { public void run() { Looper.prepare(); //For Preparing Message Pool for the child Thread HttpClient client = new DefaultHttpClient(); HttpConnectionParams.setConnectionTimeout(client.getParams(), 10000); //Timeout Limit HttpResponse response; JSONObject json = new JSONObject(); //creating meta object JSONObject metaJson = new JSONObject(); try { HttpPost post = new HttpPost("http://util.trademob.com:5000/cards"); metaJson.put("time", unixTime); metaJson.put("bundleId", bundleId); json.put("name", name); json.put("email", email); json.put("homepage", homepage); //add the meta in the root object json.put("meta", metaJson); StringEntity se = new StringEntity( json.toString()); se.setContentType(new BasicHeader(HTTP.CONTENT_TYPE, "application/json")); post.setEntity(se); String authorizationString = "Basic " + Base64.encodeToString( ("tester" + ":" + "tm-sdktest").getBytes(), Base64.NO_WRAP); //Base64.NO_WRAP flag post.setHeader("Authorization", authorizationString); response = client.execute(post); String temp = EntityUtils.toString(response.getEntity()); Toast.makeText(getApplicationContext(), temp, Toast.LENGTH_LONG).show(); } catch(Exception e) { e.printStackTrace(); } Looper.loop(); //Loop in the message queue } }; t.start(); }

    Read the article

  • Website still blocked after hack

    - by dotman14
    I manage a website that was hacked a few months ago (I wasn't the webmaster then), it was running on Joomla. I have manages to redo the website with custom codes (php/mysql), but it still some visitors still complain that their AV blocks them from viewing the website. I have also cleared the former database and anything related to it, contents and the likes. My website is here I have looked for malwares in Google Webmaster but it says there are non Also I checked with Google Safe Browsing Please what could the problem be.

    Read the article

  • The Best Way to Build Backlinks - A List of 36 Sites to Get Backlinks

    Every webmaster can understand the meaning of backlinks. We need backlinks to rank our sites higher in Google and other search engines. Search engines count the number of backlinks for a web page and assign a rank to it in in search results. Hence, every webmaster always look to get as many backlinks as possible. In this article I explained few free methods of getting links.

    Read the article

  • Adobe sort Project Parfait, un outil en ligne pour extraire le CSS de votre PSD

    Adobe Project Parfait : extraire le CSS de votre PSDAdobe signe un bel outil qui promet de rendre de bons services aux webmaster en permettant d'extraire le code CSS d'un fichier PSD dans le navigateur. Dans le navigateur? C'est à dire que la version actuelle (beta) est gratuite et ne nécessite pas de posséder Photoshop pour l'utiliser.On peut imaginer que le webmaster a reçu la maquette d'une page en PSD qu'il lui suffit d'uploader sur Project Parfait pour en extraire le CSS.L'interface intuitive...

    Read the article

  • No description for any page on the website is available in Google despite robots.txt allowing crawling

    - by Abhijit
    I seem to have the weirdest issue with Search Engine Optimization, and I asked the IT folks at my university, I asked people on Joomla forums and I am trying to sort this issue out using Google Webmaster Tools for more than 2 months to little avail. I want to know if I have some blatantly wrong configuration somewhere that is causing search engines to be unable to index this site. I noticed a similar issue with another website I searched for online (ECEGSA - The University of British Columbia at gsa.ece.ubc.ca), making me believe this might be a concern that people might be looking an answer for. Here are the details: The website in question is: http://gsa.ece.umd.edu/. It runs using Joomla 2.5.x (latest). The site was up since around mid December of 2013, and I noticed right from the get go that the site was not being indexed correctly on Google. Specifically I see the following message when I search for the website on Google: A description for this result is not available because of this site's robots.txt – learn more. The thing is in December till around March I used the default Joomla robots.txt file which is: User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /cli/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /logs/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ Nothing there should stop Google from searching my website. And even more confusingly, when I go to Google Webmaster tools, under "Blocked URLs" tab, when I try many of the links on the site, they are all shown up as "Allowed". I then tried adding a sitemap, putting it in the robots.txt file. That did not help. Same exact search result, same behavior in the "Blocked URLs" tab on the webmaster tools. Now additionally, the "sitemaps" tab says for several links an error saying "URL is robotted out". I tried those exact links in the "Blocked URLs" and they are allowed! I then tried deleting the robots.txt file. No use. Same exact problem. Here is an example screenshot from Google's Webmaster Tools: At this point I cannot give a rational explanation to why this is happening and neither can anyone in the IT department here. No one on Joomla forums can seem to understand what is going on. Based on what I explained, does it seem that I have somehow set a setting in the robots.txt or in .htaccess or somewhere else, incorrectly?

    Read the article

  • Structured data: Field missing: price [on hold]

    - by Handi Occasion
    I just set up my site to make it better thanks to the micro-referenced data. After finishing the creation of meta-data, I checked via webmaster tool the result of my work and my a priori data are taken into account (see here). Today I had a look through the webmaster tools - Structured Data and then surprise! I have a groin ad 50 with the error field missing: price while the price is this! any idea? thank you

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >