Search Results

Search found 10444 results on 418 pages for 'macbook pro retina'.

Page 115/418 | < Previous Page | 111 112 113 114 115 116 117 118 119 120 121 122  | Next Page >

  • SEO for site with 301 redirect on root domain to subfolder

    - by Kim
    I've been asked to do SEO for a site. The site is made using Wordpress and prestashop. Because of this the root domain has a 301 redirect to a subfolder - domain/shop/ For my SEO submission work, I know it's not good practice to submit urls that have redirects on them and a lot of the time it's not allowed. After searching the net I think my best bet is to do all my site submissions using the url - domain/shop/ even though it will take a lot more listings to get them up in ranking compared to using their root domain. I'm not sure how it will work. The root domain has the greatest rank then passes rank to the rest of the site. If I'm targeting the subfolder will it work?

    Read the article

  • Which kind of sitemap directory should I build for a search based navigation site

    - by Noam
    I have a search based navigation web-site. Each query has filters as well as sort-by. The search results point to end-pages inside the site. Each of those pages has many outlinks to other end-pages. Currently I have a XML sitemap which directs crawlers to all the end pages. I'm trying to add a silo sitemap directory to improve SEO. Assuming this is the right direction I have a couple of options: end pages sorted alphabetically. Pages by major search filters, and then divide alphabetically. Pages for every filter and cross option between them and the sort-by. Which would you recommend and why?

    Read the article

  • Whois status "pending delete" with expiration date in November 2011???

    - by Sylver
    A friend of mine is in the process of being scammed by a domain registrar and I am trying to sort out the mess. However I could use a hand understanding some of the details. He paid for 2 years of domain name registration on 6 november 2009. The whois record reads: Domain ID:XXXXXXXXXX Domain Name:XXXXXXXXX.ORG Created On:06-Nov-2009 09:23:12 UTC Last Updated On:17-Dec-2010 00:15:10 UTC Expiration Date:06-Nov-2011 09:23:12 UTC Sponsoring Registrar:OnlineNIC Inc. (R64-LROR) Status:CLIENT TRANSFER PROHIBITED Status:HOLD Status:PENDING DELETE SCHEDULED FOR RELEASE Registrant ID:ONLC-XXXXXXX-X Registrant Name:My friend's name ... Registrant Email:Old email The registrar charged a renewal fee a week ago and is now asking an extra $150 to "reclaim" the domain name, even though the domain name is apparently still in my friend's name and it looks like there is still another 10 months before the expiry date. The expiration date on the WhoIs record looks right (Nov 2011), so I don't understand why the domain status says "PENDING DELETE SCHEDULED FOR RELEASE". Can someone explain me better what the deal is and explain what I need to do get the domain name transfered to a more honest registrar? I already have a registrar for my own domain names, been using them for 10 years without problems, so I know where to transfer the domain names to, I just don't know how to proceed.

    Read the article

  • How to verify real people?

    - by Gerben Jacobs
    For a music community, I want bands to be able to verify themselves. What is the best way to do this? For example I could let the record label mail me, but some bands are indepedent. I could also ask them to put a 'code' on their website or Facebook page and then check manually. I'm not per se looking for a waterproof solution, so no scanning of real life documents and I'm okay with doing the checks manually. In other words, how can you verify real people with their virtual presence?

    Read the article

  • How would I tell if a prospective client website is under DDoS attack?

    - by artlung
    I have a person asking me whether the DDoS Mitigation Service they're using is worth it. This is out of my expertise, but clearly at some point someone sold this service to the client. Assuming I don't have anything but a domain name, what information can I gather about whether they are indeed under attack and/or how well the DDoS Mitigation Service is working? Assume I don't have any administrative access to the site/server(s) in question.

    Read the article

  • Which token from a long User-Agent should I use in robots.txt?

    - by Gaia
    The definition of User-Agent states that several tokens can be included, as deemed necessary by the client. I want to block certain bots via robots.txt and I am confused as to which part of the User-Agent string to use, especially for more obscure bots. For example: Mozilla/5.0 (compatible; uMBot-LN/1.0; mailto: [email protected])" JS-Kit URL Resolver, http://js-kit.com/ Mozilla/5.0 (compatible; SEOkicks-Robot +http://www.seokicks.de/robot.html Do I use the second token? Can tokens contain spaces, or did the SEOkicks folks forget a semicolon after SEOkicks-Robot? I don't actually intend on making my question specific to a couple bots - I want to know the guideline: which part of UA do I place in robots.txt for these exotic bots with UA as long as a haiku? User-agent: uMBot-LN/1.0 Disallow: / PS: Thank you but I do not need to hear that undesirable bots are better blocked with mod_security. I already have commercial mod_sec rules in place.

    Read the article

  • Symbolic link not allowed or link target not accessible: /var/www on Ubuntu 11.04

    - by Jamie Hutber
    I am getting a 403 when i access http://mayfieldafc.local/ upon looking in the apache logs i am getting [Wed Nov 16 12:32:59 2011] [error] [client 127.0.0.1] Symbolic link not allowed or link target not accessible: /var/www I have what i believe to be the correct permissions set on /var/www. hutber can create and delete files, hutber being my user. I can also execute as program on this folder. in mayfields vhost its: <Directory /var/www/mayfieldafc/docroot> Options +FollowSymLinks AllowOverride None Order allow,deny Allow from all </Directory> I am pulling my hair out not being able to work on my sites with my work ubuntu install. I know of nothing else that could be effecting this. So any ideas?

    Read the article

  • How do I get Google to crawl my content when it's only displayed when you fill in a form?

    - by Sarang Patil
    I have a webpage. It has a form and the "results" section is blank. When the user searches for items, and a list that pops up, he/she chooses one option from list and then the corresponding results are displayed in results section. I once decided to log every ip,url of person with time that visits my page. One ip was 66.249.73.26, and on doing google search I came to know it is ip of google bot. link for whatmyipaddress google bot Now when I searched for the links that this ip visited, it was like this: search?id=100 search?id=110 ... search?id=200 ... then afterwards it incremented in steps of 1, like 400,401.. But people search for strings and not numbers. And because googlebot searches for numbers like this, I think the corresponding content is never displayed and so my page content is never indexed, even though it has rich content. So I want to ask you is that in order to show google bot all the content that the webpage has, should I list all the results in index page and ask users to enter string to filter results?

    Read the article

  • SEO indexing with dynamic titles, keywords and description

    - by Andrea Turri
    I'm working on a worldwide website (all in one single domain) so I'm wondering to create dynamic titles, descriptions, keywords and headings for each location. What I'm doing is to get information from the IP of the user and show for example a dynamic title: var userCity = codeToGetCityFromIP; <title>Welcome to userCity</title> // and same for description, keywords and headings... Obviously the code is different... I'd like to know if it is a good solution to create multiple SEO indexing based on cities? I'm also using GeoLocation and I do same using the returned values from it. I'm doing right or there are more effective ways to indexing in different countries and cities without create multiple website for each city of the world? Thanks.

    Read the article

  • Free forum engine with good anti-attack mechanisms

    - by macias
    I am looking for forum engine (for discussions) with good attack countermeasures built in. Windows (preferrably) or Linux. Free (as beer). I think about registration flooding and blocking user accounts attacks. For registration, such engine should have at least: captcha blocking mulitple registrations from the same IP providing login (for logging in) and user name (for displaying the author of the posts) For logging in: no blocking on multiple tries -- instead after X try sending via mail a token, the third piece needed for next login -- without it logging in will be impossible (it would be similar to activation process) The engine should be designed with two ideas in mind: protecting engine against attacks 0 penalty for decent users Thank you in advance for your help and recommendations.

    Read the article

  • general questions about link spam

    - by hen3ry
    Hello, A CMS-based site I manage is suffering from a small but ominously growing number of almost certainly bot-emplaced, invisible spam links placed in registered-user-only shoutboxes and user forums. "Link Spam", yes? Until recently, I've kept my eyes on narrow tech issues, and I'm having trouble understanding what's going on. I understand that we need to tighten up our registration procedures, but more generally... Do I understand correctly that our primary interest in combatting link spam on our site is that major search engines reduce or zero the search visibility of sites that contain link spam? Although we're non-commercial, we don't want to be at the bottom of the rankings, or eliminated altogether. Are the linked-to sites the direct beneficiaries of the spam links, or is there some kind of indirection? What is the likely relationship between the link-spammers and the owners of the (directly or indirectly) linked-to sites? Are the owners of the linked-to sites paying the link-spammers for higher visibility? Are the owners aware that this method is being used? It is my impression that major search engines are capable these days of detecting that given sites are being promoted by link spam, and that these sites may consequently be reduced in search rank or dropped altogether. Do these sanctions occur frequently? Is there any potential value in sending notifications to the owners of the linked-to sites that their visibility is at risk? TIA, hen3ry

    Read the article

  • Self hosted PHP shopping cart with no storefront?

    - by Question
    I am looking for a shopping cart to implement on a simple website instead of the default paypal cart that is used with their add to cart buttons (I don't like the non-styeable new tab/window cart). However, I really like the ability to simply add the buttons to existing pages. I do not have a lot of products and do not want to deal with a storefront and complex templates. The main features I need: Self-hosted Easy to implement with existing website (copy and paste button code, etc.) Ability to have variations on one button with different prices (dropdown with sizes, colors, etc.) Ability to track inventory and disallow out of stock orders Ability to pass cart details to PayPal Website Payments Standard I have seen most of the large storefront options: oscommerce, zencart, cubecart, opencart, prestashop, magento, cs-cart, lemonstand, etc. but these are way more than I need. I don't need the storefront or customer accounts or templated pages, etc. I have seen e-junkie, which is not far off from what I would like, but it is not self-hosted and I would prefer an in-site cart (or dynamic overlay cart) rather than a lightbox or new tab/window cart. I also love the paypal minicart and its implementation, but there is no way to track inventory. So, does anyone have any recommendations that might meet these requests?

    Read the article

  • Best way to block "comment spam" postings to web forms? [closed]

    - by David Jones
    Possible Duplicate: Make your site anti-bot? I have a custom web form on my PHP-based site. Recently it is getting a regular stream of comment-spam postings from a few specific IP addresses. Question: What is a good way to block a small set of blacklisted IP addresses from accessing my site? I was thinking it should be possible using .htaccess to respond with status code 403 (Forbidden) for all HTTP requests from the blacklisted IP addresses, ... but I am not sure exactly how to do that. If anyone knows the .htaccess syntax needed to accomplish this, ... please let me know. thanks in advance,

    Read the article

  • Is Movable Type among the most secure PHP blogs? How secure are the various PHP blog applications?

    - by user6025
    Basically I'm trying to find a blog for a website, and security is the highest priority in our case. We don't need any features that I would imagine are special. Wordpress was our first idea, but its reputation precedes it, and though it may have cleaned up its act lately, I'm not seeing much solid evidence. I get the impression that Movable Type (at least the Perl version) has a much better reputation for security than Wordpress (historically at least). I'm not sure I want to take a chance with Wordpress at this point, but is there some objective source I can got to to back up (or counter) the notion that MT is at least among the best? Secunia doesn't recommend using their stats for comparisons, and securityfocus.com doesn't have stats at all that I can see. Searching here http://web.nvd.nist.gov makes MT look way better than WP (at least in 2007), but this site was referenced by MT's own page boasting about their security, so I don't know how relevant it is or how seriously people take it. Any suggestions on sites where I could/should make a somewhat objective comparison?

    Read the article

  • Why are intermittent pages disappearing from SERPs?

    - by Beebee
    I have a very basical informational site with about 50+ pages. Specifically, there is a page about each site. I have been tracking pages daily for months and they have been streadily increasing in the SERPs. Suddenly, about 3-4 weeks ago, about 20 pages were completely missing from the SERPS, though they were still indexed (I could google a specific text from the page in quotes and it would appear). Since then, pages have been added and removed from SERPs continually, cutting my traffic by about 50%. All I have been adding is information about the sites topic. I haven't stolen content or done anything else that seems like it would cause this.

    Read the article

  • Creating Google sitemap.xml , is it okay for the images to be wrapped in url tags?

    - by AzizAG
    I'm using a tool to generate the sitemap.xml file for me, it started to crawl my website, got the pages and all images, but when exporting it, I review the xml(to make sure nothing is wrong) and I noticed that the images in my website are wrapped in url tags(I think it should be in image tags). See this: <url><loc>http://mywebsite.com/images/12.jpg</loc><lastmod>2012-05-23T13:39:02+00:00</lastmod><changefreq>weekly</changefreq><priority>0.50</priority></url> Shouldn't it be wrapped in image tag?(just like videos wrapped in video tag) Thanks.

    Read the article

  • Proper way to create and work with a subdomain?

    - by Genadinik
    My site got effected by Panda, and I am trying to see if making a subdomain would work. The site is comehike.com, and I created a subdomain which is currently empty at hiking.comehike.com I have a directory /outdoors that has some high quality hand-written articles. I want to put those into the new subdomain to see what would happen. My questions are: Should I just copy and paste the files for those pages into the new subdomain's folder, and just change all the links in all my pages from the original domain to the new subdomain? Should I just do a 301 redirect to the new subdomain? Since test.site.com and www.site.com are different domains, will the new page have to start from scratch in terms of Pagerank, and its rankings in the SERPs?

    Read the article

  • Seeking .htaccess help: Converting multiple subdomains (both HTTP and HTTPS) to www.domain.com using .htaccess

    - by Joshua Dorkin
    I've been trying to get an answer to this question on other forums (the folks at SuperUser thought this was the place I needed to post) and via my connections, but I haven't gotten very far. Hopefully you guys can help me find an answer. I've got a dozen old subdomains that have been indexed by Google. These have been indexed as both HTTP AND HTTPS. I've managed to redirect all the subdomains properly, provided they are not HTTPS, but can't get any of the HTTPS subdomains to property redirect. Here's the code I'm using: RewriteCond %{HTTP_HOST} ^subdomain1.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain2.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain3.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] This works great until someone goes to: https://subdomain2.mysite.com$ which is not redirected back to http://www.mysite.com$ How can I get this to work? Additionally, I'm guessing there is an easier way to make it happen than setting up a dozen pairs of RewriteCond/RewriteRule? Is there any way to do this in just a few lines, including one where I list all the subdomains? I'd actually also like to redirect everything on https://www.mysite.com$ to http://www.mysite.com$ except for 3 folders. These are mysite.com/secure, mysite.com/store, mysite.com/user. Is there any good way to add this to the .htaccess file?

    Read the article

  • Website restyle, SEO migration plan?

    - by Goboozo
    I am currently in a project for one of my biggest clients. We have built a website that will -replace- the old website. When it comes to actual content its is largely the same. However, the presentation of the content has changed drastically. From our point of view much more user-friendly (main reason to update the site). Now, since the sites presentation has changed we have some major changes in: HTML & CSS: To change the presentation of the content URL's: To make them better understandable (301 redirects have been taken care of and are in place) Breadcrumbs: To enhance the navigation (we have made the breadcrumbs match exactly with the url's) Pagination: This was added to enable content browsing Title tags: Added descriptive title tags to the major links and buttons. Basically all user content including meta tags have remained the same. Now since this company is rather successful and 90% of its clients come from Google's organic results I am obliged to take all necessary precautions. People tell me I need a migration plan to prevent the site being hurt in Google, but I have never worked using such a plan... ...So, based on the above. Would you consider a migration plan necessary and what precautions/actions would you recommend to prevent us being put down in our SERP positions? Many thanks in advance for your answers.

    Read the article

  • iFrame content pageviews not matching parent page pageviews

    - by surfbird0713
    I have a page with content hosted in an iFrame, both using the same GA account ID. When I look at the pages report, the parent page has about 9000 unique views, but the iFrame content only has 3700. Anyone have an idea what could cause that kind of discrepancy? My only guess is that it would be caused by people moving on before the iFrame content has a chance to load, but the average time on page for the host page is 56 seconds, so that doesn't seem possible. This is the page in question: http://cookware.lecreuset.com/cookware/content_le-creuset-lid_10151_-1_20002 The flipbook is hosted in the iFrame on a separate domain. I have each page of the flipbook triggering a virtual pageview to try to evaluate engagement with the book - when the flipbook loads, it fires a pageview for the page it is on, so that is the page I'm using for the 3700 number. I also looked at the source of the iFrame in the pages report, and that number just about matches the virtual pageviews so that piece is consistent. Any ideas on this are much appreciated. Thanks!

    Read the article

  • How does the Keyword Order in the domain name effects SEO?

    - by Sushil
    From the Google keyword research tool, I see "chuck norris jokes" has global 246,000 searches. And again searching "jokes chuck norris" has the same search result. But as see, order of keyword in search terms has, "hello how are you" and "how are you hello" has clearly different results. Now instead of search term (assuming "chuck norris jokes"), I was wondering, if I had to register chucknorrisjokes.com and jokeschucknorris.com, would it effect the ranking on the search result? Or would it be the same? As we see here, both the domains has the same keywords, just in different order. How would that effect? I hope what I am trying to say is clear.

    Read the article

  • DNS query re website Status: inactive

    - by Matthew Brookes
    There is a website that I am assisting with which, when you do a DNS look up on Who.is, returns a Website Status of "inactive". I also noticed the server type is incorrectly reported. This is not a website I generally use for DNS queries so am unsure if it is reliable. Using other DNS checking services reports what Iwould expect and the site is functioning correctly. Research I have done with regard to Website Status: inactive suggests an issue with the DNS configuration? I am looking for help understanding if this is something to be concerned with and if possible how to update this value or how it gets set in the first place.

    Read the article

  • Assigning Static Public IP Address to Windows Server 2008

    - by Neeti
    Please help a newbie. I am new to windows server. I have an IBM server and I have installed Windows Server 2008 R2 on that. I am provided with a static IP address by my ISP. How I can assign that to my server? I have a webapplication hosted on the server which I require to access from the external world using internet browser. How can this be achieved? Please let me know if there are any tutorials or step by step guide for achieving what I am trying to.

    Read the article

  • What is the proper way to setup Google Apps email accounts for a subdomain?

    - by binaryorganic
    Let's say I've registered domain.com at enom and set it up to use Google Apps for email by rerouting DNS to enom's servers and editing the MX records there. That works flawlessly. Now let's say I want to have email at a subdomain for that same site. I already have a working subdomain at the host, but I want to catch email traffic at enom before it gets that far. I've set up Google Apps as a new account for the subdomain, successfully verified domain ownership, and now they want me to update MX records. What's the right format? For domain.com, I just put @ for the hostname, and then provided the Address and Pref values that Google gave me. I tried putting subdomain.domain.com as new values under hostname for the subdomain, but that doesn't seem to work. What am I doing wrong?

    Read the article

  • Why is "www.mysite.com" different from "mysite.com"?

    - by sapeish
    In any browser if I use www.mysite.com or just mysite.com the web page is correctly retrieved, but I am having trouble with Google Analytics and Facebook App. Facebook: To be able to get Likes, I create the Facebook App needed and set the site URL to http://mysite.com/. Using their tool http://developers.facebook.com/tools/debug/ when I test my page using http://mypage.com it works but using http://www.mypage.com fails with the message: Object at URL 'http://www.mysite.com/' of type 'website' is invalid because the domain 'www.mysite.com' is not allowed for the specified application id. Google Analytics: To be able to get traffic statistics, I created a Property and a Profile both with the URL http://www.mypage.com and no statistics were gathered in a week, when I changed the configured URL to http://mypage.com statistics where available a few hours later. What should I do to have statistics and likes for both www.mysite.com and mysite.com ??

    Read the article

< Previous Page | 111 112 113 114 115 116 117 118 119 120 121 122  | Next Page >