Search Results

Search found 10444 results on 418 pages for 'macbook pro retina'.

Page 81/418 | < Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >

  • Google shows subdomain of main site instead of add on domain URL

    - by Welsher
    I have my host (lunarpages) set up with a few add on domains to my main account. These show up as sub-domains of my main account, but they can be reached by using the new domain I've created. So: subdomain1.domain.com -- www.mynewsite.com subdomain2.domain.com -- www.myothersite.com etc. The problem is, mynewsite.com shows up in google with that domain, but myothersite.com shows up with subdomain2.domain.com. I don't have a clue what might be causing this to happen. If anyone has an advice or can point me in the right direction, I'd really appreciate it! Thanks.

    Read the article

  • Mass emailing bouncebacks- Sendblaster

    - by Matt
    I am currently using a mass emailer called sendblaster- if anyone has experience using this program for mass emails any help would be fantastic. The program has a feature that allows you to track reads and opens of emails sent, however the problem i have is with delivery failures/bouncebacks. The "manage bouncebacks" feature is very confusing, and appears to be incapable of showing which email addresses have bounced. For some reason the sender address does not receive delivery failures as with other mass email programs that I've used. If anyone knows a way to efficiently manage the delivery failures/bounceback using this program please help! Thanks

    Read the article

  • Copy website content from WebsiteA.com/file.html to WebsiteB/file.com every time interval

    - by Jimbo Mombasa
    I want to copy a website from http://stats.pingdom.com/file... to http://mywebsite.com/file every 10 min. Then with purple-include I want to do a transclusion and display it on http://mywebsite.com/page.html So the task is download http://stats.pingdom.com/file to http://mywebsite.com/file I figured out the transclusion part but I do not know how to copy a wabpage from A to B. Are there any script for this or how can I do this?

    Read the article

  • 301 redirect www to non-www [duplicate]

    - by Claudiu
    This question already has an answer here: Removing non-www support 4 answers I understand that it is better to make a 301 redirect to make sure that all your links are seen the same on Google. Until now I always used erasmus-plus.ro without the www. for my website. Is it ok to make a redirect from www. to non-www. From my search on Google all users spoke about it the other way around. And somewhere I read that redirects are not good for seo. Is 301 an exception?

    Read the article

  • clicktale.com alternative that works with https and ajax

    - by Alexey Ivanov
    I need to record user's actions on site for analytics purposes. The way clicktale.com doing it is just fine. But unfortunately it have problems with working over https and recording ajax events. Is there some service or script/library that I can host that can do this task? Non-free one's are ok to. Clarification: ClickTale function that I want to reproduce is recording of separate user sessions and their replay. So you can see video of all user's interactions with page: There he clicks first, which links opens, etc. Usually such services replay user's actions buy reproducing them with javascript (and here comes ajax problem: external sites can't use ajax because of cross-domain scripting). So I'm looking for a tool (possibly script that I host on site to allow cross-domain scripting) that can record ajax blocks actions.

    Read the article

  • Webmaster tools, Duplicate Meta Descriptions, and Short Meta Descriptions [closed]

    - by Watsy91
    Possible Duplicate: Do meta keywords have any impact on ranking algorithms? I am fairly new to the whole Webmaster Tools concept. I have been looking at all the different options, such as crawl errors, HTML improvements etc... I have been looking at the Duplicate Meta Descriptions and Short Meta Descriptions, was wondering if anyone could suggest ideas on how to go about improving this. It seems that all the Duplicates are from the URL Title and the short description. It would seem to me that most people would have information regarding the page with the same keywords as their titles. Heres an example of one: These are the ultimate hampers in taste, quality and value. Amongst this range of luxury hampers ar /food-hampers/food-hampers-over-100.html /thank-you-gifts/large-gifts-over-100.html To get to the point I just want to know do these things really matter? Would they have a real consequence on my sites rankings? My sites have been falling down the rankings since early this year and I have really started to look at Google Analytics and Webmaster tools to try and indicate certain problems. I have researched the Internet and it seems that some people don't bother and others do!! I know that Stackoverflow has 100s+ people who have went through the above and I would really appricate if they could give me some tips etc. Or in the END does it really matter?? :D

    Read the article

  • SEO Suggestion For My Blog [closed]

    - by Rana
    I have a programming tutorial blog, which have decent traffic. However, I am interested to do some basic seo for my blog to get it optimized. I want to do it myself by learning. I was wondering if experts here can suggest me how should I proceed please? Also, if you please review my blog and suggest the most common seo concern that come to your mind first, those will be helpful as well. My blog site url is as follow: http://codesamplez.com/ Looking forward to your feedback soon. Thanks.

    Read the article

  • Facebook like button for a Facebook feed item

    - by jverdeyen
    Here is what I do: When the admin of a website i'm creating adds a new 'news' item to his website, it directly posts the same item on the Facebook 'fanpage' of the website. This is done by an Facebook app with the permissions to post an item on the Facebook fanpage. So far so good. I'm getting the new post id from the facebook graph. Here is what I want to achieve: I'm adding a 'like' button on the original website, but when this is clicked it should be referenced to the same post item on the facebook page. So if someone likes the news post on the website he should also like the referencing facebook feed post. Which is exactly the same. Can this be done? I've been playing around with the href etc of the like button, without any success. When I hit the like button it likes the url given, but is doesn't recognizes that this is a feed post. Any idea? Thx in advance.

    Read the article

  • Did anyone experience negative SERP movement after implementing rel=author?

    - by raam86
    I am not intrested in why I don't see the picture in SERPs. So I know this is borderline off-limits but I turned every stone in the web (including DDgo) trying to find anybody experiencing a worse position in SERPs after implementing rel=author tags. In Google Webmaster Tools: Everything seems fine but the first results dropped 14 places in SERPs in the past two days. The original landing page went down from first page to 5th page in a few days. It is a useful site with original content concerning marriage laws. This specific page is no where to be found and now the first result leads to the home page. Assuming everything else is the same with no changes made to the site at all is there a reason the rel=author tag will cause such a plummet? Additional info that might be useful: The google+ account is as dead as a palm pilot.

    Read the article

  • Restrict SSL access for some paths on a apache2 server

    - by valmar
    I wanted to allow access to www.mydomain.com/login through ssl only. E.g.: Whenever someone accessed http://www.mydomain.com/login, I wanted him to be redirect to https://www.mydomain.com/login so it's impossible for him/her to access that site without SSL. I accomplished this by adding the following lines to the virtual host for www.mydomain.com on port 80 in /etc/apache2/sites-available/default: RewriteEngine on RewriteCond %{SERVER_PORT} ^80$ RewriteRule ^/login(.*)$ https://%{SERVER_NAME}/login$1 [L,R] RewriteLog "/var/log/apache2/rewrite.log" Now, I want to restrict using SSL for www.mydomain.com. That means, whenever someone accessed https://www.mydomain.com, I want him to be redirected to http://www.mydomain.com (for performance reasons). I tried this by adding the following lines to the virtual host of www.mydomain.com on port 443 in /etc/apache2/sites-available/default-ssl: RewriteEngine on RewriteCond %{SERVER_PORT} ^443$ RewriteRule ^/(.*)$ http://%{SERVER_NAME}/$1 [L,R] RewriteLog "/var/log/apache2/rewrite.log" But when I now try to access www.mydomain.com/login, I get an error message that the server has caused to many redirects. That does make sense. Obviously, the two RewriteRules are playing ping-pong against each other. How could I work around this?

    Read the article

  • Will Google penalize subdomains if content is nearly identical

    - by John Pham
    I have created a subdomain for a town in San Diego that's ranking very well for it's keywords: http://carmelvalleymortgage.loanrebateinc.com/ I want to replicate this subdomain's content for another town in San Diego: http://sandiego.mortgage.loanrebateinc.com/ I will edit the text, tags, image files specific to each town, otherwise the verbiage will be identical. Question: Will Google penalize the main site? Will Google penalize the subdomains and list the content as spam? If yes to either 1 or 2, what strategies can I implement to prevent this? I'm using WordPress.

    Read the article

  • Appropriate response when client empowered with CMS destroys content to his own will

    - by dukeofgaming
    So, I just recently closed a website project that pretty much was The Oatmeals' Design Hell, but with content. The client loved the site at the beginning but started getting other people involved and mercilessly bombarding us with their opinions. We served a carefully thought content strategy (which the client approved) and extremely curated copywriting that took us four months after at least 5 requirement changes (new content, new objectives for the business, changed offerings, new mindfaps, etc.) that required us to rewrite the content about 3 times. The client never gave timely feedback even though we kept the process open for him and his people to see (content being developed transparently in Google Docs). Near the end of the project he still wanted to make changes but wanted us to finish already (there are not enough words in the world to even try to make sense of this). So I explained to him the obvious implications of the never-ending requirement changes and advised him to take the time to gather his thoughts with his own team and see the new content introduced as a new content maintenance project. He happily accepted, but on the day of training/delivery things went very wrong and we have no idea why. The client didn't even allow the site to be out for a week with the content we developed for him and quickly replaced us with a Joomla savvy intern so that he completely destroy the content with shallow, unstructured, tasteless and plain wordsmithing (and I'm not even being visceral). Worst insult of all, he revoked our access from his server and the deployed CMS not even having passed 10 minutes of being given his administrator account (we realized the day after that he did it in our own office, the nerve!). Everybody involved in the team is enraged and insulted. I never want to see this happen again. So, to try to make sense of this situation and avoid it in the future with new clients I have two concrete questions: Is there even an appropriate course of action with a client like this?, or is he just not worth the trouble of analyzing (blindly hoping this never repeats again). In the exercise to try and blame ourselves instead of the client and take this as a lesson of... something, how should we set expectations for new clients about the working terms, process and final product so that they are discouraged from mauling the content to their own contempt once they get the codes to the nukes (access to the CMS)?

    Read the article

  • Resolving a cname using different DNS

    - by Sandeep Singh Rawat
    I have a domain name (e.g. abc.com) registered in GoDaddy and I have a few subdomains (mail, blog) correctly setup to a different hosts. Now I want to park my domain with a parking host (seohosting.com) which asked me to change my nameserver to their DNS. What I want is to only redirect dns queries for (www or @) cname to seohosting.com while still being able to use my other cname for my own purpose. Is there a way to do this? I dont have the host IP address for parking host.

    Read the article

  • How do I ask google not to index certain parts of my page?

    - by Gavin Mannion
    I was searching for an old review on my site today and I noticed that Google is indexing the headline text in my latest article list on every page that it appears, obviously I guess. The problem is if I search for my Dragon's Lair review specifically to my site like this http://www.google.co.za/search?sugexp=chrome,mod=9&sourceid=chrome&ie=UTF-8&q=site%3Alazygamer.net+dragons+lair+review Then it returns a ton of pages that aren't appropriate as they aren't related to the review at all. The reason why I care is that I have a second Dragon's Lair review that was posted years ago and now I can't find it. Is there a way to hint to google that certain text isn't relevant to the actual content on the page? is it a terrible idea?

    Read the article

  • trouble setting up dns for vps

    - by Zenph
    I had registered a domain at namecheap and forwarded the dns to my host at vps.net. The strange thing is, when I did that the site was showing up. I even uploaded files and everything was displaying correctly on my new domain. Now, it is just the namecheap holding page again. I have no idea why this is happening as I haven't touched the configuration since it was working. Could anyone point me in the right direction? When I enter http://domain.com it redirects to http://www.domain.com and the namecheap holding page is shown. Prior to this domain.com was showing what the host was serving. I am completely lost and have no idea where to start so I'd appreciate any help I can get.

    Read the article

  • website address point to localhost

    - by munir ahmad
    Today in firefox while surfing the internet, I opened a website it asked "This site may harm your computer" if you want to open this website add it in exception list. I added a website under exception list and trust this website. After that situation, whenever I opend this website, it always points toward the localhost untill internet connected. I have setup localhost through apache (xampp server). If internet not connected this website do not open anything but localhost still work as usaual. How can I remove this situation so that this website do not point to locathost? I am using winxp sp3. Same problem now appear in all browsers too.

    Read the article

  • Help with Apache mod_rewrite rules

    - by Brian Neal
    I want to change some legacy URL's like this: /modules.php?name=News&file=article&sid=600 to this: /news/story/600/ This is what I have tried: RewriteEngine on RewriteCond %{QUERY_STRING} ^name=News&file=([a-z_]+)&sid=([0-9]+) RewriteRule ^modules\.php /news/story/%2/ [R=301,L] However I still get 404's on the old URLs. I do have some other rewrite rules working, so I am pretty sure mod_rewrite is enabled and functioning. Any ideas? Thanks.

    Read the article

  • remap an xml feed to the address of a wordpress rss feed

    - by cboettig
    I used to have a blog based on Wordpress and moved to one based on Jekyll. I can create a new feed in Jekyll by building an atom page in XML with a bit of Liquid code, like this The trouble is, the location of the new feed is http://carlboettiger.info/atom.xml, while the old feed from the wordpress site is http://carlboettiger.info/feed, with no extension. how can I configure the Jekyll-created feed such that followers who have pointed their readers to the old feed address from wordpress will start to get the new content? (Site's Jekyll source here)

    Read the article

  • Strategies for very fast delivery of webpages.

    - by Cherian
    I run a website Cucumbertown with an initial pay load of nearly 9KB zipped. All my js is delayed loaded with requirejs and modernizer is the only exception. Now all my webpages are Nginx cached and only 10-15% hits go to the backend proxy. And the cache is invalidated by logged in users as proxy_cache_bypass. So for an anonymous user its nearly always a cache hit. I have some basic OS tuning with default via ip dev eth0 initcwnd 15 net.ipv4.tcp_slow_start_after_idle 0 Despite an all cache & large initcwnd my pages still take 2.5 – 3 seconds. I have a yslow score of And page speed at Are there strategies that can help deliver webpages even faster than this? Deliver pages at 1+ second time for 10KB payload? Notes: My servers run of a fairly good data center from Linode at Fremont.

    Read the article

  • How to know my free disk space on web hosting server?

    - by Abu
    I have got some work from my friend for updating his website. Earlier his website was made by some other person and he used to maintain all the stuff. Now that developer has given only the ftp username and password to my friend. He asks me to update his website. But the problem is I don't know how to access the things for this particular web hosting account like knowing the available free space, accesing email account, etc. I asked him about website control panel but he says that he doesn't know about. Is there any other site/client program/control panel that I can use to manage that website. So can any one help me out?

    Read the article

  • Can't get heroku site updated on custom domain

    - by Joseph Brown
    I have hosayif.com registered at GoDaddy, and I set up a cname for rails.hosayif.com to point to my heroku app at sharp-meadow-6535.herokuapp.com. I set this up with a previous app, and it worked. I made a new app, renamed the old one, and then renamed the new one so that it is sharp-meadow-6535.herokuapp.com in hopes of not having to change anything at GoDaddy. In theory, rails.hosayif.com and sharp-meadow-6535.herokuapp.com should be the same site, but they are not. Can someone tell me what I am doing wrong?

    Read the article

  • Friendly URLs: is there a max length for search engines?

    - by Olivier Pons
    People from stackoverflow have been working closely with google team to help them make the panda algorithm more efficient, so I guess they've learned a lot from the google team. Thus they may have done very clever friendly URLs to maximize the page rank. I've seen from time to time very long URLs (can't find where) in stackoverflow, but after a certain "amount" of character there were only numbers, kind of "ok passed this length, SEOs will ignore this so let's put only numbers". I've done a huge work on my framework to make very friendly URLs, and my website can come up with URLs like: http://www.mysite.fr/recherche/region/provence-alpes-cote-d-azur/departement/bouches-du-rhone/categorie-de-metiers/paramedical/ It's very long and I'm wondering if the previous URL won't be mixed with, say, this one: http://www.mysite.fr/recherche/region/provence-alpes-cote-d-azur/departement/bouches-du-rhone/categorie-de-metiers/art/

    Read the article

  • How should I structure my urls for both SEO and localization?

    - by artlung
    When I set up a site in multiple languages, how should I set up my urls for search engines and usability? Let's say my site is www.example.com, and I'm translating into French and Spanish. What is best for usability and SEO? Directory option: http://www.example.com/sample.html http://www.example.com/fr/sample.html http://www.example.com/es/sample.html Subdomain option: http://www.example.com/sample.html http://fr.example.com/sample.html http://es.example.com/sample.html Filename option: http://www.example.com/sample.html http://www.example.com/sample.fr.html http://www.example.com/sample.es.html Accept-Language header: Or should I simply parse the Accept-Language header and generate content server-side to suit that header? Is there another way to do this? If the different language versions don't have different urls, what do I do about the search engines?

    Read the article

< Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >