Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 112/216 | < Previous Page | 108 109 110 111 112 113 114 115 116 117 118 119  | Next Page >

  • Is AdWords ad blocked from top spots of SERPs until it is reviewed?

    - by Omeoe
    I have an AdWords ad and a keyword with a Quality Score of 10. Inferring from CPC from actual clicks, max CPC is set way beyond that of the third advertiser from the SERP for this keyword (there are three ads in the top). Still the ad is shown on the 4th spot which located either on the right or at the bottom of the SERP. The only catch is that the ad's status is "under review". Is it the reason why it's blocked from the top spots?

    Read the article

  • Migrating Ruby on Rails Website to New Server (Linux)

    - by GarytheWorm
    I have an existing website that is a Ruby on Rails project. I have another server i need to transfer the existing website too. The server i wish to transfer too was originally hosting the website so has the necessary gems/configuration are installed. I have tar the current releases shared dir from the old server and transfered them over to the new server. I have then unpack the tar in the apps directory to the new location which is a different URL path. My problem is now as you can see below that the path on the current - is pointing to the old url. ( i ran ls -la to see owenership) How can i change this current path to read with my new web address? current releases shared sitepack.tar root@server1:/var/www/clients/client1/NEWSITE.com/web/apps# ls -la current - /var/www/OLDSITE.com/web/apps/releases/20120130171636 root@server1:/var/www/clients/client1/NEWSITE.com/web/apps#

    Read the article

  • SEO Keyword Research Help

    - by user5857
    I'm new at SEO and keyword research. I am using Market Samurai as my research tool, and I was wondering if I could ask for your help to identify the best key word to target for my niche. I do plan on incorporating all of them into my site, but I wanted to start with one. If you could give me your input on these keywords, I would appreciate it. This is all new to me :) I'm too new to post pictures, but here are my keywords (Searches, SEO Traffic, and SEO Value / Day): Searches | SEO Traffic | PBR | SEO Value | Average PR/Backlinks of Current Top 10 1: 730 | 307 | 20% | 2311.33 | 1.9 / 7k-60k 2: 325 | 137 | 24% | 822.94 | 2.3 / 7k-60k 3: 398 | 167 | 82% | 589.79 | 1.6 / 7k-60k I'm wondering if the PBR (Phrase-to-broad) value of #1 is too low. It seems like the best value because the SEOV is crazy high. That is like $70k a month. #3 has the highest PBR, but also the lowest SEOV. #2 doesn't seem worth it because of the PR competetion. Might be a little too hard to get into the top page of Google. I'm wondering which keywords to target, and if I should be looking at any other metric to see if this is a profitable niche to jump into. Thanks.

    Read the article

  • Shared Hosting Provider [closed]

    - by Garry
    Possible Duplicate: How to find web hosting that meets my requirements? I've been with Dreamhost for 5 years but the amount of downtime I have experienced over the last 6 months has been outrageous. As of now (2012) which hosting provider would you recommend? Most of my sites are small to medium readership blogs running WordPress. I've been looking at Inmotion and Hostgator. Reliability is paramount. Thanks

    Read the article

  • Pointing domain away from google

    - by redbeard
    I've already asked this question on google apps support, didnt fet an answer yet, and I am in a bit of a hurry. Need to have that website up by monday. Question: We have a domain registered at 101domain, we used a domain just for google apps, for email, and some services, and domain was never used for website. Now we pointed domain to the local hosting service, set everything like it should be, and pointed back to google from hosting service for gmail and gmail works perfectly. Problem is that we tried to set up a website at hosting service but domain still points to google apps login. Everything looks like it should, we recently bought 2 more domains that are set up the same (we use google apps on them too), the only difference tha I can think of is that we didn't use for apps first. Could this be because CNAME lags behind DNS server change?

    Read the article

  • Can Mailchimp APIs be used to send templated transaction email triggered by actions on my website?

    - by HenryW
    I am currently playing around with Mailchimp's APIs, but the documentation to me is not very clear. Here is what I actually want: Have the templates I created on Mailchimp, be visible on my own server. Assign each template I made to a specific action (logged in,subscribed, created order, or new password). This is functionality that I already tested with Mandrill, but the template exists on mandrill's account. If option 1 is not possible, can I still make my own template in my own environment, and send that template out over Mailchimp or Mandrill? Should I use Mailchimps services for this or send the email directly from my own server? Curent used function: function tep_mandrill_mail($to_name, $to_email_address, $email_subject, $email_text, $from_email_name, $from_email_address) { if (SEND_EMAILS != 'true') return false; $uri = 'https://mandrillapp.com/api/1.0/messages/send-template.json'; $postString = '{ "key": "xxxxxxxxxxx", "template_name": "sometemplatename", "template_content": [ { "name": "header", "content": "*|HEADERSTUFF|*" }, { "name": "main", "content": "*|CONTENTSTUFF|*" }, { "name": "footer", "content": "*|FOOTERSTUFF|*" } ], "message": { "subject": "'.$email_subject.'", "from_email": "'.$from_email_adress.'", "from_name": "'.$from_email_name.'", "to": [ { "email": "'.$to_email_address.'", "name": "'.$to_name.'" } ], "important": false, "track_opens": true, "merge": true, "merge_vars": [ { "rcpt": "'.$to_email_address.'", "vars": [ { "name": "HEADERSTUFF", "content": "'.$email_subject.'" }, { "name": "CONTENTSTUFF", "content": "'.$email_text.'" }, { "name": "FOOTERSTUFF", "content": "paulvale-foot" } ] } ], "tags": [ "password_forgotten" ] }, "async": false, "ip_pool": "Main Pool" }'; $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $uri); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true ); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true ); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_POSTFIELDS, $postString); curl_exec($ch); }

    Read the article

  • Redirects in .htaccess to avoid crawl errors

    - by user71698
    I am getting a lot of errors in Webmaster Tools and basically there's a lot of links ending like this: mydomainname.com/links.php How can I redirect these links, to shave off this part at the end? For instance, there is a link in Google: http://www.onlineglobalbiz.com/article-marketing/www.onlineglobalbiz.com/links.php This should be: http://www.onlineglobalbiz.com/article-marketing/ Using .htaccess, how can I redirect from the incorrect links?

    Read the article

  • On-Site Factors that Affect CPC

    - by ashes999
    I have a few websites on various niche topics, all running Adsense. The most promising one currently has a CPC that hovers around $1; the rest have CPCs of $0.25-$0.50. I'm curious to know what on-site factors affect CPC. That is to say, what I can do, legally (in white-hat compliance) to increase my CPC? Some factors that affect CPC but are not within my control (and therefore, beyond the scope of my question -- they're just examples) include: What advertisers are paying for keywords on my site What pages people are landing on etc.

    Read the article

  • Can't get into the admin console after migrating to new server

    - by Emerson
    I migrated my WordPress blog to a new server, and everything seemed to be working fine until it started giving me the error when entering the admin area: Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 4864 bytes) in /home/neworder/public_html/blog/wp-admin/includes/plugin.php on line 729 The line 729 has: $protected = array( '_wp_attached_file', '_wp_attachment_metadata', '_wp_old_slug', '_wp_page_template' ); I had installed the maintenance-mode, and I have suspicions that this is what broke the forum. If I remove the plugin it then gives another error: Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 19456 bytes) in /home/neworder/public_html/blog/wp-admin/includes/post.php on line 1158 And that line has: $content .= '<p class="hide-if-no-js">' . esc_html__( 'Remove featured image' ) . '</p>'; } I tried to restore the blog file-system from the old server and also to restore the database from the old server (2x), but still it gives me the same error. The blog itself seems to be working fine: http://blog.antinovaordemmundial.com/

    Read the article

  • Best way to setup multiple sites' emails in my Gmail

    - by John
    I've a dozen sites and I want all of their emails come to my one gmail id and I want to reply centrally from Gmail only. I've also added all of those emails in "send email as:" list in Gmail. I could add email forwarders in my Cpanel but in that case I'll not be able to send email whose inboxes haven't been created( for example [email protected]). If I create email account then I'd receive emails in my inbox as well as forwared by the forwarder( to my gmail id). Otherwise I can setup Gmail for my domain. But for a dozen emails I'm not sure if that'd be fine. I see in http://www.google.com/enterprise/apps/business/pricing.html that for up to 10 emails it is free. But then to send email from webhosting the php code will need SMTP login details and leaving my important gmail account details in my webhosting account is very risky given my sites have been compromised twice. What is the best way to centralize all my emails so that I can read/reply/search from single place?

    Read the article

  • Is it time to add IPv6 access to my websites?

    - by Rob Hoare
    I have several dedicated servers and VPS servers, and some of those are at companies that have provided me with native IPv6 blocks (in addition to the IPv4 IP addresses). Does it currently make sense to point an AAAA record to an IPv6 address on my server, in addition to the A record pointing to the IPv4 address? This would be for (for example) the www subdomain. (the networking and web server software would be set up on the server to respond appropriately). A while ago I read that a small percentage of users (1 in a thousand?) would have slow or no access if a subdomain had both A and AAAA records because their networking software asked for one and got the other. Is that still the case, will adding an AAAA record inconvenience some users, or is the percentage already smaller and falling? In other words, is now the time to get around to adding native IPv6 support for a busy website aimed at the general public, or is it still too early?

    Read the article

  • Can other domain registrars view non-public whois information?

    - by user3188544
    If my domains are hosted at a registrar (lets take Gandi, for example) and it has privacy protection on the whois information, can another ICANN-accredited registrar (GoDaddy, for example) still view my actual information that is behind the privacy guard? i.e. I don't have a GoDaddy account. But, since they are ICANN-accredited, could they access the real whois info without the privacy protection?

    Read the article

  • Directing from a 1und1 hosting solution, with urls intact

    - by Jelmar
    I have done this before on GoDaddy without a hitch, but I cannot seem to figure out this particular case. I have a domain space with temporary url http://yogainun.mysubname.com/ and am hosting the domain name that is to be applied to it at 1und1.de. Right now I have set it up so that from the 1und1 domain name hosting the address http://www.yoga-in-unternehmen.de/ is frame redirected to the subdomain that I just referred to. But this is not what I want. http://www.yoga-in-unternehmen.de/ is to be the domain. With the frame redirect, url's like http://www.yoga-in-unternehmen.de/example-article do not show up. But this is what I want. With godaddy in a similar case, I just turned on DNS and changed the name servers. That worked without problem, but with 1und1 not. Is there something I am missing?

    Read the article

  • guidline for promoting a web forum or portal

    - by Hafiz
    I am a web application developer, have developed a lot of sites, portal and apps for clients. Now I want to have own such sites that with which I can do business. But I don't know what are steps to do so. What I know is make a site or portal. But what after that? There are lot of people having so much traffic on sites built with some simple open source. Many of them are forums. I also wanted to start a forum and a web portal. I can develop that or can have open source. But what after that? Content entry and SEO? Is it all to promote a portal or site? Do SEO nowadays work ? or it is all about marketing and advertising? I have no idea about that so please tell what you guys suggest. thanks for every one's opinion in advance.

    Read the article

  • How can I exclude content in my notifications bar from being indexed?

    - by Liam E-p
    Of course I want my content to be indexed pretty fast by search engines, however not my notifications bar. My notifications bar contains the last 30 changes to content on the site, and I don't want this to show in my SEO meta. As all the notifications are generic, it often doesn't provide any relevant information. As I said the notifications are generic. If an article named "123" was created, it would create a notification that says "Article "123" was created by xxx at 12:00AM". I'm now wondering if this is a content design problem. As only 1/3 of this information is actually relevant to users (the title, what happened). By SEO meta, and irrelevant notification data being shown, I mean this - Basically what I was wondering, is how I could optimise this, so search engines wouldn't show this generic nonsense.

    Read the article

  • Unable to download microsoft excel files from a IIS SSL site

    - by Jeffrey
    The web master at my corporation added SSL to the web site and now none of my users can download Microsoft word and xcel files the sites generates. According to Microsoft the following must be down. Web sites that want to allow this type of operation should remove the no-cache header or headers. Typical of MS they don't tell you what to do, how to do it, or what the best practice is. The web master says its a web config setting. But all i can finds is <configuration> <appSettings/> <connectionStrings/> <system.web> <httpRuntime sendCacheControlHeader="false"/> and I don't know if this is the best way to achieve the result. I would greatly appreciate some advice on this subject.

    Read the article

  • Google still has record of my old site URL - what to do?

    - by Mayeenul Islam
    I had a blog site, i.e. http://example2.com, then I bought a new domain, i.e. http://example.com and 301 permanent redirected example2.com to example.com. But when I get into the Google Webmaster Tools, if I get some 404, and then click into the link and see the "Linked from" tab, it shows some links like: http://example.com/post-1 http://example2.com/feed http://example2.com/post-1 According to Google, if you change your domain, just use a redirection for at least 4-6 months, but it almost passed. Then why Google has still traces of my old site? The issue is important, because I don't want to pay for the old domain anymore. I tried deleting my existing sitemap.xml and recreating it from the new site, but still such links are stored. What could I do?

    Read the article

  • Do PHP-FPM (and other PHP handlers) need execute permissions on the PHP files they're serving?

    - by Andrew Cheong
    I read in a post at Server Fault that PHP-FPM needs execute permissions. However, the answer in When creating a website, what permissions and directory structure? only grants read and write permissions to PHP-FPM. Maybe I don't quite understand how PHP handlers (or CGI in general) work, but the two claims seem contradictory to me. As I understand, when Apache / Nginx gets a request for foobar.php, it "passes" the file to an appropriate handler. That is, I imagine it's as if www-root (or apache or whomever the webserver's running as) were to run some command, /usr/sbin/php-fpm foobar.php Actually, no, that's naive, I just realized. PHP-FPM must be a running instance (if it's to be performant, and cache, etc.), so probably PHP-FPM is just being told, "Hey, quick, process this file for me!" In either case, I don't see why execute permissions are necessary. It's not like the webserver needs to literally execute the file, i.e. ./foobar.php Is the Server Fault answer simply mistaken?

    Read the article

  • Hijax == sneaky Javascript redirects? Will I get banned from Google?

    - by Chris Jacob
    Question Will I get penalised as "sneaky Javascript redirects" by Google if I have the following Hijax setup (which requires a JavaScript redirect on the page indexed by google). Goal I want to implement Hijax to enable AJAX content to be accessibile to non-JavaScript users and search engine crawlers. Background I'm working on a static file server (GitHub Pages). No server side tricks allowed (so Google's #! "hash bang" solution is not an option). I'm trying to keep my files DRY. I don't want to repeat the common OUTER template in all my files i.e. header, navigation menu, footer, etc They will live in the main index.html Setup the Hijax index.html page contains all OUTER html/css/js... the site's template. index.html has a <div id="content"> which defaults to containing the "homepage" html. index.html has a navigation menu, with a Hijax link to an "about" page. With JavaScript disabled (e.g. crawler) it follows link to /about.html. With JavaScript enabled (e.g. most people) the link updates the url hash fragment to /#about and jQuery replaces the <div id="content"> innerHTML with $("#content").load("about.html #inner-container");. AJAX content about.html does not contain anything extra to try an cloak content for crawlers. about.html file contains enough HTML / CSS / JavaScript to display /about.html as a standalone page with it's own META data... e.g. <html><head><title>About</title>...</head><body></body></html>. about.html has NO OUTER HTML template (i.e. header, navigation menu, footer, etc). about.html <body> contains a <div id="inner-container"> which holds the content that is injected into index.html. about.html has a <noscript> tag as the first child of <body> which explains to non-JavaScript users that they are viewing the about page "inner content" - with a link to navigate to the index.html page to get the full page layout with menu. The (Sneaky?) Redirect Google indexes the /about.html page. However when a person with JavaScript enabled visits that page there is no OUTER html template (e.g. header, navigation menu, footer, etc). So I need to do a JavaScript redirect to get the person over the /#about page (deeplinking to the "about" page "state" in index.html). I'm thinking of doing a "redirect on click or after 10 seconds". The end results is that user ends up on an "enhanced" page back on index.html with all it's OUTER template - but the core "page" content is practically identical. Known issue with inbound links e.g. Share / Bookmarking It seems that if a user shares the URL /#about on their blog, when allocating inbound links to my site Google ignores everything after the # ... it allocates value to the / page - See: http://stackoverflow.com/questions/5028405/hashbang-vs-hijax/5166665#5166665. I can only try an minimise this issue offering "share" buttons on the page with the appropriate urls i.e. /about.html. Duplicate Sorry. I posted this same question over on http://stackoverflow.com/questions/5561686/hijax-sneaky-javascript-redirects-will-i-get-banned-from-google ... then realised it probably belongs more on this Stack Exchange site... Not sure if I should delete the Stack Overflow question? Or just leave it on both sites? Please leave comment.

    Read the article

  • Can I test my affiliate ID on a dummy webpage without it being suspended?

    - by user359650
    I've recently applied for an Amazon affiliate program (which was accepted) as I'm planning on advertising books I read, on my website. Before going live with my website, I would like to: 1 -test the whole affiliate program to make sure it's working properly. 2 -buy the books I will review and promote on my website under my own affiliate program in order to get some cash back and therefore save money. To do so, I thought about setting up a simple HTML page (on the actual domain I applied for) which will just list the products I will buy before going live. That way I test, get some cash back, and don't expose my website (Brand, content...) before going live. Can I do this without having my account suspended by Amazon (i.e. won't Amazon think I only applied to the program to get some cash back, will Amazon be happy with receiving affiliate traffic from an almost empty website...) ?

    Read the article

  • Using disqus for a website (question on SERP and backlink)

    - by Homunculus Reticulli
    I am building a website and am trying to decide between writing my own commenting system and using disquss. One of the main deciding factors is that (obviously), I want comments left on my page to be show on SERPS. However, I remeber reading somewhere that disqus loads comments into a page using AJAX - and therefore the comments are "invisible" as far as Googlebot and other SE crawlers are concerned. Could someone confirm or refute this? The other question I have is about whether (as a commenter), When I place a comment on another website using disqus (including any links I may add to my comment), do the lionks in my comment count as a backlink (in other words are they "dofollow" or "nofollow" links)?

    Read the article

  • How frequently Googlebot fetch sitemaps? Is it depending on page rank?

    - by JITHIN JOSE
    How much frequently google fetches sitemaps? I am now working with a high traffic website normally have 30 new posts per minute.But currently it provides sitemaps which links to new 100 posts(3 minutes). Is this method is enough ?. Is Bots fetch sitemaps every 3 minutes?. Did need to change sitemaps to list all 5M posts(indexed sitemaps)?. How this change will effect on traffic and page rank. Is google bot remove urls that previously listed on sitemap but not now?

    Read the article

  • Mail Hosting That Will Allow Outbound Bulk Mail?

    - by user249493
    No, I'm not a spammer! I do volunteer work for a non-profit social services agency. They send out daily email with several hundred recipients on each message. Their web hosting company has been flagging the email as spam due to the volume. So I'm looking for an email hosting provider that won't do that. (I can separate out the web hosting function; we just need mail hosting right now.) They can't use something like MailChimp, Constant Contact, or Vertical Response because some of the mail is just inbound emails they aggregate and send out, and they don't want the overhead of "rebuilding" it in a "newsletter" service. I think that Google Apps for Business might be a good solution, but the pricing is just too high for this under-funded non-profit. I've applied for the non-profit discount but haven't heard back yet. Is there mail hosting service that might fit their needs? Thanks in advance.

    Read the article

  • SEO and new site - visibility best practices

    - by Ispuk
    Since i launched a new web site, i was wondering which are the best practices to let the visibility of the site grow up faster then just leaving the site online? I mean which internet channels are good to speed up visibility of a new site? Can anyone show some tricks he do when launching new site? I'm not talking about spam, advertising and SEO tech tips (the site is well done with all the main SEO tech tricks).

    Read the article

  • Impressions and traffic dropped by 70 %

    - by Louise
    Can anyone advise why my impressions dropped and traffic as well? I used to have very generic keywords such as: anti aging, anti wrinkle, face cream, eye cream. I thought they were bad and made the keywords more specific: anti wrinkle eye cream, anti aging face cream, etc. Following that change, my impressions and traffic dropped dramatically! I used to get 45+ visitors a day, now I get 15- visitors a day. What is the way forward? I thought what I did to the keywords was good?

    Read the article

< Previous Page | 108 109 110 111 112 113 114 115 116 117 118 119  | Next Page >