Search Results

Search found 9727 results on 390 pages for 'llblgen pro'.

Page 177/390 | < Previous Page | 173 174 175 176 177 178 179 180 181 182 183 184  | Next Page >

  • How can I clone or mirror a site without SEO penalties for duplicate content?

    - by Amanda
    I am a web developer and I want to create clones of the sites I've developed for clients, so that I have an "original copy" on a subdomain of my own website, so that I can showcase my work to new clients. What is the best way to not get my clients original websites penalised for duplicate content? I am planning to have a robots.txt file that disallows all robots, as well as using <link href="http://www.client-canonical-site.com/" rel="canonical" /> in the <head> of the pages. Is that sufficient? Should I use rel=nofollow on all the links as well?

    Read the article

  • SMTP server to deliver mail to Rails app, how?

    - by Gunchars
    all, this is my first question and I hope I chose the right place to post it. Here's what I need help with: I've been looking for this all day and I'm having a hard time finding a SMTP mail server that would fit the following criteria: lightweight, does one thing and does it good is able to route and deliver local mail to a Rails application The second point could be accomplished in any number of ways. I'm running a VPS, so I have full freedom in how to implement this. It could, for example, put messages straight in the db, pipe them to a helper program that would then process them accordingly or also save messages in a mbox file and run a script after every received message. I'm building a small site so the traffic is not going to be a problem. If there are alternative ways to deliver messages to a Rails app, I'd gladly hear about them. Thank you. EDIT: After long searching, I think I've found what I was looking for. Exim is a mail server that can deliver local mail to pipes. Also, Rails 3 and ActionMailer can make it really easy to process the incoming mail. More info here: http://www.exim.org/exim-html-current/doc/html/spec_html/ch29.html http://guides.rubyonrails.org/action_mailer_basics.html#receiving-emails

    Read the article

  • Htaccess 301 redirect dynamic URL

    - by Jarede
    I don't know a whole lot about .htaccess rules so forgive and help me ask the correct question. Currently I have a .htaccess rule like: RewriteRule ^surveys/(\S+)/directory/(\d+)/(\d+)/entry/(\d+)/?$ directories/index.cfm?sFuseAction=XXX.YYYY.ZZZZ&nDirectoryID=$2&nEntryID=$4&nCategoryID=$3&sDirectory=$1 [NC,L] which I want to do a 301 redirect to: RewriteRule ^(\S+)/directory/(\d+)/(\d+)/entry/(\d+)/?$ directories/index.cfm?sFuseAction=XXX.YYYY.ZZZZ&nDirectoryID=$2&nEntryID=$4&nCategoryID=$3&sDirectory=$1 [NC,L] I'm unsure of the correct syntax to go about making these redirect correctly.

    Read the article

  • Is Google DFP a replacement for ad rotate plugin?

    - by EPQRS
    I'm currently using Ad-Rotate WordPress plugin on my WordPress site. I recently came to know of Google DFP. I'm currently adding 1-5 ads per day which will increase soon and am wondering if Google DFP is an alternate solution to Ad Rotate plugin. I want to mainly show ours and clients' ads and not AdSense. I'm just looking for an ad manager and was wondering if Google DFP is the right alternate solution. Where can I find a tutorial on how to use (add ads) Google DFP? (I already have an AdSense account)

    Read the article

  • Joomla Backend running slow on localhost

    - by boothe
    I made a local backup of a Joomla site a few months ago to test changes before updating the live site. Everything worked fine. Today I checked the local version after a while but when I open the backend (/administrator) it takes a while until the site is loaded. I tried out different things and accidently disconnected my Network Connection. After that everything loads as fast as before. But when I connect the Network Connection the problem reappears. I am running Joomla 1.5.14 on XAMPP 1.7.0.

    Read the article

  • Will adsense 'use' comments between the adsense opening and closing tags for ad targeting?

    - by SuperSpy
    I've got a simple website on which students of my school can see their timetable. I setup adsense three weeks ago and everything is working fine. But I would like to help adsense. Since only a timetable and a few instructions are shown, there is scarcely content to which ads can relate. Idea: <!- google_ad_section_start –> <!-- Students like clothing ... and bargains, etc.--> <!– google_ad_section_end –> Will this work? Or are there other ways to help adsense?

    Read the article

  • Google Webmaster Tools Index dropped to Zero [closed]

    - by Brian Anderson
    Earlier this year I rebuilt my website using ZenCart. Immediately I saw a drop in index status from 59 to 0. I then signed up for Google Webmaster Tools and noticed the Index status took a dramatic drop and has never recovered. I have worked to add content and I know I am not done, but have not seen any recovery of this index since. What confuses me is when I look at the sitemap status under Optimization it shows me there are 1239 submitted and 1127 pages indexed. Most of my pages have fallen off page one for relevant search terms and some are as far back as page 7 or 8 where they used to be on the first page. I have made some changes in the past week to robots.txt and sitemap.xml, but have not seen any improvements. Can anyone tell me what might be going on here? My website is andersonpens.net. Thanks! Brian

    Read the article

  • How to prevent Google Analytics from adding a second slash between domain and page specific URL when viewing a page?

    - by Jeromy Anglim
    I have a blog http://foo.tumblr.com. I sometimes go to Site Content - All Pages on Google Analytics and then navigate to page listing and then click the icon to take me to that page on my blog. However, instead of opening http://foo.tumblr.com/post/1234/blah.html Google Analytics is opening http://foo.tumblr.com//post/1234/blah.html (i.e., it is adding a second slash between the domain the page specific component of the URL). How can I stop Google Analytics from doing this?

    Read the article

  • Drop down menu for blogger with automatic link update

    - by Payo
    My mom has a recipe blog in blogger which has many categories "Cakes", "Celebration Day" etc which are filled with numerous recipe posts. Now I want to organize my blog in a better way and make a drop down menu for all the categories(I already have labels) but I dont want to keep updating the menu whenever I include a new post. I want to have it updated automatically. Is there any way it can be done with CSS or html codes?

    Read the article

  • wget not respecting my robots.txt. Is there an interceptor?

    - by Jane Wilkie
    I have a website where I post csv files as a free service. Recently I have noticed that wget and libwww have been scraping pretty hard and I was wondering how to circumvent that even if only a little. I have implemented a robots.txt policy. I posted it below.. User-agent: wget Disallow: / User-agent: libwww Disallow: / User-agent: * Disallow: / Issuing a wget from my totally independent ubuntu box shows that wget against my server just doesn't seem to work like so.... http://myserver.com/file.csv Anyway I don't mind people just grabbing the info, I just want to implement some sort of flood control, like a wrapper or an interceptor. Does anyone have a thought about this or could point me in the direction of a resource. I realize that it might not even be possible. Just after some ideas. Janie

    Read the article

  • Broken Links in different browsers

    - by kdorival
    Hi I'm having problems with our website, http://www.accessiblehomehealthcare.com, which is a wordpress 2.7 (version). All of a sudden our RSS links broke on the right side, which has happened before and I fixed it within 5 mins. Now, when I fix it, it doesn't look right in different version of I.E. or Firefox, I have I.E. 8 and Firefox 3.6.15 and it looks good for the most part, but there are a few parts where the links are broken. One browser the links would look ok but go to another page and the links or logos would be broken. Certain parts of the website should be static(identical) to the other pages of the site, but if a link is broken on one page, its perfect on another page. I was wondering was there a secret code for wordpress to keep the sites compatible with all browser versions or is there a bigger issue???? Any help or suggestions will help???

    Read the article

  • Tumblr Custom URL Not Working

    - by user3177012
    I have bought a domain that I want to use as a unique Tumblr website but I can't get the url to pick up on Tumblr settings. The domain is a .com registered with 123-reg. I've set the CNAME to the correct tumblr url and also set the A record too. When I visit the url I get the Tumblr error page so I know that the domain is pointing, however when I go to settings in Tumblr and "Test" the url it says that it's not pointing and I can't save it. What could be the problem?

    Read the article

  • What are the Consequences for using Relative Location Headers?

    - by Alan Storm
    According to the spec, Location headers used in a redirect require a server name HTTP/1.1 301 Moved Permanently ... Location: http://example.com/foo/baz/bar However, in 2012, most web browsers will recognize a relative path and redirect you to the new location using the original server name HTTP/1.1 301 Moved Permanently ... Location: /foo/baz/bar Are there any negative/surprising consequences to using the relative URLs in the Location headers? My particular concern is how Google/search-engines will interpret this, but if there's anything else I'm not thinking about I'd love to hear it.

    Read the article

  • No description for any page on the website is available in Google despite robots.txt allowing crawling

    - by Abhijit
    I seem to have the weirdest issue with Search Engine Optimization, and I asked the IT folks at my university, I asked people on Joomla forums and I am trying to sort this issue out using Google Webmaster Tools for more than 2 months to little avail. I want to know if I have some blatantly wrong configuration somewhere that is causing search engines to be unable to index this site. I noticed a similar issue with another website I searched for online (ECEGSA - The University of British Columbia at gsa.ece.ubc.ca), making me believe this might be a concern that people might be looking an answer for. Here are the details: The website in question is: http://gsa.ece.umd.edu/. It runs using Joomla 2.5.x (latest). The site was up since around mid December of 2013, and I noticed right from the get go that the site was not being indexed correctly on Google. Specifically I see the following message when I search for the website on Google: A description for this result is not available because of this site's robots.txt – learn more. The thing is in December till around March I used the default Joomla robots.txt file which is: User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /cli/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /logs/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ Nothing there should stop Google from searching my website. And even more confusingly, when I go to Google Webmaster tools, under "Blocked URLs" tab, when I try many of the links on the site, they are all shown up as "Allowed". I then tried adding a sitemap, putting it in the robots.txt file. That did not help. Same exact search result, same behavior in the "Blocked URLs" tab on the webmaster tools. Now additionally, the "sitemaps" tab says for several links an error saying "URL is robotted out". I tried those exact links in the "Blocked URLs" and they are allowed! I then tried deleting the robots.txt file. No use. Same exact problem. Here is an example screenshot from Google's Webmaster Tools: At this point I cannot give a rational explanation to why this is happening and neither can anyone in the IT department here. No one on Joomla forums can seem to understand what is going on. Based on what I explained, does it seem that I have somehow set a setting in the robots.txt or in .htaccess or somewhere else, incorrectly?

    Read the article

  • Github Feed affecting my WordPress installation? [on hold]

    - by saul
    Any idea how this fork is affecting my site? I went to verify my website log stats, and realized this may be the cause of a strange redirect constantly happening on my WordPress installation. Here's a line I found on my log: 54.81.91.95 - - [07/May/2014:22:52:08 -0400] "GET /category/selfie/feed/ HTTP/1.1" 200 1826 "-" "feedzirra http://github.com/pauldix/feedzirra/tree/master" And this is the Github fork (or however these are called). https://github.com/feedjira/feedjira/tree/master Basically, I think everytime I update my categories, (selfie in this case), I get redirected to install.php. Probably by triggering some GET function on that feed. to the best of my knowledge, this feed parses all url with this structure, blocking them, kind of like a DDoS attack?? Any ideas how to go about it??

    Read the article

  • Using photoshop actions to decide if an image needs to be rotated

    - by voxobscuro
    I have Photoshop CS3 and I need to do a batch on a lot of pictures before I upload them. The pictures need to fit in an 600x800 box, yet be as big as possible within that box. Some of them are much wider than taller and others are more tall than wide. I am trying to put together a photoshop action that will rotate, resize, and fill pictures as needed to make them as big as possible while staying within the 600x800 box. The only thing I haven't gotten sorted out is how to tell photoshop to rotate the image 90 degrees if that will allow the picture to be bigger within the constraints. Any ideas?

    Read the article

  • Glue Records creation

    - by FFrewin
    I need some information on the following issue, as I would like to have it clear on my mind. I have a VPS server. All my sites hosted on this VPS are using as NameServer .gr domain, like ns1.greekdomain.gr & ns2.greekdomain.gr . The .gr domain name is a domain I own with a greek registar. Now, I want to move 2 websites with .co.uk domain names to my VPS. The co.uk domain names are registered with a UK based registar. When I went in the domain management panel, I did changed the nameservers of my domains to my ns.greekdomain.gr ns. However the panel returns an error about invalid nameservers. After digging, I found that my nameservers are not valid because they do not exist as records in the .co.uk registry. And here it starts my big trouble. The .co.uk registart tells me that I have to ask my hosting provider / .gr registar to create a new record to the .uk registry for my nameservers. The .gr registar tells me that my uk registar needs to create a new record for my ns. From Nominet (.co.uk) registry, the one employee tells me that I need to ask my uk registar, the other employee (seemed to not understand what I was asking) told me that they cannot change my nameservers for me, and she told me to contact anyone else (old hosting provider, uk registar, .gr registar) to help me with that. I can't find help from nobody. I try since the last week to transfer my websites to my VPS and I can't. So, the question is who is responsible and who is able to create glue records for my nameservers ?

    Read the article

  • .htaccess redirect to subfolder in different domain, maintaining old domain in the URL

    - by Naoise Golden
    Redirect has been widely discussed and most problems solved, so I am sorry for opening yet another post about this, but none of the codes I am trying work. I have a WordPress site hosted in http://mydomain.com/clientsdomain.com/wordpress I would like to temporarily redirect http://clientsdomain.com/ to the abovementioned URL, maintaining the clientsdomain.com domain in the URL. So for example http://clientsdomain.com/some/page would be pointing to http://mydomain.com/clientsdomain.com/wordpress/some/page Is this even possible with .htaccess? Maybe som configuration or plugin option with WordPress?

    Read the article

  • Is a 302 redirect to a random URL from the homepage an SEO problem?

    - by CookieMonster
    I originally posted this on Stackoverflow, but I believe here is a better place to ask. My web application is very similar to notepad.cc which redirects to a randomly generated URL upon access, e.g. http://myapp.com/roTr94h4Gd. (Please note that notepad.cc is not my site.) Probably because of this redirect feature, when I do "fetch as Google" or "fetch as Bingbot", I get a 302 and no html content. Not even a <html></html> tag. HTTP/1.1 302 Moved Temporarily Server: nginx/1.4.1 Date: Tue, 01 Oct 2013 04:37:37 GMT Content-Type: text/html Transfer-Encoding: chunked Connection: keep-alive X-Powered-By: PHP/5.4.17-1~dotdeb.1 Set-Cookie: PHPSESSID=vp99q5e5t5810e3bnnnvi6sfo2; expires=Thu, 03-Oct-2013 04:37:37 GMT; path=/ Expires: Thu, 19 Nov 1981 08:52:00 GMT Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache Location: /roTr94h4Gd How should I avoid 302 in this case? I suppose I could modify my site to prevent the redirect, but it is a necessary feature of my web app to generate a random URL on each access. I added <meta name="fragment" content="!"> tag into my index page and set it to return a static snapshot of my page when the flag is set. But this still returns a 302. I also added a header to return 200 before redirecting, but this had no effect, either. Could someone tell me a good suggestion to solve this problem?

    Read the article

  • Google nofollow, Disavow and Link Removal Requests

    - by PsychoDad
    I am the owner of http://www.YouReview.net and I am constantly getting requests from people asking me to remove links to their sites or they will Disavow the links and they threaten me with Google penalties. All of this is a bit frustrating because first I use nofollow on any link outside the YouReview.net domain. Second, I've never heard of Google penalizing a site for linking to other websites. My question is twofold: Do disavowed links penalize the site that was disavowed? and Does the "nofollow" attribute on tags absolutely guarantee that the link is not followed and not counted for search engine ranking? Why don't more people know about nofollow?

    Read the article

  • XMLHttpRequest not working, trying to test database connection [closed]

    - by Frederick Marcoux
    I'm currently creating my own CMS for personnal use but I'm blocked at a code. I'm trying to make a installation script but the AJAX request to test if database works, doesn't work... There's my JS code: function testDB() { "use strict"; var host = document.getElementById('host').value; var username = document.getElementById('username').value; var password = document.getElementById('password').value; var db = document.getElementById('db_name').value; var xmlhttp = new XMLHttpRequest(); var url = "test_db.php"; var params = "host="+host+"&username="+username+"&password="+password+"&db="+db; xmlhttp.open("POST", url, true); xmlhttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded"); xmlhttp.setRequestHeader("Content-length", params.length); xmlhttp.setRequestHeader("Connection", "close"); xmlhttp.send(params); $('#loader').removeAttr('style'); if (xmlhttp.responseText !== '') { if (xmlhttp.readyState===4 && xmlhttp.status===200) { $('#next').removeAttr('disabled'); $('#test').attr('disabled', 'disabled'); $('#test').text('Connection Successful!'); $('#test').addClass('btn-success'); $('#login').addClass('success'); $('#login1').addClass('success'); $('#db').addClass('success'); $('#loader').attr('style', 'display: none;'); } else { $('#next').attr('disabled', 'disabled'); $('#test').removeClass('btn-success'); $('#test').removeAttr('disabled'); $('#test').text('Test Connection'); $('#login').removeClass('success'); $('#login1').removeClass('success'); $('#db').removeClass('success'); $('#loader').attr('style', 'display: none;'); } } else { $('#next').attr('disabled', 'disabled'); $('#next').attr('disabled', 'disabled'); $('#test').removeClass('btn-success'); $('#test').removeAttr('disabled'); $('#test').text('Test Connection'); $('#login').removeClass('success'); $('#login1').removeClass('success'); $('#db').removeClass('success'); $('#loader').attr('style', 'display: none;'); } } And there's my PHP code: <?php $link = mysql_connect($_POST['host'], $_POST['username'], $_POST['password']); if (!$link) { echo ''; } else { if (mysql_select_db($_POST['db'])) { echo 'Connection Successful!'; } else { echo ''; } } mysql_close($link); ?> I don't know why it doesn't work but I tried with JQuery $.ajax, $.get, $.post but nothing work...

    Read the article

  • Splitting a sitemap by content type

    - by James
    I currently am tasked with submitting our website sitemap to the search engines every week. We have a module which does offer sitemap generation but we find using it does not work very well as not all pages are included and it does not split the sitemap by content. I've used various (online and offline) tools to generate the sitemaps which is not the problem. The problem is that after every generation (which takes most of each Monday) I have to manually go through the sitemap and categorise the links in to products, pages, categories and sub categories. I've experimented successfully with XSL to split the sitemap but it is still a labour intensive process. Does anyone know of a good method to split the sitemap? Currently there are around 20,000 links (iirc) in total.

    Read the article

  • How do I find information on who links to my sites?

    - by bobdobbs
    I'm trying to figure out if there's a free way to get information on backlinks to my site. I've had webmaster tools and google analytics set up for years. But I can't find access to data about site backlinks in either toolset. Webmaster tools, under 'traffic'-'links to your site' gives me the same message for all of my sites: "No data available". I haven't been able to find anything in GA that gives any information on backlinks. I've heard of using "links:" as an operator in google search, but for each of my sites, this returns either zero or very few results in cases when I know I have many backlinks. Most of the links simple aren't shown. My thinking is that google maintains a graph of who links to my site, so I figured that they might let me see it. But I can't figure out how. I've found this tool on a spammy website: http://www.backlinkwatch.com. It offers more data than google on my backlines, and offers more results in exchange for a paid subscription. The data it offers for free looks good, but the results are limited and the site has popups and obnoxious ads. So, in short: how do I get data on who links to me? Is there a free way?

    Read the article

  • Google Analytics Campaigns Not Tracking E-Commerce

    - by Paul
    I am running email campaigns via MailChimp and tracking the success of my campaigns via Google Analytics. I can successfully see data being tracked for: Reporting > Conversions > Ecommerce (Receiving Data) Reporting > Traffic Sources > Campaigns (Receiving Data) However, I am not receiving any Ecommerce data for the individual campaigns: Reporting > Traffic Sources > Campaigns > Ecommerce (No data) So I see data like: Visits: 18,501 Revenue: $0.00 Everything I have read leads me to believe this should just "work" if Ecommerce is setup. Is there some additional action I need to take for this work? Any help would be appreciated!

    Read the article

  • Redirect public traffic to a different subfolder, while local traffic remains unchanged

    - by ecnepsnai
    I would like to have local (intranet) HTTP traffic go to the /var/www/html folder while any public traffic goes to the subfolder, /var/www/html/public I've tried this configuration, with some variation, in httpd.conf <VirtualHost PRIVATE-IP> DocumentRoot /var/www/html ServerName ecn ErrorLog /var/www/logs/error/private CustomLog /var/www/logs/access/private common </VirtualHost> <VirtualHost PUBLIC-IP> DocumentRoot /var/www/html/public ServerName PUBLIC-DOMAIN-NAME ErrorLog /var/www/logs/error/public CustomLog /var/www/logs/access/public common </VirtualHost> PUBLIC-IP, PRIVATE-IP, and PUBLIC-DOMAIN name are all replaced with the correct values in the actual document. The problem is, local traffic can browse fine but remote traffic is directed to the root folder and getting 403d (because I have that folder blocked off through my .htaccess file). If I append /public to the URL it works fine.

    Read the article

< Previous Page | 173 174 175 176 177 178 179 180 181 182 183 184  | Next Page >