Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 176/389 | < Previous Page | 172 173 174 175 176 177 178 179 180 181 182 183  | Next Page >

  • Tips for managing internal and external links using WordPress [closed]

    - by keruilin
    So I'm looking for ways to optimize my site for user and search engine purposes. I've read several articles and looked at several different plugins. To say the least, I'm thoroughly confused as what are the best practices for managing internal and external links. Here is a list of some of my questions: Which internal links should be set to "nofollow"? Which external links should be set to "nofollow"? To what degree does actively managing links contribute to your PR? Should you use "nofollow" blindly on all links in comments? If a link to an external site is broken (404 or whatever), should you "nofollow" that link? What about "noindex"? As you can see, lots of questions. I'm hoping that you experienced webmasters can give a newb some best-practice advice.

    Read the article

  • Where to store users consent (EU cookie law)

    - by Mantorok
    We are legally obliged in a few months to obtain consent from users to allow us to store any cookies on the users PC. My query is, what would be the most effective way of storing this consent to ensure that users don't get repeat requests to give consent in the future, obviously for authenticated users I can store this against their profile. But what about for non-authenticated users. My initial thought, ironically, was to store given consent in a cookie..?

    Read the article

  • Google Analytics Campaigns Not Tracking E-Commerce

    - by Paul
    I am running email campaigns via MailChimp and tracking the success of my campaigns via Google Analytics. I can successfully see data being tracked for: Reporting > Conversions > Ecommerce (Receiving Data) Reporting > Traffic Sources > Campaigns (Receiving Data) However, I am not receiving any Ecommerce data for the individual campaigns: Reporting > Traffic Sources > Campaigns > Ecommerce (No data) So I see data like: Visits: 18,501 Revenue: $0.00 Everything I have read leads me to believe this should just "work" if Ecommerce is setup. Is there some additional action I need to take for this work? Any help would be appreciated!

    Read the article

  • Why do 410 pages show as errors in Google Webmaster Tools?

    - by ElHaix
    To remove links from our site, we return a 410 code on on the links we want removed, and shows The page you requested was removed.. In Webmaster Tools, I see all the 410 pages in Crawl Errors / Not Found. I'm worried that because they appear in Crawl Errors that they could be negatively affecting SEO rankings. Is that the case, and if so, should I change the return codes from 410 to something else?

    Read the article

  • Htaccess 301 redirect dynamic URL

    - by Jarede
    I don't know a whole lot about .htaccess rules so forgive and help me ask the correct question. Currently I have a .htaccess rule like: RewriteRule ^surveys/(\S+)/directory/(\d+)/(\d+)/entry/(\d+)/?$ directories/index.cfm?sFuseAction=XXX.YYYY.ZZZZ&nDirectoryID=$2&nEntryID=$4&nCategoryID=$3&sDirectory=$1 [NC,L] which I want to do a 301 redirect to: RewriteRule ^(\S+)/directory/(\d+)/(\d+)/entry/(\d+)/?$ directories/index.cfm?sFuseAction=XXX.YYYY.ZZZZ&nDirectoryID=$2&nEntryID=$4&nCategoryID=$3&sDirectory=$1 [NC,L] I'm unsure of the correct syntax to go about making these redirect correctly.

    Read the article

  • Is there any good reason I would want my website to be framed?

    - by minitech
    I'm building a website that's not security-critical in any way at all, so having somebody put a page in an <iframe> is not particularly dangerous to its users. However, as my website doesn't have script plugins that will be used anywhere else, is there any reason why I shouldn't just apply: X-Frame-Options: Deny to every page on my website? Is there any valid reason for any other website to embed mine? I've seen plenty of content-stealing ones and attempts to hijack user accounts, but never an actual good usage of frames that's not an explicit feature of the website.

    Read the article

  • Source code not matching uploaded HTML file

    - by benhowdle89
    I'm not sure if this is the right place to ask but i'm having a hugely frustrating problem with Coda and my website (i'm not sure which one is causing the issue) I'm using Coda to make changes to my website, Coda uses built in FTP to save changes to your web page. So when you hit Save, it uploads the new file. I've been using Coda for months and never had a problem until now. I am making changes in the html of my index.php and hitting save, it's successfully uploading the file but no changes are reflected in the source code in ANY browser. I even logged into cPanel on my website, ie. www.example.com:2082 and looked at the file - the changes have been made successfully. But the actual webpage in browser's source code, no changes?? I have tried adding which made no difference. Interestingly i make changes to style.css and the changes are instant. I have emptied the cache on all of my browsers but i'm still having an issue. Does this sound like a Coda problem or has anyone heard of such a thing?

    Read the article

  • SEO issue - External links rel="nofollow" or NOT!

    - by Mary Melody
    Previously I created a website for promo codes and coupons. That have hundreds of external links to the Retailer's websites and I used rel="nofollow" tag. But my site SEO rank was very very... bad especially on Google. So then I removed the rel="nofollow" tag, but no improvement. The only difference between this site and my other sites is "External links". My other sites have good ranking on Google. Now I'm creating a site for reviews. So this is also similar situation for me. I just want to know about how does SEO react with external links? and then what possibly happened in my case?

    Read the article

  • Make Apache encode or replace quotes instead of escaping them?

    - by mplungjan
    In the dcoumentation I read Format Notes For security reasons, starting with version 2.0.46, non-printable and other special characters in %r, %i and %o are escaped using \xhh sequences, where hh stands for the hexadecimal representation of the raw byte. Exceptions from this rule are " and \, which are escaped by prepending a backslash, and all whitespace characters, which are written in their C-style notation (\n, \t, etc). In versions prior to 2.0.46, no escaping was performed on these strings so you had to be quite careful when dealing with raw log files. This is a problem for Analog which is still the handiest analyser I use. I get .... "GET /somerequest?q=\"quoted string\"&someparm=bla" in the logfile and it is of course flagged as corrupt since Analog expects .... "GET /somerequest?q=%22quoted string%22&someparm=bla" or similar. I realise I can pre-process using something like perl -p -i.bak -e 's/\\"/%22/g' logfile But I'd rather not have to add this step to these files which are 50-90MB zipped per day Thanks for any pointers

    Read the article

  • What is the best shopping cart or implementation for unlimited users posting unlimited products? [closed]

    - by Matt
    I've been working with x-cart much lately, and I was thinking about using it for a much larger site, but I don't know if it can handle what I'm looking for. I need a platform or strategy that can allow for as many users as possible where each can post multiple products (hopeful up to a hundred, but that's less important), but in their own private catalogs. So what am I looking for? With x-cart, I'm used to customizing it with jquery, smarty, and php, so I can handle that much.

    Read the article

  • Why do different browsers return different search results at Google and how can I prevent it?

    - by Sei
    I am running some websites and am constantly checking keywords' rankings on google.com. And it is really important for us to see the organic search result without logging in or setting a specific location. Since this morning my colleague and I have checked the same ranking on both IE and firefox, the result surprisingly is very different (it almost feel like IE was logged in because the ranking is much higher, while in reality it is not). I have changed computer and the same problem occurred. It did not happen before. Can anyone tell me why is it?

    Read the article

  • How to create email request forms and auto-responder?

    - by mfc
    I'm building a site in css and I'm pretty new to any code or script other than html and css. I'm trying to create a landing page that requires an email from visitors and set up an auto responder to send to that newly submitted email. This would also be a signup for email newsletters. I have some idea how to create the form and have looked into a bit. I don't know how to make it a requirement to get past the landing page and into the actual website or set up the auto-responder. Any help would be much appreciated. Or if someone knows of a source that explains how to do this thing in particular it would be wonderful. I tried lynda.com but everything is so general and I can't seem to find info on exactly how to do this but I know its quite common. Thanks!

    Read the article

  • Switching hosting from Google Apps to GoDaddy

    - by Sahir M.
    I am in a pickle here. I just bought a hosting service from GoDaddy and I already have a domain registered with Google Apps. So I went into the domain control center from Google Apps and I changed the nameservers to point to GoDaddy's and then I deleted all the A records and also removed the www and main CNAME. Then I created an A record with host "@" which points to the IP address I got from GoDaddy (this is the server address I see under the category Server in the GoDaddy Hosting Control Center). Also, I see that in this domain center, the domain forwarding is set to forward to my address "www.domainsite.com". So right now when I go to my website I see the error "This website is temporarily unavailable, please try again later." Does anyone know how to set this up properly or know what problem I could be having? Thanks!

    Read the article

  • How can I turn on compression for my IIS 7 web sites?

    - by Richard A
    I am using IIS7 and trying to optimize as much as possible. I had one suggestion about compression but I am not sure how to turn this on. I am familiar with making changes to Web.Config but not sure about making IIS7 changes. What makes it more difficult is that I am using Windows Azure where new images are created every time I publish. Can someone explain if there's more than one way to turn on compression and how I can do it.

    Read the article

  • Doing affiliate program with shops who don't have a program already set up

    - by Jacobo Polavieja
    I am developing an online shop which has managed to agree with other shops to a comission per sale. Now, the problem, is these other shops don't have any kind of affiliate system. So my question is, is there any way we could arrange an easy way for this? They don't plan to develop anything as they are small shops, so... my only guess right now is to control on my site how many times the links to them have been clicked to have an estimate of potential clients, but don't know how they can know that user came through my site and purchased something. Thank you very much for your help!

    Read the article

  • Google Analytics Export API - nextPagePath data

    - by Btibert3
    I am probably missing something obvious, but I do not understand when I query: start.date = DATE_START, end.date = DATE_END, dimensions = c("ga:pagePath","ga:previousPagePath"), metrics = c("ga:pageviews"), filters = mypageofinterest, table.id = "ga:mytable", max.results=RESULTS my data return as expected, all of the previous pages including (entrance). However, when I modify the code to be nextPagePath start.date = DATE_START, end.date = DATE_END, dimensions = c("ga:pagePath","ga:nextPagePath"), metrics = c("ga:pageviews"), filters = mypageofinterest, table.id = "ga:mytable", max.results=RESULTS only one line of data are returned; the pagepath and nextpagepath are identical with itself. I replicated this result using the Query Explorer. What am I missing or doing wrong? I was expecting to see a large number of "next" pages, including (exit). Thanks in advance.

    Read the article

  • getting the user back where they came from with mod_form_auth

    - by bmargulies
    Using the mod_form_auth module in Apache HTTPD 2.4.3, I am looking for a way to have the user redirected to their original desired target after completing a login. That is, if I have a <Location /protected> ... form auth config here </Location> the user might browse to /protected/a, or to protected/b. In either case, they will be presented with the login form. However, as far as I can see, I must specific a single 'success' URL. I'm wondering if I'm missing some Apache feature that would allow me to, for example, cause the redirect to the login form go to something like: https://login.html?origTarget=/protected/a via some syntax on the AuthForLoginRequiredLocation statement?

    Read the article

  • Is this safe? <a href=http://javascript:...>

    - by KajMagnus
    I wonder if href and src attributes on <a> and <img> tags are always safe w.r.t. XSS attacks, if they start with http:// or https://. For example, is it possible to append javascript: ... to the href and src attribute in some manner, to execute code? Disregarding whether or not the destination page is e.g. a pishing site, or the <img src=...> triggers a terribly troublesome HTTP GET request. Background: I'm processing text with markdown, and then I sanitize the resulting HTML (using Google Caja's JsHtmlSanitizer). Some sample code in Google Caja assumes all hrefs and srcs that start with http:// or https:// are safe -- I wonder if it's safe to use that sample code. Kind regards, Kaj-Magnus

    Read the article

  • GA tracking utm query params after hashbang

    - by hybrid9
    We currently use a hashbang for the portion of our site that generates dynamic content which can also be deep linked. Our analytics team wants to use utm params to track the referral traffic from social networks. We are using Universal Analytics (analytics.js) as well as GTM. Will GA pick up the query parameters after the hashbang or does it always have to go before? For example: example.com/#!/some/content?utm_source=foo&utm_campaign=bar example.com?utm_source=foo&utm_campaign=bar/#!/some/content In #1, I'm concerned that the utm params won't be recorded and in #2 the page will break or the url could be incorrectly written. How does GA pull in those parameters - location.search? regex? Can I get away with using either?

    Read the article

  • Leveraging a hosted web font service from a local development server?

    - by Tom Auger
    There are a number of popular web font services on the market today who "host" the fonts and serve them to your web page via javascript or CSS pointing to remote locations. For example http://webfonts.fonts.com or http://typekit.com However, there seems to be an issue when you're developing on a local testing server - the remote font services don't validate the font and return 403 access denied errors and the like. What workarounds are there for using remote services such as a hosted font service, on a local development server?

    Read the article

  • Google doesn't index a subdomain. What can be the problem and what can be done?

    - by fudge
    Hi! I have a domain, let's call it example.com, which has a subdomain, games.example.com. I maintain a games forum using phpbbseo which is located at games.example.com/forum. The problem is that the forum is not being crawled. I used Google's webmaster tools and tested that the page is seen by google. P.S. There is a link from games.example.com to games.example.com/forum. What can I do? How can I make google crawl my forum?

    Read the article

  • How to combat negative SEO?

    - by Perturbed
    Someone has decided to create a hate blog on a hosted blogging service (wordpress.com) that bashes my company. The blog contains posts that completely flame myself, my service, and contains complete falsehoods about how I run my business. Without going into details, I'm pretty sure the author of this blog is an owner of a competing service (although it is authored completely anonymously). Frankly, I'm not sure if the content would qualify for defamation or not, but I really don't like the idea of spending money on a lawyer to even attempt to prove this. I also have no interest in retorting or even replying to the blog in any sort of way -- I feel this would justify the ludicrous claims that have been posted. Unfortunately, whoever wrote the blog was pretty smart about using key words that people commonly use to search for my service. Because my customer base is relatively small and local, our PageRank is not incredibly high. As a result, when someone Google's our business name, this blog is usually within the top five results (thankfully, it's never above the business' actual website, but it's usually within eyeshot). It's incredibly frustrating to hear from customers who have seen the link (luckily, most of the time they think the author is crazy). Is there anything I can do to combat this? Would it be worthwhile to setup my own hosted wordpress.com branded blog, in an effort to trump this wordpress.com with a blog that is more active of my own? TL;DR: Someone made a hate blog using wordpress.com and is now on the first page of my business' search results. What are my options?

    Read the article

  • Can Mailchimp APIs be used to send templated transaction email triggered by actions on my website?

    - by HenryW
    I am currently playing around with Mailchimp's APIs, but the documentation to me is not very clear. Here is what I actually want: Have the templates I created on Mailchimp, be visible on my own server. Assign each template I made to a specific action (logged in,subscribed, created order, or new password). This is functionality that I already tested with Mandrill, but the template exists on mandrill's account. If option 1 is not possible, can I still make my own template in my own environment, and send that template out over Mailchimp or Mandrill? Should I use Mailchimps services for this or send the email directly from my own server? Curent used function: function tep_mandrill_mail($to_name, $to_email_address, $email_subject, $email_text, $from_email_name, $from_email_address) { if (SEND_EMAILS != 'true') return false; $uri = 'https://mandrillapp.com/api/1.0/messages/send-template.json'; $postString = '{ "key": "xxxxxxxxxxx", "template_name": "sometemplatename", "template_content": [ { "name": "header", "content": "*|HEADERSTUFF|*" }, { "name": "main", "content": "*|CONTENTSTUFF|*" }, { "name": "footer", "content": "*|FOOTERSTUFF|*" } ], "message": { "subject": "'.$email_subject.'", "from_email": "'.$from_email_adress.'", "from_name": "'.$from_email_name.'", "to": [ { "email": "'.$to_email_address.'", "name": "'.$to_name.'" } ], "important": false, "track_opens": true, "merge": true, "merge_vars": [ { "rcpt": "'.$to_email_address.'", "vars": [ { "name": "HEADERSTUFF", "content": "'.$email_subject.'" }, { "name": "CONTENTSTUFF", "content": "'.$email_text.'" }, { "name": "FOOTERSTUFF", "content": "paulvale-foot" } ] } ], "tags": [ "password_forgotten" ] }, "async": false, "ip_pool": "Main Pool" }'; $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $uri); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true ); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true ); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_POSTFIELDS, $postString); curl_exec($ch); }

    Read the article

  • Copying an SSH Public Key to a Server

    - by Nathan Arthur
    I'm attempting to setup a git repository on my Dreamhost web server by following the "Setup: For the Impatient" instructions here. I'm having difficulty setting up public key access to the server. After successfully creating my public key, I ran the following command: cat ~/.ssh/[MY KEY].pub | ssh [USER]@[MACHINE] "mkdir ~/.ssh; cat >> ~/.ssh/authorized_keys" ...replacing the appropriate placeholders with the correct values. Everything seemed to go through fine. The server asked for my password, and, as far as I can tell, executed the command. There is indeed a ~/.ssh/authorized_keys file on the server. The problem: When I try to SSH into the server, it still asks for my password. My understanding is that it shouldn't be asking for my password anymore. What am I missing?

    Read the article

  • Virtual Pageview Goal Funnel Not Tracking Correctly

    - by cphill
    I have an AJAX form that has three stages: 1. The landing page where a user fills out a form and selects between three question sets and clicks begin assessment 2. The assessment page, where users fill out questions relating to the question set that they selected on the landing page. 3.The results page, which shows whether they are at High Risk or Low Risk. Since this is an AJAX form that does not open a new page for each step of the process, I implemented a virtual pageview that would fire on the pageload of each step of the form process. The following is my virtual pageview setup for each stage: /form/begin-assessment /form/assessment/* (* = Three different virtual pageviews depending on the users selection of the three sets of questions: /one, /two, /three) 3./form/finished-assessment I have set up three separate goals to track user progress through each step of the form assessment. Here is my Goal setup: Goal Description: -Goal Type: Destination Goal Details: -Destination: /form/finished-assessment -Funnel: On Step 1: /form/begin-assessment (Required: Yes) Step 2: /form/assessment/one (Step 2: replace /one with /two or /three and you have my two other goals setup) Now my goals are recording the correct data in the first step and show the completions in the destination, but the second step does not show any drop offs. They show the same data as the destination. Any ideas of how I set up the goals wrong?

    Read the article

< Previous Page | 172 173 174 175 176 177 178 179 180 181 182 183  | Next Page >