Search Results

Search found 14053 results on 563 pages for 'upk pro (knowledge pathwa'.

Page 254/563 | < Previous Page | 250 251 252 253 254 255 256 257 258 259 260 261  | Next Page >

  • Is there a way to setup Clicktale tag in Google Tag Manager?

    - by Cubius
    Since GTM doesn't support document.write() method the standard clicktale code doesn't work. Is there a workaround for this? ClickTale employee has sent me these instructions: Replace the document.write JS line above with the following: document.body.appendChild(externalScript); Example: <!-- ClickTale Bottom part --> <script type='text/javascript'> var externalScript = document.createElement('script'); var scrSrc = document.location.protocol=='https:'? 'https://clicktalecdn.sslcs.cdngc.net/': 'http://cdn.clicktale.net/'; scrSrc += 'www11/ptc/xxx-xxx-xxx-xxx.js'; externalScript.src = scrSrc; externalScript.type = 'text/javascript'; document.body.appendChild(externalScript); </script> <!-- ClickTale end of Bottom part --> I am not sure what to do with this. Has someone tried something like this?

    Read the article

  • Somehow Google considers a properly 301'd URL as 200 and is still indexing the new content in old page?

    - by user2178914
    We redirected all the old URL's to new ones properly using htaccess. The problem is Google, somehow is still finding content in the old page(which it shouldn't) and stores it in the cache rather than the new URL. For eg: Old Page- http://www.natures-energies.com/iching.htm New Page- http://www.natures-energies.com/index.php?option=com_content&view=article&id=760 If you type the old URL into the browser it redirects If you fetch the old URL as Googlebot in the webmaster tools the header says 301/permanently redirected. If I try to crawl as any other bot it still says 301 redirected. Even if you click the old link in Google it redirects to the new URL. Only in its cache it shows the old URL and moreover it shows the new content in it! I am stumped on how Google manages to grab the new content and puts in the old URL instead of the new one! One more interesting thing is that if I try a cache for the new page it shows the cache of the new content with old URL! Any help would be appreciated. I am at end of my wits. I think i have tried almost everything. Is there anything that I'm missing to see? You can use this search to find the old url's. Maybe you'll some patterns that i missed. site:www.natures-energies.com inurl:htm -inurl:https|index

    Read the article

  • Rewrite rule for 403

    - by Jitesh Tukadiya
    I have an .htaccess file: In this file it redirects to index.php in case a file or directory is not found. My code is as below: <IfModule mod_rewrite.c> RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?/$1 [L] </IfModule> Everything is working fine with this code. Now when I get Forbidden error(403), I would like it to be redirect to index.php. Do you have an idea how to write an .htaccess file for this purpose?

    Read the article

  • Could multiple uses of the same keywords in image alt attributes hurt SEO?

    - by saratogahiker
    Let's say on an e-commerce site that sells unique pens, on a particular pen's product page, the image of the pen has an alt attribute value of "unique red-striped pen"... and another product has "unique blue-spotted pen", etc... The keywords across all products being "unique" and "pen", which would also be helpful when it comes to SEO. However, if the person just goes to the general "unique pens" category page and sees a list of thumbnail images, each with the words "unique" and "pen" in the alt attribute, would that potentially have a negative impact with regards to SEO by having the same keywords too many times?

    Read the article

  • Google not showing any pages from my site in the index after three months [on hold]

    - by Alex Coisman
    Despite having a sitemap and using Google Webmaster Tools, it has been over 3 months and my site has not been added to the Google index at all. Here's the site: www.famouslefthandedpeople.com As far as I know, I have done everything correctly. However, there must be something I am overlooking that is preventing Google from indexing the site. I do not have a robots.txt file, so allow/disallow isn't the issue. Although the content of the site is sparse, it is original and not duplicated internally or externally so Panda/Penguin should not be a problem. I have reviewed the answers at Why isn't my website in Google search results? and I don't think it applies here. If it matters, I am using WordPress to create the site. What other factors should I be looking at in order to troubleshoot this?

    Read the article

  • Making a calendar tool with Google API. Should I use OAuth Service Account?

    - by Goluxas
    I'm creating a calendar tool for a client. He has his work calendars in a Google account, which I have access to. I'll be using the Google Calendar API for the tool, but I'm new to Google APIs and the OAuth stuff. I'll be using PHP. Do I need to use an OAuth2 Service Account, or should I make a regular OAuth2 Web Application Client ID from that Google account? In case this information is helpful, the tool I will be making will do this: 1) Allow my client to fill out a form which will create a new calendar and populate it with several events. 2) Send out an email to a mailing list twice a week listing changes that have been made to these calendars. I'll also be making a page that displays the iframes for each calendar, so his clients may see them even if they do not have a Google account.

    Read the article

  • Comparisons of Javascript 'data grids'?

    - by Joe
    I've found plenty of questions between here and StackExchange of people asking for the 'best' data grid / data table, or one that has a particular feature, and plenty of lists out there (of various ages) listing the various data grid implementations ... but is anyone aware of any matrix of what features the various solutions implement? (eg, allow shift-click to select multiple; support checkboxes for selection; can update a regular table in-place; allow editing of cells; support websql or indexeddb for local caching; which browsers they support; infinite scroll; etc.) There's a generic 'javascript framework' comparison on wikipedia, which would be the sort of thing I'm looking for, but it doesn't go into detail on data grids. (which makes sense, as so many are extensions, not core features of those frameworks, and in the case of jQuery, there's lots of 'em.)

    Read the article

  • why my site cache 2 time par day?

    - by clarawood
    I have read the FAQs and checked for similar issues: YES My site's URL (web address) is: www.adultxdating.com Description (including timeline of any changes made): I have lost my top search listings from last 4 months. I am still working on this but not getting proper guidance. This site is caching 2 times in 24Hrs. Some times sites will back in to top 10 search listing on 100s keywords, some time its gone out 1000+. anybody can help me why its happening. I have more than 200K+ incoming links and updating the site regularly. Please help. Thanks clara wood

    Read the article

  • Improving FAQ SEO with multiple pages?

    - by asdfasdf
    I have a client who has over 200 Question/Answer style content blocks. Neither the questions or answers are very long and most of them have almost the same question but with a word or two differentiating themselves from the rest of the questions. Would SEO be helped or hurt if I would to put each QA on its own page with the title of the page the question being asked etc... Or, would that be considered "farming"? If not, what would be the best way (in SEO world) do present all these QAs? Thanks for any advice..

    Read the article

  • Problem in domain name of site in google search [on hold]

    - by Jayadratha Mondal
    My domain provider dont supports advanced DNS at this moment. So I have used iframe to forward to my webserver. Suppose my domain abc.com have a simple html which is opening xyz.com via Iframe. Google was showing abc.com on the search. But from last two days it is showing xyz.com, other things like description, name of the site are ok only the domain changed to xyz.com, but I want to show abc.com. When I'm typing the full domain then google showing the domain as I want. But If I type a part or any keyword then it showing that xyz.com. Anyone have any idea how to show abc.com again? EDIT: I dont know the domain name provider because its not mine. When I'm trying to set new A record its showing that "you need advanced dns to do this currently you are using simple DNS". I'm doing this from cpanel. Normally as per I know there are 3 sections. 1 domain name 2 A record 3Cname. But I dont have any domain name section.

    Read the article

  • How do I forward/redirect a website from a folder in a subdomain to another server?

    - by dozza
    I have a client with a site at: subdomain.theirdomain.com/folder It's a 50Gb gallery site that i've now cloned and have locally in MAMP. Once i've made some changes to it I need to host it at alternative physical server/hosting (I intend using a dedicated server I have access to with a technical domain name currently). However, the client would ideally like to keep the existing URL as it has been used extensively in marketing. I've done HTTP redirects and forwards and 301 redirects in the past, but I'm not sure how or even if I can do what the client wants. How can I achieve this, possibly using .htaccess and DNS entries? Caveats: I can't have the site at a 3rd party domain and the client isn't able/allowed to register any additional domains:

    Read the article

  • A drop in SERP after following webmaster guidelines [on hold]

    - by digiwig
    So here's a puzzle for all you SEO gurus out there. I recently launched my own site. I had target keywords which were ranking very well for about 1 month, within the top five and even appearing in first place. In an attempt to maintain good positioning, I followed guidelines by adding robots.txt, an xml sitemap redirecting non-www to www redirecting index.php to root domain adding htaccess 301 redirect for old pages I added rich snippets created a google+ account, verified my picture to appear, I went through each of the webmaster issues with duplicate titles and meta descriptions and improved header tag document outlines i even created a few more blog posts to keep the content freshing and moving. So now my website appears on page 2 with my target keywords - and all because I followed the guidelines. What is happening? I see competitors with stagnant content superglued to position 1.

    Read the article

  • Are generic keywords in url bad for SEO? [closed]

    - by user1661479
    Possible Duplicate: Squeezing all the SEO out of a URL as possible Need help with url structure. Let's say I'm a manufacturer of Wire EDM machines. Is it bad for me to put the keywords wire-edm in my url to help try to raise SEO ranking? For example: mywebsite.com/wire-edm/machine/model-xxxx mywebsite.com/wire-edm/customer-service mywebsite.com/wire-edm/contact Or should I leave it as the following because the gains are fairly insignificant and it doesn't help users understand my site structure: mywebsite.com/machine/model-xxxx mywebsite.com/customer-service mywebsite.com/contact I’d like to hear what everyones thoughts are on this and please provide some sources for which method is better.

    Read the article

  • Free web mangement control panel

    - by Thorn007
    Hey guys I need some help. I will also apologize for not being able to be more specific. I'm looking for a specific web admin panel. that uses a login page via port 2222 or 4444. This is not vanilla forum or any forum. So the only way I know how to make this a legit question is to ask what "free" control panels do you use to manage your web sites. This means files and domains. Why do you use it? Where is it located?

    Read the article

  • Simple mod_rewrite Question

    - by user5358
    Hello, I want to have everything that looks like this: /1/2/3/4/5/[...] to redirect to this: /index.php?u=/1/2/3/4/5/[...] unless the requested string is a specific file. So anything that doesn't have a ".", I want to redirect to "index.php?u=[...]". I'll then parse the URI segments in PHP to determine what the user is requesting. I've been looking around for how to do this, but have only a very rough understanding of regular expressions and have been unable to find an example of how to do it. Thanks!

    Read the article

  • Redirect subdomain to local pc

    - by user1188570
    I have a home webserver which is constatly running. Is it possible to create a subdomain which would redirect traffic to another local pc? For example I have 1 Server and one notebook(with webserver installed for developing) Now I can access to notebook only from local network with IP. Server is also hosting domain example.com. Now I would like to visit laptop.example.com, which would be my laptop.

    Read the article

  • Google analytics campaign advice

    - by Drewsdesign
    I am buying traffic from a broker not one source and sending to various landing pages. I would like to know the best way to structure a campaign so I can find which referrering site/url is performing the best (time on site, bounce etc) Should the utm_campaign be the 'brokername' and the utm_source be the 'landingpagename' or should this be the other way around? Also what would be the best way to create a custom report to show all the referrers metrics by each landing page ? Thank guys really appreciate any help on this.

    Read the article

  • Making HTML5 videos stored on AWS S3 **difficult** to download (because I cant make it impossible)

    - by Jimmery
    I am building a website that hosts video's stored on AWS's S3 service. The videos are played thru a HTML5 player we have built. Ive just been asked to make sure "nobody can steal our video's". Now I know that if you really don't want something stolen, don't put it up on the internet. However I just need to secure these videos as good as possible, the videos need to at the very least resist someone going thru the source code and trying to download them manually. One option available to me is to completely rebuild the video player in flash. This is not ideal, for several reasons, notably because I would also then have to build an App for mobile devices to be able to view this site. So I am looking for other options. I have heard about using a token to make the file available only during certain times. I have heard of using a separate file to serve the videos that sits between the HTML5 page and the video file. I am also having a look at IAM, the Secure AWS Access Control, in the hopes AWS can solve this problem for me. Can anyone here recommend any of these options? Or perhaps suggest other options available to me? Any help would be greatly appreciated.

    Read the article

  • How to edit thousands of html pages at once? [on hold]

    - by Johnsy Omniscient
    I need to edit thousands of pages for a website with dynamic content added manually by the owner throughout 3 years, it has thousands of pages and I'm sure there is a better way to edit them without spending hours opening each one of them. I know it would be easy to just edit the styles.css but page dynamics like the positions of the google ad-boxes are individually edited inside the html of each page, so there is no way to solve this through css. Is there some sort of code, script and macro that can edit the pages at once?

    Read the article

  • How to set up a local webDAV server for use with Goodreader on the ipad? [migrated]

    - by confused-about-webdav
    I need to know how to set up a local webDav server on my PC so that goodreader on ipad can automatically sync with it over the local wifi network? I am really a rookie when it comes to setting up a web server and have tried various guides on the internet. I tried setting up a webdav server using IIS and forwarded the required ports and enabled webdav publishing but goodreader can find it in the local wifi network automatically nor is it able to connect even after manually entering the credendtials. So i'll be really greateful if someone who has successfully setup a webdav server for use with goodreader can point me on how to do it?

    Read the article

  • Is mixing 'Adsense' banners and content okay on a Pinterest style layout?

    - by Theodores
    I was under the impression that Google likes to have their adverts clearly separated out from content so that people don't accidentally click on the adverts thinking they are articles. For a 'pinterest' style layout where you only see the one page and a few pop ups over that one page, you could mix in the adverts with the content, as demonstrated with the two adverts slap in the middle on this site: Clearly this can be done and it exists in the wild, with Google adverts being supplied to the site. However, is that against the spirit and/or the letter of what one signs up to with Adsense?

    Read the article

  • Two different domain for specific languages pointing to one site

    - by user25599
    I developing a client's blog and he needs it to be bilingual (english and spanish). Now what he wants is that users can get to the content based on the domain e.g. Jhon enters www.domain.com and he gets the english version and Juan enters www.elsenordominio.com to get the spanish version. All content will be validated by php so the users and search engines only reads the domain related language. What do i need to use header re-direct or 301? is it bad for seo? will google will google penalize me? I hope you guys can help me and forgive me if my english is not good.

    Read the article

  • Google Analytics Funnel problems

    - by Alex
    I have a problem with the funnels in Google Analytics. So I have a e-commerce website that I want to track the user path to a purchase. I want GA to track if a user goes trough these steps [Item page]-[Purchase]-[Checkout]. I thought this could be done by funnels and my setup currently now consist of: Step 1: [Item page] (Required) Step 2: [Purchase] Goal: [Checkout] But when I go to the "Funnel Visualization Report" the following shows. [Item page] Visits: 150 [Purchase page] Visits: 170 [Checkout] Visits: 32 How can the [Purchase page] be higher than the [Item page]? I searched the internet over, and found something called Horizontal Funnels but this doesn't show the correct numbers, again the purchase and checkout steps are higher than the item page. So somehow it doesn't need step 1, to fulfill the funnels/goals. What am I doing wrong?

    Read the article

  • Google Scholar Realted Question

    - by Art
    I have just requested Google Scholar to use my web site for collecting papers from my personal web site: http://cs.uic.edu/~asmirnov/publications.html I was wondering if I did everything right: I submitted a request on the form provided on scholar web site I published the papers in PDF on my web site Is there anything else needed for Google to index my web site? Other questions are: 1. The first paper (link to it) is not to just paper, but to the whole issue. 2. Are there any tages to be added on my web site, if so, then which and how do I add them? 3. What are those exporting options available on google scholar web site and how do they work? Thank you very much for being patient with me and my questions as well.

    Read the article

  • A relatively new blog seems to be getting very poor Google indexing

    - by Genadinik
    I have a new blog that is 2 months old. In the first few weeks, it was getting indexed nicely and my GoogleWebmaster reports were showing that it was getting crawled and began ranking for some terms. Then as I kept writing, the GoogleWebmaster report thinned out and showed less and less terms that this blog ranks for. Now there are only 4 terms with one of them being my name. Is there something I need to do to keep the old posts to remain indexed and crawled? Thanks, Alex

    Read the article

< Previous Page | 250 251 252 253 254 255 256 257 258 259 260 261  | Next Page >