Search Results

Search found 9717 results on 389 pages for 'gkt pro'.

Page 136/389 | < Previous Page | 132 133 134 135 136 137 138 139 140 141 142 143  | Next Page >

  • wordpress woocommerce php variable usage %1$s

    - by tech
    I am using wordpress with woocommerce and I am trying to manipulate a copy of myaccount.php The default code uses some variables of some sort that I am not familiar with nor have I been able to find documentation on. The variables in question are %1$s, %2$s and %s <p class="myaccount_user"> <?php printf( __( 'Hello <strong>%1$s</strong> (not %1$s? <a href="%2$s">Sign out</a>).', 'woocommerce' ) . ' ', $current_user->display_name, wp_logout_url( get_permalink( wc_get_page_id( 'myaccount' ) ) ) ); ?> <?php printf( __( 'From this page you can view your recent orders, manage your shipping and billing addresses and <a href="%s">edit your password and account details</a>.', 'woocommerce' ), wc_customer_edit_account_url() ); ?> </p> How can I identify the variables, what they represent and how to use them? Thank you.

    Read the article

  • What sort of phone numbers are allowed as the WHOIS contact?

    - by billpg
    I'm getting a non-trivial amount of scam phone calls to the phone number contact listed in WHOIS. Could I change it to a premium rate line? If the scammers want to talk to me so much, make them pay for the privilege! Seriously though, are there any restrictions on the type of phone number I can give as my WHOIS contact? Notwithstanding that it is a phone number which can be used to contact the domain holder. In the UK, cell phones are more expensive for the caller to call than land-lines, so I suspect a significant number of people are already listing a "premium rate" phone number.

    Read the article

  • Google search does not show sub-pages from my website

    - by Chang
    My website appears in Google search, but only the first page. Of course I have sub-pages linked from the first page, but the sub-pages do not show in Google search. Not in Yahoo, not in Bing. What should I do? It has been three years that sub-pages do not show. (I tried searching site:mydomain.com and pressed 'repeat the search with the omitted results included' link) What would you suspect the reason? My website addresses were like xxx.php?yy=zzz etc, etc, so I changed it to /yy/zzz using mod_rewrite. I thought it might be (X)HTML standard violations, so now I changed it. I hope Google will soon have my entire website, but I am a little bit pessimistic. Do you have any thought?

    Read the article

  • Should Site Title be Before or After Page Title?

    - by NickAldwin
    Apologies if this is a dupe. I tried searching, but didn't find anything specifically addressing this concern. When creating a large(ish) site, page titles usually reference both the site name and the current page name. However, it seems there are two main conventions: Bob's Awesome Site - Contact Page and Contact Page - Bob's Awesome Site I've looked around, and pages usually use one of the two variants above. Is there any reason to use one over the other? SEO/readability/usability/etc? I've thought about it, and have only come up with: Page first - Differentiates the tab when the browser is crowded with lots of tabs Site first - Immediately see the "parent" site, so to speak; more cohesive experience

    Read the article

  • .htaccess language redirects with seo-friendly urls

    - by jlmmns
    How do I setup my .htaccess file to detect several languages, and redirect them to specific seo-friendly urls? Basically every url needs to go to index.php?lang=(...) So, for English language detection http://mysite.com has to go to http://mysite.com/en/ (index.php?lang=en) my .htaccess as of now (not working): RewriteEngine On RewriteCond %{HTTP:HOST} http://mysite.com/ RewriteCond %{HTTP:Accept-Language} ^en [NC] RewriteRule ^$ http://mysite.com/en/ [L,R=301] RewriteCond %{HTTP:Accept-Language} ^de [NC] RewriteRule ^$ http://mysite.com/de/ [L,R=301] RewriteCond %{HTTP:Accept-Language} ^nl [NC] RewriteRule ^$ http://mysite.com/nl/ [L,R=301] RewriteCond %{HTTP:Accept-Language} ^fr [NC] RewriteRule ^$ http://mysite.com/fr/ [L,R=301] RewriteCond %{HTTP:Accept-Language} ^es [NC] RewriteRule ^$ http://mysite.com/es/ [L,R=301] RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-l RewriteRule ^(en|de|nl|fr|es)$ index.php?lang=$1 [L,QSA]

    Read the article

  • Grapeshot crawler ignoring robots.txt

    - by QF_Developer
    Has anyone come across a crawler called Grapeshot? They are hammering the same page repeatedly on our website. I believe they are looking for ad related keywords, based on previous content ad campaigns. The odd thing is we never ran any such campaigns on the page they are so interested in. We do have only a few pages running AdSense, is this what has attracted Grapeshot? I've added the following declaration to my robots.txt, but they don't seem to be honouring it? User-agent: grapeshot Disallow: / Any ideas on how to block this nuisance crawler? I'm starting to think the best way is by setting up IP rules in IIS?

    Read the article

  • Website cookie scanner

    - by user359650
    I'm in charge of a relatively big corporate website (circa 95K pages) and need to perform a cookie audit. I can see cookies issued on a per-page basis with Chrome or Firefox console, but given the amount of pages I need a tool to automate the process. I tried to google for website cookie scanner but my search was unfruitful and found: either online tools which only scan the home page paid services (ex1, ex2) Does any of you know about a tool to scan an entire website and generate a report showing which cookies are being used and which page set them?

    Read the article

  • Page displaying sections using opacity in CSS3 but without navigating or scrolling down [closed]

    - by Senthil Kumaran
    Here is my app - http://www.shalgreetings.com/ I am trying to override the scroll bar going down to a imagesection in CSS, so that whole app is visible with logo, header and other controls all the times when people navigate through different #sections. I am not sure where in the CSS, I am making the mistake as clicking on #sections traverses the page. Here is this app's original inspiration code, which has got this right. Anyone can point me where the problem seems to be in the above app?

    Read the article

  • Webserver on a rotating server with NAT IP or changing IPs

    - by hpsoftware
    i would have to elaborate my questions so please have patience Explaining the logic. if you are familiar with logmein then it installs a client software on your computer then it kinda keeps tracks where you computer is as long as it's connected to internet. So you can always access your computer no matter where it is whatever it's IP is you just go to logmein.com and then you can just access it. Now what i am asking 1. Let's assume i have a website hosted on my laptop let's call it webserver. so then i move around i have a new IP sometime even on a hotel network is it possible to do something like what logmein does so i can keep moving around my Webserver to new IP but it has some local client or something which keeps updating my IP or something i am sure i would need a gateway server somewhere which is connected to my domain name via DNS so somebody accessing my website www.mywebsite.com goes to my main server then gets routed to my laptop which could be anywhere but my gateway server is able to communicate to my webserver I will keep updating the case description based on comments to make more sense. please have patience with me. Regards

    Read the article

  • 250 k 404 & 410 errors in Webmaster Tools. Bad backlinks?

    - by Natália
    Our webmaster tools account is showing 250.000 errors related with weird links from other sites. These URLs are comming mostly from non existent sites or are being generated directly by our website. Here some examples of these urls: oursite.com/&q=videos+caseros+sexo+pornos+gratis&sa=X&ei=R638T8eTO8WphAfF2vG8Bg&ved=0CCAQFjAC%2F%2Fpage%2F2%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F4%2Fpage%2F5%2Fpage%2F4/page/3 Our site is a popular spanish adult site, yet we don´t have keywords which are being mentioned in this url. Apparently this link comes from our site. Some more examples: oursite.com/&q=losmejoresvideosporno&sa=X&ei=U__8T-BnqK7RBdjmhYsH&ved=0CBUQFjAA%2F%2Fpage%2F2%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3/page/4 Once again: not our queries, not out urls. oursite/tag/tetonas We think that it might be other site, which is having a policy of extremely bad SEO based on other sites branding and keywords usage: thirdsite/buscador/tetonas-oursite The question is: if other sites are generating these urls, how can we prevent this? Why the tag is being generated if no link was added to the other site? What should we do with these errors? 301? 410 gone? I have read all similar Q&A here but none of them seems to solve our problem. It is not likely to be a bad ad (Inspected them all). Maybe some all content which google decided to recrawl suddenly? Maybe third parties bad SEO policy? Maybe all of them? Any help will be higly appreciated,

    Read the article

  • Is there a way to add Google Docs-like comments to any web page?

    - by Sean
    You know the comments on Google Docs word processing documents? And how it creates a little discussion over in the right-hand margin? I love it. Great for collaboration. I want to free it from Google Docs so I can use it with clients to discuss mock-ups or scaffolded websites. Searching Google for "add comments [or discussions] to any website" only gets you results for adding blog-like comments (Disqus, JS-Kit, etc.) Anyone know of a solution for what I'm after here?

    Read the article

  • Garbled text in server logs

    - by Glenn Dayton
    I recently looked over my server's logs and I found a bunch of garbled text. Here is a link to the full log, and here is a snapshot of what it looks like: ¹^œÌÓûFF™ÃŒ-ôÚÏàÃÒNRs§cÝi ~F#J"|³Ôq0ã~QQbA ¼¹¦’š¶É3œßå<ú€Ç©XAwdL?R°ÝbÒt©ôÇ·Æ…÷q˜ÇѺ| Þ,߯¡Êr yR¤Q¹Jêlš‘AzP\ ¦ÂY„ÉÉ,æ™ U™»ì³ÔÝáCÿ42‹Ö.nŽÉ2%ÓN8i4Œ®¿‘•"-se•äŽ¿ÊÁ§€þ 8åv%'#Äpžs/ÙÍ:¡1ÑÖÃå ºu|Q®!ÏyÆ,­NR@¶ËȯRDkã=ÿÀܸ ›¼Ô ’ð>ÓÌBftdÃ8–é}‰[øbãÝÁ嘲b¾W n´tT­œpäNëëÔ ·RUÓP+ÅuKÁ£¬\âÌ®:J<ÍÁ0:Q%ª(Œ˜E-ÁI:ï™4®hæœT†«);°Çda@´#èì}‡£ü•{57ý]¼|øÓñð÷ÈÌð‡MkŠâ•C~$Óô#ÙV¾Núå.#Á]vôžóæ» V&8)%øVSž“±ÔQLåÓý1–ŽÃßQ$¹ýž")ÈûQcÄý_ÔüGP=s‹vq#Pmoo.tigertutorialscomµÐOKÃ0ð»Ÿâ‘ØH“ What is this? and is someone trying to do something to my website?

    Read the article

  • How can I pass referrer header from my https domain to http domains?

    - by nutcracker
    My website is 100% https. I have links to other http domains. The referrer header is not set when linking from a https page to a http page. From http://en.wikipedia.org/wiki/HTTP_referrer If a website is accessed from a HTTP Secure (HTTPS) connection and a link points to anywhere except another secure location, then the referer field is not sent. I would prefer that other domains can see the referrer so that they know that traffic comes from my domain. Is there a way to force this header or is there another solution? Update I've done some basic testing using a redirect: http page -- link to http --> 301 redirect --> http page = referrer intact https page -- link to https --> 301 redirect --> http page = referrer blank https page -- link to http --> 301 redirect --> http page = referrer blank https page -- link to http --> 302 redirect --> http page = referrer blank The referrer is lost when linking from a https page to a http redirect page on my own domain. So there is no referrer on the redirect.

    Read the article

  • Why does 301 redirect work for http but not for https?

    - by Tom G
    Through my domain registrar I have set up a domain, essayme.co.uk, to automatically forward to https://google.com. If I go to http://essayme.co.uk it works as expected and redirects me to https://google.com. $curl -i http://essayme.co.uk HTTP/1.1 301 Moved Permanently Cache-Control: max-age=900 Content-Type: text/html Location: https://google.com Server: Microsoft-IIS/7.5 X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET Date: Sat, 07 Jun 2014 11:14:16 GMT Content-Length: 0 Age: 0 Connection: keep-alive However, if I go to https://essayme.co.uk it just freezes and times out. $curl -i https://essayme.co.uk curl: (7) Failed connect to essayme.co.uk:443; Operation timed out What is happening in the second case? (and, if possible, how can I get the redirect to work for https?) Problem background/clarification: I don't have an SSL certificate for the essayme.co.uk domain above, but I do for my live domain (let's call it mywebsite.com), and I was seeing the exact same problem on this domain (hence why I'm trying to debug the problem). Unfortunately I can't experiment with the live domain (as it's live) and I would like to avoid having to buy a second certificate for essayme.co.uk just for debugging (unless absolutely necessary). The problem I was seeing: my live domain, mywebsite.com (not its real name), has a valid SSL certificate. Visiting https://www.mywebsite.com displayed the webpage as expected. I had set up forwarding (like in the question above) from the naked domain (mywebsite.com) to https://www.mywebsite.com) Visiting http://mywebsite.com redirected to https://www.mywebsite.com as expected. However, visiting https://mywebsite.com would freeze and time out (as in the question above). I also tried forwarding it to http://www.otherwebsite.com as an experiment (i.e. forwarding to another site that does not use SSL), but the result was the same: Visiting http://mywebsite.com redirected to http://www.otherwebsite.com as expected. Visiting https://mywebsite.com would freeze and time out again. So I set up essayme.co.uk as an experiment to try and understand why it doesn't work.

    Read the article

  • What is the best website width? [on hold]

    - by Salvis Dišlers
    What is the best website width? I can't decide between 1200px max-width and 1236px....with 300px sidebar as enough or 320px? I personally like wide website, my current website is 1236px; but I hear it's considered very wide for eyesight and all those gadgets keeping in mind... The website is for reading - I mean, mostly articles with around 3000 words on average per page, so no matter - whether they look wide, whether they love to scroll down :) Also - what to decide about Banner size / sidebar? Probably the 300px I can put a 300px banner, but on 320px sidebar I could put 336px banner, if necessary...

    Read the article

  • Best URL for cars related website? [duplicate]

    - by Claudio ??is Mulas
    This question already has an answer here: What is the best stucture of SEO friendly URL? 3 answers If this was your website, what will be the URLs for each car on sale? http://www.autoscout24.eu/Details.aspx?id=247572735&asrc=ha I'm working on a car dealership website. What should be the best URL? Consider also that the company can have more models of the same car. I'm not asking for a url scheme, there are a lot of similar questions. My question is: in a car dealership website what is the best url for a car? What are by you the best variables I've to put on the url. Brand, model, year, location, color, miles/km, etc. This website, that url, this particulary case: what will you choose for urls? (even not in the following list) audi_q5_2009.html audi_q5_2009_used.html audi_q5_2009_used.html audi_q5_2009_used_in_alcobendas.html audi_q5_2009_used/247572735.html

    Read the article

  • Will using two different tracking codes affect my SERP

    - by Danny Hefer
    Hello everyone and thanks for your time! I am now facing a problem after a site migration. New site is basically an improved version of old site, with the same content and some extras. After pointing the domain name to the new site, the old site was still online for a while but didn't get any traffic. The new site has its own tracking code. So, old tracking code has age (something like 7 years) but no visitors for a month, but new tracking code is a month old with an acceptable traffic. How to you think google will react if I add old tracking code to new site? Thanks by advance!

    Read the article

  • How to easily delete all email forwarders in cPanel?

    - by psoft
    I know that I can import a list of email forwarders using CPanel, but how can I delete a list? I want to manage 300+ addresses - as a membership list for my organization. I want to be able to delete that many without clicking 'Delete' and then 'Confirm' (or whatever it is) 300 times. Even if I am able to simply delete ALL forwarders, then upload a modified list - that's fine with me. Note: I'm using a shared hosting package through SiteGround. The tech service rep informed me that I can't use CPanel scripts in a shared environment. Any suggestions?

    Read the article

  • Paid Website Code Review

    - by clifgray
    I have written a pretty extensive webapp and it is going to go live in the next fews weeks and before I really publicize it I want to get some professionals to review it for optimization and best practices. Is there any online service or way to find local software engineers who would be willing to do this? Just to give some specifics that may be helpful, my site is on Google App Engine and written in Python and it is tough to find someone with extensive experience in that area.

    Read the article

  • Is this DFP error message the reason my ad won't show?

    - by Eric
    I'm setting up DFP to display ads and I have an ad tag (Javascript) from adtechus.com. The tag looks like this: <script src="http://adserver.adtechus.com/?adrawdata/1.0/1111/11111/0/0/ADTECH;loc=100;noperf=1;"></script> When I paste that tag into DFP, I get an error message saying it does't recognize the tag format: ...and, more importantly, my ad isn't showing on the page. DFP seems to be taking my adtechus ad tag regardless and working with it, despite the error message. But could that be the reason my ad isn't showing? And how can I fix it?

    Read the article

  • Are you aware of any client-side malware that sends lots of junk requests for .gifs?

    - by Matt Sherman
    I am getting dozens of 404 errors on my site that are requests for gif's with apparently random names, like 4273uaqa.gif and 5pwowlag.gif. I see that most of them are coming from one user. I assume something is happening in the background on her machine without her knowledge -- a malware thing on the client. Have you seen this behavior before, and do you know what sort of malware might cause it? Would love to advise my customer that s/he has an issue. I'd also like to stop getting these 404 reports. (reposted from main Stack Overflow)

    Read the article

  • mod_rewrite works within directory not on root

    - by Anvesh Saxena
    I am having problem in my RewriteRule for the tags portion. What I am able to debug is that the rule is been triggered at least because the page "tags.php" is been rendered but without the URL parameters. This .htaccess file with the rules is within root for my sub-domain and has following content for tags postion. # Rewrite rule for tags RewriteRule ^tags/(\w+)/(\d+)/?$ tags.php?tag_name=$1&tag_id=$2 RewriteRule ^tags/(\w+)/?$ tags.php?tag_name=$1 RewriteRule ^tags/?$ tags.php?tag_name= Another problem that I ain't able to debug is that the similar .htaccess file exists for a directory within my sub domain and is working as expected with the necessary URL parameters also been available. The .htaccess file within the directory reads as follows # Rewrite rule for tags RewriteRule ^tags/(\w+)/(\d+)/?$ restAPI.php?type=tags&tag_name=$1&tag_id=$2 RewriteRule ^tags/(\w+)/?$ restAPI.php?type=tags&tag_name=$1 RewriteRule ^tags/?$ restAPI.php?type=tags&tag_name= Could anyone point me the problem that I might be having in my Rewrite rules, I am also facing Internal server error sometimes which I am second guessing is due to the linked problem. Note:- I have Apache version 2.2.23 on my shared hosting.

    Read the article

  • Google indexed page a day before also reflecting in search but today everything vanish

    - by ganesh
    We had robots.txt which disallow all robots as we were in development. We are live now. We change robots.txt as per our requirement a day before. Submit indexes using Google Webmaster Tools index status. After this we can see proper result in search as well as Google images search was working as expected. Suddenly today all these things vanish from Google Search. Now again I can see old result i.e. under construction message. I checked robots.txt in Google Webmaster Tools, it's ok - no crawling errors. Kindly let me know what exactly happened? How I can inform this issue to Google?

    Read the article

  • Pagination for product listing, what to use? "canonical" or "rel-prev-next" or do nothing?

    - by Jayapal Chandran
    I want to make sure my product listing is 10 products per page which are not in a series (link). They have explained how to use canonical or rel prev for pagination when a long page has been divided into multiple page and the multiple pages becomes a series were as my condition is not that. They are unique listing which are not related to each listing... All the listing links leads to a product profile page. So lets say my site is all about cars and I have a Used Audi page with 1000 Audi's for sale. There are 10 used audi cars on each page so there's 100 pages in the series. If I start to utilise Rel="prev" and rel="next" should I set page 2 onwards as index,follow or noindex,follow? The content on Page 2 all the way to 100 only changes ever so slightly as different cars will be for sale on different pages but from a "Panda" point of view the pages are incredibly similar as they'd hold the same meta data as page 1 in the series along with duplicate reviews & news etc. I want Page 1 in the series as the Main page for Google to send users too and I don't see the point in Google indexing page 2 100. What's everyone's view on this? Lastly with the rel="canonical" tag should page 2 to 100 all point back to page 1 in the series or the individual page itself? E.G: /used-audi/page-3/.

    Read the article

  • Wordpress .htaccess preventing subfolder access

    - by John K.
    This is sort of a goofy setup, but it's not in my power to reconfigure it at this time. I'm running in a shared hosting environment. The domain is example.com. This is an add-on domain on the host side with example.com being redirected to the www/example.com sub-directory. That directory houses a standard Wordpress site which acts as the main site when you visit example.com. The .htaccess file within that directory is: # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteRule ^wp-admin/profile\.php$ /ssm/welcome [R] </IfModule> I have a subdirectory, at the root level with the /example.com subdirectory that houses a cake php application. That subdirectory is /tracker. My problem is that when I attempt to browse to example.com/tracker, I get a 404 from Wordpress because perma links are on. What I think I need is a rewrite rule in the Wordpress .htaccess file that short circuits the existing rewrite rules and permits example.com/tracker to work independently of the Wordpress install. Or a rewrite rule at the root level that short circuits the redirect to the /example.com directory in the first place. Not sure how well I explained that so here's a summary. The www/ directory structure: example.com/ tracker/ Add on domain of www.example.com redirecting to the /example.com directory with Wordpress and a tracker/ directory running CakePHP which I would like to access via www.example.com/tracker. If you need further info or clarification let me know!

    Read the article

< Previous Page | 132 133 134 135 136 137 138 139 140 141 142 143  | Next Page >