Search Results

Search found 14053 results on 563 pages for 'upk pro (knowledge pathwa'.

Page 180/563 | < Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >

  • What options are there for integrating with payment gateways? [closed]

    - by Rowland Shaw
    It seems that there are only two types of payment gateway service out there at the moment; Either that the entire cart logic is handled offsite (with something like Paypal's Standard option) or the other option being that you need to go through the certification for handling credit card numbers and doing pretty much everything yourself. Ideally, for the project I'm working on, I'm after a bit of middle ground such that I can handle the cart on-site, and only pass over to a payment gateway (with an order amount, billing & delivery details, and order ref) for them to handle the card details, before passing back. I'm sure that I've used e-commerce sites using this pattern before, but I cannot find any payment providers out there that offer this sort of option, so are there any? The only over requirement we have at present is that it must accept orders in Sterling.

    Read the article

  • Does the EU cookie law apply to an EU site that is hosted outside of the EU?

    - by mickburkejnr
    I have been reading up about this EU cookie law, and have also had in depth conversations with my girlfriend who is a solicitor/lawyer and with colleagues while building websites. While we are now working towards implementing a way to abide by the EU law, I have thought of something which no one really knows the answer to and has caused a few arguments. It's my understanding that any website in the EU must abide by these cookie laws, which is understandable. However, say if I were to have a .co.uk or .eu domain name pointing to a website which is hosted in America for example, do I still need to abide by the EU laws even though the website is hosted outside of the EU? One person I have asked has said that because the domain name is .co.uk or .eu (a European TLD) then the website is still accountable under EU law. Another person I have asked has said because the actual website is held outside of the EU, it doesn't actually have to bother with this law.

    Read the article

  • CDN virtual subdomain causes duplicated content

    - by user3474818
    I have created a subdomain and a CNAME record which points to the domain root. The subdomain www.static.example.com is actually a copy of the entire website www.example.com and it is supposed to act as an CDN and serve static content in order to improve speed. However, all of my content can be accessed via subdomain aswell, so Google has indexed it all and now I am dealing with duplicated content. How could I deny access to crawlers for the subdomain baring in mind that I do not have different subfolder for subdomain, so I can't create a separate robots.txt file?

    Read the article

  • Google Analytics Views - Why Use Them?

    - by pee2pee
    I've been reading about Google Analytics views but still not sure why I would use them. I'm the only person in the company who understands and uses Google Analytics. We have no subdomains. Is there any reason why I would want to use views? Google Analytics has been going for some years now and I just created a copy of the original view but this has zero data, so I can't see how it would benefit me.

    Read the article

  • Setting folder to be writable by apache/php in windows?

    - by Chris Sobolewski
    I have a local test server, and I am attempting to write a file with PHP. I am getting a message that the folder (../uploads/) does not exist or I do not have permission. My directory structure is D:\xampp\htdocs\website\ //<--root D:\xampp\htdocs\website\library //<--where script runs D:\xampp\htdocs\website\uploads //<--where I'd like to save I know on a *nix server, I can just chmod the permission to 0777. What do I need to set on my windows box to give apache the ability to write a file?

    Read the article

  • What is cloudfront.net and what does it do?

    - by JK01
    I frequently have messages like this in my websites error log: "Script error.". URL: https://e3m4drct5m1ays.cloudfront.net/items/loaders /loader_21.js?pid=21&systemid=13504281c5a501837196c23300f84e66&aoi=1327214632& zoneid=16620&cid=HK&rid=Hong%20Kong%20(general)&ccid=Kowloon&dma=0. Line number: 0 Error name: Stack: Now I don't actually know what cloudfront is or what it does. And I do not refer to this script in my site. So why would I be getting js error logged as if it was a script being run on my own site? This is using elmah logging.

    Read the article

  • PHP accessible shared content between two websites on the same VPS on different domains/IPs

    - by Lee Fentress
    I have two ecommerce websites, selling music digital downloads, on the same VPS, currently using cPanel/WHM (but thinking of switching to Virtualmin). They have separate domains and IPs of course. They both share from the same set of music files, so I have duplicate copies in each website directory, which takes up a lot of disk space. How might I go about sharing the same set of music files across both sites, allowing PHP access, so that it does not break my shopping cart's functionality of serving customers the downloads after they have paid for them? I thought of maybe using symlinks or something, but I don't know if it's possible, or if it would have to somehow circumvent built-in security features of the server. I'm new to VPS management.

    Read the article

  • How to point one sub-domain to another sub-domain and they can be used interchangeably

    - by Talon
    I'm trying to do this secure.domain2.com -loads content from- secure.domain1.com So if somebody goes to secure.domain2.com it will load the content of secure.domain1.com Note that I don't want a redirect, so if someone goes to secure.domain2.com in the address bar it will still say secure.domain2.com even though it's loading content from secure.domain1.com I've read that it's possible with a CName or something like that, what is the best way to do that?

    Read the article

  • How can I get cross-browser consistent behavior for TR heights within a table with a set height? [migrated]

    - by Dan
    I have an arbitrary number of tables with an arbitrary number of rows in each, and all tables are the same height. My initial approach was to just set the overall height of the table and hope the rows were smart enough to distribute themselves appropriately. That's not the case. I have 4 different behaviors going on with 4 browsers, but I need them to all render at the very least in a similar way. Safari & Chrome (WebKit): All rows are equal height, creating scroll bars as needed and fitting within table height. Firefox: All rows are the height necessary to fit their content, with the remaining rows overflowing out of the table. Additionally, If the content of the rows does not take up all of the height, only the part of the table with content in it takes the background (though it seems, through use of Firebug, that the actual table [and TR] extend to the bottom of the proper table height). IE: All rows are the height necessary to fit their content, with the remaining rows overflowing out of the table. Obviously this only includes one version of each browser and additional variation would likely appear with more being tested. Ideally, a solution where the browser renders TRs with less content smaller than those with larger content, while still using scrolling within the variable height TRs when the overall height of the table is not enough would be optimum. I could potentially see a solution to achieve that with JS, but can it be done with CSS? Or, if not, can the behavior that WebKit displays be made to work across the browsers? Thanks! PS: Example can be found here.

    Read the article

  • Separate urls for a set of pages sharing 80% duplicate content

    - by user131003
    Issue: Currently my site has one particular page which has country specific data. So I've URLs like : mysite.com/sale-united-states mysite.com/sale-united-kingdom mysite.com/sale-sweden etc. All these pages have 80-90% common content and 10-20% country specific content. currently all these pages canonically point to mysite.com/sale-united-states. The problem is when someone searches for "sale Sweden", Google correctly shows mysite.com/sale-united-states page, which does not feel correct as it shows US page instead of Sweden. Now I'm thinking of not using canonical url so that country specific urls are produced in Google saerch. But I'm not sure how 80% duplicate content is going to affect SEO? What should be the recommended approach for this situation? A friend of mine suggested a "separate subdomain per country" based approach but it seems overkill for one page.

    Read the article

  • Two different websites in one remote hosting

    - by Kor
    My client asked me that a website that is hosted in one server (and pointing there through a domain) should also be accessed (into a specific directory) from another domain, which is not pointing there. For example: http://www.foo.com, hosted at GoDaddy, with the full website http://www.bar.com, hosted at Bluehost, needs to access http://www.foo.com/bar, as if it was the http://www.bar.com's root. So, if anybody enters through http://www.bar.com, it should internally load http://www.foo.com/bar, without visually changing the url. I amb not sure if this is possible using .htaccess or anything like this. Could anybody show me some light? Thanks in advance

    Read the article

  • Anyone know a way to find out the number of Twitter followers for a particular account on any given date?

    - by mejpark
    I manage two corporate Twitter accounts and two personal Twitter accounts, and it would be really useful to know how many followers each account has at the end of each week. I'm using TweetStats.com to gather statistics at the moment, but the follower stats functionality isn't granular enough to determine the precise number of followers. Does anyone know of any useful and free tools that can provide the exact number of followers for a specific Twitter account on any given date? Thank you, Mike.

    Read the article

  • Is there a class or id that will cause an ad to be blocked by most major adblockers?

    - by Moak
    Is there a general class or ID for a html element that a high majority of popular adblockers will block on a website it has no information for. My intention is to have my advertisement blocked, avoiding automatic blocking is easy enough... I was thinking of maybe lending some ids or classes from big advertisment companies that area already being fought off quite actively. right now my html <ul id=partners> <li class=advertisment><a href=# class=sponsor><img class=banner></a></li> </ul> Will this work or is there a more solid approach?

    Read the article

  • Static HTML to Wordpress Migration SEO Implications?

    - by Kayle
    Recently, I migrated a client's site to a new server and a new home within wordpress so they could more easily edit their website and start a blog section. The static site was 10 years old a was showing up at place #3 for it's primary keyword, consistently, according to my client, and has dropped to rank #6-8 following the migration. At launch, we made sure the urls were identical (save the removal of ".htm" which we used 301 redirects to compensate for) and we generated a new XML map and pinged google with the new site. We keep a 404 log to make sure we're not losing any incoming links. We also have Google Webmaster Tools on this site and have zero errors/suggestions, everything seems ok. I was told by numerous sources that Google would not penalize us for the use of 301s, but it's the only thing I can think of right now that is different about the site, other than the platform. Any ideas about what we could be getting docked for?

    Read the article

  • High quality/performance shared hosting (in northern Europe)

    - by Bente
    I work as a web developer on almost all levels. However, my typical customer is a 1-5 guys running some sort of consulting business. They have (or want) a web page with some kind of CMS so the can perform most (or all) editing themselves. I normally opt for Concrete5 as my default CMS because it's the most user friendly (and free) CMS I have found. My good recurring customers I host on my own server as a service, but I need a good host for the customers where I want to deliver a product and not be responsible for whatever may happen in the future. However, I still struggle with hosting! Experience shows that the typical ~1$ shared hosting is waaay to slow to run concrete5 smoothly, and a VPS is out of the question because I don't want to maintain it. So, where can I find as fast (from northern Europe), reliable, shared host where I can put a site and don't have to worry about the server going down or being unmaintained. I expect this should cost around $10-$20 but I'm open to all kinds of suggestions because different customers have different budgets.

    Read the article

  • After changing web host, I get a 'file does not exist' error

    - by Jordan
    I run a WordPress blog, and have recently changed web hosts. When changing web hosts, I copied all files and exported/imported the database etc as explained by lots of tutorials found easily on Google. The blog home page works fine. What goes wrong: When I click on any link from the home page, the browser gets stuck in a redirect loop. Looking at the error log, I see: File does not exist: /usr/local/apache/htdocs/index.php The directory /usr doesn't even exist for my website - so perhaps this is looking for a file that was present using my old Web Host and is no longer present with my new web host? What is going on, and how might I resolve it?

    Read the article

  • Redirect error in Google Webmaster Tools report

    - by Aurelio De Rosa
    I built a CMS and I used it to create the following website http://www.tkdmontecatini.com . After some days, Google Webmaster Tools started to give me several "Redirect error" on some pages like the follows: http://www.tkdmontecatini.com/it/photogallery http://www.tkdmontecatini.com/it/pagina/9/Informazioni/Corsi/Chi-Siamo http://www.tkdmontecatini.com/it/pagina/2/Informazioni/Eventi/Eventi The funny things are: If I access those links from a browser, it's all right and I've not redirect loops or other similar issues If I use the "Fetch as Googlebot" function, I get a great "Success" result Question: Any idea of why this happens and how can I fix it?

    Read the article

  • Google webmaster tools: changing address from domain name to subdomain

    - by Charliz
    So we originally have our blog on our main domain (for example, it would be on www.example.com). Now we have moved it to http://blog.example.com. My question is how do we change the address from www.example.com to blog.example.com. I read this http://www.google.com/support/webmasters/bin/answer.py?answer=83106 and it said make sure your site is main not a subdomain but I'm trying to move the site to a subdomain. Help.

    Read the article

  • Usefulness of the Backlinks shown in Webmaster Tools

    - by Ewan Heming
    Is the list of links for a site shown in Google Webmaster Tools a complete list or just a sample? I've noticed that the links in there appear to be all the ones I didn't think would have any real value - either because they were nofollow or from irrelivant sites. The few I did think would be some use have never shown up and there's also some links that are sometimes there and sometimes not (such as my linkedIn profile). Does this mean that the missing links don't/no longer carry any value? It almost appears that the list is there for Google to either inform you about problems (there was a useful list there when someone tried to SPAM my site) or mis-imform you about which link-building strategies work or not (to keep people guessing about what works or not).

    Read the article

  • Should I prevent search engines indexing tag/category pages?

    - by Macha
    On my site, I currently have no special rules for search engines. It is a blog, statically generated using a Python program. When I search for some of my articles on Google, there is usually a tag or category page included in the results. Sometimes it even ranks ahead of the article itself. Obviously, as these links aren't always going to have the article on them, this aren't the results I want people to click on. So, I'm thinking of setting noindex on these pages. Is there any possible downside to doing so? Is this possible to do via robots.txt, or do I have to add it to all the relevant templates? All I can find for robots.txt are ways to stop the search engine crawling those pages, which isn't what I want - while I don't want them indexed, it's still the only surefire way to find all my blog posts.

    Read the article

  • Chrome trims the last <li> element in a row [closed]

    - by Paul
    Ok guys, I give up. Here's the code I'm struggling with: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html;charset=utf-8" /> <title>Blah</title> <style type="text/css"> #container { margin: 0 auto; width: 350px; border: 1px solid #ccc; } ul { list-style-type: none; margin: 0; padding: 0; text-align: center; } ul li { display: inline; padding: 5px; margin: 0 1px; background-color: lime; line-height: 2em; /* border: 1px solid red; */ } </style> </head> <body> <div id="container"> <ul> <li>Element A</li> <li>Element B</li> <li>Element C</li> <li>Element D</li> <li>Element E</li> <li>Element F</li> </ul> </div> </body> </html> Why the heck does Chrome trim the right side of "Element D" (even though there is enough space to display whole item), while Firefox and even Internet Explorer render this code properly? It becomes more visible when we apply the commented border. In other words, is there a way to tell the browser that I want every single <li> element to be autonomic, and thus to move it to the next row if it doesn't fit entirely in the previous one? Can't wait to see the solution, thanks in advance.

    Read the article

  • At what point should data be sent back to server?

    - by whamsicore
    A good example would be the stackexchange "rate" button. When a post is upvoted the arrow changes color immediately. However there is a grace period for one to edit one's vote decision (oops! voted by mistake?). Is the upvote action processed immediately or does is only process after a set time period, or when the user leaves the page? How exactly is this rating processed? What is the standard for handling dynamic page edits (e.g. stackexchange rating, facebook posts?)

    Read the article

  • International TLD's vs. duplicate content

    - by Litso
    Hey all, I currently work at a pretty big website that has visitors from around the globe. My job is to help out on the SEO, and one thing we've been discussing lately is the use of international TLD's. The ones we use range between: (partly) translated websites like .es and .de that serve most of the content in the country's language non-translated (english) websites for non-english languages (due to a lack of translations) like .ro and .cz english websites for english speaking countries with localized TLD's (.co.nz, .co.uk) On one hand I really have the feeling this is causing a lot of duplicate content, especially for the last two categories of TLD's. On the other hand though it seems a lot like country-specific TLD's tend to score a lot better in that country's Google. Would it be advisable to keep on using these domains, or should we canonicalize them all to the .com version?

    Read the article

  • Are the famous websites handmade? [closed]

    - by Mithun Chuckraverthy
    I'm a newbie in web designing. I always wanted to build a professional quality website by myself. So, I started learning HTML/XHTML and CSS for presentation; and, JavaScript and PHP/MySQL for scripting. I wonder, would the developers of famous websites design them by hand? Or, have they found out any better idea of using softwares? If so, can you tell me what are they? (By the word famous, I mean any websites that are liked by millions of people all over the world. Like: Google, Facebook etc.) Thanks in advance!

    Read the article

  • Google Maps API: Premier License or excess map loads?

    - by j0nes
    I am currently looking for a way on how to deal with the Google Maps API usage limits. I am planning a redesign of our page that will probably get around 2 million map loads per month. This will surely break the usage limit of 750000 map loads per month available in the free version. If we pay for excess map loads, this means we would have to pay 5000$ per month. The other option would be to use a Premier license, however there is very few information available on the usage limits for this and the price. I have filled the request form to get a custom offer from Google, but I did not get any response yet. Can anyone of the Premier license holders tell me which option will be cheaper for my usage pattern, paying for Premier license or paying for excess map loads?

    Read the article

< Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >