Search Results

Search found 9727 results on 390 pages for 'llblgen pro'.

Page 163/390 | < Previous Page | 159 160 161 162 163 164 165 166 167 168 169 170  | Next Page >

  • What options are there for integrating with payment gateways? [closed]

    - by Rowland Shaw
    It seems that there are only two types of payment gateway service out there at the moment; Either that the entire cart logic is handled offsite (with something like Paypal's Standard option) or the other option being that you need to go through the certification for handling credit card numbers and doing pretty much everything yourself. Ideally, for the project I'm working on, I'm after a bit of middle ground such that I can handle the cart on-site, and only pass over to a payment gateway (with an order amount, billing & delivery details, and order ref) for them to handle the card details, before passing back. I'm sure that I've used e-commerce sites using this pattern before, but I cannot find any payment providers out there that offer this sort of option, so are there any? The only over requirement we have at present is that it must accept orders in Sterling.

    Read the article

  • How to track in Google Analytics registrations come from Google AdWords ads?

    - by automatix
    I created a campaign in Google AdWords and some ads in it and gave them URLs like mydomain.tld/registration/?utm_campaign=mycampaing&ad=x mydomain.tld/registration/?utm_campaign=mycampaing&ad=y mydomain.tld/registration/?utm_campaign=mycampaing&ad=z All ads lead to the registration page. A registration is a visit of the page mydomain.tld/registration-complited/?user={ID} So I can track the registrations in Google Analytics. I just go to Behavior -> Site Content -> All Pages and filter the pages to registration-complited. But how can I see, how many and which users have registered, after they came from an ad of a campaign, e.g. utm_campaign? And how can I also track this for a sigle ad of the campaign, e.g. x?

    Read the article

  • Subdomain still times out after set up a month ago

    - by user8137
    I'm a newbie on this and this has probably been asked already but the subjects online were close but too vague in there answers so I've probably really messed this up. I would really appreciate specific step by step instructions. This is what I'd like to do: use the subdomain www.high-res.domain.com to be accessed by external customers with specific permissions to access the site (like ftp). We use Network Solutions to house domain.com. We recently added a new ip address to point to www.high-res.domain.com. I gave the ip address to the company that hosts our website. I pinged www.high-res.domain.com and it points to the correct ip address but still times out. It’s been a few weeks now and when you ping it, it still times out. C:ping XXX.XXX.X.XXX Pinging XXX.XXX.X.XXX with 32 bytes of data: Request timed out. Request timed out. Request timed out. Request timed out. Ping statistics for XXX.XXX.X.XXX: Packets: Sent = 4, Received = 0, Lost = 4 (100% loss). Tracert times out as well. I even went to DNS tools and a few other sites for checking this and it shows the same thing. I recently went into the DNSmgmt on our server (wink2k3sp1) and created an A record under the DomainDnsZones which translated to a Cname when you look at it. Under the Domain it has two entries one to the subdomain and the other to the website host each with separate ip addresses. Is this correct? The website people are too busy on another project to research it further and my friends haven't gotten back to me. Please help. Thanks KK

    Read the article

  • Question on business connections and page rank?

    - by Viveta
    I just want to ask this question to get a yes no answer on something that I've been wondering on lately. So regarding how there are countless numbers of sites now that use the no-follow; making it harder to get ranking for your page if your website information might be something useful and will get traffic but maybe isn't something that your business connections share content of; but I am trying to find out if the benefit to having a bunch of say "likes" to your facebook page, but all the connection to your website's content isn't passing any benefit to your main page. So are you then competing with your own website in regards to SERPs to your facebook page and that of your home page. Am I correct on this; that if you start having your facebook page doing real good as far as connections and likes (helping bump up your facebook PageRank) but if you have links on your page with certain optimized keywords, that there is no benefit to your website (other than people getting to your facebook page, and then more likely to click to your page). Hope I explained it well what I am asking. Just wanted to get a better picture of this to know what I want to focus on as far as how I'll be linking to my desired landing pages in the future.

    Read the article

  • How come Indiegogo links shared on G+ link to their page instead of displaying URL?

    - by Ivan Vucica
    If an Indiegogo link, such as this one, gets shared on G+, their G+ page is displayed in the post in the place where commonly the URL would be displayed. I've tried looking analyzing the HTML, but came up empty handed: there's Twitter cards metadata, there's OpenGraph, there is a G+ button -- but I found nothing that links to Indiegogo's page, not even rel="publisher". So, how does Indiegogo achieve this?

    Read the article

  • Does the EU cookie law apply to an EU site that is hosted outside of the EU?

    - by mickburkejnr
    I have been reading up about this EU cookie law, and have also had in depth conversations with my girlfriend who is a solicitor/lawyer and with colleagues while building websites. While we are now working towards implementing a way to abide by the EU law, I have thought of something which no one really knows the answer to and has caused a few arguments. It's my understanding that any website in the EU must abide by these cookie laws, which is understandable. However, say if I were to have a .co.uk or .eu domain name pointing to a website which is hosted in America for example, do I still need to abide by the EU laws even though the website is hosted outside of the EU? One person I have asked has said that because the domain name is .co.uk or .eu (a European TLD) then the website is still accountable under EU law. Another person I have asked has said because the actual website is held outside of the EU, it doesn't actually have to bother with this law.

    Read the article

  • Tracking Search Filter Parameters Using Google Analytics

    - by Petra Barus
    I'm just wondering if there is a way to do this using Google Analytics. Let's say I have a search filter like the one used in Trulia.com There is a text search for the location with other drop-downs for filtering by bedroom, land size, property type (apartments, house) etc. Is there a way to track the filter and obtain a report for some questions like below using Google Analytics What is the most popular property types (house, apartments) for search in New York area? What is the most common maximum price of users who are looking for apartments in San Francisco? (or actually Google Analytics is not suitable for this kind of thing?)

    Read the article

  • Creating a backup - Rsync - Connection refused (111)

    - by pablofiumara
    I am trying to create a backup of my website for free. I just want to have a backup of my website, including not only all files and the configuration but also the databases. I mean, a full backup. If it can be done automatically, it would be better. I feel there are better ways than using the cpanel to achieve that (actually, I believe sometimes web hosters does not have any cpanel). I read the following on how to do it: Automatically mirror the entire contents and configuration of your main server to a secondary backup server on a completely separate network in a different data centre. Use RSync, FXP, cPanel voodoo, or whatever method you wish to automate syncing. That is why I installed Rsync Daemon which is an alternative to SSH for remote backups. I configured it but the test went wrong. The terminal is showing me this: pablofiumara@pablofiumara-Lenovo-G470:~$ sudo rsync [email protected]::share [sudo] password for pablofiumara: rsync: failed to connect to pablofiumara.com (50.87.147.75): Connection refused (111) rsync error: error in socket IO (code 10) at clientserver.c(122) [Receiver=3.0.9] pablofiumara@pablofiumara-Lenovo-G470:~$ sudo rsync [email protected]::share failed to connect to 50.87.147.7 (50.87.147.7): Connection refused (111) rsync error: error in socket IO (code 10) at clientserver.c(122) [Receiver=3.0.9] What should I do? Is there a better or easier way to achieve what I wish (I mentioned this in the first paragraph)?

    Read the article

  • Google shows "search instead for" when searching for our website

    - by Athanatos
    Our website is new and the name is similar (only one letter different than another website) completely different type and company though. searching for xxxxxA works OK in Google and we find relatively good results. However searching xxxxxA.com finds results for the other website and gives us the following options: Showing results for xxxxxE.com Search instead for xxxxxA.com (hyperlink when clicked then it is correctly searching for our site) Questions: Do we need to contact Google to correct this and if yes how ? if not will it be corrected automatically when the site becomes more popular and what is the process? How do we make the process quicker?

    Read the article

  • Is there a class or id that will cause an ad to be blocked by most major adblockers?

    - by Moak
    Is there a general class or ID for a html element that a high majority of popular adblockers will block on a website it has no information for. My intention is to have my advertisement blocked, avoiding automatic blocking is easy enough... I was thinking of maybe lending some ids or classes from big advertisment companies that area already being fought off quite actively. right now my html <ul id=partners> <li class=advertisment><a href=# class=sponsor><img class=banner></a></li> </ul> Will this work or is there a more solid approach?

    Read the article

  • build my own CDN service

    - by user5332
    I have two servers, both with self domain 1st www.myexample1.com 2nd www.myexample2.com and now I would like to setup CDN of www.myexample1.com to www.myexample2.com but I dont know how setup DNS or Apache that, so both servers served files for www.myexample1.com request ... I don't need solve databases, sessions or someting else... but I need know, how to do that both server will available as www.myexample1.com

    Read the article

  • Wordpress plugin installation error.

    - by Steve
    I'm trying to upload secure-wordpress.1.0.6, and I receive the following error: Warning: touch() [function.touch]: open_basedir restriction in effect. File(/abs_path/wordpress/tmp/secure-wordpress.tmp) is not within the allowed path(s): (/abs_path/:/abs_path/:/usr/local/lib/php:/tmp/php_upload) in /abs_path/public/www/wordpress/wp-admin/includes/file.php on line 199 Download failed. Could not create Temporary file. The /wp-content folder and all it's subfolders have 777 permission. I've added the following two lines to wp-config.php: putenv('TMPDIR='.ini_get('upload_tmp_dir') ); define('WP_TEMP_DIR', ABSPATH .'wp-content/uploads/'); What else should I try? I am using Wordpress 3.04 in a PHP 4.49 environment.

    Read the article

  • Hide folder names or such?

    - by Miller
    Okay, I have a cpanel account with unmetered everything (pay a bit per month), so I wanna host my forum on it etc I have the domains as lets say money.com wordpress.com forum.com As I'll have to put everything in different folders for instance money will be in /m/ wordpress /w/ and forum in /forum/ or something. What I'm saying is, how do I hide the file so it'll look like money.com/m/ is actually money.com ?? I need to hide the folder name the contents are in so I can host multiple sites, therefore the site will look like its the only site on the host so I don't have to add a redirect for it to direct it to the folder? Thanks guys, been trying for a while!

    Read the article

  • Links shortener with advanced reporting?

    - by Qualcuno
    I am serching for a script (preferably in PHP) or an external solution which lets me create an "url shortener" with advanced reports. We have been using Google Short Links for a while: it works really well, but it lacks reporting (it only displays a counter with the total number of redirects). Our setup is as follows: "go.mydomain.com" points to the web service, and we can create links such as "go.mydomain.com/product1". What I'm looking for is a similar service (or self-hosted solution) but with advanced reports, so we can track redirects by day, month, etc, distinguish between mobile and desktop users (very important!) and so on.

    Read the article

  • URL Rewrite http to https EXCEPT files in a specific subfolder

    - by BrettRobi
    I am trying to force all traffic on my web site to use HTTPS, using the URL Rewrite 2.0 module added to IIS 7.5. I got that working and now have a need to exclude a couple of pages from using SSL. So I need a rule to rewrite all URL except those referencing this folder to HTTPS. I've been banging my head against the wall on this and am hoping someone can help. I tried creating a rule to match all URL except those in a nossl subfolder as in this example: <rule name="HTTP to HTTPS redirect" enabled="true" stopProcessing="true"> <match url="(/nossl/.*)" negate="true" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTPS}" pattern="off" /> </conditions> <action type="Redirect" url="https://{HTTP_HOST}/{R:1}" redirectType="Found" /> </rule> But this doesn't work. Can anyone help?

    Read the article

  • Is it possible for a web-server to send more files than requested for, and have the browser accept them?

    - by Osiris
    I've created a basic web server for a school project, and it serves static content without a problem. I thought of having the server parse all htm/html files for links to .js/.css/image files, and send these files to the client without these files being requested by the client later. eg. The browser requests: index.htm The server responds with intex.htm and image.jpg I modified the server to send two distinct http responses for a "GET /index.html HTTP1.1" (one for the html page and one for the image), but the browser ended up requesting the image when it was good and ready. Is there any way to bypass this? (use a multipart response, perhaps) Will these files be accepted by most browsers, or will they be rejected for security reasons?

    Read the article

  • Robots meta tag with "noimageindex"

    - by jimy
    I have some doubt regarding noimageindex value in meta robots tag. If I add this tag on my page say http://www.example.com/somepage/someaction.php and on that page the images are served from another page say http://www.exampleimg.com. Then will the tag has meaning. I mean to say will the images will be ignored by bots? Or exampleimg is not affected by that tag. And all images will be indexed? Note: We want to stop indexing of the images on that particular page.

    Read the article

  • Transferring local site to shared hosting

    - by Pete
    I'm looking to setup a simple online text processing tool similar to the Clang demo. The processing program itself is a C++ program which I can modify to provide the desired output I need. Since I use Linux+Perl daily and have used Apache in the past, I'd like to get this working locally first. My two questions are: Is it possible to do this with only Apache and Perl? I've looked into frameworks for doing this and quickly ran into The Paradox Of Choice. Will I be able to easily transfer a working local site to a shared hosting service? I want to administer as little as possible. My understanding is since this needs to run a C++ program that CGI is a requirement and thus I need to administer the httpd server. Hopefully this doesn't mean a VPS. Thanks

    Read the article

  • How to use javascript functions, defined in some external *.js file in browser's javascript console?

    - by Dmytro Tsiniavsky
    I would like to know is it possible to save some, for example,simplemath.js file with function ADD(a, b) { return a + b; } simple function, run opera's or some other browser's javascript console, include somehow this (simplemath.js) file, call ADD(2, 5), and get a result in console or execute javascript code on current web page and manipulate with it's content. How can I do that? How can I use javascript functions from external files in web-browser's javascript console?

    Read the article

  • Strategy for hosting 700+ domains, each with static HTML site

    - by jonschlinkert
    I have a portfolio of more than 700 domain names, and ideally I'd like to put up a single-page HTML/CSS/JavaScript webpage for each domain. Is there a system/strategy/workflow that will allow me to: Automate the deployment of new websites, quickly and easily without having to manually initiate each new website in an admin panel. For instance, I've seen dropbox-based solutions that claim to make it simple to setup new websites on your dropbox account, but you still have to set each one up in an admin interface first. It would be so much easier to have a folder naming convention that allowed the user to easily clone/copy/duplicate sites inside their Dropbox App folder (https://www.dropbox.com/developers/blog/23) to create new ones. Sounds interesting, however... It's easy to managing CNAMEs on the registrar-side, is there a way to quickly associate CNAMEs with new websites, maybe gh-pages-style (https://help.github.com/articles/setting-up-a-custom-domain-with-pages)? With GitHub's gh-pages, all you have to do is drop a file called CNAME into your repo, with the domain name you want associated with the repo inside the file. gh-pages isn't a good solution for what I'm doing though unfortunately. I'm also a front-end developer, specializing in rapid web development and "front-end build systems", so I building and maintaining static assets for hundreds of sites is no problem. It's the hosting-side that I really struggle with. Any suggestions?

    Read the article

  • Access denied 403 errors after migrating my site

    - by AgA
    I've recently migrated my Joomla site from one shared hosting to another with Hostgator. GWT notified me about many 403 access denied pages. I've checked with Firebug too, and even though browser is displaying full page correctly but http return is 403. I've checked the home page but it's correctly returing 200 response. The same is shown by Fetch as Google in GWT(pasted this in the bottom). The site is 3 years old and I regularly do such migrations. I've copied the files and database "AS IS". I've even cleared all the caches but no luck. There is only one change: previously the site was primary domain but now it's add-on one. What could be the issue? This is how Googlebot fetched the page. Fetch as Google URL: http://MYSITE.COM/-----------------REMOVED.html Date: Thursday, June 20, 2013 at 10:32:14 PM PDT Googlebot Type: Web Download Time (in milliseconds): 3899 HTTP/1.1 403 Forbidden Date: Fri, 21 Jun 2013 05:32:15 GMT Server: Apache P3P: CP="NOI ADM DEV PSAi COM NAV OUR OTRo STP IND DEM" Expires: Mon, 1 Jan 2001 00:00:00 GMT Cache-Control: post-check=0, pre-check=0 Pragma: no-cache Set-Cookie: 0e4f6b53991c80cf39d57a6db58bb58d=ee2d880e8db0f1fc03c5612ea5a77004; path=/ Last-Modified: Fri, 21 Jun 2013 05:32:19 GMT Keep-Alive: timeout=5, max=75 Connection: Keep-Alive Transfer-Encoding: chunked Content-Type: text/html; charset=utf-8 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en-gb" lang="en-gb" > <head> <base href="http://www.mysite.com/-----------------rajiv-yuva-shakthi-programme-finance-planning.html" /> <meta http-equiv="content-type" content="text/html; charset=utf-8" /> <meta name="robots" content="index, follow" /> <meta name="keywords" content="" /> <<<<<<TRIMMED>>>>>>>>>>>>>>

    Read the article

  • What's the difference between my nameserver and CName settings?

    - by Josh Mcquiston
    I have purchased a domain name(mxsoup.net) through GoDaddy, and it is just parked. In order to set up my custom URL for my SourceAudio site, they give me the following instructions: In order to host your site at a your own URL, we need you to set up some DNS records to point your URL to us. Specifically, we need two CNAME references, one for 'www.mxsoup.net' and one for 'secure.mxsoup.net', both of which should point to 'web2.sourceaudio.com'. But the rep on the phone at GoDaddy said that my site is hosted at HostMonster.com, and therefore I need to talk to them to accomplish this(which is possibly true, but my business owner says he hasn't purchased hosting for this particular domain, yet he does have some other sites in his hostmonster hosting acct.) My GoDaddy account shows that my nameservers are pointing at NS1.HOSTMONSTER.COM, and NS2.HOSTMONSTER.COM, and I can edit those. But is this the same as setting up the CNAME as described above? Any help would be greatly appreciated!

    Read the article

  • PHP accessible shared content between two websites on the same VPS on different domains/IPs

    - by Lee Fentress
    I have two ecommerce websites, selling music digital downloads, on the same VPS, currently using cPanel/WHM (but thinking of switching to Virtualmin). They have separate domains and IPs of course. They both share from the same set of music files, so I have duplicate copies in each website directory, which takes up a lot of disk space. How might I go about sharing the same set of music files across both sites, allowing PHP access, so that it does not break my shopping cart's functionality of serving customers the downloads after they have paid for them? I thought of maybe using symlinks or something, but I don't know if it's possible, or if it would have to somehow circumvent built-in security features of the server. I'm new to VPS management.

    Read the article

  • Google page rank not showing after redirecting www to non www?

    - by muhammad usman
    i have a fashion website. i had redirected my domain htttp:// (non www) to http:// www domain and my preferred domain in Google webmaster tools was http:// www. Now i have redirected http:// www to http:// domain and have changed my prefered domain as well. Now Google PageRank is not showing for even a single page. Would any body please help me and let me know if i have done something wrong? below is my htaccess redirect code RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] RewriteCond %{HTTP_HOST} ^www\.deemasfashion\.com$ RewriteRule ^deemasfashion\.com/?(.*)$ http://deemasfashion.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://deemasfashion.com/ [R=301,L] RewriteRule ^index\.htm$ http://deemasfashion.com/ [R=301,L]

    Read the article

  • Alternative to nofollow: custom 302 url shortener?

    - by Dogweather
    Here's the scenario: lots of blogging platforms make it tedious to insert nofollow into links within the post content. I.e., you need to edit the html, format it correctly, etc. I have a client who posts lots of content with links that should be nofollow'ed, and I thought of a novel way to handle this, since the blogging platform they're using makes it hard: I install a URL shortener web app on the client's domain. The shortener works as normal, except it redirects via 302 instead of 301. The pagerank will therefore stay at the shortener's domain, and not flow on to the target site. Part 2: In order to get the pagerank to collect meaningfully, say on the site's home page, the shortened URLs would be generated like this: /link?12345 instead of /link/12345. And then, the path /link would 301 to the home page. This way, the id is a param, not a path element. And thus, all the incoming shortened links are going to one path, which transfers pagerank to the home page. So that's my idea. I wanted to see if anybody could find problems with it. Thanks!

    Read the article

< Previous Page | 159 160 161 162 163 164 165 166 167 168 169 170  | Next Page >