Search Results

Search found 9717 results on 389 pages for 'pro'.

Page 173/389 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • SSL certificates - best bang for your buck [closed]

    - by Dunnie
    I am in the process of setting up an online store. This is the first project I have attempted which will require a good level of security, so I recognise that an decent SSL certificate is a must. My current (albeit admittedly basic) understanding of the options are: DV SSL - more or less pointless, as provides no verification. OV SSL - better, as provides a basic level of organisational verification. EV SSL - 'better, full' verification, but much more expensive. As this is a new business venture, and budgets are tight, which option provides the best bang for my buck? I have read questions such as EV SSL Certificates - does anyone care? which suggest that EV certificates are a bit of a con. Does this mean that OV certificates offer better value for money, especially for new businesses with shallow pockets?

    Read the article

  • What is the best shopping cart or implementation for unlimited users posting unlimited products? [closed]

    - by Matt
    I've been working with x-cart much lately, and I was thinking about using it for a much larger site, but I don't know if it can handle what I'm looking for. I need a platform or strategy that can allow for as many users as possible where each can post multiple products (hopeful up to a hundred, but that's less important), but in their own private catalogs. So what am I looking for? With x-cart, I'm used to customizing it with jquery, smarty, and php, so I can handle that much.

    Read the article

  • Will having multiple domains improve my seo?

    - by Anonymous12345
    Lets say I have a domain already, for example www.automobile4u.com (not mine), with a website fully running and all. The title of my "Website" says: <title>Used cars - buy and sell your used cars here</title> Also, lets say I have fully SEO the website so when people searching for the term buy used cars, I end up on the second or first page. Now, I want to end up higher, so I go to the google adwords page where you can check how many searches are made on specific terms. Lets say the term "used cars" has 20 million searches each month. Here comes the question, could I just go and buy that domain with the search terms adress, in this case www.usedcars.com and make a redirect to my original page, and this way when people search for "used cars", my newly bought domain name comes up redirecting people to my original website (www.automobile4u.com)? The reason I believe this benefits me is because it seems search engines first of all check website adresses matching the search, so the query "used cars" would automatically bring www.usedcars.com to the first result right? What are the downsides for this? I already know about google spiders not liking redirects, but there are many methods of redirecting... Is this a good idea generally?

    Read the article

  • Does redirecting old site's URLs to new site's front page hurt a page's ranking?

    - by Kaivosukeltaja
    An old site that is being rewritten needs to have it's URLs redirected to the new site. There are a few hundred pages that may or may not have corresponding pages on the new site, probably with different slugs, and adding mappings manually will require more hours than we can spare. It was suggested that all old URLs be redirected to the new front page, but I remember reading somewhere that this confers a penalty in page rank because it's what link farmers do. Is this true or can we take the easy way out?

    Read the article

  • photshop: why low quality jpg saved as high quality increases the file size?

    - by Alex Angelico
    i have a low quality background about 85kb. If I open the file in Photoshop and save it, I have to save this image as 100% quality, the file size increases to 640kb. This doesn't make sense, the image si already compressed, the quality cannot be better than the source, so the 100% quality while saving should produce THE SAME FILE OPENED. The problem is, I have this background and want to add a logo image over it, and then save it. But If I do so, i have to save this with 10% quality, and the logo looks horrible. If I save this to 100% quality, the file size is huge... How can I achive this?

    Read the article

  • Alexa indexing browsing history?

    - by Haluk
    We have this test.php sitting around in a forgotten folder. It is a script which just sends an email to our site admin. We never had a page linking to it. It is not indexed by Google. It does not exist in the Internet Archive Wayback Machine. But every now and then it gets crawled by ia_archiver. I wonder how it got indexed. Could it be because of the Alexa toolbar installed on our computer? Does Alexa index our personal browsing history?

    Read the article

  • Copying an SSH Public Key to a Server

    - by Nathan Arthur
    I'm attempting to setup a git repository on my Dreamhost web server by following the "Setup: For the Impatient" instructions here. I'm having difficulty setting up public key access to the server. After successfully creating my public key, I ran the following command: cat ~/.ssh/[MY KEY].pub | ssh [USER]@[MACHINE] "mkdir ~/.ssh; cat >> ~/.ssh/authorized_keys" ...replacing the appropriate placeholders with the correct values. Everything seemed to go through fine. The server asked for my password, and, as far as I can tell, executed the command. There is indeed a ~/.ssh/authorized_keys file on the server. The problem: When I try to SSH into the server, it still asks for my password. My understanding is that it shouldn't be asking for my password anymore. What am I missing?

    Read the article

  • Where to store users consent (EU cookie law)

    - by Mantorok
    We are legally obliged in a few months to obtain consent from users to allow us to store any cookies on the users PC. My query is, what would be the most effective way of storing this consent to ensure that users don't get repeat requests to give consent in the future, obviously for authenticated users I can store this against their profile. But what about for non-authenticated users. My initial thought, ironically, was to store given consent in a cookie..?

    Read the article

  • Poor backlink profile - search rankings not updated for 2+ months

    - by fistameeny
    I am carrying out some work on a website that is a PR2 with a few good quality, relevant backlinks (PR4-6). It has a presence on Twitter that is updated regularly, a Google Places listing, and listings on some decent directories (Qype etc). The site was rebuilt into Drupal 7 two months ago, with all the basics done - URL rewriting, XML Sitemap submitted to Google, and most importantly, good quality, structured content. I've noticed that Google is still showing "old" URL's from the previous version of the site that was ditched 8 weeks ago. I think the site may be penalised under the Penguin update, as a previous SEO company created many low quality links from link farms/directories. My question is what the correct way to deal with this is. Bing Webmaster Tools can "disavow" links, and I guess I can attempt to contact the link farms to have them removed. I've already submitted a request to Google to request that we have the penalty removed as we're trying to tidy up a bad history. We submit updated sitemaps to Google and Bing daily, and have built some further decent quality, relevant links. Is there anything further I can do?

    Read the article

  • Google analytics set up with wrong domain

    - by Tom
    I have recently embedded Google Analytics into a site using the default embed code. <script> (function (i, s, o, g, r, a, m) { i['GoogleAnalyticsObject'] = r; i[r] = i[r] || function () { (i[r].q = i[r].q || []).push(arguments) }, i[r].l = 1 * new Date(); a = s.createElement(o), m = s.getElementsByTagName(o)[0]; a.async = 1; a.src = g; m.parentNode.insertBefore(a, m) })(window, document, 'script', '//www.google-analytics.com/analytics.js', 'ga'); ga('create', 'UA-XXXXXXXX-1', 'MYDOMAIN.COM'); ga('send', 'pageview'); </script> However I had MYDOMAIN.COM set to an incorrect domain. The views for the site seem very low, however, I can see myself there as a visitor in the real time scanner. What effect would setting the domain incorrectly have had? How does Google use this parameter?

    Read the article

  • Extracting meta tags attribute using wget [migrated]

    - by Amit
    I have a file having some URLs per line. I need to extract the "keywords" present in the tags i.e. if there is meta tag for "keywords" then i want to get "content" value for it. Example: if the web-page has this meta-tag then for that URL i want "wikipedia,encyclopedia" to be extracted. One approach is to download the web-page using "wget" and then parse it using some standard HTML parser. I was wondering is there any better way to do this without downloading the entire web-page.

    Read the article

  • How can I redirect everything but the index as 410?

    - by Mikko Saari
    Our site shut down and we need to give a 410 redirect to the users. We have a small one-page replacement site set up in the same domain and a custom 410 error page. We'd like to have it so that all page views are responded with 410 and redirected to the error page, except for the front page, which should point to the new index.html. Here's what in the .htaccess: RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteRule !^index\.html$ index.html [L,R=410] This works, except for one thing: If I type the domain name, I get the 410 page. With www.example.com/index.html I see the index page as I should, but just www.example.com gets 410. How could I fix this?

    Read the article

  • A couple of links to our products and 10 pages of crack/keygen/torrent/etc.

    - by devdept
    If you try searching for our company and product name you'll get two useful links and 10 pages of hacker sites where eventually you can download the cracked version of our products. How can we clean hacker links and leave only useful links to our prouct pages? We already checked the Google URL Removal Tool but within the 'Removal Type' options we can specify there is nothing meaningful to specify in this case. Shall we proceed the same? Thanks.

    Read the article

  • Which mobile device is appropriate as a utility tool for a web master?

    - by Kayle
    Basically, I'm looking for a device to use on the road and I would prefer to not have to sit down for the majority of the tasks (which rules out netbooks, in my mind). I'm also hoping to spend less than $500. This is what I'd like to "capably" be able to do on the device: Browse the web in non-mobile format, flash is a plus Email, chat, etc Have access to a decent text editor and ftp OR a browser that supports BESPIN/ACE Some sort of SSH support I'm looking at rooted Android phones and iPhone/iPads... though the phone aspect is only icing (it would be cool to consolidate the two devices and have net access through cell networks, but I'm not married to the idea). Are there cheap linux tablets that are ready for prime-time yet? I suppose that would be ideal. All suggestions welcome!

    Read the article

  • Free hosting solution for a very low-traffic website [duplicate]

    - by user966939
    This question already has an answer here: How to find web hosting that meets my requirements? 4 answers I run a very low-traffic website (about 40 users, basically all of which are daily active on the site). I don't see it changing anytime soon either, as there is no way to sign up on the site right now. Until now I have just been using a sub-directory on a friend's host (shared), to host the web site. But in only a few weeks from now, his subscription will end, and he has no plans on renewing it. So of course this means I'll have to move on to something else. But I don't think I'll find someone who'd be willing to share a... shared host with me again. And besides, the software used on that server is ancient (PHP 4.4.9 + MySQL 4.1.22). There's one obvious solution that comes to mind, I guess: choose a better host and pay for it myself. The problem here is that I have no real fixed income, as I'm only a student. So even if the pricing is dirt cheap, I just can't be certain I will be able to afford it, every single month, for... at least 2 years maybe? So I've looked at free hosting solutions instead. The least requirement I had was that it was completely free of ads. But no matter where I look, I always find something in a corner or two ("what can you expect from a free host?" - yeah I know, but I guess it was worth a shot). For example, on Byethost (one of the free hosts I tried), if you trigger a PHP error while error reporting is set to E_ALL, you will spawn some hidden ad... Besides Byethost, I've tried 000Webhost, x10Hosting, 2Freehosting/1Freehosting, Wink.ws, and they are only worse. Okay, I'm running low on ideas. But! What if I just hosted the site myself, on my own computer? That could work. I actually do have my computer on practically 24/7. But not really. Sometimes I need to reboot it, and sometimes we even have power outages. And what if the hardware needs an upgrade? It's not such a big deal for me if the site went down, because I know what's going on; but what about the users? If I do decide to host it myself, is there some way to show users an alternate page instead of them just seeing a generic "server not found" page in the browser when the site is not accessible? Or is there something I have been missing out on? Is there a different kind of "web hosting" solution out there that I haven't heard of? Here is what I'm really looking for: Free (as in, no costs) NO ads Bandwidth enough for a low-traffic forum with roughly 40 users (Semi-)Up-to-date PHP and MySQL (at least not older than a year) No standard (non-extension) PHP functions turned off - such as sleep() The mbstring extension is enabled Disk space: at least 5 MB At least one MySQL database Some bonus points would be: Max execution time of PHP scripts can be set Remote access to MySQL database What would be the best solution for me? Is there one?

    Read the article

  • Validation Meta tag for Bing [closed]

    - by Yannis Dran
    Note of the author: I did my research before posting and the old "duplicate" generated over a year ago and things have changed since then. In addition, it was generic question but mine is targeting only ONE search engine machine: BING. FAQ is not clear about how should we deal with these, "duplicate" cases. The "duplicate" can be found here: Validation Meta tags for search engines Should I remove the validation meta tag for the Bing search engine after I validate the website?

    Read the article

  • Upload ICS (calandar) file to my server

    - by IEnumerable
    I want to upload my ICS file to my server, well this is the easy part. What I really want is to be able to share this *.ics file with 2-3 people so they can edit. I have currently uploaded the file on the www/ (root) mydomain.com.au/myfile.ics of my server and added it to my Email/Cal client. Was all looking ok until I tried to save changes. Can someone please direct me to some documentation on how I do this properly. The easy solution would be to upload to google, but I would rather learn how to manage the file myself. Thank you

    Read the article

  • Google is not treating two Australian schools as separate sites when both are subdomains of qld.edu.au

    - by LuckySpoon
    My question relates to two websites, each of which is a "Calvary Christian College", however in two totally different locations and unrelated to each other entirely (except by name, and thus domain). All schools in the state are issued a <school-name>.qld.edu.au subdomain, in this case calvary.qld.edu.au and calvarycc.qld.edu.au. Now what's interesting is that these domains are crossing each other in sitelinks for searches such as calvary christian college townsville. The green data here is for one school (the Townsville school, as per search term), and the red data is for the other school. I've put a demotion in for this 6 months ago (we control calvary.qld.edu.au), however we're seeing no change on the results page. I have been able to get the owners of calvarycc.qld.edu.au to submit demotions for our domain, which should go in sometime in the next few days. What can we do to tell Google that these websites are not interchangeable, despite both appearing as "subdomains" of qld.edu.au? We can possibly open channels of communication with the administrators of qld.edu.au but will need to tell them what we need to change, and at this point I'm out of ideas.

    Read the article

  • Lazyloading images and SEO

    - by surpr
    Lazyloading images with a noscript fallback. Should I expect any damage in the SERPs? The site is completely thumbnail based. Also should I put a smaller image size in the noscript fallback to increase crawlability? We have nearly 1mil thumbs so it's a decision I'm hesitant to do. The reason why I'm thinking about it in the first place is because we're upping thumbail size about 50% which will add 10% of pagesize.

    Read the article

  • Accented characters representation in the URL

    - by Dan
    We have support for various languages in our website, including Spanish, French and Swedish. For now, the links in the site are NOT encoded before sent to the browser, sending the real accented chars (if such exists, i.e. href="www.(dot)example(dot)com/héllo.html") and not their HEX representation. This works & looks good on all browsers, including Chrome, FF and IE. However, we care great deal about SEO. We got this tip that encoding the links before sending them to the browser (so instead of linking to http://www.example.com/héllo, we will link to http://www.example.com/h%E9llo) will improve the way search engines will 'understand' the links and the keywords in the URL. This involves some work at our side, so we wanted to know if there's truth in that tip, but couldn't find anything addressing this issue.

    Read the article

  • Google Analytics Visitors drop-off for certain region of site only

    - by crmpicco
    I have an issue with the tracking on my site where I have seen a dramatic drop off of visitors to the site from a certain region. I have four regions on my site at the moment, these are UK, EU, US and RoW (Rest of the World). The UK, EU and US regions are unaffected, only the RoW region suffers this drop-off. I have included a screen shot below from my GA account, which shows this effect. My GA code, which is included on every page on the site is below. I have changed the UA account number intentionally for this example. There have been no changes made to the GA account or the tracking code in a live environment for some considerable time, but for some reason I am seeing the drop-off for this region only. In the code below I am not tracking page views on certain pages as I have event tracking setup for these pages. <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-18721873-5']); _gaq.push(['_setCookiePath', '/row/']); if ( typeof(p_page) != 'undefined') { // do nothing if user is on above pages // N.B. there are a series of conditions in this if statement checking that we are not on a particular page } else { _gaq.push(['_trackPageview']); } </script>

    Read the article

  • custom domain point to tumblr blog

    - by Julius
    My domain mydomain.com is registered with godaddy. I wish to host my tumblr blog on this domain with nearlyfreespeech.net hosting. My active nameservers at godaddy already point to my authoritative ones at NFS.net which is working. However i'm baffled of the correct configuration to set to point to my Tumblr. Preferably id like (A) my domain http://mydomain.com to host the blog and have http://www.mydomain.com redirect also to http://mydomain.com If this is too difficult my next preference is (B) to have http://www.mydomain.com host the blog whilst http://mydomain.com redirects to http://www.mydomain.com My 3rd preference is to have (C) a sub-domain like http://tumblr.mydomain.com or http://tumblr.mydomain.com to host the blog and i guess have http://mydomain.com and http://www.mydomain.com both redirect to it. I've tried having two aliases mydomain.com and www.mydomain.com pointing to my permanent NFS ip at mydomain.nfshost.com and when i try to add: (1) an A record pointing mydomain.com to the ip 66.6.44.4 as per Tumblr's instructions it tells me i already have the bare domain as an alias so i cant do that. (2) the A record on the www.mydomain.com alias. I can do this with either www.mydomain.com set as an alias or not. But when i tried this with mydomain.com set as the canonical name the result when visiting either mydomain.com or www.mydomain.com was them both continually redirecting to eachother until an error was thrown. So i was wondering if there is a ninja that could save me some hairpulling and tell me the correct way to config A, or else B, or else C.

    Read the article

  • Looking for free, specific Ip2Location Database

    - by Andresch Serj
    I am searching for a free db (like an updated XML or CSV file) that relates IP addresses to specific locations. I want more information than just the country. I want some sort of region or city reference, even if that ends up to be a number that makes no sense to me. Doesn't have to be super correct or always up to date either. It is just to distinguish between user groups and not to monitor or spy on them.

    Read the article

  • Usual Suspects: Typical 3rd Party Entities in E-Commerce [closed]

    - by zharvey
    I am doing some requirements/analysis for a web app that I'd like to build (Ruby/Java developer here). This web app would have a store front, shopping cart and would need to be totally compliant with all e-com best practices. It's amazing how much non-technical info comes up when you search for phrases like "how does e-commerce work", but very little comes up in the way of technical details. As such, I'm having extreme frustration finding answers to what I consider pretty straight-forward questions. I came here because I believe this question is not off-topic; if it is, please leave a comment as to why this question does not belong here and I will happily remove it myself (upvotes if your comment can point me to the correct place for this question!). So then: What 3rd parties will I need to work with to have a modern, web-compliant e-com site? So far I can account for a payment gateway provider like Authorize.net and an SSL certificate provider like Trustwave. Any others? What other standards besides PCI compliance will I be held to (besides governing laws, of course!)? Vulnerability scans: PCI compliance requires quarterly scans: if I'm a "Level 4" (low volume) Merchant does that still apply to me? Irregardless, my backend architecture is quite huge, with web servers, app servers, database, message brokers and more. Do each of these servers need to be scanned?!? If not what servers do need to get these quarterly scans? I usually hate to ask micro-questions inside of one large one, but these are so closely-related I just felt like asking them all separately would be spamming the site with too many petty questions. Thanks in advance!

    Read the article

  • help redirecting IP address

    - by Alice
    Google has indexed the IP address of my site rather than the domain, so now I'm trying to set up a 301 redirect that will redirect the IP address and all subsequent pages to the domain. I currently have something like this in my .htaccess file (however don't think it's working correctly?): RewriteCond %{HTTP_HOST} ^12.34.567.890 RewriteRule (.*) (domain address)/$1 [R=301,L] I've used various redirect checker tools and keep getting the message: "... not redirecting to any URL or the redirect is NOT SEARCH ENGINE FRIENDLY" Am I doing something wrong or is there something else I should be trying? Thanks! Alice

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >