Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 124/216 | < Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >

  • Using only password to authenticate user (no "username" field)

    - by Guy
    I am creating a client access system, to allow manage invoices, make payments, access information about their products and information/functionality alike. Supposedly there are less than 1000 clients. Would there be any security threat to use only password (UUID v4 strings) to authenticate user? My thoughts: There is virtually no probability of collision or success with brute-force attack. http://en.wikipedia.org/wiki/UUID#Random%5FUUID%5Fprobability%5Fof%5Fduplicates User friendly (one click go) It is not intended to be remembered

    Read the article

  • What options are there for integrating with payment gateways? [closed]

    - by Rowland Shaw
    It seems that there are only two types of payment gateway service out there at the moment; Either that the entire cart logic is handled offsite (with something like Paypal's Standard option) or the other option being that you need to go through the certification for handling credit card numbers and doing pretty much everything yourself. Ideally, for the project I'm working on, I'm after a bit of middle ground such that I can handle the cart on-site, and only pass over to a payment gateway (with an order amount, billing & delivery details, and order ref) for them to handle the card details, before passing back. I'm sure that I've used e-commerce sites using this pattern before, but I cannot find any payment providers out there that offer this sort of option, so are there any? The only over requirement we have at present is that it must accept orders in Sterling.

    Read the article

  • PHP W3 Validator API, Is this good? [closed]

    - by Josh Purcell
    I was trying to find a way to see if my site's code was valid or not but I continuously going over to W3 Validator so I decided to make an "API" however it really isn't! I just wanted to know if anybody can find a better solution to the one I have made. This is what I currently use, with the usage of ?uri=http://www.mydomain.com : <?php if(!$_GET['uri']) { echo "No URI!"; } else { $CheckURI = "http://validator.w3.org/check?uri=".$_GET['uri']; $URL = file_get_contents($CheckURI); $Start = strpos($URL, "<title>") + 7; $End = strpos($URL, "</title>"); $Title = substr($URL, $Start, $End-$Start); if(preg_match('[Invalid]',$Title)) { //Code is INVALID echo "<a href='$CheckURI' title='This is not good!' target='_BLANK'>INVALID Source</a>"; } elseif(preg_match('[Valid]',$Title)) { //Code is VALID echo "<a href='$CheckURI' title='Check It Yourself!' target='_BLANK'>Valid Source</a>"; } else { //It Went WRONG echo ""; } }

    Read the article

  • Redirect error in Google Webmaster Tools report

    - by Aurelio De Rosa
    I built a CMS and I used it to create the following website http://www.tkdmontecatini.com . After some days, Google Webmaster Tools started to give me several "Redirect error" on some pages like the follows: http://www.tkdmontecatini.com/it/photogallery http://www.tkdmontecatini.com/it/pagina/9/Informazioni/Corsi/Chi-Siamo http://www.tkdmontecatini.com/it/pagina/2/Informazioni/Eventi/Eventi The funny things are: If I access those links from a browser, it's all right and I've not redirect loops or other similar issues If I use the "Fetch as Googlebot" function, I get a great "Success" result Question: Any idea of why this happens and how can I fix it?

    Read the article

  • Deny access to a folder on hosting server but serve the pages

    - by Sourav
    My hosting server allows to host multiple websites. The directory structure is like this root |_ www.a.com |_ www.b.com |_ www.c.com |_ www.d.com I want to put some PHP files on the www.d.com folder so if some one browse the site from web-browser can get it, but no one can get it's source code [even by loggin in to the root folder]. Is there any way to doing so ? There is a feature called Password protect folder or so, can in help in this case ?

    Read the article

  • Apache DVB http video Streaming bandwidth or priority problem

    - by igino manfre'
    I'm streaming few precompressed DVB videos from cloud. The streams are generated from VLC on "impossible" ports (such as 64085, 64086 etc) reverse proxed by Apache on port 80 and 8080. All the generated streams are listed in "http://95.110.164.61/indexv.html". From an ADSL connection with enough downlink bandwidth, recalling the stream generated by VLC (such as "http://95.110.164.61:64087/mpg2_6.4") it flows fluently. Recalling the same stream proxed by Apache ("http://95.110.164.61/mpg2_6.4") the stream stops and goes. The only situation in which the Apache proxed streams flow regularly is from a site connected through 64 Mbps warranted bandwith with RTT to the server less than 10 mseconds. Please note that streams below 2 Mbps are fluently proxed. The system is a single core xeon with windows 2008 R2 on 4 GB of RAM with 1 Gbps of network bandwidth. The drain of computational and bandwidth resources is negligeable, the RAM usage always lower than 50%. On the system I run many VLC streamers. Any of them drains a variable amount of RAM (from about 25 to 70 MB). On the contrary the couple of httpd.exe processes drain no more than 7 MB. Using Wireshark (on the server) I see that VLC directy send to the client much more packets than Apache, and the stream is framgmented on many frames. I'm not a programmer, a newby of Apache. Can anyone please address me to a specific portion of the Apache's huge documentation? Thank you. igino

    Read the article

  • with dash or not in domain [closed]

    - by menardmam
    Possible Duplicate: Is it better to put hyphens in a domain name ? I have to buy a domain it can be : somebigcompany.com or some-big-company.com I know, it's not a sample answer... but i like to know your point of view... the client what with no dash, i think for Google, and readability the dash is better... i have talk to a expert SEO that sait since panda update of the google algorithm it will be punish to have dash.. i don't believe it ! after your answer, i go to godaddy to buy it thanks in advance

    Read the article

  • Alternative to nofollow: custom 302 url shortener?

    - by Dogweather
    Here's the scenario: lots of blogging platforms make it tedious to insert nofollow into links within the post content. I.e., you need to edit the html, format it correctly, etc. I have a client who posts lots of content with links that should be nofollow'ed, and I thought of a novel way to handle this, since the blogging platform they're using makes it hard: I install a URL shortener web app on the client's domain. The shortener works as normal, except it redirects via 302 instead of 301. The pagerank will therefore stay at the shortener's domain, and not flow on to the target site. Part 2: In order to get the pagerank to collect meaningfully, say on the site's home page, the shortened URLs would be generated like this: /link?12345 instead of /link/12345. And then, the path /link would 301 to the home page. This way, the id is a param, not a path element. And thus, all the incoming shortened links are going to one path, which transfers pagerank to the home page. So that's my idea. I wanted to see if anybody could find problems with it. Thanks!

    Read the article

  • When acquiring a domain name for product xyz, is it still important to buy .net and .org versions too?

    - by Borek
    I am buying a domain name for service xyz and obviously I have bought .com in the first place. In the past it was automatic to also buy the .net and .org versions. However, I've been asking myself, why would I do that? To serve customers who mistakenly enter a different TLD? (Would someone accidentally do that these days?) To avoid a chance that competition will acquire those TLDs and play some dirty game on my customers? If there is a good reason, or a few, to buy the .net and .org versions these days I'd like to see those listed. Thanks.

    Read the article

  • htaccess correct, Apache logs still showing the evil visitors with 200 code

    - by bulgin
    I hope someone can help me. Please take a look at the following snippet of Apache logs: 95-169-172-157.evilvisitor.com - - [12/Nov/2012:09:46:02 -0500] "GET /the-page-I-dont-want-to-deliver.html HTTP/1.1" 200 9171 "http://hackers.ru/" "Mozilla/4.0 (MSIE 6.0; Windows NT 5.1; Search)" I have the following included in my .htaccess for the root directory of the website and there are no other .htaccess files anywhere that would affect this: RewriteEngine On Options +FollowSymLinks ServerSignature Off ErrorDocument 403 "Nothing Interesting Here" order allow,deny deny from evilvisitor.com deny from hackers.ru deny from anonymouse.org allow from all I also have GeoIP functioning properly and have this included there: #for stuff from different countries RewriteCond %{ENV:GEOIP_COUNTRY_CODE} ^(UA|TR|RU|RO|LV|CZ|IR|HR|KR|TW|NO|NL|NO|IL|SE) RewriteRule ^(.*)$ [R=F,L I know this works because whenever I attempt to access the website from a proxy in say, Spain, I get the error message. I also know it works because when accessing the website from anonymouse.org, the proper error code page is displayed. So then why am I still getting these visitors who successfully access the page I don't want them to see with an Apache 200 code when it should be an error code?

    Read the article

  • Country specific content vs global content

    - by Ando
    I have a global product presentation website myproduct.com For certain countries I also own the country domain: myproduct.co.uk, myproduct.com.au, myproduct.es, myproduct.de, etc. The presentation website is translated in multiple languages and I set up redirects: myproduct.es will redirect to myproduct.com/es/, myproduct.de will redirect to myproduct.com/de/, etc. . The content so far is the same, just translated in different languages. The advantages are that it's easy to keep the content aligned - everything is managed from one centralized dashboard (I'm using Wordpress with qtranslate). Now I'm running into trouble as for different countries I want localized content - for UK I want to run different promotions and use a different reseller than for .com.au so I would like that users coming from myproduct.co.uk see something different than those coming from myproduct.com.au (and not be redirected to myproduct.com as they are right now). How can I achieve this? I could duplicate the whole main website and modify only certain parts but then I would have a lot of duplicate content (e.g. info about how the product works) and I would have pages that are likely to change (FAQ page) that I would have to keep updated over all websites. I can duplicate only partially the main website: on the localized website I would have only the pages that are different and then all other links would point to the .com site. This would solve the duplication problem but would cause confusion for the user as you would navigate from .co.uk to .com without noticing and then wonder how to get back. Other, better option?

    Read the article

  • How frequently Googlebot fetch sitemaps? Is it depending on page rank?

    - by JITHIN JOSE
    How much frequently google fetches sitemaps? I am now working with a high traffic website normally have 30 new posts per minute.But currently it provides sitemaps which links to new 100 posts(3 minutes). Is this method is enough ?. Is Bots fetch sitemaps every 3 minutes?. Did need to change sitemaps to list all 5M posts(indexed sitemaps)?. How this change will effect on traffic and page rank. Is google bot remove urls that previously listed on sitemap but not now?

    Read the article

  • check when a page was first seen by google

    - by sam
    Is there a way to see when a page was first published, by checking when it was first cahced by google (obviously its not 100% fool proof because there is a couple of days delay in some cases but it will give you a good idea.) The only other way i could think of checking it published date is if the page / post had a publicly viable time stamp on it, but in the case im looking for it, it dosnt have a publicly visible time stamp.

    Read the article

  • How to point one sub-domain to another sub-domain and they can be used interchangeably

    - by Talon
    I'm trying to do this secure.domain2.com -loads content from- secure.domain1.com So if somebody goes to secure.domain2.com it will load the content of secure.domain1.com Note that I don't want a redirect, so if someone goes to secure.domain2.com in the address bar it will still say secure.domain2.com even though it's loading content from secure.domain1.com I've read that it's possible with a CName or something like that, what is the best way to do that?

    Read the article

  • Wordpress plugin installation error.

    - by Steve
    I'm trying to upload secure-wordpress.1.0.6, and I receive the following error: Warning: touch() [function.touch]: open_basedir restriction in effect. File(/abs_path/wordpress/tmp/secure-wordpress.tmp) is not within the allowed path(s): (/abs_path/:/abs_path/:/usr/local/lib/php:/tmp/php_upload) in /abs_path/public/www/wordpress/wp-admin/includes/file.php on line 199 Download failed. Could not create Temporary file. The /wp-content folder and all it's subfolders have 777 permission. I've added the following two lines to wp-config.php: putenv('TMPDIR='.ini_get('upload_tmp_dir') ); define('WP_TEMP_DIR', ABSPATH .'wp-content/uploads/'); What else should I try? I am using Wordpress 3.04 in a PHP 4.49 environment.

    Read the article

  • build my own CDN service

    - by user5332
    I have two servers, both with self domain 1st www.myexample1.com 2nd www.myexample2.com and now I would like to setup CDN of www.myexample1.com to www.myexample2.com but I dont know how setup DNS or Apache that, so both servers served files for www.myexample1.com request ... I don't need solve databases, sessions or someting else... but I need know, how to do that both server will available as www.myexample1.com

    Read the article

  • How do I word my url so that it doesn't get blocked or appear spammy

    - by user18681
    I'm creating a fairly large site. Will my links appear spammy if I use the same word as in the pathfile in the url? For example: www.example.com/apples/great-apple-recipes www.example.com/apples/fresh-apple-pie www.example.com/apples/delicious-apple-turnovers I do not want my link to appear spammy. But is it ok if the keyword is almost always the same as in the pathfile on a huge site? Does the pathfile count as part of the keyword? Also, how many words in total should a url (including pathfile etc...) be?

    Read the article

  • Webserver insists on opening "blog1.php" instead of "index.php"

    - by pepoluan
    I'm at my wits' end. I have just ripped out a website and in the process of rebuilding everything. Previously, the 'home page' of the website is a blog, with the address "www.mydomain.com/blog1.php". After exporting everything, I deleted the whole directory, and -- based on request -- immediately create a blog/ directory. The idea is to get the blog back up as soon as possible, and temporarily redirect people accessing www.mydomain.com to the blog. Accessing the blog via http://www.mydomain.com/blog/ works. So I put in an index.php file containing a (temporary) redirect to the blog's address. The problem: The server insists on opening blog1.php instead of index.php. Even after we deleted all the files (including .htaccess). And even putting in a new .htaccess file with the single line of DirectoryIndex index.php doesn't work. The server stubbornly wants blog1.php. Now, the server is actually a webhosting, so I have no actual access to it. I have to do my work via cPanel. Currently, I work around this issue by creating blog1.php; but I really want to know why the server does not revert to opening index.php. Did I perhaps miss some important settings in the byzantine cPanel menu page?

    Read the article

  • Will my site containing duplicate content be accepted in Adsense

    - by user5858
    I've a new site just over 6 months with 50 unique visitors daily. It has good amount of duplicate pages which are not copyrighted. For example I've copied related companies product FAQ's "as is" in the site. Moreover I'm not supposed to modify a company's product's faqs. I fear my login may be banned by Adsense if I submit it. So I want to know: 1) Whether I can submit it for Adsense account 2) Whether Google can penalize me and in what way 3) How would Google come to know that the duplicate content on my site is not copyrighted?

    Read the article

  • Indexing and Page Ranking Issues

    - by user631249
    Hi all I am on the first page of google for keywords concerned with MOVING, however i cant seem to break the carpet cleaning rankings. I have made changes and additions which havent been indexed yet. Should i wait for the run or please please can someone give me pointers on the carpet cleaning indexing. Also i have 53pages submitted and only 38 indexed, where could the problem be. Is there software to check indexing hiccups . Thanks.

    Read the article

  • Access denied 403 errors after migrating my site

    - by AgA
    I've recently migrated my Joomla site from one shared hosting to another with Hostgator. GWT notified me about many 403 access denied pages. I've checked with Firebug too, and even though browser is displaying full page correctly but http return is 403. I've checked the home page but it's correctly returing 200 response. The same is shown by Fetch as Google in GWT(pasted this in the bottom). The site is 3 years old and I regularly do such migrations. I've copied the files and database "AS IS". I've even cleared all the caches but no luck. There is only one change: previously the site was primary domain but now it's add-on one. What could be the issue? This is how Googlebot fetched the page. Fetch as Google URL: http://MYSITE.COM/-----------------REMOVED.html Date: Thursday, June 20, 2013 at 10:32:14 PM PDT Googlebot Type: Web Download Time (in milliseconds): 3899 HTTP/1.1 403 Forbidden Date: Fri, 21 Jun 2013 05:32:15 GMT Server: Apache P3P: CP="NOI ADM DEV PSAi COM NAV OUR OTRo STP IND DEM" Expires: Mon, 1 Jan 2001 00:00:00 GMT Cache-Control: post-check=0, pre-check=0 Pragma: no-cache Set-Cookie: 0e4f6b53991c80cf39d57a6db58bb58d=ee2d880e8db0f1fc03c5612ea5a77004; path=/ Last-Modified: Fri, 21 Jun 2013 05:32:19 GMT Keep-Alive: timeout=5, max=75 Connection: Keep-Alive Transfer-Encoding: chunked Content-Type: text/html; charset=utf-8 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en-gb" lang="en-gb" > <head> <base href="http://www.mysite.com/-----------------rajiv-yuva-shakthi-programme-finance-planning.html" /> <meta http-equiv="content-type" content="text/html; charset=utf-8" /> <meta name="robots" content="index, follow" /> <meta name="keywords" content="" /> <<<<<<TRIMMED>>>>>>>>>>>>>>

    Read the article

  • Keeping rackspace vserver alive

    - by mit
    It appears to me that rackspace somehow freezes cloud VMs after some idle time. This means the first page request to a php page takes much longer to respond than the subsequent requests. This is in some cases good, in other cases not acceptable. I am actually querying a machine with wget from a different host now to keep it "alive". But I wonder what frequency would be necessary. Does anyone know the time period after which they send a VM to "sleep"? I guess it would be some minutes. EDIT: There is absolutely no caching involved on the php site. It just recently moved from another vhost and there was never such latency on the first request.

    Read the article

  • Opinions on .gr (Greek) registrars?

    - by Marc Bollinger
    None of the previous questions tackle some of the one-off (or further) countries' registries, beyond .co.uk, .it, et al. or else I'd have found an answer myself. I'm just looking for information for a vanity domain, so obviously I'm alright without an answer, but it's an unasked question (or at least, unanswered), and I'm not exactly in a hurry to give my credit card information over country lines, sight unseen.

    Read the article

  • Using an old penalized domain for a new website

    - by MiladSafaei
    I had a website with 2 domains like these: firstdomain.com and first-domain.com. The main domain was first-domain.com and the other one was 301 redirected to first one. The main domain got a Google Penguin penalty some months ago. I uploaded the site on an new domain and removed Google index of old domain by using the remove URL tool in Webmaster Tools. Now, I want to use firstdomain.com (which was redirected to the penalized domain) for a new and fresh website with new and perfect content. Is it probable that history of this domain affects the new website and harms its ranking?

    Read the article

  • What should a page's minimum word count be in order to be effectively indexed?

    - by ZakGottlieb
    I'm seeding a new site with hundreds of (high quality) posts, but since I am paying per word written, I'm wondering if anybody in the community has any anecdotal evidence as to how many words of content there should now be for a page to be counted just the same as a 700 word+ post, for example? I know there are always examples of pages ranking well with, for instance, 50 words or less of content, but does anyone have any strong evidence on what the minimum count should be, or has anyone read anything very informative in regards to this issue? Thanks a lot in advance!

    Read the article

< Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >