Search Results

Search found 9728 results on 390 pages for 'meysam pro'.

Page 168/390 | < Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >

  • Password protect an alias virtual difrecory

    - by Jason
    I have a main domain being hosted through CPanel. I also have a sub-domain that I would like to appear as a path under the main domain instead of as a sub-domain. So I have: http://example.com/ pointing to the main hosted file. http://example.com/mydir pointing to the subdomain files. This is achieved by a httpd.conf include from the main domain section to set an alias: alias /mydir /path/to/subdomain/files/ Now, that works fine so far. The problem is that if a .htaccess file under /path/to/the/subdomain/files/ contains an error, the alias is completely skipped, and /mydir goes instead to the main host files. That is kind of surprising to me - I would expect an error to return an error instead. Now the killer: if I try to password protect /path/to/subdomain/files/, then trying to access http://example.com/mydir will again attempt to deliver from under the main hosted files and not from /path/to/subdomain/files/ I am not seeing any errors reported on the .htaccess file in the apache error log, so I am assuming the .htaccess is valid: AuthUserFile /path/to/valid/readable/.htpasswd AuthName "Secure Access" AuthType Basic Require valid-user This kind of behaviour does not seem right to me. Is there something obvious that could be causing it? Or is this just the way it works? Perhaps using an alias is the wrong way to go?

    Read the article

  • Facebook Like javascript related to Time Spent Downloading a page Increase in GWT?

    - by donaldthe
    Hi, I installed the Facebook Like button Javascript version on my website on December 15th. Take a look at this report from Google Webmaster Central. Crawl stats Googlebot activity in the last 90 days The crawl stats are from Googlebot which as far as I know doesn't execute Javascript. Could the Facebook Like Javascript code, "The XFBML version" be related to large spike in Time spent downloading a page? (By the way the huge spike in November was caused by a mistake where every image request was getting a 301.) I'm not sure what caused the spike to go down by half somewhere in December. It may have been related to a faulty setting in web.config. I'm at a loss as to what I can do about this or even how to tell if this is my problem or Googlebots crawl problem. Here is the Facebook code I am using to create the like button. It is right after the opening body tag <div id="fb-root"></div> <script> window.fbAsyncInit = function() { FB.init({appId: 'xxxxx', status: true, cookie: true, xfbml: true}); }; (function() { var e = document.createElement('script'); e.async = true; e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js'; document.getElementById('fb-root').appendChild(e); }()); ` and this creates the like box: <fb:like show_faces="false"></fb:like> If the Javascript can't be the problem any ideas on where to start looking would be appreciated.

    Read the article

  • Why googling by keycaptcha gives results on reCAPTCHA? [closed]

    - by vgv8
    EDIT: I'd like to change this title to: How to STOP Google's manipulation of Google search engine presented to general public? I am frequently googling and more and more frequently bump when searching by one software product I am given instead the results on Google's own products. For ex., if I google by keyword keycaptcha for the "Past 24 hours" (after clicking on "Show search tools" -- "Past 24 hours" on the left sidebar of a browser) I am getting the Google's search results show only results on reCAPTCHA. Image uploaded later: Though, if confine keycaptcha in quotes the results are "correct" (well, kind of since they are still distorted in comparison with other search engines). I checked this during few months from different domains at different ISPs, different operating systems and from a dozen of browsers. The results are the same. Why is it and how can it be possibly corrected? My related posts: "How Gmail spam filter works?" IP adresses blacklisting Update: It is impossible for me to directly start using google.com as I am always redirected to google.ru (from google.com) by my ip-address "auto-detect location" google's "convenience". The google's help tells that it is impossible to switch off my location auto-detection because it is very helpful feature. There is a work-around to use google.com/ncr (to get google.com) (?anybody know what does it mean) to prevent redirection from google.com but even. But all results are exactly the same OK, I can search by quoted "keycaptcha", I am already accustomed to these google's quirks, but the question arises why the heck to burn time promoting someone's product if GOOGLE uses other product brands for showing its own interests/brands (reCAPTCHA) instead and what can be done with it? The general user will not understand that he was cheated and just will pick up the first (wrong) results Update2: Note that this googling behaviour: is independent on whether I am logged-in (or log-out-ed of) a google account, which account, on browser (I tried Opera, Chrome, FireFox, IE of different versions, Safari), OS or even domain; there are many such cases but I just targeted one concrete restricted example speciffically to to prevent wandering between unrelated details and peculiarities; @Michael, first it is not true and this text contains 2 links for real and significant results.. I also wrote that this is just one concrete example from many and based on many-month exp. These distortions happen upon clicking on: Past 24 hours, Past week, Past month, Past year in many other keywords, occasions/configurations of searches, etc. Second, the absence of the results is the result and there is no point to sneakingly substitute it by another unsolicited one. It is the definition of spam and scam. 3d, the question is not abt workarounds like how to write search queries or use another searching engines. The question is how to straighten the googling's results in order to stop disorienting general public about. Update: I could not understand: nobody reproduces the described by me behavior (i.e. when I click "Past 24 hours" link in google search searching for keycaptcha, the presented results are only on reCAPTCHA presented)? Update: And for the "Past week":

    Read the article

  • IE8 HTTPs Download Issue

    - by Jon Egerton
    I have a problem with a system I develop related to IE8 downloading over SSL (ie on sites using https://...) and is described on this MS kb article: http://support.microsoft.com/kb/323308 We use the HTTPCacheability.NoCache option as the data being downloaded is sensitive, and is downloaded from a secured site. I don't want that data to be cached on any of the proxies etc that the response passes through back to the client. The article describing the issue details a fix to the client side registry changing a BypassSSLNoCacheCheck setting. I don't want to loosen the system security just for IE8, as the system works fine on anything more upto date. Getting all the clients to apply the hotfix is difficult at best, and impossible at worst. We need to support IE8 in the system, at least for now. So: 1: Does the detailed hotfix have any implications for the security at the browser end in IE8 - does it mean the file will be cached? (in a place other than where the user saves the file). 2: Is there some way I can get these files downloadable with a change at the server end that doesn't break the security side of things?

    Read the article

  • how to get new site indexed by alexa

    - by JohnMerlino
    When I search my site on alexa, it says "Alexa Traffic Rank: No Data". So I google the issue. I come across this page: http://www.loudable.com/my-website-data-not-showing-in-alexa-get-your-website-crawled-by-alexasolution.html It says to get site indexed, click Crawl my site on the webmasters page. However there is no longer a link that says "Crawl my site". So as of now, does anyone know how to get site indexed by alexa so that my traffic rank will display on alexa index?

    Read the article

  • Chrome trims the last <li> element in a row [closed]

    - by Paul
    Ok guys, I give up. Here's the code I'm struggling with: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html;charset=utf-8" /> <title>Blah</title> <style type="text/css"> #container { margin: 0 auto; width: 350px; border: 1px solid #ccc; } ul { list-style-type: none; margin: 0; padding: 0; text-align: center; } ul li { display: inline; padding: 5px; margin: 0 1px; background-color: lime; line-height: 2em; /* border: 1px solid red; */ } </style> </head> <body> <div id="container"> <ul> <li>Element A</li> <li>Element B</li> <li>Element C</li> <li>Element D</li> <li>Element E</li> <li>Element F</li> </ul> </div> </body> </html> Why the heck does Chrome trim the right side of "Element D" (even though there is enough space to display whole item), while Firefox and even Internet Explorer render this code properly? It becomes more visible when we apply the commented border. In other words, is there a way to tell the browser that I want every single <li> element to be autonomic, and thus to move it to the next row if it doesn't fit entirely in the previous one? Can't wait to see the solution, thanks in advance.

    Read the article

  • How to filter traffic coming to particular page from other page?

    - by BishKopt
    I've got page A linking to page B. There are also other pages linking to B. How can I see traffic that is coming to page B be ONLY form page A? I can somehow do it via Behavior flow: Behavior Behavior flow [Right click on anything] Explore traffic through here [Click edit icon] Define a page group [Right click] Group details [Dropdown] Incoming traffic But how do I do it in normal reports? Is there any way to filter out only the traffic coming from particular page?

    Read the article

  • Need suggestions on how to create a website with an encrypted database.

    - by SFx
    Hi guys, I want to create a website where a user enters content (say a couple of sentences) which eventually gets stored in a backend database (maybe MySQL). But before the content leaves the client side, I want it to get encrypted using something on client like maybe javascript. The data will travel over the web encrypted, but more importantly, will also be permanently stored in the backend database encrypted. Is JavaScript appropriate to use for this? Would 256 bit encryption take too long? Also, how do you query an encrypted database later on if you want to pull down the content that a user may have submitted over the past 2 months? I'm looking for tips, suggestions and any pointers you guys may have in how to go about learning about and accomplishing this. Thanks!

    Read the article

  • Can I include a robots meta tag outside of the head in HTML snippets indeded to be SSIed?

    - by Dan
    I have a number of files in my site which are not intended for independent viewing, but rather to be AJAXed into content within the site. They obviously don't meet HTML standards (no body, head, etc.) as independent entities. I would like to prevent search engines from indexing these pages, but do not have access to /robots.txt (which would be much more ideal). My question is, could I include the following at the top of these partial HTML files and get the desired results? <meta name="robots" content="noindex, noarchive"> I guess there are two parts to this question. Will this cause any rendering issues in any browsers? Will search engines (at least Google & Bing) interpret this as intended?

    Read the article

  • Desktop Software to monitor online status of web site and web-based application

    - by pansp
    I'm basically looking for a desktop-based software which can monitor my company's website and the web application's online availability. I know there are few online applications like Uptime Robot which does the same work but I have been asked to find a desktop based software which can monitor running in system tray and notify any down-time. A free software would be great. Any help would be appreciated. Thanks!

    Read the article

  • URL hex characters in .htaccess

    - by Steve
    There is an old page with a space in the filename, and this is no longer found on the website. So I need to redirect this page to another page using a 301 redirect in .htaccess. If I place the filename directly into .htaccess (Bouquets%20%26%20Loose.html), the redirect does not work. If I escape the % sign like this (Bouquets\%20\%26\%20Loose.html), the redirect still does not work. How do I get this redirect to work in .htaccess? Thanks.

    Read the article

  • Buy internet country domain name: .fr .co.uk .de .com.au .sg etc

    - by user700580
    I already bought a domain name .com on godaddy for my company. I would like to reserve the same name with country specific domain extention, but not sure where to buy them and how to do it. Here are the ones that I would like to buy: Europe: fr, co.uk, de, ch, es, it, nl, se, no, ru australia: com.au asia: sg Godaddy has all except 1 in europe, australia and singapore. Should I find a website that sell all of them or should I buy some of them in godaddy and others elsewhere? Any suggestions where to buy them? Until now i've always buy .com domain names only so not sure how to do it. Thanks

    Read the article

  • Hosting advice for a write-heavy dynamic website

    - by Rahul Rawat
    I have built a website using PHP and MySQL and now I am looking for a hosting service. I am expecting about a 1000 users registering and about 5-10k pageviews/day in a week's time. So which host should I opt for? It will let users submit contents of type blobs and submit around 10 pictures per users. I hope that traffic will increase so can justhost's or bluehost's shared hosting serve that purpose or should I go for more dedicated ones. Basically the site is write heavy and there are average 2-3 MySQL queries per page and it is quite dynamic. So depending on these requirements which web hosting will be optimal for me.

    Read the article

  • SEO for images: can I use a different (cookieless) domain?

    - by Oliver
    Hello, We want to increase the value of some of our important images by means of SEO, and we want to start serving them from a different, i.e. cookieless, domain. We want to go from http://www.example.com/images/1234.jpg to http://www.example.com/germany/bavaria/landscape.jpg which can easily be done via URL rewriting. Then on the other hand, we would like to serve the image from a completely different domain, let's say http://www.examplestatic.com/germany/bavaria/landscape.jpg, to save the overhead of sending the cookie from www.example.com. Somehow I feel that this is not a good idea because I move the image away from the content by putting it on a different domain. Can anyone shed some light on this problem? Naturally, I would just use a different subdomain, e.g. img.example.com, but we already use subdomains for languages and our cookies are valid for all subdomains of example.com, so this won't help. I'd really appreciate any hints. Cheers,

    Read the article

  • Web Safe Area (optimal resolution) for web app design?

    - by M.A.X
    I'm in the process of designing a new web app and I'm wondering for what 'Web Safe Area' should I optimize the app layout and design. By Web Safe Area I mean the actual area available to display the website in the browser (which is influenced by monitor resolution as well as the space taken up by the browser and OS) I did some investigation and thinking on my own but wanted to share this to see what the general opinion is. Here is what I found: Optimal Display Resolution: w3schools web stats seems to be the most referenced source (however they state that these are results from their site and is biased towards tech savvy users) http://www.w3counter.com/globalstats.php (aggregate data from something like 15,000 different sites that use their tracking services) StatCounter Global Stats Display Resolution (Stats are based on aggregate data collected by StatCounter on a sample exceeding 15 billion pageviews per month collected from across the StatCounter network of more than 3 million websites) NetMarketShare Screen Resolutions (marketshare.hitslink.com) (a web analytics consulting firm, they get data from browsers of site visitors to their on-demand network of live stats customers. The data is compiled from approximately 160 million visitors per month) Display Resolution Summary: There is a bit of variation between the above sources but in general as of Jan 2011 looks like 1024x768 is about 20%, while ~85% have a higher resolution of at least 1280x768 (1280x800 is the most common of these with 15-20% of total web, depending on the source; 1280x1024 and 1366x768 follow behind with 9-14% of the share). My guess would be that the higher resolution values will be even more common if we filter on North America, and even higher if we filter on N.American corporate users (unfortunately I couldn't find any free geographically filtered statistics). Another point to note is that the 1024x768 desktop user population is likely lower than the aforementioned 20%, seeing as the iPad (1024x768 native display) is likely propping up those number (the app I'm designing is flash based, Apple mobile devices don't support flash so iPad support isn't a concern). My recommendation would be to optimize around the 1280x768 constraint (*note: 1280x768 is actually a relatively rare resolution, but I think it's a valid constraint range considering that 1366x768 is relatively common and 1280 is the most common horizontal resolution). Browser + OS Constraints: To further add to the constraints we have to subtract the space taken up by the browser (assuming IE, which is the most space consuming) and the OS (assuming WinXP-Win7): Win7 has the biggest taskbar footprint at a height of 40px (XP's and Vista's is 30px) The default IE8 view uses up 25px at the bottom of the screen with the status bar and a further 120px at the top of the screen with the windows title bar and the browser UI (assuming the default 'favorites' toolbar is present, it would instead be 91px without the favorites toolbar). Assuming no scrollbar, we also loose a total of 4px horizontally for the window outline. This means that we are left with 583px of vertical space and 1276px of horizontal. In other words, a Web Safe Area of 1276 x 583 Is this a correct line of thinking? I'm really surprised that I couldn't find this type of investigation anywhere on the web. Lots of websites talk about designing for 1024x768, but that's only half the equation! There is no mention of browser/OS influences on the actual area you have to display the site/app. Any help on this would be greatly appreciated! Thanks. EDIT Another caveat to my line of thinking above is that different browsers actually take up different amounts of pixels based on the OS they're running on. For example, under WinXP IE8 takes up 142px on top of the screen (instead the aforementioned 120px for Win7) because the file menu shows up by default on XP while in Win7 the file menu is hidden by default. So it looks like on WinXP + IE8 the Web Safe Area would be a mere 572px (768px-142-30-24=572)

    Read the article

  • Tracking first visit date in Google Analytics

    - by dusan
    I want track a site's visitor retention using Google Analytics, to see if unregistered users are returning to it, within a time frame of 2+ months from now. This blog post seems to be on the right track, but I want to track unregistered users, so I don't have a "join date" or similar variable at my disposal. This other blog post suggests using all 5 GA custom variables, using the first variable slot on the first week, variable 2 on week 2, etc. This method will allow me to track 5 weeks of visitors. I want to track more than 5 weeks of visitors, so I was thinking on using two custom variables in GA: visitor's first visit date, and visitor's last visit date. How I can save the first visit date? Because if I save another value in the same slot the old value will be overwritten, and I don't know how reliable is to save that variable conditionally (reading the __utmv cookie to check whether the "first visit date" is set, if it isn't set I save the current date)

    Read the article

  • My Xmap generated sitemap is not being submitted

    - by user2014989
    I m using Joomla Xmap component for creating sitemap. Here is the URL of my Xmap generated Sitemap: http://www.acethehimalaya.com/index.php?option=com_xmap&sitemap=1&view=xml I tried to submit my sitemap to Google but the problem I'm facing is that the URL doesn't get submitted and I'm having the issue that it says the sitemap is empty. Can Xmap generated sitemaps not be submitted, or am I doing anything wrong?

    Read the article

  • Is it illegal to use content in such a way?

    - by MHZ
    I have a couple of questions about the legality of the content of some websites. I am currently working on two websites and I would like to make sure I am not breaking any laws, by using some content like I am... Do I need to get a license to use images from the Internet (such as google.images.com) in my site, assuming they aren't a company logo belonging to another company? If not, am I allowed to use it after I modify it with a image editing software? If content such as phone numbers, e-mail addresses, website addresses, and text from websites can be found for free online, and I gather this information for a search engine based site that I am working on and offering this information on a paid basis (similar to google, but more specialized), is something that is legal? Note: I am not 'copying' or redirecting business from anywhere, to my site. The exact opposite, the site I am working on actually helps advertise businesses and make it easier for customers to find them.

    Read the article

  • Can I use a list of blog ping services for a portal?

    - by Ivanhoe123
    I'm setting up a list of ping services for a portal. It has a blog, forum, articles, restaurants, hotels and many other information, so it is far beyond a blog. I have a list of standard ping services for WP blogs - but I do not know if this should be literally only for blogs. My questions are: Is it recommended to ping blog services from a portal, such as http://blogsearch.google.com/ping/RPC2? Are there any penalties for sites that are not recognized as blogs? Is there some list of ping services for regular websites and not only blogs? Thanks!

    Read the article

  • How to find first cached date and time of a website?

    - by John Sanjay
    I found some of my webpage contents has been copied in other website. I need to check the cache date and time of both pages so that i can compare and guess which page is original and which one is duplicate according to Google. Because i know that if Google found two webpages with same content, it consider the first cached page by Google bot as unique. Is there any online tool for checking that? or other ways?

    Read the article

  • Does having a Google "stop word" in a domain name have less SEO benefit than not having it?

    - by Dan
    Let me explain. Let's say my keyword I want to optimize is "green giraffes". But the domain greengiraffes.com (singular, plural, no hyphen, hyphen, etc.) is not available. I know that the search results for "green giraffes" and "about green giraffes" are essentially the same because "about" is a "stop word". Does that therefore also mean that the domain name "aboutgreengiraffes.com" is as good as "greengiraffes.com" in terms of SEO value? Are all stop words equal in that regard, or a shorter one (such as "e" or "z") is better?

    Read the article

  • How can I set parameters in Google webmaster tools so that my dynamic content is indexed?

    - by Werewolf
    I have read questions about URL parameters in Google Webmaster Tools in this site and the Google Webmaster Help Center but I have a problem. My site searches in the database and show some information. These two URL display some data: http://mydomain.com/index.aspx?category=business http://mydomain.com/index.aspx?category=graphic&City=Paris In URL parameter section, I can only define parameter category, how Google can detect proper values (business, graphics, real estate...)? Every word is not valid for search. If My page name is default.aspx or anything else, where I should define it? If I use URL rewriting like http://mydomain.com/search/category/business, my settings must change?

    Read the article

  • Should the English website use href="x-default" when it doesn't auto-redirect to the user's language or country?

    - by Noam
    For each URL on my site, I'm auto-redirecting according to header accept language. The site arch is English version: http://mydomain.com/page Spanish version http://es.mydomaina.com/page etc.. The english version is displayed unless I'm seeing a specific language other than en and that I support in the header, and then a redirect occurs. Google says this: For language/country selectors or auto-redirecting homepages, you should add an annotation for the hreflang value "x-default" as well: My pages aren't language selectors, nor are they the homepage. But I am auto-redirecting. My question is, should my english version be hreflang="x-default" or/and hrefland="en"?

    Read the article

  • Gaming Community CMS, with forum integration [closed]

    - by Tillman32
    Possible Duplicate: Which Content Management System (CMS) should I use? I've had a simple website that I coded myself for a while now, the site is a gaming community. It's very forum and news driven. It was a HORRIBLE idea to take on coding this thing myself. Although we've used it for about a year now, we're just getting too big, and I need to streamline our work. I need writers to post news, etc. I've been doing it through code. ( A year ago I thought it would be a cool idea ) Anyway, I've been messing with just about every CMS out there, and I'm struggling to get something that I really like. The main issue I'm facing, is a good news system, and good forum integration. I'm sort of picky when it comes to looks, its a curse. Reading on here, I see a lot of people saying Drupal is the best for the 3 things I need, community interaction, and forums. I think the main issue that I ran into with drupal, was ease of use, and themes. I am not a web designer, and I need a good theme. For an idea of what I'm looking for, go check out http://www.clgaming.net, they have forums integrated, a nice news area on home page/news section, and nice user accounts. It looks very professional, and I doubt I'll get close to that with a free theme, but their functionality is exactly what I need. Any ideas would be greatly appreciated.

    Read the article

  • navigation menus and SEO

    - by Rodolfo
    I've always have my doubts about navigation menus effect on SEO. You know, the vertical menus on the top that show in every page in the site linking to main sections and subsections. My issue is that if not done dynamically (i.e. after page is loaded or something), from a search engine's point of view it probably looks like a whole bunch of links in the beginning of the page, and links that probably have nothing to do with the page being analyzed, so it's probably not only confusing it, but also giving link 'juice' to the wrong pages or reducing its value. When I've asked SEO people about this, I usually get a "Google is smart, they'll recognize it as a menu and ignore it" response, but I'm not convinced (and the 'Google is smart' argument sounds almost like religion discussion to me). So does it affect SEO negatively or not? Are there any official posts on this topic?

    Read the article

< Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >