Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 171/476 | < Previous Page | 167 168 169 170 171 172 173 174 175 176 177 178  | Next Page >

  • Why doesn't Firefox cache my images and CSS

    - by Richard A
    I am using IIS7, I have already set up the following. But when I run Firefox it seems not to cache any of my images even with "remember history" set. <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <staticContent> <clientCache cacheControlCustom="public" cacheControlMode="UseMaxAge" cacheControlMaxAge="7.00:00:00" /> </staticContent> </system.webServer> </configuration> However when I use Firebug it still points to Firefox not caching images and CSS: public,max-age=604800 Content-Type text/css Content-Encoding gzip Last-Modified Mon, 27 Jun 2011 03:53:22 GMT Accept-Ranges bytes Etag "507968c27d34cc1:0" Vary Accept-Encoding Server Microsoft-IIS/7.5 X-Powered-By ASP.NET Date Mon, 27 Jun 2011 13:06:41 GMT Content-Length 5067 Request Headersview source Host www.xx.com User-Agent Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1 Accept text/css,*/*;q=0.1 Accept-Language en-us,en;q=0.5 Accept-Encoding gzip, deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive Referer http://www.xx.com/ Cookie __utma=62996397.135679654.1309106351.1309159743.1309164158.8; __utmz=62996397.1309106351.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); __utmc=62996397

    Read the article

  • Oversizing images to produce better looking pages?

    - by Joannes Vermorel
    In the past, improper image resizing used to be a big no-no of web design (not mentioning improper compression format). Hence, for years I have been sticking to the policy where images (PNG or JPG) are resized on the server to match the resolution pixel-wise they will have with the rendered page. Now, recently, I hastily designed a HTML draft with oversized images, using inline CSS style such as width:123px and height:123px to resize the images. To my (slight) surprise, the page turned out to look much better that way. Indeed, with better screen resolution, some people (like me), tend to browse with some level of zoom (aka 125% or even 150% zoom), otherwise fonts are just too small on-screen. Then, if the image is strictly sized, the enlarged image appears blurry (pixel interpolation effect), but if the image is oversized the results is much better. Obviously, oversizing images is not an acceptable pattern if your website is intended for mobile browsing, but is there case where it would be considered as acceptable? Especially if the extra page weight is small anyway.

    Read the article

  • How can I use domain masking without having self referral in Google Analytics

    - by Cdore
    I have one old domain that points to a website's server's ip (let's call it www.oldsite.com). I have a new one, www.newsite.com, that is set up to be forwarded to a specific page on the website. Due to the way the host of newsite.com places the website in a frame, in Google Analystics, the newsite.com is listed as a source rather than the source they were at before hand, causing a self referral. A solution is to edit the code of the iframe as I looked up, but there's no way to really edit the host's masking source code of course. Another solution I did previously was have www.newsite.com point to the address that www.oldsite.come pointed to. It solved the analytics problems, but in exchange, the url masking no longer worked. In the address bar, it came up as www.oldsite.com. Is there a way to make me have url masking and be able to forward to agree with google analytics? The server of the website is hosted on a cloud server, if this is anymore information.

    Read the article

  • My colleague can't visit our website through her provider after long downtime

    - by Peter Westerlund
    We did a frontpage update some days ago that caused the site to crash. The site was down for several hours. After troubleshooting, we concluded that we needed to cache more content. It had been run too many queries. After solving that and rebooting of server, we here in Sweden and Norway were again able to visit the site. But a colleague in Tunisia couldn't. It seems to work from another internet provider but not her own. What could have happened? And what should we do? Edit: I should add: She is able to visit the site through tunnel at anonymouse.org.

    Read the article

  • Webhosting with custom database choice [closed]

    - by churchill614
    Possible Duplicate: How to find web hosting that meets my requirements? I am trying to find somewhere to host a website which uses OrientDB as its database. My budget doesn't stretch to a dedicated server where I can configure everything as I need it. Rather, I am hoping to find somewhere, ideally UK based, that will allow me to install/install for me OrientDB on their server, that is of the normal shared server variety. Is anybody able to point me in a good direction for this please (whilst UK is preferable it is not essential)?

    Read the article

  • Is it ok for a canonical link to point to itself?

    - by Tom Gullen
    I've got the canonical: <link href="http://www.Site.com/Blog/how-to-know-when-this" rel="canonical" /> Is it ok if this is on the page it is pointing to? Also I'm putting it on all these pages: http://www.Site.com/Blog/how-to-know-when-this http://www.Site.com/Blog/how-to-know-when-this/ http://www.Site.com/Blog.aspx?ID=1 http://www.Site.com/Blog/how-to-know-when-this/?q= Is this correct useage?

    Read the article

  • Photo Gallery Software - reads from a local directory - watches folder- user and group permissions

    - by Darkflare
    Use Case: Photos are organised in a folder structure by date (by software lightroom/picasa) Want to run a local webserver to host a web gallery from (already know how to run lamp etc) I want to be able to attach metadata to the photos (probably through a database not residing in the photos folder) such that they can be tagged/categoried/albumed without affecting the original photos I want to be able to assign permissions to different albums to set users I want the software to watch the photo source folder for changes so that new photos are indexed ready for applying metadata and albums. I'd like the software to handle the rendering of numerous file types (photo formats) as well as video formatts I am language agnostic so php/python heck even c#, just want software that forfills the requirements. The main reason I am asking this question here as I am unsure what this software would even be called so google searching is quite difficult! Thanks for reading.

    Read the article

  • Can I use a 302 redirect to serve up static content from a url with escaped_fragment?

    - by Starfs
    We would like to serve up seo-friendly ajax-driven content. We are following this documentation. Has anyone ever tried to write a 302 redirect into the htaccess file, that takes the '?_escaped_fragment=' string and send that to a static page? For example /snapshot/yourfilename/ How will Google react to this? I've gone through the documentation and it's not very clear. The below quote is from Google's documentation this is what I find. I'm not sure if they are saying that you can redirect the _escaped_fragment_ url to a different static page, or if this is to redirect the hashtag URL to static content? Thoughts? From Google's site: Question: Can I use redirects to point the crawler at my static content? Redirects are okay to use, as long as they eventually get you to a page that's equivalent to what the user would see on the #! version of the page. This may be more convenient for some webmasters than serving up the content directly. If you choose this approach, please keep the following in mind: Compared to serving the content directly, using redirects will result in extra traffic because the crawler has to follow redirects to get the content. This will result in a somewhat higher number of fetches/second in crawl activity. Note that if you use a permanent (301) redirect, the url shown in our search results will typically be the target of the redirect, whereas if a temporary (302) redirect is used, we'll typically show the #! url in search results. Depending on how your site is set up, showing #! may produce a better user experience, because the user will be taken straight into the AJAX experience from the Google search results page. Clicking on a static page will take them to the static content, and they may experience avoidable extra page load time if the site later wants to switch them to the AJAX experience.

    Read the article

  • On-Site Factors that Affect CPC

    - by ashes999
    I have a few websites on various niche topics, all running Adsense. The most promising one currently has a CPC that hovers around $1; the rest have CPCs of $0.25-$0.50. I'm curious to know what on-site factors affect CPC. That is to say, what I can do, legally (in white-hat compliance) to increase my CPC? Some factors that affect CPC but are not within my control (and therefore, beyond the scope of my question -- they're just examples) include: What advertisers are paying for keywords on my site What pages people are landing on etc.

    Read the article

  • pros-cons of separate hosting accounts versus using addon domain

    - by hen3ry
    Folks: For historical reasons, I have "Site A" on "Hosting Account A", and "Site B" on "Account B", totally independent accounts with the same vendor, Bluehost. Both are primary domains. Now that Hosting Account B is just about to expire, I'm considering letting it disappear and moving Site B to an Addon domain on "Account A". Both sites are non-commercial, narrow-interest, very-low-traffic, hundreds of page views per month. The file weights for the sites are non-trivial, especially as I like to install specialized CMSs in subdomains. Since Bluehost allows unlimited hosting space there should be no issue with the file load, except I've seen hints of an issue with total file count, maybe 50k files -- which I'm not currently close to hitting, but might eventually. My question: what are the pros and cons of using separate accounts versus hosting Site B as an addon domain? Obviously, using a single account is cheaper by half, and I know that my authoring environment (DreamWeaver CS5) complains when it detects nested source trees, telling me "Synchronization" might fail in such cases, but I don't depend on this feature. What other factors should I consider? TIA

    Read the article

  • Adding SPF records in GoDaddy

    - by Mayank swami
    I have the GoDaddy hosting and send mail using the following code: $to = "[email protected]"; $subject = "Test mail"; $message = "Hello! This is a simple email message."; $from = "[email protected]"; $headers = "From:" . $from; mail($to,$subject,$message,$headers); echo "Mail Sent."; When the mail arrives at its destination I see the following (in red outline) I don't want to show the "via server" and for that there is an option to add a SPF record. To do this I have followed the instructions in this page: Managing DNS for Your Domain Names but it's not working. After that i have tried: v=spf1 include:_spf.google.com ~all as described in http://support.google.com/a/bin/answer.py?hl=en&answer=178723 but I still get the same result. How can I solve this and prevent "via server" showing?

    Read the article

  • .com.au backordered domain: Do I have to return it if the original owner asks for it?

    - by vDog
    I was contacted by the original owner of a domain to give him the domain that I backordered a few weeks ago. The domain was abandoned for about 2 months before I bought it to eliminate the competition of my client but now I am faced with a threat that he will take this matter to court and AUDA (.au domain administration limited). Am I supposed to handover the domain that I have bought legally? I would like to know my rights in this situation.

    Read the article

  • is box-shadow (CSS3) really not ready to use? (according to "CAN I USE")

    - by mechdeveloper
    I have a problem that I want you to help me, I am currently making a website, I am building that website on HTML5 and CSS3 technology, every feature I'd like to use I check it first in "CAN I USE", the technology I use most is box-shadow, and I already made some great things with it but, I have a doubt about the percentage of browser that don't support that technology, the percentage of browser that do not support box-shadow is around 17.12%, and if you see the conclusions (show options = other options = show conclusions) they say that that feature isn't ready yet because they are "Waiting for Opera Mini 5.0-6.0 to expire", I personally think that the best that we can do in order to make people update their browsers is not support older browser, but ... am I right thinking like this? will I have bad consecuences if I don't support older browsers? is worth to work twice just to support older browsers? should I still working with box-shadow?

    Read the article

  • How URL Redirection affects SEO?

    - by Costa
    The following paragraph is from SEO Google Guide Google is good at crawling all types of URL structures, even if they're quite complex, but spending the time to make your URLs as simple as possible for both users and search engines can help. Some webmasters try to achieve this by rewriting their dynamic URLs to static ones; while Google is fine with this, we'd like to note that this is an advanced procedure and if done incorrectly, could cause crawling issues with your site. What makes URL re-writing implementation incorrect for GoogleBot? I am using Asp.net 3.5 framework. Thanks

    Read the article

  • why my website doesn't ranked by alexa? [closed]

    - by arshen
    i created a website with WordPress and post 10+ article in period of two month, but alexa doesn't rank my website. i tried to change my theme, URL and other related things and submit my website URL manually to alexa dashboard, while i have amount of 200 page view in a day but its still not ranked. my website URL: http://daskaht.ir robots file: http://daskhat.ir/robots.txt alexa page: www.alexa.com/siteinfo/daskhat.ir domain whois: whois.domaintools.com/daskhat.ir and website seo rank: www.woorank.com/en/www/daskhat.ir

    Read the article

  • google analytics - real-time user stats vs audience overview user stats

    - by udog
    When looking at the real-time analytics reporting for our app, it shows around 150-180 users, say around 10AM (our peak usage time). When I look at the Audience Overview report for the same day (hourly breakdown), the number of users shown for the 10AM hour is over 1000. I'm sure this has to do with some sort of aggregation, but I would like to know more about how these two numbers are calculated in order to understand it.

    Read the article

  • Strange SEO Problems

    - by Davey
    I have a twitter account linked to my wordpress site so that each tweet becomes a new post. I was wondering why my SEO was hurting and when I looked at the source I was seeing stuff like this: Los Conchita & # 8 2 1 7 ; s on Prince has V & # 8 2 3 0 ; That is what the source lists as the title of the page. Has anyone else had this problem and know why it is occurring? Thanks! The site is reviewathens.com

    Read the article

  • Is there a weight for html link?

    - by Questions
    I would like to know if there is any weight associated with a html link (its backlink) when google does crawling/indexing. Will 1,2 & 3 be ever considered as a backlink by Google? 1. <a href="xyz.com">1</a> // one character link 2. <a href="xyz.com"> </a> //blank space link 3. <a href="xyz.com">!</a> //special character link 4. <a href="xyz.com">keyword</a> //meaningful word link I hope my question is understandable and guess this is right forum. I don't know to put it in other words. Thanks in advance.

    Read the article

  • Sharepoint 2010 as framework for website?

    - by Kenny Bones
    So I'm looking at several solutions for our new website. And we've looked at ExpressionEngine first and foremost. Now, during brainstorming today, one person said "why don't we use Sharepoint 2010 to build the site on?", and it doesn't seem like a horrible idea. I mean, we're based around Office anyway. We use Lync and have an intranet based on Sharepoint 2010 anyway. Does anyone have any thoughts on this? Would it cost more to develop an internet webpage on Sharepoint 2010 opposed to using ExpressionEngine?

    Read the article

  • SSL Certificate is Untrusted... sometimes

    - by dragonmantank
    Web Designer I'm working with signed up a new client that needed an SSL certificate. We went to namecheap.com and purchased on from Comodo. Got all the needed files and set it up in ISPConfig. To test we used Windows 7 running IE8, Firefox 3.6, and Chrome 12, and then on OSX with Firefox 4, Safari 5, and Chrome 13. All of them worked fine. The client is getting 'This connection is untrusted' in Firefox 4 and 5. Safari works fine on their machine. On my machines and the designer's machines all works with no errors. I had the client forward me the info for the certificate that Firefox has and the fingerprints match up. I have an old Windows 2000 VM with IE6 and Chrome and those work just fine as well. Any ideas on what else to check or do? The server is running Debian 5.0, up-to-date, with Apache 2 and ISPConfig 3.3

    Read the article

  • How should a site respond to automated login attempts with phony usernames?

    - by qntmfred
    For the last couple weeks I've been seeing a consistent stream of 15-30 invalid login attempts per hours on my site. Many of them are non-sensical usernames that nobody would ever register for real, and often contain typical spam-related keywords. They all come from different IP addresses so I can't just IP block/throttle the requests. I'm not worried about unauthorized access to real accounts since they aren't using real usernames. And if it were a member of my site trying to brute force logins, they could easily scrape the valid usernames from the site, so I'm not worried about that kind of malicious behavior either. But what's the point of this type of activity? What would whichever bot operator is doing this have to gain by attempting all these logins?

    Read the article

  • How long does it take for Google Webmasters to index site after submitting sitemap? [closed]

    - by Venkatesh Hodavdekar
    Possible Duplicate: Why isn't my website in Google search results? I have submitted my website today into Google search using Google Webmasters using sitemaps. The status on the sitemap says OK and it shows that 12 urls have been recognized. I was wondering how long does it take for the link to get indexed, as the indexed url option says "No data available. Please check back soon." I am not sure if it is showing this message due to some error, or everything is fine.

    Read the article

  • What is a good robots.txt for WP ?

    - by Steven
    What is the "best" setup for robots.txt? I'm using the following permalink structure in Wordpress: /%category%/%postname%/. My robots.txt currently looks like this (copied from somewhere a long time ago): User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /trackback Disallow: /comments Disallow: /category/*/* Disallow: */trackback Disallow: */comments I want my comments to be indext. So I can remove this, right? Do I want to disallow indexing categories because of my permalinkstructure? An article can have several tags and be in multiple categories. This may cause duplicates in google. How should I work around this? Would you change anything else here?

    Read the article

  • Convert Google Video URL [on hold]

    - by user3716328
    When i download a video from YouTube(Google docs or Google plus) with a download manager this what i get referer: http://www.youtube.com/watch?v=71zlOWbEoe8 address: http://r4---sn-5hnezn76.googlevideo.com/videoplayback?id=o-AAyCp6dqoOj_sUwGtSwE9J27TU7-iKHf4d209xDnuCee&signature=2EB6338E125E84AAA3736DBCEF6AC35451A2104A.1C29A8A9B830CDC9E4493D93A25BA62FF3672E4A&key=cms1&ipbits=0&fexp=3300091%2C3300111%2C3300130%2C3300137%2C3300164%2C3310366%2C3310648%2C3310698%2C3312169%2C907050%2C908562%2C913434%2C923341%2C924203%2C930008%2C932617%2C935501%2C939936%2C939937&ratebypass=yes&ip=41.230.222.50&upn=DO_DxWpYrpg&expire=1402102715&sparams=expire,gcr,id,ip,ipbits,itag,ratebypass,source,upn&itag=18&source=youtube&gcr=tn&redirect_counter=1&req_id=8a4c0009cb74c822&cms_redirect=yes&ms=nxu&mt=1402081178&mv=m&mws=yes And I want to know if i have the address is there a way to get the referer I mean if i have this : http ://r4---sn-5hnezn76.googlevideo. com/videoplayback?id...... how to get http ://www.youtube.com/watch?v=......... ?

    Read the article

  • Does Submit to Index on a page with new content update Content Keywords for the site?

    - by Dan Kanze
    Using Google Webmaster Tools I'm trying to update the Content Keywords of my site. I'm confused about the relationship between Submit to Index and Content Keywords Does Fetch as Google -- Submit to Index on a previously existing indexed page containing new content expidite updating the Content Keywords crawled by the real Google bot? Does Submit to Index only submit new URL's so that previously indexed URL's still point to the older cached version until Google crawls specifically for new content on its own? Does Submit to Index have anything to do with Content Keywords or crawling new content being a previously indexed page or never been indexed page?

    Read the article

< Previous Page | 167 168 169 170 171 172 173 174 175 176 177 178  | Next Page >