Search Results

Search found 3419 results on 137 pages for 'browsers'.

Page 125/137 | < Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >

  • Detecting 'stealth' web-crawlers

    - by Jacco
    What options are there to detect web-crawlers that do not want to be detected? (I know that listing detection techniques will allow the smart stealth-crawler programmer to make a better spider, but I do not think that we will ever be able to block smart stealth-crawlers anyway, only the ones that make mistakes.) I'm not talking about the nice crawlers such as googlebot and Yahoo! Slurp. I consider a bot nice if it: identifies itself as a bot in the user agent string reads robots.txt (and obeys it) I'm talking about the bad crawlers, hiding behind common user agents, using my bandwidth and never giving me anything in return. There are some trapdoors that can be constructed updated list (thanks Chris, gs): Adding a directory only listed (marked as disallow) in the robots.txt, Adding invisible links (possibly marked as rel="nofollow"?), style="display: none;" on link or parent container placed underneath another element with higher z-index detect who doesn't understand CaPiTaLiSaTioN, detect who tries to post replies but always fail the Captcha. detect GET requests to POST-only resources detect interval between requests detect order of pages requested detect who (consistently) requests https resources over http detect who does not request image file (this in combination with a list of user-agents of known image capable browsers works surprisingly nice) Some traps would be triggered by both 'good' and 'bad' bots. you could combine those with a whitelist: It trigger a trap It request robots.txt? It doest not trigger another trap because it obeyed robots.txt One other important thing here is: Please consider blind people using a screen readers: give people a way to contact you, or solve a (non-image) Captcha to continue browsing. What methods are there to automatically detect the web crawlers trying to mask themselves as normal human visitors. Update The question is not: How do I catch every crawler. The question is: How can I maximize the chance of detecting a crawler. Some spiders are really good, and actually parse and understand html, xhtml, css javascript, VB script etc... I have no illusions: I won't be able to beat them. You would however be surprised how stupid some crawlers are. With the best example of stupidity (in my opinion) being: cast all URLs to lower case before requesting them. And then there is a whole bunch of crawlers that are just 'not good enough' to avoid the various trapdoors.

    Read the article

  • WebKit "Refused to set unsafe header "content-length"

    - by Paul
    I am trying to implement simple xhr abstraction, and am getting this warning when trying to set the headers for a POST. I think it might have something to do with setting the headers in a separate js file, because when i set them in the <script> tag in the .html file, it worked fine. The POST request is working fine, but I get this warning, and am curious why. I get this warning for both content-length and connection headers, but only in WebKit browsers (Chrome 5 beta and Safari 4). In Firefox, I don't get any warnings, the Content-Length header is set to the correct value, but the Connection is set to keep-alive instead of close, which makes me think that it is also ignoring my setRequestHeader calls and generating it's own. I have not tried this code in IE. Here is the markup & code: test.html: <!DOCTYPE html> <html> <head> <script src="jsfile.js"></script> <script> var request = new Xhr('POST', 'script.php', true, 'data=somedata', function(data) { console.log(data.text); }); </script> </head> <body> </body> </html> jsfile.js: function Xhr(method, url, async, data, callback) { var x; if(window.XMLHttpRequest) { x = new XMLHttpRequest(); x.open(method, url, async); x.onreadystatechange = function() { if(x.readyState === 4) { if(x.status === 200) { var data = { text: x.responseText, xml: x.responseXML }; callback.call(this, data); } } } if(method.toLowerCase() === "post") { x.setRequestHeader("Content-Type", "application/x-www-form-urlencoded"); x.setRequestHeader("Content-Length", data.length); x.setRequestHeader("Connection", "close"); } x.send(data); } else { // ... implement IE code here ... } return x; }

    Read the article

  • Windows Azure : Server Error , 404 - File or directory not found.

    - by veda
    I want to upload some files of size 35MB on to the blob container. I have coded for splitting the data into blocks and upload it on to the blob container and form a blob using PUT. I tested the code for some files of Size 2MB or something... It worked well. But When I tried it for a large MB file, its giving me this error Server Error 404 - File or directory not found. The resource you are looking for might have been removed, had its name changed, or is temporarily unavailable. when I tried it for files of size 6MB, it gives me this error.. Server Error in '/' Application. Runtime Error Description: An application error occurred on the server. The current custom error settings for this application prevent the details of the application error from being viewed remotely (for security reasons). It could, however, be viewed by browsers running on the local server machine. Details: To enable the details of this specific error message to be viewable on remote machines, please create a <customErrors> tag within a "web.config" configuration file located in the root directory of the current web application. This <customErrors> tag should then have its "mode" attribute set to "Off". <!-- Web.Config Configuration File --> <configuration> <system.web> <customErrors mode="Off"/> </system.web> </configuration> Notes: The current error page you are seeing can be replaced by a custom error page by modifying the "defaultRedirect" attribute of the application's <customErrors> configuration tag to point to a custom error page URL. <!-- Web.Config Configuration File --> <configuration> <system.web> <customErrors mode="RemoteOnly" defaultRedirect="mycustompage.htm"/> </system.web> </configuration> Can anyone tell me, How to solve this...

    Read the article

  • JQuery dialog momentairly displayed on page load

    - by Kevin Won
    I created a page that has a JQuery based dialog using the standard JQuery UI function. I do this with out of the box functionality of JQuery... nothing special at all. Here is my HTML for the dialog: <div id = "myDialog"> <!-- ... more html in here for the dialog --> </div> Then the JQuery called in javascript that transforms the <div> to a dialog: // pruned .js as an example of kicking up a JQuery dialog $('#myDialog').dialog({ autoOpen: false, title: 'Title here', modal: true } }); Again, plain-vanilla JQuery. So you start this wizard by clicking on a link on the parent page, and it then spawns a JQuery dialog which has a significant chunk of HTML that includes images, etc. As I continued developing this page, I started to notice that when I loaded the page in the browser that the <div> tags I was putting in that JQuery transforms into dialogs would very briefly be displayed. Then the page would act as expected. In other words, the dialog would not be hidden, it would be displayed briefly in-line in the page. Quite ugly and unprofessional looking! But after a split second, the page would render correctly and look just as I expected/wanted. Over time, as the page size grew, the time the page would remain incorrectly rendered grew. My guess is that the rendering engine of the browser is rendering the page as it is loading, then at the end it is kicking off the JQuery that will transform the <div> into a dialog. This JQuery function will then transform the simple <div> to a JQuery dialog and hide it (since I have the autoOpen property set to false). Some browsers <cough>IE</cough> display it longer than others. My large-ish dialog now causes the page to incorrectly render for about 1 second... YUCK! I came up with a resolution to this problem which works OK, but I'm wondering if someone knows of a better way.

    Read the article

  • IE7 relative/absolute positioning bug with dynamically modified page content

    - by Matthias Hryniszak
    Hi, I was wondering if there's anyone having an idea how to tackle with the following problem in IE7: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>IE7 absolute positioning bug</title> <style type="text/css"> #panel { position: relative; border: solid 1px black; } #spacer { height: 100px; } #footer { position: absolute; bottom: 0px; } </style> <script type="text/javascript"> function toggle() { var spacer = document.getElementById("spacer"); var style = "block"; if (spacer.style.display == "block" || spacer.style.display == "") { style = "none"; } spacer.style.display = style; } </script> </head> <body> <div id="panel"> <button onclick="toggle();">Click me</button> <br /><br /><br /> <div id="spacer"></div> <div id="footer">This is some footer</div> </div> </body> </html> When you run this in IE7 you'll see that the "footer" element stays after modifying the CSS for "panel". The same example tested in IE8, FF and Chrome behaves exactly as expected. I've already tried updating the element's class but this does not work if the browser's window has been opened maximized and no further size changes were made to the window (which is about 90% of the use cases we have for our product.... :( ) I'm stuck with a CSS-based solution however I think that I can make an exception in this case if it can easily be made IE7-specific (which means that other browsers will behave in a standard way with this). Please help!

    Read the article

  • CSS layout mystery

    - by selfthinker
    Among the many two (or three) column layout techniques I sometimes use the following one: <div class="variant1"> <div class="left1"> <div class="left2"> left main content </div> </div> <div class="right1"> <div class="right2"> right sidebar </div> </div> </div> together with: .variant1 .left1 { float: left; margin-right: -200px; width: 100%; } .variant1 .left1 .left2 { margin-right: 200px; } .variant1 .right1 { float: right; width: 200px; } This works in all major browsers. But for some very strange reason exactly the same technique but reversed doesn't work: <div class="variant2"> <div class="left1"> <div class="left2"> left main content </div> </div> <div class="right1"> <div class="right2"> right sidebar </div> </div> </div> with .variant2 .left1 { float: left; width: 200px; } .variant2 .right1 { float: right; margin-left: -200px; width: 100%; } .variant2 .right1 .right2 { margin-left: 200px; } In the second variant all text in the sidebar cannot be selected and all links cannot be clicked. This is at least true for Firefox and Chrome. In IE7 the links can at least be clicked and Opera seems completely fine. Does anyone know the reason for this strange behaviour? Is it a browser bug? Please note: I am not looking for a working two column CSS layout technique, I know there are loads of them. And I don't necessarily need this technique to work. I only like to understand the reason why the second variant behaves like it does. Here is a link to a small test page which should illustrate the problem: http://selfthinker.org/stuff/css_layout_mystery.html

    Read the article

  • Exception thrown in YUI: "Sizzle" is not defined!

    - by nanobyt3
    Hi, We are using HTML Unit v2.6 with Web-Harvest and extended its functionality to create a new element <web session="sess1" browser="firefox2"> <web-getpage url="https://www.linkedin.com/secure/login"/> <web-setinput name="uname">username</web-setinput> <web-setinput name="pwd">password</web-setinput> <web-clickinput name="login"/> </web> When we run this we get an exception while the element loads the URL specified. The details are as below: Caused by: net.sourceforge.htmlunit.corejs.javascript.EcmaError: TypeError: Cannot find function hasOwnProperty in object net.sourceforge.htmlunit.corejs.javascript.EcmaError: ReferenceError: "Sizzle" is not defined. at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.constructError(ScriptRuntime.java:3651) at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.constructError(ScriptRuntime.java:3629) at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.typeError(ScriptRuntime.java:3657) at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.typeError2(ScriptRuntime.java:3676) at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.notFunctionError(ScriptRuntime.java:3740) at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.getPropFunctionAndThisHelper(ScriptRuntime.java:2249) at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.getPropFunctionAndThis(ScriptRuntime.java:2216) at net.sourceforge.htmlunit.corejs.javascript.Interpreter.interpretLoop(Interpreter.java:1501) at net.sourceforge.htmlunit.corejs.javascript.Interpreter.interpret(Interpreter.java:845) at net.sourceforge.htmlunit.corejs.javascript.InterpretedFunction.call(InterpretedFunction.java:164) at net.sourceforge.htmlunit.corejs.javascript.ContextFactory.doTopCall(ContextFactory.java:427) at com.gargoylesoftware.htmlunit.javascript.HtmlUnitContextFactory.doTopCall(HtmlUnitContextFactory.java:263) at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.doTopCall(ScriptRuntime.java:3058) at net.sourceforge.htmlunit.corejs.javascript.InterpretedFunction.exec(InterpretedFunction.java:175) at com.gargoylesoftware.htmlunit.javascript.JavaScriptEngine$5.doRun(JavaScriptEngine.java:415) at com.gargoylesoftware.htmlunit.javascript.JavaScriptEngine$HtmlUnitContextAction.run(JavaScriptEngine.java:520) ... 42 more As it appears that 'Sizzle' (present in YUI3) is causing this. We then checked the same in Firefox and IE but neither of the browsers showed any error of 'Sizzle' not being defined. Also we tried to use latest snapshot of htmlunit, but had same issue. Is this a limitation(bug) of HTML Unit JavaScript engine? OR Is there anyway to configure HTML Unit to handle this exception? If anyone has already had such an issue, please do let us know. Any help is very much appreciated. Thanks in advance !

    Read the article

  • jquery hover not working in safari and chrome

    - by Nik
    I'm developing a site and I am implementing a jquery hover effect on some list items. It works perfectly in all browser except safari and chrome (mac and pc). For some reason the hover effect doesnt work on those to browsers. Here is the link link text I thought I would add the code just in case it helps (it also uses the color_library.js file that can be found in the head of the document). $(document).ready(function() { var originalBG = $("#menu li#Q_01","#menu li#Q_03","#menu li#Q_05","#menu li#Q_07","#menu li#Q_09","#menu li#Q_11","#menu li#Q_11").css("background-color"); var originalBG1 = $("#menu li").css("color"); var originalBG2 = $("#menu li#Q_02","#menu li#Q_04","#menu li#Q_06","#menu li#Q_08","#menu li#Q_10","#menu li#Q_12").css("background-color"); var fadeColor = "#009FDD"; var fadeColor1 = "#FFF"; var fadeColor2 = "#623A10"; $("#menu li#Q_01").hover( function () { $(this).animate( { backgroundColor:fadeColor2,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_03").hover( function () { $(this).animate( { backgroundColor:fadeColor2,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_05").hover( function () { $(this).animate( { backgroundColor:fadeColor2,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_07").hover( function () { $(this).animate( { backgroundColor:fadeColor2,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_09").hover( function () { $(this).animate( { backgroundColor:fadeColor2,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_11").hover( function () { $(this).animate( { backgroundColor:fadeColor2,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_13").hover( function () { $(this).animate( { backgroundColor:fadeColor2,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_02").hover( function () { $(this).animate( { backgroundColor:fadeColor,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_04").hover( function () { $(this).animate( { backgroundColor:fadeColor,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_06").hover( function () { $(this).animate( { backgroundColor:fadeColor,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_08").hover( function () { $(this).animate( { backgroundColor:fadeColor,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_10").hover( function () { $(this).animate( { backgroundColor:fadeColor,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); $("#menu li#Q_12").hover( function () { $(this).animate( { backgroundColor:fadeColor,color:fadeColor1}, 380 ) }, function () { $(this).animate( {color:"#666",backgroundColor:"#fff"}, 380 ) } ); }); Thanks for any advice ;)

    Read the article

  • How does this ajax call persist DOM changes in the browser cache?

    - by Greg
    For the purpose of the question I need to create a simple fictitious scenario. I have the following trivial page with one link, call it page A: <a class="red-anchor" onclick="change_color(event);" href="http://mysite.com/b/">B</a> With the associated Javascript function: function change_color(e) { var event = e || window.event; var link = event.target; link.className = "green-anchor"; } And I have the appropriate CSS to make the anchor red or green based on the classname. This is working. That is, when I click the anchor it changes color from red to green, which is briefly visible before the browser loads page B. But if I then use the BACK button to return to page A I get different behavior in different browsers. In Safari, the anchor is still green (desired behavior) In Firefox it reverts to red I imagine that Safari is somehow updating its cached version of the page, whereas Firefox isn't. So my first question is: is there any way to get FF to update the cached page, or is something else happening here? Secondly: I have a different implementation where I use an ajax call. In this I set the class of the anchor using a session variable, something like... <a class="<?php echo $_SESSION["color"]; ?>" ...[snip]... >B</a> And the javascript function makes an additional ajax call that changes the "color" session variable. In this case both Safari and Firefox work as expected. When going back from B to A the color is still green. But I can't for the life of me figure out why it should be different to the non-ajax case. I have tried many different permutations and for it to work on FF the "color" session variable MUST change (i.e. the ajax call itself is not somehow reloading the cache). But on coming BACK, the page is being reloaded from the cache (verified in Firebug), so how is the page even accessing this session variable if it isn't reprocessing the page and running that fragment of php in the anchor? I figure there must be something fundamental here that I am not understanding. Any insight would be much appreciated.

    Read the article

  • ASP.net Ajax tab container not appearing

    - by Eyla
    I created new web project using VS 2008 with enabled Ajax template with C# and Framework 3.5. I added Ajax reference to the project and I can see all Ajax toolkit in my tool box. The problem that when I add tab container with Tab Panels then run the projects nothing appear on the browser and I tried few browsers. I'm including my code and I wish that someone would help me. Regards, My Code: ................................................................ <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="Contacts._Default" %> <%@ Register assembly="AjaxControlToolkit" namespace="AjaxControlToolkit" tagprefix="asp" %> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" > <head runat="server"> <title>Untitled Page</title> </head> <body> <form id="form1" runat="server"> <asp:ScriptManager ID="ScriptManager1" runat="server" /> <div> <asp:TabContainer ID="TabContainer1" runat="server" ActiveTabIndex="0"> <asp:TabPanel runat="server" HeaderText="TabPanel1" ID="TabPanel1"> <ContentTemplate> tab 1 </ContentTemplate> </asp:TabPanel> <asp:TabPanel runat="server" HeaderText="TabPanel2" ID="TabPanel2"> <ContentTemplate> tab 2 </ContentTemplate> </asp:TabPanel> <asp:TabPanel runat="server" HeaderText="TabPanel3" ID="TabPanel3"> <ContentTemplate> tab 3 </ContentTemplate> </asp:TabPanel> </asp:TabContainer> </div> </form> </body> </html>

    Read the article

  • jQuery Cycle plugin IE6/7 issues

    - by Aaron Moodie
    I've implemented a slideshow using the Cycle plugin, which is working in all browsers except IE6&7, where the images just show up in a list, and the #page_copy div is not hiding. I've been going through the code all day without any luck, and not exactly sure how or what I should be looking for. What would be the best way to go about debugging this issue? I know that the #page_copy div is hiding when I remove the rest of the code, and I've tried the reverse (which had no result) <script type="text/javascript" charset="utf-8"> jQuery.fn.fadeToggle = function(speed, easing, callback) { return this.animate({opacity: 'toggle'}, speed, easing, callback); }; $(document).ready(function() { $('#page_copy').hide(); $('a#info_close_button').click(function() { $('#page_copy').fadeOut(200); return false; }); $('a#info_button').click(function() { $('#page_copy').fadeToggle(200); return false; }); }); $(window).load(function() { // vertically center single image var $image_cnt = $("#images > img").size(); if($image_cnt < 2) { var $single_img = $("#images").children(':first-child'); var h = $single_img.height(); $single_img.css({ marginTop: (620 - h) / 2, }); $(".next").css("display","none"); $(".prev").css("display","none"); } }); // wait until images have loaded before starting cycle $(window).load(function() { // front image rotator $('#images').cycle({ fx: 'fade', speed: 300, next: '.next', prev: '.prev', containerResize: 0, timeout: 0, delay: -2000, before: onBefore }); }); // hide all but the first image when page loads $(document).ready(function() { $('#images img:gt(0)').hide(); }); // callback fired when each slide transition begins function onBefore(curr,next,opts) { var $slide = $(next); var w = $slide.width(); var h = $slide.height(); $slide.css({ marginTop: (620 - h) / 2, marginLeft: (650 - w) / 2 }); }; </script>

    Read the article

  • multiple stateful iframes per page will overwrite JSESSIONID?

    - by Nikita
    Hello, Looking for someone to either confirm or refute my theory that deploying two iframes pointing to two different stateful pages on the same domain can lead to JSESSIONIDs being overwritten. Here's what I mean: Setup suppose you have two pages that require HttpSession state (session affinity) to function correctly - deployed at http://www.foo.com/page1 and http://www.foo.com/page2 assume www.foo.com is a single host running a Tomcat (6.0.20, fwiw) that uses JSESSIONID for session id's. suppose these pages are turned into two iframe widgets to be embedded on 3rd party sites: http://www.site.com/page1" / (and /page2 respectively) suppose there a 3rd party site that wishes to place both widgets on the same page at http://www.bar.com/foowidgets.html Can the following race condition occur? a new visitor goes to http://www.bar.com/foowidgets.html browser starts loading URLs in foowidgets.html including the two iframe 'src' URLs because browsers open multiple concurrent connections against the same host (afaik up to 6 in chrome/ff case) the browser happens to simultaneously issue requests for http://www.foo.com/page1 and http://www.foo.com/page2 The tomcat @ foo.com receives both requests at about the same time, calls getSession() for the first time (on two different threads) and lazily creates two HttpSessions and, thus, two JSESSIONIDs, with values $Page1 and $Page2. The requests also stuff data into respective sessions (that data will be required to process subsequent requests) assume that the browser first receives response to the page1 request. Browser sets cookie JSESSIONID=$Page1 for HOST www.foo.com next response to the page2 request is received and the browser overwrites cookie JSESSIONID for HOST www.foo.com with $Page2 user clicks on something in 'page1' iframe on foowidgets.html; browser issues 2nd request to http://www.foo.com/page1?action=doSomethingStateful. That request carries JSESSIONID=$Page2 (and not $Page1 - because cookie value was overwritten) when foo.com receives this request it looks up the wrong HttpSession instance (because JSESSIONID key is $Page2 and NOT $Page1). Foobar! Can the above happen? I think so, but would appreciate a confirmation. If the above is clearly possible, what are some solutions given that we'd like to support multiple iframes per page? We don't have a firm need for the iframes to share the same HttpSession, though that would be nice. In the event that the solution will still stipulate a separate HttpSession per iframe, it is - of course - mandatory that iframe 1 does not end up referencing httpSession state for iframe 2 instead of own. off top of my head I can think of: map page1 and page2 to different domains (ops overhead) use URL rewriting and never cookies (messes up analytics) anything else? thanks a lot, -nikita

    Read the article

  • IE6 and fieldset background color?

    - by codemonkey613
    Hey, I'm having some difficulty with CSS and IE6 compatibility. URL: http://bit.ly/dlX7cS Problem #1: I put a background image on the fieldset around Canada and United States. In IE6 and IE7, the background bleeds above the border-top of the fieldset. So, I found a fix. It is applied only to IE browsers, and moves the legend up a few pixels, aligning the background correctly. <!-- Fix: IE6/IE7, Legends --> <!--[if lte IE 7]> <style type="text/css"> fieldset { position: relative; } fieldset legend { position: absolute; top: -0.5em; left: 0; } </style> <![endif]--> This fixes IE7. But in IE6, it seems to make my legend for Canada vanish completely. Does anyone have a copy of IE6 they can open my site and tell me if you see Canada label. (I am testing with a multi-IE program, and it keeps crashing. My copy might not be accurate). If it's not there, any suggestions on how to fix it? Also, any suggestion on where I can download working copy of IE6? Problem #2: I have a Google Map embedded using iframe. The width of that iframe is 515px. In Firefox, Chrome, IE7 -- that is the correct alignment. But in IE6, it gets <br/> underneath the Just Energy paragraph beside it. It doesn't fit. I have to change width to 513px for it to fit. Uhm, anyone know where those 2px of difference happen? I removed border, padding, margin from the iframe, but still something is happening. <!-- Google Maps --> <iframe class="gmap" src="http://maps.google.com/maps/ms?hl=en&amp;ie=UTF8&amp;msa=0&amp;msid=100146512697135839835.000481e2a2779e8865863&amp;ll=42,-100&amp;spn=20,80&amp;output=embed" frameborder="0" marginheight="0" marginwidth="0" scrolling="no"></iframe> <!-- / Google Maps --> Er, big headache. lol

    Read the article

  • html widget communicating with server

    - by Nikita Rybak
    I'm making html widget for websites. Let's say, it will display current stock indexes. In short, arbitrary website owner takes code snippet from me and includes it on his webpage http://website.com/index.html. When arbitrary user opens http://website.com/index.html, my code sends request to my server (provider.com), which performs necessary operations and returns information to user's browser. When response has arrived, user will see relevant stock value on http://website.com/index.html. In index.html service could be called like this <script type="text/javascript" src="provider.com/service.js"> </script> <div id="target_area"></div> <script type="text/javascript"> service.show("target_area", options); </script> Now, the problem is in the same origin policy: I can't just send ajax request from website.com to provided.com and return html to embed in client's webpage. I see several solutions, which I list below, but none quite satisfy me. I wonder, if you could suggest something, especially if you had some relevant experience. 1) iframe, plain and simple. Disadvantage: must have fixed dimensions + stupid scroll bars appearing in some browsers. Can be fixed with javascript, but all this browser-specific tinkering doesn't sound good to me. 2) JSONP. Problem: can't return whole chunk of html, must return only data. Then, on browser side, I'll have to use javascript to embed data into html snippet placed statically in index.html. Doesn't sound nice, because data format is not very simple and may even change later. 3) Use hidden iframe to do ajax requests. A bit tricky, but sounds like a way to go. Well, that's my thoughts on the subject. Are there any better ways? BTW, I tried to check some existing widgets too, but didn't find much useful information. All domain names used in this text are fictional and any resemblance is purely coincidental :)

    Read the article

  • jquery ui tabs close button beneath the text

    - by Pradyut Bhattacharya
    Hi I m using jquery ui tabs and i m using them with the function of dynamically closing them. the example page here where clicking on the link 'add tab' leads to adding of tabs in the tabs panel... now in firefox the close buttons are displayed beneath the text of the tab which is leading to garbled text in the tab panel or the body of the tabs like other browsers how can i display it in same line the css i m using is .ui-tabs { padding: .20em; zoom: 1; } .ui-tabs .ui-tabs-nav { list-style: none; position: relative; padding: .2em .2em 0; height:27px; } .ui-tabs .ui-tabs-nav li { position: relative; float: left; border-bottom-width: 0 !important; margin: 0 .2em -1px 0; padding: 0; font-size:63.5%; } .ui-tabs .ui-tabs-nav li a { float: left; text-decoration: none; padding: .5em 1em; } .ui-tabs .ui-tabs-nav li.ui-tabs-selected { padding-bottom: 1px; border-bottom-width: 0; } .ui-tabs .ui-tabs-nav li.ui-tabs-selected a, .ui-tabs .ui-tabs-nav li.ui-state-disabled a, .ui-tabs .ui-tabs-nav li.ui-state-processing a { cursor: text; } .ui-tabs .ui-tabs-nav li a, .ui-tabs.ui-tabs-collapsible .ui-tabs-nav li.ui-tabs-selected a { cursor: pointer; font: 62.5%; } .ui-tabs .ui-tabs-panel { padding: 1em 1.4em; display: block; border-width: 0; background:black; color:white; font-size: 12px; } .ui-tabs .ui-tabs-hide { display: none !important; font: 62.5%; } #tabs .ui-tabs-nav li a:hover { float: left; text-decoration: none; padding: .5em 1em; background-color: #868472; } #tabs-profile .ui-tabs-nav li { position: relative; float: left; border-bottom-width: 0 !important; margin: 0 .2em -1px 0; padding: 0; font-size:75%; } Please help thanks Pradyut India

    Read the article

  • Why does filter: blur(0) still cause text to blur under Webkit?

    - by johnkavanagh
    I've come across a bug today that's taken far longer than I would like to admit to identify. Essentially: setting a filter: blur(0) (or the vendor-specific -webkit-filter) on an element should - I believe - mean that no form of blur is applied. However, having tested this today, it would appear that Webkit based browsers still blur the text within any element with either blur(0) or blur(0px) assigned to it. I've knocked together a quick Fiddle here: http://jsfiddle.net/f9rBE/ These are three identical dixs containing text (no custom fonts): This has absolutely nothing assigned Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aliquam facilisis orci in quam venenatis, in tempus ipsum sagittis. Suspendisse potenti. Donec ullamcorper lacus vel odio accumsan, vel aliquam libero tempor. Praesent nec libero venenatis, ultrices arcu non, luctus quam. Morbi scelerisque sit amet turpis sit amet tincidunt. Praesent semper erat non purus pretium consequat. Aenean et iaculis turpis. Curabitur diam tellus, consectetur non massa et, commodo venenatis metus. One has no styles at all assigned, the other two have blur(0) and blur(0px): .no-blur{} .zero-px-blur{ -webkit-filter: blur(0px); -moz-filter: blur(0px); -o-filter: blur(0px); -ms-filter: blur(0px); filter: blur(0px); } .zero-blur{ -webkit-filter: blur(0); -moz-filter: blur(0); -o-filter: blur(0); -ms-filter: blur(0); filter: blur(0); } If you preview this under Chrome/Safari you'll see that the text in the second two are still blurred: A few things worth noting: This unintentional blurring occurs in Safari on iOS7 devices (both iPhones and iPads); It also occurs on Chrome and Safari under OSX; It doesn't happen under FireFox in OSX. Of course, this isn't supported at all in Firefox just yet so it's hard to tell whether the behaviour I'm seeing is intentional/expected behaviour, or whether this is a bug in Webkit? Is it possible that this is only prevalent in higher-density resolution devices (ie: retina MacBook/iPhone/iPad)? With this in mind, how do you actually overwrite an item that has blur applied to it to set it back to non-blurred?

    Read the article

  • dojo/dijit ContentPane setting content

    - by Kitson
    I am trying append some XML retrieved via a dojo.XHRGet to a dijit.layout.ContentPane. Everything works ok in Firefox (3.6) but in Chrome, I only get back 'undefined' in the particular ContentPane. My code looks something like this: var cp = dijit.byId("mapDetailsPane"); cp.destroyDescendants(); // there are some existing Widgets/content I want to clear // and replace with the new content var xhrData = { url : "getsomexml.php", handleAs: "xml", preventCache: true, failOk: true }; var deferred = new dojo.xhrGet(xhrData); deferred.addCallback(function(data) { console.log(data.firstChild); // get a DOM object in both Firebug // and Chrome Dev Tools cp.attr("content",data.firstChild); // get the XML appended to the doc in Firefox, // but "undefined" in Chrome }); Because in both browsers I get back a valid Document object I know XHRGet is working fine, but there seems to be some sort of difference in how the content is being set. Is there a better way to handle the return data from the request? There was a request to see my XML, so here is part of it... <?xml version="1.0" encoding="UTF-8" standalone="no"?> <svg xmlns:svg="http://www.w3.org/2000/svg" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" version="1.1" width="672" height="1674"> <defs> <style type="text/css"> <![CDATA[ ...bunch of CSS... ]]> </style> <marker refX="0" refY="0" orient="auto" id="A00End" style="overflow: visible;"> ...bunch more defs... </defs> <g id="endpoints"> ...bunch of SVG with a some... <a xlink:href="javascript:gotoLogLine(16423,55);" xlink:type="simple">...more svg...</a> </g> </svg> I have run the output XML trough the WC3 validator for XML to verify it is valid. Like I said before, works in FireFox 3.6. I tried it on Safari and I got the same "undefined" so it seems to be related to Webkit.

    Read the article

  • CSS sliding doors technique for buttons, IE8 problem

    - by Kelvin
    Hello All! I used sliding doors technique, explained here: http://www.oscaralexander.com/tutorials/how-to-make-sexy-buttons-with-css.html With only one exception, that I decided to add one more image for "hover" effect. My code works well for all browsers, except IE8 (and maybe earlier versions). a.submit-button:active and a.submit-button:active span are simply blocked by "hover" and never work. Does anyone knows solution for this? Thanks a lot in advance! <style type="text/css"> .clear { /* generic container (i.e. div) for floating buttons */ overflow: hidden; width: 100%; } a.submit-button { background: transparent url('images/button-1b.png') no-repeat scroll top right; color: #fff; display: block; float: left; font: bold 13px sans-serif, arial; height: 28px; margin-right: 6px; padding-right: 18px; /* sliding doors padding */ text-decoration: none; outline: none; } a.submit-button span { background: transparent url('images/button-1a.png') no-repeat; display: block; line-height: 14px; padding: 6px 0 8px 24px; } a.submit-button:hover { background-position: right -28px; outline: none; /* hide dotted outline in Firefox */ color: #fff; } a.submit-button:hover span { background-position: 0px -28px; } a.submit-button:active { background-position: right -56px; color: #e5e5e5; outline: none; } a.submit-button:active span { background-position: 0px -56px; padding: 7px 0 7px 24px; /* push text down 1px */ } </style> And this is the button: <div class="clear"> <a class="submit-button" href="#" onclick="this.blur();"><span>Hello All</span></a> </div>

    Read the article

  • Why does this CSS example use "height: 1%" with "overflow: auto"?

    - by Lawrence Lau
    I am reading a HTML and CSS book. It has a sample code of two-column layout. <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html> <head> <style> #main {height: 1%; overflow: auto;} #main, #header, #footer {width: 768px; margin: auto;} #bodycopy { float: right; width: 598px; } #sidebar {margin-right: 608px; } #footer {clear: both; } </style> </head> <body> <div id="header" style='background-color: #AAAAAA'>This is the header.</div> <div id="main" style='background-color: #EEEEEE'> <div id="bodycopy" style='background-color: #BBBBBB'> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> This is the principal content.<br /> </div> <div id="sidebar" style='background-color: #CCCCCC'> This is the sidebar. </div> </div> <div id="footer" style='background-color: #DDDDDD'>This is the footer.</div> </body> </html> The author mentions that the use of overflow auto and 1% height will make the main area expand to encompass the computed height of content. I try to remove the 1% height and tried in different browsers but they don't show a difference. I am quite confused of its use. Any idea?

    Read the article

  • ASP.NET: Large number of Session_Start with same session id

    - by Jaap
    I'm running a ASP.NET website on my development box (.NET 2.0 on Vista/IIS7). The Session_Start method in global.asax.cs logs every call to a file (log4net). The Session_End method also logs every call. I'm using InProc session state, and set the session timeout to 5 mins (to avoid waiting for 20 mins). I hit the website, wait for 5 minutes unit I see the Session_End logging. Then I F5 the website. The browsers still has the session cookie and sends it to the server. Session_Start is called and a new session is created using the same session id (btw: I need this to be the same session id, because it is used to store data in database). Result: Every time I hit F5 on a previously ended session, the Session_Start method is called. When I open a different browser, the Session_Start method is called just once. Then after 5 minutes the Session_End each F5 causes the Session_Start method to execute. Can anyone explain why this is happening? Update: After the Session timeout, all subsequent requests have a session start & session end. So in the end my question is: why are the sessions on these subsequent request closed immediatly? 2010-02-09 14:49:08,754 INFO Global.asax[7486] [(null)] - Session started. SID=nzponumvf1hbaniverffp4mq host=127.0.0.1 2010-02-09 14:49:08,754 INFO Global.asax[7486] [nzponumvf1hbaniverffp4mq] - Request start: GET http://localhost:80/js/settings.js 2010-02-09 14:49:08,756 INFO Global.asax[7486] [(null)] - Session ended. SID=nzponumvf1hbaniverffp4mq 2010-02-09 14:49:08,760 INFO Global.asax[7486] [(null)] - Session started. SID=nzponumvf1hbaniverffp4mq host=127.0.0.1 2010-02-09 14:49:08,760 INFO Global.asax[7486] [nzponumvf1hbaniverffp4mq] - Request start: GET /css/package.aspx?name=core 2010-02-09 14:49:08,761 INFO Global.asax[7486] [(null)] - Session ended. SID=nzponumvf1hbaniverffp4mq 2010-02-09 14:49:08,762 INFO Global.asax[7486] [(null)] - Session started. SID=nzponumvf1hbaniverffp4mq host=127.0.0.1 2010-02-09 14:49:08,762 INFO Global.asax[7486] [nzponumvf1hbaniverffp4mq] - Request start: GET /js/package.aspx?name=all 2010-02-09 14:49:08,763 INFO Global.asax[7486] [(null)] - Session ended. SID=nzponumvf1hbaniverffp4mq 2010-02-09 14:49:08,763 INFO Global.asax[7486] [(null)] - Session started. SID=nzponumvf1hbaniverffp4mq host=127.0.0.1 2010-02-09 14:49:08,763 INFO Global.asax[7486] [nzponumvf1hbaniverffp4mq] - Request start: GET /css/package.aspx?name=rest 2010-02-09 14:49:08,764 INFO Global.asax[7486] [(null)] - Session ended. SID=nzponumvf1hbaniverffp4mq 2010-02-09 14:49:08,764 INFO Global.asax[7486] [(null)] - Session started. SID=nzponumvf1hbaniverffp4mq host=127.0.0.1 2010-02-09 14:49:08,765 INFO Global.asax[7486] [nzponumvf1hbaniverffp4mq] - Request start: GET /css/package.aspx?name=vacation 2010-02-09 14:49:08,765 INFO Global.asax[7486] [(null)] - Session ended. SID=nzponumvf1hbaniverffp4mq web.config relevant section: <system.web> <compilation debug="true" /> <sessionState timeout="2" regenerateExpiredSessionId="false" /> </system.web>

    Read the article

  • Safari/Chrome problem with ajaxsubmit?

    - by Jan
    Hi I'm currently having some weird issues with ajaxsubmit (http://jquery.malsup.com/form/#ajaxSubmit), which I'm currently using in a project. I have a flow where I need to open a form into a modal window. I'm using fancybox for that and it works like a charm. When the form has been forced to open in the fancybox window there can happen two things. 1) If the user who is about to submit the form is logged in she should see a confirmation in the modal box, that her input was succesfully submitted 2)If the user is not logged in there should be loaded a login form once she hits the submit button 2.1) When the user has logged in she should receive a confirmation in the modal box. This is also working like a charm in Firefox, IE8 and IE7 but not in Safari or Chrome. The weird part is that it seems like safari and chrome are completely ignoring my ajaxsubmit form. To force the first form to be opened I use the follwoing script - this part is working in both Safari and Chrome. $(".klikEnPrisForm").ajaxForm({ success: function(data){ $.fancybox({'content':data}); } }); My ajaxsubmit form scrip looks like this var options = { url: '/?altTemplate=XmlProxyKlikEnPris', dataType: 'xml', data: $(this).serializeArray(), success: function(data) { if ($(data).find('loggetind').text() == 'true') { $("#klikenpris").hide(); $('<div id="fancybox-inner-klik"></div>').appendTo('#fancybox-inner'); $('#fancybox-inner-klik').load('/KlikEnPrisAccept?tilKvittering=1&sagsno=' + $(data).find('sagsnummer').text() + '&pris=' + $(data).find('pris').text() + '&klik-comment=' + $(data).find('kommentar').text() + '&klik-telefon=' + $(data).find('tlf').text() + '&klik-maeglerkontakt=' + $(data).find('maakontakte').text()).stop(true, true); } else { $("#klikenpris").hide(); $("#fancybox-wrap").css({ 'width': '480px', 'height': '220px' }); $("#fancybox-inner").css({ 'width': '460px', 'height': '220px' }); $('<div id="fancybox-inner-klik"></div>').appendTo('#fancybox-inner'); $('#fancybox-inner-klik').load('/login.aspx?loginklikpris=0&klikpris=1&sagsno=' + $(data).find('sagsnummer').text() + '&pris=' + $(data).find('pris').text() + '&klik-comment=' + $(data).find('kommentar').text() + '&klik-telefon=' + $(data).find('tlf').text() + '&klik-maeglerkontakt=' + $(data).find('maakontakte').text()).stop(true, true); } } }; // bind to the form's submit event $('#klikenprisform').submit(function() { // inside event callbacks 'this' is the DOM element so we first // wrap it in a jQuery object and then invoke ajaxSubmit $(this).ajaxSubmit(options); // !!! Important !!! // always return false to prevent standard browser submit and page navigation return false; }); I have tried inserting an alert in the succes callback function but it's never being called it seems. It seems like the default action is not being overruled by the link written in the "url" in ajaxsubmit. I'm really puzzled about this, since it's working nicely in other browsers and I'm completely lost on how I should approach the debugging in safari/chrome. I hope all the above makes sense and I'm looking forward to hear any suggestions. Cheers!

    Read the article

  • Is this asking too much of a browser?

    - by Matt Ball
    I'm embedding a large array in <script> tags in my HTML, like this (nothing surprising): <script> var largeArray = [/* lots of stuff in here */]; </script> In this particular example, the array has 210,000 elements. That's well below the theoretical maximum of 231 - by 4 orders of magnitude. Here's the fun part: if I save JS source for the array to a file, that file is 44 megabytes (46,573,399 bytes, to be exact). If you want to see for yourself, you can download it from my Dropbox. (All the data in there is canned, so much of it is repeated. This will not be the case in production.) Now, I'm really not concerned about serving that much data. My server gzips its responses, so it really doesn't take all that long to get the data over the wire. However, there is a really nasty tendency for the page, once loaded, to crash the browser. I'm not testing at all in IE (this is an internal tool). My primary targets are Chrome 8 and Firefox 3.6. In Firefox, I can see a reasonably useful error in the console: Error: script stack space quota is exhausted In Chrome, I simply get the sad-tab page: Cut to the chase, already Is this really too much data for our modern, "high-performance" browsers to handle? Is there anything I can do* to gracefully handle this much data? Incidentally, I was able to get this to work (read: not crash the tab) on-and-off in Chrome. I really thought that Chrome, at least, was made of tougher stuff, but apparently I was wrong... Edit 1 @Crayon: I wasn't looking to justify why I'd like to dump this much data into the browser at once. Short version: either I solve this one (admittedly not-that-easy) problem, or I have to solve a whole slew of other problems. I'm opting for the simpler approach for now. @various: right now, I'm not especially looking for ways to actually reduce the number of elements in the array. I know I could implement Ajax paging or what-have-you, but that introduces its own set of problems for me in other regards. @Phrogz: each element looks something like this: {dateTime:new Date(1296176400000), terminalId:'terminal999', 'General___BuildVersion':'10.05a_V110119_Beta', 'SSM___ExtId':26680, 'MD_CDMA_NETLOADER_NO_BCAST___Valid':'false', 'MD_CDMA_NETLOADER_NO_BCAST___PngAttempt':0} @Will: but I have a computer with a 4-core processor, 6 gigabytes of RAM, over half a terabyte of disk space ...and I'm not even asking for the browser to do this quickly - I'm just asking for it to work at all! ? *other than the obvious: sending less data to the browser

    Read the article

  • Google Chrome audit on caching

    - by Álvaro G. Vicario
    If I run an audit on my sites with Google Chrome, I get this message in the Leverage browser caching section: The following resources are missing a cache expiration. Resources that do not specify an expiration may not be cached by browsers: A list of all the pictures follows. I get a similar notice in Leverage proxy caching: Consider adding a "Cache-Control: public" header to the following resources: Apart from pictures, I also get a notice about HTML, CSS and JavaScript files: The following resources are explicitly non-cacheable. Consider making them cacheable if possible: Its funny because I've worked hard to cache all static contents (except for pictures, where I just left Apache's default settings). Firefox does indeed store all these items in cache. Is there anything I should improve in my HTTP headers? Here's the complete header set of some items as loaded after removing the browser caché. Pictures use default settings I didn't really check before, the rest should be cachéd for three hours. I can set headers with both .htaccess and PHP. PNG HTTP/1.1 200 OK Date: Sat, 31 Jul 2010 12:46:14 GMT Server: Apache Last-Modified: Thu, 18 Mar 2010 21:40:54 GMT Etag: "c48024-230-4821a15d6c580" Accept-Ranges: bytes Content-Length: 560 Keep-Alive: timeout=4 Connection: Keep-Alive Content-Type: image/png HTML HTTP/1.1 200 OK Date: Sat, 31 Jul 2010 12:46:13 GMT Server: Apache X-Powered-By: PHP/5.2.11 Expires: Sat, 31 Jul 2010 15:46:13 GMT Cache-Control: max-age=10800, s-maxage=10800, must-revalidate, proxy-revalidate Content-Encoding: gzip Vary: Accept-Encoding Last-Modified: Wed, 24 Mar 2010 20:30:36 GMT Keep-Alive: timeout=4 Connection: Keep-Alive Transfer-Encoding: chunked Content-Type: text/html; charset=ISO-8859-15 CSS HTTP/1.1 200 OK Date: Sat, 31 Jul 2010 12:48:21 GMT Server: Apache X-Powered-By: PHP/5.2.11 Expires: Sat, 31 Jul 2010 15:48:21 GMT Cache-Control: max-age=10800, s-maxage=10800, must-revalidate, proxy-revalidate Content-Encoding: gzip Vary: Accept-Encoding Last-Modified: Thu, 18 Mar 2010 21:40:12 GMT Keep-Alive: timeout=4 Connection: Keep-Alive Transfer-Encoding: chunked Content-Type: text/css JavaScript HTTP/1.1 200 OK Date: Sat, 31 Jul 2010 12:48:21 GMT Server: Apache X-Powered-By: PHP/5.2.11 Expires: Sat, 31 Jul 2010 15:48:21 GMT Cache-Control: max-age=10800, s-maxage=10800, must-revalidate, proxy-revalidate Content-Encoding: gzip Vary: Accept-Encoding Last-Modified: Thu, 18 Mar 2010 21:40:12 GMT Keep-Alive: timeout=4 Connection: Keep-Alive Transfer-Encoding: chunked Content-Type: application/x-javascript Update I've tested Jumby's suggestion and set my CSS's expire to 1 year: Cache-Control:max-age=31536000, s-maxage=31536000, must-revalidate, proxy-revalidate Connection:Keep-Alive Content-Encoding:gzip Content-Length:4198 Content-Type:text/css Date:Mon, 02 Aug 2010 20:48:56 GMT Expires:Tue, 02 Aug 2011 20:48:56 GMT Keep-Alive:timeout=5, max=99 Last-Modified:Thu, 18 Mar 2010 20:40:12 GMT Server:Apache/2.2.14 (Win32) PHP/5.3.1 Vary:Accept-Encoding X-Powered-By:PHP/5.3.1 However, Chrome still claims "explicitly non-cacheable".

    Read the article

  • Web Safe Area (optimal resolution) for web app design

    - by M.A.X
    I'm in the process of designing a new web app and I'm wondering for what 'web safe area' should I optimize the app layout and design. I did some investigation and thinking on my own but wanted to share this to see what the general opinion is. Here is what I found: Optimal Display Resolution: w3schools web stats seems to be the most referenced source (however they state that these are results from their site and is biased towards tech savvy users) http://www.w3counter.com/globalstats.php (aggregate data from something like 15,000 different sites that use their tracking services) StatCounter Global Stats Display Resolution (Stats are based on aggregate data collected by StatCounter on a sample exceeding 15 billion pageviews per month collected from across the StatCounter network of more than 3 million websites) NetMarketShare Screen Resolutions (marketshare.hitslink.com) (a web analytics consulting firm, they get data from browsers of site visitors to their on-demand network of live stats customers. The data is compiled from approximately 160 million visitors per month) Display Resolution Summary: There is a bit of variation between the above sources but in general as of Jan 2011 looks like 1024x768 is about 20%, while ~85% have a higher resolution of at least 1280x768 (1280x800 is the most common of these with 15-20% of total web, depending on the source; 1280x1024 and 1366x768 follow behind with 9-14% of the share). My guess would be that the higher resolution values will be even more common if we filter on North America, and even higher if we filter on N.American corporate users (unfortunately I couldn't find any free geographically filtered statistics). Another point to note is that the 1024x768 desktop user population is likely lower than the aforementioned 20%, seeing as the iPad (1024x768 native display) is likely propping up those number. My recommendation would be to optimize around the 1280x768 constraint (*note: 1280x768 is actually a relatively rare resolution, but I think it's a valid constraint range considering that 1366x768 is relatively common and 1280 is the most common horizontal resolution). Browser + OS Constraints: To further add to the constraints we have to subtract the space taken up by the browser (assuming IE, which is the most space consuming) and the OS (assuming WinXP-Win7): Win7 has the biggest taskbar footprint at a height of 40px (XP's and Vista's is 30px) The default IE8 view uses up 25px at the bottom of the screen with the status bar and a further 120px at the top of the screen with the windows title bar and the browser UI (assuming the default 'favorites' toolbar is present, it would instead be 91px without the favorites toolbar). Assuming no scrollbar, we also loose a total of 4px horizontally for the window outline. This means that we are left with 583px of vertical space and 1276px of horizontal. In other words, a Web Safe Area of 1276 x 583 Is this a correct line of thinking? I tried to Google some design best practices but most still talk about designing around 1024x768 which seems to be quickly disappearing. Any help on this would be greatly appreciated! Thanks.

    Read the article

  • CSS: Chrome and Safari seem to 'add' border to width, while IE, Firefox & Opera don't

    - by Michiel
    Hey guys, I'm trying to achieve cross-browser consistency for my website, but I have been trying all day now and its driving me nuts (0.38 am here in Europe now..). It's about this page: http://www[insert-dot-here]geld[insert-dash-here]surfen[insert-dot-here]nl/uitbetalingen.html (please note that I prefer this URL not to be made crawlable for seo-bots) If you view this page in IE, Firefox or Opera, everything is fine, but in Chrome and Safari the tables are a little out of line (as you'll probably clearly notice). What seems to be the problem?; it appears to me that in Chrome and Safari the left and right border (2px) in total are added to the set table width, while in the other browsers the border is considered part of the width. The (most) relevant CSS-lines are the following ones (from the tabel.css-file, also available through the page's source file): table.uitbetaling { margin: 11px 18px 10px 19px; border: 1px solid #8ccaee; width: 498px; padding: 0; } table.uitbetaling img, table.uitbetaling td { margin: 0; border: 0; padding: 0; width: 496px; } table.uitbetaling tr { margin: 0; border: 0; padding: 0 1px 0 0; } So basically I have used a table-structure to organize images, like this; (the class of the table is 'uitbetaling') <table> <tr><td><img /></td></tr> <tr><td><img /></td></tr> ... <tr><td><img /></td></tr> </table> If, here, I set the width of 'table.uitbetaling' and 'table.uitbetaling img, table.uitbetaling td' to the same value (e.g. both 496 or 498), the 'problem' in Chrome and Safari is solved, however in Firefox the right side border is than blank. Because the right-side border can't 'fit' in anymore. 'img' and 'td' must be at least 2px more narrow than 'table.uitbetaling' for the right-border be visible in Firefox. Is there any way to solve this? Thanks so much in advance for your insights!!

    Read the article

< Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >