Search Results

Search found 57458 results on 2299 pages for 'http response codes'.

Page 37/2299 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • How to restrict http access to video files?

    - by Tharases
    I want to only let "right" people watch those videos. In other words, only registered users that are allowed (by other users, ie, friends) should see videos. I have a hard retriction for cpu usage in my shared environment, so I can use things like php's readfile.

    Read the article

  • Interrupted Upgrade from 11.10 to 12.04

    - by Tamil
    My upgrade using alternative iso from 11.10 to 12.04 got interrupted and I had to hard restart my machine. Now I feel that everything is recovered except my already installed packages like vim. How do I backup my home folder for fresh installation of ubuntu? Following are the errors I'm facing I couldn't mark any package for re-installation in synaptic or remove and install too. output of sudo apt-get install vim Building dependency tree Reading state information... Done Package vim is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source E: Package 'vim' has no installation candidate If I try installing it from synaptic I get apache2.2-common: Package apache2.2-common has no available version, but exists in the database. This typically means that the package was mentioned in a dependency and never uploaded, has been obsoleted or is not available with the contents of sources.list my sources.list file # added by the release upgrader # deb cdrom:[Ubuntu 12.04.1 LTS _Precise Pangolin_ - Release amd64 (20120822.4)]/ precise main restricted # added by the release upgrader # # deb cdrom:[Ubuntu 12.04.1 LTS _Precise Pangolin_ - Release amd64 (20120822.4)]/ precise main restricted # deb cdrom:[Ubuntu 11.04 _Natty Narwhal_ - Release amd64 (20110427.1)]/ natty main restricted # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://archive.ubuntu.com/ubuntu precise main restricted deb-src http://archive.ubuntu.com/ubuntu precise main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://archive.ubuntu.com/ubuntu precise-updates main restricted deb-src http://archive.ubuntu.com/ubuntu precise-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://archive.ubuntu.com/ubuntu precise universe deb-src http://archive.ubuntu.com/ubuntu precise universe deb http://archive.ubuntu.com/ubuntu precise-updates universe deb-src http://archive.ubuntu.com/ubuntu precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://archive.ubuntu.com/ubuntu precise multiverse deb-src http://archive.ubuntu.com/ubuntu precise multiverse deb http://archive.ubuntu.com/ubuntu precise-updates multiverse deb-src http://archive.ubuntu.com/ubuntu precise-updates multiverse ## Uncomment the following two lines to add software from the 'backports' ## repository. ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. # deb http://us.archive.ubuntu.com/ubuntu/ natty-backports main restricted universe multiverse # deb-src http://us.archive.ubuntu.com/ubuntu/ natty-backports main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu precise-security main restricted deb-src http://archive.ubuntu.com/ubuntu precise-security main restricted deb http://archive.ubuntu.com/ubuntu precise-security universe deb-src http://archive.ubuntu.com/ubuntu precise-security universe deb http://archive.ubuntu.com/ubuntu precise-security multiverse deb-src http://archive.ubuntu.com/ubuntu precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. deb http://archive.canonical.com/ubuntu precise partner # deb-src http://archive.canonical.com/ubuntu natty partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu precise main deb-src http://extras.ubuntu.com/ubuntu precise main # deb http://tamil.3758_gmail.com:[email protected]/free unstable main # disabled on upgrade to oneiric # deb http://debian.datastax.com/natty oneiric main # disabled on upgrade to oneiric sudo apt-get update Err http://archive.ubuntu.com precise InRelease Err http://archive.canonical.com precise InRelease Err http://archive.ubuntu.com precise-updates InRelease Err http://archive.ubuntu.com precise-security InRelease Err http://extras.ubuntu.com precise InRelease Err http://archive.canonical.com precise Release.gpg Unable to connect to 172.16.140.249:3142: Err http://archive.ubuntu.com precise Release.gpg Unable to connect to 172.16.140.249:3142: Err http://archive.ubuntu.com precise-updates Release.gpg Unable to connect to 172.16.140.249:3142: Err http://extras.ubuntu.com precise Release.gpg Unable to connect to 172.16.140.249:3142: Err http://archive.ubuntu.com precise-security Release.gpg Unable to connect to 172.16.140.249:3142: W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise/InRelease

    Read the article

  • Lighttpd 403 Errors on HTML and PHP pages

    - by Brian
    I installed lighttpd on CentOS 5.5 64-bit. Everything seems fine and running except I cannot get past 403 errors on both HTML and PHP pages. I have used CHMOD and CHOWN, changed ownership in the config file, done everything possible and have been stuck for 2 days. Appreciate any help, and here's hoping to a stupid error on my part. Here is the log file with debug options on: 2011-02-21 11:23:13: (request.c.304) fd: 7 request-len: 408 GET /index.html HTTP/1.1 Host: 10.0.1.8 User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 115 Connection: keep-alive Cache-Control: max-age=0 2011-02-21 11:23:13: (response.c.241) run condition 2011-02-21 11:23:13: (response.c.300) -- splitting Request-URI 2011-02-21 11:23:13: (response.c.301) Request-URI : /index.html 2011-02-21 11:23:13: (response.c.302) URI-scheme : http 2011-02-21 11:23:13: (response.c.303) URI-authority: 10.0.1.8 2011-02-21 11:23:13: (response.c.304) URI-path : /index.html 2011-02-21 11:23:13: (response.c.305) URI-query : 2011-02-21 11:23:13: (response.c.349) -- sanatising URI 2011-02-21 11:23:13: (response.c.350) URI-path : /index.html 2011-02-21 11:23:13: (response.c.470) -- before doc_root 2011-02-21 11:23:13: (response.c.471) Doc-Root : /srv/www/lighttpd 2011-02-21 11:23:13: (response.c.472) Rel-Path : /index.html 2011-02-21 11:23:13: (response.c.473) Path : 2011-02-21 11:23:13: (response.c.521) -- after doc_root 2011-02-21 11:23:13: (response.c.522) Doc-Root : /srv/www/lighttpd 2011-02-21 11:23:13: (response.c.523) Rel-Path : /index.html 2011-02-21 11:23:13: (response.c.524) Path : /srv/www/lighttpd/index.html 2011-02-21 11:23:13: (response.c.541) -- logical -> physical 2011-02-21 11:23:13: (response.c.542) Doc-Root : /srv/www/lighttpd 2011-02-21 11:23:13: (response.c.543) Rel-Path : /index.html 2011-02-21 11:23:13: (response.c.544) Path : /srv/www/lighttpd/index.html 2011-02-21 11:23:13: (response.c.561) -- handling physical path 2011-02-21 11:23:13: (response.c.562) Path : /srv/www/lighttpd/index.html 2011-02-21 11:23:13: (response.c.608) -- access denied 2011-02-21 11:23:13: (response.c.609) Path : /srv/www/lighttpd/index.html 2011-02-21 11:23:13: (response.c.128) Response-Header: HTTP/1.1 403 Forbidden Content-Type: text/html Content-Length: 345 Date: Mon, 21 Feb 2011 16:23:13 GMT Server: lighttpd/1.4.28 Here is the directory listing. I used CHOWN to set to lighttpd:lighttpd [root@localhost lighttpd]# ls -al total 40 drwxrwxrwx 2 lighttpd lighttpd 4096 Feb 21 10:48 . drwxrwxrwx 3 lighttpd lighttpd 4096 Feb 21 10:57 .. -rwxrwxrwx 1 lighttpd lighttpd 10 Feb 20 08:32 index.html -rwxrwxrwx 1 lighttpd lighttpd 20 Feb 21 10:48 index.php -rwxrwxrwx 1 lighttpd lighttpd 20 Feb 21 10:39 info.php [root@localhost lighttpd]# Requested Commands: [root@localhost lighttpd]# ls -ld / /srv /srv/www drwxr-xr-x 22 root root 4096 Feb 21 04:39 / drwxrwxrwx 3 lighttpd lighttpd 4096 Feb 20 07:38 /srv drwxrwxrwx 3 lighttpd lighttpd 4096 Feb 21 10:57 /srv/www [root@localhost lighttpd]# ps auxZ | grep lighttpd root:system_r:httpd_t lighttpd 3842 0.0 0.2 48368 896 ? S 12:24 0:00 /usr/sbin/lighttpd -f /etc/lighttpd/lighttpd.conf root:system_r:unconfined_t:SystemLow-SystemHigh root 3845 0.0 0.2 61152 764 pts/0 R+ 12:24 0:00 grep lighttpd

    Read the article

  • Apache HTTP Server+Tomcat: Which file generates mod_jk.conf, how to modify generated stuff, and how does httpd reach it?

    - by Sk8erPeter
    I'm using XAMPP with Apache HTTP Server and Tomcat Add-On installed. There's a default mod_jk.conf which is generated by Tomcat when starting it. But which file generates this mod_jk.conf file? How can I modify default values? By default, it looks like this: pastebin - mod_jk.conf. How does Apache HTTP Server reach this file? I can't see any reference to this file when looking into httpd.conf. When I put a VirtualHost in my httpd.conf file, and I put the line JkMount /* ajp13 into it, Apache HTTP Server service can't start (causes a 7024 event id error in Event Viewer (with error code 1, but nothing specific), but puts no error messages into error.log. The VirtualHost looks like this: pastebin - VirtualHost + JkMount. This way Apache HTTP Server can not start. If I comment out the line JkMount /* ajp13, it starts without a problem. BUT if I put the following line, which is the same as in mod_jk.conf, before the mentioned VirtualHost again, the service can start! <IfModule !mod_jk.c LoadModule jk_module "C:/xampp/tomcat/xampp/apache/modules/mod_jk.so" </IfModule Why do I have to put this line in again? Why does that happen, that the http://localhost/example does work, so this query is redirected to AJP13, but I have to put the LoadModule line in again in another file? EDIT: I don't have a clue why, I surely modified something, but now /example doesn't work either... And the config above gives a 500 Internal Server Error... :S Thanks!

    Read the article

  • How do I turn on basic HTTP-auth for a page in jboss?

    - by Electrons_Ahoy
    I'm setting up a jboss server for testing some java code that talks to http servers. That's pretty easy. However one of the things I'm testing is interfacing with classic "old-school" HTTP-Auth protected pages, and for the life of me I can't figure out how to turn that on in jboss (and my google-fu seems to have let me down.) So, how do I add a basic username and password to a single html (or jsp) file in jboss using http Basic Access Authentication?

    Read the article

  • Can I get my http back in the address bar in Firefox?

    - by Abel
    Since Firefox 7 (or 6?) they silently removed "http" from the address bar. Previously, when you clicked it, "http" showed up again, allowing to easily just change http into https. So far, I found two ways to get it back, but isn't there a setting I can use to have it permanently in vision again, or at least when I click it? Way 1: copy selection and paste it again Way 2: type the whole scheme again in front of the address ...

    Read the article

  • Google is re-indexing pages after redirecting URLs from HTTP to HTTPS incorrectly

    - by SLIM
    I upgraded my site so that all pages have gone from using HTTP to HTTPS. I didn't consider that Google treats HTTPS pages differently than HTTP. I recreated my sitemap to so that all links now reflect the new HTTPS URLs and let it be for a few days. (Whoops!) Google is now re-indexing all the HTTPS pages. I have about 19k pages on the site, and Google has already indexed about 8k of the new HTTPS pages. The problem is that Google sees all of these as brand new pages when many of them have a long HTTP history. Of course most of you will recognize the problem, I didn't set up a 301 from the old HTTP to the new HTTPS URLs. Is it too late to do this? Should I switch my sitemap back to HTTP URLs and then 301 redirect to the new HTTPS URls? Or should I leave the sitemap as is, and setup 301 redirects anyway... I'm not even sure if Google is trying to reach the HTTP site anymore. Currently the site is doing 303 redirects (from HTTP to HTTPS), although I haven't figured out why yet.

    Read the article

  • Cookie access within a HTTP Class

    - by James Jeffery
    I have a HTTP class that has a Get, and Post, method. It's a simple class I created to encapsulate Post and Get requests so I don't have to repeat the get/post code throughout the application. In C#: class HTTP { private CookieContainer cookieJar; private String userAgent = "..."; public HTTP() { this.cookieJar = new CookieContainer(); } public String get(String url) { // Make get request. Return the JSON } public String post(String url, String postData) { // Make post request. Return the JSON } } I've made the CookieJar a property because I want to preserve the cookie values throughout the session. If the user is logged into Twitter with my application, each request I make (be it get or post) I want to use the cookies so they remain logged in. That's the basics of it anyway. But, I don't want to return a string in all instances. Sometimes I may want the cookie, or a header value, or something else from the request. Ideally I'd like to be able to do this in my code: Cookie cookie = http.get("http://google.com").cookie("g_user"); String g_user = cookie.value; or String source = http.get("http://google.com").body; My question - To do this, would I need to have a Get class, and a Post class, that are included within the HTTP class and are accessible via accessors? Within the Get and Post class I would then have the Cookie method, and the body property, and whatever else is needed. Should I also use an interface, or create a Request class and have Post and Get extend it so that common methods and properties are available to both classes? Or, am I thinking totally wrong?

    Read the article

  • URL Rewrite – Protocol (http/https) in the Action

    - by OWScott
    IIS URL Rewrite supports server variables for pretty much every part of the URL and http header. However, there is one commonly used server variable that isn’t readily available.  That’s the protocol—HTTP or HTTPS. You can easily check if a page request uses HTTP or HTTPS, but that only works in the conditions part of the rule.  There isn’t a variable available to dynamically set the protocol in the action part of the rule.  What I wish is that there would be a variable like {HTTP_PROTOCOL} which would have a value of ‘HTTP’ or ‘HTTPS’.  There is a server variable called {HTTPS}, but the values of ‘on’ and ‘off’ aren’t practical in the action.  You can also use {SERVER_PORT} or {SERVER_PORT_SECURE}, but again, they aren’t useful in the action. Let me illustrate.  The following rule will redirect traffic for http(s)://localtest.me/ to http://www.localtest.me/. <rule name="Redirect to www"> <match url="(.*)" /> <conditions> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> </conditions> <action type="Redirect" url="http://www.localtest.me/{R:1}" /> </rule> The problem is that it forces the request to HTTP even if the original request was for HTTPS. Interestingly enough, I planned to blog about this topic this week when I noticed in my twitter feed yesterday that Jeff Graves, a former colleague of mine, just wrote an excellent blog post about this very topic.  He beat me to the punch by just a couple days.  However, I figured I would still write my blog post on this topic.  While his solution is a excellent one, I personally handle this another way most of the time.  Plus, it’s a commonly asked question that isn’t documented well enough on the web yet, so having another article on the web won’t hurt. I can think of four different ways to handle this, and depending on your situation you may lean towards any of the four.  Don’t let the choices overwhelm you though.  Let’s keep it simple, Option 1 is what I use most of the time, Option 2 is what Jeff proposed and is the safest option, and Option 3 and Option 4 need only be considered if you have a more unique situation.  All four options will work for most situations. Option 1 – CACHE_URL, single rule There is a server variable that has the protocol in it; {CACHE_URL}.  This server variable contains the entire URL string (e.g. http://www.localtest.me:80/info.aspx?id=5)  All we need to do is extract the HTTP or HTTPS and we’ll be set. This tends to be my preferred way to handle this situation. Indeed, Jeff did briefly mention this in his blog post: … you could use a condition on the CACHE_URL variable and a back reference in the rewritten URL. The problem there is that you then need to match all of the conditions which could be a problem if your rule depends on a logical “or” match for conditions. Thus the problem.  If you have multiple conditions set to “Match Any” rather than “Match All” then this option won’t work.  However, I find that 95% of all rules that I write use “Match All” and therefore, being the lazy administrator that I am I like this simple solution that only requires adding a single condition to a rule.  The caveat is that if you use “Match Any” then you must consider one of the next two options. Enough with the preamble.  Here’s how it works.  Add a condition that checks for {CACHE_URL} with a pattern of “^(.+)://” like so: How you have a back-reference to the part before the ://, which is our treasured HTTP or HTTPS.  In URL Rewrite 2.0 or greater you can check the “Track capture groups across conditions”, make that condition the first condition, and you have yourself a back-reference of {C:1}. The “Redirect to www” example with support for maintaining the protocol, will become: <rule name="Redirect to www" stopProcessing="true"> <match url="(.*)" /> <conditions trackAllCaptures="true"> <add input="{CACHE_URL}" pattern="^(.+)://" /> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> </conditions> <action type="Redirect" url="{C:1}://www.localtest.me/{R:1}" /> </rule> It’s not as easy as it would be if Microsoft gave us a built-in {HTTP_PROTOCOL} variable, but it’s pretty close. I also like this option since I often create rule examples for other people and this type of rule is portable since it’s self-contained within a single rule. Option 2 – Using a Rewrite Map For a safer rule that works for both “Match Any” and “Match All” situations, you can use the Rewrite Map solution that Jeff proposed.  It’s a perfectly good solution with the only drawback being the ever so slight extra effort to set it up since you need to create a rewrite map before you create the rule.  In other words, if you choose to use this as your sole method of handling the protocol, you’ll be safe. After you create a Rewrite Map called MapProtocol, you can use “{MapProtocol:{HTTPS}}” for the protocol within any rule action.  Following is an example using a Rewrite Map. <rewrite> <rules> <rule name="Redirect to www" stopProcessing="true"> <match url="(.*)" /> <conditions trackAllCaptures="false"> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> </conditions> <action type="Redirect" url="{MapProtocol:{HTTPS}}://www.localtest.me/{R:1}" /> </rule> </rules> <rewriteMaps> <rewriteMap name="MapProtocol"> <add key="on" value="https" /> <add key="off" value="http" /> </rewriteMap> </rewriteMaps> </rewrite> Option 3 – CACHE_URL, Multi-rule If you have many rules that will use the protocol, you can create your own server variable which can be used in subsequent rules. This option is no easier to set up than Option 2 above, but you can use it if you prefer the easier to remember syntax of {HTTP_PROTOCOL} vs. {MapProtocol:{HTTPS}}. The potential issue with this rule is that if you don’t have access to the server level (e.g. in a shared environment) then you cannot set server variables without permission. First, create a rule and place it at the top of the set of rules.  You can create this at the server, site or subfolder level.  However, if you create it at the site or subfolder level then the HTTP_PROTOCOL server variable needs to be approved at the server level.  This can be achieved in IIS Manager by navigating to URL Rewrite at the server level, clicking on “View Server Variables” from the Actions pane, and added HTTP_PROTOCOL. If you create the rule at the server level then this step is not necessary.  Following is an example of the first rule to create the HTTP_PROTOCOL and then a rule that uses it.  The Create HTTP_PROTOCOL rule only needs to be created once on the server. <rule name="Create HTTP_PROTOCOL"> <match url=".*" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{CACHE_URL}" pattern="^(.+)://" /> </conditions> <serverVariables> <set name="HTTP_PROTOCOL" value="{C:1}" /> </serverVariables> <action type="None" /> </rule>   <rule name="Redirect to www" stopProcessing="true"> <match url="(.*)" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> </conditions> <action type="Redirect" url="{HTTP_PROTOCOL}://www.localtest.me/{R:1}" /> </rule> Option 4 – Multi-rule Just to be complete I’ll include an example of how to achieve the same thing with multiple rules. I don’t see any reason to use it over the previous examples, but I’ll include an example anyway.  Note that it will only work with the “Match All” setting for the conditions. <rule name="Redirect to www - http" stopProcessing="true"> <match url="(.*)" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> <add input="{HTTPS}" pattern="off" /> </conditions> <action type="Redirect" url="http://www.localtest.me/{R:1}" /> </rule> <rule name="Redirect to www - https" stopProcessing="true"> <match url="(.*)" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> <add input="{HTTPS}" pattern="on" /> </conditions> <action type="Redirect" url="https://www.localtest.me/{R:1}" /> </rule> Conclusion Above are four working examples of methods to call the protocol (HTTP or HTTPS) from the action of a URL Rewrite rule.  You can use whichever method you most prefer.  I’ve listed them in the order that I favor them, although I could see some people preferring Option 2 as their first choice.  In any of the cases, hopefully you can use this as a reference for when you need to use the protocol in the rule’s action when writing your URL Rewrite rules. Further information: Viewing all Server Variable for a site. URL Parts available to URL Rewrite Rules Further URL Rewrite articles

    Read the article

  • Gvim on Windows 7: ALT codes not working

    - by John Sonderson
    I would like to be able to enter ALT codes in Gvim on Windows 7 as documented on the following site: Alt Codes On Windows (Windows 7 in my case), to generate a character via an ALT code you make sure that the NumLock key on your keypad is toggled on, hold down the ALT key, enter the keycode on the numeric keypad, and then release the ALT key. However this does not work in Gvim on Windows (which ignores the fact that I am pressing the ALT key and just prints to entered keypad key directly onto the screen). How can I get these keystroke combinations to work in Gvim as well? Thanks. EDIT: As the answer below points out, the way to insert non-ASCII characters for which you do not have entries on your keyboard without changing the keyboard layout is as follows: Make sure you are in insert mode, and then type CTRL-V followed by the Unicode character code of interest, for instance: CTRL-V u00E0 (generates à) CTRL-V u00C8 (generates È) CTRL-V u00E8 (generates è) CTRL-V u00E9 (generates é) CTRL-V u00EC (generates ì) CTRL-V u00F2 (generates ò) etc... See for instance http://unicode-table.com/ for a full list of Unicode character codes. The following list of Unicode characters by language may also be useful: http://en.wikipedia.org/wiki/List_of_Unicode_characters In some cases such as this one, though, there might be an easier way to enter special characters (see :help digraphs and :digraphs). For example, while in insert mode you may be able to type the following: CTRL-K E! (yields É) CTRL-K a' (yields á) Note that as the following page shows: http://code.google.com/p/vim/source/browse/runtime/doc/digraph.txt Gvim 7.4 contains an even wider set of default digraphs than Gvim 7.3, thus providing convenience to an even broader set of languages. Regards.

    Read the article

  • wget hangs in http request sent awaiting response in some sites

    - by gkr
    Using Ubuntu 12.04. wget hangs in http request sent, awaiting response... in some sites. Browser's are not opening sites that are failed in wget. But in WinXP everything works. This works gkr@gkr-desktop:~/Documents/curl$ wget google.com --2012-06-12 21:29:37-- http://google.com/ Resolving google.com (google.com)... 74.125.236.174, 74.125.236.160, 74.125.236.161, ... Connecting to google.com (google.com)|74.125.236.174|:80... connected. HTTP request sent, awaiting response... 301 Moved Permanently Location: http://www.google.com/ [following] --2012-06-12 21:29:38-- http://www.google.com/ Resolving www.google.com (www.google.com)... 74.125.236.179, 74.125.236.180, 74.125.236.176, ... Connecting to www.google.com (www.google.com)|74.125.236.179|:80... connected. HTTP request sent, awaiting response... 302 Found Location: http://www.google.co.in/ [following] --2012-06-12 21:29:38-- http://www.google.co.in/ Resolving www.google.co.in (www.google.co.in)... 74.125.236.184, 74.125.236.191, 74.125.236.183, ... Connecting to www.google.co.in (www.google.co.in)|74.125.236.184|:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: `index.html.3' [ ] 13,383 --.-K/s in 0.04s 2012-06-12 21:29:39 (308 KB/s) - `index.html.3' saved [13383] gkr@gkr-desktop:~/Documents/curl$ This site just stops/hangs in awaiting response. gkr@gkr-desktop:~/Documents/curl$ wget grooveshark.com --2012-06-12 21:27:29-- http://grooveshark.com/ Resolving grooveshark.com (grooveshark.com)... 8.20.213.76 Connecting to grooveshark.com (grooveshark.com)|8.20.213.76|:80... connected. HTTP request sent, awaiting response... ^C gkr@gkr-desktop:~/Documents/curl$ Thanks

    Read the article

  • my jQuery codes suspected to fail on IE 7

    - by Kyle
    I have received numerous calls from users lately, stating that they are not able to access the conference sites with IE7. These sites are created from a template, and they are managed on Joomla. Previously on other sites, there have no problems or complaints. However, with the recent complaints , I suspect that the culprit is my simple jQuery codes since the sites that have been reported have been created recently and incorporated with jQuery features. Site A (does not contain any jQuery): digitalmediaroi.net Site B (With recent complaints that fails to load on certain IE7): http://brownfieldscanada.com/ These are the jQuery codes that are running concurrently on a page. Are they using too much memory, therefore causing a problem on IE 7 ? <span id="alertTxt" style="text-align:center;display:none"><span style="color:#CC0000; font-weight:bold;">ALERT:</span> Municipalities, Developers, Owners, QPs, Consultants, Lawyers, Service Providers</span> <span id="alertTxt2" style="text-align:center; font-weight:bold; display:none">This high-level summit is specifically designed for YOU!</span> <span id="alertTxt3" style="text-align:center; font-weight:bold; display:none; color:#184b26;">Don't miss our Ground Water Protection, Shallow Soil and Waterfront Properties Workshop</span> <span id="alertTxt4" style="text-align:center; font-weight:bold; display:none"><a href="register/registeronline.html" title="Register for the Transforming &amp; Revitalizing Downtowns Summit!" style="font-family:ariel, helvetica, san-serif; color:#000099; text-decoration:underline;">Online registration now available!</a></span> <script type="text/javascript"> function animateTxt() { $j("#alertTxt").fadeIn(2000).delay(6000).fadeOut(1500, function() { $j("#alertTxt2").fadeIn(2000).delay(3000).fadeOut(1500,function(){ $j("#alertTxt3").fadeIn(2000).delay(6000).fadeOut(1500,function(){ $j("#alertTxt4").delay(500).fadeIn(2000).delay(4000).fadeOut(1500,function(){ animateTxt();}); }); }); }); } animateTxt(); </script> <script type="text/javascript">// <![CDATA[ var imgs1 = new Array("http://www.brownfieldscanada.com/images/brown-images/sponsors/intrinsik.jpg", "http://www.brownfieldscanada.com/images/brown-images/sponsors/stantec.jpg"); var imgs1_alt = new Array("Intrinsik - Sponsor of Ontario Brownfields Regulatory Summit", "Stantec - Sponsor of Ontario Brownfields Regulatory Summit"); var sponsor_names = new Array("Sponsor:","Sponsor:"); var lnks1 = new Array("http://www.intrinsikscience.com/", "http://www.stantec.com/"); var currentAd1 = 0; var imgCt1 = imgs1.length; function cycle1() { if (currentAd1 == imgCt1) { currentAd1 = 0; } var banner1 = document.getElementById('adBanner1'); var link1 = document.getElementById('adLink1'); banner1.src=imgs1[currentAd1]; banner1.alt=imgs1_alt[currentAd1]; link1.href=lnks1[currentAd1]; document.getElementById('sponsorheader').innerHTML = sponsor_names[currentAd1]; $j("#adBanner1").fadeIn(2000).delay(5000).fadeOut(1500, function(){ currentAd1++; cycle1(); }); } cycle1(); // ]]></script> <script type="text/javascript">// <![CDATA[ var partner_img = new Array("http://www.brownfieldscanada.com/images/brown-images/partners/BuildingLogo-2.jpg", "http://www.brownfieldscanada.com/images/brown-images/partners/NRU-Publishing_logo.jpg", "http://www.brownfieldscanada.com/images/brown-images/partners/haz_mat.jpg", "http://www.brownfieldscanada.com/images/brown-images/partners/oppi_logo_blue_with_tag.jpg", "http://www.brownfieldscanada.com/images/brown-images/partners/renew_logo.jpg", "http://www.brownfieldscanada.com/images/brown-images/partners/DCN.jpg"); var partner_lnks = new Array("http://www.building.ca/", "http://www.nrupublishing.com/", "http://www.hazmatmag.com/", "http://www.ontarioplanners.on.ca/", "http://renewcanada.net/", "http://www.dailycommercialnews.com/"); var partner_alt = new Array("Building.ca - Parter for Ontario Brownfields Regulatory Summit", "NRU Publishing - Partner for Ontario Brownfields Regulatory Summit", "HazMat Management Magazine - Partner for Ontario Brownfields Regulatory Summit", "The Ontario Professional Planners Institute - Partner for Ontario Brownfields Regulatory Summit", "Renew Canada - Partner for Ontario Brownfields Regulatory Summit", "Daily Commercial News and Construction Record - Partner for Ontario Brownfields Regulatory Summit"); var partner_title = new Array("Real Estate Development • Construction • Architecture", "NRU Publishing", "HazMat Management Magazine", "The Ontario Professional Planners Institute", "ReNew Canada", "Daily Commercial News and Construction Record"); var partner_name = new Array("Partner:","Partner:","Partner:","Partner:","Partner:", "Partner:"); var partner_num = 0; var partner_total = 6; function partnerCycle() { if (partner_num == partner_total) { partner_num = 0; } var partnerBanner = document.getElementById('partnerBanner'); var link1 = document.getElementById('partnerLink'); partnerBanner.src=partner_img[partner_num]; partnerBanner.alt=partner_alt[partner_num]; document.getElementById('partnerLink').href=partner_lnks[partner_num]; document.getElementById('partnerLink').title=partner_title[partner_num]; document.getElementById('partnerheader').innerHTML="<strong>"+partner_name[partner_num]+"</strong>"; $j("#partnerBanner").fadeIn(2000).delay(3000).fadeOut(1500, function(){ partner_num++; partnerCycle(); }); } partnerCycle(); // </script>

    Read the article

  • HTTP 2.0 serait bloqué par l'ajout de SPDY de Google, un ingénieur de FreeBSD traite le projet de « fiasco » et demande son abandon pour HTTP 3.0

    HTTP 2.0 serait bloqué par l'intégration de SPDY de Google Un ingénieur de FreeBSD traite le projet de « fiasco »et demande son abandon au profit de HTTP 3.0Le groupe de travail de l'IETF (Internet Engineering Task Force) sur la version 2.0 de la norme HTTP (Hypertext Transfer Protocol) fait face à une crise qui pourrait entrainer un retard de la publication de celle-ci.Pour rappel, HTTP 2.0 est conçu pour permettre aux navigateurs de charger des pages Web plus rapidement. La sortie de la version...

    Read the article

  • Sending Parameters with the BizTalk HTTP Adapter

    - by Christopher House
    I've never had occaison to use the BizTalk HTTP adapter since I've always needed SOAP rather than just POX (plain old XML).  Yesterday we decided that we're going to expose some data via a Java servlet that will accept an HTTP post and respond with POX.  I knew BizTalk had an HTTP adapter but I had no idea what it's capabilities were. After a quick read through the BizTalk docs, it was apparent that the HTTP send adapter does in fact do posts.  The concern I had though was how we were going to supply parameters to the servlet.  The examples I had seen using the HTTP adapter all involved posting an XML message to some HTTP location.  Our Java guy, however didn't want to take that approach.  He wanted us to provide a query string via post, much like you'd expect to see on an HTTP get.  I decided to put together a little test scenario and see what I could come up with.  We didn't have a test servlet I could go against and my Java experience is virtually nill, so I decided to put together an ASP.Net project to act as the servlet.  It didn't need to be fancy, just one HttpHandler that accepts a post, reads a parameter and returns XML.  With the HttpHandler done, I put together a simple orchestration to send a message to the handler.  I started by having the orch send a message of type System.String to see what it would look like when the handler received it. I set a breakpoint in my handler and kicked off the orchestration.  Below is what I saw: As I suspected, because of BizTalk's XML serialization, System.String was not going to work.  I thought back to my BizTalk 2004 days and I project I worked on that required sending HTML formatted emails via the SMTP adapter.  To acomplish that, I had used a .Net class with a custom serialization formatter that I got from a Microsoft sample.  The code for the class, RawString can be found here. I created a new class library with the RawString class as well as a static factory class, referenced that in my orchestration project and changed my message type from System.String to RawString.  Below is what the code in my message construction looks like: After deploying the updated orchestration, I fired it off again and checked the breakpoint in my HttpHandler.  This is what I saw: And there you have it.  The RawString message type allowed me to pass a query string in the HTTP post without wrapping it in XML.

    Read the article

  • Distribution upgrade problem "No new release found"

    - by fefe
    I'm using Ubuntu 11.04. The update manager once found the new release 'oneiric', and still shows up this screen when I log on use ssh: Welcome to Ubuntu 11.04 (GNU/Linux 2.6.38-14-generic x86_64) * Documentation: https://help.ubuntu.com/ 0 packages can be updated. 0 updates are security updates. New release 'oneiric' available. Run 'do-release-upgrade' to upgrade to it. Last login: Wed Apr 25 16:22:48 2012 from *** But I didn't upgrade then, and change my apt sources. And now I cannot upgrade to 'oneiric'. do-relase-upgrade shows: $ sudo do-release-upgrade Checking for a new ubuntu release No new release found $ And apt-get dist-upgrade shows: $ sudo apt-get dist-upgrade Reading package lists... Done Building dependency tree Reading state information... Done Calculating upgrade... Done 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. $ I can successfully update all my packages. File contents of source.list: $ cat /etc/apt/sources.list ## See sources.list(5) for more information, especialy # Remember that you can only use http, ftp or file URIs deb http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty main universe restricted multiverse deb-src http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty main universe restricted multiverse deb http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-security universe main multiverse restricted deb-src http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-security universe main multiverse restricted deb http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-updates universe main multiverse restricted deb-src http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-updates universe main multiverse restricted deb http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-backports universe main multiverse restricted deb-src http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-backports universe main multiverse restricted # deb http://ubuntu.dormforce.net/ubuntu/ lucid main universe restricted multiverse # deb-src http://ubuntu.dormforce.net/ubuntu/ lucid main universe restricted multiverse # deb http://ubuntu.dormforce.net/ubuntu/ lucid-security universe main multiverse restricted # deb-src http://ubuntu.dormforce.net/ubuntu/ lucid-security universe main multiverse restricted # deb http://ubuntu.dormforce.net/ubuntu/ lucid-updates universe main multiverse restricted # deb-src http://ubuntu.dormforce.net/ubuntu/ lucid-updates universe main multiverse restricted # CDROMs are managed through the apt-cdrom tool. # deb http://archive.canonical.com lucid partner # deb http://archive.canonical.com lucid-security partner # deb http://archive.canonical.com lucid-updates partner # deb-src http://archive.canonical.com lucid partner # deb-src http://archive.canonical.com lucid-security partner # deb-src http://archive.canonical.com lucid-updates partner #medibuntu repo # deb http://packages.medibuntu.org/ lucid free non-free # deb-src http://packages.medibuntu.org/ lucid free non-free # deb http://extras.ubuntu.com/ubuntu maverick main #Third party developers repository deb http://mirrors.sohu.com/ubuntu/ natty main restricted multiverse universe deb-src http://mirrors.sohu.com/ubuntu/ natty main universe restricted multiverse #Added by software-properties deb http://security.ubuntu.com/ubuntu/ natty-security universe main multiverse restricted deb-src http://mirrors.sohu.com/ubuntu/ natty-security universe main multiverse restricted deb http://mirrors.sohu.com/ubuntu/ natty-updates universe main multiverse restricted deb-src http://mirrors.sohu.com/ubuntu/ natty-updates universe main multiverse restricted File contents of /etc/update-manager/meta-release: $ cat /etc/update-manager/meta-release # default location for the meta-release file [METARELEASE] URI = http://changelogs.ubuntu.com/meta-release URI_LTS = http://changelogs.ubuntu.com/meta-release-lts URI_UNSTABLE_POSTFIX = -development URI_PROPOSED_POSTFIX = -proposed What may be the problem of this?

    Read the article

  • How do i force www subdomain on both https and http

    - by Brian Perin
    For whatever reason I can't seem to get this right, I've looked at many examples on here and apache's website. I'm trying to force www.domain.com instead of domain.com on EITHER http or https but I am not trying to force https over http. the following code seems to work for all https connections but http will not redirect to www. RewriteEngine On RewriteCond %{HTTPS} on RewriteCond %{HTTP_HOST} !^www\.domain\.com$ [NC] RewriteRule ^ https://www.domain.com%{REQUEST_URI} [R=301] RewriteEngine On RewriteCond %{HTTPS} off RewriteCond %{HTTP_HOST} !^www\.domain\.com$ [NC] RewriteRule ^ http://www.domain.com%{REQUEST_URI} [R=301]

    Read the article

  • http-equiv=content-language alternative - the way of specifying document language

    - by tugberk
    Lots of web sites uses following meta tag to specify the default language of the document: <meta http-equiv="content-language" content="es-ES"> When I go to w3c site: http://www.w3.org/TR/2011/WD-html-markup-20110113/meta.http-equiv.content-language.html#meta.http-equiv.content-language I get this: Using the meta element to specify the document-wide default language is obsolete. Consider specifying the language on the root element instead. What is the way of specifying document language now?

    Read the article

  • Err http://extras.ubuntu.com precise Release.gpg

    - by bell
    updating gives the ff: Ign cdrom://Ubuntu 11.10 _Oneiric Ocelot_ - Release amd64 (20111012) oneiric InRelease Ign cdrom://Ubuntu 11.10 _Oneiric Ocelot_ - Release amd64 (20111012) dists/oneiric/main/binary- / InRelease Ign cdrom://Ubuntu 11.10 _Oneiric Ocelot_ - Release amd64 (20111012) dists/oneir Err http://archive.canonical.com oneiric Release.gpg Unable to connect to archive.canonical.com:http Err http://security.ubuntu.com precise-security/universe Translation-en_US Unable to connect to security.ubuntu.com:http:

    Read the article

  • Google Analytics HTTP vs HTTPS

    - by Pelangi
    I want to use Google Analytics on a website that uses both HTTP and HTTPS that works as explained below: Secure pages accessed through https://mydomain.com/secure/* are always on HTTPS. Any access to these pages through HTTP will be redirected to HTTPS. Any other pages will be accessible through both HTTP and HTTPS I have a Google Analytics profile with URL using HTTPS. Will I cover all traffic? Do I need to create another profile using HTTP and how should I apply the other profile?

    Read the article

  • Opening the Internet Settings Dialog and using Windows Default Network Settings via Code

    - by Rick Strahl
    Ran into a question from a client the other day that asked how to deal with Internet Connection settings for running  HTTP requests. In this case this is an old FoxPro app and it's using WinInet to handle the actual HTTP connection. Another client asked a similar question about using the IE Web Browser control and configuring connection properties. Regardless of platform or tools used to do HTTP connections, you can probably configure custom connection and proxy settings in your application to configure http connection settings manually. However, this is a repetitive process for each application requires you to track system information in your application which is undesirable. Often it's much easier to rely on the system wide proxy settings that Windows provides via the Internet Settings dialog. The dialog is a Control Panel applet (inetcpl.cpl) and is the same dialog that you see when you pop up Internet Explorer's Options dialog: This dialog controls the Windows connection properties that determine how the Windows HTTP stack connects to the Internet and how Proxy's are used if configured. Depending on how the HTTP client is configured - it can typically inherit and use these global settings. Loading the Settings Dialog Programmatically The settings dialog is a Control Panel applet with the name of: inetcpl.cpl and you can use any Shell execution mechanism (Run dialog, ShellExecute API, Process.Start() in .NET etc.) to invoke the dialog. Changes made there are immediately reflected in any applications that use the default connection settings. In .NET you can simply do this to bring up the Internet Settings dialog with the Connection tab enabled: Process.Start("inetcpl.cpl",",4"); In FoxPro you can simply use the RUN command to execute inetcpl.cpl: lcCmd = "inetcpl.cpl ,4" RUN &lcCmd Using the Default Connection/Proxy Settings When using WinInet you specify the Http connect type in the call to InternetOpen() like this (FoxPro code here): hInetConnection=; InternetOpen(THIS.cUserAgent,0,; THIS.chttpproxyname,THIS.chttpproxybypass,0) The second parameter of 0 specifies that the default system proxy settings should be used and it uses the settings from the Internet Settings Connections tab. Other connection options for HTTP connections include 1 - direct (no proxies and ignore system settings), 3 - explicit Proxy specification. In most situations a connection mode setting of 0 should work. In .NET HTTP connections by default are direct connections and so you need to explicitly specify a default proxy or proxy configuration to use. The easiest way to do this is on the application level in the config file: <configuration> <system.net> <defaultProxy> <proxy bypassonlocal="False" autoDetect="True" usesystemdefault="True" /> </defaultProxy> </system.net> </configuration> You can do the same sort of thing in code specifying the proxy explicitly and using System.Net.WebProxy.GetDefaultProxy(). So when making HTTP calls to Web Services or using the HttpWebRequest class you can set the proxy with: StoreService.Proxy = WebProxy.GetDefaultProxy(); All of this is pretty easy to deal with and in my opinion is a way better choice to managing connection settings than having to track this stuff in your own application. Plus if you use default settings, most of the time it's highly likely that the connection settings are already properly configured making further configuration rare.© Rick Strahl, West Wind Technologies, 2005-2011Posted in Windows  HTTP  .NET  FoxPro   Tweet (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Playing with http page cycle using JustMock

    - by mehfuzh
    In this post , I will cover a test code that will mock the various elements needed to complete a HTTP page request and  assert the expected page cycle steps. To begin, i have a simple enumeration that has my predefined page steps: public enum PageStep {     PreInit,     Load,     PreRender,     UnLoad } Once doing so, i  first created the page object [not mocking]. Page page = new Page(); Here, our target is to fire up the page process through ProcessRequest call, now if we take a look inside the method with reflector.net,  the call trace will go like : ProcessRequest –> ProcessRequestWithNoAssert –> SetInstrinsics –> Finallly ProcessRequest. Inside SetInstrinsics ,  it requires calls from HttpRequest, HttpResponse and HttpBrowserCababilities. With this clue at hand, we can easily know the classes / calls  we need to mock in order to get through the expected call. Accordingly, for  HttpBrowserCapabilities our required test code will look like: Mock.Arrange(() => browser.PreferredRenderingMime).Returns("text/html"); Mock.Arrange(() => browser.PreferredResponseEncoding).Returns("UTF-8"); Mock.Arrange(() => browser.PreferredRequestEncoding).Returns("UTF-8"); Now, HttpBrowserCapabilities is get though [Instance]HttpRequest.Browser. Therefore, we create the HttpRequest mock: var request = Mock.Create<HttpRequest>(); Then , add the required get call : Mock.Arrange(() => request.Browser).Returns(browser); As, [instance]Browser.PerferrredResponseEncoding and [instance]Browser.PreferredResponseEncoding  are also set to the request object and to make that they are set properly, we can add the following lines as well [not required though]. bool requestContentEncodingSet = false; Mock.ArrangeSet(() => request.ContentEncoding = Encoding.GetEncoding("UTF-8")).DoInstead(() =>  requestContentEncodingSet = true); Similarly, for response we can write:  var response = Mock.Create<HttpResponse>();    bool responseContentEncodingSet = false;  Mock.ArrangeSet(() => response.ContentEncoding = Encoding.GetEncoding("UTF-8")).DoInstead(() => responseContentEncodingSet = true); Finally , I created a mock of HttpContext and set the Request and Response properties that will returns the mocked version. var context = Mock.Create<HttpContext>();   Mock.Arrange(() => context.Request).Returns(request); Mock.Arrange(() => context.Response).Returns(response); As, Page internally calls RenderControl method , we just need to replace that with our one and optionally we can check if  invoked properly: bool rendered = false; Mock.Arrange(() => page.RenderControl(Arg.Any<HtmlTextWriter>())).DoInstead(() => rendered = true); That’s  it, the rest of the code is simple,  where  i asserted the page cycle with the PageSteps that i defined earlier: var pageSteps = new Queue<PageStep>();   page.PreInit +=      delegate      {          pageSteps.Enqueue(PageStep.PreInit);      }; page.Load +=      delegate      {          pageSteps.Enqueue(PageStep.Load);      };   page.PreRender +=      delegate      {          pageSteps.Enqueue(PageStep.PreRender);      };   page.Unload +=      delegate      {          pageSteps.Enqueue(PageStep.UnLoad);      };   page.ProcessRequest(context);    Assert.True(requestContentEncodingSet);  Assert.True(responseContentEncodingSet);  Assert.True(rendered);    Assert.Equal(pageSteps.Dequeue(), PageStep.PreInit);  Assert.Equal(pageSteps.Dequeue(), PageStep.Load);  Assert.Equal(pageSteps.Dequeue(), PageStep.PreRender);  Assert.Equal(pageSteps.Dequeue(), PageStep.UnLoad);    Mock.Assert(request);  Mock.Assert(response);   You can get the test class shown in this post here to give a try by yourself with of course JustMock :-).   Enjoy!!

    Read the article

  • WinInet Apps failing when Internet Explorer is set to Offline Mode

    - by Rick Strahl
    Ran into a nasty issue last week when all of a sudden many of my old applications that are using WinInet for HTTP access started failing. Specifically, the WinInet HttpSendRequest() call started failing with an error of 2, which when retrieving the error boils down to: WinInet Error 2: The system cannot find the file specified Now this error can pop up in many legitimate scenarios with WinInet such as when no Internet connection is available or the HTTP configuration (usually configured in Internet Explorer’s options) is misconfigured. The error typically means that the server in question cannot be found or more specifically an Internet connection can’t be established. In this case the problem started suddenly and was causing some of my own applications (old Visual FoxPro apps using my own wwHttp library) and all Adobe Air applications (which apparently uses WinInet for its basic HTTP stack) along with a few more oddball applications to fail instantly when trying to connect via HTTP. Most other applications – all of my installed browsers, email clients, various social network updaters all worked just fine. It seems it was only WinInet apps that were failing. Yet oddly Internet Explorer appeared to be working. So the problem seemed to be isolated to those ‘classic’ applications using WinInet. WinInet’s base configuration uses the Internet Explorer options dialog. To check this out I typically go to the Internet Explorer options and find the Connection tab, and check out the LAN Setup. Make sure there are no rogue proxy settings or configuration scripts that are invalid. Trying with Auto-configuration on and off also can often fix ‘real’ configuration errors. This time however this wasn’t a problem – nothing in the LAN configuration was set (all default). I also played with the Automatic detection of settings which also had no effect. I also tried to use Fiddler to see if that would tell me something. Fiddler has a few additional WinInet configuration options in its configuration. Running Fiddler and hitting an HTTP request using WinInet would never actually hit Fiddler – the failure would occur before WinInet ever fired up the HTTP connection to go through the Fiddler HTTP proxy. And the Culprit is: Internet Explorer’s Work Offline Option The culprit in this situation was Internet Explorer which at some point, unknown to me switched into Offline Mode and was then shut down: When this Offline mode is checked when IE is running *or* if IE gets shut down with this flag set, all applications using WinInet by default assume that it’s running in offline mode. Depending on your caching HTTP headers and whether the page was cached previously you may or may not get a response or an error. For an independent non-browser application this will be highly unpredictable and likely result in failures getting online – especially if the application forces requests to always reload by disabling HTTP caching (as I do on most of my dynamic HTTP clients). What makes this especially tricky is that even when IE is in offline mode in the browser, you can still browse around the Web *if* you have a connection. IE will try to load anything it has cached from the local cache, but as soon as you hit a URL that isn’t cached it will automatically try to access that URL and uncheck the Work Offline option. Conversely if you get knocked off the Internet and browse in IE 9, IE will automatically go into offline mode. I never explicitly set offline mode – it just automatically sets itself on and off depending on the connection. Problem is if you’re not using IE all the time (as I do – rarely and just for testing so usually a few commonly used URLs) and you left it in offline mode when you exit, offline mode stays set which results in the above head scratcher. Ack. This isn’t new behavior in IE 9 BTW – this behavior has always been there, but I think what’s different is that IE now automatically switches between online and offline modes without notifying you at all, so it’s hard to tell when you are offline. Fixing the Issue in your Code If you have an application that is using WinInet, there’s a WinInet option called INTERNET_OPTION_IGNORE_OFFLINE. I just checked this out in my own applications and Internet Explorer 9 and it works, but apparently it’s been broken for some older releases (I can’t confirm how far back though) – lots of posts seem to suggest the flag doesn’t work. However, in IE 9 at least it does seem to work if you call InternetSetOption before you call HttpOpenRequest with the Http Session handle. In FoxPro code I use: DECLARE INTEGER InternetSetOption ;    IN WININET.DLL ;    INTEGER HINTERNET,;    INTEGER dwFlags,;    INTEGER @dwValue,;    INTEGER cbSize lnOptionValue = 1   && BOOL TRUE pass by reference   *** Set needed SSL flags lnResult=InternetSetOption(this.hHttpSession,;    INTERNET_OPTION_IGNORE_OFFLINE ,;  && 77    @lnOptionValue ,4)   DECLARE INTEGER HttpOpenRequest ;    IN WININET.DLL ;    INTEGER hHTTPHandle,;    STRING lpzReqMethod,;    STRING lpzPage,;    STRING lpzVersion,;    STRING lpzReferer,;    STRING lpzAcceptTypes,;    INTEGER dwFlags,;    INTEGER dwContextw     hHTTPResult=HttpOpenRequest(THIS.hHttpsession,;    lcVerb,;    tcPage,;    NULL,NULL,NULL,;    INTERNET_FLAG_RELOAD + ;    IIF(THIS.lsecurelink,INTERNET_FLAG_SECURE,0) + ;    this.nHTTPServiceFlags,0) …  And this fixes the issue at least for IE 9… In my FoxPro wwHttp class I now call this by default to never get bitten by this again… This solves the problem permanently for my HTTP client. I never want to see offline operation in an HTTP client API – it’s just too unpredictable in handling errors and the last thing you want is getting unpredictably stale data. Problem solved but this behavior is – well ugly. But then that’s to be expected from an API that’s based on Internet Explorer, eh?© Rick Strahl, West Wind Technologies, 2005-2011Posted in HTTP  Windows  

    Read the article

  • How do I configure Jetty (via jettyrunner) so that it names a character set in the Content-Type response header?

    - by Pointy
    I use Jetty (via the oh-so-handy Jetty Runner) for day-to-day web application testing. One thing I've recently stumbled on is the fact that I don't get a character set called out in the "Content-Type" response header all the time. I do get it in response to my application's XMLHttpRequest transactions, but not for plain old pages loaded by <a> links or whatever. I've read a little bit about how to set up a Jetty config file, but I've never been able to completely understand that; all servlet containers are complicated, and while Jetty is pretty simple it's just weird enough that I don't grok the overall idea. Thus, all I do to launch my app is to run the Jetty Runner .jar file with a couple of simple arguments to set up the port number and logfile path, and then I just give it the .war file to run. It works great — except for the missing character set :-) Anybody have a quick sample config file that might fix this? edit — oh if it matters, I'm running Jetty 7.0.0 RC3; I've also tried with a slightly newer version (still 7.something) with exactly the same issue. All my testing is on Ubuntu.

    Read the article

  • Right div pushing center div further down

    - by Chase
    I cannot get this last div to go up properly in my layout and have tried countless things. I'm not sure what's going on with my css? Here is a screenshot: http://img291.imageshack.us/img291/5377/screenshot20100528at123.png #events { float: left; width: 420px; margin:0 0 5px 0; font-family:Helvetica, Arial, sans-serif; font-size: 16px; background-image: url(images/lastfmhead.jpg); background-repeat: no-repeat; height:360px; overflow:hidden; display: inline; } #events table { width:419px; } #events th, td { padding: 3px 3px; } #whatsup ul, #citywhatsup ul { margin:0 5px 0 5px; text-align:left; font-family:Helvetica, Arial, sans-serif; } #whatsup ul li, #citywhatsup ul li{ list-style-type:none; list-style: none; } #whatsup hr, #citywhatsup hr{ border: none 0; border-top: 1px dashed #990000;/*the border*/ width: 100%; height: 1px; margin: 1px auto 5px auto;/*whatever the total width of the border-top and border-bottom equal*/ } #events ul { margin:5px 5px 0 5px; } #events ul li{ list-style-type:none; list-style: none; } #attending ul { display: inline-block; margin: 0; width:200px; } #attending ul li { display: inline-block; list-style-image:none; margin:0; padding:2px 5px 2px 5px; } #attending { width: 230px; margin:0 13px 5px 12px; float: left; display: inline; background-image: url(images/otherhead.jpg); background-repeat: no-repeat; text-align:center; height:360px; overflow:hidden; } #whatsup { width: 230px; margin:0 0 5px 0; float: left; display:inline; background-image: url(images/otherhead.jpg); background-repeat: no-repeat; text-align:center; } #eventtitle{ margin: 3px 0 -3px 0; } #eventtitle { color: #900; margin-left: 5px; font-size:16px; } #tweetit { color: #487B96 !important; font-size:16px; margin: 3px 0 -4px 0; } #photos { background-image: url(images/flickrheader.jpg); background-repeat: no-repeat; width: 665px; clear:both; } <div id="cityevents"> <h2> Events </h2> <table> <th>Date </th><th> Who's Playing </th><th> Venue </th><th> City </th><th> Tickets </th> <tr><td>May 28</td><td><a href='http://www.songkick.com/concerts/5384486?utm_source=1121&utm_medium=partner' target='_blank'>Jill King</a></td><td>Open Eye Cafe</td><td>Carrboro</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>May 28</td><td><a href='http://www.songkick.com/concerts/5281141?utm_source=1121&utm_medium=partner' target='_blank'>Ahleuchatistas</a></td><td>Nightlight</td><td>Chapel Hill</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>May 28</td><td><a href='http://www.songkick.com/concerts/4970896?utm_source=1121&utm_medium=partner' target='_blank'>Sam Quinn</a></td><td>Local 506</td><td>Chapel Hill</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>May 29</td><td><a href='http://www.songkick.com/concerts/5303661?utm_source=1121&utm_medium=partner' target='_blank'>Cagematch Mayhem, Champion Vs Au Jus, Heartbreaker Vs Au Jus</a></td><td>DSI Comedy Theater</td><td>Carrboro</td><td style='text-align:center;'><a href='http://www.songkick.com/concerts/5303661/tickets?utm_source=1121&utm_medium=partner' target='_blank'> Find </a></td></tr><tr><td>May 29</td><td><a href='http://www.songkick.com/concerts/4722066?utm_source=1121&utm_medium=partner' target='_blank'>Lewd Acts, Converge, Gaza, Black Breath</a></td><td>Cat's Cradle</td><td>Carrboro</td><td style='text-align:center;'><a href='http://www.songkick.com/concerts/4722066/tickets?utm_source=1121&utm_medium=partner' target='_blank'> Find </a></td></tr><tr><td>May 29</td><td><a href='http://www.songkick.com/concerts/4647076?utm_source=1121&utm_medium=partner' target='_blank'>Nate Currin</a></td><td>Broad Street Cafe</td><td>Durham</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>May 29</td><td><a href='http://www.songkick.com/concerts/5580211?utm_source=1121&utm_medium=partner' target='_blank'>International Night</a></td><td>Serena Rtp</td><td>Durham</td><td style='text-align:center;'><a href='http://www.songkick.com/concerts/5580211/tickets?utm_source=1121&utm_medium=partner' target='_blank'> Find </a></td></tr><tr><td>May 29</td><td><a href='http://www.songkick.com/concerts/4770241?utm_source=1121&utm_medium=partner' target='_blank'>Jill King</a></td><td>Caffe Driade</td><td>Chapel Hill</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>May 29</td><td><a href='http://www.songkick.com/concerts/5406411?utm_source=1121&utm_medium=partner' target='_blank'>Sunbears!</a></td><td>Local 506</td><td>Chapel Hill</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>May 29</td><td><a href='http://www.songkick.com/concerts/4924136?utm_source=1121&utm_medium=partner' target='_blank'>Studio Gangsters</a></td><td>The Reservoir</td><td>Carrboro</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>May 30</td><td><a href='http://www.songkick.com/concerts/5252161?utm_source=1121&utm_medium=partner' target='_blank'>She Wants Revenge</a></td><td>Cat's Cradle</td><td>Carrboro</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>May 30</td><td><a href='http://www.songkick.com/concerts/4436326?utm_source=1121&utm_medium=partner' target='_blank'>Unheard Radio Battle of the Bands</a></td><td>Mansion 462</td><td>Chapel Hill</td><td style='text-align:center;'><a href='http://www.songkick.com/concerts/4436326/tickets?utm_source=1121&utm_medium=partner' target='_blank'> Find </a></td></tr><tr><td>May 30</td><td><a href='http://www.songkick.com/concerts/4924141?utm_source=1121&utm_medium=partner' target='_blank'>Studio Gangsters</a></td><td>The Cave</td><td>Chapel Hill</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>Jun 2</td><td><a href='http://www.songkick.com/concerts/5252881?utm_source=1121&utm_medium=partner' target='_blank'>Jeanne Jolly</a></td><td>Caffe Driade</td><td>Chapel Hill</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>Jun 2</td><td><a href='http://www.songkick.com/concerts/4628026?utm_source=1121&utm_medium=partner' target='_blank'>James Husband, Of Montreal</a></td><td>Cat's Cradle</td><td>Carrboro</td><td style='text-align:center;'><a href='http://www.songkick.com/concerts/4628026/tickets?utm_source=1121&utm_medium=partner' target='_blank'> Find </a></td></tr><tr><td>Jun 2</td><td><a href='http://www.songkick.com/concerts/5019466?utm_source=1121&utm_medium=partner' target='_blank'>Camera Obscura</a></td><td>Duke Gardens</td><td>Durham</td><td style='text-align:center;'><a href='http://www.songkick.com/concerts/5019466/tickets?utm_source=1121&utm_medium=partner' target='_blank'> Find </a></td></tr><tr><td>Jun 3</td><td><a href='http://www.songkick.com/concerts/4226511?utm_source=1121&utm_medium=partner' target='_blank'>Reverend Horton Heat, Cracker, Legendary Shack Shakers</a></td><td>Cat's Cradle</td><td>Carrboro</td><td style='text-align:center;'><a href='http://www.songkick.com/concerts/4226511/tickets?utm_source=1121&utm_medium=partner' target='_blank'> Find </a></td></tr><tr><td>Jun 3</td><td><a href='http://www.songkick.com/concerts/5253371?utm_source=1121&utm_medium=partner' target='_blank'>American Aquarium</a></td><td>Local 506</td><td>Chapel Hill</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>Jun 4</td><td><a href='http://www.songkick.com/concerts/4285251?utm_source=1121&utm_medium=partner' target='_blank'>Laurence Juber</a></td><td>The ArtsCenter</td><td>Carrboro</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr><tr><td>Jun 4</td><td><a href='http://www.songkick.com/concerts/5642566?utm_source=1121&utm_medium=partner' target='_blank'>Community Jam, Pt Scarborough Is a Movie, Armageddon'it</a></td><td>DSI Comedy Theater</td><td>Carrboro</td><td style='text-align:center;'><a href='http://www.songkick.com/concerts/5642566/tickets?utm_source=1121&utm_medium=partner' target='_blank'> Find </a></td></tr><tr><td>Jun 4</td><td><a href='http://www.songkick.com/concerts/4676216?utm_source=1121&utm_medium=partner' target='_blank'>Big Bill Morganfield</a></td><td>Papa Mojos Roadhouse</td><td>Durham</td><td style='text-align:center;'><span style='color: #999'> Find </span></td></tr> </table> </div> <!-- Events --> <div id="citywhatsup"> <h2> What's Up? <div id="tweetit"><a class="btn-slide">Tell em'</a> </div></h2> <div id="twitpanel"></div> <script type="text/javascript"> twttr.anywhere(function (T) { T("#twitpanel").tweetBox({ height: 100, width: 215, label: '', defaultContent: "" }); }); </script> <script type="text/javascript"> twttr.anywhere(function (T) { T("#whatsup").linkifyUsers(); }); </script> <ul> <li><img src='http://a1.twimg.com/profile_images/898693876/4604414396_0464180430_b_normal.jpg' alt='kaiten_keiku' height=40px; width=40px; style='border:0px; float: left; padding-right:4px;'/> @kaiten_keiku: <span style='text-align:justify;'>@Charlotte_Nao ????????~?????????!</span> - <span class='twittertime'>May 28 12:37AM</span></li><hr/><li><img src='http://a3.twimg.com/profile_images/612153581/bowdown_normal.jpg' alt='bugn' height=40px; width=40px; style='border:0px; float: left; padding-right:4px;'/> @bugn: <span style='text-align:justify;'>@Bravotv (sitc2 as rhony) Bethenny-Carrie, Sonja-Samantha, Alex-Miranda, Ramona-Charlotte</span> - <span class='twittertime'>May 28 12:36AM</span></li><hr/><li><img src='http://a1.twimg.com/profile_images/844630278/mj_normal.jpg' alt='Myra_Jones' height=40px; width=40px; style='border:0px; float: left; padding-right:4px;'/> @Myra_Jones: <span style='text-align:justify;'>@t_weet123 If you're still in Charlotte then you need to head to Whiskey River...they say Luke B. just walked in and started drinking.</span> - <span class='twittertime'>May 28 12:36AM</span></li><hr/><li><img src='http://a1.twimg.com/profile_images/936667468/110971230_normal.jpg' alt='THEORACLE2' height=40px; width=40px; style='border:0px; float: left; padding-right:4px;'/> @THEORACLE2: <span style='text-align:justify;'>@MsKamilah08 are yall in charlotte?</span> - <span class='twittertime'>May 28 12:36AM</span></li><hr/><li><img src='http://a1.twimg.com/profile_images/767244842/7AM_normal.jpg' alt='mtollefsrud' height=40px; width=40px; style='border:0px; float: left; padding-right:4px;'/> @mtollefsrud: <span style='text-align:justify;'>@vosler09 thinks I'm Charlotte.</span> - <span class='twittertime'>May 28 12:36AM</span></li><hr/><li><img src='http://a3.twimg.com/profile_images/936496517/DSCF0317_-_Copy_normal.JPG' alt='Thasian' height=40px; width=40px; style='border:0px; float: left; padding-right:4px;'/> @Thasian: <span style='text-align:justify;'>I like #CharMeck #Charlotte | Atlanta = #No #FAIL #EPICFAIL</span> - <span class='twittertime'>May 28 12:36AM</span></li><hr/><li><img src='http://a3.twimg.com/profile_images/695551715/NASCAR_logo_flag_normal.jpg' alt='NascarNewsNow' height=40px; width=40px; style='border:0px; float: left; padding-right:4px;'/> @NascarNewsNow: <span style='text-align:justify;'>#NASCAR #RACING News from the track: Charlotte Motor Speedway | Nascar Leath: Ahh, the waiting is... http://bit.ly/b2DToq #NHRA #DAYTONA500</span> - <span class='twittertime'>May 28 12:35AM</span></li> </ul> </div> <div id="photos"> <h2> Recent Photos </h2> <ul> <li><a href='http://farm5.static.flickr.com/4029/4646832962_980f936db9.jpg' target='_blank' rel='lightbox-photos' title='05 23 10 Jamie's Baby Shower 097'><img src='http://farm5.static.flickr.com/4029/4646832962_980f936db9_t.jpg' alt='05 23 10 Jamie's Baby Shower 097' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm4.static.flickr.com/3176/4646218481_d06829a778.jpg' target='_blank' rel='lightbox-photos' title='summer'><img src='http://farm4.static.flickr.com/3176/4646218481_d06829a778_t.jpg' alt='summer' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4032/4646833312_7b1de5390a.jpg' target='_blank' rel='lightbox-photos' title='100_0064'><img src='http://farm5.static.flickr.com/4032/4646833312_7b1de5390a_t.jpg' alt='100_0064' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4008/4646832834_784a0a9ed1.jpg' target='_blank' rel='lightbox-photos' title=''><img src='http://farm5.static.flickr.com/4008/4646832834_784a0a9ed1_t.jpg' alt='' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4066/4646218735_b37d8fd9e5.jpg' target='_blank' rel='lightbox-photos' title='DSC05524'><img src='http://farm5.static.flickr.com/4066/4646218735_b37d8fd9e5_t.jpg' alt='DSC05524' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4054/4646830604_97afd54623.jpg' target='_blank' rel='lightbox-photos' title='DTLA graff'><img src='http://farm5.static.flickr.com/4054/4646830604_97afd54623_t.jpg' alt='DTLA graff' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4066/4646833048_7a9ab28733.jpg' target='_blank' rel='lightbox-photos' title='100_0243.jpg'><img src='http://farm5.static.flickr.com/4066/4646833048_7a9ab28733_t.jpg' alt='100_0243.jpg' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm4.static.flickr.com/3399/4646832626_de89d0fb0e.jpg' target='_blank' rel='lightbox-photos' title='6W????'><img src='http://farm4.static.flickr.com/3399/4646832626_de89d0fb0e_t.jpg' alt='6W????' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4005/4646832826_0d9e8afe19.jpg' target='_blank' rel='lightbox-photos' title='IMG_9381'><img src='http://farm5.static.flickr.com/4005/4646832826_0d9e8afe19_t.jpg' alt='IMG_9381' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4046/4646830752_dfc32b1740.jpg' target='_blank' rel='lightbox-photos' title='11'><img src='http://farm5.static.flickr.com/4046/4646830752_dfc32b1740_t.jpg' alt='11' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4010/4646215929_80b78f0007.jpg' target='_blank' rel='lightbox-photos' title='GEDC8592'><img src='http://farm5.static.flickr.com/4010/4646215929_80b78f0007_t.jpg' alt='GEDC8592' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4064/4646832384_7fc8d31e11.jpg' target='_blank' rel='lightbox-photos' title='2010 Advanced Grappling'><img src='http://farm5.static.flickr.com/4064/4646832384_7fc8d31e11_t.jpg' alt='2010 Advanced Grappling' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4049/4646218143_c108276325.jpg' target='_blank' rel='lightbox-photos' title='P1270352'><img src='http://farm5.static.flickr.com/4049/4646218143_c108276325_t.jpg' alt='P1270352' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4009/4646217767_3900f39475.jpg' target='_blank' rel='lightbox-photos' title='P1270351'><img src='http://farm5.static.flickr.com/4009/4646217767_3900f39475_t.jpg' alt='P1270351' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4027/4646831284_30b6e6da36.jpg' target='_blank' rel='lightbox-photos' title='Image245'><img src='http://farm5.static.flickr.com/4027/4646831284_30b6e6da36_t.jpg' alt='Image245' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm4.static.flickr.com/3399/4646218295_63a899d322.jpg' target='_blank' rel='lightbox-photos' title='IMG_0037'><img src='http://farm4.static.flickr.com/3399/4646218295_63a899d322_t.jpg' alt='IMG_0037' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4059/4646218159_01d5b02c3f.jpg' target='_blank' rel='lightbox-photos' title='DooDah2010-6665'><img src='http://farm5.static.flickr.com/4059/4646218159_01d5b02c3f_t.jpg' alt='DooDah2010-6665' height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://farm5.static.flickr.com/4053/4646834404_615e09b715.jpg' target='_blank' rel='lightbox-photos' title='IMG_2668'><img src='http://farm5.static.flickr.com/4053/4646834404_615e09b715_t.jpg' alt='IMG_2668' height=100px; width=100px; style='border:0px;'/></a></li> </ul> </div> <div id="videos"> <h2> Recent Videos </h2> <ul> <li><a href='http://www.youtube.com/watch?v=_oc5-0yFoYg&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/_oc5-0yFoYg/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=1pRXKeYVYHc&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/1pRXKeYVYHc/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=yzRQUu-ZBdw&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/yzRQUu-ZBdw/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=mud-A76nLro&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/mud-A76nLro/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=TLOboW19_OA&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/TLOboW19_OA/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=PcYja2jjvi0&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/PcYja2jjvi0/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=KKjklCEMrPk&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/KKjklCEMrPk/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=AyUwY6PRX0Y&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/AyUwY6PRX0Y/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=8Sf2-7RjVYs&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/8Sf2-7RjVYs/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=xGMayoJmhE8&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/xGMayoJmhE8/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=paseBeB6Cb8&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/paseBeB6Cb8/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li><li><a href='http://www.youtube.com/watch?v=l_FSRUtMMik&feature=youtube_gdata' target='_blank'><img src='http://i.ytimg.com/vi/l_FSRUtMMik/default.jpg'height=100px; width=100px; style='border:0px;'/></a></li> </ul> </div> </div><!-- #content -->

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >