Search Results

Search found 7249 results on 290 pages for 'https everywhere'.

Page 30/290 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • Can you use gzip over SSL? And Connection: Keep-Alive headers

    - by magenta
    I'm evaluating the front end performance of a secure (SSL) web app here at work and I'm wondering if it's possible to compress text files (html/css/javascript) over SSL. I've done some googling around but haven't found anything specifically related to SSL. If it's possible, is it even worth the extra CPU cycles since responses are also being encrypted? Would compressing responses hurt performance? Also, I'm wanting to make sure we're keeping the SSL connection alive so we're not making SSL handshakes over and over. I'm not seeing Connection: Keep-Alive in the response headers. I do see Keep-Alive: 115 in the request headers but that's only keeping the connection alive for 115 milliseconds (seems like the app server is closing the connection after a single request is processed?) Wouldn't you want the server to be setting that response header for as long as the session inactivity timeout is? I understand browsers don't cache SSL content to disk so we're serving the same files over and over and over on subsequent visits even though nothing has changed. The main optimization recommendations are reducing the number of http requests, minification, moving scripts to bottom, image optimization, possible domain sharding (though need to weigh the cost of another SSL handshake), things of that nature.

    Read the article

  • Who sells the cheapest EV SSL certificate?

    - by Zack Peterson
    I want a SSL certificate for my web site that will not only be accepted without warning by all popular browsers (at least accepted by Firefox and Internet Explorer), but also give my visitors the green address bar. Which certificate authority is selling the least expensive extended validation SSL certificates?

    Read the article

  • CURL - HTTPS Wierd error

    - by Vincent
    All, I am having trouble requesting info from HTTPS site using CURL and PHP. I am using Solaris 10. It so happens that sometimes it works and sometimes it doesn't. I am not sure what is the cause. If it doesn't work, this is the entry recorded in the verbose log: * About to connect() to 10.10.101.12 port 443 (#0) * Trying 10.10.101.12... * connected * Connected to 10.10.101.12 (10.10.101.12) port 443 (#0) * error setting certificate verify locations, continuing anyway: * CAfile: /etc/opt/webstack/curl/curlCA CApath: none * error:80089077:lib(128):func(137):reason(119) * Closing connection #0 If it works, this is the entry recorded in the verbose log: * About to connect() to 10.10.101.12 port 443 (#0) * Trying 10.10.101.12... * connected * Connected to 10.10.101.12 (10.10.101.12) port 443 (#0) * error setting certificate verify locations, continuing anyway: * CAfile: /etc/opt/webstack/curl/curlCA CApath: none * SSL connection using DHE-RSA-AES256-SHA * Server certificate: * subject: C=CA, ST=British Columnbia, L=Vancouver, O=google, OU=FDN, CN=g.googlenet.com, [email protected] * start date: 2007-07-24 23:06:32 GMT * expire date: 2027-09-07 23:06:32 GMT * issuer: C=US, ST=California, L=Sunnyvale, O=Google, OU=Certificate Authority, CN=support, [email protected] * SSL certificate verify result: unable to get local issuer certificate (20), continuing anyway. > POST /gportal/gpmgr HTTP/1.1^M Host: 10.10.101.12^M Accept: */*^M Accept-Encoding: gzip,deflate^M Content-Length: 1623^M Content-Type: application/x-www-form-urlencoded^M Expect: 100-continue^M ^M < HTTP/1.1 100 Continue^M < HTTP/1.1 200 OK^M < Date: Wed, 28 Apr 2010 21:56:15 GMT^M < Server: Apache^M < Cache-Control: no-cache^M < Pragma: no-cache^M < Vary: Accept-Encoding^M < Content-Encoding: gzip^M < Content-Length: 1453^M < Content-Type: application/json^M < ^M * Connection #0 to host 10.10.101.12 left intact * Closing connection #0 My CURL options are as under: $ch = curl_init(); $devnull = fopen('/tmp/curlcookie.txt', 'w'); $fp_err = fopen('/tmp/verbose_file.txt', 'ab+'); fwrite($fp_err, date('Y-m-d H:i:s')."\n\n"); curl_setopt($ch, CURLOPT_STDERR, $devnull); curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_URL, $desturl); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false); curl_setopt($ch, CURLOPT_HEADER, false); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 0); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT,120); curl_setopt($ch, CURLOPT_AUTOREFERER, true); curl_setopt($ch, CURLOPT_ENCODING, 'gzip,deflate'); curl_setopt($ch, CURLOPT_POSTFIELDS, $postdata); curl_setopt($ch, CURLOPT_VERBOSE,1); curl_setopt($ch, CURLOPT_FAILONERROR, true); curl_setopt($ch, CURLOPT_STDERR, $fp_err); $ret = curl_exec($ch); Anybody has any idea, why it works sometimes but fails mostly? Thanks

    Read the article

  • What are the attack vectors for passwords sent over http?

    - by KevinM
    I am trying to convince a customer to pay for SSL for a web site that requires login. I want to make sure I correctly understand the major scenarios in which someone can see the passwords that are being sent. My understanding is that at any of the hops along the way can use a packet analyzer to view what is being sent. This seems to require that any hacker (or their malware/botnet) be on the same subnet as any of the hops the packet takes to arrive at its destination. Is that right? Assuming some flavor of this subnet requirement holds true, do I need to worry about all the hops or just the first one? The first one I can obviously worry about if they're on a public Wifi network since anyone could be listening in. Should I be worried about what's going on in subnets that packets will travel across outside this? I don't know a ton about network traffic, but I would assume it's flowing through data centers of major carriers and there's not a lot of juicy attack vectors there, but please correct me if I am wrong. Are there other vectors to be worried about outside of someone listening with a packet analyzer? I am a networking and security noob, so please feel free to set me straight if I am using the wrong terminology in any of this.

    Read the article

  • secure rest API for running user "apps" in an iframe

    - by Brian Armstrong
    I want to let users create "apps" (like Facebook apps) for my website, and I'm trying to figure out the best way to make it secure. I have a REST api i want to run the user apps in an iframe on my own site (not a safe markup language like FBML) I was first looking at oAuth but this seems overkill for my solution. The "apps" don't need to be run on external sites or in desktop apps or anything. The user would stay on my site at all times but see the user submitted "app" through the iframe. So when I call the app the first time through the iframe, I can pass it some variables so it knows which logged in user is using it on my site. It can then use this user session in it's own API calls to customize the display. If the call is passed in the clear, I don't want someone to be able to intercept the session and impersonate the user. Does anyone know a good way to do this or good write up on it? Thanks!

    Read the article

  • telling java to accept self-signed ssl certificate

    - by Nikita Rybak
    It looks like a standard question, but I couldn't find clear directions anywhere. I have java code trying to connect server with probably self-signed (or expired) certificate. It gives something like this [HttpMethodDirector] I/O exception (javax.net.ssl.SSLHandshakeException) caught when processing request: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target As I understand, I have to play around with keytool and tell java that it's ok to allow this connection. But all comments I've found assume I'm fully proficient with keytool, like "generate private key for server and import it into keystore". And I'm not. Is there anybody who could post detailed instructions? I'm running unix, so bash script would be best. Not sure if it's important, but code executed in jboss. Thanks a lot!

    Read the article

  • Making sure a web page is not cached, across all browsers.

    - by Edward Wilde
    Our investigations have shown us that not all browsers respect the http cache directives in a uniform manner. For security reasons we do not want certain pages in our application to cached, ever, by the web browser. This must work for at least the following browsers: Internet Explorer versions 6-8 FireFox versions 1.5 - 3.0 Safari version 3 Opera 9 Our requirement came from a security test. After logging out from our website you could press the back button and view cached pages.

    Read the article

  • Secure login on your domain with Google App Engine

    - by mhost
    Hi, We are starting a very large web based service project. We are trying to decide what hosting environment to use. We would really like to use Google App Engine for scalability reasons and to eliminate the need to deal with servers ourselves. Secure logins/registrations is very important to us, as well as using our own domain. Our target audience is not very computer savvy. For this reason, we don't want to have the users have to sign up with OpenID as this can't be done within our site. We also do not want to force our customers to sign up with Google. As far as I can see, I am out of luck. I am hoping to have a definite answer to this question. Can I have an encrypted login to our site accessed via our domain, without having to send the customers to another site for the login (OpenID/Google). Thanks.

    Read the article

  • Programmatically log on to a site

    - by Shaihi
    I am not sure if this fits better on StackOverflow, but here goes: I want to programmatically log on to: http://wrds-web.wharton.upenn.edu/wrds/index.cfm?true I tried capturing the log on url using fiddler2 and HttpFox, but to no avail. Is this a server side script that I cannot capture? If so how can I do the log on?

    Read the article

  • how to use cookies in HttpsURLConnection in android.

    - by sajjoo
    hello guys, actually i am new in Android and now i have to add the cookies in my project. i am using HttpsUrlConnection. here is how i am making request and getting response from a webserver and now i have to add cookies aswell. URL url = new URL(strUrl); HttpsURLConnection connection = (HttpsURLConnection) url.openConnection(); connection.setRequestMethod("POST"); connection.setRequestProperty("Content-Type", "application/soap+xml; charset=utf-8"); connection.setRequestProperty("Content-Length", ""+ Integer.toString(request.getBytes().length)); connection.setUseCaches (false); connection.setDoInput(true); connection.setDoOutput(true); // send Request... DataOutputStream wr = new DataOutputStream (connection.getOutputStream()); wr.writeBytes (request); wr.flush (); wr.close (); //Get response... DataInputStream is = new DataInputStream(connection.getInputStream()); String line; StringBuffer response = new StringBuffer(); while((line = is.readLine()) != null) { response.append(line); } is.close(); FileLogger.writeFile("Soap.txt", "RESPONSE: " + methodName + "\n" + response); HashMap<String, String> parameters = null; try { parameters = SoapRequest.responseParser(response.toString(), methodName); } catch (ParserConfigurationException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (SAXException e) { // TODO Auto-generated catch block e.printStackTrace(); } return parameters; any help will be appreciative, thanks

    Read the article

  • cURL PHP Proper SSL between private servers with self-signed certificate

    - by PolishHurricane
    I originally had a connection between my 2 servers running with CURLOPT_SSL_VERIFYPEER set to "false" with no Common Name in the SSL cert to avoid errors. The following is the client code that connected to the server with the certificate: curl_setopt($ch,CURLOPT_SSL_VERIFYPEER,FALSE); curl_setopt($ch,CURLOPT_SSL_VERIFYHOST,2); However, I recently changed this code (set it to true) and specified the computers certificate in PEM format. curl_setopt($ch,CURLOPT_SSL_VERIFYPEER,TRUE); curl_setopt($ch,CURLOPT_SSL_VERIFYHOST,2); curl_setopt($ch,CURLOPT_CAINFO,getcwd().'/includes/hostcert/Hostname.crt'); This worked great on the local network from a test machine, as the certificate is signed with it's hostname for a CN. How can I setup the PHP code so it only trusts the hostname computer and maintains a secure connection. I'm well aware you can just set CURLOPT_SSL_VERIFYHOST to "0" or "1" and CURLOPT_SSL_VERIFYPEER to "false", but these are not valid solutions as they break the SSL security.

    Read the article

  • Accessing Securised Web Service

    - by Xstahef
    Hi, I need to connect to a provider's web service with a Windows Form application. He gives me a certificate to access it but I have a security problem. I have done these following steps : Add certificate to personal store (on IE & Firefox) Generate a proxy with the remote wsdl (no problem) Use this code to call a method : `using (service1.MessagesService m = new service1.MessagesService()) { X509Certificate crt = new X509Certificate(@"C:\OpenSSL\bin\thecert.p12",string.Empty); m.ClientCertificates.Add(crt); var result = m.AuthoriseTransaction(aut); this.textBox1.AppendText(result.id.ToString()); }` I have the following error : The underlying connection was closed: Could not establish trust relationship for the channel SSL / TLS. Thanks for your help

    Read the article

  • SSL with external static content server

    - by SirMoreno
    I have a .Net web application that for performance issues gets all the static data (CSS, Images, JS) from an external server that is on different location and different hosting company. I want to enable SSL on my site without the users getting a message: "Page contains both secure and insecure elements" Does this means I’ll have to get two SSL Certificates one for each server? If I want the users to continue getting the static content from the external server what other options do I have? Thanks.

    Read the article

  • How do I sign a HTTP request with a X.509 certificate in Java?

    - by Rune
    How do I perform an HTTP request and sign it with a X.509 certificate using Java? I usually program in C#. Now, what I would like to do is something similar to the following, only in Java: private HttpWebRequest CreateRequest(Uri uri, X509Certificate2 cert) { HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uri); request.ClientCertificates.Add(cert); /* ... */ return request; } In Java I have created a java.security.cert.X509Certificate instance but I cannot figure out how to associate it to a HTTP request. I can create a HTTP request using a java.net.URL instance, but I don't seem to be able to associate my certificate with that instance (and I'm not sure whether using java.net.URL is even appropriate).

    Read the article

  • How to connect to a third party website in classic asp using javascript for password encryption and yet not giving in the password.

    - by Abbi
    Hi I have to make changes to classic asp website where once a button is clicked it autologins to a third party website with a intermediate page that warns that you are logging in to a third party website. The thirdparty is providing us with a username and password and gave us an examle javascript to encode the password to send to them. Now where do I store the userid and password. I cannot execute the javascript on the serverside. It has to go to the client. If the asp page which has the encryption javascript goes to the client side then the source can be viewed and the username and password is given out. Is there a way that I can have hidden asp page whose only job is to encrypt the password and create a new url and auto redirect it to that new url. So when the user clicks ok on the intermediate warning page I redirect it to this hidden asp page which does the encryption and a creates a url for get method and redirects to that page. I am a novice as far as java script and classic asp is concerned. Any ideas/ advice will be appreciated. Thanks, --Abbi

    Read the article

  • HttpsURLConnection failing intermittently to the same URL

    - by Arkaitz Jimenez
    I think I'm experiencing the same as http://groups.google.com/group/android-developers/msg/9d37d64aad0ee357 This is Android 1.5 SDK. I happen to call several times below code(which is in a method) with the same url and it fails intermittently. When it fails, there is no exception, the stream is empty so the readConnection fails, and getResponseCode returns -1. Global caching is disabled, setDefaultUseCaches(false); I suppose there must be some kind of url connection object pool somewhere. Any idea on how can I workaround this? HttpURLConnection connection = null; try { URL url = new URL(this.url); connection = (HttpURLConnection) url.openConnection(); connection.setRequestProperty("Authorization", "basic " + Base64Coder.encodeString(user + ":" + password)); connection.setRequestProperty("User-Agent", userAgent); connection.connect(); readConnection(connection.getInputStream()); connection.disconnect(); } catch (IOException ex) { reportException(ex, connection.getResponseCode()) } catch (ParserException ex) { reportException(ex, connection.getResponseCode()) }

    Read the article

  • Strange problem on some client's browsers

    - by Gustavo Cardoso
    We are fighting a strange problem on the company that I work. We created a site of a promotion to a client where its consumers can register products barcodes to win prizes. The site was created using PHP and MySQL. The site uses SSL on every form. However, some consumers report to the client's call-center they was no able make a registration at the site. We try everything, but we cannot, by no ways, reproduce the problem. The consumers reported the problem on several browsers ranging from IE8 to Firefox, the problem is same on everyone them. One co-woker this weekend was able to catch this same bug on his wife's notebook and brought her computer to the company so we could test. However, here on the company the problem didn't happened and we can make the registration normally. We suppose this problem could be a matter of encoding and special characteres like ã and ç. But we are sure that all source files are UTF8- with BOM. We also suspect of MSXml version, but we are note sure anymore. Because of legal impediments the client cannot ask the consumers to install anything on they computers to test or fix the problem. Sorry but by complience rules we also cannot share the url of the site, what is a pity. I know it is too much on vacuun, but perhaps you could had crossed something similar. Thank you

    Read the article

  • Why are 20% of keywords still provided when Google is using HTTPS across the board?

    - by Rajesh Magar
    Most of the searches that appear in my analytics are "not provided" because Google has encrypted their all searches. However, if all search results are now encrypted with HTTPS protocol then how is Google analytics still able to track some (20%) of the organic keywords details? There are still some keywords appearing in my organic keywords section. So how did Google analytics do this tracking? Does it bypass the HTTPS restrictions for the referrer?

    Read the article

  • problems calling webservices through the https connection

    - by shivaji123
    i have done an application in BlackBerry which takes username & password with url link which is a link of server here i am calling some webservices but it is doing the connection in https so when i take the username password & url link & hit the login button it basically calls a webservice but then the application connecting to the webservice for ever & after some time i get the error massage something "unreported exception the application is not responding" .& then the application crashes out.Also i am using the SOAP client library . this is the piece of code synchronized (this) { try { _httpconn = (HttpConnection) Connector.open(url,Connector.READ_WRITE);//Connector.READ_WRITE //_httpconn =(StreamConnection)Connector.open(url); //System.out.println("-----------httpsconnection() PART--------------------"); _httpconn.setRequestMethod(HttpConnection.POST); //_httpconn.setRequestProperty("Content-Type", "application/x-www-form-urlencoded"); //System.out.println("-----------httpsconnection() PART- **-------------------"); _httpconn.setRequestProperty("SOAPAction", Constants.EXIST_STR); //System.out.println("-----------httpsconnection() PART-REQUEST -------------------"); _httpconn.setRequestProperty("Content-Type", "text/soap+xml"); //System.out.println("-----------httpsconnection() PART- CONTENT-------------------"); _httpconn.setRequestProperty("User-Agent", "kSOAP/1.0"); //System.out.println("-----------httpsconnection() PART-USER Agent-------------------"); String clen = Integer.toBinaryString(input.length()); _httpconn.setRequestProperty("Content-Length", clen); //System.out.println("-----------httpsconnection() Content-Length--------------------"); _out = _httpconn.openDataOutputStream(); //System.out.println(input+"-----------input--------------------"+url); _out.write(input.getBytes()); _out.flush(); // may or may not be needed. //int rc = _httpconn.getResponseCode(); int rc = _httpconn.getResponseCode(); if(rc == HttpConnection.HTTP_OK) { isComplete = true; _in = _httpconn.openInputStream(); msg = new StringBuffer(); byte[] data = new byte[1024]; int len = 0; int size = 0; while ( -1 != (len = _in.read(data)) ) { msg.append(new String(data, 0, len)); size += len; } responsData = msg.toString(); System.out.println("-----------responsData "+responsData); } if(responsData!=null) isSuccessful = true; stop(); } catch (InterruptedIOException interrIO) { //errStr = "Network Connection hasn't succedded. "+ //"Please check APN setting."; UiApplication.getUiApplication().invokeLater(new Runnable() { public void run() { Status.show("Network Connection hasn't succedded. "+ "Please try again later."); } }); isComplete = true; System.out.println(interrIO); stop(); } catch (IOException interrIO) { System.out.println("-----------IO EXCEPTION--------- "+interrIO); //errStr = "Network Connection hasn't succedded. "+ //"Please check APN setting." ; UiApplication.getUiApplication().invokeLater(new Runnable() { public void run() { Status.show("Network Connection hasn't succedded. "+ "Please try again later." ); } }); isComplete = true; System.out.println(interrIO); stop(); } catch (Exception e) { System.out.println(e); //errStr = "Unable to connect to the internet at this time. "+ //"Please try again later."; UiApplication.getUiApplication().invokeLater(new Runnable() { public void run() { Status.show("Unable to connect to the internet at this time. "+ "Please try again later." ); } }); isComplete = true; stop(); } finally { try { if(_httpconn != null) { _httpconn.close(); _httpconn = null; } if(_in != null) { _in.close(); _in = null; } if(_out != null) { _out.close(); _out = null; } } catch(Exception e) { System.out.println(e); UiApplication.getUiApplication().invokeLater(new Runnable() { public void run() { Status.show("Unable to connect to the internet at this time. "+ "Please try again later." ); } }); } } } } can anybody help me out. Thanks in advance

    Read the article

  • Returning a 404 page when a folder is accessed from one domain, but allowing access from other domains and IP addresses

    - by okw
    Situation: I want to return a 404 page ("404.php") when a folder ("hidden") is accessed from the example.com domain. I want the same folder to be accessible from a subdomain ("hidden.example.com") or from a different domain ("hidden.com") which are both configured in a single VirtualHost entry. The server has multiple IP addresses that it listens on. Each IP address serves identical content from the example.com domain (sharing a VirtualHost entry.) I want the folder to be accessible from the IP address. The server is configured to use SSL/TLS/HTTPS. HTTPS is optional on example.com, but HTTPS is enforced in the .htaccess file for the hidden folder using a rewrite rule shown below. /www/hidden/.htaccess RewriteCond %{HTTPS} !=on RewriteRule .* https://%{SERVER_NAME}%{REQUEST_URI} [R,L] I know that {SERVER_ADDR} gives the server's IP address, but does it return the one that the client is requesting from? I'm also starting to think that something in the VirtualHosts file would be more appropriate. Any thoughts on this? What should be allowed: http://87.65.43.21/hidden/ https://87.65.43.21/hidden/ http://12.34.56.78/hidden/ https://12.34.56.78/hidden/ http://hidden.example.com/ https://hidden.example.com/ http://hidden.com/ https://hidden.com/ http://www.hidden.com/ https://www.hidden.com/ What should be 404-ed with 404.php http://example.com/hidden/ https://example.com/hidden/ http://www.example.com/hidden/ https://www.example.com/hidden/ http://example.com/hidden/hiddenfile.php https://example.com/hidden/hiddenfile.php etc. Thanks.

    Read the article

  • Local SSL connections are causing redirect loop (after Ubuntu update)

    - by codeinthehole
    Following a recent Ubuntu update, my local websites are no longer serving their pages over SSL. For example, my .htaccess file attempts to ensure /sign-in is always served over HTTPS: RewriteEngine On RewriteCond %{HTTPS} off RewriteCond %{REQUEST_URI} /sign-in RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,QSA,R=301] However when I make a request to /sign-in on the domain site2-local , I get the error "The page isn't redirecting properly" with the following in /var/log/apache2/error.log [Tue Jun 08 12:20:57 2010] [info] [client 127.0.1.1] Connection to child 0 established (server site1-local:443) [Tue Jun 08 12:20:57 2010] [info] Seeding PRNG with 656 bytes of entropy [Tue Jun 08 12:20:57 2010] [info] Initial (No.1) HTTPS request received for child 0 (server site2-local:443) [Tue Jun 08 12:20:57 2010] [info] Subsequent (No.2) HTTPS request received for child 0 (server site2-local:443) [Tue Jun 08 12:20:57 2010] [info] Subsequent (No.3) HTTPS request received for child 0 (server site2-local:443) [Tue Jun 08 12:20:57 2010] [info] Subsequent (No.4) HTTPS request received for child 0 (server site2-local:443) [Tue Jun 08 12:20:57 2010] [info] Subsequent (No.5) HTTPS request received for child 0 (server site2-local:443) [Tue Jun 08 12:20:57 2010] [info] Subsequent (No.6) HTTPS request received for child 0 (server site2-local:443) [Tue Jun 08 12:20:57 2010] [info] Subsequent (No.7) HTTPS request received for child 0 (server site2-local:443) [Tue Jun 08 12:20:57 2010] [info] Subsequent (No.8) HTTPS request received for child 0 (server site2-local:443) [Tue Jun 08 12:20:57 2010] [info] Subsequent (No.9) HTTPS request received for child 0 (server site2-local:443) [Tue Jun 08 12:20:57 2010] [info] Subsequent (No.10) HTTPS request received for child 0 (server site2-local:443) [Tue Jun 08 12:21:12 2010] [info] [client 127.0.1.1] (70007)The timeout specified has expired: SSL input filter read failed. [Tue Jun 08 12:21:12 2010] [info] [client 127.0.1.1] Connection closed to child 0 with standard shutdown (server site2-local:443) There is a connection to site1-local (another site on my machine which shares the certificate), which I don't understand. Anyone know what is causing this issue?

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >