Search Results

Search found 24209 results on 969 pages for 'site'.

Page 642/969 | < Previous Page | 638 639 640 641 642 643 644 645 646 647 648 649  | Next Page >

  • Does data mining qualify as an abuse?

    - by Hybryd
    Hi all, today I had a strange experience with my ISP. They disabled my password for internet connection, and when I called them, they enabled it again, but they didn't say why it happened. In the last couple of days I was running a data mining that I made for one forum to get some useful info about business that I'm in. So I thought, maybe my ISP figured that 10,000 page requests in couple of hours to the same site may be some kind of attack. What do you think, does it qualify as an attack? Is it even ok to data mine in that way?

    Read the article

  • Is there a USB ethernet (wired) adapter that is really compatible with Windows 7 64-bit?

    - by nbolton
    I've checked the Windows 7 compatibility site, and it lists a fair few USB ethernet (wired, not wireless) adapters that should work with Windows 7 64-bit. However, whenever I Google for the model number and Windows 7 64-bit, there's many forum posts claiming that the devices actually don't work with 64-bit (but do work with 32-bit). I've actually also found this with the LUPO USB ethernet adapter; works with 32-bit win7, but not 64-bit (no drivers available). So is there anyone out there who is 100% certain, and have actually used successfully, a 64-bit win7 capable USB ethernet adapter?

    Read the article

  • Why is it good to have website content files on a separate drive other than system (OS) drive?

    - by Jeffrey
    I am wondering what benefits will give me to move all website content files from the default inetpub directory (C:) to something like D:\wwwroot. By default IIS creates separate application pool for each website and I am using the built-in user and group (IURS) as the authentication method. I’ve made sure each site directory has the appropriate permission settings so I am not sure what benefits I will gain. Some of the environment settings are as below: VMWare Windows 2008 R2 64 IIS 7.5 C:\inetpub\site1 C:\inetpub\site2 Also as this article (moving the iis7 inetpub directory to a different drive) points out, not sure if it's worth the trouble to migrate files to a different drive: PLEASE BE AWARE OF THE FOLLOWING: WINDOWS SERVICING EVENTS (I.E. HOTFIXES AND SERVICE PACKS) WOULD STILL REPLACE FILES IN THE ORIGINAL DIRECTORIES. THE LIKELIHOOD THAT FILES IN THE INETPUB DIRECTORIES HAVE TO BE REPLACED BY SERVICING IS LOW BUT FOR THIS REASON DELETING THE ORIGINAL DIRECTORIES IS NOT POSSIBLE.

    Read the article

  • Equivalent of LogRotate for Windows?

    - by mfinni
    We have a huge logfile being written by a vendor's application. Let's assume the vendor won't do anything that we ask. Is there any way of rotating that logfile somehow? We're looking at about 300 MB an hour being written - I'd much rather chunk that into 10 MB pieces, and let anything older than a day or over 1000 files fall off a cliff. (I know I know, possible duplicate of How do you rotate apache logs on windows without interrupting service? ) Aha - the Chomp log was dead, but searching for "chomp logrotate brought me to it's new site. I'll give it a try tomorrow and reply if I like it. I'd still like to hear about software anyone else is using that works for this.

    Read the article

  • Set preferred language in Chrome and other Google services

    - by Super Chicken
    Whenever I'm abroad and access Google's search (via Chrome browser, on my own laptop) or other Google services, they are presented to me in the local language. How can I get Google services displayed in English and instruct Chrome to use google.com (instead of the country-specific site)? My language setting in Windows is English, so Chrome should already use this by default, and I've also set my language preference in iGoogle to English (U.S.), yet if I'm in France, for example, my searches take place on google.fr and sites like the Google News are in French. Chrome tries to be helpful by suggesting to translate these pages for me, but it would be far better to direct to the original English version of these sites in the first place. How do I fix this?

    Read the article

  • Interface to collect successful remote backups status

    - by Aseques
    I would like to deploy into our infrastructure a web interface that could register when the copies are finished and if for some reason they haven't. The current issue is that we are doing on site backups for customers, for each backup a mail is sent ad the end of the backup, the problems is that sometimes the mail isn't sent for a variety of reasons: System doesn't have internet Backup system crashed before sending the mail etc.. What I'd like to do is to have a web interface that the backup software cant visit after doing the backup (either if it's a success or a fail), that acknowledges that the backup has finished, after some time, I'd like to receive a report of the machines that hadn't done the backup. Is there anything remotely similar to this that I could use/adapt to our environment? UPDATE: Just found out this (paessler.com) that seems to be a privative solution of what I intended.

    Read the article

  • Is there a precedent for the license on a compiler restricting the kind of development you can use it for?

    - by Jim McKeeth
    It was recently let slip that the new EULA for Delphi XE3 will prohibit Client Server development with the Professional edition without the additional purchase of a Client Server license pack. This is not to say the Professional version will lack the features, but the license will specifically prohibit the developer from using the compiler for a specific class of development, even with 3rd party or home grown solutions. So my question is if there is a precedent of a compiler or similar creative tool prohibiting the class of work you can use it for. Specifically a commercially licensed "professional" tool like Delphi XE3. Also, would such a restriction be legally enforceable? I know there have been educational edition or starter edition tools in the past that have restricted their use for commercial purposes, but those were not sold as "professional" tools. Also I know that a lot of computing software and equipment will have a disclaimer that it is not for use in "life support equipment" or "nuclear power" but that is more of avoiding liability than prohibiting activity. Seems like I recall Microsoft putting a restriction in FrontPage that you couldn't use it to create a web site that reflected poorly on Microsoft, but they pulled that restriction before it could be tested legally.

    Read the article

  • Stupid question: prevent file from being changed under linux

    - by Josh
    Ok. Stupid question. I have forgotten what this is called and without remembering the name the search function on the site and on google is failing me. What's the command under linux to mark a file as "locked"/to prevent any changes from being made to it? I'm not talking about chmod. There's a property that can be set (again the name escapes me at the moment) which prevents even processes running as root from changing a file. What is this called and how do I set it?

    Read the article

  • Unable to set NTFS permissions for ApplicationPoolIdentity on Windows 2008 SP2

    - by Kev
    On Windows 2008 R2 I am able to set NTFS permissions for an application pool's synthesised ApplicationPoolIdentity account thus: ICACLS d:\websites\site1\www /grant "IIS AppPool\site1":(CI)(OI)(M) The website's application pool is named site1 and is configured to run as ApplicationPoolIdentity. The site's authentication is also configured to authenticate as ApplicationPoolIdentity. I've done this a thousand times on Windows 2008 Standard Edition R2 with never a hitch. However if I try to do the same in Windows 2008 Standard Edition SP2 I get the error: IIS AppPool\site1: No mapping between account names and security IDs was done. Successfully processed 0 files; Failed processing 1 files I also notice that this fails if I try to set permissions for the application pool identity via the security GUI as well. I've seen this before and a reboot has cleared this issue but I'd like to know why this happens periodically. Googling around suggests other folks have hit this problem but there's never a satisfactory explanation. Why would this be?

    Read the article

  • Using HTML5 Today part 4&ndash;What happened to XHTML?

    - by Steve Albers
    This is the fourth entry in a series of descriptions & demos from the “Using HTML5 Today” user group presentation. For practical purposes, the original XHTML standard is a historical footnote, although XHTML transitional will probably live on forever in the default web page templates of old web page editors. The original XHTML spec was released in 2000, on the heels of the HTML 4.01 spec.  The plan was to move web development away from HTML to the more formal, rigorous approach that XHTML offered, but it was built on a principle that conflicts with the history and culture of the Internet: XHTML introduced the idea of Draconian Error Handling, which essentially means that invalid XML markup on a page will cause a page to stop rendering. There is a transitional mode offered in the original XHTML spec, but the goal was to move to D.E.H.  You can see the result by changing the doc type for a document to “application/xhtml+xml” - for my class example we change this setting in the web.config file: <staticContent> <remove fileExtension=".html" /> <mimeMap fileExtension=".html" mimeType="application/xhtml+xml" /> </staticContent> With the new strict syntax a simple error, in this case a duplicate </td> tag, can cause a critical page error: While XHTML became very popular in the ensuing decade, the Strict form of XHTML never achieved widespread use. Draconian Error Handling was one of the factors that led in time to the creation of the WHATWG, or Web Hypertext Application Technology Group.  WHATWG contributed to the eventually disbanding of the XHTML 2.0 working group and the W3C’s move to embrace the HTML5 standard. For developers who long for XML markup the W3C HTML5 standard includes an XHTML5 syntax. For the longer, more definitive look at what happened to XHTML and how HTML5 came to be check out the Dive Into HTML mirror site or Bruce Lawson’s “HTML5: Who, What, When Why” talk.

    Read the article

  • Blocking apache access via user agent string

    - by Tchalvak
    I've got a scripter who is using a proxy to attack a website I'm serving. I've noticed that they tend to access the site via software with a certain common user agent string (i.e. http://www.itsecteam.com/en/projects/project1_page2.htm "Havij advanced sql injection software" with a user_agent string of Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; SV1; .NET CLR 2.0.50727) Havij). I'm aware that any cracking software worth it's salt will probably be able to modify it's user agent string, but I'm fine with the scripter having to deal with that feature at some point. So, is there any software out there for automatically blocking access & permanently blacklisting by matching user agent strings?

    Read the article

  • HTTP Redirect from www.mydomain.com to my amazon ec2 account (instance)?

    - by fabius
    Hello! I have a domain, that is registered at a service provider but my site (wordpress blog) is hosted in a shared account with a friend in another other host service. I want to become seperate from this friend because I'm tired of boring him with my blog downtimes. Now, my problem is that I signed up to Amazon EC2 service and I created a instance (a virtual machine) to host my wordpress blog and now I'd like to redirect mydomain.com to this instance at Amazon EC2 and I don't know how to proceed in order to achieve that. The instance at Amazon EC2 is up and running (it's a 64bit linux machine) but I couldn't redirect mydomain.com to this instance at my host service webpanel. Could someone help me please???

    Read the article

  • How do I elevate privileges when running appcmd from a nant task?

    - by Rune
    We are using a Windows 7 box as build server. As part of our continuous integration process I would like to stop and start an IIS 7 website. I have tried doing this from the command line using appcmd: appcmd start site "my website" However, this only works if I start the console window by choosing "Run as Administrator", so it won't work out-of-the-box from NAnt etc. How do I script appcmd to be run with elevated privileges (or am I going about this in the wrong way)? Thank you.

    Read the article

  • Completely unable to install python2.7-dev

    - by gath
    Am totally stuck! Am unable completely to install python2.7-dev on my Ubuntu13.04. I have tried all the tricks mentioned on this site (askubuntu.com) and many other sites on the web but still nothing! Am running Ubuntu 13.04 64bit on a virtualbox, but every time I run sudo apt-get install python2.7-dev, I get the following error; python2.7-dev : Depends: python2.7 (= 2.7.3-0ubuntu3.5) but 2.7.4-2ubuntu3 is to be installed Depends: libpython2.7 (= 2.7.3-0ubuntu3.5) but 2.7.4-2ubuntu3 is to be installed Depends: libexpat1-dev but it is not going to be installed Depends: libssl-dev but it is not going to be installed E: Unable to correct problems, you have held broken packages. I've tried to do apt-get update, but still nothing! I've even tried installing Python2.7 from sources, but still nothing doing! Is there a single package with all the dependencies I can download, that can just install everything (Python2.7-dev) or is there another trick I can use to get the Python-dev headers installed on my machine. Hint: I've noticed when I run sudo apt-get update , somewhere along the updates I've seen some errors; ... Get:1 http://us.archive.ubuntu.com precise-updates Release.gpg [198 B] Ign http://us.archive.ubuntu.com raring Release Ign http://us.archive.ubuntu.com raring-updates Release Ign http://us.archive.ubuntu.com raring-backports Release Ign http://us.archive.ubuntu.com raring-security Release Get:2 http://us.archive.ubuntu.com precise-updates Release [98.7 kB] Err http://extras.ubuntu.com raring/main Sources 404 Not Found Err http://extras.ubuntu.com raring/main amd64 Packages 404 Not Found Err http://extras.ubuntu.com raring/main i386 Packages 404 Not Found Ign http://extras.ubuntu.com raring/main Translation-en_US Ign http://extras.ubuntu.com raring/main Translation-en ... On my precise-updates.list file is a single entry; deb http://us.archive.ubuntu.com/ubuntu/ precise-updates main restricted I don't know if that might help Help! Paul

    Read the article

  • Suggestions for a Self-serv advertising service

    - by Mystere Man
    I am seeking a self-serv advertising service for my websites, but I have a few restrictions that seem to make what i'm looking for hard to find. Specifically, I want to place "advertise here" links on my pages and allow end-users to purchase advertising on that site, page, and location. These ads will not be part of a national network. Supports multi-tenancy - That is, I have a number of domains using the same "web application" but with customized content per domain. When a customer wants to advertise on a given domain, then the ads will only appear on that domain and on that page of the domain (even though the page name may be the same across multiple domains). Supports fixed ad prices, not just CPC. I need monthly and quarterly pricing regardless of performance. Integrates with OpenX and other ad networks, so that if there is no self-serv on a given zone, it will use national advertising or direct advertising. Shiny Ads has much of this, but i'm looking for alternatives, as their prices are a bit crazy (20%) and can only do PayPal.

    Read the article

  • Can't connect back to the wireless network after the password was changed

    - by 7777
    Family changed the network password and some other network settings after new computers were brought into the house because apparently they wouldn't work with what we had. Actually an off-site tech remotely changed it, and I have no idea what he did. My laptop detects the network (it shows up under available networks) but whenever I try to connect it says: Windows is unable to connect to the selected network. The network may no longer be in range. Please refresh the list of available networks, and try to connect again. I wish I could give more details, config settings, but frankly I have no idea what I'm looking for. This is XP (also, not a password issue, I know the password, it's just that I have no idea where to enter it, etc.)

    Read the article

  • Is it possible to use a VB master page to cover an entirely separate directory written in C#?

    - by Jason Weber
    I have a company website written in vb.net. There are 5 master pages. I recently began utilizing a forum application, also asp.net 4.0, but this one is written in C#. My forum directory is domain.com/knowledgebase/. Is there any possible way to take one of my vb.net master pages and somehow integrate into the /knowledgebase/ directory? Here's what's currently This is what's in the top of every page in my site: <%@ Page Title="USS Vision Inc." Language="VB" MasterPageFile="~/homepage.master" AutoEventWireup="false" CodeFile="default.aspx.vb" Inherits="_default" culture="auto" meta:resourcekey="PageResource1" uiculture="auto" Debug="true" %> This is what's in my /knowledgebase/ directory: <%@ Page Language="C#" AutoEventWireup="true" ValidateRequest="false" Inherits="YAF.ForumPageBase" culture="auto" uiculture="auto" %> <%@ Register TagPrefix="YAF" Assembly="YAF" Namespace="YAF" %> <script runat="server"> Is it somehow possible to use, for instance, homepage.master in the /knowledgebase/ directory? If so, how would I accomplish this? Thanks for any guidance anybody can offer!

    Read the article

  • Likeliness of obtaining same IP address after restarting a router

    - by ?affael
    My actual objective is to simulate logged IPs of web-site users who are all assumed to use dynamically assigned IPs. There will be two kinds of users: good users who only change IP when the ISP assignes a new one bad users who will restart their router to obtain a new IP So what I would like to understand is what assignment mechanics are usually at work here deciding from what pool of IPs one is chosen and whether the probability is uniformly distributed. I know there is no definite and global answer as this process can be adjusted be the ISP but maybe there is something like a technological frame and common process that allows some plausible assumptions. UPDATE: A bad user will restart the router as often as possible if necessary. So here the central question is how many IP changes on average are necessary to end up with a previously used IP.

    Read the article

  • apache/debian squeeze server loading directory listing instead of website

    - by Diego
    when you navigate to mywebsite.com/ you see an apache page showing a folder called mywebsite.com/, clicking there then takes me to mywebsite.com/mywebiste.com which doesn't exist, so wordpress shows me the a 404 error. I'm trying to host a wordpress site at mywebsite.com/ but I think I have some kind of directory listing wrong somewhere, though I'm pretty sure I've set up my /etc/apache2/sites-available/mywebsite.com correctly: <VirtualHost *:80> ServerName mywebsite.com ServerAdmin [email protected] DocumentRoot /var/www/mywebsite.com/ <Directory /> Options FollowSymLinks AllowOverride All </Directory> ErrorLog /var/log/apache2/error.log CustomLog /var/log/apache2/access.log combined LogLevel warn </VirtualHost>

    Read the article

  • Best practice, or generally best way to set up web-hosting server, permissions, etc.

    - by Jagot
    Hi, I'm about to set up a server upon which a friend and I will be hosting web sites, and I'll be using Debian. I've set up a LAMP solution many times just to using for local testing purposes, but never for actual production use. I was wondering what are the best practices are in terms of setting the server up, in reference specifically to accessing the web root directory. A couple of the options I have seen: Set up a single user account on the server for us both to use and use a virtual host to point to the somewhere in the home directory, e.g. /home/webdev/www. Set each of us up a user account, and grant permissions in some way to /var/www (What would be the best way? Set up a new group?) I want to get this right when I first set this up as there won't be any going back for a while once our first site is up and running. Appreciate any guidance in advance.

    Read the article

  • Install/import SSL certificate on Windows Server 2003/IIS 6.0

    - by ChristianSparre
    Hi A couple of months ago we ordered an SSL certificate for a client's server using the request guide in IIS 6.0. This worked fine and the guide was completed when we received the certificate. But about 2 weeks ago the server crashed and had to be restored. Now I can't seem to get the site running. I have the .cer file, but what is the correct procedure to import the the certificate? I hope some of you can help me.. -- Christian

    Read the article

  • Any mobile-friendly Credit Card billing solutions for mobile sites similar to Bango?

    - by Programmer
    Are there any mobile-friendly Credit Card billing solutions for mobile sites similar to Bango? The advantages of Bango I have seen compared to regular Credit Card solutions that make it considerably "mobile-friendly" are: 1) It does not require the user to enter their full name and billing address to make a payment. The user is only required to enter their Credit Card number, expiration date, and CVC code (if they are in the U.S., they will also have to enter their Zip Code). That is significantly less input than is normally required for Credit Card payments, which is a big plus on small mobile key pads. After a user makes an initial Credit Card payment, their details are stored by Bango, and the next time the user needs to make a payment with the same Credit Card, they just have to click a single link and it processes the payment on their stored Credit Card. Needless to say, this is very convenient for mobile users as it is analogous to Direct Carrier Billing as far as the user is concerned since they won't need to input any details. The downside with Bango is that their fees are higher than others, all payments must be processed via their site and branding, there is a high minimum ($1.99) and a low maximum ($30) on how much you can charge users, and you need to pay a monthly fee on top of the high transaction costs. It is due to the downsides mentioned above that I am looking for an alternative solution that also does the advantages 1) and 2) above. Is there anything like that? I looked at JunglePay and they do neither 1) nor 2).

    Read the article

  • Cant access websites

    - by LiveEn
    Recently i have some problems accessing websites. When i try to access it says This webpage is not available. I tried accessing the site through FireFox, Internet Explorer and Chrome. I also tried with using a web proxy but still the same problem. This problem is only in my Desktop PC, all the websites works fine in my laptop Currently i cant access, yahoo.com download.com bing.com proxy.org daniweb.com aol.com and many forums I check with the host file but nothing is blocked in that. Can some one please suggest what is wrong?? Thanks

    Read the article

  • What should you do when presented with a horrible design?

    - by plua
    Our firm makes websites. We also design websites. But sometimes our client brings his/her own design. This is often made by an in-house designer, or it is the same design they used for something else. However, sometimes these designs look awful. And I am talking really unprofessional, unbalanced, uncool. But the client really wants this design. I really do not like working with a design that is so awful. It takes away all pleasure in coding. You code. You check the demo. Works great. Looks awful. It's just not fun. And ultimately the client might be happy, but 1) I do not feel proud of the final product and 2) the community sees you 'develop' ugly websites, which is bad for your image. Anybody experiencing this kind of stuff? What do you recommend? I've been thinking: Blocking these clients. If somebody has an 'own' design, ask to see it first. Then somehow politely decline. Drawback: you lose a client. Create a new design. Have our in-house designers work one something really cool. Drawbacks: client would need to pay for this (without asking for it), or it will be declined and the company loses time = money. And it might come as an insult if you propose a new design out of the blue. THEIR designer won't like it for sure. Put a clear disclaimer at the bottom of the site: Website design by XXXXX, Website development by US. Helps for the community-impact (if people pay attention), but not for the uneasy feeling.

    Read the article

  • curl can't verify cert using capath, but can with cacert option

    - by phylae
    I am trying to use curl to connect to a site using HTTPS. But curl is failing to verify the SSL cert. $ curl --verbose --capath ./certs/ --head https://example.com/ * About to connect() to example.com port 443 (#0) * Trying 1.1.1.1... connected * Connected to example.com (1.1.1.1) port 443 (#0) * successfully set certificate verify locations: * CAfile: none CApath: ./certs/ * SSLv3, TLS handshake, Client hello (1): * SSLv3, TLS handshake, Server hello (2): * SSLv3, TLS handshake, CERT (11): * SSLv3, TLS alert, Server hello (2): * SSL certificate problem, verify that the CA cert is OK. Details: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed * Closing connection #0 curl: (60) SSL certificate problem, verify that the CA cert is OK. Details: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed More details here: http://curl.haxx.se/docs/sslcerts.html curl performs SSL certificate verification by default, using a "bundle" of Certificate Authority (CA) public keys (CA certs). If the default bundle file isn't adequate, you can specify an alternate file using the --cacert option. If this HTTPS server uses a certificate signed by a CA represented in the bundle, the certificate verification probably failed due to a problem with the certificate (it might be expired, or the name might not match the domain name in the URL). If you'd like to turn off curl's verification of the certificate, use the -k (or --insecure) option. I know about the -k option. But I do actually want to verify the cert. The certs directory has been properly hashed with c_rehash . and it contains: A Verisign intermediate cert Two self-signed certs The above site should be verified with the Verisign intermediate cert. When I use the --cacert option instead (and point directly to the Verisign cert) curl is able to verify the SSL cert. $ curl --verbose --cacert ./certs/verisign-intermediate-ca.crt --head https://example.com/ * About to connect() to example.com port 443 (#0) * Trying 1.1.1.1... connected * Connected to example.com (1.1.1.1) port 443 (#0) * successfully set certificate verify locations: * CAfile: ./certs/verisign-intermediate-ca.crt CApath: /etc/ssl/certs * SSLv3, TLS handshake, Client hello (1): * SSLv3, TLS handshake, Server hello (2): * SSLv3, TLS handshake, CERT (11): * SSLv3, TLS handshake, Server finished (14): * SSLv3, TLS handshake, Client key exchange (16): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSL connection using RC4-SHA * Server certificate: * subject: C=US; ST=State; L=City; O=Company; OU=ou1; CN=example.com * start date: 2011-04-17 00:00:00 GMT * expire date: 2012-04-15 23:59:59 GMT * common name: example.com (matched) * issuer: C=US; O=VeriSign, Inc.; OU=VeriSign Trust Network; OU=Terms of use at https://www.verisign.com/rpa (c)10; CN=VeriSign Class 3 Secure Server CA - G3 * SSL certificate verify ok. > HEAD / HTTP/1.1 > User-Agent: curl/7.19.7 (x86_64-pc-linux-gnu) libcurl/7.19.7 OpenSSL/0.9.8k zlib/1.2.3.3 libidn/1.15 > Host: example.com > Accept: */* > < HTTP/1.1 404 Not Found HTTP/1.1 404 Not Found < Cache-Control: must-revalidate,no-cache,no-store Cache-Control: must-revalidate,no-cache,no-store < Content-Type: text/html;charset=ISO-8859-1 Content-Type: text/html;charset=ISO-8859-1 < Content-Length: 1267 Content-Length: 1267 < Server: Jetty(7.2.2.v20101205) Server: Jetty(7.2.2.v20101205) < * Connection #0 to host example.com left intact * Closing connection #0 * SSLv3, TLS alert, Client hello (1): In addition, if I try hitting one of the sites using a self signed cert and the --capath option, it also works. (Let me know if I should post an example of that.) This implies that curl is finding the cert directory, and it is properly hash. Finally, I am able to verify the SSL cert with openssl, using its -CApath option. $ openssl s_client -CApath ./certs/ -connect example.com:443 CONNECTED(00000003) depth=3 /C=US/O=VeriSign, Inc./OU=Class 3 Public Primary Certification Authority verify return:1 depth=2 /C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=(c) 2006 VeriSign, Inc. - For authorized use only/CN=VeriSign Class 3 Public Primary Certification Authority - G5 verify return:1 depth=1 /C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=Terms of use at https://www.verisign.com/rpa (c)10/CN=VeriSign Class 3 Secure Server CA - G3 verify return:1 depth=0 /C=US/ST=State/L=City/O=Company/OU=ou1/CN=example.com verify return:1 --- Certificate chain 0 s:/C=US/ST=State/L=City/O=Company/OU=ou1/CN=example.com i:/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=Terms of use at https://www.verisign.com/rpa (c)10/CN=VeriSign Class 3 Secure Server CA - G3 --- Server certificate -----BEGIN CERTIFICATE----- <cert removed> -----END CERTIFICATE----- subject=/C=US/ST=State/L=City/O=Company/OU=ou1/CN=example.com issuer=/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=Terms of use at https://www.verisign.com/rpa (c)10/CN=VeriSign Class 3 Secure Server CA - G3 --- No client certificate CA names sent --- SSL handshake has read 1563 bytes and written 435 bytes --- New, TLSv1/SSLv3, Cipher is RC4-SHA Server public key is 2048 bit Secure Renegotiation IS NOT supported Compression: NONE Expansion: NONE SSL-Session: Protocol : TLSv1 Cipher : RC4-SHA Session-ID: D65C4C6D52E183BF1E7543DA6D6A74EDD7D6E98EB7BD4D48450885188B127717 Session-ID-ctx: Master-Key: 253D4A3477FDED5FD1353D16C1F65CFCBFD78276B6DA1A078F19A51E9F79F7DAB4C7C98E5B8F308FC89C777519C887E2 Key-Arg : None Start Time: 1303258052 Timeout : 300 (sec) Verify return code: 0 (ok) --- QUIT DONE How can I get curl to verify this cert using the --capath option?

    Read the article

< Previous Page | 638 639 640 641 642 643 644 645 646 647 648 649  | Next Page >