View Single Post
Old March 28, 2009, 12:16 PM   #1
tyme
Staff
 
Join Date: October 13, 2001
Posts: 3,355
TFL and SSL ("invalid certificate" messages)

Recently we've gotten more than a few complaints about the SSL certificate in use at TFL.

As anyone can see, we created the current self-signed cert over two years ago, before the recent hysterical Firefox changes. The self-signed cert costs nothing and allows anyone to use SSL to access TFL, if they're worried about their boss/neighbor/whoever snooping. This use of SSL for non-sensitive purposes is not futile or stupid IMO. Furthermore, with proper diligence -- checking the certificate's hash, ideally from multiple locations and over time -- it is possible to achieve reasonable confidence in the legitimacy of a self-signed certificate.

I don't want to go into the details of Firefox's obnoxious behavior or the reasons for it here. (google "firefox self-signed certificates" for heated opinions on both sides of the argument). I will point out that MSIE 7, Google Chrome, and Opera all handle self-signed certificates in a much more reasonable way. There is a firefox extension that deals with self-signed SSL certificates in a more sane manner: Perspectives.

"So what?"

Google and perhaps some other search engines are indexing SSL versions of pages/urls at TFL. This causes some innocent visitors to get sucked into the https version of TFL, and if they're using Firefox, they get a scary warning and no easy way to get around it. That's very bad.

Disabling SSL might work in the long run, but short-term it will break every incoming https link, which I don't think is a good idea.

Free SSL certificates: Not viable as far as I can tell... Neither Firefox 3, nor Opera (10 beta), nor IE7 or IE8 come with startcom's class 1 cert signing key. I just got a cert for TFL and tested it, and since there's no significant difference in warning messages I'm leaving the current self-signed cert for the time being. At least this way people who have already accepted the cert don't have to accept a new one.

What might work: one of the email complaints recommended sending Googlebot a different robots.txt to deny access when it tries to use SSL. I've started doing that (for all bots, not just google). (after further review, some people seem to think that Google doesn't differentiate https://site/robots.txt from http://site/robots.txt, so denying all in https:// could remove all hits for the site from google.) TFL is also now redirecting every https googlebot request to the non-SSL version of whatever page it wanted. I don't know how this will affect search indexing, but since it only affects SSL pages, which aren't the majority of incoming links to TFL, I'm willing to experiment. Unfortunately, changes targeted at google's indexing take time to go into effect.

So... ideas? I'm not very familiar with search engines' behaviors, so I can only guess how googlebot et. al. will react.
__________________
“The egg hatched...” “...the egg hatched... and a hundred baby spiders came out...” (blade runner)
“Who are you?” “A friend. I'm here to prevent you from making a mistake.” “You have no idea what I'm doing here, friend.” “In specific terms, no, but I swore an oath to protect the world...” (continuum)
“It's a goal you won't understand until later. Your job is to make sure he doesn't achieve the goal.” (bsg)
tyme is offline  
 
Page generated in 0.03731 seconds with 8 queries