Three students at the Massachusetts Institute of Technology (MIT) were ordered by a federal court judge over the weekend to cancel their scheduled presentation at Defcon, which centered on vulnerabilities in Boston's transit fare payment system. The (Temporary Restraining Order) came at the behest of the Massachusetts Bay Transit Authority (MBTA).
The story got a good bit of attention, and the lawyers at the Electronic Frontier Foundation (EFF) took up the cause of the three students, going so far as to represent them as a part of the Coders' Rights Project.
Zack Anderson, R. J. Ryan and Alessandro Chiesa were issued a TRO (Temporary Restraining Order) to prevent them from talking at Defcon about vulnerabilities in the Massachusetts Bay Transit Authority's Boston fare cards, known as 'CharlieCard' and 'CharlieTicket'.
Specifically, the MBTA claimed the students had violated the Computer Fraud and Abuse Act (CFAA) by delivering information to conference attendees that could be used to defraud the MBTA of its transit fares.
"The court's order is an illegal prior restraint on legitimate academic research in violation of the First Amendment," said EFF Civil Liberties Director Jennifer Granick.
"The court has adopted an interpretation of the statute that is blatantly unconstitutional, equating discussion in a public forum with computer intrusion," she continued. "Security and the public interest benefit immensely from the free flow of ideas and information on vulnerabilities. More importantly, squelching research and scientific discussion won't stop the attackers. It will just stop the public from knowing that these systems are vulnerable and from pressuring the companies that develop and implement them to fix security holes."
The irony in this legal battle is that the research prevented at Defcon was already in the hands of thousands of people as it was burned to the Defcon CD -- the CD is given away to those who attend Defcon and contains slides, tools and other little goodies. Not to mention that the slides from the talk are now online for all to see.
The Tech Herald posed the following to various security experts and researchers:
Recently, MIT students were issued a TRO (Temporary Restraining Order) to prevent them from talking at Defcon about vulnerabilities in the Massachusetts Bay Transit Authority's (MBTA) Boston fare cards, known as 'CharlieCard' and 'CharlieTicket'. However, while the talk was prevented, the details of their work are now online and available to anyone who wants them.  Does security by obscurity work? Do government bodies have the right to defend their purchased technology? And , were they correct in blocking the talk? Please explain why or why not?
 The technology that is vulnerable in the CharlieCard relates to known vulnerabilities in the MiFare Classic RFID technology used by NXP. The CharlieTicket security issues are explained in the slide and deal with cloning.
What follows is commentary from various researchers and security experts. The comments were solicited from various sources by The Tech Herald and published as they were given.
If you have a thought you want to share on this topic, drop us a line in the comments area. If you are a security researcher or vendor and want to make a comment, please e-mail the security address on this site.
"Sounds like the MBTA should be offering the MIT students a job rather than taking legal action against them. There is an unwritten code of conduct that security-savvy individuals, such as the MIT students, notify the affected party, such as the MBTA, of discovered security vulnerabilities and then give the party a reasonable amount of time to patch the vulnerability before taking the issue public.
"I'm not sure if that process happened, [ED: It did happen, parts of the talk were removed to prevent an audience member from repeating the research successfully.] but one thing I know for sure is that as security threats continue to escalate throughout the world and hackers continue to become more sophisticated in their methods, organizations and government bodies need to find ways to work with these types of individuals.
"We have to be careful about trying to over-govern such activities, such as issuing the restraint order, or these activities will go underground, and the security industry and customers will no longer benefit from individuals' findings and methods." – Charlotte Dunlap, Senior Analyst, Information Security at ESG
"It's been shown time and time again that security through obscurity sets the stage for embarrassing discoveries of a system's flaws. For example, commercial DVDs are crackable today because the system relies on an approach that was not subject of broad scrutiny.
"Governments and other system purchasers want to protect their systems, but they should do this by keeping their private encryption keys secret, and by purchasing a system that relies on widely studied security and on third party security evaluations. At this point, the cat is out of the bag.
"Instead of focusing on blocking public disclosure, the MBTA and other affected customers should be focused on what remedial actions they can take, and on ensuring that their subsequent IT purchases contain strong security that has been properly evaluated." – Eric Skinner, CTO at Entrust
"MBTA made the worst-possible decision by suing researchers for two reasons. First, the law suit disrupts all trust that has been built between researchers and industry and shatters progress we have recently made towards more responsible disclosure of vulnerabilities. Second, MBTA attracts attention to their weak system, while discouraging researchers to find solutions.
"When will industry stop fighting unreasonable law suits and start building secure systems? The research community is ready to help built more secure systems. But as long as hacking results in more attention for serious security issues than constructive work, we will keep hacking." – Karsten Nohl, Security researcher
"The argument against 'security through obscurity' is generally associated with MIT. It is somewhat more nuanced than 'security through obscurity is always bad'. Obscurity does provide security, but achieving obscurity is very hard and maintaining obscurity is harder. Withholding the details of an architecture can make the job of an attacker somewhat harder but this comes at the cost of limiting the scope of the security review.
"Organizations such as the NSA that have world-class security specialists in-house can employ security through obscurity safely because they have access to the necessary expertise. In civilian applications security through obscurity has consistently been found to produce 'brittle' security. If all you rely on for security is obscurity your system is likely to fail and when it does fail it is likely to fail catastrophically.
"It is generally unhelpful to think of security in terms of normative statements, particularly when governments are concerned. Rather than talking about 'rights' it is more productive to ask if an action is likely to have its intended effect. For better or worse, the Internet security community considers itself a community and when a member of that community comes under attack the natural response is to defend. The legal case will be argued by the EFF, but whatever the outcome of that case, nothing will change the fact that the MBTA system is now known to be flawed. If criminals can work out a way to exploit that flaw for personal gain they will.
"The question of how and when to reveal security weaknesses has been debated at considerable length for more than twenty years. My personal policy has been that I never reveal a vulnerability I discover except to the party responsible for the affected system. But that is not necessarily the best approach. When I discovered a flaw in the Netscape random number generator I informed the security specialist at Netscape. Unfortunately due to a communication failure at their end they failed to fix the bug and it was rediscovered a year later.
"According to the accounts I have read, it appears that the MIT students did the responsible thing and informed the MBTA of their discovery before giving the talk. This is not going to be sustainable if researchers fear that their research will be suppressed by litigation." – Phillip Hallam Baker, principal scientist for VeriSign.