A prominent US Senator has called on the Federal Trade Commission to investigate Microsoft for “gross cybersecurity negligence,” citing the company’s continued use of the obsolete and vulnerable RC4 encryption cipher that Windows uses by default.
-
A prominent US Senator has called on the Federal Trade Commission to investigate Microsoft for “gross cybersecurity negligence,” citing the company’s continued use of the obsolete and vulnerable RC4 encryption cipher that Windows uses by default. Senator Ron Wyden went on to liken Microsoft to an "arsonist selling firefighting services to their victims.”
-
A prominent US Senator has called on the Federal Trade Commission to investigate Microsoft for “gross cybersecurity negligence,” citing the company’s continued use of the obsolete and vulnerable RC4 encryption cipher that Windows uses by default. Senator Ron Wyden went on to liken Microsoft to an "arsonist selling firefighting services to their victims.”
@dangoodin This is a losing argument. Microsoft and other companies have to be concerned with many issues, such as not breaking existing systems. The business judgement rule, assuming that decisions are made beyond mere concern for "shareholder value", provides a fairly decent standard to measure whether negligence or reckless behavior has occurred.
One could extend Wyden's argument to other places - like automobiles. It is well known that helmets, fire suits, and especially five point seat harnesses enhance driver safety. So do we hold VW and GM and Ford liable for not putting those into cars?
We've already condemned a large number of people to lifetime pain by our near ban on opioid pain relief (on the grounds of attempting to deny abuse by some others.)
We have a balance in all things - and my sense is that in this case "gross negligence" is an inappropriate accusation.
(I would say that clear notice of the issue on the product would be appropriate.)
-
@dangoodin This is a losing argument. Microsoft and other companies have to be concerned with many issues, such as not breaking existing systems. The business judgement rule, assuming that decisions are made beyond mere concern for "shareholder value", provides a fairly decent standard to measure whether negligence or reckless behavior has occurred.
One could extend Wyden's argument to other places - like automobiles. It is well known that helmets, fire suits, and especially five point seat harnesses enhance driver safety. So do we hold VW and GM and Ford liable for not putting those into cars?
We've already condemned a large number of people to lifetime pain by our near ban on opioid pain relief (on the grounds of attempting to deny abuse by some others.)
We have a balance in all things - and my sense is that in this case "gross negligence" is an inappropriate accusation.
(I would say that clear notice of the issue on the product would be appropriate.)
@karlauerbach @dangoodin Sorry, no; as I notes, the problem was known long before Active Directory, so there wasn’t a backwards compatibility issue.
-
@karlauerbach @dangoodin Sorry, no; as I notes, the problem was known long before Active Directory, so there wasn’t a backwards compatibility issue.
Geez, talk about painting yourself into a corner.
-
Geez, talk about painting yourself into a corner.
@dangoodin @SteveBellovin I've spent many an hour in the corner, often deservedly.
I think the greater issue here is not the use of an algorithm that is know to be vulnerable but rather that we have too often used the "crunchy on the outside, soft on the inside" model of security rather than building layers of protection and adherence to the principle of "least privilege".
Microsoft may well be culpable for not keeping up with the often staggering rate of change of security risks, methods, and algorithms.
But what is the standard that we use to measure that culpability? Are we to go to a strict product-liability standard? (i.e. they made it, they are liable, no excuses - essentially an insurance system.)
I bring up self driving vehicles as an example of the fuzziness of the standards. We want to encourage innovation but we also want to block crazed deployment such as Tesla's "full self driving" representations. The real question is who bears the risk and costs of the damage?
-
R AodeRelay shared this topicR ActivityRelay shared this topic