Avoid the 512 Bit RSA Keys
August 9, 2024 11:24 AM   Subscribe

200 MW of Power (sorry Doc) This article was interesting to me, as it took me back to the days when I finished my math degree with some study of crypto, and their explanation of how the 512 bit key ended up in use.

Also: be extremely wary of the research mentioned in the article, without approval from the company and maybe some legal advice.

Also Also: not quite 1.2 “jigga” watts.
posted by teece303 (20 comments total) 12 users marked this as a favorite
 
Oof. Thank you. Bookmarked for human-factors-infosec, because "it was in the [programming] library so we assumed it was safe" is discussion-worthy.
posted by humbug at 11:47 AM on August 9


Time was, if you were using OpenSSL as a developer you were supposed to be enough of a crypto expert to know what size keys to use. 512 bits has been deprecated forever. Longer than some kids now writing code have been alive, probably.

"It was in the library so we assumed it was safe" is not a very good defense. But there are so many ways that the market incentivizes shitty security. Not having devs knowing what they are doing is just an example of that.
posted by Aardvark Cheeselog at 12:03 PM on August 9 [1 favorite]


Just to give a feeling about how bad 512 is, the guidance* for how big of keys you need to sell products to certain parts of the US government doesn't even list the security strength of anything less than 1024 (which they also don't accept).

* It's on page 54 if you're curious.
posted by atbash at 12:03 PM on August 9 [3 favorites]


This is a nice case study in how to handle a situation like this, and a good lesson in what we should do going forward.

I'm not a fan of blaming devs for being stupid in this situation, because there's so damn much to know to be a developer that there will always be knowledge gaps.

Yeah, devs should always be upping their game, but there's no good reason for insecure algorithms and key choices to be kept in libraries. It's far more reasonable for a smaller set of library maintainers to remove those than for every developer using the library to know not to use the thing that's right in front of them.
posted by Ickster at 12:10 PM on August 9 [1 favorite]


It occurs to me that it's sort of understandable that someone would pronounce gigawatts as "jiggawatts" because the prefix giga- is the same root of the word giant / gigantic.
posted by tclark at 12:14 PM on August 9 [3 favorites]


Yeah, devs should always be upping their game, but there's no good reason for insecure algorithms and key choices to be kept in libraries.

I mean, backwards compatibility is a thing but you should at a minimum have direct access to these things hidden behind a feature flag with a name like --I_AM_KNOWINGLY_USING_AN_INSECURE_ALGORITHM or something.

If you wanted to be particularly aggressive about it you'd use

--I_AM_KNOWINGLY_USING_AN_INSECURE_ALGORITHM=2024

and the program would break and force you to manually recheck that when it stopped being 2024.
posted by mhoye at 12:15 PM on August 9 [4 favorites]


I'm not a fan of blaming devs for being stupid in this situation, because there's so damn much to know to be a developer that there will always be knowledge gaps.

I keep coming back to the industrial paper cutter example. Those things will shear through a stack of hardcovers like they're made of warm butter, and through the magic of standard industrial design they're still almost impossible to hurt yourself with.

We really need software that meets that minimum bar.
posted by mhoye at 12:18 PM on August 9 [6 favorites]


I read a lot about critical failures (such as airline or criticality accidents) and there's almost always a point where someone should've known better, but a lot of other failures happened upstream of the critical moment.

Stealing from Admiral Cloudberg's summary, According to the swiss cheese model of safety, an accident happens when the holes in the stacked swiss cheese slices align, allowing a hazard to pass straight through unhindered.

Yeah, devs should know, but one layer of swiss cheese isn't enough.
posted by Ickster at 12:18 PM on August 9 [3 favorites]


1024-bit was officially deprecated in 2013. 512-bit has been widely known as crack-able since I was just learning cryptography to begin with back in the 1990s. And yet here we are.

But it also makes me think about all the IT professionals at my current job, that may know even less than these devs, and might easily choose such a bad key if the software let them.

(I regularly have to update some software that’s wanted me to make sure I’ve upgraded to 2048 a bit keys for something, and I wonder how old that blurb is… [yes, the upgrade happened sometime before I took over this job])
posted by teece303 at 12:22 PM on August 9 [1 favorite]


Sometime in between “we had only two relatively junior engineers” and “we sell a product with an API offering potential access to equipment installed in sixty thousand customer homes” a professional third-party security audit should have occurred.

On a minor positive note, it is refreshing that the dev did this, they reported it, and the company fixed it promptly without trying to bring down the full wrath of the legal system on their head.
posted by gelfin at 12:23 PM on August 9 [7 favorites]


*goes off grid*
posted by HearHere at 1:13 PM on August 9 [1 favorite]


I am reminded of working on a project which required some "light" security (I am not nor will I ever be a security guy) and even my very light understanding made it clear the "main" devs were using a woefully outdated security protocol and could not be convinced to fix it. They knew even less than me, and all they knew was "security" was a synonym for "getpwnedinfiveseconds" encryption and by god they were going to use that.

I refused to use it and quit the project rather than continuing to pound my head against that wall.
posted by maxwelton at 1:28 PM on August 9


strength of anything less than 1024 (which they also don't accept).

😭
posted by 1024 at 2:02 PM on August 9 [3 favorites]


In the US there is a certification called FIPS compliance and if you are contracting with the US government, you need to assure that all your code and all the libraries you use are FIPS compliant. To that end, for many platforms there are automated tools for verifying that libraries meet the standards. I was working on a library that could read and write PDF files. One of our customers wanted to use it and ran it through the FIPS tools and it failed. And it was not a surprise. My code had to handle ALL versions of PDF including their first approach at encryption which was either 40 bit or 128 bit encryption. So that required all kinds of variances and promises, etc, all of which came down to "for the love of dog, don't encrypt PDF documents with this setting ever."

In theory, you should be updating your libraries continually as well as maintaining a threat model and checking it as part of your release process, but I have personally been in a situation where a library changed and I was in a situation of "can't go back, can't go forward" because the library had a bug that was serious enough that we couldn't ship with it, but the new library either removed a feature that we were depending on or included brand spanking new bugs that were as bad or worse than the one that had been fixed.

I love software.
posted by plinth at 2:44 PM on August 9 [1 favorite]


The operating system we used in college had a system library function that anyone could call, which would allow you to change the password of any account. Even root. You had to know how to link a C library into a Pascal executable, but that was the only “security” around it.
posted by funkaspuck at 2:55 PM on August 9 [1 favorite]


Fantastic writeup by Ryan. The company's response and writeup are also impressive (once you filter out the shoulder-patting that they include).

Compare to the Netgear NTP debacle (of 2003, OMG I'm old), where it took three days to even get Netgear's initial attention, and a full month to develop new firmware and start deploying it.
posted by intermod at 3:21 PM on August 9


It only gets one sentence in the article, but:

"GivEnergy introduced a fix within 24 hours of Castellucci privately disclosing the weakness."

That's an incredible turnaround, particularly considering GivEnergy is not what we'd normally think of as a software company. Google's Project Zero has a 90 day policy, and tere are a lot, a lot of software companies out there that can't or won't hit that and sometimes struggle to renegotiate it.
posted by mhoye at 6:01 PM on August 9


Almost twenty years ago teece303 (then teece) and I participated in a thread involving giga-pronunciation. Time, man. And here's an updated working Wikipedia link.
posted by cgc373 at 6:02 PM on August 9


The real flaw here is them relying solely on key signing for authentication.

Since no one else but GivEnergy can generate tokens, if they had a whitelist of tokens they've issued (which they probably store anyway) and checked against it, this hack wouldn't work and the quality of their cryptography becomes much less important.
posted by grahamparks at 4:15 AM on August 10


Great big RSA keys are so 20th century. All the cool kids are using ed25519 now, because (a) the keys are nice and short at 256 bits and make authorized_keys much less nasty to look at and (b) most current distributions of OpenSSL make ssh-keygen generate ed25519 keys by default.
posted by flabdablet at 1:28 PM on August 11 [1 favorite]


« Older Better than the free chocolate muffins   |   Evaluating People-Search Site Removal Services Newer »


This thread has been archived and is closed to new comments