Another year, another Day! March 14th marks my annual check-in with my favourite leaderboard on the Internet: y-cruncher.
y-cruncher is the software that everyone uses to compute interesting constants to ludicrous precision. In 2022, Emma Haruka Iwao’s team at Google used it to compute to 100 trillion digits. The point of the stunt was not that Google needs 100 trillion digits of (nobody does), but rather to show that Google Cloud’s computers are seriously fast.
If you must know, the last ten digits of up to the 100 trillion mark are 3095295560.
Checking in on the leaderboard, it seems that the record… still stands! No new digits of have been announced since last Day. However, a certain Jordan Ranous has been busy breaking other records. He and his team at StorageReview now hold the title to many of our favourite MC Stairwell constants:
- 35 trillion digits of
- 20 trillion digits of
- 20 trillion digits of the golden ratio
as well as many other constants like that aren’t on the stairway (yet).
So why not break the record for ? Was it simply too many digits to beat? After all, Google’s seriously fast computers still had to churn for over 157 days to get 100 trillion. But Jordan Ranous didn’t think so, so his team put together a beast of a computer with
- 96 cores
- 1.5 TB of RAM
- 530.1 TB of SSD storage (the digits themselves will take 100 TB)
On February 9, 2023, they set it to compute 100 trillion digits of using y-cruncher, just like Iwao did a year before. This computation was well underway by Day 2023, when I reported Iwao’s record in mathNEWS. And then, on April 10, the 59th day, it finished, beating Google by almost 100 days!
So why not break the record for ? The frustrating answer is that they just didn’t try to go any further. StorageReview would argue that the point was not to break the record, but to simply beat Google at their own benchmark — which, to be fair, they absolutely did. They achieved such a feat because the components they used were all connected locally instead of on the cloud. Then, they took an 80 TB AWS Snowball Edge and copied the digits onto it (compressed). It turns out, the fastest way to get that much data onto the Internet is via UPS shipping.
But, from a research perspective, it’s a pity. You could say that Jordan Ranous has completed one of the most redundant computations in the history of computing, by going all the way up to the world record, and not a single digit more. We still don’t know what comes after 3095295560. It’s not even true that they have validated Google’s digits, because each record is already validated as it is computed.
Fortunately, there is some hope. In their video about the whole endeavour, StorageReview hints that they might try to go beyond 100 trillion in the future. However, 10 months later, we haven’t heard anything. So if they are churning away at the next record, it better be something huge.
Personally, I’d love to see 314,159,265,358,979.