the hyper scalars are buying huge AMD they figured out how to do inference on them with out great software bugs. but on an average nividia is beating them out ! google is on a different level though.
Let's see...2^241 or so possible 256 bit numbers, so that's 256 * 2^241, so that's....10^50 yottabytes. Obviously we're gonna need cloud storage for all this, so let's say that's about 2 cents per gigabyte/month, so that's...2.2614 × 10^63 dollars per month?
Actually, why does the site list the odds as ~1 in 5.27 × 10⁷²? That's 2^241, but it's picking random 256 bit numbers. Is it because there are so many valid hits?
Since you're at it, if you're also curious, what would be the energy cost of trying all of them, considering the average power used by a random computer today? Are we looking at something like an average quasar total contained energy?
If Advent of Code has taught me anything it’s that interval ranges can be really useful for this kind of thing. I mean at least twice in ten years. We just need to figure out how to coordinate individuals attempts to make it storage efficient.
reply