Note this is more than just a cooking video. If you don't have 24 minutes to spare, I recommend watching from the 16:26 timestamp for the cost analysis or even 21:50 for the social commentary about what this video proves vs. what it doesn't.
I think it might be a Xorg vs. Wayland thing rather than browser version. IIRC Wayland does not report coordinates of the window relative to screenspace.
I wonder whether there’s an upper bound on the largest number that can be expressed in the observable universe.
All digital representations rely on discrete states in hardware, and there’s a finite number of those in the observable universe, so there should a finite maximum number for computers.
I'll have to think about this. I want a very physical meaning of the term using values rather than references. One where the largest number that can be expressed using up/down fingers with two hands is 1024, not a sign language reference to a googolplex.
I wonder if it would be helpful to restrict the question a bit, maybe something like: what is the largest number for which the number, and also all smaller magnitude integers, can be expressed.
Just as a quick example of why it's a bit absurd - I name that number you just defined $zeta$. Now I make $zeta'$ = zeta^zeta. Or whatever manipulation you like.
Adding constraints is addressed in the link.
And zeta' can not be expressed by any state of the visible Universe.
The GP question was not about encoding, and thus is not subject to compression. The largest number we can measure of anything is a pretty well defined concept.
I suppose pi has a cheat code: if you made the largest possible circle in the universe you only need enough digits to distinguish points on its circumference that are a Planck distance apart. Then you can either ignore any digits beyond that or even simply make them up, as there would be no way to measure the difference.