Technically we don't know how it was first measured and probably never will. The earliest evidence of the Earth's measurements is the Pyramids and good luck finding out how those were built.
I just found a great recap on this topic that also fits nicely with the points your link brings up, shedding a better light on the number 43,200 in other cultures (the scaling factor between the pyramid and the earth dimensions for those who TL;DR).
It also addresses the fact the speed of light in m/s is encoded three times in the pyramid, twice in the dimensions, in cubits and meters, and another times via the GPS coordinates (yes 3 fucking times !). But how would they know about the duration of a second ? I don't know ! What I know is that 43,200 * 2 = 84,600, the number of seconds in a day.
The thing goes deeper with this man and his investigations of the first edition of Shakespeare's Sonnets.
I'm not saying this with sarcasm. Maybe you could take over my attempts at coming up with a custom set of equations for a pyramid, or any other platonic shape, using symbolic regression / genetic algorithms.
"The pyramids give us the dimensions of our planet on a scale defined by the planet itself." - Hancock
It's a bit as if we had never seen a poem and stumble upon some text that harbors rhymes. You'd argue there's no intrinsic link between, say, pickle and tickle, you'd dig in each word etymology to show they wouldn't rhyme in their past forms. And beyond that you would deny the fact words rhymes because they are not in succession, and when I'd point out it's because the rhymes are crossed, you'd laugh it off.
It's not about convincing you (of what ? of some secret intent ? I'm not even sure this is the case). It's about how convincing it is. For that you need to model the cognitive process that interprets these coincidences.
Algorithmic Simplicity and Relevance - Jean-Louis Dessalles
4.1 First-order Relevance
Relevance cannot be equated with failure to anticipate [16]: white noise is ‘boring’, although it impossible to predict and is thus always ‘surprising’, even for an optimal learner. Our definition of unexpectedness, given by (1), correctly declares white noise uninteresting, as its value s at a given time is hard to describe but also equally hard to generate (since a white noise amounts to a uniform lottery), and therefore U(s) = 0.
Following definition (1), some situations can be ‘more than expected’. For instance, if s is about the death last week of a 40-year old woman who lived in a far place hardly known to the observer, then U(s) is likely to be negative, as the minimal description of the woman will exceed in length the minimal parameter settings that the world requires to generate her death. If death is compared with a uniform lottery, then Cw(s) is the number of bits required to ‘choose’ the week of her death: Cw(s) log2(52×40) = 11 bits. If we must discriminate the woman among all currently living humans, we need C(s) = log2(7×109 ) = 33 bits, and U(s) = 11 – 33 = –22 is negative. Relevant situations are unexpected situations.
s is relevant if U(s) = Cw(s) – C(s) > 0 (2)
Relevant situations are thus simpler to describe than to generate. In our previous example, this would happen if the dying woman lives in the vicinity, or is an acquaintance, or is a celebrity. Relevance is detected either because the world generates a situation that turns out to be simple for the observer, or because the situation that is observed was thought by the observer to be ‘impossible’ (i.e. hard to generate).
In other contexts, some authors have noticed the relation between interestingness and unexpectedness [9, 16], or suggested that the originality of an idea could be measured by the complexity of its description using previous knowledge ([10], p. 545). All these definitions compare the complexity of the actual situation s to some reference, which represents the observer’s expectations. For instance, the notion of randomness deficiency ([8], ch. 4 p. 280) compares actual situation to the output of a uniform lottery. The present proposal differs by making the notion of expectation (here: generation) explicit, and by contrasting its complexity Cw(s) with description complexity C(s).