Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

60 Hz sure makes it easy to keep clocks on time.


As far as clocks are concerned, 60 Hz or 50 Hz are very similar, just make sure the number of teeth on gears match the frequency.


The two clocks I reference the most - my wrist watch, and my mobile phone - don't really benefit from the alleged advantage of having mains power cycling every 0.0167 seconds.

Perhaps you could expound further on this hypothesis?

I'd always assumed people who spent 3+ years studying electrical engineering had solved this problem. Certainly in Australia (~240V / 50Hz) we don't seem to have a problem with all our clocks being 20% wrong all the time.


1/60th of a second isn't a common unit of time though


It's convenient to count 60 pulses to make a seconds pulse, then 60 of those to make a minute pulse, then 60 of those to make an hour pulse. Then 60 of those to make 2 and a half days :P


Not sure it's precise enough though. In 2018, many clocks in Europe were off because the frequency on the net had drifted due to (as I understood it) the network being out of sync across various countries. Some here might actually understand the details of this.


In the US, we modulate (or used to) grid frequency specifically for these analog clocks such that in a 24hr period it averages to exactly 60Hz.

It doesn't really matter on a second-to-second timescale how accurate grid frequency is. If you can keep the average frequency right, all your clocks will speed up and slow down in sync, and average out to 24hours per day


The frequency drifts up and down whenever demand doesn't exactly match supply. Higher demand slows the frequency down, higher supply speeds it up. This is actually the main way power companies know if supply and demand match, and if power stations have to ramp up or down.

The frequency changes are pretty small in normal operation, but on a clock that uses the frequency to keep time they accumulate. They only work reliably because power companies know about them and occasionally deliberately run a bit over or under capacity to make the average match again.


Fun fact, there are databases of the exact frequency vs. time and it can be used to accurately time stamp audio/video recordings by correlating the ~50/60hz noise in the recording with the database. Good writeup on the technique and how it has been used in court cases: https://robertheaton.com/enf/


In 2018 the European grid lost a cumulative 6 minutes due to a Serbia/Kosovo dispute.


Which has been corrected since.


This is fascinating, didn't know. Why does higher demand lower the frequency?


With a lot of simplification, consuming electricity acts as a brake on giant wheels inside the power plants that are usually spinning at mains frequency. The plants accelerate the same wheels, so with much demand the braking wins and with too little the acceleration wins.


There is conservation of energy. Energy in strictly equals energy out.

The electrical grid is a bunch of heavy spinning motor-generators that are electrically connected to heavy spinning motor-generators and other loads like lightbulbs. The motor-generators are electrically identical, except that we expect to add energy to one side and extract energy on the other*.

So what happens if the energy added by power plants is less than the energy extracted by lightbulbs and the loads on the motor-generators? Conservation of energy means that we must get the energy by slowing down the generators, extracting their kinetic energy. That lowers the grid frequency.

The same thing can happen in reverse to increase the grid frequency. Too much power generation must increase the kinetic energy of the motor-generators.

* Many of the loads on the grid are intentional or unintentional flywheels, so they may actually add energy to the grid if the grid is slowing, increasing stability.


Because generators -- where virtually _all_ AC power is created -- start running slower with high demand. They catch up, via increased power input through governors, but changes in load will necessarily have some impact on speed.


The frequency is generated by rotating electrical generators. Higher electrical demand increases the mechanical load on the generator, making it rotate more slowly, producing a lower frequency.


I don't think 60 or 50 Hz matters wrt to this.

The only thing that matters is that a clock that expects a certain frequency gets that frequency and not 1% more or 1% less.


I'm sure it makes the gear ratios easier to calculate for the power-frequency-synced gearing on those old electric clocks.

But now? It's pretty much just an implementation detail.


think again




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: