This question is about the clock that keeps current time, not about the signal that sequences the circuitry (the wall clock, not the metronome).
First computers were pure calculators, just very large, so I’m pretty sure they did not have system clock (simply because that’s a lot of extra circuitry). Modern computers use the system clock for lots of stuff so that there’s even an API for obtaining current time which is used when dealing with emails and IM messages, creating new log records, creating and changing filesystem objects, etc.
So clearly at some point someone decided that system clock is useful for something inside computer and clock was added.
What was the earliest use case for the system clock?
7
It’s interesting to separate out system clocks, that provide the basic drumbeat for keeping gating circuits in synch with each other, and calendar clocks, that are used for tracking elapsed time over longer periods.
The simplest form of calendar clock is driven off the alternating current power supply. This provides 60 cycles per second in the US, generally 50 in Europe. The circuitry for this is simple and inexpensive, and it’s suitable for a lot of purposes, such as providing the time last modified for a file. It’s reasonably accurate because the power grid requires all power providers to be in synch with each other. AC clocks go back at least to the UNIVAC computers, and probably earlier.
The speed and accuracy of the system clock depends somewhat on the technology used to drive your gates. Relays, vacuum tubes, transistors, integrated circuits, and microchips operate at vastly different speeds, and require very different clocks.
In addition, many early systems were completely synchronous, and a component had to deliver the correct result on a given clock tick, or else it was out of spec. Other systems were designed to be somewhat or completely asynchronous, where a signal would be provided whenever it was ready, and the receiver of that signal had to check on some control circuit to tell it whether to use the signal or to wait.
In the heydey of timesharing systems (late 1960s, early 1970s), the AC clock provided a source of interrupts that would allow the operating system to reconsider its scheduled jobs, if necessary, on each interrupt. Timesharing system designers wanted to keep scheduling overhead within limits, and also to make the system responsive to changing loads and demands.
Today’s desktops typically use the AC clock for calendar purposes, except when power is cut off, when they use a button battery that may last between five to ten years. The battery powered clock gains or loses up to a few seconds a day.
7
Many early time-sharing services would charge you by the millisecond for things like CPU time and memory time, and even dedicated data-processing systems (think company payroll and accounting systems) would have batch runs scheduled to run at certain times. I think large computer systems started having real-time clocks around the time they stopped being experimental toys and began to be used for “real work”.
6
Yes, adding a clock was complex for first computers, that’s right. But some of the first computers simply WERE clocks. http://en.wikipedia.org/wiki/Antikythera_mechanism.
What is interesting, the first universal computers were not afraid of clock using.
ENIAC (1946) had a special cycling unit
for synchronization of all its operations. It was not named “system clock”, but it was separate and it counted time, so, it was a clock. And the whole machine could work with it, not by API, of course, (there were no APIs these times).
ENIAC was one of the first universal programming machines, but not the first… But even the German Z1 (1941) had its clock, too. For the same use – synchronizing.