I need to add a certain feature to a module in a given project regarding time calculation. For this specific case I’m using Java and reading through the documentation of the Date class I found out the time is calculated in milliseconds starting from January 1, 1970, 00:00:00 GMT. I think it’s safe to assume there is a similar “starting date” in other languages so I guess the specific implementation in Java doesn’t matter.
How is the time calculation performed by the computer? How does it know exactly how many milliseconds have passed from that given “starting date and time” to the current date and time?
2
I think it’s safe to assume there is a similar “starting date” in other languages so I guess the specific implementation in Java doesn’t matter.
Pretty much all computers use this or a similar variation of Unix time.
How is the time calculation performed by the computer? How does it know exactly how many milliseconds have passed from that given “starting date and time” to the current date and time?
Typically, a computer includes one or more hardware devices that count time, usually based on a crystal oscillator (the same as used in most modern clocks). There are serveral variations/standards for such devices:
- Programmable Interval Timer
- Real-Time Clock
- High Precision Event Timer
These timer devices usually have a separate battery that allows them to continue running when the computer is switched off. The computer interacts with them in two ways:
- When it boots up, it reads the current time from the device
- At regular intervals, the device causes a hardware interrupt that is handled by the OS to do any time-dependant tasks.
However, the crystal oscillators used in the hardware timers have limited precision and over time drift away from the correct time. For applications where it’s important to have an accurate clock (or just to avoid the hassle of having to manually adjust it), computers
regularly synchronize their time via the Network Time Protocol with a time server, which is connected (directly or indirectly) to a high-precision atomic clock.
1
There are 60 seconds in a minute, 60 minutes in an hour, and 24 hours in a day. That’s 86400 seconds in a day. Multiply that by 1000, and you’ve got the number of milliseconds in a day.
You should now be able to work out the number of days from the beginning of time (January 1, 1970, 00:00:00 GMT). Multiply that by the number of milliseconds in a day, and you now have the internal number that the computer clock stores to represent the current date/time.
4
The BIOS stores the time for you on the system clock. The OS provides a system function to retrieve the time in their format (Unix systems, match the Java framework, Windows uses a different “Epoch” –the start date– but the Java Runtime Translates the result to match what it expects.
2