Unix Epoch
The Unix epoch—January 1st, 1970 at 00:00:00 UTC—was chosen by early Unix engineers as a convenient and uniform starting point for representing time. All Unix timestamps are calculated relative to this moment.Examples
| Date / Time | Unix Time (seconds) |
|---|---|
| January 1st, 1970 00:00:00 | 0 |
| June 6th, 1983 00:00:00 | 423705600 |
| June 6th, 1983 16:00:00 | 423763200 |
| September 9th, 2001 01:46:40 | 1000000000 |
| July 20th, 1969 20:17:40 | -14182940 |
Dates before 1970 are represented as negative numbers, counting seconds backward to the Unix epoch.
Limitations
Year 2038 Problem
Systems that store Unix time as a signed 32-bit integer will encounter errors after 03:14:07 UTC on January 19, 2038, when timestamps exceed the maximum 32-bit value. This is similar to the Y2K issue. Modern systems using 64-bit integers are not affected.Leap Seconds
Unix time assumes each day has exactly 86,400 seconds and does not account for leap seconds. This means Unix time can drift slightly from true UTC time, though the difference is negligible for most applications.Variations
Unix Time with Decimals
For sub-second precision, Unix time can include decimal places. For example,1609459200.5 represents a moment 500 milliseconds after midnight on January 1st, 2021.
Unix Time in Milliseconds
Narrative uses this format, representing time as the number of milliseconds since the Unix epoch. This provides greater precision for timestamp data—multiply standard Unix time by 1,000 to convert. For example:- Unix time (seconds):
1609459200 - Unix time (milliseconds):
1609459200000
Additional Resources
- Wikipedia: Year 2038 Problem
- Wikipedia: Unix time
- Epoch Converter for practical conversions between Unix time and standard dates

