Why are dates calculated from January 1st, 1970?
Solution 1
It is the standard of Unix time.
Unix time, or POSIX time, is a system for describing points in time, defined as the number of seconds elapsed since midnight proleptic Coordinated Universal Time (UTC) of January 1, 1970, not counting leap seconds.
Solution 2
using date(January 1st, 1970) as default standard
The Question makes two false assumptions:
- All time-tracking in computing is done as a count-since-1970.
- Such tracking is standard.
Two Dozen Epochs
Time in computing is not always tracked from the beginning of 1970 UTC. While that epoch reference is popular, various computing environments over the decades have used at least nearly two dozen epochs. Some are from other centuries. They range from year 0 (zero) to 2001.
Here are a few.
January 0, 1 BC
January 1, AD 1
October 15, 1582
January 1, 1601
December 31, 1840
November 17, 1858
December 30, 1899
December 31, 1899
January 1, 1900
January 1, 1904
December 31, 1967
January 1, 1980
January 6, 1980
January 1, 2000
January 1, 2001
Unix Epoch Common, But Not Dominant
The beginning of 1970 is popular, probably because of its use by Unix. But by no means is that dominant. For example:
- Countless millions (billions?) of Microsoft Excel & Lotus 1-2-3 documents use
January 0, 1900
(December 31, 1899). - The world now has over a billion iOS/OS X devices using the Cocoa (NSDate) epoch of
1 January 2001, GMT
. - The GPS satellite navigation system uses
January 6, 1980
while the European alternative Galileo uses22 August 1999
.
ISO 8601
Assuming a count-since-epoch is using the Unix epoch is opening a big vulnerability for bugs. Such a count is impossible for a human to instantly decipher, so errors or issues won't be easily flagged when debugging and logging. Another problem is the ambiguity of granularity explained below.
I strongly suggest instead serializing date-time values as unambiguous ISO 8601 strings for data interchange rather than an integer count-since-epoch: YYYY-MM-DDTHH:MM:SS.SSSZ
such as 2014-10-14T16:32:41.018Z
.
Count Of What Since Epoch
Another issue with count-since-epoch time tracking is the time unit, with at least four levels of resolution commonly used.
- Seconds
The original Unix facilities used whole seconds, leading to the Year 2038 Problem when we reach the limit of seconds since 1970 if stored as a 32-bit integer. -
Milliseconds
Used by older Java libraries, including the bundled java.util.Date class and the Joda-Time library. -
Microseconds
Used by databases such as Postgres. -
Nanoseconds
Used by the new java.time package in Java 8.
Solution 3
why its always 1st jan 1970 , Because - '1st January 1970' usually called as "epoch date" is the date when the time started for Unix computers, and that timestamp is marked as '0'. Any time since that date is calculated based on the number of seconds elapsed. In simpler words... the timestamp of any date will be difference in seconds between that date and '1st January 1970' The time stamp is just a integer which started from number '0' on 'Midnight 1st January 1970' and goes on incrementing by '1' as each second pass For conversion of UNIX timestamps to readable dates PHP and other open source languages provides built in functions.
Solution 4
Is there any reason behind using date(January 1st, 1970) as standard for time manipulation?
No reason that matters.
Python's time
module is the C library. Ask Ken Thompson why he chose that date for an epochal date. Maybe it was someone's birthday.
Excel uses two different epochs. Any reason why different version of excel use different dates?
Except for the actual programmer, no one else will ever know why those those kinds of decisions were made.
And...
It does not matter why the date was chosen. It just was.
Astronomers use their own epochal date: http://en.wikipedia.org/wiki/Epoch_(astronomy)
Why? A date has to be chosen to make the math work out. Any random date will work.
A date far in the past avoids negative numbers for the general case.
Some of the smarter packages use the proleptic Gregorian year 1. Any reason why year 1?
There's a reason given in books like Calendrical Calculations: it's mathematically slightly simpler.
But if you think about it, the difference between 1/1/1 and 1/1/1970 is just 1969, a trivial mathematical offset.
Solution 5
January 1st, 1970 00:00:00 am is the zero-point of POSIX time.
Related videos on Youtube
![Vijay Shanker Dubey](https://i.stack.imgur.com/JjCj9.jpg?s=256&g=1)
Vijay Shanker Dubey
An avid reader and learner of Software Architecture. Entrepreneur at heart, Software Architect in Making Currently a Principal Software Engineer at Oracle India, Bengaluru(India). A Sun Cetified Java Programmer. Sites: Twitter Updates Blogs Send Queries at - [email protected]
Updated on July 08, 2022Comments
-
Vijay Shanker Dubey almost 2 years
Is there any reason behind using date(January 1st, 1970) as default standard for time manipulation? I have seen this standard in Java as well as in Python. These two languages I am aware of. Are there other popular languages which follows the same standard?
Please describe.
-
Greg K over 14 yearsAnother popular language following the same standard is PHP, its a fairly common time start point.
-
phuclv over 5 years
-
-
dmckee --- ex-moderator kitten over 14 yearsDo you know if Kernighan and Thompson every expressed a reason for choosing that moment beyond "It's a round number slightly before we started building the thing."?
-
Donal Fellows over 14 yearsIt's the start of a year, it's in the zero timezone (Zulu). Both of those make the date formatting code simpler.
-
keturn over 14 yearsDoesn't count leap seconds? I did not know that detail. After thinking about it for a few moments, I can see why you'd do it that way, but man. my world is shattered. by 24 seconds.
-
Chris Nava over 14 yearsIf 1/1/1 had been chosen we would have run out of seconds (2^31) by now. As it stands, we face a Y2K like issue in 2038 for 32 bit operating systems. en.wikipedia.org/wiki/Year_2038_problem
-
user1066101 over 14 years@Chris Nava: The folks that use 1/1/1 count days, not seconds. 2 billion days is about 5 million years. Often they keep a (day,time) pair to maximize time resolution; there are only 86400 seconds in most days.
-
Chris Nava over 14 years@S.Lott: Yes. I was just pointing out that since most software counts seconds (not minutes) since the epoch, 1/1/1 was to far in the past to be a reasonable start date. Therefore, a more recent date was chosen as the computer epoch (and by association the start of the IT revolution. ;-)
-
user1066101 over 14 years@Chris Nava: "most"? I assume by "most" you mean "Linux". Other OS's don't work the same way Linux does. The issue is that "reasonable" and "why 1/1/1970?" aren't easy questions to answer; most importantly, the answer doesn't matter. "reasonable" is true, but it's not the reason why. The reason why is something only Ken Thompson can answer.
-
PascalVKooten almost 6 yearsI wonder what the dominant epoch is at this moment... did you base it on data?
-
Basil Bourque almost 6 years@PascalVKooten Many different epochs are used in many different environments and software systems. So there is no one dominant epoch. My point here is to never assume the epoch. Know your data source. Best approach is for a data source to avoid the epoch problem entirely and just use ISO 8601 strings, IMHO.
-
PascalVKooten almost 6 yearsThanks for your reply. I understand that there are many, but I'm curious to know if e.g. POSIX got more popular over time.
-
Unknow0059 over 3 yearsWhy are dates calculated from January 1st 1970? Because dates are calculated from January 1st 1970. This is circular logic that doesn't answer anything.
-
Unknow0059 over 3 yearsThis is interesting and all but doesn't answer why 1970 was chosen to be the Unix epoch.