Время в linux 1970

Why does Unix time start at 1970-01-01?

I wouldn’t have known the answer except google was there for me:

From Here (needs free subscription):

Linux is following the tradition set by Unix of counting time in seconds since its official «birthday,» — called «epoch» in computing terms — which is Jan. 1, 1970.

A more complete explanation can be found in this Wired News article. It explains that the early Unix engineers picked that date arbitrarily, because they needed to set a uniform date for the start of time, and New Year’s Day, 1970, seemed most convenient.

The Unix epoch is midnight on January 1, 1970. It’s important to remember that this isn’t Unix’s «birthday» — rough versions of the operating system were around in the 1960s. Instead, the date was programmed into the system sometime in the early ’70s only because it was convenient to do so, according to Dennis Ritchie, one of the engineers who worked on Unix at Bell Labs at its inception.

@ChrisHalcrow: what would you have chosen as time 0 if you were dmr? And how is the choice inconvenient for developers? The inconvenience is because measuring time in «human» terms (years, months, days, hours/minutes/seconds, time zones, daylight time) is complicated, not because some (arbitrary) $t=0$ instant was chosen.

@NickD, good prompt for an explanation and good point! I would choose 00:00:00 of CE 0 though, as I’m confident that would make things a little easier to calculate . Please explain what is ‘dmr’? Also, ironically, the fact that the OP requires an explanation of why this date was chosen shows that it is inherently confusing for someone to understand the usage of 01/01/70 as a reference date!

@ChrisHalcrow: dmr = Dennis Ritchie. Did you calculate the number of seconds from your chosen origin to today? How many bits does it require? The PDP-11 had 16-bit registers and words, but it allowed you to group together two registers and two words to make 32-bit registers and double-words for some operations. That gives you +/- 68 years from your 0 time (or +136 years if your time was unsigned — but dmr chose signed). His choice may be a bit mystifying the first time you see it, but it’s a pretty obvious decision given the above.

@NickD — great explanation! This should be part of the accepted answer — why not move your comment there, and we can delete ours from here?

Let me attempt to answer it (ofcourse source: internet)

Unix Time is represented by a 32 bit whole number (an integer) that can be positive or negative (signed). Unix was originally developed in the 60s and 70s so the «start» of Unix Time was set to January 1st 1970 at midnight GMT (Greenwich Mean Time) — this date/time was assigned the Unix Time value of 0. This is what is know as the Unix Epoch.

Читайте также:  Css для linux v34

A 32 bit signed integer can represent whole numbers between -2147483648 and 2147483647. Since Unix Time starts at 0, negative Unix Time values go back in time from the Epoch and positive numbers go forward in time. This means that Unix Time spans from Unix Time value of -2147483648 or 20:45:52 GMT on December 13th 1901 to Unix Time value of 2147483647 or 3:14:07 GMT on January 19 in 2038. These dates represent the beginning, the pre-history and the end of Unix Time.

The end of Unix Time will occur on January 19, 2038 03:14:07 GMT. On January 19, 2038 03:14:08 GMT all computers that still use 32 bit Unix Time will overflow. This is known as the «Year 2038 problem». Some believe this will be a more significant problem than the «Year 2000 problem». The fix for the Year 2038 problem is to store Unix Time in a 64 bit integer. This is already underway in most 64 bit Operating Systems but many systems may not be updated by 2038.

Only one paragraph of this actually addresses the question, and it’s somewhat inaccurate (the epoch was originally in 1971; it was moved later)

Thats Right Michael. From Wikipedia: The earliest versions of Unix time had a 32-bit integer incrementing at a rate of 60 Hz, which was the rate of the system clock on the hardware of the early Unix systems. The value 60 Hz still appears in some software interfaces as a result. The epoch also differed from the current value. The first edition Unix Programmer’s Manual dated 3 November 1971 defines the Unix time as «the time since 00:00:00, Jan. 1, 1971, measured in sixtieths of a second».

@Nikhil I still don’t get why 1970, only because Unix was developed that time? Why not 1960? or different month different day?

@Nikhil or it doesn’t really matter? Just first month first day looks better and it was made in 1971 so 1970 would look better too?

Psst! Dennis Ritchie is on the record about this, to Poul-Henning Kamp, Warren Toomey, and Wired. Warner Losh has also reported on this. Find out what dmr actually told people about this.

Therefore, here is Dennis Ritchie‘s comment about this, as well as a brief expanation of the overflow that he mentions.

The Unix epoch is midnight on January 1, 1970. It’s important to remember that this isn’t Unix’s «birthday» — rough versions of the operating system were around in the 1960s. Instead, the date was programmed into the system sometime in the early 70s only because it was convenient to do so, according to Dennis Ritchie, one [of] the engineers who worked on Unix at Bell Labs at its inception.

«At the time we didn’t have tapes and we had a couple of file-systems running and we kept changing the origin of time,» he said. «So finally we said, ‘Let’s pick one thing that’s not going to overflow for a while.’ 1970 seemed to be as good as any.«

There are approximately 32 millions seconds in a year, which means that it takes about 31 years for a billion seconds to pass. Apparently, earlier this year, some mathematically inclined provocateurs discovered that the year 2001 marked 31 years since 1970, and some of them assumed that this might represent an «overflow» — the date buffer filling with digits, causing the computer to go wacky.

In addition, and with a bit more historical detail, Warner Losh stated in an email, Re: [TUHS] The 2038 bug. , on 4 Jan 2021:

My understanding is that it’s been 1st Jan 1970 since at least Ed5, if not Ed6.

It’s been that way since the 4th edition.

In the 3rd edition it was the number of 60Hz ticks since 1972, along with this note: «This guarantees a crisis every 2.26 years.»

Rebasing the epoch would be. tricky. lots of math is done assuming an origin of 1970, and not all of it is obvious to even concerted analysis.

Читайте также:  Linux raid member смонтировать

Less ugly would be to declare time_t to be unsigned instead of signed. It would break less code. Making time_t 64 bits also breaks code, even if you declare you don’t care about binary compat since many older apps know time_t is 32-bits.

Notable dates

  • V1 released 1972
  • V2 released June 1972
  • V3 released February 1973
  • V4 released November 1973
  • V5 released June 1974
  • V6 released May 1975
  • V7 released January 1979
  • V8 released 1985
  • .

Источник

Why is 1/1/1970 the «epoch time»?

Early versions of unix measured system time in 1/60 s intervals. This meant that a 32-bit unsigned integer could only represent a span of time less than 829 days. For this reason, the time represented by the number 0 (called the epoch) had to be set in the very recent past. As this was in the early 1970s, the epoch was set to 1971-01-01.

Later, the system time was changed to increment every second, which increased the span of time that could be represented by a 32-bit unsigned integer to around 136 years. As it was no longer so important to squeeze every second out of the counter, the epoch was rounded down to the nearest decade, thus becoming 1970-01-01. One must assume that this was considered a bit neater than 1971-01-01.

Note that a 32-bit signed integer using 1970-01-01 as its epoch can represent dates up to 2038-01-19, on which date it will wrap around to 1901-12-13.

It’s the frequency of one of the oscillators on the system boards used at the time. It wasn’t necessary for the oscillator to be 60Hz since it ran on DC, but it was probably cheap to use whatever was most common at the time, and TVs were being mass-produced then.

Actually, at the time, it was very common for computer clocks as well as RTCs to be synchronised with the US mains waveform because it was (is?) very reliable. It was multiplied to get the processor clock, and divided to get seconds for the RTC.

Читайте также:  Unable to enumerate usb device linux

@JediKnight This is speculation based on my own experiences as a developer: changing a standard takes time, and if your change doesn’t take hold then you end up with competing standards. The real solution to the epoch problem is 64-bit integers, not moving the epoch forward in time.

The earliest versions of Unix time had a 32-bit integer incrementing at a rate of 60 Hz, which was the rate of the system clock on the hardware of the early Unix systems. The value 60 Hz still appears in some software interfaces as a result. The epoch also differed from the current value. The first edition Unix Programmer’s Manual dated November 3, 1971 defines the Unix time as «the time since 00:00:00, Jan. 1, 1971, measured in sixtieths of a second».

Epoch reference date

An epoch reference date is a point on the timeline from which we count time. Moments before that point are counted with a negative number, moments after are counted with a positive number.

Many epochs in use

Why is 1 January 1970 00:00:00 considered the epoch time?

No, not the epoch, an epoch. There are many epochs in use.

This choice of epoch is arbitrary.

Major computers systems and libraries use any of at least a couple dozen various epochs. One of the most popular epochs is commonly known as Unix Time, using the 1970 UTC moment you mentioned.

While popular, Unix Time’s 1970 may not be the most common. Also in the running for most common would be January 0, 1900 for countless Microsoft Excel & Lotus 1-2-3 spreadsheets, or January 1, 2001 used by Apple’s Cocoa framework in over a billion iOS/macOS machines worldwide in countless apps. Or perhaps January 6, 1980 used by GPS devices?

Many granularities

Different systems use different granularity in counting time.

Even the so-called “Unix Time” varies, with some systems counting whole seconds and some counting milliseconds. Many database such as Postgres use microseconds. Some, such as the modern java.time framework in Java 8 and later, use nanoseconds. Some use still other granularities.

ISO 8601

Because there is so much variance in the use of an epoch reference and in the granularities, it is generally best to avoid communicating moments as a count-from-epoch. Between the ambiguity of epoch & granularity, plus the inability of humans to perceive meaningful values (and therefore miss buggy values), use plain text instead of numbers.

The ISO 8601 standard provides an extensive set of practical well-designed formats for expressing date-time values as text. These formats are easy to parse by machine as well as easy to read by humans across cultures.

  • Date-only: 2019-01-23
  • Moment in UTC: 2019-01-23T12:34:56.123456Z
  • Moment with offset-from-UTC: 2019-01-23T18:04:56.123456+05:30
  • Week of week-based-year: 2019-W23
  • Ordinal date (1st to 366th day of year): 2019-234

Источник

Оцените статью
Adblock
detector