In computer science and computer programming, system time represents a computer system's notion of the passage of time. In this sense, time also includes the passing of days on the calendar.
System time is measured by a system clock, which is typically implemented as a simple count of the number of ticks that have transpired since some arbitrary starting date, called the epoch. For example, Unix and POSIX-compliant systems encode system time ("Unix time") as the number of seconds elapsed since the start of the Unix epoch at 1 January 1970 00:00:00 UT, with exceptions for leap seconds. Systems that implement the 32-bit and 64-bit versions of the Windows API, such as Windows 9x and Windows NT, provide the system time as both, represented as a year/month/day/hour/minute/second/milliseconds value, and, represented as a count of the number of 100-nanosecond ticks since 1 January 1601 00:00:00 UT as reckoned in the proleptic Gregorian calendar.
System time can be converted into calendar time, which is a form more suitable for human comprehension. For example, the Unix system time seconds since the beginning of the epoch translates into the calendar time 9 September 2001 01:46:40 UT. Library subroutines that handle such conversions may also deal with adjustments for time zones, daylight saving time (DST), leap seconds, and the user's locale settings. Library routines are also generally provided that convert calendar times into system times.
Many implementations that currently store system times as 32-bit integer values will suffer from the impending Year 2038 problem. These time values will overflow ("run out of bits") after the end of their system time epoch, leading to software and hardware errors. These systems will require some form of remediation, similar to efforts required to solve the earlier Year 2000 problem. This will also be a potentially much larger problem for existing data file formats that contain system timestamps stored as 32-bit values.
Closely related to system time is process time, which is a count of the total CPU time consumed by an executing process. It may be split into user and system CPU time, representing the time spent executing user code and system kernel code, respectively. Process times are a tally of CPU instructions or clock cycles and generally have no direct correlation to wall time.
File systems keep track of the times that files are created, modified, and/or accessed by storing timestamps in the file control block (or inode) of each file and directory.
Most first-generation personal computers did not keep track of dates and times. These included systems that ran the CP/M operating system, as well as early models of the Apple II, the BBC Micro, and the Commodore PET, among others. Add-on peripheral boards that included real-time clock chips with on-board battery back-up were available for the IBM PC and XT, but the IBM AT was the first widely available PC that came equipped with date/time hardware built into the motherboard. Prior to the widespread availability of computer networks, most personal computer systems that did track system time did so only with respect to local time and did not make allowances for different time zones.
With current technology, most modern computers keep track of local civil time, as do many other household and personal devices such as VCRs, DVRs, cable TV receivers, PDAs, pagers, cell phones, fax machines, telephone answering machines, cameras, camcorders, central air conditioners, and microwave ovens.
Microcontrollers operating within embedded systems (such as the Raspberry Pi, Arduino, and other similar systems) do not always have internal hardware to keep track of time. Many such controller systems operate without knowledge of the external time. Those that require such information typically initialize their base time upon rebooting by obtaining the current time from an external source, such as from a time server or external clock, or by prompting the user to manually enter the current time.
The system clock is typically implemented as a programmable interval timer that periodically interrupts the CPU, which then starts executing a timer interrupt service routine. This routine typically adds one tick to the system clock (a simple counter) and handles other periodic housekeeping tasks (preemption, etc.) before returning to the task the CPU was executing before the interruption.
The following tables illustrate methods for retrieving the system time in various operating systems, programming languages, and applications. Values marked by (*) are system-dependent and may differ across implementations. All dates are given as Gregorian or proleptic Gregorian calendar dates.
The resolution of an implementation's measurement of time does not imply the same precision of such measurements. For example, a system might return the current time as a value measured in microseconds, but actually be capable of discerning individual clock ticks with a frequency of only 100 Hz (10 ms).
Operating system | Command or function | Resolution | Epoch or range | |
---|---|---|---|---|
Android | 1 ms | 1 January 1970 | ||
BIOS (IBM PC) | [1] | 54.9254 ms 18.2065 Hz | Midnight of the current day | |
[2] | 1 s | Midnight of the current day | ||
[3] | 1 day | 1 January 1980 to 31 December 1999 or 31 December 2079 (system dependent) | ||
CP/M Plus | System Control Block:[4] , Days since 31 December 1977 , Hour (BCD) , Minute (BCD) , Second (BCD) | 1 s | 31 December 1977 to 5 June 2157 | |
BDOS function > :[5] , Days since 1 January 1978 , Hour (BCD) , Minute (BCD) , Second (BCD) | ||||
DOS (Microsoft) | | 10 ms< | -- DL returns 1/100 of a second, so resolution is 10 ms; though accuracy is 55 ms. Also the FAT filesystem stores timestamps with 2 s resolution, except for create date, which is stored in 10 ms --> | 1 January 1980 to 31 December 2099 |
[6] [7] | ||||
iOS (Apple) | [8] | < 1 ms | 1 January 2001 ±10,000 years | |
macOS | [9] | < 1 ms[10] [11] | 1 January 2001 ±10,000 years | |
OpenVMS | 100 ns[12] | 17 November 1858 to 31 July 31,086[13] | ||
1 μs[14] | 1 January 1970 to 7 February 2106[15] | |||
1 ns | ||||
z/OS | [16] | 2-12 μs 244.14 ps | 1 January 1900 to 17 September 2042 UT[17] | |
1 January 1900 to AD 36,765[18] | ||||
Unix, POSIX (see also C date and time functions) | 1 s | (*) 1 January 1970 (to 19 January 2038 prior to Linux 5.9) to 2 July 2486 (Since Linux 5.10) 1 January 1970 to 4 December AD 292,277,026,596 | ||
1 μs | ||||
1 ns | ||||
OS/2 | 10 ms | 1 January 1980 to 31 December 2079[19] | ||
Windows | 1 ms | 1 January 1601 to 14 September 30828, 02:48:05.4775807 | ||
100 ns | ||||
Language/Application | Function or variable | Resolution | Epoch or range | |
---|---|---|---|---|
Ada | 100 μs to 20 ms (*) | 1 January 1901 to 31 December 2099 (*) | ||
AWK | 1 s | (*) | ||
BASIC, True BASIC | , , | 1 s | (*) | |
Business BASIC | , | 0.1 s | (*) | |
C (see C date and time functions) | 1 s (*)[20] | (*) | ||
C++ | | 1 s (*) 1 ns (C++11, OS dependent) | (*) | |
C# | [21] [22] | 100 ns[23] | 1 January 0001 to 31 December 9999 | |
CICS | 1 ms | 1 January 1900 | ||
COBOL | 1 s | 1 January 1601 | ||
Common Lisp | 1 s | 1 January 1900 | ||
Delphi (Borland) | 1 ms (floating point) | 1 January 1900 | ||
Delphi (Embarcadero Technologies)[24] | [25] | 1 ms | 0/0/0000 0:0:0:000 to 12/31/9999 23:59:59:999 [sic] | |
[26] (alias for) | ||||
[27] | 0/0/0000 0:0:0:000 to 12/31/9999 0:0:0:000 [sic] | |||
[28] | ||||
[29] | ||||
[30] | ||||
[31] | 1 s | 0/0/0000 0:0:0:000 to 12/31/9999 23:59:59:000 [sic] | ||
[32] | 1 day | 1 to 7 | ||
[33] | 1 year | (*) | ||
Emacs Lisp | 1 μs (*) | 1 January 1970 | ||
Erlang | , [34] | OS dependent, e.g. on Linux 1ns | 1 January 1970 | |
Excel | ? | 0 January 1900[35] | ||
Fortran | (*)[36] [37] | 1 January 1970 | ||
1 μs | ||||
Go | 1 ns | 1 January 0001 | ||
Haskell | 1 ps (*) | 1 January 1970 (*) | ||
1 ps (*) | 17 November 1858 (*) | |||
Java | 1 ms | 1 January 1970 | ||
[38] | 1 ns | arbitrary | ||
[39] | 1 ns | arbitrary[40] | ||
JavaScript, TypeScript | 1 ms | 1 January 1970 | ||
Matlab | 1 s | 0 January 0000[41] | ||
MUMPS | (short for ) | 1 s | 31 December 1840 | |
LabVIEW | 1 ms | 00:00:00.000 1 January 1904 | ||
1 ms | 00:00:00.000 1 January 1904 | |||
Objective-C | < 1 ms[42] | 1 January 2001 ±10,000 Years | ||
OCaml | 1 s | 1 January 1970 | ||
1 μs | ||||
Extended Pascal | 1 s | (*) | ||
Turbo Pascal | 10 ms | (*) | ||
Perl | 1 s | 1 January 1970 | ||
[43] | 1 μs | |||
PHP | 1 s | 1 January 1970 | ||
1 μs | ||||
PureBasic | 1 s | 1 January 1970 to 19 January 2038 | ||
Python | 1 μs (*) | 1 January 1970 | ||
RPG | , , | 1 s | 1 January 0001 to 31 December 9999 | |
, | 1 μs | |||
Ruby | [44] | 1 μs (*) | 1 January 1970 (to 19 January 2038 prior to Ruby 1.9.2[45]) | |
Scheme | [46] | 1 s | 1 January 1900 | |
Smalltalk | 1 s 1 μs 1 s | 1 January 1901 (*) | ||
SQL | or or or or | 3 ms | 1 January 1753 to 31 December 9999 (*) | |
60 s | 1 January 1900 to 6 June 2079 | |||
Standard ML | 1 μs (*) | 1 January 1970 (*) | ||
TCL | 1 s | 1 January 1970 | ||
1 ms | ||||
1 μs | ||||
1 μs (*) | (*) | |||
Windows PowerShell | [47] [48] | 100 ns | 1 January 0001 to 31 December 9999 | |
Visual Basic .NET | 100 ns | 1 January 0001 to 31 December 9999 |