The Unix epoch is the moment at which Unix time begins: 00:00:00 UTC on Thursday, 1 January 1970. An epoch timestamp is the number of seconds (or milliseconds) elapsed since that moment. This converter translates between epoch integers and human-readable dates in both directions, supporting second and millisecond precision.
A 10-digit number like 1700000000 is a second-precision Unix timestamp. A 13-digit number like 1700000000000 is millisecond-precision — common in JavaScript (Date.now()), Java (System.currentTimeMillis()), and most REST APIs. Microsecond timestamps (16 digits) appear in high-performance logging and profiling tools. The tool auto-detects the precision from the digit count.
On 32-bit signed integers: 2,147,483,647 seconds = 19 January 2038, 03:14:07 UTC (the Year 2038 problem). JavaScript uses 64-bit floating point, so this tool correctly handles dates well beyond 2038.
UTC (Coordinated Universal Time) is the primary time standard. Unix timestamps are always in UTC — they count seconds from the 1970-01-01 00:00:00 UTC moment regardless of your local timezone. Your local date/time is derived from UTC by adding the timezone offset.
JavaScript: Date.now() (ms) or Math.floor(Date.now()/1000) (seconds). Python: import time; time.time(). Unix shell: date +%s. PHP: time().
An epoch is a reference point — the moment from which a clock starts counting. The Unix epoch (1970-01-01 00:00:00 UTC) is the most common, but other systems use different epochs: Windows FILETIME starts from 1601-01-01; NTP from 1900-01-01; GPS from 1980-01-06.
See also the Unix Timestamp Converter, Date Calculator, and the World Clock.