About Unix time
Unix time counts the seconds elapsed since 1970-01-01 00:00:00 UTC, the Unix epoch. Modern systems usually store it as a 64-bit signed integer (so dates up to year 292 277 026 596 work fine). JavaScript and many APIs store it in milliseconds.
The converter auto-detects the unit: 10 digits or fewer = seconds, 13 digits = milliseconds, 16 digits = microseconds, 19 digits = nanoseconds.
Common timestamps
- 0 — 1970-01-01 00:00:00 UTC (the epoch)
- 1000000000 — 2001-09-09 01:46:40 UTC (1 billion seconds)
- 1234567890 — 2009-02-13 23:31:30 UTC (the famous "billion seconds plus a bit" moment)
- 2147483647 — 2038-01-19 03:14:07 UTC (32-bit signed overflow — the "Year 2038 problem")
FAQ
Are seconds or milliseconds standard?
Both are widely used. Linux, MySQL, PHP and most APIs default to seconds. JavaScript (Date.now(), new Date()) uses milliseconds. The auto-detect picks the right unit by digit count.