Epoch & Unix Timestamp Converter
Convert unix timestamps to human-readable dates. Auto-detects seconds vs milliseconds. Outputs ISO 8601, UTC, local, SQL datetime, and more.
What is a Unix timestamp?
A Unix timestamp (also called epoch time or POSIX time) is the number of seconds elapsed since January 1, 1970 00:00:00 UTC — the Unix epoch. It's timezone-independent, monotonically increasing, and trivially sortable, which makes it the de facto standard for storing and transmitting time in APIs and databases.
Seconds vs milliseconds
The single most common source of timestamp bugs is a unit mismatch. JavaScript's Date.now() returns milliseconds, while most Unix system calls return seconds. Quick rules:
- 10 digits→ seconds (e.g.,
1700000000) - 13 digits→ milliseconds (e.g.,
1700000000000)
Common timestamp mistakes
Passing ms to a seconds API: Your server returns 1700000000000 and you store it, then later use it as seconds — you now have a date in the year 55000+.
Ignoring timezone offsets: Storing a UTC timestamp is correct, but displaying it without converting to the user's local time leads to off-by-hours bugs.
32-bit overflow (Y2K38): Signed 32-bit integers can only store timestamps up to January 19, 2038. Systems using 32-bit time_t will overflow. Use 64-bit time values.
Related Tools
Frequently Asked Questions
A Unix timestamp (also called epoch time) is the number of seconds that have elapsed since January 1, 1970 00:00:00 UTC. It's a timezone-independent way to represent a point in time, widely used in APIs, databases, and system programming.
A 10-digit number is almost always seconds (covers years 2001–2286). A 13-digit number is almost always milliseconds (covers years 2001–2286 at millisecond precision). This converter auto-detects based on magnitude, but you can override with the unit toggle.
If you're passing milliseconds to a function that expects seconds (or vice versa), the result will be wildly wrong — often near epoch zero (Jan 1, 1970) or far in the future. Always check the digit count. 10 digits = seconds, 13 digits = milliseconds.
ISO 8601 is the international standard for date and time representation. It looks like 2023-11-14T22:13:20.000Z — always UTC when ending in Z. It's the safest format for storing and transferring timestamps between systems because it's unambiguous and timezone-explicit.
JavaScript: Math.floor(Date.now() / 1000). Python: import time; int(time.time()). C#: DateTimeOffset.UtcNow.ToUnixTimeSeconds(). Go: time.Now().Unix(). The code snippets section below shows these with full context.
Yes. Negative timestamps represent dates before January 1, 1970. For example, -86400 represents December 31, 1969 00:00:00 UTC. Not all systems support negative timestamps — notably 32-bit systems may not.