What Is a Unix Timestamp? Everything Developers Need to Know
Unix timestamps explained โ the epoch, UTC vs local time, milliseconds vs seconds, the Year 2038 problem, and working with timestamps in code.
ToolNest Team
December 10, 2025
What Is a Unix Timestamp?
A Unix timestamp (also called Unix time, POSIX time, or epoch time) is a system for representing a point in time as a single number โ the number of seconds that have elapsed since the Unix epoch: January 1, 1970, at 00:00:00 UTC.
For example, the timestamp 1700000000 represents November 14, 2023 at 22:13:20 UTC.
Why January 1, 1970?
When Unix was developed in the late 1960s and early 1970s at Bell Labs, the creators needed a simple, universal reference point for time. January 1, 1970 was chosen somewhat arbitrarily โ it was a recent round date. There's nothing mathematically special about it; it's just a convention that the entire tech industry has adopted.
Why Use Timestamps Instead of Date Strings?
Timestamps have significant advantages for software systems:
No timezone ambiguity โ "March 15 3:00 PM" is ambiguous without timezone.
1710500400has exactly one meaning regardless of where the server or client is located.Easy math โ Adding/subtracting seconds is trivial. "What time is it 7 days from now?" =
now + 7 * 24 * 60 * 60.Compact storage โ A 32-bit or 64-bit integer is smaller than a datetime string.
Database indexing โ Integers index faster than string-based dates.
Universal โ Works across languages and systems without parsing format strings.
Seconds vs Milliseconds
This is a common source of bugs. Two conventions exist:
- Unix timestamp (seconds):
1700000000โ used by Unix, POSIX, Python'stime.time(), many databases - JavaScript timestamp (milliseconds):
1700000000000โ used by JavaScript'sDate.now()and most browser/Node.js APIs
Always check which one an API expects. If your timestamp has 13 digits, it's likely milliseconds. If it has 10 digits, it's likely seconds.
// JavaScript uses milliseconds
Date.now() // e.g., 1700000000000
new Date().getTime() // same, milliseconds
Math.floor(Date.now() / 1000) // convert to seconds
import time
time.time() # seconds with decimal (e.g., 1700000000.123)
int(time.time()) # integer seconds
time.time() * 1000 # convert to milliseconds
UTC vs Local Time
Unix timestamps always represent UTC (Coordinated Universal Time). This is critical: the same timestamp means the same moment in time everywhere on Earth.
When you convert a timestamp to a human-readable date, the result depends on the timezone:
1700000000โ "2023-11-14 22:13:20 UTC"1700000000โ "2023-11-14 14:13:20 PST (UTC-8)"1700000000โ "2023-11-15 07:13:20 IST (UTC+5:30)"
Always work in UTC internally and convert to local time only for display.
The Year 2038 Problem
Early Unix systems stored timestamps as a 32-bit signed integer, which can represent values from -2,147,483,648 to 2,147,483,647.
The maximum 32-bit timestamp is 2147483647 โ January 19, 2038 at 03:14:07 UTC. After this point, a 32-bit timestamp overflows and wraps around to a large negative number, representing December 13, 1901.
This is the "Year 2038 Problem" or Y2K38 โ analogous to Y2K.
The solution is to use 64-bit integers for timestamps, which can represent dates billions of years in the future. Most modern systems already use 64-bit timestamps. However, embedded systems and older software running on 32-bit architectures may still be vulnerable. Linux kernel 5.6 (2020) resolved the 32-bit issue on 32-bit systems.
Working with Timestamps
JavaScript:
// Current timestamp
const nowMs = Date.now(); // milliseconds
const nowSec = Math.floor(nowMs / 1000); // seconds
// Convert timestamp to Date object
const date = new Date(1700000000 * 1000); // ร 1000 if seconds
// Format for display
date.toISOString(); // "2023-11-14T22:13:20.000Z"
date.toLocaleDateString(); // locale-specific format
Python:
import datetime
import time
# Current timestamp
ts = int(time.time())
# Convert to datetime
dt = datetime.datetime.fromtimestamp(ts, tz=datetime.timezone.utc)
# From datetime to timestamp
ts = int(dt.timestamp())
SQL (PostgreSQL):
-- Current timestamp
SELECT EXTRACT(EPOCH FROM NOW())::BIGINT;
-- Convert to timestamp
SELECT TO_TIMESTAMP(1700000000);
Common Timezone Pitfalls
- Never use local time in databases โ Store UTC timestamps, convert to local time in the application layer
- DST transitions โ Daylight Saving Time means local time can go backward (November) or skip forward (March). Timestamps handle this transparently since they're always UTC.
- Server timezone โ If your server is in a different timezone than your users, local time calculations on the server will be wrong. Always use UTC.
Use our free Timestamp Converter to convert between Unix timestamps and human-readable dates instantly.
Share this article