Millisecond Timestamps Explained
A Unix timestamp counts time since January 1, 1970 00:00:00 UTC. The original definition uses seconds, but many modern systems use milliseconds instead. If you have ever seen a 13-digit number where you expected 10, or gotten a date in January 1970 when you expected today, you have encountered this distinction firsthand.
Why Millisecond Timestamps Exist
Second-precision is sufficient for many use cases -- log rotation, cron jobs, certificate expiry. But some operations require finer granularity:
- Event ordering -- In a distributed system processing thousands of events per second, two events can share the same second-level timestamp. Milliseconds reduce (but do not eliminate) the chance of collisions.
- Performance measurement -- Measuring API response times or page load speed requires sub-second precision. A 150ms response and a 950ms response both round to the same 1-second bucket.
- Animation and UI -- Browser rendering operates on 16ms frames (60 FPS). Second-level timestamps are useless for tracking frame timing.
- Financial systems -- Trade execution timestamps often need millisecond or microsecond precision for audit trails and ordering.
How to Tell Seconds from Milliseconds
The simplest rule: count the digits.
| Unit | Digits | Current-era example | Range |
|---|---|---|---|
| Seconds | 10 | 1711545600 | 2001 -- 2286 |
| Milliseconds | 13 | 1711545600000 | 2001 -- 2286 |
| Microseconds | 16 | 1711545600000000 | 2001 -- 2286 |
| Nanoseconds | 19 | 1711545600000000000 | 2001 -- 2286 |
If the number has 10 digits and starts with 1, it is almost certainly seconds in the current era (roughly 2001 to 2033). If it has 13 digits, it is milliseconds. Beyond 13, you are looking at microseconds (16 digits) or nanoseconds (19 digits).
A programmatic check:
function detectUnit(ts) {
if (ts < 1e10) return "seconds (9 digits or fewer - historical)";
if (ts < 1e12) return "seconds";
if (ts < 1e15) return "milliseconds";
if (ts < 1e18) return "microseconds";
return "nanoseconds";
}
Converting Between Units
Conversion is straightforward multiplication or division:
| From | To | Operation |
|---|---|---|
| Seconds | Milliseconds | multiply by 1,000 |
| Seconds | Microseconds | multiply by 1,000,000 |
| Seconds | Nanoseconds | multiply by 1,000,000,000 |
| Milliseconds | Seconds | divide by 1,000 |
| Microseconds | Seconds | divide by 1,000,000 |
| Nanoseconds | Seconds | divide by 1,000,000,000 |
In Code
# Python
seconds = 1711545600
ms = seconds * 1000 # 1711545600000
us = seconds * 1_000_000 # 1711545600000000
ns = seconds * 1_000_000_000 # 1711545600000000000
# Back to seconds
seconds = ms // 1000
// JavaScript
const seconds = 1711545600;
const ms = seconds * 1000;
const us = seconds * 1000000;
const ns = BigInt(seconds) * 1000000000n; // use BigInt for nanoseconds
// Back to seconds
const sec = Math.floor(ms / 1000);
Note on JavaScript and large numbers: Nanosecond timestamps exceed Number.MAX_SAFE_INTEGER (9,007,199,254,740,991). Use BigInt for nanosecond arithmetic to avoid precision loss.
Which Systems Use Which Unit
| System / API | Unit | Example |
|---|---|---|
JavaScript Date.now() | milliseconds | 1711545600000 |
Java System.currentTimeMillis() | milliseconds | 1711545600000 |
Java Instant.now().toEpochMilli() | milliseconds | 1711545600000 |
Python time.time() | seconds (float) | 1711545600.123 |
Ruby Time.now.to_i | seconds | 1711545600 |
Go time.Now().Unix() | seconds | 1711545600 |
Go time.Now().UnixMilli() | milliseconds | 1711545600000 |
Go time.Now().UnixNano() | nanoseconds | 1711545600000000000 |
Unix date +%s | seconds | 1711545600 |
PostgreSQL EXTRACT(EPOCH FROM ...) | seconds (float) | 1711545600.000 |
| Snowflake IDs (Discord, Twitter) | custom (ms-based) | embedded in ID |
| Kafka message timestamps | milliseconds | 1711545600000 |
Elasticsearch @timestamp | milliseconds | 1711545600000 |
The pattern: server-side languages default to seconds, client-side JavaScript and Java default to milliseconds. Always check the documentation for the specific API you are calling.
The "January 1970" Bug
The most common symptom of a unit mismatch is seeing dates near January 1, 1970. This happens when a millisecond timestamp is interpreted as seconds:
// Treating milliseconds as seconds
new Date(1711545600000) // Correct: Wed Mar 27 2024
new Date(1711545600000 / 1) // Correct: same thing
// Accidentally treating seconds as milliseconds
new Date(1711545600) // Wrong: Tue Jan 20 1970
The fix is always the same: check the digit count and multiply or divide by 1000 accordingly.
Going the other direction, if you see dates far in the future (like the year 56000), you probably multiplied a millisecond timestamp by 1000 when it was already in the right unit.
DevToolBox Auto-Detection
DevToolBox's Epoch Converter automatically detects the timestamp unit based on digit count. Paste 1711545600 or 1711545600000 and it will correctly identify whether you are dealing with seconds or milliseconds, showing the conversion in both units so you never have to guess.
This is especially useful when working with unfamiliar APIs or debugging logs from systems you did not build. Paste the number, see the date, and move on.
Try it on DevToolBox -- paste any timestamp and the unit is detected automatically.