Unix Timestamp Converter
This Unix timestamp converter translates between Unix epoch timestamps and human-readable dates. Convert seconds or milliseconds since January 1, 1970 to formatted dates, or convert any date and time to its Unix timestamp equivalent with timezone support.
Current Unix Timestamp
--
Milliseconds: --
Frequently Asked Questions
What is a Unix timestamp?
A Unix timestamp (also called Unix time, POSIX time, or epoch time) is the number of seconds that have elapsed since January 1, 1970 at 00:00:00 UTC, known as the Unix epoch. It provides a simple, timezone-independent way to represent a specific moment in time as a single integer. For example, the timestamp 1700000000 represents November 14, 2023 at 22:13:20 UTC.
How do you convert Unix timestamp to date?
To convert a Unix timestamp to a human-readable date, multiply the timestamp by 1000 (if in seconds) to get milliseconds, then create a date object. In JavaScript: new Date(timestamp * 1000). In Python: datetime.fromtimestamp(timestamp). Our converter handles this automatically and displays the result in multiple formats including ISO 8601, RFC 2822, and local time.
What is epoch time?
Epoch time refers to the starting point from which time is measured in a computing system. The Unix epoch is January 1, 1970 at 00:00:00 UTC. This date was chosen as a convenient reference point when Unix was being developed in the early 1970s. All Unix timestamps are calculated as the number of seconds elapsed since this epoch, making it a universal time reference across systems.
Why does Unix time start from 1970?
Unix time starts from January 1, 1970 because that date was chosen by the early Unix developers at Bell Labs as a convenient, recent reference point. The original Unix system was developed in the late 1960s, and 1970 was a round number close to the system's creation date. Using a fixed epoch makes time calculations simple: just add or subtract seconds from this reference point.
What is the Year 2038 problem?
The Year 2038 problem (Y2K38) occurs because many systems store Unix timestamps as 32-bit signed integers, which can hold a maximum value of 2,147,483,647. This corresponds to January 19, 2038 at 03:14:07 UTC. After this moment, the integer overflows and wraps to a negative number, interpreted as December 13, 1901. Most modern systems now use 64-bit integers, which extend the range to about 292 billion years.
How do you get the current Unix timestamp?
In JavaScript: Math.floor(Date.now() / 1000). In Python: import time; int(time.time()). In PHP: time(). In Bash: date +%s. In Java: System.currentTimeMillis() / 1000. All these return the current number of seconds since the Unix epoch. Note that Date.now() in JavaScript returns milliseconds, so divide by 1000 for the standard seconds-based timestamp.
What is the difference between Unix timestamp in seconds and milliseconds?
A Unix timestamp in seconds is a 10-digit number (e.g., 1700000000) counting seconds since the epoch. A millisecond timestamp is a 13-digit number (e.g., 1700000000000) counting milliseconds since the epoch. JavaScript's Date.now() returns milliseconds, while most server-side languages use seconds. To convert: milliseconds = seconds x 1000, seconds = milliseconds / 1000.
What is a 10-digit vs 13-digit Unix timestamp?
A 10-digit Unix timestamp represents time in seconds since January 1, 1970 (e.g., 1710936000 = March 20, 2024). A 13-digit timestamp represents time in milliseconds (e.g., 1710936000000 = the same moment with millisecond precision). Most programming languages and databases use seconds (10 digits), while JavaScript's Date.now() returns milliseconds (13 digits). This converter auto-detects the format based on the number of digits.
How do I convert a Unix timestamp in Excel?
To convert a Unix timestamp to a date in Excel, use the formula: =(A1/86400)+DATE(1970,1,1) where A1 contains the timestamp in seconds. For milliseconds, use =(A1/86400000)+DATE(1970,1,1). Format the cell as a date/time to see the readable result. To convert a date back to a Unix timestamp: =(A1-DATE(1970,1,1))*86400. Note that Excel dates are based on your local timezone, so results may differ from UTC by your timezone offset.
Can Unix timestamps be negative?
Yes, negative Unix timestamps represent dates before January 1, 1970 (the Unix epoch). For example, -86400 represents December 31, 1969 (one day before the epoch). Negative timestamps are fully supported in most programming languages: Python's datetime.fromtimestamp(-86400) returns 1969-12-31. This is useful for historical dates, though very early dates (before ~1901) may exceed the 32-bit signed integer range.
Unix Timestamp Converter Guide
What Is a Unix Timestamp?
A Unix timestamp (also known as Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970 00:00:00 UTC, not counting leap seconds. This date is known as the Unix epoch. Unix timestamps are widely used in computing because they provide a simple, unambiguous way to represent a specific moment in time regardless of time zone.
How to Use This Tool
Timestamp to Date:
- Enter a Unix timestamp in seconds (10 digits) or milliseconds (13 digits)
- The tool auto-detects whether the input is in seconds or milliseconds
- Click "Now" to populate with the current timestamp
- Results are shown in multiple formats for convenience
Date to Timestamp:
- Select a date using the date picker
- Optionally set a specific time
- The corresponding Unix timestamp is shown in both seconds and milliseconds
Output Formats
- ISO 8601: The international standard format, e.g.,
2024-01-15T10:30:00.000Z - RFC 2822: Common email and HTTP header format, e.g.,
Mon, 15 Jan 2024 10:30:00 GMT - Local Time: Formatted according to your browser's locale settings
- UTC: Universal Coordinated Time representation
Seconds vs. Milliseconds
Unix timestamps can be represented in two common units:
- Seconds (10 digits): The traditional Unix format used by most command-line tools and server-side languages (e.g.,
1700000000) - Milliseconds (13 digits): Used by JavaScript (
Date.now()), Java, and many APIs (e.g.,1700000000000)
This converter automatically detects which unit your input uses based on the number of digits.
Common Timestamps
| Timestamp | Date |
|---|---|
| 0 | January 1, 1970 00:00:00 UTC (Unix Epoch) |
| 1000000000 | September 9, 2001 01:46:40 UTC |
| 1700000000 | November 14, 2023 22:13:20 UTC |
| 2000000000 | May 18, 2033 03:33:20 UTC |
| 2147483647 | January 19, 2038 03:14:07 UTC (Y2K38 problem) |
Use Cases
- Debugging API responses that return timestamps
- Converting log file timestamps to human-readable dates
- Setting up scheduled tasks (cron jobs) with specific timestamps
- Working with databases that store dates as Unix timestamps
- Comparing dates across different time zones without ambiguity
Unix Timestamp in Programming Languages
Here are examples of how to get the current Unix timestamp and convert a timestamp to a date in popular programming languages:
Python
import time
import datetime
# Get current Unix timestamp
timestamp = int(time.time())
# Convert timestamp to date
dt = datetime.datetime.fromtimestamp(timestamp)
print(dt) # e.g., 2026-03-20 12:00:00JavaScript
// Get current Unix timestamp (seconds)
const timestamp = Math.floor(Date.now() / 1000);
// Convert timestamp to date
const date = new Date(timestamp * 1000);
console.log(date.toISOString());PHP
// Get current Unix timestamp
$timestamp = time();
// Convert timestamp to date
echo date('Y-m-d H:i:s', $timestamp);Java
import java.time.Instant;
import java.time.ZoneId;
import java.time.LocalDateTime;
// Get current Unix timestamp
long timestamp = System.currentTimeMillis() / 1000;
// Convert timestamp to date
LocalDateTime dt = Instant.ofEpochSecond(timestamp)
.atZone(ZoneId.systemDefault())
.toLocalDateTime();SQL (MySQL)
-- Get current Unix timestamp
SELECT UNIX_TIMESTAMP();
-- Convert timestamp to date
SELECT FROM_UNIXTIME(1710936000);
-- Result: '2024-03-20 12:00:00'Go
package main
import (
"fmt"
"time"
)
// Get current Unix timestamp
ts := time.Now().Unix()
// Convert timestamp to date
t := time.Unix(ts, 0)
fmt.Println(t.Format("2006-01-02 15:04:05"))Bash
# Get current Unix timestamp
date +%s
# Convert timestamp to date (Linux)
date -d @1710936000
# Convert timestamp to date (macOS)
date -r 1710936000Common Unix Timestamp Values
A quick reference table of notable Unix timestamps for developers:
| Date | Unix Timestamp (seconds) |
|---|---|
| January 1, 1970 (Unix Epoch) | 0 |
| January 1, 2000 (Y2K) | 946684800 |
| January 1, 2025 | 1735689600 |
| January 1, 2026 | 1767225600 |
| January 1, 2027 | 1798761600 |
| September 9, 2001 (1 billion) | 1000000000 |
| November 20, 2286 (10 billion) | 10000000000 |
| January 19, 2038 (Y2K38 overflow) | 2147483647 |
The Year 2038 Problem (Y2K38)
The Year 2038 Problem, often called Y2K38, is a computing limitation caused by storing Unix timestamps as 32-bit signed integers. A signed 32-bit integer can hold a maximum value of 2,147,483,647, which corresponds to January 19, 2038 at 03:14:07 UTC. One second after this moment, the integer overflows and wraps around to a negative number, which would be interpreted as a date in December 1901.
This problem is similar to the Y2K bug that affected systems at the turn of the millennium. However, Y2K38 is potentially more impactful because Unix timestamps are deeply embedded in operating systems, file systems, databases, and network protocols.
Modern systems have largely mitigated this issue by adopting 64-bit timestamps. A signed 64-bit integer can represent dates up to approximately 292 billion years in the future, effectively eliminating the overflow concern. Most modern Linux kernels, macOS, and Windows versions already use 64-bit time internally. However, legacy embedded systems, older databases, and some file formats may still be vulnerable.