Unix Timestamp Converter

This Unix timestamp converter translates between Unix epoch timestamps and human-readable dates. Convert seconds or milliseconds since January 1, 1970 to formatted dates, or convert any date and time to its Unix timestamp equivalent with timezone support.

Current Unix Timestamp

--

Milliseconds: --

Frequently Asked Questions

What is a Unix timestamp?

A Unix timestamp (also called Unix time, POSIX time, or epoch time) is the number of seconds that have elapsed since January 1, 1970 at 00:00:00 UTC, known as the Unix epoch. It provides a simple, timezone-independent way to represent a specific moment in time as a single integer. For example, the timestamp 1700000000 represents November 14, 2023 at 22:13:20 UTC.

How do you convert Unix timestamp to date?

To convert a Unix timestamp to a human-readable date, multiply the timestamp by 1000 (if in seconds) to get milliseconds, then create a date object. In JavaScript: new Date(timestamp * 1000). In Python: datetime.fromtimestamp(timestamp). Our converter handles this automatically and displays the result in multiple formats including ISO 8601, RFC 2822, and local time.

What is epoch time?

Epoch time refers to the starting point from which time is measured in a computing system. The Unix epoch is January 1, 1970 at 00:00:00 UTC. This date was chosen as a convenient reference point when Unix was being developed in the early 1970s. All Unix timestamps are calculated as the number of seconds elapsed since this epoch, making it a universal time reference across systems.

Why does Unix time start from 1970?

Unix time starts from January 1, 1970 because that date was chosen by the early Unix developers at Bell Labs as a convenient, recent reference point. The original Unix system was developed in the late 1960s, and 1970 was a round number close to the system's creation date. Using a fixed epoch makes time calculations simple: just add or subtract seconds from this reference point.

What is the Year 2038 problem?

The Year 2038 problem (Y2K38) occurs because many systems store Unix timestamps as 32-bit signed integers, which can hold a maximum value of 2,147,483,647. This corresponds to January 19, 2038 at 03:14:07 UTC. After this moment, the integer overflows and wraps to a negative number, interpreted as December 13, 1901. Most modern systems now use 64-bit integers, which extend the range to about 292 billion years.

How do you get the current Unix timestamp?

In JavaScript: Math.floor(Date.now() / 1000). In Python: import time; int(time.time()). In PHP: time(). In Bash: date +%s. In Java: System.currentTimeMillis() / 1000. All these return the current number of seconds since the Unix epoch. Note that Date.now() in JavaScript returns milliseconds, so divide by 1000 for the standard seconds-based timestamp.

What is the difference between Unix timestamp in seconds and milliseconds?

A Unix timestamp in seconds is a 10-digit number (e.g., 1700000000) counting seconds since the epoch. A millisecond timestamp is a 13-digit number (e.g., 1700000000000) counting milliseconds since the epoch. JavaScript's Date.now() returns milliseconds, while most server-side languages use seconds. To convert: milliseconds = seconds x 1000, seconds = milliseconds / 1000.

Unix Timestamp Converter Guide

What Is a Unix Timestamp?

A Unix timestamp (also known as Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970 00:00:00 UTC, not counting leap seconds. This date is known as the Unix epoch. Unix timestamps are widely used in computing because they provide a simple, unambiguous way to represent a specific moment in time regardless of time zone.

How to Use This Tool

Timestamp to Date:

  • Enter a Unix timestamp in seconds (10 digits) or milliseconds (13 digits)
  • The tool auto-detects whether the input is in seconds or milliseconds
  • Click "Now" to populate with the current timestamp
  • Results are shown in multiple formats for convenience

Date to Timestamp:

  • Select a date using the date picker
  • Optionally set a specific time
  • The corresponding Unix timestamp is shown in both seconds and milliseconds

Output Formats

  • ISO 8601: The international standard format, e.g., 2024-01-15T10:30:00.000Z
  • RFC 2822: Common email and HTTP header format, e.g., Mon, 15 Jan 2024 10:30:00 GMT
  • Local Time: Formatted according to your browser's locale settings
  • UTC: Universal Coordinated Time representation

Seconds vs. Milliseconds

Unix timestamps can be represented in two common units:

  • Seconds (10 digits): The traditional Unix format used by most command-line tools and server-side languages (e.g., 1700000000)
  • Milliseconds (13 digits): Used by JavaScript (Date.now()), Java, and many APIs (e.g., 1700000000000)

This converter automatically detects which unit your input uses based on the number of digits.

Common Timestamps

TimestampDate
0January 1, 1970 00:00:00 UTC (Unix Epoch)
1000000000September 9, 2001 01:46:40 UTC
1700000000November 14, 2023 22:13:20 UTC
2000000000May 18, 2033 03:33:20 UTC
2147483647January 19, 2038 03:14:07 UTC (Y2K38 problem)

Use Cases

  • Debugging API responses that return timestamps
  • Converting log file timestamps to human-readable dates
  • Setting up scheduled tasks (cron jobs) with specific timestamps
  • Working with databases that store dates as Unix timestamps
  • Comparing dates across different time zones without ambiguity