askvity

How to Find the Century Year in Python?

Published in Python Programming 2 mins read

To find the century year of a given year in Python, you can use integer division and handle the edge case where the year is perfectly divisible by 100.

Here's how you can do it:

def find_century(year):
  """
  Calculates the century of a given year.

  Args:
    year: The year (integer).

  Returns:
    The century (integer).
  """
  year = int(year) # Ensure the input is an integer

  century = (year // 100) + 1

  # Handle the edge case where the year is perfectly divisible by 100
  if year % 100 == 0:
    century = year // 100

  return century

# Example usage:
year1 = 2023
century1 = find_century(year1)
print(f"The year {year1} is in the {century1}th century.")

year2 = 1900
century2 = find_century(year2)
print(f"The year {year2} is in the {century2}th century.")

year3 = 2000
century3 = find_century(year3)
print(f"The year {year3} is in the {century3}th century.")

year4 = 1
century4 = find_century(year4)
print(f"The year {year4} is in the {century4}th century.")

Explanation:

  1. Integer Division (//): The core logic involves using integer division (//) by 100. This gives the quotient, which is the base for determining the century. For example, 2023 // 100 results in 20.

  2. Adding 1: You typically add 1 to the result of the integer division because the first century includes years 1-100, the second century includes years 101-200, and so on. So, we add 1 to 20 to get the 21st century.

  3. Edge Case (Year Divisible by 100): When a year is perfectly divisible by 100 (e.g., 1900, 2000), adding 1 would result in an incorrect century. In this case, the century is simply the result of the integer division by 100. The code if year % 100 == 0: century = year // 100 handles this exception.

Example Breakdown:

  • Year 2023: 2023 // 100 = 20. 20 + 1 = 21. Result: 21st century.
  • Year 1900: 1900 // 100 = 19. 1900 % 100 == 0 is true, so century is just 19. Result: 19th century.
  • Year 2000: 2000 // 100 = 20. 2000 % 100 == 0 is true, so century is just 20. Result: 20th century.
  • Year 1: 1 // 100 = 0. 0 + 1 = 1. Result: 1st century.

This approach is clear, concise, and handles all cases correctly. It's the most straightforward and Pythonic way to determine the century of a given year.

Related Articles