France
noun
UK/frɑːns/
US/fræns/
Definitions of France noun
A nation situated in Western Europe.
France is renowned for its cuisine, art, and culture.
Many tourists travel to France to visit the Eiffel Tower in Paris.
France plays a significant role in the European Union.