Namibia
noun
UK/nəˈmɪb.i.ə/
US/nəˈmɪb.i.ə/
Definitions of Namibia noun
A country located in Southern Africa.
The desert landscapes of Namibia are incredibly spectacular.
Many tourists travel to Namibia to view its wildlife.
Namibia's economy relies heavily on mining and fishing industries.