Namibia

noun

UK/nəˈmɪb.i.ə/
US/nəˈmɪb.i.ə/

Definitions of Namibia noun

  1. A country located in Southern Africa.

    • The desert landscapes of Namibia are incredibly spectacular.

    • Many tourists travel to Namibia to view its wildlife.

    • Namibia's economy relies heavily on mining and fishing industries.