Before calling Africa a “Third-World” continent, think again.
It’s due to western imperialism and capitalism that strangled Africa.
Why do you think white people can’t keep their hands off of it.
ultimately Africa is called a third world country (even though its a continent) to justify white colonialism and invasion of the land in order to “civilize” and provide “resources” and “freedom” all while Africa is raped and pillaged. literally and figuratively.