as a result of early Hollywood films, U.S. government propaganda, white Christian missionary reports, newspaper coverage, and overall societal conditioning, many Negroes in the early twentieth century looked down upon Africa as a pariah “dark continent” of unspeakable embarrassments.