History

What did African Americans gain in ww2?


Answers

lilybilly762

2 years ago Comment

African Americans gained land but in an unfarmable territory  

ElizbethLiddle208

2 years ago Comment

During the World War II, some of the African American leaders were wary but they still got themselves involved in the war. This involvement did gain them some ground in the civil rights movements. They were also admitted to Navy and Air Force for the first time.