Black Culture?
Stupid question: is this representative for black culture in the US? I mean in general? I'm sure there are lawyers and doctors in that community but this documentary does probably better reflect what it means to be black in America than any affirmative action TV show. Right?
I hope this is not an offensive assumption. I am not a US citizen. I am from Germany, to be specific. So I'm really not to familiar with everything. Just wondering if this is just a crazy exception or mostly, what it's at.