Salon ran this article on the Oscars earlier today. It’s interesting reading, while I agree with some of it (I really do think Denzel deserved that award, I can’t comment on Halle Berry as I’ve not seen her perfomance in that movie.), I don’t get how anyone can realisticly think the awards were given to simply cover the fact that Hollywood is racist. I could be very wrong but….