Are our schools being 'feminized'? If so, does it matter?
Katha Pollitt, who herself went to Radcliffe (the girls' wing of Harvard), has an interesting perspective on the whole concept that American education is being "feminized." Alarmists, she says, point to the overwhelming majority of female teachers in elementary and high schools, "too much sitting quietly, not enough sports and a feminist-friendly curriculum that forces boys to read--oh no!--books by women. Worse--books about women" as reasons that America's schools are no longer boy-friendly.
She argues that it's not female schools that are
turning boys off education: it is the boys' own choices, and belief that they don't really have to work that hard to
get the opportunities, and jobs, they'll eventually need. Says Pollitt, "sex discrimination in employment is alive
and well: Maybe boys focus less on school because they think they'll come out ahead anyway."
I don't totally agree with her point of view (although I don't totally disagree with it, either): I take issue with her anecdotal evidence of which books her daughter read in public school, for one - I read way more female stuff in middle and high school, which I attended in the late eighties. I also think that preschools and elementary schools are feminized. And this is the source of much of the problem, which ends up affecting colleges.
What do you think? Are schools being feminized in America? If they are, is that ok?