Tuesday, August 14, 2012

Even MORE Things You Can't Talk About At U.S. Colleges

Five Questions Your College Doesn't Want You To Ask...Primarily, Because They Can't Answer Them.



Earlier this year, I posted an entry about questions that were verboten to ask at most U.S. colleges. Since then, I’ve actually graduated from my university, and I’ve had a good three months to reflect on the totality of my campus stay - and now that it's that time of year where scores and scores of freshmen are ambling into university lecture halls for the first time, I figured it was an issue well worth revisiting. With that in mind, there are a few worthwhile addendums that I think need to be annexed to my original listing - and for all my successors, please feel free to use these to completely derail any and all class discussions, if you need time to sneak out and play some Xbox or something.

Question Number One
“If race DOESN'T exist, then does that mean the NAACP are racists?”

Race, no matter the venue, is almost always a discussion nobody wants to involve themselves in. It’s too easy to step on someone’s toes, and because everybody is afraid of appearing unlearned or even insensitive, worthwhile questions are almost never brought up when people of variant skin tones actually get together and discuss the notion. In academia, you’ll hear quite a few lectures regarding the social import of race and ethnicity…well, more accurately, a few lectures regarding how UNIMPORTANT race and ethnicity are to the modern world.

Unless you’re going to school in North Dakota or something, perhaps you’ve noted that your campus environment is a fairly multicultural one. In fact, if you’re a freshman, you may even find yourself forced to take an introductory course where you’re indoctrinated…I mean, enlightened…with the concept of diversity. At my alma matter, freshmen were required to read a book about a Georgian town that became a refugee settlement center in the 1990s, ultimately creating a pan-racial, pan-ethnic utopian society that today, only has one-fifth of its total population living underneath the poverty line. Unless you’re going to a school founded by Bob Jones, it’s about a 100 percent chance that the information you’re going to be fed about “multiculturalism” is going to be overwhelmingly positive - a message that, at least partially, kind of conflicts with most colleges’ perspectives on race and ethnicity.

Here’s the paradox: while colleges want you to celebrate racial and ethnic identity, they also want students to shy far, far away from believing that those same racial and ethnic identities are anything OTHER than social constructs. In other words, race and ethnicity, as GENETIC REALITIES, simply do not exist in higher education.

The problem here is that, well, race and ethnicities do seem to kind of exist, as genetic, scientifically-proven facts. I had a professor that once made my class write a report on a Discovery Channel special about genetic research - the message of the assignment, I suppose, was that since MOST of us are a biological hodgepodge of differing racial and ethnic genes, the whole idea of “race” and “ethnicity” as scientific constructs was pointless. It’s a nice gesture and all, but that still doesn’t do away with the obvious here: that genetically, people of varying races and ethnicities are INHERENTLY different from one another.

I understand what educators are trying to do: since a lot of college freshmen may still be porting about some insensitive ideas about others, college officials feel the obligation to address just how arbitrary and meaningless racial and ethnic identities truly are, so that students can interact more efficiently and sympathetically with others - and also, reduce the potentiality of the school getting slapped with a discrimination lawsuit while they’re at it.

This should strike you as more than just a little strange, since colleges make a lot of profits from associating with groups and organizations that make race and ethnicity not only CENTRAL components of their structures, but in many cases, are the SOLE emphasis of said organizations and associations. So, while it’s generally not cool to place an emphasis on race and ethnicity when discussing the merits of people, that doesn’t stop colleges from encouraging students to find a sense of identity in racially-and-ethnically-driven clubs and organizations. The message here is just a tad perplexing; categorizing people based SOLELY on racial and ethnic identity markers is unacceptable, but CELEBRATING one’s racial and ethnic identity is not only A-OK, but championed by just about every college in the U.S.

Earlier this year, my college hosted this week-long African American Studies symposium. Outside the liberal arts college, several salesmen and saleswomen set up booths, and not surprisingly, a lot of their goods seemed to emphasize the fact that, yeah, racial differences EXIST. I recall finding one book that said melanin was the “chemical key to black greatness” - a title that, to me, would indicate that some people of color seem think that there’s SOME element of genetic science (or pseudo-science, perhaps?) to the whole “race” and “ethnicity” thing.

The way many schools circumvent this is by coming up with a rather ingenious postulate: that race and ethnicity are LEARNED BEHAVIORS and not GENETICALLY-instilled traits. That’s right kids, “race” and “ethnicity” has nothing at all to do with pigment or chromosomes or DNA or any of that other claptrap, it’s actually a “social inheritance” as opposed to a biological one. It’s an interesting idea, to be sure, but one that completely overlooks the scientific reality that race and genetics have some considerable overlap. While genetics may play an absolutely microscopic role in what shade of brown you look like, those same genetics explain a whole host of differences between the fluctuating races and ethnic groups - like why certain peoples seem to be more susceptible to certain disorders and diseases than others. And if you REALLY want to get a shit storm brewing, try bringing up the fact that not only has modern science found the genetic key to racial variation…they’ve found the PRECISE protein within the precise genetic compound that accounts for skin tone variance.

Admittedly, it’s an uphill battle trying to explain how genetic factors may have just a smidge of influence - in spite of being 100 percent supported, validated and proven by genetic analysis research - to people that are CONVINCED that only social influences constitute such identity markers. But as you will see, that’s a logical conundrum that can just as easily be inverted in the vestibules of higher education…

Question Number Two
“So if gender is learned and sexuality is genetic…then why does science tell us the EXACT opposite?”

The legendary scene in “Kindergarten Cop” occurs when a precocious youngster tells Ahnold the primary difference between the sexes: evidently, the big variable is that boys have penises, and girls have a vagina.

Admittedly, it’s not the most inclusive definition of gender, but it’s pretty close - empirically, women and men have differing reproductive systems and secondary sex characteristics that, for the most part - indicate a pretty big chasm between the two sexes. Now, if we wanted to get technical and end the debate right friggin’ now, you could say that women are exclusively carriers of XX chromosomes (more specifically, karyotype 46,XX, but occasionally karyotype 47,XXX or karyotype 45,X), while men generally lug about karyotype 46,XY. Additionally, only males lug about this intronless gene called SRY - a protein generally referred to as the sex-determining-region on Y chromosomes. Even the aberrational female that ends up with XY genes will not be able to produce the SRY protein, so it’s generally considered THE biological means of separating the sexes (in fact, it was the official means of determining gender implemented by the International Olympic Committee, until people started calling it, without the remotest since of irony, “sexist.”)

With that in mind, it would seem like any and all arguments about gender would be pretty fruitless and arbitrary, since we have a scientifically-proven algorithm for determining what constitutes male and female staring us in the face. But as soon as you get into college, though…

First off, you’re going to get hit with the thesis that human gender isn’t necessarily a biological construct, but a social construct reinforced by one’s culture and early experiences as children. This is a theory that’s pretty sound, if not entirely plausible, as just about every culture has some sort of expected traits and characteristics that are tied to one’s sexual assignment. Where things take a nosedive into absurdity is when your professors start telling you that not only does the “social construct” of gender have more influence than one’s biological assignment, but the “social construct” of gender COMPLETELY overrides the concept of biological gender assignment. Summarily, this means all of that hubbub about karyotypes and SRY gets chunked out the window, because only cultural dictations can determine what “sex” ultimately is. The more extreme lecturers will even tell you that the idea of “gender” simply does not EXIST as a scientific reality, generally writing off all of that chromosomal nonsense as “oppressive cultural constructs.” (And the guffaws to be had: nine times out of ten, these are the same kind of educators that regularly deride religious folks for “discrediting” science to meet their own agendas.)

Peculiarly, these “nature vs. nurture” argument gets COMPLETELY flip-flopped when we switch gears from human sex to human sexuality. While many academics will tell you that sex is determined by cultural influences, just about EVERY instructor I had in college was absolutely vehement that sexual orientation was an immutable, biological condition. By the time you graduate, this notion will be pounded into your skull: “gender is learned, but sexual preference is TOTALLY biological.” To say to the contrary, I might add, isn’t just frowned upon - it’s enough to get you tossed out of school for promoting “hate rhetoric.”

This, despite the fact that the American Psychological Association says that “although much research has examined the possible genetic, hormonal, developmental, social, and cultural influences on sexual orientation, no findings have emerged that permit scientists to conclude that sexual orientation is determined by any particular factor or factors.”

That, and the fact that many of the most frequently cited reports “confirming” the existence of biological components to human sexual orientation - Dean Hamer, Simon LeVay,  Richard Pillard, among them - were authored by gay researchers, who seem to have just a teensy bit of bias working against the authenticity of their findings.

And whatever you do, DON’T bring up reports that like this one from the Archives of Sexual Behavior, published in 2001, that claims that 46 percent of gays reported childhood molestation, while “straight” test subjects only reported 7 percent.

Or this 2010 report, which states that, for some reason, children raised by homosexual parents/guardians seem to be more likely to report GLBT identities than children raised in oppressive, heteronormative households. And please refrain from noting that this report from Rutgers’ Department of Biological Sciences - which states that EVERY single study claiming a genetic basis for sexual orientation released prior to 1995 fails as clinical studies - exists.

Even so, don’t be surprised if your professors STILL vouch for the genetic explanation of sexual orientation after being presented with such data. And if anybody attempts to label you as a “bigot” or a “gay-basher” for taking a contrarian standpoint - just remind them that there’s scientific research out there that confirms “homophobic” genes exist, too.


Question Number Three
“For the last time: is 'majority rules' inherently a GOOD thing, or a BAD thing?”

So, I was sitting in class one day, listening to a lecture about populism. The discussion was about William Jennings Bryan and the Free Silver Movement - admittedly, not the most exciting period in American history, but an interesting little epoch, nonetheless. So, my professor railed on and on about how great popular support - the desires of the masses and all that shit - was such a great thing, and something that should dictate national policy. Now, fast forward a couple of weeks, and we’re having a discussion about that Hitler fellow. That same professor talks about how Die Fuhrer had popular support (the dude was adamant about having his moves backed by plebiscites, after all) and now, all of a sudden, populism is a freaking horrible thing. Whether or not majority opinion should decided federal policy seems to be one of those golden mechanisms that swings back and forth on the Morality-Meter…essentially, if it’s a majority opinion that your professor LIKES, populism is the “beating heart” of democracy and if it’s something your professor doesn’t like, it’s an example of “the tyranny of the masses” being hoisted upon undeserving minority populations.

You’d think that, a good three hundred years after the U.S. was founded, we’d have ironed out whether or not that which meets the greatest needs of the greatest number of people is a positive or a negative. As before, it seems to be something that’s morally ambiguous as a construct, UNTIL it proves pivotal in something like a federal election or the passing of new legislation. In other words, populism is both the primary cog of freedom AND despotism - and from there, it’s an argument that just gets worse and worse and worse.

Case in point: let’s say that in a popular election, the counter-party to the party your professor doesn’t like wins a majority come voting day. At this juncture, your professor will almost certainly come up with some sort of excuse that describes how “institutional prejudices” works against a majority of the voting population, which means the results of said election are actually skewed to a minority representation of the public. The big, fat beefy meat in this philosophical hamburger is that in representative democracies, not everybody has the ability to have their voices heard, and if they did, we’d get an entirely different outcome. But if the party of your professor DOES win, guess what happens to all of that “populism is bullshit” rhetoric? That’s right, popular civic support now becomes the hallmark of American exceptionalism and a vital moral component of society itself. It’s a completely arbitrary, vacant notion, this “populism” stuff - which means as soon as you hear the word in class, it’s nigh time to just tune out for the rest of the lecture.

Question Number Four
“Even educators admit that statistics is a bunch of BS, so where is there such a dependency upon it in academia?”

Remember earlier, when we were comparing research about whether or not human sexuality has a genetic component? You may be wondering how so many tests were able to VALIDATE the notion and how so many other tests were able to absolutely DISCONFIRM the notion. Apparently, this “Scientific Method” stuff is built upon the shakiest of foundations, no?

In higher education, you will be living, breathing, sleeping and pooping statistics. Everything you write will probably involve linking to a study conducted by a bunch of people you’ve never heard of, and a lot of times, that’s ALL you have to do to pass a class. I’ve always said that professors grade you not on content and composition, but just how well you’re able to cite and source others…because as we all know, an enlightened citizenry is BUILT a steady diet of thoughtless acceptance of facts and unoriginal observations, ain’t it?

The inherent problem with research and studies and clinical trials, I am afraid, is a quite pressing one: all in all, they don’t mean shit. Every week, you’ll hear about new research concluding that [XX] in [YY] people are [ZZ]. You look at it, you nod your head and you go “all right.” Hey, it’s peer-evaluated, so it HAS to be accurate. The thing is, if you took the time to look beyond the abstract, you would come to a decisively different conclusion - that reports of the like have about as much real-world authenticity as toilet paper.

Here’s a good example. Earlier this year, the CDC released a report that said one in 88 children (eight year olds, to be specific) in America had an autism-spectrum-disorder. Since it’s a report from the Centers for Disease Control, and since most people never read the report, they would simply take up the numbers as real-world estimates. I mean, the federal government runs the CDC, and the federal government runs the Census. Shit, you’d figure they’d be able to hook up the percentages through Access or something, and the end results would be pretty damn close to perfect.

The thing is, the report didn’t evaluate EVERY single child in the United States. In fact, the report evaluated only about eight percent of ALL eight year olds in the United States circa 2008, leaving out entire juvenile populations in 36 states. Additionally, most of the children evaluated were done so at ADDM surveillance locations, which gives us a VERY GOOD reason to read the fine print on the CDC’s report:

“Second, the surveillance areas were not selected to be representative of the United States as a whole, nor were they selected to be representative of the states in which they are located. Limitations regarding population size, surveillance areas, and the consistency of these attributes were considered when analysts evaluated comparisons across multiple time points. Although the two ADDM sites reporting the highest prevalence estimates in 2008 also reported among the highest prevalence estimates in 2002, the most recent results from New Jersey and Utah are based on subregions of their 2002 surveillance areas, with smaller populations compared with those areas and with most other ADDM sites. The estimated prevalence in these subregions possibly was influenced by factors unique to these smaller communities and might not reflect the number and characteristics of children with ASDs in the larger areas covered by these ADDM sites in 2002. Similarly, five other ADDM sites covered different surveillance areas in 2008 compared with 2002 and/or 2006. Although comparisons with earlier surveillance years were carefully restricted to comparable surveillance areas, caution is advised when interpreting results. For example, the addition of one North Carolina county in 2008 resulted in a nearly 15% increase in the overall prevalence of ASDs in that site compared with their findings when this new county was excluded from the prevalence estimate. Although this county was excluded from calculations when the 2008 results were compared with those from earlier surveillance years, the impact of this single county highlights the relative differences across subregions of any given ADDM site.”

Now, I’m not quite sure if you caught that last part, but the CDC basically told us that the reason why the autism rate increased MAY have been tallied is because, well, the CDC simply included more study sites this time around. My advice to you college kids is that EVERY TIME you crack open a report, you flip straight to the part of the study titled “Limitations,” because that’s pretty much the part of the study where the researchers ADMIT that all of this crap is flawed and potentially pointless.

The major problem is that in academia, there’s such a reliance upon statistics as PROOF of one’s assertions that debates about social sciences sound more like a game of “Math Blaster” than intellectual discourse. The major, MAJOR problem that arises here is two-fold:

ONE: Statistics can EASILY be skewed to fit one’s hypothesis or agenda (here’s a SHORT LIST of the many biases - intentional or otherwise - that can affect research findings); and

TWO: As a general rule, even if statistics are mostly accurate, the findings can only be applied as broad, sweeping generalizations and have no bearings on assessing or analyzing INDIVIDUAL SUBJECTS. In the field of humanities, that’s kind of something that may have an influence of sorts. I think.

Question Number Five
“Seriously, how can I EVER become a critical thinker when all I ever do is give you the answers YOU want to hear?”

And at the end of the day, THIS is the biggest slight I can think of against liberal education in America. We wonder why we churn out so many aimless, brainless and consumption-obsessed knuckleheads out of college, and I reckon this is the root cause of our nation’s intellectual leakage.

When you go to college, you have one fundamental priority: getting OUT OF COLLEGE, so that you can actually do what you want to with your life. That’s the way America is set up, and nobody has the time or patience to actually retain things like “skill sets” or “knowledge.” This, I might add, is just  as big a problem with educators and administrators as it is with students.

For the most part, students never have the time to learn, and teachers never have the time to teach. There’s so much non-academic stuff going on in college - from unions to academics to lobbying to special interests committees to externally commissioned research - that there isn’t just a disconnect between students and educators, there’s a complete rift between the two. Honestly, the only time the teacher and the student connect is via assignments, and since both educator and educatee just want to get the hell out of the classroom and move on to more lucrative obligations, the hierarchal emphasis is shifted from education to completion. The student does what he or she has to do to pass the class, and the teacher does enough actual teaching to not piss off his or her higher-ups, and that’s that. If you’re wondering where the part about  producing a more knowledgeable citizenry comes in…well, I’m right there with you.

Long story short, this is the mindset of every college student in America: “I’m going to do what I have to do to pass. I don’t have the time or resources to effectively assess and analyze the information given to me, so it looks like I’m just going to have to memorize large chunks of bullshit, which I will immediately discard five minutes after the final exam. I will repeat this 60 times, until these assholes give me a diploma.”

Admission time, kids. Every single essay I wrote in college, I intentionally skewed to best meet what I considered my professors’ personal biases to be. If they were liberal, I gave them a left-wing answer. If they were conservative, I gave them a right-wing response. If they were gay, I gave them a gay-rights answer, and if they were a woman, I gave them a women’s-liberation answer. In essence, for four years straight, I did NOTHING but give my teachers the responses they wanted to hear, and not what I personally thought about the material (especially if it ran contra to what my professor thought.) And as a result, I walked out of college with a Bachelor’s in Science….and Magna Cum Laude honors, to boot.

If you CAN’T see the problem here, let me elaborate a tad. You see, to make it through college - that social construct which allows you partake of higher salary and higher influence employment - the most effective means of doing so is becoming an expert in the field of ass-kissing and yes-manning. You’re told to analyze and assess things, but always through a particular lens or perspective - and since you want that diploma and a life that doesn’t suck, you run with that little perspective until there’s at least a “C” on your academic transcript.

In short, that means critical thinking in American colleges is D-E-A-D. Nobody feels the urge (and most certainly, nobody is given the encouragement) to rock the boat and call into question ANYTHING. Nobody  challenges the authoritative ideologies of the professor, because that might keep them from passing the class and therefore, moving on to the lives they want to lead. As such, contrarian ideas are never promoted (and in some places, hardly even tolerated), and a general ennui kicks in across the university. The last time I checked, you can only gain wisdom from asking questions - and in most U.S. colleges, there’s far, far too many questions that are considered off-limits.

Simply put, we need more critical thinking, and we need more ambiguous (read: for a change, unbiased) takes on cultural matters in academia. Instead of looking at major social drivers like religion, race, sexuality, ethnicity and politics as absolutes, wouldn’t it behoove EVERYBODY if we studied in climates where these issues were openly discussed instead of being relegated to niche recesses of the university, or completely disbarred altogether?

It’s an unlikely prospect, I know, but as long as these institutional impositions are in place, be prepared for the quo to remain status for quite sometime…

2 comments: