Why You Should Be Supporting a Universal Living Wage

I currently work as a freelance editor and writer, making almost all my income from these efforts. It’s the remote worker dream, one that I would be reluctant to give up, even for higher pay. I like setting my own schedule so that I can, for instance, write a random blog post when an idea hits me. This morning, at around 6:45 a.m., an idea hit me.

One of my most recent gigs is as a translator of textbooks, translating Spanish to English. Now, to be clear, despite the fact that I have lived in Madrid for nearly 4.5 years, my Spanish is quite mediocre. As in, my reading level might, might, be B1, and my speaking level is even worse. I survive in Spain in large part due to having a partner whose Spanish is much better than mine and the fact that most of my day-to-day interactions can be done in English or remedial Spanish.

I write all of that to say there is no one more surprised than me that I am a Spanish-to-English translator. What qualifications do I have? Well, for my clients, the main one is that I am a native English speaker/writer with an above-average writing and editing ability (I type this fully aware that there will probably be three typos in this thing after I post it). And that, it turns out, is just as valuable for this particular gig as someone with fluency in both languages.

I am able to do my job because of the existence of DeepL and, to a lesser degree, Google Translate. Years ago, it was common to joke about how these translation services mangled language. There was a common ritual that involved the translation of an English phrase through multiple languages and then back to English to see what kind of word salad you ended up with. These days, such a meme is a relic of a bygone era.

That’s not to say these translators are now perfect (far from it, which is why I have a job), but they certainly are vastly better than they used to be even just a few years ago. As an experiment, I tried translating the English phrase “I love you to death” with DeepL, running it through Spanish, then Dutch, and then Japanese. When I translated it back to English, it returned “I love you to death.” That’s just an anecdote, but the point is, these translation tools have gotten far more sophisticated in a short span of time.

That’s largely due to AI. Artificial Intelligence is being used in basically every business and science field imaginable, mostly in ways far less sexy or menacing than decades of science fiction have led us to believe it would be. AI is the future, but also, it’s the present. While the kind of AI we’re used to seeing in films like I, Robot is quite possibly a century or more out, its use on a smaller, more workmanlike scale is already universal.

Now, as you can tell by the title of this post, I’m not here to write about AI (the little I do discuss AI owes a great deal to the excellent book by Hannah Fry, Hello World; pick it up). I only bring it up because it’s intricately linked to the work I do now. Without its existence and its improvement to translating technology, I would be ineligible for my current gig. Someone who was actually bilingual and a good writer/editor would be required for the job, and they would be able to ask for a far higher wage for their efforts. Aye, there’s the rub.

For the last year or so, I have been asking people, “Could your job be done by a machine?” Some people reply, unequivocally, yes, while others say probably. And still others state that parts of their job can be done by a machine, but it would lack the “human” element. Few if any people have ever said absolutely not.

As a writer, I like to think that I bring something to the table that AI (or Robby the Robot) couldn’t. Creativity, life experience, emotion, faulty logic – the “human” element. But the reality is that AI is already being used to write books and if that technology improves at even a fraction of the rate translation has improved, we’re going to see a completely AI-written novel top the New York Times Bestseller list within the decade.

(If you’re dubious, read about David Hofstadter’s experiment in AI-generated classical music, which took place all the way back in 1997.)

I, too, would like to think my “humanness” (my specific talent and imagination) brings something to my work that is valuable. Also, I like to get paid for my work. But I know the inevitable reality is that, at some point down the line, my value – and your value – as a worker will be next to nil. That process has already begun.

I have value as a writer and editor because I am pretty good at both skill sets and, frankly, way better than the average person. And for now, that means that I can make a living doing this thing that I love doing. But I have no delusion that I couldn’t be replaced by an algorithm at some point down the line. I can’t help but think about my nephews and nieces and wonder what types of job opportunities will exist for them in the future. (When I accidentally transposed the ‘i’ and ‘e’ in nieces just now, my Word processor automatically corrected it. Thanks technology!)

There’s currently much discussion of self-driving cars and how those will put truck drivers out of work. I think that fear is a little premature because fully self-driving cars are probably a lot further off in the future than people like Elon Musk would lead you to believe (a topic covered thoroughly in Hello World). In my novel, Yahweh’s Children, I make a throwaway joke about a character 40 years in the future still waiting for flying cars. The point being that sometimes the promises of “visionaries” don’t pan out when matched with the pragmatic roadblocks of reality. But I digress.

The truth is that self-driving cars will be a nightmare for truck drivers, but not because it will eliminate all truck driving jobs. What it’s going to eliminate is the need for skilled truck drivers, the type of people who have highly specialized training and can thus demand a higher wage (usually with the help of a union, but, again, I digress). A self-driving truck will still need a human driver (for the foreseeable future), but not one who needs to operate the truck with anything more than the most rudimentary knowledge. So, what happens then? It’s basic economics: far more people will be able to do that job, which means their labor will be worth less, which means they’ll be paid less. But somebody will still take that job; a job’s a job, as they say.

Truck driving is perhaps the most high-profile example of a job potentially being overtaken by technology, but it’s hardly the only profession that is at risk (just read up on how restaurants are looking at tech to replace workers). It’s also not just AI that is making jobs obsolete. The former US President won over some voters by promising to bring back coal mining jobs. It was one of his most transparent lies (as time proved), but also maybe one of the most telling. Coal mining is dying, and though advances in technology are playing a part in accelerating the decline in jobs, the reality is that an industry built on digging up a finite resource was always going to have an expiration date. But a chunk of the world wants to deny reality by putting their heads in the ground, and they will happily support someone who sells them a shovel.

Whatever your job is, whatever amount of humanity you bring to it, just know that at some point – in a few years, in a generation, in four generations – AI and related technology will take much of the skill and individuality out of it. Your position, as it exists now, will be replaced by a machine, possibly with a human to keep things running, but a human who is far less trained and experienced than you. A human who will get paid less than you get paid now, which is already probably lower (in real world dollars) than what someone a generation ago got paid to do your job.

Let me be clear: I’m not an anti-tech prophet of doom. I think technology is great, and even if I didn’t, I’d still know its progress is inevitable. The question isn’t if, it’s when, and all that.

What’s not inevitable (at least, yet) is how society adapts to technology. Anyone who tells you this current economic model of hourly wages and salaries is sustainable either has their head in a hole or is selling shovels. In not too many generations, we will either have a society that provides for its population (its entire population), or we’ll have one where wealth inequality is so astronomical, the concept of a “first-world country” will be meaningless. In both scenarios, let me assure you, the rich will be absolutely fine.

In many ways, the fight over increasing the minimum wage in the US (which I wholeheartedly support) is a sideshow, because at some point it won’t be about finding jobs that pay well, it’ll be about finding any jobs at all. If we acknowledge that technology can do some jobs completely and other jobs partially, we have to accept the math that there will be less jobs available (certainly less jobs that require skill). Considering that the global population is going to still be growing for the next four decades, at least, the decline in jobs that pay a true living wage is a problem that is only going to worsen.

And that’s why you should support a universal living wage*. You, the teacher; you, the doctor; you, the truck driver; you, the computer programmer; you, the writer. It’s not about Communism or Socialism (or any other poorly understood ‘-ism’). If anything, it’s probably the most capitalist idea possible: if you ensure the entire population has enough money to buy food and shelter and clothes and iPhones and Netflix subscriptions, business will thrive. Billionaires will still be billionaires and the Jeffrey Bezos and Elon Musks can continue to shoot their penises rockets into the moon.

Even if you adamantly believe that your job could never be fully replaced by a machine because of that intangible human factor, you have to at least acknowledge that parts of your job could be automated. Which means that at some point, the Capitalist Overlords or Job Creators (whichever term you prefer) will realize they can pay less money. And anybody who thinks that increasing the minimum wage is enough to staunch the wound is as much a victim of head-in-the-ground thinking as those coal miners.

~

Anyway, those are my thoughts on the matter. Now I have to get back to my day job. I’ve only got a few more years before Wall-E replaces me.

* I’m using the term “living wage”, but I understand it might be better termed a universal basic income. But the UBI that has recently been proposed in the US by people like Andrew Yang has always fallen short of what I’m talking about. I mean a true living wage, i.e., not just a bare minimum, but something that allows for people to do whatever they like (say, for instance, a decade-long travel project). Your “wage” is what you “earn” simply by being alive and producing whatever you produce.

Why Do We Seek Labels?

It’s almost a daily occurrence now. On Facebook or Twitter, in an article or mind-numbing listicle, someone is discussing the traits, burdens and/or pleasures of being an introvert. Based on the unscientific sampling of my personal feed, 90% of the narcissistic self-promoters in the world are actually meek and shy introverts.

When us loners aren’t breathlessly talking about how weird it is that we prefer books to people (haha, I’m soooo crazy!), we’re posting the results of a Briggs Myers personality test (or some generic knockoff).

“I’m totally an INFP.”

“Well, I’m an ENFJ.”

“Oh, I could definitely see that. I guess that’s because I’m an ENTP.”

“I kind of figured all of you were CUNTs.”

And when we get bored with scientific classifications that mostly mean nothing, we fall back on the original sugar pill of personality labels: The Zodiac.

What’s Your Sign?

How is it that a generation raised on “Be Yourself” entertainment is so obsessed with conforming to labels? How can we, on one hand, talk so much about how our race, gender and sexuality doesn’t define our potential and value, and then turn around and say without a trace of irony, “Oh, I’m a Cancer, that’s why I’m so emotional”? Why are we so in need of being sorted?

Hate to break it to you, but you aren’t Harry Potter.

I realize that most people who read horoscopes don’t put much faith in them. They read them for entertainment, they read them because they’re bored, or they read them because it’s fun to see how they match up with their lives. But, like the lapsed Catholic who still crosses himself before entering a scary, black basement, there is an ounce of belief in these people.

It’s not faith in the Zodiac (though, obviously, there are people who truly and fully believe in the bunk), but rather a kind of desperate hope that there could be some truth in the predictions. If the horoscope is true, if the Briggs Myers is accurate, if Muhammad is the Prophet, then there is understandable order in the universe.

There isn’t. At least, not in the way you want.

BleakIntroPerversion

Being an introvert can be great. (Except when it’s not.) There is no question that I fit the label. I fit many labels. I’m shy, I’m pensive, I’m a Wallflower, I’m serious, melancholy, calculating. (Except when I’m not.) I am many things that are really just synonyms for the same trait, which is [fill-in-the-blank].

I imagine, though, that being an extrovert must be pretty great, too.

Except when it’s not.

We have an unhealthy compulsion to be categorized. I don’t like to say, “Because of the internet…” since that’s a very myopic way of looking at the world. I don’t think the internet is fundamentally changing us so much as it’s allowing us to more fully reveal the truest human nature. That said, because of the internet, we are becoming more and more obsessed with telling other people what our label is, presumably so they’ll better understand and accept us.

Am I too quiet? Well, don’t be mad at me, I’m just an introvert.

Am I not assertive enough? That’s just my personality trait, I’d rather create than control.

Am I bad in bed? Must be because you’re a Gemini.

We’re so scared of admitting our failings – of admitting that being less than perfect isn’t a quirk but a reality – that we seek a label for every single possible human personality.

I’m guilty, as well. I am bi-polar. Mostly it’s an affliction I have to deal with to get through day-to-day life and I don’t tend to talk about it in my real world life.* I rarely tell co-workers and don’t bring it up with friends and roommates unless I think they’re someone who will appreciate the conversation, usually because they have a similar struggle. I don’t want to be defined by my condition.

Except when I do. Because at times I do want to wrap myself up in the label. I want to use it as an excuse so that all my worst behaviors and traits can be written off and forgiven. I want permission to be weak.

We are all weak at times, and in those low moments we seek the comforting reassurance that it’s not our fault, not our responsibility. It’s just our nature.

That might be true, but what of it? There are positive traits and there are negative traits. There are qualities to be celebrated and qualities to be corrected. And then there are traits and qualities that just exist, neither good nor bad. I think of it (as I do most things) in evolutionary terms: There is no right or wrong way to be, generally, but there are traits that are more beneficial for particular circumstances.

I have great qualities for being a writer. I have lousy qualities for being a pop singer (even if I do have an ass like Nicki Minaj).

Our incessant need to label ourselves speaks to a great insecurity within us. Maybe it’s because of the constant bombardment of celebrity news and digitally-manipulated images, or maybe that insecurity always existed and the internet is just allowing us to admit it.

Either way, it’s a sickness. Not the insecurity. Insecurities can actually be a gift, a reminder that we are not done yet, that we can always do more to improve ourselves. No, the sickness is the desire to label ourselves, to say, “This is who I am so deal with it.” There are more than 7 billion people on this planet. If the world doesn’t want to “deal with it” they’ll just ignore you, like I imagine most of your friends on Facebook already do.

Nobody cares if you’re an introvert, an INTJ or a Taurus.

We only care what you actually do. So close the Buzzfeed quiz and go create something of lasting value.

Manhattan Sunset

 

*I use this space to talk about my condition on occasion because I want people to understand it, and I want people who are dealing with the same problem to have a place to feel less alone.

Writers Versus Content Creators

I am a writer.

It used to embarrass me to say that because it comes across as so utterly pretentious. Anybody who’s published a poem on Poetry.com can call themselves a writer, which pretty much dilutes the word. I’ve only felt comfortable calling myself a writer in the last few years, partially because I’ve published nationally and some of my stories and poems have appeared in journals. But the more basic reason that I feel comfortable using the ‘W’ term for myself is because I work damn hard at it.

I edit. I edit like a motherfucker professional. Not a single post goes up on this site that hasn’t been read and re-read and edited for typos and grammatically confusing phrases and then rewritten again to make sure that it isn’t all just one big rambling mess. If an article goes up and I spot a typo after the fact, I pretty much can’t do anything until I’ve fixed it. And that’s just for blog posts. You can’t imagine how much time I spend on short stories and the longer pieces I work on. I’ve been editing a completed novel for years. It’s been finished, I’ve submitted it to agents (no interest found), and yet still I return to it in hopes of improvement.

Editing is only one part of being a writer. A very, very, very important part of it, but still not the whole shebang. A writer should also care for craftsmanship, the interplay of words and sounds. One needn’t look far to see that very little of what is written online has been crafted in any manner. Even if we’re ignoring the gibberish that gets posted in the name of SEO and Google analytics, publication on the internet is largely about filling space. Websites don’t employ writers, they employ content creators.

Book Binders

CONTENT IS KING(?)

“Content Creator” is this era’s greatest Orwellian euphemism, presenting the mindless sputum of the half-literate as ‘content’ and declaring the banging of one’s head against a keyboard as ‘creativity.’ Internet content is, by various definitions, valuable, even when it only exists to point the reader to the work of a superior thinker or artist. Unfortunately, the chained up monkeys who type this stuff, while still unable to reproduce Shakespeare, have learned how to market their smeared shit so effectively that we all stop and look.

A great many articles published online contain barely 100 words worth of original content all in reference to someone else’s video, photographs or article, copied whole cloth from another website or news source. So content-less has content creation become that the only real purpose of any creator is to slap up an attention-grabbing headline to bring in the hits. With headlines like “This Video Will Change Your Mind About Everything” and a screenshot strategically frozen to reveal cleavage (yes, Upworthy, I see what you’re doing), sites get your clicks and your shares, spreading their empty content like the mental herpes it truly is.

A content creator might push back and say, “You’re just bitter because you’ve failed as a writer.” To which I say, yeah, probably. But what is a writer if not someone who has failed at everything else in life.

WRITERS WRITE RIGHT

I am not criticizing the Internet. I have no qualms saying that the World Wide Web is the greatest scientific achievement in all of human history. Yes, even beating sliced bread. Counter to common belief, I don’t think the Internet is making us worse people, or even less social. The Internet didn’t turn us into assholes, we already were assholes (slavery, anyone?). This tool is transformative and quite often magnificent in the way that it brings together ideas, cultures, experiences and, most importantly, people. Blaming the Internet for our shortcomings as a species is like blaming the automobile for car crashes. In a certain light, it’s vaguely true, but it’s obviously missing the larger picture.

I know a lot of writers personally. Some I like and some I don’t, while some like me and most… tolerate me. Most of the writers I have known over the years have, at some point or another, stopped writing. At least, in a serious way. They may toss out a poem here or there, or loosely maintain a blog. Many of these writers have attempted to get their writing published and found out the hard way, like I have, that it is really, really hard to get published in this age, especially if you’re not writing erotic fan-fiction based on someone else’s creation.

It’s… disheartening. I’m not saying it was ever easy to be a writer, but I don’t think anyone would dispute that this is the hardest age for a writer to find a faithful audience and make a living by it. The Internet is, somewhat, to blame for that. The other party at fault is us, the writers. We have grown to accept the truism that no one will pay us for our writing, like we’re all part of one global internship and our bosses are waiting for their coffee. I’m not saying this isn’t true, just that it’s a self-fulfilling prophecy. Of course no one’s going to pay for what they’re getting for free. Remember what your mama said about buying the cow? Yep, we’re all sluts.

This is truly a shame because nobody has changed and shaped history more than writers. Great ideas and revolutionary movements spread through the written word. As much as Twitter gets a bad name for its 140-character limit and seemingly frivolous content, it actually serves a tremendous function because it helps spread messages. It lets us share the word.

Writing has value. Content doesn’t.

EXTRA! EXTRA! READ ALL ABOUT IT!

We’re a headline culture, so it’s no wonder that we believe all human knowledge can be reduced to a series of bulletpoints for easy consumption. The epidemic of scientific illiteracy that has created the Anti-Vaxxers, the Climate Change Deniers and the Intelligent Design Movement is largely based on these various groups believing that if they read a couple of headlines, a Wikipedia article and a science study abstract, they’re suddenly as informed as a person who has devoted their life to the field. You can’t reduce hundreds of years of research into an afternoon and then call yourself an expert.

The more reductive we become, the harder it is to convey anything meaningful. Even the flashy content creators are shoving extra information into their headlines (“#16 Will Blow You Away” “#3 Will Literally Get You Pregnant” “#10 !!!!!!!!!!!!!!!!!!!!!!”) because the fire-hose torrent of hyperbole is losing its ability to draw eyes. Everybody is screaming with ALL CAPS that what they have to show you is worth your 5-second attention span, and in reality almost none of it is.

Which is why it’s time for writers to fight back.

Don’t give in to the easy pull of content creation. Don’t aim for the lowest common denominator. Don’t over-hype your work with misleading, exclamation-filled headlines. Be a writer. Craft your words with care, edit them to perfection, and if the world doesn’t care, do it again. And again, and again. The world doesn’t owe you an audience. As a writer, though, you owe it to yourself and to your work to actually give a damn about the quality of your writing. The word will remain long after all the content has been banished to the unlit alleyways of internet obscurity.

So what are you? Content Creator, or Writer?

#WritersVsContentCreators

Type Set

You Are Not A Genius

Let’s start with a very basic fact: If there is an average intelligence, somebody has to be below it. An average, or mean, is not the number that is most common (that’s the mode), or the number that is smack dab in the middle of all the numbers (that’s the median). No, the average is the value we get when all numbers are added and divided by the number of numbers. In this case, those numbers are I.Q. points.

Theoretically, if there was just one massive, industrial-strength moron on the planet, and everyone else were of an astronomically higher degree of intelligence, everyone (but that one) could be higher than the average I.Q. But that isn’t the case. Without any practical way of giving the whole planet an intelligence test, we can be fairly sure that the average and mode for I.Q. points is  damn near the same.

I don’t care how good you are at Minecraft (whatever that is), you my dear reader are, with high statistical likelihood, not a genius.

Einstein Genius Fake Quote

Uh, Fish Are Pretty Dumb, You Ninny

Have you seen this fish quote? It’s the quintessential quote for the internet age. First of all, it’s frequently attributed to Albert Einstein, but was never said by him (basically, if you have some banal sentiment to express, claim Einstein said it), secondly, it doesn’t really make any sense (expecting humans to have basic reasoning and problem solving abilities isn’t the same as expecting a fish to climb a tree), thirdly, if everyone is a genius, then being a genius is suddenly not special. Who cares?

And fourthly, fuck the guy who did say this. I get that we’re worried about self-esteem and people being made to feel bad about themselves, but telling everyone they’re special isn’t the solution, it’s the problem. While you’re assuring your kids that no matter what they do, they’re a success, reality is waiting in the wings to show them that you can’t buy lunch with a glowing sense of self-worth. People fail. That’s how they learn, and grow. Ever met an adult who was coddled their entire childhood and never made to work for anything? They’re the worst.

There is a kernel of truth in the idea that judging everybody on the same scale fails to truly appreciate a variety of skills. A musician shouldn’t be judged on his ability to do spreadsheets, nor would you reject a doctor if she wasn’t good at watercoloring. I’ve known intelligent businessmen who couldn’t write an intelligible literary essay to save their lives. We all have a limited amount of space in our brains (as I’ve noted before, the 10% idea is a myth), so we prudently save room for the knowledge and skill sets that most benefit our profession.

That’s what separates us (I include myself) from the geniuses. Geniuses have minds that are capable of functioning at a level beyond the grasp of us mere average schmoes. A genius isn’t just someone who is a talented guitarist or knows how to program a computer or write an enjoyable book. Those are all excellent skills to have, particularly if your line of work is guitarist, programmer or writer. But they don’t elevate you to the level of genius.

Well, What is a Genius?

After being so adamant that you are not a genius, I’m going to admit that defining a genius is kind of difficult. If we’re talking about I.Q. points, there doesn’t seem to be one consistent metric, though anything above 140-150 is generally considered genius or gifted. I’m not sure how common I.Q. testing is anymore, especially since the tests have often been accused of having a cultural bias. I’ve never taken a test (not a real one; I’ve done the online ones, but those aren’t legitimate gauges of anything), and I don’t know of many people who have. 100 is generally considered average, and most people fall somewhere around there, which is why I.Q. points are often represented with a bell curve.

But when we use the term genius in casual conversation, whether referring to Steve Jobs, Vince Gilligan, David Bowie or some other public figure, we’re not concerned with their intelligence quotient, we’re referring to their achievements. Which is why the term genius is hard to define, and why it’s becoming so overused. We should guard against conflating our personal admiration of someone with objective acclaim. Which is not to say that Jobs, Gilligan and Bowie aren’t geniuses, only that when we’re basing a judgment on a person’s output, it’s really only the historians who can make the call.

Indeed, the old adage is true: Genius is never truly appreciated in its own time. Except, that’s not a lament, it’s a recipe. Achievement can only truly be appreciated with perspective.

The World’s (Not) Full of Idiots

The flip side of the fact that not everyone is a genius is that not everyone is an idiot.* I hear it all the time, on average once a day: “The world is full of idiots!” I had a roommate who pretty much peppered that phrase into every discussion he had (though, when I called him out on it, he denied any memory of ever saying it). Read any political site or article and you’ll learn that Republicans are idiots, and so are Democrats. Liberals and conservatives, all idiots.

The Big Bang Theory vs Community copy

It’s not just politics, though. Fans of The Big Bang Theory are idiots, as is anyone who listens to Dave Matthews Band or reads Twilight. Basically, if someone does or enjoys something that you don’t, they’re an idiot.

There have been studies that show correlations between intelligence or success and musical and literary tastes, but no such study could ever hope to prove causation, and bias almost inevitably enters into such surveys. Comparing the fan base of The Big Bang Theory, which is the most highly watched sitcom on TV, with that of, say, Community, which is poorly rated but critically adored is a fool’s errand. As a huge fan of Community (and a person who has next to no interest in TBBT), I would love to believe  that my preference reflects some sort of mental superiority. In truth, it just speaks to my sense of humor.

You Are Not A Genius. Deal With It.

Be content with your average-ness. What choice do you have? You’re certainly not going to read books on new and difficult subjects to expose yourself to original ideas and educate yourself. Who’s got time for that? Accept that you will always be somewhere in the middle, with the vast majority of the population. At least you won’t be lonely.

And learn to deal with the mindblowing notion that people who hold different beliefs, have different tastes and enjoy different experiences aren’t lesser than you.

Or, you know, don’t. Idiot.

Paleontologist Snowman

*Just as there really are geniuses in this world, there are idiots, too. They’re just not as numerous as you think, and most of them are probably refusing to get their children vaccinated for fear of autism, so evolution might weed them out anyway.

The Age of Balance

kurt staring at the sky

We live in an age of wonders.

We live in an age of turmoil.

We live in an age of transformation.

We live in an age of destruction.

Each one of those sentences is true, 4 strands that co-exist in the DNA of our reality. It’s easy to handpick a number of examples to validate each assertion. Space travel and the curative power of modern medicine are certainly wonders. Turmoil rages in the Middle East as well as in the corridors of Washington, D.C. The internet and new technology is transforming the world into something never before seen in human history. And technology, like bombs, or the results of our technology, such as pollution, are capable of greater levels of destruction than any other historical era could have even imagined.

Optimists and pessimists can cherry pick a bushel’s worth of evidence to support their personal predisposition, and sometimes the same piece of evidence can be used by both to support their respective arguments (the internet is both the greatest achievement of humanity, and our likely undoing, depending on the messenger). Everything is going to kill us. Everything is going to save us. Just you wait and see.

Not Optimists Vs. Pessimists. Optimists and Pessimists

[Note: Portions in red discuss current political happenings. If you are such a person who finds such topics unbearable, feel free to skip.]

Place me firmly in the optimist camp. I’ve lived through enough faux-apocalypses and read about enough historical ones to cast a weary glance at anyone whose predictions are death, destruction and doom. Like cockroaches, we survive.

I’m also a pragmatist, though, so I understand that oftentimes it’s people freaking out about a potential catastrophe that helps us avert it. In recent memory, the ‘Y2K‘ problem is probably one of the most famous cataclysmic events that never happened, now nothing more than a punchline. What gets lost in the discussion of Y2K’s uneventful arrival is that people had been working on addressing the potential problems for months and years before January 1st, 2000. Was all the doomsaying for naught? Perhaps, but we can never know what would have happened if some people hadn’t taken the threat seriously and sought solutions.

Society needs optimists and pessimists. Just like society needs liberals and conservatives. As an optimistic liberal, pessimistic conservatives annoy the living hell out of me quite frequently, but that doesn’t mean I think they shouldn’t exist. I’m predisposed to role my eyes when someone predicts destruction ahead, but I think it’s good that people take such prophecies seriously enough to address them and hopefully find solutions. A pessimist with solutions is mighty handy to have around.

Pessimists without solutions, on the other hand, are dangerous.

This is why the government Shutdown/Obamacare kerfuffle is so fingers-on-a-chalkboard aggravating to me. Ted Cruz and his compatriots claim that the Affordable Care Act is hurting people, not helping. This may or may not be true (the evidence suggests that there are some negative effects being felt now, but time will hopefully rectify those issues), but defunding the program doesn’t solve any problems for 2 reasons: Obamacare still exists, defunded or not, and even if you eradicate it, no Republicans are offering any alternative that will help fix the increasingly unsustainable healthcare crisis in this country.

Ted Cruz and John Boehner are pessimists without a solution.

The Balance

American history is the story of finding balance, swinging too far one way and swinging back, always in search of the sweet spot. We are constantly attempting to maintain a balance between liberty and control that doesn’t collapse into anarchy or succumb to tyranny. We enjoy the progress of liberalism, but conservatism attempts to uphold a recognizable society. We embrace technology but maintain a constant vigil against its more dangerous and excessive applications.

Every election cycle in this country brings about op-eds about how our 2-Party system is bad for democracy, keeping out the smaller voices. The assumption, presumably, is that if only the Green Party or the Libertarians were given the same platform as the Republicans and Democrats, they would help change the conversation. This belief is, to be blunt, stupid. The Green Party is just the Democratic Party as your crotchety, conservative father would describe it. And Libertarianism is the worst idea since unsliced bread. Neither one of these parties is ever going to capture a substantial minority in the House or the Senate, and certainly not a majority nor the White House.

Democrats and Republicans represent generic versions of the most common political stances. If you want a representative 3rd party, you don’t go to an extreme, you go to the middle.

We actually have a 3-Party system now, and it’s a disaster. The Tea Party is called a wing of the Republican party, but the divisions within the GOP reveals how erroneous that description is. The Tea Party and the Republicans might agree on a great deal of policies (number 1: destroy Obama), but they are about as unified a party as oil and water. That’s not to suggest that gridlock hasn’t always existed in our government, but this is probably the first time in our nation’s history where such an impasse can only be broken by a majority of 1 party siding with the opposition party against their supposed allies.

Our political system is unbalanced right now and it’s a disaster.

Proving a Point 2

The Pendulum

While I believe it is important to maintain balance, I don’t think it’s something that requires concentrated effort to accomplish. We swing left and right, back and forth, and an equilibrium results, albeit one that never quite settles into stasis (which, I would argue, is vital for the continued growth of our society and species). History’s pendulum is a perpetual motion machine, the engine for all of our advancements.

We will never achieve a perfect middle ground, nor should that be our goal. Instead, we should continue to seek a society and world that allows room for opposition. I’m never going to be anything but a card-carrying Liberal, but that doesn’t mean I want Conservatives to be silenced. Quite the contrary, a nation built on nothing but unbridled liberalism sounds just as terrible as one built solely around conservatism. The promise of America has always been that it’s a land where the pendulum swings freely.

As long as that remains the case, consider me an optimist.

Nothing upon another’s word

Nullius in Verba
~ The Royal Society

Nothing upon another's word

Yet another tattoo.

Count it as either 13 or 14, it’s my 2nd in New Orleans. Generally with the tattoos I get each year, they are meant to sum up something about the previous year leading up to the inking, but because I’ve already gotten 1 tattoo here in the city, I decided to get a phrase that was less about marking a moment in time and more just part of my personal philosophy.

“Nothing Upon Another’s Word” (in the original latin) is the motto of the Royal Society of London, one of the oldest (if not the oldest) organizations dedicated to science. It has existed since 1660. There are religions that are younger than that.

This motto is the essential heart of science, and the hallmark of a skeptical mind (note: skeptical, not cynical). Every atheist has the spirit of this phrase running through their veins, even if they’ve never read it. Of course, you don’t have to be an atheist to respect this basic tenet of the scientific pursuit (there are, after all, scientists who are religious), but to live it in your day to day life is to refute the very notions of ‘blind faith’ and ‘authority.’

There are those who will claim ‘science’ is just another ‘faith,’ revealing that they don’t understand either word. The phrase “Nothing upon another’s word” is what sets science apart from religion. Being an atheist or admiring science doesn’t mean one lacks the ability to believe, it only means that we don’t believe based on someone’s word or assurances. If a scientist makes a claim, s/he has to provide evidence to support that claim. Once that has been done, a portion of faith (used in the sense of “good faith” not “blind faith”) is allotted to that person, so long as each additional claim is supported with additional evidence.

Science builds on what has been established. The Theory of Evolution by Natural Selection didn’t just appear in Darwin’s head, he built it on observations and well-established facts. These observations and facts were so well established that Darwin wasn’t even the only person to come up with the theory. He was just the first to get it published and widely disseminated.

Religion doesn’t work that way. It makes a huge claim (an omnipotent God, a Holy Prophet who speaks for Him, Heaven and Hell) and works backwards, demanding that the believers accept the most outlandish claims first (with no evidence) and then everything else they say is pretty easy to swallow in comparison.

When religious people attack science by claiming that the Big Bang Theory or String Theory are just matters of faith, they’re displaying the very mindset that makes them susceptible to religious faith. They are used to thinking about the big and working small, whereas science takes the small and builds up to the big. Those religious people who dismiss scientific theories don’t understand that such theories are built on smaller observations and well-documented facts, because their personal “theories” (God) have no such foundations.

When I say I believe in the Big Bang Theory or the Theory of Natural Selection, I’m not saying I have faith in someone else’s word. I’m saying that there has been enough research, study and established facts to make each theory believable. The theory could be proven wrong, but if that’s the case the base facts won’t change any. On the other hand, if God is disproved (obviously this will never happen), every religion will suddenly be meaningless (I mean, more so).

When someone proclaims faith in a particular religion’s God, their belief is built upon accepting the unproven claims of another. When I state that the only thing I believe in is science, I’m plainly saying, “Nothing upon another’s word.”

Nothing upon another's word Context