Musings on the Wild World of Writing & Editing

I’ve been neglecting this blog for quite some time now–I’ve gotten distracted by the siren call of tumblr, Twitter, and Pinterest. I return to offer freelance copyediting services to anyone interested.

My résumé is available here, but I’ll provide the highlights:

  • Graduated from Texas Christian University in December 2012 after 3.5 years of study
  • Bachelor of Arts in Writing with a minor in French Language Studies
  • 4.0 GPA, summa cum laude
  • Phi Beta Kappa Honor Society
  • Graduated from John V. Roach Honors College with departmental honors in English
  • Attended TCU on a full-tuition scholarship
  • Copyedited for TCU Press during a 6-month internship
  • Acted as web-editor-in-chief and copyedited for eleven40seven: TCU’s Student Journal of the Arts
  • Edited and offered writing assistance to TCU students as a writing associate at the William L. Adams Center for Writing, in which I worked with APA, MLA, Chicago/Turabian as well as all subject areas (ranging from creative nonfiction to nursing to psychology to business)

I will copyedit in any genre: poetry, nonfiction (any subject area welcome), creative nonfiction, and fiction. I have worked across the curriculum at the Center for Writing, and I have worked with all creative genres by virtue of my specializing in craft in my writing major.

I will charge $15-$20 per hour based on the following criteria (I accept payment via PayPal):

  • Copyediting level desired (heavy, medium, or light)
  • Difficulty of the text (I will charge more for dense or academic texts)
  • Citations (the more citations and the more in-depth the style, the more I will charge)
  • Style guide
  • Content editing (art, charts, graphs, fact-checking, etc.)
  • Turnaround time desired

I am happy to copyedit in hardcopy or online, but for hardcopy I will request that you pay for printing and shipping costs.

I can provide quotes for your manuscript if you send a brief summary, one or two sample chapters, desired time for completion, and level of editing.

I’m also able to send basic workshop feedback for any creative pieces on which you may want comments, suggestions, or ideas for improvement.

Samples of my copyediting are available on my external portfolio, and I can be contacted at rachelk.spurrier@gmail.com to answer any questions.

Advertisements

The Digital Chill Pill for Kids

ImageBecause I’m not a parent, I don’t spend a lot of time on blogs devoted to raising children. I read Motherlode sometimes on the New York Times, particularly if the article concerns education or children’s health. The only reason I came across this article was because it showed up in my Publisher’s Weekly daily email. I was intrigued because of my interest in cyberliteracy and how children are more adapted to constant use of the Internet than adults (even me, despite the fact that I’m 22). True, millennials and younger people are technology natives, while older adults are technology immigrants, where the younger people grew up with technology and have little to no trouble using it, while adults have to acclimate and adjust to a (digital) technology-heavy world. But even millennials are behind young children who are growing up with cell phones and iPads (I didn’t have a cell phone until I was in my teens, and even then it was a clunker). I see kids in the subway on iPads, in restaurants, in stores, in strollers. I’m mostly fascinated by this trend–all a parent has to do is hand over an iPad or iPhone, and a child is entertained for hours, much more than those activity or coloring books you did on road trips as a kid with a box of crayons. It’s a digital pacifier.

Is handing a kid an iPad any different than giving a coloring book or interactive game/puzzle? The goal is to distract the child, so what difference does it make if it’s on a screen or not? We’ve been developing games to improve children’s education for ages, and that technology is becoming more and more sophisticated as time goes on. As a kid, I had a “laptop” with a bunch of educational games on it, and my brother and I often worked on brain teasers/puzzles/logic games. So, were my brother and I any better off?

The Ads Say This: Tablets Teach your Kids to Read, Write, and Play

ImageYou can’t escape the bombardment of tablet apps and ads geared toward children. I see them every day on billboards and in the subways. They’re ubiquitous, and the message is clear: an iPad can teach your kid to write, to read, and to play (by the way, play is a valuable form of learning, but that’s a different topic for a different day). Because parents want their kids to get ahead of the curve, they’ve invested in video/handheld education games for years. As a kid, I remember seeing ads for pre-tablet handheld video games, kind of like a Nintendo Gameboy but it was all about interactive learning. But these games and handheld devices have become increasingly sophisticated and diversified with thousands of games and apps to choose from.

In my opinion, yes, iPads are different than old-school non-digital games. As someone who had bad fine motor skills, using a regular crayon to color improved my handwriting. Using a pencil to practice my letters made my grip on writing utensils better. Using my finger to trace letters on a screen or pressing spaces on a coloring “page” would have done little to solve the problem of my poor fine motor skills. In addition, you interact differently with paper and actual hard games like Rubik’s cubes or mancala or even checkers than you do with a screen. I’m not faulting parents who use iPads to get their kids to chill while they try to go shopping or enjoy a meal. I definitely understand that from a non-parent perspective. But should iPads and tablets be our default digital education tool?

Alone Together

There’s another perspective on this–giving kids iPads as a constant distraction prevents them from interacting with other people and improving interpersonal skills. Additionally, not having constant visual and auditory stimulation on a screen teaches children how to sit in quiet, to be alone, to be by themselves in their heads. Children need to react to external stimuli, but children also need to know how to really talk to other people and to value alone time.

ImageInterestingly enough, the article mentions an expert whose Ted talk I watched during cyberliteracy class. Sherry Turkle explains that the more time we spend in front of screen, interacting with others through instant messaging and email, the more we isolate ourselves, and not in a good way. Having a good-ol’ fashioned conversation and actually engaging another person are far more valuable than learning your letters on a tablet. Knowing how to keep in touch with people beyond just Facebook wall posts and photo sharing is an important skill that we inhibit in our children when we emphasize digital technology that cannot–and should not–replace basic human interaction. We are social creatures, herd animals, and we benefit enormously from human touch, from mimicking other’s facial gestures and body posture, from learning basic social skills.

Maybe iPads are better than what I had–how do we know? The data isn’t in yet about what children on iPads are like long-term in terms of neurology, psychology, and sociology. Only time will tell.

What About Books?

So what does this have to do with books, you ask? Well, publishers and developers alike are creating books that are more like games than books. Just like those books where you have a choice (“If you enter the cave, turn to page 50; if you decide to take an alternative route, turn to page 12”), these new books embed into the text games, video, sound, etc. In a way, I think this development is pretty cool that books are now a social, interactive application. Integrating media into text is important–multimedia and multimodal texts are becoming de rigeur.

kid reading bookBut, as a traditionalist, I have to put up a fight for the old school book. Digital, interactive textbooks will probably be a good thing–kids will be able to check answers quickly, engage in learning, and explore new learning techniques. But as for literature with a capital “L,” the value of the old school paper book cannot be overstated. These books teach concentration, focus, and valuable lessons on the human condition. Books as art shouldn’t be reduced to fun and games–“serious” reading improves critical thinking skills and themes about what it is to be human.

So please–use an iPad, but still get kids hooked on reading with picture books, then chapter books, then young adult books, then books for adults. Books can still be illustrated, fun, entertaining, exciting, even without buttons and whistles and bells and gadgets.

When Greeks began to write, Plato lamented this new technology, claiming it would ruin learning and memory. Instead of having to memorize everything, people could store knowledge externally in print and in text form. If you have some time to read more about how writing is a technology, check out this article (warning: it’s long and dense). In the long run, people still retained knowledge and added to our knowledge base over the millennia in art, science, math, literature, and so on. We live in an age where one newspaper may contain more information than the average citizen might come across in a lifetime just a few centuries ago. But how well are we remember these days in a new era? Nick Carr explores how we treat memory and attention span differently know that we are constantly on the Internet and can store vast amounts of information in cyberspace. Why remember when Marie Antoinette was beheaded when you can look it up in seconds on Wikipedia? Why memorize your friends’ phone numbers when they’re stored on a mobile device?

Out With the Old, In With the New 

I’m getting off topic, but what I mean is that whenever we develop a new way to store and disseminate information, we evaluate if this new technology will affect our way of thinking, analyzing, and remembering. And it usually does, although we cannot fully analyze the long term effects for decades. Along with multitasking and web surfing affecting our train of thought and ability to concentrate  we’re changing the way we read because of the e-book. I read an article that explores this topic. I’ve already talked about how e-books change reading comprehension for people who’ve grown up on print books (though not children). I’ve also talked a little bit about how some publishers and retailers like Amazon want to make reading social with shared underlinings and annotations (Amazon, stop sharing my notes! It’s creepy); some applications like Riffle try to make reading and recommendations a social media experience.

But apart from sharing our highlights and notes, e-readers gather information about our reading habits–how quickly we read, where we stop and start reading, how often we read, etc. What you read and how you read it is no longer your private information. This is obvious when we get book recommendations from retailers, but publishers might use this gathered information to encourage readers to edit. For example, if readers on average stop around page 50, the publisher might recommend that the writer shorten the exposition. What if a book you buy is automatically tailored to your tastes via algorithms that know your buying habits and your preferences? What if readers have the option to group edit a text? Of course, publishers have been coming out with new editions of books for years, but usually a new edition takes a while to write and is widely publicized.  What if the edition is specific to you, or you never know that what you’re reading isn’t what came out originally? What if the accessibility of a book is dependent on other readers? All of a sudden, that quiet, private afternoon curled up with a book seems way more disturbing and intrusive.

Is Sharing Caring? 

Mikhail Bakhtin theorized about the relationship between writer, audience, and genre. From what I can remember, a writer writes a book and publishes it, but its reception and genre is dependent on audience. For example, Sherman Alexie’s The Absolutely True Diary of a Part-Time Indian was intended as an adult novel, but now it is widely considered YA fiction, though much of the content is memoir. Now, that relationship might books more malleable and changeable than ever before: writer publishes book, audience reads book on an e-reader, publisher gathers information and edits the book, and the book comes out again in a new, revised form based on the reader’s preferences and tastes.

I have no idea if this is a good or bad thing. If publishers really do start changing books to appeal to the audience preference’s, both the author’s autonomy and the reader’s choice will be limited. However, as long as the original print version is still available, I guess I’ll just switch back to print instead of risking reading a book that is not what the author intended. I would like to interpret and make reading decisions for myself, thanks very much.

The E-Reader Wars

Okay, fine, that title is a little incendiary. But the wars behind e-readers are varied: libraries getting mad at publishers because publishers limit the number of e-rentals before the libraries have to pay again; price collusion on e-books leading the DoJ to sue five of the Big Six; self-published e-books taking on the traditional publishing industry. Yes, the battles are varied and many, but I want to go back to a topic I discussed a little while back: a new meaning of “ownership.” I read another article on another e-reader war about digital rights management (DRM). The Big Six publishers require that their e-books be sold with DRM protection so that readers cannot make copies of books, and because of DRM requirements, a book you buy for Kindle can only be read on other Kindle devices or apps (or Nook with Nooks and Nook apps, etc.) Many consumers hate DRM because if they decide to switch e-readers, there is no way for them to convert the file to read on another device. However, the issue goes further; non-DRM books can’t be read on Kindles (some can, but relatively few). So, if you have a Kindle, you’re pretty much stuck getting your e-content from Amazon. An “easy” way to circumvent this problem is to get a tablet with multiple e-reader apps, but a Kindle e-book has to stay in the Kindle app, and an iBook has to stay in the iBook library. 

Independent booksellers want to sell e-books without DRM so that no matter the customer’s e-reader/tablet, that customer will be able to buy whatever book he or she likes. Some imprints of major publishers are ceding to this trend and allowing non-DRM content to be sold. Hopefully other publishers will come around and let independent bookstores fight Amazon’s growing market share of e-readers and e-books. 

As a writer, I’m pretty torn about copyright law and piracy. On the one hand, I respect intellectual property and do not want my work stolen without my permission, but I feel that in some ways copyright law is outdated, overly strict, and stifling. When a music label sues a mom for using a song on a YouTube video, that label comes across as out of touch and stingy. I know; I know; the music industry is struggling right now, but the woman didn’t intend to break the law, just add a cute soundtrack to her video. And don’t get me wrong–I’m super against piracy. Unless a friend gave me the song/CD, I’ve bought every song on my iPod and every book on my Kindle. 

I’m just not sure that ruthlessly cracking down on every possibility of copyright infringement is really in the creator’s or the consumer’s best interest. I’m glad that the resources in the creative commons are growing, but we are a long way from recognizing that copyright laws might be getting in their own way. I’m going to go into more on this in my next post, so stay tuned. 

I grew up reading strong female characters: Hermione Granger from Harry Potter and Liesel Meminger in The Book Thief. They may have been bookish and awkward and shy, but they had an internal combustion that fueled them on (and of course, later on, Katniss Everdeen literally burned on). I admired these characters for their pluck and tenacity; as a teenage girl, I saw myself in them and what I wished I could be. Throughout high school, I basically ignored Twilight and only considered it while reading a chapter in How to Read Literature Like a Professor, because the chapter basically said that vampirism is almost always a metaphor for sex. After graduation, I decided to go ahead and read them and see what the fuss was all about, and I wanted to be able to legitimately say they were awful. 

I’m not going to spend an entire blog post on how bad the writing of those books are; if you’d like to see an entire website about it, click here. Quick disclaimer: I did enjoy reading them in the way that you enjoy eating an entire bag of Cheetos puffs in one sitting or chewing on gummi worms during a bad movie. But the greater messages of the books upset me in their treatment of young women.

Again, I will spare you the comparisons of Hermione Granger and Katniss Everdeen versus Bella Swan, because it’s just so easy: Hermione Granger continues fighting to defeat the Dark Lord even when the love of her life walks out, and Katniss Everdeen helps lead a revolution while her love is being tortured (granted, Katniss does have one too many breakdowns in Mockingjay but whatever). Bella Swan curls up in a ball on the forest floor when Edward leaves her, and she lives in a nigh-catatonic state until she begins flirting with another boy. Her only true happiness comes from Edward or Jacob, never from within. Message received: life has no meaning without a boy in it to tell you you’re special.

Image

The reason why I’m taking the time to write about this issue is because I read this article today. In it, Tara Isabella Burton confirms what I’d worried about all along: most, if not all, books in this category preach the same message–that you only have value if a boy loves you. And this is a genre that has its own section in Barnes and Noble. Burton lays out the formula for these books, and Bella Swan (ugh that name) fits all the categories:

  • Ambiguous in description but always “intelligent”: Bella Swan is described as good in school and pretty, but apart from knowing that she has dark hair and eyes and is clumsy, she could basically look like anyone. This is convenient because it allows the reader to imagine herself in Bella’s spot because she is so damn bland. The fantasy is easier to complete–Edward isn’t necessarily telling Bella how much he loves her, he’s telling you
  • Vampirism is a safe way for Bella to explore her sexuality without actually going all the way (despite that awful baseball scene in book 1, she and Edward don’t hit home base until three agonizing books later). These books provide a proxy for sexuality through vampirism or magic or some other fantastical world.
  • Bella is loved not for her intelligence, wit, or charm. Edward loves her because he cannot read her mind and her blood smells special. And yes, I get how creepy that sounds, and we even get to learn the Italian for it: la tua incantante or something. She is loved because she is “unique” to Edward, not through common interests or her personality. She is inherently special to one person, and only one man can see that. She is not special based on her own merit or to herself.
  • Bella’s female friends are basically seen as annoying, cumbersome, or irritating. She only has a mind for Edward. I get it–young love, whatever–but so many girls are too quick to throw away friends to hang out with a boy. And instead of proving that that behavior is dangerous and detrimental, Meyer makes it seem okay in the end because she marries Edward at the age of eighteen (or maybe nineteen but what difference does that make?) Yes, this is the message we are sending girls–get married right out of high school; it’s okay if you don’t go to college because you’ll be a wife and mother and that’s all that matters.

tumblr_ld79x6co1g1qzwaddo1_500

I’m glad that these books get teenage girls to read; I really am. But I cannot fully support an entire genre that teaches girls that their greatest value only comes from outside of them, that they are only worth something when a boy is validating them. This is dangerous and untrue. The truest lesson is that only when you love yourself can you love others or let others love you. I used to not believe this, but I do now, and I know this much to be true: you will never truly be happy until you believe you deserve someone healthy and whole who will treat you well. Emma Watson plays Hermione in Harry Potter and Sam in The Perks of Being a Wallflower, which leaves us with this quote: We accept the love we think we deserve. If we teach girls that they only deserve love from other people and not themselves, they will only evaluate themselves in how others see them. Their only sense of self will come from external sources, rather than building an identity based on introspection. We cannot and must not let this lesson for girls when they turn the last page. We owe them more than that. Please read, but please know that your value lies beyond what a boy thinks of you. A girl needs to know she is a person who deserves love and affection from herself above all others.

Like Herding Cats

From elementary school through most of high school, I believed the rules of grammar were fixed–immutable, inexorable. Apart from creative writing, the conventions of grammar were stable and set; I liked how definitive they were. Illogical and idiosyncratic though they were, the rules of grammar made sense to me–I used to diagram sentences in my head when I spoke or read. I no longer diagram in my head, but I’ll freely admit my nerdiness unabashedly and unashamedly. Grammar came easily to me, perhaps because I read so often and internalized syntax and usage. The more you read, the more you can tell when something “sounds” wrong (as a grammarian, your reason for something being incorrect should never be that it “sounds” wrong, but I catch the error because it sounds weird and then identify the technical issue). Spelling is similar but more visual: the more you see words and the patterns of spelling, the better your spelling will be. Lesson here kids: read and read often.

Le Francais, c’est plus qu’une langue

That was the motto of my high school French club: French is more than a language. When I began taking French in high school, I took to its grammar almost effortlessly. Part of the reason is that while English is not a Romance language, many of its words have Latin roots, and some of the word order is similar. Because I felt so grounded with English, French wasn’t too much of a stretch. Plus, a mind for grammar will ostensibly do well with any language. Many people find  the French language maddening because of its odd pronunciation, its ridiculous subjunctive, its ludicrous number of verb tenses. But I liked it. I liked conjugation tables, and I liked learning new patterns of spelling, and I liked putting together the pieces of a sentence. For me, English and French grammar and spelling exercises were like candy in homework form. Once I was about to vomit from a plateful of trigonometry, I sat back, relaxed, and worked with future and conditional. It was relaxing (if you’re still with me at this point, I applaud you). I’ve basically been waxing nostalgic about my love for grammar because of what I’m about to say next: English grammar is contradictory, antiquated, and nonsensical. And there’s no such thing as “standard” English.

It All Comes Back to the Nazis

For those of you who don’t waste time on the Internet, Godwin’s Law of Nazi Analogies states that the longer an online discussion continues, the probability of a comparison to Nazis approaches 1. Basically, the longer an online forum goes on, the more likely someone is going to call someone else Hitler or the Nazis. So, I’m going to go ahead and fulfill Godwin’s Law and start talking about grammar Nazis.

Part of the reason I’ve been so absent from posting is that my life has been in a state of massive change. I found a job in New York, NY, and I moved at the end of January. I’ve been settling into my new job and recovering from culture shock (my Southern/Midwestern sensibilities sometimes clash with the New York ethos), and my daily exhaustion has prevented me from posting. My job is in client support, so most of my job is speaking with clients and helping them use our product. Although I was grateful for the job and liked the work environment, part of me began to worry that I wouldn’t use my training as a writer and editor. I took my first chance to prove my skills when an internal memo came around with an FAQ about the company (we just launched, so we are still familiarizing ourselves with how to address issues in the system and answer questions). Although it was internal, I marked it up and copyedited it, sent it to my boss, who passed it on to our Chief Marketing Officer. He was impressed enough to start sending me stuff to edit and write. I got to choose the style guide for our client support team as well as media relations–I chose Chicago, because the Oxford comma needs a defender from the evil AP. Speaking of which, I once took a course that included peer editing. One of my peers was a journalism major, and she kept crossing out all my commas. I kept adding in commas to hers. Both of our papers were technically correct, but we merely had different philosophies on punctuation.

Which brings me to the fact that now I am the resident grammar nazi of my department, I get questions about grammar, and it’s so hard not to qualify my responses with, “Well in this case it would be this, but sometimes if you want it could be this.” I also have to dispel a lot of what gets incorrectly taught in schools: you can’t begin a sentence with a conjunction; you can’t end a sentence with a preposition; etc. If you ever need help definitively addressing these issues, I recommend Miss Thistlebottom’s Hobgoblins: The Careful Writer’s Guide to the Taboos, Bugbears and Outmoded Rules of English Usage by Theodore M. Bernstein or The Transitive Vampire by Karen Elizabeth Gordon.

I digress. When I’m feeling particularly feisty, I go on long rants about prescriptivist versus descriptivist grammar; open versus closed punctuation; the punctuation and spelling variations between UK and US usage (I’m looking at you, realise). I have a nice, sturdy set of reference books to dispel ambiguity when questions of grammar arise, and these books only add fuel to the flames of my grammatical passion: Webster’s Collegiate Dictionary, Webster’s Usage Dictionary, Chicago Manual of Style, and The Copyeditor’s Handbook by Amy Einsohn. For anyone who wants a full and spirited debate on all things grammatical, Einsohn is your gal. She discusses various grammar schools of thoughts, words that cause the most heated debate, and the tiny nuances of language that dictate how we edit and write.

How in the hell did we get here? 

How, you might ask, did we end up with so many conflicting grammar rules and pet peeves among individuals? To summarize briefly, blame Latin. A lot of our ridiculous grammar rules come from British elites several centuries ago who wished to “perfect” English by making it imitate Latin, the language of the great Roman empire. An immediate issue becomes obvious: Latin is the basis of the Romance languages, but English is a Germanic language. The two are both Proto-Indo European language, but they come from different families. The biggest example of this issue is the old-school rule that you cannot split infinitives. The basis for this rule is that in Latin, you literally cannot split an infinitive because an infinitive is one word. The other is the old standby that you cannot end a sentence with a preposition. Neither one of these rules is particularly enforced, depending on whom you ask. I am sure that my grandfather will insist that you cannot split an infinitive until the day he dies.

My grandfather is an example of a prescriptivist, a person who wants to preserve standard written English, and his less obsessed grammar counterpart would be a descriptivist, someone who is more concerned with actually employing language as it is really used. If a writer were to follow every bizarre rule in the prescriptivist handbook, he or she would be reduced to ridiculous and ambiguous wording that would obscure meaning. In all honesty, placing a preposition at the end of the sentence to communicate meaning is much more effective than obsessively preventing an ending preposition with odd or wordy phrasing.

One of the best birthday cards I ever received had two girls sitting at a restaurant. One girl said to the other, “Where’s your birthday party at?” The other girl responded, “You shouldn’t end a sentence with a preposition.” When you opened the card, the first girl said, “Where’s your birthday party at, bitch?” We should not always write like we speak, of course, but there is something to be said for writing in a style that is easily understandable.

Please don’t misunderstand me; certain grammar rules exist for a damn good reason. You may not confuse there, their, and they’re; you may not mix up its and it’s; you may not create a plural with an apostrophe. These rules exist because if you make those mistakes, you obscure the meaning of the sentence; however, a sentence that does not follow the screwball rules of English but is perfectly understandable is preferable to the alternative.

Some people might want to scream that I am promoting the corruption of the language and wish to do away with all standards of grammar. Not at all. I will still stand by that the active voice is better than the passive, that vague pronoun references are confusing, that incorrect comma usage is maddening, and so on. I’m merely promoting that we let go of some of the ridiculous grammar rules that have sprouted over the years and have stuck, like bits of urban myth and folklore. They no longer serve any real purpose apart from making some writers feel superior to others (even if those writers’ writing is nigh incomprehensible).

Retreat to Move Forward

I work in an office, so I often hear “corporate speak,” which most often includes incorrect word usage to the extreme. For example, I’m currently preparing for battle on the fact that you cannot use “clean” to describe accurate information or data. 30 Rock often mocks this corporate speak and the overuse of acronyms within the corporate environment–watch “Retreat to Move Forward” for some of the best examples or the 4th season Christmas episode. I admire the corporate world’s creativity with language, but please, please, please, let’s avoid the overuse of acronyms and the obsession with catchy phrases. It’s not helping anybody. We already have words for that–we don’t need to use a new word whose denotation nor connotation is remotely close to what you mean.

So as I move forward in the corporate world, I will do battle for those rules of English that help create clarity and meaning, but I might let a few fall by the wayside, those that are outdated, confusing, and impractical. They’ve long outworn their welcome in the grammar books.

 

The More Things Change

In my AP classes in high school, my teachers would prep students for the essay questions we might come across on test day. Fortunately, the College Board had set particular categories for each exam (e.g., in my English Language course, we had to prepare for an ADQ essay–Agree/Disagree/Qualify on any given subject based on primary and secondary sources). In World History and European History, I distinctly remember the category of “Change Over Time.” Basically we had to write what changed in a certain area over a certain time period and what stayed the same. In European History, that was pretty easy: the middle class is always rising. In World History, many students felt like being clever and would say that the only constant was change. The answer, though smart aleck-y, was often correct, though that’s not what the College Board was looking for. (I vaguely remember writing an essay about the change of China’s independence and loss of self-determination in the nineteenth century.) Although my classmates and I often whined and complained about having to write another CoT essay, in truth, we are quite adept at looking back at events and finding patterns. Oh and don’t worry, folks, I fully plan on writing a rant against teaching essay writing to the test. The challenge of the AP CoT essay was fairly simple, but when we turn it inwards, the task becomes exponentially more difficult. Looking at ourselves poses a deeper set of issues than studying a series of facts for a standardized test.

The Only Constant Is…

I’m reminded of a quote by Nelson Mandela that I have on my Facebook page under “Favorite Quotations”: “There is nothing like returning to a place that remains unchanged to find the ways in which you yourself have altered.” I chose that quote as one of my favorites because at the time I was feeling nostalgic, so I kept going back to old sites from my childhood. Indeed, my middle school baseball field remained unchanged, and the park where I first held hands with my first boyfriend had stayed the same. But in many ways these places were unrecognizable to me, because, of course, I had grown up and left behind the teenage girl who lived those moments. I was now viewing them from the lenses of a young woman, merely looking back on the reminisces of a 13-year-old girl.

I think I might have chosen creative nonfiction because I’m introspective and have an unfortunate tendency to navel gaze. I’m always looking inward, analyzing my thoughts and feelings, and I’m always looking back, picking apart the past for clues as to what went wrong (or right, as it were). But I think most people are fairly skilled at looking back and realizing how they’ve changed. We all chuckle a little at the follies of middle school or shake our heads at the dramatic events of high school. We can easily see how much we’ve changed from age 10 to 20 or age 30 to 40. But we are spectacularly bad at realizing how much we will change. I came across an article in the New York Times that gave some scientific insight to what I had already surmised might be true: Although right now I can tell how much I’ve changed in the past ten years, I will more than likely make an inaccurate prediction of how I will be in another ten years.

I think this article is another one of those instances where science attempts to give a logical explanation for something many people already assumed to be true. For example, the field of evolutionary psychology has provided scientific explanations for attractiveness in the case of the hour-glass figure in women and the square jaw, broad shoulders in men. I wouldn’t have thought much about the article except that I’ve been doing a lot of thinking about the process of reflecting. Partially this reflection comes from a desire to “sort myself out” before moving to New York, and partially this reflection merely comes from the time of year. New year, new beginnings, new life.

Reflections

One of my favorite paintings is The Penitent Magdalene by Georges de la Tour. To view a couple versions, click here and here. I’m not a Biblical scholar, and I abandoned my art history major long ago, so the reason I’m attracted to this painting has nothing to do with religious symbolism or la Tour’s skill with indirect lighting. It’s about the look on her face as she stares into the mirror, skull in her lap or on the table, dark and lustrous hair falling down her back. She is beautiful; she is pensive; she is contemplative. There are many versions of the Penitent Magdalene in artwork, much like the Pietà or the Madonna, but the versions that draw me in are the ones where Mary Magdalene stares not upwards at the heavens but downwards in the mirror, where she realizes that the answers to her salvation come not only from above, but from purging herself of the demons (figurative) within.

I connected this painting’s theme of reflection to one of my favorite poems, “Mirror” by Sylvia Plath:

I am silver and exact. I have no preconceptions.
What ever you see I swallow immediately
Just as it is, unmisted by love or dislike.
I am not cruel, only truthful—
The eye of a little god, four-cornered.
Most of the time I meditate on the opposite wall.
It is pink, with speckles. I have looked at it so long
I think it is a part of my heart. But it flickers.
Faces and darkness separate us over and over.
Now I am a lake. A woman bends over me,
Searching my reaches for what she really is.
Then she turns to those liars, the candles or the moon.
I see her back, and reflect it faithfully.
She rewards me with tears and an agitation of hands.
I am important to her. She comes and goes.
Each morning it is her face that replaces the darkness.
In me she has drowned a young girl, and in me an old woman
Rises toward her day after day, like a terrible fish.

I’m not going to go into a literary analysis of Plath’s poem. I did a project on it in the ninth grade, and that was enough for me in terms of academic study. The reason I’m going on and on about reflections is that I recently came across an article about New Year’s Resolutions for the publishing industry. I’ve often told people that publishing isn’t dying, merely going under a transformative process and a series of growing pains. The best way, in my opinion, for it to survive and flourish is to do some serious reflection. Jeremy Greenfield gives some succinct advice on some of the more pressing problems in publishing today: the cost of e-books, librarians versus publishers, intellectual property, etc.

Resolutions

I haven’t yet written about how antiquated and ridiculous copyright laws are (I fully intend to once I’ve done some more research), but I have gotten a bit into new meanings of ownership in the digital era. This whole issue of ownership with digital media has become a problem between librarians and publishers. Unlike print books, where you pay a one-time flat fee for the book and loan it out as much as possible after that, e-books are problematic to publishers, who haven’t taken a fancy to digital borrowing. I suppose the objection is that multiple people can borrow a digital copy at once, although this solution is easily remedied with appropriate software and a simple user interface that only allows one person at a time to check out an e-book. Publishers, to offset this potential loss, want libraries to pay fees after a certain number of loans in order to continue to loan it out via e-books. Libraries are, understandably, a bit annoyed, especially as their budgets shrink and their patrons disappear. Greenfield argues that librarians should focus on bigger fish for the time being. He recommends letting the publishers come around in their own time while librarians move their focus on more worthwhile issues. (He doesn’t name which issues, so whatever.)

I could comment on his other recommendations to publishers, agents, and writers, but the only other thing I wanted to fully discuss is his recommendation to readers about complaints concerning the prices of e-books. Many people do not understand why a digital file should cost $14.99, and Greenfield explains the value of a book, no matter the format. However, I tend to favor a more practical approach in defending the cost of an e-book. In a course I took last semester, a classmate complained of the cost of e-textbooks. I’d recently read an article about how some universities are pairing with publishers to cut the costs, so I explained to my fellow classmate that the cost of producing a book is much more than printing costs. The cost of the ink and binding and paper is only one aspect of publishing: paying the sales team, acquisitions team, editors, copyeditors, typesetters, and overhead costs; royalties to the author; marketing, including advanced reader copies and travel expenses for book signings and appearances; the list goes on. Plus, converting a book into a digital file has its own set of inherent costs.

The long road of traditional publishing is an expensive one; it’s no wonder many writers are turning to self-publishing. The costs of hiring a freelance copyeditor, a freelance book artist, and some fees are minimal compared to keeping (almost) all of your own profits. As the new year begins to unfold, it’s time for everyone in publishing (from writers to readers to agents to editors) to look in the mirror and reflect on both the goals of our individual parts and the sum of the whole. After all, change is the only constant, and only with collective group effort will we manage to remain successful.

%d bloggers like this: