A blog post after my own heart!

A coupla hundred years ago, our good ol’ mate Darwin thought up this totally radical idea called evolution. This had two noticeable results: the church got their knickers in a twist, and it revolutionised our understanding – of biology, humanity, pretty much EVERYTHING-y. Since then, all manner of disciplines have co-opted the evolution concept: stellar evolution! Directed […]

via The Evolution of Language: Three Things We’ve Learnt — Mother Tongues

Taming the’Saurus–Dangerous in the Wrong Hands

or, Etymology Won’t Teach You About Beetles

“Its very variety, subtlety, and utterly irrational, idiomatic complexity makes it possible to say things in English which simply cannot be said in any other language.”
― Robert A. Heinlein, Stranger in a Strange Land

It’s apparently the thing to make fun of English on Facebook: memes abound. Today I saw a quote (attributed to David Burge): “Yes, English can be weird. It can be understood through tough thorough thought, though.” My buddy sent me a set of proper English teacher mugs with helpful reminders like, “They’re there for their afternoon tea” and “I’m going to add two sugars, too.”
(Along about second grade I prided myself on coming up with the grammatically correct “That ‘that’ that that girl used was incorrect.” Four ‘that’s! Count them. WHOA! I just did one better: “I always said that that ‘that’ that that girl used was incorrect”. Guess that’s why I went to grad school for English, huh? Top that.)
Then everyone on FB will always comment something about English having no rules and English being the hardest language to learn as a result. . .Ha, ha, I laugh along with them, but part of me wants to proclaim, “NO! EVERYTHING IN ENGLISH HAPPENS FOR A REASON, AND IT’S FASCINATING!”
Whoa, sorry I shouted just there; that’s about three decades of pent-up enthusiasm for one of my favorite of all subjects: the History of the English Language. No other language in the world developed quite like ours, with a unique political history allowing a variety of languages in a variety of time periods each to dump a whole new vocabulary into the mix. And though for most this results in laughable pronunciations and spellings, the far more important and brilliant fact about English gets lost: we have more word choices than any other language!

He went up the stairs.
He mounted the stairs.
He ascended the stairs.

. . .And we have three different guys who need to get up the stairs, right? The first is a plumber, the second is a cheap romance-novel hero, the third is a bishop or something.

The plumber reached the second floor through English’s original origins as a Germanic language (to go, to wend). Dirk Kirkwood benefited from the French influence on the language following the Norman Conquest. His Eminence’s usual mode of transportation derives from the Latin influences our writers in English actively imported during the Renaissance in order to “beautify” our language for better-sounding poetry.

Three different sentences (which would usually translate into only the same sentence in another language) which technically do mean the exact same thing in English as well–and yet, they don’t. In English-teacher speak, the three sentences have the same DENOTATION but they don’t have the same CONNOTATION.

And we writers are always all about connotation: we’re always seeking to say things in a new way, yet we still want it to be the correct way, and we really really want to achieve that subtlety of expression that Heinlein celebrates and which “word-inventors” or word-importers like Shakespeare worked so hard to make possible. It’s so much easier to express ourselves with so many words to pick from–and that’s before you even employ colorful idioms and cultural allusions. Thank St. Francis de Sales for the Thesaurus.

Hold your jets.

The Saurus-Trap

All right, I admit it, while my vocabulary is astoundingly prodigious, I AM on the wrong side of 50 and more often than not these days I’m too slow to come up with the question on Jeopardy! –even though I know that I know it, I just can’t think of the word. So on my laptop, where I do most of my writing, the Firefox homepage is a thesaurus.

A thesaurus is wonderful for people like me who already know the words, who already understand the exquisitely subtle connotations of a given word, we just need reminding. The same tool in the hands of a beginning writer becomes as laughable as the English language is to an internet memester.

“If Facebook has taught us anything, it’s that a lot of you, are not quite ready for a Spelling Bee.” –OH, NO THEY DIDN’T! Someone actually made a meme that complained about spelling at the exact same time s/he made the egregious grammatical error of separating the subject from the verb with a glaring comma. Hah!

So, yes, imagine the creative writing results someone like this would produce if you handed him/her a thesaurus. I could always spot the student who had gone out and bought a Roget paperback a mile off:

“We need to be fastidious not to under-estimate the perilous effects of global warming.” –Fine, I’ll be sure to bring along my hand sanitizer and St. George.

So where’s the good news? First, anyone reading an obscure blog like mine purporting to address “matters of peripheral interest to writers” isn’t the kind of person to fall into the leaf-covered Saurus-trap. You’re aware that denotation is not connotation.

Second, back up to where I (and also some famous writers, you can look them up) proclaimed English the best language on the planet for exquisite precision of expression. I’d like to add another assertion, the implications of which are even far more glorious: those of you who have grown up with a love of words and have grown up with English have had your very brains shaped by vocabulary to perceive reality with exacting complexity. Well, potentially. One always hopes.

It’s like the old paradigm presented in the saying that Eskimos have 50 different words for snow (which, incidentally, is not an urban myth… 2013 update). If your mind can comprehend fifty different kinds of snow and ice condition, then you literally do see your environment differently than someone who only has a couple words at play during winter.

Etymology will teach you about beetles and scarabs and coleopterans. . .

One of the best ways to tap into your English-speaking cultural advantage is to have a good look at the history of that language, how and when the words evolved, which in turn will give you greater understanding of why those various connotations exist. It’s true that most people wouldn’t tell you their plumber ascended the stairs while at the same time describing the Pope on TV going up some steps for his inauguration ceremony. Writers, however, can give a lot more depth as well as precision to their writing by studying word origins.

Navigating the requirements of graduate school can be treacherous.
Negotiating the requisites of graduate school can be tricky.

…This one’s easy to explore without having to consult your American Heritage dictionary. (Which is, btw, the best dictionary for Americans interested in etymology; I won mine in a 5th grade spelling bee and the rest is history.)

“Treacherous” has connotations of betrayal, with its origin meaning “to cheat”, whereas the word navigation brings to mind a sailor managing unexpected obstacles on the sea. Both have the implications that a graduate student is met with much that is out of her control.

“Negotiating” instead sounds like a person who is at least on equal footing with the task at hand, with its implicit meaning “to conduct business” and a further overtone of something that is actively worked at, not passively achieved. Combine with that the seeming-synonym for treacherous of “tricky” and we have an additional sense that the student is empowered to act upon grad school just as well as grad school springs surprises on the student.

Two sentences, but only in the second does the student seem more powerful. I might add that a “requirement” is something issued by one entity to another, whereas a “requisite” is a static or neutral condition of need–but I’m probably just over-thinking it now.

The next time you find yourself torn between two “big” words, have a look at their etymologies–and you might just be surprised. Usually that alone is enough for you to decide. But along the way you’ll find yourself exploring the origin of another word, and yet another. . .Soon you’ll be reading and writing untold layers of meaning in every new choice of words.

Saurus
Enter a caption

How many ways can you tell a (hi)story?

I’m not biased, am I? –You’d better be!

So why do people choose to write historic fiction instead of straight-out fiction? Or instead of fiction fantasy, where you don’t actually need to do the historic research and a few anachronisms may be expected, even encouraged? All right, strange question to begin a blog post that isn’t actually about historic fiction but rather the writing of history. I would posit that historic fiction and history books are not two distinct entities, but rather two ends of a spectrum whose middle ground is rather more trodden than you might expect.

The Powers That Be Still Decide What We Think Is Our History

Once, all written history was as trusted as absolutely true as an article on snopes.com. And those writings almost always had the “official seal of approval” of the state or the Church or whoever held the greater sway. If the illustrious powers that were in all their wisdom didn’t want you to hear about it, you didn’t hear about it. The bulk of European histories before the Renaissance were shaped by the unshakable belief that history was ordained by God:  rulers and Popes and bishops and noblemen, of course, carried out the actions that led to historic events and trends, but it was God who put them on their thrones.

“But I know all that,” I hear you saying. Yet consider: how do we learn our history today? It really isn’t that much different than once-upon-a-time because most of the populace is a bit lazy, and even the critical readers among us–if they don’t have an interest in learning about a particular time and place–are going to fall back on the same sources as the non-readers: what they learned in school and what they see on the big screen. Look how many people believe that all the whites at the first Thanksgiving were Pilgrims–and Puritans at that–the way they were in their school play, or how many folks buy wholesale the conspiracy-theory depicted in the Oliver Stone film JFK.

And We Seem To Be Happy To Think That History Is Etched In Stone

Text books in school are always notoriously behind the times, but most parents and educators don’t seem to mind much: for one thing, books can be expensive, and it’s expected that they can’t always be current. For another, most of society is usually content if history class gets the general idea right, and along the way maybe teaches our kids a little something about the good guys and the bad guys and some skills in reading and remembering dates and writing reports. I’ve followed my niece’s primary and secondary school career in the cyber-age, and it’s surprisingly similar to when I grew up: you have access to information, but knowing what to do with that information is all the difference. You can only learn so much actual fact as a youngster. A student like me from the information dark ages of the 70’s who has been taught to read and think and argue critically is a hundred times better off than a student today who has all the information in the world at her fingertips through her laptop.

Information and education are NOT inter-changeable commodities. –Yeh, I said it, you can quote me.

And yes, I know you know that, but where did it all start, the idea that we could even question what our government or our church or Mr. Pendleton our sixth-grade history teacher wanted us to know about history? For the most part, we have Niccolo Machiavelli to thank, the guy who is somewhat-unfairly eponymous for the scheming, manipulating political villain. Before he pointed out in The Prince that history was created not by God or by God-appointed rulers but by fallible human beings with a variety of motivations, there was no (official) notion that there was more than one way to interpret history. Suddenly it was all the rage to depict the events of history being caused by ambitious, smart or otherwise influential persons, through charisma or manipulation or sheer luck.

Machiavelli Says. . .Consider the Source

Post-Machiavelli we are able even to have the concept of “historiography”–that is, not the study of history so much as the study of the ways history is depicted. History can be interpreted through a moral lens, where the good guy comes out on top, or events can be depicted to bolster up a particular ideology, or they can be construed to re-affirm the superiority of one nation over another. (No, really?) These are the bold strokes; there are far more subtle ones at play that have taken all of us in at one time or another. Even me. And I did my dissertation in this neck of the woods. (What follows is quite an original thesis; here’s stipulating the caveat that if you take it and run with it, I’ll call it plagiarism, thanks!)

Shakespeare’s histories, the next-gen of the ideas of Machiavelli, gave us in the guise of individual characters a whole pageant of these “theories of historiography”, though they weren’t known then by this name. Check out the first tetralogy of history plays–that’s a fancy way of saying the three parts of Henry VI followed by Richard III. You have Henry VI, the pious but weak ruler wandering around in his own medieval church pageant, and his practical, down-to-earth statesman and all-around good guy Gloucester depicting a more secular or humanist approach to governing, and in comes York (Richard III’s father) as the ambitious Machiavel. By the time you get to Richard III’s antics you see a control-freak so manipulative that he psychologically starts to cave in on himself–really quite a modern idea in the sense that I could only label it with blatant anachronisms drawn from Freud and Jung and Engel’s biopsychosocial ideas. (But hey, Oedipus Rex existed quite a long time before the Oedipal complex, right?)

Look At All the Amazing New Things We’re Learning About Richard III!*

Ironically, the injustices done to the image of Richard III as a notorious Machiavel at the hands of Shakespeare and others in relating his history illustrate the same sort of Machiavellian “singly-motivated” telling made possible by the originator of the archetype himself. We’ve long known that here is a history ripe for revision–but whether that revision will be in the hands of an artist (like me, and the idea for a new Richard III play now-fermenting) or a historian, it is quite simply waiting for one thing: a new motivation for the telling of the story.

So here we are, right back where we started. I write historical fantasies because I want to deliver a certain message, a lesson or a truth that I personally believe. Those who write historic fiction are generally drawn to a “truth”, a topic associated with a particular time period, and the way they can drive home that idea to the reader. Writers of fantasy, those who have a message, have a more wide-open field upon which to play, but they can still draw from specific time periods and places to handle subjects that mesh well with the events of that part of history. And those who write the history books, well. . .they, too, have an ax to grind.

All of us writers, though, will probably fall into just one or two theories of historiography, vindicating Machiavelli, but perhaps our consciousness of the fact is what will make all the difference. We each have our own truth. We each have a story to tell. Just look at all the ways we can choose to tell it.


*The skeleton in the parking lot has captured my imagination this past couple years, and the story continues to unfold:  http://www.bbc.com/news/uk-england-leicestershire-21063882

RIII

Revisit, Review, Revise–But Don’t Reverse History

Why a post about Thanksgiving in February? Well, my blog purports to address revisions of history: fictional stories that convey more truth than fact, genealogical accounts that shift the focus of history from huge international events to the simple truths of everyday lives of ordinary people, and non-fiction work that blows us away with new discoveries in history, in anthropology and DNA, in science and how we comprehend the world and the universe. I’d like to comment on a quintessential part of American history that is absolutely ripe for revisionists–and not for the better.

Here is my how-NOT-to-revise-history polemic.

I believe that re-writes of history are a constant, as we discover more facts and as we find it desirable to emphasize certain aspects of that history to further our interests and beliefs in the present day. No single interpretation of history is ever going to be flawless or final, but that doesn’t negate the necessity of examining the past in new ways–on the contrary, I heartily believe in and encourage the existence of several versions of history existing and circulating at any one time.

HOWEVER. Having said that, I don’t believe we need to allow free-for-all revisions of our heritage to suit the zeitgeist of any particular group. On the contrary, a modern sensitivity to a pluralist view of what history’s lessons are or should be should make us that much more critical and careful about being as truthful and honest as possible. We have an age-old problem wherein our cultural attempts to redress the wrongs of the past sometimes lead us to allow the formerly-repressed voice to assert anything at all it wants–perhaps motivated by guilt or ignorance or just not wanting to be left out of the protest/party happening out in the street.

How not to celebrate a Quatercentenary

The year 2020 approaches, and with it, a 400th year commemoration of the foundation of Plymouth Colony. Already we have groups–some of them with a seeming-stamp of “official approval”–completely in denial of the notable values practiced in this Colony (but rarely in any other early European settlement attempts) and attempting to subvert what it is we’ve always celebrated about Thanksgiving, turning it instead into a day or mourning (for atrocities against Native Americans) and vilifying people like Governor Bradford (come on, do you even know what he generally stood for?!) –Strange, I generally genuinely like people who attempt to “subvert” anything! And I’m usually the champion of the politically-correct–at least the original goals of PC, before it became the free-for-all I decry above.

It’s become apparent that groups such as Plymouth 400 don’t speak for me; this isn’t political correctness, it’s misguided (all right, even malicious) distortion of history. We have nothing to learn from either extreme’s version of history, particularly when the purveyors of such versions behave so badly in the present. (You’ll have to Google around for some of the incidents at play in and around Plymouth, Massachusetts.) And we have much to learn from a historical look at why Thanksgiving became and stayed a national holiday in the first place. Hint: it wasn’t actually about blithely giving thanks and throwing around pumpkins and turkey drumsticks.

Give earlier Americans who selected Thanksgiving a worthy holiday some credit: we don’t celebrate Plymouth Colony because they were the earliest European colony; they weren’t. And we certainly don’t celebrate it because it epitomized white/native relations. Rather we still celebrate because it was the exception to the norm, an example of how integration should have been, and how it still could be if we only learn the lessons of history.

Why did the 1800’s “discover” the Thanksgiving holiday?

Remember, when the holiday was established and took hold later in the 1800s, our country was in the midst of increasingly ugly race relations; people knew it. Lincoln was responding to the ultimate human degradation, slavery. Out west whites were at war with Native Americans and atrocities were committed on both sides. The notion of genocide even became a possible vision for some. And often-violent prejudice met the most recent immigrants: Slavs, Chinese, Jews.

Given that backdrop we can see that celebrating the first Thanksgiving was NOT a quaint hypocrisy; it was a fervent wish by people weary with all the hate and violence that surrounded them.

Were there ulterior motives at play when Massasoit showed up with twice as many of his folks to the Plymouth colonists’ harvest bash? Sure, they wanted to let it be known they were watching. But hey, they brought food. Was there political maneuvering on both sides that first year, a search for alliances and allies? Of course, and that’s precisely the point: these people believed–and went on believing in spite of occasional individual mishaps–that this whole living-together thing could be accomplished without bloodshed.

Cut to the past century, as we continue not to learn the lessons from Plymouth Colony: Japanese interment, backlash against the latest immigrants, SE Asian or Hispanic or Muslim. . .You’d better believe I’m going to point out how ridiculous it is to equate Governor Bradford with the KKK or to turn Thanksgiving into a national day of mourning.

Well, given that there are some groups of people still fomenting dissent among the races, I guess I can see why they’re threatened by the vision of the First Thanksgivng after all.


Disclosure: I descend not only from Governor Wm. Bradford but also the mythological American love story, John Alden & Priscilla Mullins.
Disclaimer: I felt this way about Thanksgiving even before I found out I had ancestors on the Mayflower.


Reading suggestion:  Nathaniel Philbrick provides a good balanced example of all the voices preserved surrounding the impact of Plymouth Colony on the New World, and vice-versa:  2003. The Mayflower and the Pilgrims’ New World. G.P. Putnam’s Sons. That’s the young person’s version; I found it quite challenging enough!

1stTh

The Aphasic Horror – or, the Flip Side of Hypergraphia

When, literally, there are no words, there is no self.

Last time, I wrote about the phenomenon of hypergraphia as part of a collection of possible attributes that comprise Geschwind Syndrome, itself sometimes a feature of temporal lobe epilepsy. Having an over-abundance of words can be a wonderful “symptom” for a writer to have. But in my case TLE has on occasion led to a frightening, albeit brief, type of seizure producing aphasia: the complete loss of words. There are TLE seizures that are enjoyable when I’m given the leisure to entertain them: euphoria, dreamlike, dreaming-while-wide-awake, all-encompassing deja-vu states that hyper-stimulate both memory and creativity. But of all the unpleasant seizures that go along with TLE, none that I experience are anywhere near as horrible and frightening as the aphasia one.

I must be forgiven for complaining, because my seizures are relatively mild (and after Cymbalta was invented for my fibromyalgia I began to get quality sleep and the seizures have all but disappeared) and, being temporal lobe seizures, are usually undetectable to the outside world. But appearances most absolutely and certainly can be deceiving.
I know that the dreaded sensation I’m about to describe lasts probably only a second or even less, but believe me, it feels like an eternity while I’m “in” it.

It begins with a metaphorical “aphasia cattle prod” that hits me right between the eyes and it’s almost as if I’m being physically struck backwards; in the same moment my mind is wiped utterly blank. You can’t even imagine how blank. I suddenly have NO WORDS–none, nothing to think with. And the sensations are overwhelming: I feel as if I’m falling in total blackness, my hands clutching the air and finding nothing to hang onto, and with that/because of that I’m feeling utter terror, just about the worst terror I’ve ever known. (You can’t see me right now, but as I try to put myself back into that place, my hands are actually clawing at the air.)

During this eternity-moment I feel like I’m not even alive anymore; it feels as though I’ve been instantaneously relegated to a strange nether-world or limbo where life and death are the measure of nothing. I have no words so I can’t think! I may as well be a non-sentient slug in a petri dish. You can poke me and I might have an autonomic reaction, but I won’t be able to think about my past or my future or why I’m being poked. I can’t even think about thinking. I can’t think about myself as being apart from the petrie dish or the poke or the pok-er or the pain. There is no form or structure for my consciousness; I’m not human when I have no words.

Of course my entire description of the experience is only possible in after-thought, and certainly what the experience is not occurs to me only as part of a sensation during the actual seizure. Falling and flailing in blackness and in terror is as much as I know at the time. It’s more than enough.

Afterwards I have a residual pressure (absence of pressure?) fogging me up right between and behind the eyes, a bit of fatigue after the aura, but a profound sense of relief to be standing on firm ground again. Sensory input flows through my mind again and the little talker in my head assigns what I experience words and descriptions and I exist once more.

The capacity for aphasia even for a moment scares the daylights out of me, and I can only begin to imagine the horror for people whose neurological impairments are more long-term. Oliver Sacks, the neurologist and writer, has described in multiple books some of the more strange and painful variations brain-damage can exhibit. There are more familiar conditions like Alzheimer’s, Parkinson’s, and the catatonic conditions seen in the film Awakenings, based on Dr. Sacks’ work. These are conditions which, like Geschwind Syndrome, can challenge one’s very definition of personhood and sense of self. I appreciate every day not being on that end of the spectrum.

So hard-wired wordiness aside, I honestly don’t take for granted that my reliance on words, my love of words, my very self-perception and definition through words is a given. Rather I know that it is a gift, this indicator of sentience and human-ness. I see words for the magic they are and I’ve understood for as long as I’ve used them the drive to tell stories with them, to make something that wasn’t there before I took pen to paper.

Aphasia

I’m hard-wired to be a writer? Whoa

hyperg-poster
Driven to write…

As it turns out, finding out you’re neurologically “hard-wired” to be a writer isn’t necessarily a good thing.

Hypergraphia, the urge to write excessively, is one of a cluster of characteristics forming Geschwind Syndrome, which is itself a personality-affecting phenomenon that often goes along with having TLE, or Temporal Lobe Epilepsy. Unlike better-known epilepsies affecting the motor-control areas of the brain, TLE influences the sensory-input processing part of the brain, also the part that controls language and memory access. (You may have heard how the “smell” part of the brain resides close to the “memory” part making smell one of the strongest senses that can evoke memory–well, TLE seizures can entail sensory hallucinations such as smell and complex deja-vu hallucinations.)

I had infant febrile seizures–they put me in a coma for about a day. But it wasn’t until about 15-20 years ago I began to learn about TLE and the fact that a scar on one or both of my temporal lobes had left its mark on my very personality for as long as I could remember. An obsession with language, the drive to write, a strong sense of social justice and the tendency to become mildly-obsessed with worlds of the imagination (cartoons, TV, movies, but most often books) could all be attributed to my neurological irregularities. I wrote stories and sometimes when I was too impatient to wait for inspiration I simply copied my favorite books from the library by hand. The thing with hypergraphia is, you want to write a lot; it isn’t necessarily good. Some people write loads of crap. Some who have little imagination just doodle their name all over everything.

But I tried to be a better writer. I began my first novel at age twelve, and by then was probably an even better artist than writer (abstract creativity sort of swirls out of Geschwind Syndrome and related conditions). I studied art in high school and AP English and French, then began a major in art and minor in music that resigned itself to a writing major and the music minor. This writing thing just wasn’t going away. I proved to be a pretty good abstract thinker and was sort of flattered into graduate school in Medieval/Renaissance literature. I adored Anglo-Saxon and the histories of Shakespeare–having already written quite a bit of political-themed stories myself.

By young adulthood, though, I began to be aware that my mini-obsessions, my intensely-curious and driven scholarship (and lack of interest in dating) and overly-developed “inner life” (I think Wikipedia calls it “intensified mental life”) was unusual, and there were some aspects of my routine I had to keep to myself. Color-coded notes and index cards, the tendency to live inside interesting stories and photographs, and always, always the story-writing. I figured it was just me. Er, that is, it is me. Only. . .IS IT?

This is the sinking-feeling reaction and subsequent specific despair that followed my diagnosis. It didn’t matter that I was in good company; many notable artists had TLE (Van Gogh, Dostoyevsky). So did many mystical saints and shamans–Joan of Arc. None of them were very happy in their personal lives, were they?! Destined to be misunderstood outsiders at best.

Far from being elated that I was quite literally hard-wired to be philosophical and detail-oriented and imaginative and a writer, I felt like my entire life of writing was. . .a symptom! It felt like everything that made me ME was simply a medical phenomenon. I walked around feeling like nothing more than a diagnosis. And though it wasn’t a conscious decision, I gradually. . .stopped writing. Whenever I looked at my latest novel I saw it the way a clinical scientist views experimental data: clear evidence of a mind gripped by disease. If I didn’t have a normal mind, how could my observations and opinions ever be valuable?!

And then what happened was, I became Not-A-Writer. For FIFTEEN YEARS. That was the extent of my despair. I had a degree in writing and a Master’s in literature; I had top grade-point averages and won every scholarship and assistantship and fellowship that had come my way. I wrote three doctoral degree comprehensive exams for twelve perfect scores and I had taught writing successfully at the college level for ten years–and still I somehow convinced myself it was all a fluke.

Of course the symptoms still found release during those years; I still told myself some pretty elaborate stories in my head. I had learned to compose and memorize scenes in my head verbatim since those days in high school when staying up late to get just one-last-thought written down became impractical, and I self-imposed a strict “lights-off” schedule. During the writing hiatus I had fantasies after lights-out as lengthy and complex and coherent as any novel I’d ever written–if not more-so because now I was conscious of the need for concise story-telling, since wordiness was a symptom of my “disease”. But I stubbornly refused to write any of it down. I silenced myself more effectively than childhood shyness or low self-esteem ever had.

Still, I read a lot, as many people do, and often I read novels that were okay but I said to myself, “This got published? I’m pretty sure I can do better than that.” Social media netted me some writer friends, then last November 2nd I learned about NaNoWriMo–the online challenge to write a novel in a month. Just draft it. And something about that challenge appealed to me and freed up my self-imposed writer’s block; no one had to see it, you just had to write it. For some reason the time was right and this was the thing that got my butt in the chair and really writing again. Finding the joy in writing again and not worrying about why. That is, I began to feel like a person again some time ago, I found my life, I continued to apply my talents researching and engaging my curiosity. But actually writing every day didn’t happen till last November.

So three months later I had a fairly coherent first draft of 144,000+ words and I guess the dam has burst, the die is cast, I’ve crossed the Rubicon et cetera. Laptops and the Internet writing community make for a very different world since last I was a writer. I don’t know where I’m going from here. But I don’t think I’m going to go quietly.


*Nothing in this post or the reading list should be construed as a basis for a medical diagnosis. If you suspect your symptoms go beyond those of your average neurotic head-in-the-clouds workaholic writer, go say “Hey” to the family doctor. She’s probably happy to see you ’cause it’s been awhile!

…Before my writing break-through, I have to say that I credit Oliver Sacks with finally making me feel human again. His writings about neurological-based disorders were the first I read that asserted his patients were human beings first and beyond any medical problems they may have. He has a belief in the transcendant wonder of life and the abilities of people, probably based in his faith. Alice Weaver Flaherty, furthermore, writes about hypergraphia as both a neurologist and a writer–and a patient.

Altogether this makes for a pretty fascinating beginner reading list:

Flaherty, Alice Weaver. 2005. The Midnight Disease: The Drive to Write, Writer’s Block, and the Creative Brain. Mariner Books.

Jameson, Kay Redfield. 1993. Touched With Fire (Manic-Depressive Illness and the Artistic Temperament). The Free Press/Simon & Schuster.

LaPlante, Eve. 1993. Seized (Temporal Lobe Epilepsy as a Medical, Historical, and Artistic Phenomenon). HarperCollins.

Ornstein, Robert. 1991. The Evolution of Consciousness (The Origins of the Way We Think). Touchstone/Simon & Schuster.

Sacks, Oliver. 1987. The Man Who Mistook His Wife for a Hat and Other Clinical Tales. Harper & Row.