Remaking Our Brains

April 15, 2015

This was the weekend of our annual Conifer Kiwanis Reading Celebration for the third-graders who attend six mountain elementary schools here in the Colorado Rockies. Also for a large consortium of homeschoolers.

Before we honored the kids for their reading improvement, I gathered close to 90 third-graders on the floor around me, and urged them to make reading central to their lives. Since I poured thirty years of observation and research into my 1992 book, TV on Trial, and one of my main doctoral concentrations had to do with the relationship between reading and writing, and since those areas have remained central to me during my entire academic teaching career, I felt this occasion offered me a golden opportunity to plant seeds in these young minds.

I pointed out to them that there are two ways they can feed their brains: Reading and Electronic Imagery. Reading has been with us clear back to ancient times, but most significantly since the advent of printing, some six centuries ago. Electronic imagery is much more recent: around the turn of the twentieth century with the advent of moving pictures.

Today, electronic imagery has become so ubiquitous it increasingly has pushed reading onto the ropes, with some even questioning whether it can survive at all.

So, I pointed out to the third-graders that there are two significant differences between reading and electronic media: Reading is a creative process; electronic imagery tends to be creative only for those who create it. Reading is connotative. In other words, every time a person opens a book and begins reading, something exciting happens: that person’s brain shifts into its creative gear as the reader cranks out non-stop inner imagery that has the potential to actually change the brain into a powerhouse.

I introduced two contrasting word processes: “denotative” and “connotative.” Denotative has to do with the dictionary definition of a word. Let’s take, for instance, the word “father”; the dictionary definition is “a man who has begotten a child.” That’s all there is to it.

But the connotative process is so explosive it borders on the mind-numbing, for it has the potential, over time, to remake the brain. I pointed out that as you read the word “father,” if you have a loving father you adore, the mental image you create will tend to mirror that; but what if you have an abusive father? That would contribute to a much darker mental image. And no two readers ever create exactly the same mental imagery from the same words! For each individual is one-of-a-kind. That is why cloning would be such a terrible thing. As a person reads, word after word after word triggers the creation of mental imagery in the reader’s brain. So much so that just one book has the potential to create seismic differences in the reader’s outlook on life. But that’s not all, by any means. Each author writes in a different way from other authors; this is why Google enables teachers to catch plagiarists so easily, and why it borders on the impossible that an anonymous writer can long remain anonymous. The reader reads works by Alcott, Tolkien, Blume, Milne, Seuss, Hemingway, Tolstoy, Twain, or Martin Luther King, Jr.—; those stylistic differences are stored in inner templates, each of which may be drawn from when the reader begins to write herself/himself.

Depending upon whether the reader reads from a wide variety of books, stories, essays, etc. written by authors worth reading as opposed to stalling out on mental pablum; the former is likely to develop into a powerhouse and the latter into straitjacketed narrowism.

* * *

But what if individuals read no books and little of anything else, and instead feed the mind with electronic imagery (the norm for untold millions today), what happens to their minds? When one is watching television, cinema, video, or other electronic genres, whether one person is watching a given source or a billion people are watching it, every last one is internalizing the same picture! Reason being that the receiver’s brain has had nothing to do with the image’s creation—someone else did that. In fact, the receiver’s brain is completely bypassed: BAM! The image is blasted into the receiver’s brain. But it is not internalized for it is a foreign object. It is a self-standing entity that just sits there. Over time, as these foreign objects take up more and more space in the receiver’s brain, that person all but loses the creative potential that individual was born with.

In the collegiate freshman composition classes I’ve taught over the years, I’ve seen replayed the two species again, again, and again. When I tell a class, “Take out a blank piece of paper. We are going to write. . . . Now write!” It matters little whether I give them a subject to write about or let them choose, the results are the same each time: the reader, having all the internalized imagery of many authors’ books and stories synthesized into the memory banks, stylistic templates too, can hardly wait to start writing—and then the pen races across the page. The non-reader, almost invariably, just sits there glassy-eyed, like Bambi on ice. Since there is precious little in their brains that wasn’t created by someone else, there isn’t much they can draw from. And since they don’t read, they don’t know how to write either. Structurally, they are equally at sea. Since electronic imagery explodes at them from all directions, little of it structured, their thought-processes tend to be equally unstructured and disjointed. This is also true when they speak in public.

Furthermore, even in the business world, non-readers are handicapped. Studies have shown that when employing CEOs test them to see which applicant would be the best fit for a job, they are often given a task composed of, say five, steps in which to reach desired completion. Deliberately and unannounced, the CEO leaves out a step. So a reader moves from step to step: A to B, B to C, C to D, D to E, and E to F—only D to E is left out. The reader reaches this abyss, is puzzled , but doesn’t give up. Since the reader has developed a part of the brain scholars call the “library,” in which the brain talks to itself, the applicant, much like a spider, launches filaments out into the void, seeking for a terminus on the other side. Sooner or later, one of the filaments touches solid ground; the applicant now bridges to the other side and moves from E to F, and completes the task. The non-reader never can complete the task. Even when both applicants are college graduates with 4-point grade A averages, the results are still the same. A neighbor of mine, an executive himself, and a veteran administrator and employer, when I shared this study with him, explained, “So that’s it! I’ve long wondered why some top graduates could problem-solve and others failed so dismally. It makes sense!”

* * * * *

Sadly, our society has yet to recognize just how essential reading is to life and career success, even in areas that are not generally considered as demanding a reading background.



August 20, 2014

Why indeed?

This was the question Washington Post columnist Brigid Schulte tackled in her August 10 column in The Denver Post.

Schulte notes that “We Americans work hard. Weekends are more like workends. We sleep with our smart phones. And we think vacations are for wimps. So we don’t take them. Or take work along with us if we do.”

We are indeed a nation of workaholics. Indeed we are the only advanced economy with no national vacation policy. One in four workers, typically in low-wage jobs, have no paid vacation at all. Those who do, get, on the average, only ten to fourteen days a year. Europeans enjoy twenty to thirty days of paid vacation every year.

Terry Hartig, an environmental psychologist at Uppsala University in Sweden, maintains that “when people go on a relaxing vacation, they tend to return happier and more relaxed. . . . And those mellow, good vibes spread like “a contagion’ to everyone you come in contact with. . . . Send everyone away on vacation at the same time [as is true in Europe], and that contagion takes off through the population like a viral happiness pandemic.”

Hartig and his colleagues conducted a major study based on the incidence of anti-depressant prescriptions in Sweden during the years 1993 through 2005. They discovered that the more people took vacations at the same time, the more prescriptions dropped exponentially. True for men, women, workers, and retirees. Since 1977, Swedish law has mandated that every worker must be given five weeks of paid vacation each year (and they may take four of them during summer months. “The benefits,” maintains Hartig, “are huge. Not only is the society measurably happier, but workers are more rested and productive, relationships are closer and people are healthier. And depression is a very costly disease.”

Depression alone costs the U.S. economy an estimated $23 billion a year in lost productivity.

* * * * *

We were not created to run non-stop, but rather to take time off from work at least once a week. Scripture mandates Sabbaths during which we may regenerate. Longer Sabbaths were also mandated periodically. Multiple studies have confirmed one universal truth: Those who work non-stop soon reach the point of diminishing returns. The more hours they put in on the job the less effective they are, the staler their ideas are. So employers who work their employees to death end up losing even more than their employees do.

Furthermore, unless you frequently get out of your workplace squirrel cage, you never gain fresh ideas at all, but merely recycle increasingly outdated concepts and methods.

So back to Hartig who notes that, in Sweden, “It’s like there’s this national agreement that it’s vacation time, and work will be left aside. So instead of working and being distracted and busy, people get outside. They do things they like and enjoy. They see friends, visit their aging parents, or finally have time for that cup of tea with a friend who has been blue.”

* * * * *

America continues to pay a terrible price for our workaholocism. The current epidemic of depression and suicides ought to be a wake-up call for us.

We must take time to live!

Don’t Make New Year’s Resolutions This Year

January 1, 2014

I don’t mean that we should skip that annual ritual (for the obvious reason that we rarely keep them anyway). No more than we can stick to a diet long-term. Just think backward through time: how long has it been since you actually followed through on a list of New Year’s resolutions?

Recently, in a Success magazine, it was postulated that perhaps we ought to focus on a different approach. Zero in on one perceived deficiency in our lives at a time. For instance, we’re all aware that Americans are killing themselves at an unprecedented rate thanks to our sedentary lifestyle. Studies have confirmed that the more we sit, the sooner we die. The longer you stare at a screen without getting up out of your chair or off the couch, the shorter your lifespan. We know this, but most of us fail to act on it.

So, how about, this year, scrapping the resolution list, and concentrating on a small thing? Just one small thing. One thing that you could so concentrate on, and stay with long enough to make it habitual. If you make it habitual, you will have conquered the problem. So you determine not to sit down anywhere for longer than thirty minutes at a stretch (ideally, it ought to be fifteen minutes at a stretch, but it is wise not to settle on anything you’re unlikely to follow through on).

Let’s say the phone rings. Rather than reach for it, you stand up, retrieve it, and walk around until the conversation is over. That would represent one huge way to deal with the issue.

How about determining to take control of electronic intrusions that are gobbling up your creativity, and not coincidentally weakening your job performance? Electronic email and text-messaging beeps have become the new tyrants in our lives. What if you relegated all but the essential ones to a back-up holding pattern, to be dealt with when the day’s main demands and opportunities had been met?

Such single determinations may seem small, but in reality they are anything but! Reason being; There is no small anything in life. Tackling deficiencies in our lives, just one at a time, has the potential to revolutionize our life’s journey, and dramatically increase our creation potential. If you are interested in a fuller expression on this subject, I suggest you pick up a copy of my new book, Christmas in My Heart #22 (Pacific Press, 2013), and study my novelette-length story, “Yesterday, Today, and Tomorrow.” In it, in a fictional format, I deal with the fullest exploration of motivational stories, mantras, poems, quotes, etc., that I have ever discovered. It is extremely unlikely that I will ever take the time to do this again, given that the gestation for the story stretches out over virtually my entire lifetime (one year which was concentrated on the evolution of the story itself: 20,000 words in length). Oh yes, all you romantics, the story is also a Christmas love story.

* * * * *

So welcome to a new year, all you cherished members of my extended family. I’m honored that so many of you take the time to check out the latest blog each Wednesday of your lives. I do not take this weekly decision on your part lightly, and will continue to do my level best to be worthy of it.

A blessed 2014 to each of you!

Why Are We Americans Becoming So Dumb?

December 11, 2013

It was while listening to the “Sunday Morning” broadcast that I was jolted into shock by a broadcast segment. In it, the program regular admitted how traumatized he was to discover that his increased use of electronic gadgetry such as Smartphones and aps was destroying his brain. The catalyst was his rueful discovery that he couldn’t even remember his wife’s phone number without retrieving it from an electronic index. Even more horrifying: to realize he could no longer remember how to spell common words such as “spatula,” no matter how much time he took to probe his mental memory banks. The same was true with mathematics: the electronic crutch ends up crippling the ability to do even simple math. Witness the number of individuals at restaurant and store checkout stands who are incapable of making correct change unless the machinery does it for them!

Every time I look at new lists of intelligence rankings (by nation), I wince as the U.S. continues to slip ever further down. Long gone are the days when we led the world.

In November and December of every year, I spend a large percentage of my time at book-signing tables; often with fellow Kiwanians at my side (because of our literacy program for area elementary schools), where for eleven years now, we’ve targeted third-graders. Reason being: studies reveal that unless a child falls in love with reading by the third grade, it’s not likely to ever happen at all. We are often permitted to set up our tables in large supermarkets because of this program. So we have plenty of time to watch people, young and old, as they come into, and leave, these chain stores. More and more often we are noticing a disturbing new phenomenon: children who are connected to electronic gadgetry tend to pay no attention to the books on our tables—or anything else, for that matter. But even when electronic gadgetry is not a variable, we’ve noticed that it has almost become a norm: when an approaching child’s eyes light up at the sight of books, almost invariably it turns out that the child is a homeschooler.

Even as I was watching this most recent “Sunday Morning” broadcast, and simultaneously signing complete sets of Christmas in My Heart books, I belatedly realized that the books were taking twice as long as normal to inscribe, and that my memory was fogging over and my accuracy continuing to deteriorate. Finally, I had to leave the room where the TV set was on so that I could complete my signings in the time I allocated for them.

All this causes me to question many of the so-called benefits of technology: if electronic gadgetry continues to erode our abilities to read, comprehend, articulate, write, understand, and effectively utilize abstract thought, then might we as a nation be paying way too high a price for so-called progress? Might it also turn out to be that it is not progress at all? But rather, the reverse?

In the thirty years of research poured into my 1993 book, Remote Controlled, I discovered that the more time an individual (of any age) spent watching TV, the dumber that person proved to be. And the more muddled the brains of the recipients. By extension, might it not also be true that overexposure to electronic imagery other than television will end up dumbing down the receiver’s brains even further? Reason being that those who receive pre-fab (created by someone other than the receiver) imagery rather than creating connotatively imagery through reading, being read to, radio or live drama, end up incapable of communicating effectively either in oral or in written forms. Only the reader, it turns out, is capable of writing coherent sentences and paragraphs. Non-readers, having little that is original to them in their brains to draw from, find it almost impossible to write anything creative or coherent at all!


In last Wednesday’s blog, I touched on a number of things about formal education that are good, positive, and helpful growth-wise. In this week’s, we’ll deal with formal education’s down-side. Since I’m a product of homeschooling; parochial education; state university education; ivy league education; teaching in junior high, senior high, junior college, college/university, adult education; as well as independent research, editing, and writing, I feel I can now approach formal education objectively.

First and foremost, formal education is not the real world; each segment of it is a self-propelled entity bordering on virtual reality. Thus it is a grave mistake to assume that academic success will equate with real world career-success. In fact, the two are not very compatible with each other. Let me explain:

Once your parents enroll you in formal education—let’s say kindergarten—, it’s like an assembly-line or car-wash; your own engine is left on a siding for it won’t be needed for a long time. Year after year, your teachers and administrators will be your engineers; all you have to do is follow orders. Over time, you become ever more subservient to these academic demi-gods who have such awesome power over you; if they dislike you, they can cripple your future career by lowering your grade or failing you outright, for grading is one of the most subjective and least-understood things on earth—paradoxically, even among educators themselves.

But what happens when you graduate at last and enter the job market? What all too many discover is that their own engine has remained on a siding for so long, it’s all rusted out. They no longer know how to be self -propelling. Many never do get the old engine up and running again; in such cases, they either accept other-directedness or find some job position in academia, the only world they understand. And some (a real serendipity to school administrators and business managers) become perpetual students: always learning but never putting their learning into practice.

Also, in degree areas that ostensibly equate with the real world (such as business, management, economics, technology, engineering, etc.), there is invariably a significant gap between cutting-edge developments in the real world and academic catch-up. For instance, schools of business are now reeling because the template they were basing their degrees on has dramatically revealed its obsolescence in the plunging, undulating roller coasterish stock market in today’s recessionary times, where no one is perceived to have the answers any more: not Wall Street, not economists, not pundits, not talking heads, not overseeing bureaucrats—not even that erstwhile golden boy of investors: Warren Buffett—no one appears to have the answers. Least of all, academia.

Another weakness of formal education is that it is so stratified and straitjacketed by regulations that it more often than not fails to adequately challenge eager learners. All too often, especially in elementary and secondary education, it degenerates into a form of social homogenization and control. If a teacher has 25 – 35 squirming bodies in a given class, s/he cannot possibly do justice to each one, therefore administrators will, more often than not, judge teacher performance by classroom discipline (that’s far easier to measure).

One significant weakness of formal regimented education is that it makes no room for side-trips. You are told to study certain things; and if you regurgitate them according to the teacher’s expectations and demands, you may be awarded an A. Thus, if I am taking a literature course, and told to study only one play by Shakespeare—say King Lear—, there is no incentive for me to also read Hamlet or Richard II. But—if I am taking but one literature class at a time, or being homeschooled, or reading on my own, while I’m at it, I can read Shakespeare clear through. Which I’ve done. But not while taking a full-load in an academic institution. Actually, I’ve experienced far more mental growth taking just one class at a time than I ever have taking a full-load, where I have to rush just to keep up with the teacher’s reading demands.

Also, formal education is hard on individual creativity. In the vast majority of instances, you are not rewarded for creativity, but rather by conformity to the demands of the teacher or the system. Mavericks are tolerated at best. Those who tend to think outside the box are not generally popular in academia—unless you’re a McArthur or Fullbright scholar, of course.

I guess what I’m getting at in this blog is this: I am not suggesting that we throw out the proverbial baby with the bath water. What I am suggesting is that we realize up-front that academia cannot be considered to be “real world”—that is not its function. Thus, if you wish to be truly successful in real life, then that presupposes that you will continue to keep your own engine in good running order, with plenty of independent side trips to give it exercise. Parallel to your formal education ought to be a major emphasis on personal growth (based on such things as voracious reading and journaling from books, magazines, newspapers, judicious use of the media, travel, lectures, personal inquiry, research, writing, etc). If you do these things, you will have a counterbalance to the dependence that invariably results from grade-dominated formal education. Thus you may end up with the best of both worlds.


It is both wondrously simple and unbelievably complex: this mind God gave us—that either works, or doesn’t; sings—or sputters.

Famed obstetrician Dr. Frederic Loomis, in his memorable books, The Bond Between Us and Consultation Room, noted that in his long medical career (early to mid twentieth century), delivering over 3,000 babies in California’s Bay Area, invariably the new mother’s first two questions were either, “What is it?” (In those pre-computer days, the baby’s sex was unknown until its birth) or “Is it . . . all right?” and invariably the second question was the one asked with the most trepidation.

So it is that if the answer is positive, and it’s “all right,” the stage is set for the most incredibly rapid rate of brain growth the child will ever experience in life: not sipping life but swallowing it in gulps gallon by gallon. It is during this period of life that the child’s non-stop fusillade of questions about everything drive parents crazy. This period doesn’t ebb until around the age of six, when it is said that we’ve learned half of what we will learn in life. I must qualify that assumption by adding: half of what we need to know in order to function as human beings.

But there is no valid reason why this learning curve should not continue throughout life—unless. . . . And it is this “unless” that is a national tragedy for our nation. The tragedy has to do with the disconnect between the parents who are so euphoric about their babies’ being “all right” and their impatiently squelching, if not outright suffocating, the learning process once it has begun. How? By responding to the little question-machines with, “Don’t bother Mommy! Can’t you see I’m busy? Go bug Daddy!” Or “You and your interminable questions—you’re driving me crazy!” “Give us some peace, and shut up, for Pete’s sake!” Or, the most deadly cop-out of all, Oh, go watch TV, and leave me alone! No, I don’t care what you watch, just get out of my hair!”

And so that God-given creativity is blighted, and begins to shrivel up and die.

It’s that simple.

That dying of the once aspiring mind is accelerated by another tragedy: the wholesale annihilation of print in the home: no books, magazines, or newspapers to be found anywhere—only an impressive stack of electronic gadgetry that attach their tentacles to the child like so many octopus suction cups that drain away what creativity is left.


By by-passing the receiver’s brain and blasting in, like so many moment-by-moment howitzer shells of pre-fab information created and packaged by someone else. Just a few of those results in little damage to the receiver’s creative process; the problem in today’s electronically obsessed society is that it continues day after day, week after week, month after month, year after year, and decade after decade. And the more of this pre-fab imagery that stacks up in the mental archives of the receiver the less likelihood that anything created by the receiver’s brain will be left. Over time that person becomes what sociologists label “other directed” rather than “inner-directed” and ceases to function as a creative force at all.

I built a foundation for this new series of blogs on education and creativity with Blog #16 (March 10) – “Little Boy Blue”; Blog #17 (March 17) – “Non-reader’s Doomsday”; Blog #18 (March 24) – “Miracle in Silver Spring”; and Blog #19 (March 31) – “The Child is Father of the Man.” In coming weeks, we shall continue to explore this vital subject.

* * * * *

Every young man and woman is now a sower of seed on the field of life. Every thought of your intellect, every emotion of your heart, every word of your tongue, every principle you adopt, every act you perform, is a seed, whose good or evil fruit will prove the bliss or bane of your afterlife.

—Stephen S. Wise