Tuesday, October 25, 2011

Finding ways to mark the time, as it flies by

Something's wrong down at my dentist's office. Somebody or something has slipped up. Maybe their database got erased or a virus infected their computers ... or, in their case, an abscess. Regardless, I haven't heard a word.

For years, as regular as, well, as a trip to the dentist is supposed to be, I have received a little reminder in the mail, maybe even a phone call, to tell me it is time for my regular six-month visit. We are going on 11 months or so now, and not a word. Nada. Zilch.

And those visits were one of the ways I marked time, so it has kind of thrown me off. I like marking time and noting those things that help us do so.

Marking time is using little benchmarks to monitor how fast the time is flying by, or how slow, or what is coming next. Sure, a calendar is fine, but there are so many other subtle ways that I have found that help me keep track of time, and myself.

Lugging yellow sacks of water-conitioning salt down the basement, for example. About once a month, Spouse will ask me if we are low on salt. She, being the one with sensititive skin, notices these things. And about two weeks later, I do the lugging. So, I mark off about every six weeks with a conditioning salt exercise.

The beginning of a new school year. Commencement. Not getting nearly as much done during the summer as you were hoping to. These are the ways we mark time, tick off the seasons of our lives.

After a doctor's visit, Spouse felt compelled to buy me one of those week-long pill boxes, with space for all the pills needed each day. (Yes, unfortunately they are starting to add up.) I'm not kidding you — all I do is fill up those little mini-boxes with pills, I swear. I no sooner get all seven spaces filled, plus the ones in my hand for that evening's gulp, and I'm doing it all over again. If I ever get sentenced to 20 years in a Turkish prison or have to spend two weeks at my inlaws, I am going to insist on having one of those pill minders —- that's the perfect way to make the time just fly by.

Little pill boxes have become one of my main ways for marking time. Dang it all.

You used to be able to mark the time by the beginning of the new television season — remember that? There was a time when it was an organized moment in time for three (three — ha!) networks. Or how about "the new car season," which always happened at the same time each year? Now anniversaries of the bailout will be the only way to keep track of time relative to the auto companies, I suppose.

I had to buy a new blow-up pool for the backyard a few weeks back. Boy, you can set you clock to that one. Once a year, steady as she goes.

The wildfire season. Now there's one that rolls around regularly. Set your calendar in your brain by the ominous tones of the 6 o'clock anchor warning us that this season could be a bad wildfire season. Well, duh. Yes it could. Or not.

Or how about those stupid end-of-the-newscast video clips of the running of the bulls? Or of big wheels of cheese being rolled down hills in England, while idiots run after them, falling head over tea kettle? Regular as fireworks on the Fourth. And a way to mark time.

When I was a youngster, I could mark time by getting a new pair of eyeglasses. It happened every, um, late August, just before school started. We made the trip to Logan for updated school clothes and new glasses. With a sharper prescription, I was ready for the school year and could now see clearly the tether ball as it smacked me in the face, breaking the glasses at the bridge of the nose, and necessitating the white-adhesive-tape look that graced my face most of the year.

Lately, I have been marking time by how often I get to Pickleville Playhouse, by watching the level of Bear Lake go up or down and by how few movies I get to that aren't animated, by that crick in my back that comes and goes. We could all mark time by the number of newspapers going out of business or by how many trillions the deficit is in today.

I even caught myself marking time recently as I counted how many U.S. presidents I could remember. Hint: the rising generation doesn't get the old "where were you when JFK was assassinated" bit. Don't even go there.

I'm starting to see sons and daughters of students I once taught come through my office, for heck sakes.

Which just goes to show that regardless of how you mark your time, you've lost it.

Rules of thumb? They work most of the time ... as a rule of thumb.

No one can pin down exactly where and when the phrase “rule of thumb” was first used and what it was intended to mean. It first showed up in the written word in 1692. Some think it comes from cooks and brewers using their thumbs much like a new mother uses her elbow to test the temperature of cooking liquids. Probably not.

Horse heights have long been measured by “hands.” An old tailor might have once said “twice around the thumb is once around the wrist.” Many people learn to make a stride that is approximately a yard. So it is more than likely that using the space from tip of thumb to first knuckle as “about an inch” is probably the best origin for a rule of thumb.

For you and me, it is metaphor for some day-to-day tidbit that comes more from experience than from science. It’s one of those things that nine times out of 10 is dependably accurate — a ballpark estimate, a likelihood.

The first year I was in Logan, an aging neighbor told me not to plant tomatoes as long as you could see snow on the Wellsville Mountains. Well, my experience has been that that advice is a tad conservative — some nooks of snow can been seen long after the last frost that might threaten a garden — but, you know, it probably is a good rule of thumb.

Some say that to get a rough estimate for the temperature outdoors, count the number of times a lone cricket chirps in 15 seconds and add 37. Try it.

Here are some more rules of thumb that I am noticing or have heard others express recently:

— Meetings will always last longer than expected.

— Textbook publishing, oil companies and McDonalds are recession-proof. I’m just guessing on the first one based on what I see in my profession, but as far as the final two go, it’s true. You know what the oil companies are doing to us, but did you know through good business management, McDonald’s profits went up during the current recession? Higher profits for them than in 2005 or 2006. You might not like iced coffees, but somebody apparently does.

— National health care will be impossible, because of the cost. When Medicare was introduced in the 1960s, health care costs ballooned. Why do some think that just throwing more money will solve it this time?

— College sports teams always recruit “the best group ever.” Year after year.

— Hollywood will never have your best interests in mind. Oh, once in a blue moon you may catch yourself saying, “That one wasn’t so bad,” but don’t think for a moment that the popular entertainment media want to match your values and interests to theirs. We’ve been lulled into handing Hollywood the keys to our cars with basically no regulatory ropes to slow them down. Our standard of measurement has become a comparison to how bad it could have been, rather than how good it should be.

— It’s been said that the puppy to pick from the litter is the one whose tail wags in sync with its stride, a sign of calmness.

— If a person is recalling a real situation, his or her eyes tend to go up and left. You might consider that the person is making something up if his or her eyes go down and right.

— State-owned buildings will find themselves lacking in some building code or earthquake code or fire code, within five years of completion.

— Too much money and fame at too early of an age screws up your head. Just look at the Spears sisters, Miley (the fall is inevitable, trust me) and other Disney/Nickelodeon pinups for an understanding.

— Gay rights issues will always seem to dominate the news because, well, no one wants to make a fuss and become the next target, including the media.

— Alcohol makes people do stupid things.

— Fish have vertical fins, while mammals of the sea have horizontal ones.

— Ten people will raise the temperature of a medium-size room one degree per hour. (That must be why I get so warm sitting in church ... or perhaps it is the message hitting home. Hmmm.)

— For every 10 people on a committee, three will do nothing, three will say they will do something and never do it and three will do the work of the committee. The final one will never attend another meeting.

— Stock market advice should be ignored ... at least 50 percent of the time.

The same, some would suggest, could be said for reading random rules of thumb. The problem is choosing which half to ignore.

Technology changes often widen generation gaps

It's a topic that I could almost be accused of obsessing over. When I hear a fact or even an off-hand comment that relates to it, I perk up and get sweaty and start thinking way too much about it.

I'm doing my best to come to grips with the whole technology gap between generations and, more specifically, how technology is affecting/changing the rising generation.

Just this past week or so, for example, three things in this regard slapped me in the face. First, an associate scanned the demographic data of incoming freshmen to college and noted, "Hmm. It's kind of weird to see that a lot students coming in were born in the '90s and not just the '80s." Then I saw some statistics regarding the expotentially fast infusion of so-called new technology and social media-related technology into our culture.

Then I heard a student make an off-handed and clearly naive comment regarding 9/11 to another student: "Oh, it's September 11. That's the day that changed everything, haven't you heard?" And he said it with a tilt of his head, a lilt in his voice, a smirk sending it on its way, as if to say that he wasn't buying.

These, you see, are my worries, my near-obsessions.

It's not like I crossed the plains or anything. I'm not from "the old country" and couldn't be called out-of-touch. But I analyze the current culture shapers and see that they have never not known a world without online predators, identity theft or Osama bin Laden and the al Jezzera network. This is common knowledge and a common occurrence to them.

They've never known another definition of gay, assuming it has always been about sexuality, and are beginning to similarly lose the old-fashioned definition of pride. They would snicker to hear "The Flintstone's" theme song, assigning the wrong inference to some of words. They can't imagine that their parents once used the word thong to describe an innocent piece of footwear and are aghast when we slip and do so.

They've never not worn their hat backwards, and the vast majority of their "friends" were made online and live at one Web site. They've missed the experience of watching a television in a cabinet about the size of a refrigerator, with a tube the about 15 inches across. Their preference is a screen about the size of a large postage stamp.

They assume music has always had warnings to parents about lyrics. They've missed in their lives the sound of a newspaper hitting the front porch. They've never popped popcorn with hot oil and a pan rather than in a bag and microwave. To be without a phone is death in boiling oil.

And while there have always been cultural gaps between generations, technology seems to be driving this gap — and I think this is the crux of my obsessive thoughts — to new, wider proportions. Heck, the gap between my parents and me started with hair and ended with The Beatles. That's about it. But I could still see where they were coming from from the vantage point of where I was going. While there have been generation gaps since the dawn of time, the electricity-like speed of change now occurring in our lives has widened and deepened that gap and without real effort will be harder to close and communicate over than ever before. I'm convinced of it.

Some have compared the current evolution of high-tech, quick-turnaround social media, for example, to be equivalent to the development of the printing press or air travel. And they're not kidding.

Currently 96 percent of 18-30 year olds are on an internet social network. It took 13 years for television to reach the 50 million audience mark. It took Facebook eight months to reach 100 million. There were 1 billion (that's with a "b") downloads of Ipod applications in the first nine months of their availability. If Facebook were a country, it would be the fourth largest country by population in the world. It is routine for 18-24 year olds to skip watching something on TV, because they can see it on their computer, Ipod or phone the next day instead, 70 percent saying they regularly watch TV on the Web. And — are you sitting down? — a 21-year-old average American has played 10,000 hours of video games and received or sent 250,000 text messages. The number of text messages sent daily exceed the population of the planet.

Are you obsessing yet?

And I'm OK with technology. All of this doesn't worry me until the smirks and rolling eyes widen the gap even further by perpetuating this concept that anyone not on Facebook has nothing to offer; that anyone who prefers to read a book rather than listen to one on a podcast or skim the highlights on their phone must be an idiot.

The strong glimmer of hope in all of this for me is the thought that, well, I turned out OK. Even with all the gaps and conflicts and changes, my generation is pretty dang capable. We didn't mess the whole world up too bad. And I can be at peace and take a breath and whisper to myself that if we could do it, these naive yay-whos can do it, too. Hearts of children will always eventually turn to their fathers and vice versa, drawing out the best of each in the process.

I hope.

Patch me up — I need some repair

It was a just a quick one-liner on the radio news the other day: Some improvements had been made on a dementia patch, a slap-on medical patch, not unlike a nicotine patch, to slow down the onset of dementia.

It got me to thinking — yes, those of us with the early signs of dementia can still occasionally think — about what untapped uses might still be out there for patches, officially known as transdermal drug delivery.

The first transdermal patch was approved by the FDA in 1979 for the treatment of motion sickness. Since then pharmaceutical companies have found uses for the drug treatment modality, with nicotine patches (treatment to help smokers quit the habit) being the most prevalent.

Other patches and maladies that are often assisted by a patch include estrogen for menopause issues, nitroglycerin for angina, lidocaine for shingles, drugs for chronic pain and even some deliveries for treatment of Attention Deficit disorders.

The skin is actually a good barrier, though, something designed to prevent things from entering the body. Some medications don't pass through the skin and into the body well. So not everything can utilize a patch-type delivery.

But, hey, what if there were a jerk patch? Something we could slap on rappers that jump up on stage and claim others should not have won an award or for that kid around the block who is convinced the road is made for his skateboard, not your car — you know, things like that.

I suspect someday we will see a true, working obesity patch. Someone will come up with some hunger-deadening drug that can be delivered all day to keep you away from Chuck-a-Rama. Mark my word. It is coming and you would be wise to put a few bucks in that stock, I'm sure. Combined with the vitamin patch, the sugar-nullifying patch and the V-8 vegetable patch, you'll be on the path to good health. It will work because we won't have to make conscious decisions any more. That's got to help.

Road rage patch. This might take years of development, but it would work like this: Any time someone is going more than 10 miles over the speed limit and they still insist on flashing their lights at the car in front of them because their life is more important than anyone else's on the road that day, a shock from the patch would jolt them back to reality and twitch their right food backward a bit.

A vanity patch. This patch would sense the amount of makeup, perfume and jewelry being applied and would force a slowdown when it reaches toxic levels. A close cousin to it would be the "You're not as as important as you think you are" patch, which would be used exclusively in Hollywood and the film-making industry, as well as by those who make more than $100,000 a year but have never owned a lunch bucket, dispensing liberal doses of reality.

A tattoo patch. This patch increases the pain felt by the recipient of the tattoo, perhaps suggesting to their brain that "one more tattoo" is not as good of an idea as one less might be.

An overspending patch. Not sure how it could work, but I could sure use it.

A patch for news junkies. For example, I'm surrounded by supposedly up-to-date individuals — I mean they twitter and tweet and everything — who don't know who the secretary of state is, who couldn't pass the new immigration naturalization test if they had to, who have no idea how to listen to several opposing points of view with the understanding that it is now their responsibility to make the best choice for society. So, being outnumbered, I must be wrong. Those like me who actually think current events have a place in their life might use the news junkie patch to calm themselves down and walk through life with fewer concerns about the future. The patch could also cause numbness in the ears during the local Sunday night television news, thereby avoiding blood pressure issues with the patch wearer, when stories of cats, squirrels and viral videos are actually presented as news, and thereby extending the newscast for a dozen more commercials.

Imagine patches for gambling addiction, pornography addiction and alcohol addiction (some are actually being developed and tested for this), patches that actually worked when people began sliding down slippery slopes of self-control.

Wait a minute, here. Self-control? Who would have thought of that? Will that actually work?

Seems my dementia-slowing patch is still working. I can still muster up a clear thought or two.

Whew! Let's get this decade over with

From December 2009, which, ironically is not really the end of the decade, if you stop and figure it out ... but that's another column, I suppose.


I was trying to figure out how to delicately approach the subject when Time magazine did it for me. Did you see their cover story two weeks ago?

The Decade From Hell.

Yes, this one. The one about to end.

Time -- and others -- have collectively decided that the 10 years from Jan. 2000 to Dec. 2009 are "the most dispiriting and disillusioning decade Americans have lived through" since World War II. Others have called it the Decade of Broken Dreams or the Lost Decade.

I agree. I was halfway through a column of grinching and grumping (Note to self: New Year's resolution ought to be less grinching and grumping) about the past 10 years when Time reminded me of "what went down," as the kids say. Let me just throw out a few nouns, proper and otherwise, and see what memories they spark for you. (You can put this to music if you want, maybe Billie Joel's "We Didn't Start the Fire.")

Y2K. U.S.S. Cole. Hanging chads. Bridgestone/Firestone tires. Chandra Levy. Elian Gonzalez. Concorde crash. Do you remember how many days the Florida recount went on, with the Supreme Court deliberating as well, before "W" was declared president? Thirty-six long days.

IPod. Steve Jobs. Osama Bin Laden. Shanksville, Penn. Kabul. Bunker busters. Al-Jezzera. Patriot Act. Anthrax. Harry Potter. Enron.

John Walker Lindh. Taliban. Daniel Pearl. Chechen terrorists/Russian theater. WorldCom. Simon Cowell and Paula Abdul.

Space Shuttle Columbia. Colin Powell. Shock and Awe. Martha Steward in "prison." CIA and WMDs. Mad-cow disease. Saddam found in a hole. John Kerry, Howard Dean.

Wardrobe malfunction. Abu Ghraib. Scott Peterson. Chechin terrorists/Russian school. Indonesia tsunami, 200,000 killed in a dozen countries. MySpace and Facebook. Peter Jackson and "Lord of the Rings." Madrid bombings. Gay marriage. Victor Yushencko poisoned for his politics. Ringtones. Swift Boats.

New pope. Steroids and sports. Katrina. FEMA. Dikes and levees. John Roberts. Avian flu. Amish school killing. Donald Rumsfeld. Benizir Bhutto assassinated. Virginia Tech shooting. Larry Craig. California wildfires. Sarah Palin. Stem cell research. Wii. Nancy Pelosi. Miramar monks. Botox. Michael Phelps.

Rod Blagojevich. Dow Jones drops 34 percent in one year, your retirement vaporized. Ahmadinejad. H1N1. Bailouts and stimulus.

And that's just the big stuff. Throw in stuff from Utah, your community, and that dang ingrown toenail of mine that won't go away and it has been a tough 10.

But I didn't need Time to tell me and neither did you. I knew it was the decade from Hell when I saw that "Big Brother" had been on TV for 11 seasons. I've not seen an episode, but I know enough to know that if that reality show is your reality, you're in big trouble.

I knew it when Pres. Bush opted for an all-out invasion of a non-threatening country -- led by a dictator, it's true -- rather than opting for a simple cyanide pill or the death-in-old-age option to dispose of the mad man.

I knew it was the decade from hell when I heard that 5 percent of deaths in the United States were from from heart attacks 40 years ago, but 35 percent of adult deaths are from heart attacks in 2009. Yikes.

I knew it before Time when I read that 15,000 people in the newspaper industry -- the Fourth Estate, watchdog of government, your source for well-rounded understanding of the day's events, you remember, don't you? -- lost more than 15,000 jobs in 2009.

I knew it when Tiger Woods -- new name "Cheetah" -- was named Athlete of the Decade the same month that it was revealed he had as many mistresses as there are holes at Pebble Beach. Somehow he thought birdies were chicks, it seems, and he needed a bag full. And the president of Nike, one of his major sponsors, tell us that in a dozen years, all this stuff we now see as a problem in his character will only be "a blip on the screen" of his legacy. Aah, that's nice. I feel much better about him as a role model now knowing what the future holds for him.

I knew it when I heard a deep thinker describing the national debt in terms of dollar amounts per American per year … since the birth of Christ.

Do we have reason to hope, though? A new year brings renewed possibilities, doesn't it?

Well, the stock market seems to have learned its lesson and is only half-crazy most weeks. Mortgage rates are good, if you can get one. The International Space Station has got to be working right -- seems like all they do is send up folks to repair it. And I think people have caught on to Brian David Mitchell -- yeah, he's faking it.

Lance Armstrong has not been found to be doing anything underhanded. The top three movies of 2009 -- again, according to Time magazine -- are rated G or PG. My wife is getting a computer for Christmas (Shhh!) and my library card has not been revoked.

So a salute to Time: May your 2019 December issue not have an "s" on the word "Decade."

Is it time to change our lives?

Holiday season '09-10, I think

I got caught in one of those "use 'em or lose 'em" predicaments over the holiday -- you know, personal leave days that have to be taken or the folks in Human Resources shake their crooked fingers at you. Sitting home between Christmas and New Year's and buried under an inversion, I got caught in the "what's on TV in the daytime" trap.
I can tell you now what's on daytime TV. Ads. Period.
In the course of one made-for-TV movie that I drearily watched, three consecutive commercials caught my attention. One was for a weight-loss program, and they promised to "change my life." Right after that one came a diabetic supply source that had a testimonial that said using this service "changed my life." The third ad in the block was a technical school specializing in high-tech web-geeky stuff, even for those who didn't finish high school, and, sure enough, the guy said he "changed his life" and so could I.
Listen close and see how often this phrase is overused in promotion. In the course of three minutes, I was told to change my life three times. I had no idea people's lives needed so much changing. I wondered for a moment if I did two of the three things -- say, I went to tech school and then lost weight -- if my life would actually be the same; whether two about-face changes would bring me back to the same spot and cancel each other out, like Yogi Berra used to explain when he changed course of action: "I did a complete 360."
Change your life. Is this really a good plan? How many of us would change who we are -- really change -- if we could?
I've read that often when a passenger train in Europe has an accident or a severe derailment and when numbers and lists and tickets and such are compared, often a person will "come up missing," the assumption finally being made that they took advantage of the situation to "change their life"-- to start over, to leave the scene and try a "do-over." As only one example, just last year a Rice University student's car was found in California, with books on the front seat about "assuming a new identity," and he's still "missing."
Changing one's name is just one aspect of changing who you are. I've known people who legally changed their name -- not all that easy, really -- but they didn't change their identity by doing so. The New York Times recently reported that the rate of Iraqis changing their names has jumped significantly recently. It's mostly Iraqis who want to drop "Saddam" as a first name (kind of tainted now, even though quite common) and others whose tribal name pinpoints them as Sunni or Shiite and they want to avoid sectarian violence or revenge. The illegal ID business is bustling in Baghdad.
There are Web sites -- believe it or not -- that will help you assume a whole new identity. The opening paragraph on one site says, "Many people are changing their identity because they know that it's the only really effective way to walk away from your past financial and personal problems…" They promise a full set of identity documents and a 191-page guide to six "identity changing systems."
Another site is titled "How to Reinvent yourself and change your identity." Interestingly enough, both talk about the importance of finding and using a local cemetery. This is called paper-tripping, I learned. One site warns prospects to be careful with paper-tripping, stating, "… problem is, just how many other new identity seekers have visited that same gravestone before you?"
Give me a break! Wandering in cemeteries to get a new lease on life? What's wrong with this picture? If, in fact, this happens, and if it were to happen to me and my headstone, allow me to warn all prospective paper-trippers that I will get special license from St. Peter himself to haunt the everliving heck out of you. I will make it my life's, er, ah, my death's work to torment what little soul you might have. Got it?
As you approach a new year, I hope you feel the same as I do -- yes, we all need a few changes, a few improvements in our lives. But changes are not re-inventing or bailing out or causing deep pain to those close to you by disappearing, or paying for a new name from Guido on his Web site.
If you want to change your identity, start small. Join a gym, get a new hairdo, visit with someone in a confessional, get your medications checked and balanced, buy a DayTimer, drive to work along a new route every day, don't vote straight-party ticket next time, just to see what it feels like. Get to know your neighborhood better. Give up smoking. Ride a bike. Stop drinking pop. No snacks after 8:30 (that one's for me). Send more thank-you cards. Listen to something besides Celine Dion. Put a quarter in a jar every time you say a naughty word -- and let someone else decide what your naughty words are.
And, maybe, just maybe, we can all get comfortable enough in our own skin that we would fight the notion of changing our lives, regardless of where we get our diabetic supplies.

Making middle age snicker-proof

Before we go any further, you have understand this: I’m not a dinosaur. I don’t have one foot in the grave or anything like unto it. Yes, a while back I was introduced to a group as “seasoned,” but, hey, I can forgive that. Salt and pepper I can live with.
But, boy, did I have one of those moments I wish I could take back. I forgot, just for a second, who I really was, or at least who I was surrounded by.
I was with a group of 20-somethings, the brightest and bubbliest of our rising generation, and they were planning a social activity. Well, let’s do this, some would say. No this or this, others argued. “Why not a ‘50s dance?” I chimed in.
For the next several moments, you could have tasted, cut and sliced, boxed and packaged the deafening silence. It was so quiet I could hear a faucet dripping down the hall, behind a double door. The ticking of clocks was almost overwhelming. Somewhere in the distance a dog barked. And members of this group kindly and slowly pushed their jaws back up into normal position.
(Now, understand my thinking, won’t you? I was reviewing in my mind successful activities of this ilk that I had been involved in and it was only a couple years ago that my church group had a wonderful ‘50s dance … for people my age. This was the little fact that I somehow forgot, the fact that didn’t work its way through my gray matter before my tongue took over.
(A ‘50s dance to this group would have been like saying to my age group, “Hey, let’s have a ‘20s Dance.” A ‘20s Dance? Why would we do that? That’s, like, crazy. Yup, and, thus, the deafening silence and blank stares.)
I was reminded of this painful embarrassment — but learning experience — while sitting in a doc’s waiting room two weeks ago. In a well-worn magazine there, there was an article entitled “How Not to Act Old.” The author, Pamela Satran, said, as she introduced a list of suggestions: “The point isn’t to behave like a 26-year-old. It’s to learn how not to act like someone a 26-year-old might snicker at.” She believes folks like me ought to avoid doing things that are snicker-worthy, and to a certain degree, she has a point.
She suggested things like “unstrapping that Rolex.” No one wears watches anymore, she says, as a naked wrist is emblematic of youth. This I had not thought of. Folks like me, she said, need to practice flipping cell phones open with one hand, too.
She suggested never leaving messages on voice mail or recorders. Young people don’t leave messages. That’s an “old” thing to do. Twenty-somethings just figure that the other person will see their number in a list of missed calls and if they want to reach out they will. Urgent message? Send a text.
Not worrying about exact change is another of her biggies. If you are digging through your purse to get that penny or nickel or a couple of dimes you know is there, well, you’re acting old. Who knew, eh?
Cooking roast, according to Pam, puts you right square into the “old” category. I’m glad my mother and grandmother never knew this, by the way.
Never spout history. Anything that happened before 2001, she says, really doesn’t matter to the average youth or young adult. I’m slowly coming to grips with this one. If you are starting any conversation with “where were you when (something important) happened …” or “I remember when…”, well, give it up. Save it for someone who cares.
Her little summary list got me to thinking about suggestions I might make to, well, my generation as to how not to appear old to those who have no idea what the term “long in the tooth” even means. For all I know she covers some of these in her book — I haven’t read it — but maybe we can learn together.
Here’s the one I have to work on: Don’t yell into the cell phone. Those of us who didn’t grow up with the dang things still don’t like the sound of them, don’t like the echo and gap in timing while talking, still consider them more of a walkie-talkie than a phone. For whatever reason I can’t keep a normal voice while talking in them. I have a friend — my age — that nearly chases you out the room with his yelling when he answers his cell phone. We have got to learn to tone it down. Shhh.
Don’t call that thing hanging in their ears a “Walkman.”
Don’t try to explain that episode of M*A*S*H that something just reminded you of. M*A*S*H was a big part of your life, fine, but is absolutely nothing now. Same goes for Johnny Carson and Columbo. There is no connection whatsoever. Don’t even try.
Don’t call it “tin foil.” In fact, you might want to get over your need to wrap and cover and even use aluminum foil. This group is way past that. Besides not believing in leftovers, they have little individual thingies for this activity. Oh, and don’t call them “Tupperware.” That’s so 20th century.
I know you’ve done this one: They are not “records.” Yes, they are round and they play music, but don’t make this slip. They are not even “albums.”
Be patient. These kids that don’t care will soon have a generation behind them that doesn’t care. And then they will come armed with tape recorders — or whatever magical thing will exist then to hold memories — to tap into you and me. Because in everyone’s life — even Pam Whatshername’s — there’s a time when old is important and not to be snickered at.

Can still see plenty, even with a deficiency

A mixture of humor/personal experience and politics. Sometimes it works and sometimes, well....


I’m depressed and confused. It has to do with a recent doctor’s visit.
The opthamologist confirmed what my wife has long suspected — I’m color
blind. I’m fighting the diagnosis, going through the steps of
acceptance. Still in denial, I think.
But in the wake of this news, I have learned a lot about color
blindness. "Color vision deficiency" is the more correct term for the
condition and it is even correct to refer to it as Daltonism, as it was
first brought to the forefront by an English scientist named John
Dalton. Eight percent of white mailes have it, less in minority men;
less than 1 percent of women are touched by it. Has a strong genetic
pass-along factor.
The most common tests to evaluate color blindness are pseudoisochromatic
plates, the cards or pages of colored dots. The most frequently used
type is the Ishihara color test, developed by Shinobu Ishihara in 1917,
and still prevalent today. I think that was the one I saw ... or didn't
see. The most distressing part of this diagnosis — aside from Spouse
continually saying things like, “Which accessory/color of paint/shirt do
you like best between those two? Oh, you can't tell, can you" -- is
worrying about what I'm missing. Am I not really getting the full impact
of that glorious sunset? Is your red, white and blue better than mine?
Am I not seeing all the layered nuances in the glorious flowers of the
new rose bush I just purchased? Are my son's eyes green or hazel or
torquiose or Pantone 452? Is everyone else -- well 92% of everyone else
-- seeing a crisper, cleaner, more pure vision of life?
I've seen deep green alfalfa fields topped by blue-on-blue skies, with
purple clouds gathering in the northwest, and punctuated by diving blue
and white gulls, complete with canary-colored beaks, just like you. And
now that I've said that, I'm worried that everyone else sees orange and
red gulls, while someone else is saying, "Who ever heard of purple
clouds?" See, I'm nervous and confused.
But even if I can't see the "12" or the "7" or the "301" hidden amongst
the myriad of dots in that blasted circle on that dang card in the
weenie eye doctor's office, designed, it seems, to weed out us lesser
folks (I'm not bitter, mind you), there is plenty of stuff I can still
see.
I can see the green tea of the new populist party, as well as the
tainted red and blue of the GOP. And I can see why Bob Bennett was
dissed by his own party after serving in the Senate for nearly 20 years.
It's been said that it was because of his recent votes for incentive and
entitlement programs, as well as voting for the so-called Obamacare
health care. No it wasn't.
This was a clear and ringing statement regarding term limits in
Congress. People -- and not just in Utah -- are fed up with the concept
of "career politician." Those are two words that should never be said
together. George Washington was asked to be a career politician and he
politely said, "Nay. For it shall be one of the downfalls of this new
republic." Well, I'm paraphrasing, but I'm sure it was something like
that. He turned down the opportunity to keep serving. The early founders
believed in a citizens' army and a citizens' legislative branch. They
believed that new blood just might be the better blood. The whole
lamentation about "loss of seniority" is a crock.
We limit the president to two terms; governors are limited in terms. But
we let our representatives in both houses in Washington stay way too
long. It is impossible to not become "part of the problem" when the
first thing a newly elected congressman does is set up his or her
committee for re-election. Imagine what good things could get done if an
official knew they were going home in two years, instead of worrying
about where their next campaign dollar was coming from. It's impossible
to not be part of the problem when your livelihood comes from wielding so much power and courting those who would influence that power. Washington is not real life and after
only a few years, real life fades fast and they forget how much a billion dollars
really is.
If ours is truly a representative government, we have to keep a fresh
flow of ideas and ideologies and new blood flowing to the capital. It
should be a citizen's Washington, much in the image of Washington's
citizen army -- leaving the fields to fight for a season. Let officials
leave their fields and serve with focus for a season, not a lifetime.
That's why Utah turned on Bennett.
Even with my failing Ishihara numbers, I can still see that
overindulgent parenting is becoming a problem that spawns
under-functioning kids. Overindulgence usually brings up images of
spending too much on kids -- which is still happening, mind you -- but I
am suggesting that the so-called "helicopter parent," will be looked
back on as a problem parent. The helicopter parent is always hovering
nearby to help make the decision of what's best for the child or on the
ready to bail them out if they stumble. Universities are overrun with
them at the beginning of quarters, as parents pick classes for their
students, choose their housing and then stand in line with their VISAs
to pay for it all.
It shows a lack of trust in the child and results in poor
decision-making skills, social skills and money management abilities. Wait 10 years and we’ll all see how easy it was to make wimps of the
rising generation.
See. I can still see a few things without squinting too hard. And maybe,
just maybe, the 8 percent is seeing it correctly and the 93 percent have
the weak version. Ha! Take that Ishihara.

It's natural that we tire of some words

So are we tired of the word “green” yet?
No, I don’t mean the green, green grass of home. That we could all take more of. It’s the green houses, green cars, green factories, green politicians and even green computer monitors — that are white, by the way — to which I refer.
While the intent may be as noble as Prince Valiant, it won’t be long before the term “green” will have no impact — it won’t mean a thing in same sense that it does today, don’t you think? It’s already happening in advertising. We no longer even twitch when something is described as “green.” That’s what happens with overuse.
How about “carbon footprint?” Man, that one has become so cliched to me that it would be absolutely refreshing to hear a bureaucrat say they wanted to reduce the amount of electricity they use and that they will drive 30 miles fewer each week, instead of this mumbo-jumbo. That I can understand. Carbon footprint? Sorry, too vague — not to mention trendy — for me.
A similar thing happened to the word “gay,” you know. I was reminded of this a while back when I slipped into the phrase “We’ll have a gay ol’ time,” reminiscent of the old “Flintstones” theme song. Stares, giggles and disbelief were the only responses I got from a room full of young ‘uns. That word and it’s meaning have been absolutely hijacked.
I fear the same for “pride,” by the way. What used to be “gay pride festivals” are now simply “Pride Festivals;” what now is the Pride Office, is what used to be the Gay-Lesbian Office or something like that.
I began worrying about this same diluting or misconstruing of precise words when I went grocery shopping the other night. It seems to be a already-overused trend to call a product “natural” or “organic.” But do they mean what they are supposed to mean and over time, will their overuse no longer have the impact that originally intended?
At the side of my desk right now is a bag of “natural” potato chips — made with “all-natural potatoes” as it says right on the front of the package. Well, the cynic in me says, what else could a potato be? Pretend? Plastic? Chemically manufactured? On the back of the package it goes on to explain: “We start with farm-grown potatoes …” Well, that’s good. Those ones grown in the coal mines and found along the seashore just aren’t as good.
And do they put on pounds just like the potato chips of my youth? Well, naturally.
What makes an item “organic?” How about “sustainable” or “free-range?” I’m sure these words spawn all sorts of different images in people’s minds. Would an egg from an organically fed free-range chicken on a sustainable chicken farm (I have this vision of chasing the smiling chicken around green, safe pastures and just nabbing the egg with a butterfly net right before it hits the ground, a smile on the beak of the chicken) be better for you than that regular egg you had this morning?
Turns out there are federal “guidelines” as to what can be called organic. Since 2002, the USDA says, organic food is produced by farmers who “emphasize” the use of renewable (there’s one of those words again) resources and conservation of soil and water … “avoid” the use of chemical pesticides, fungicides and fertilizers … avoiding use of bioengineering, sewage sludge or ionizing radiation. Before a product can be labeled organic a USDA-approved certifier must also inspect the farm, looking for non-organic stuff, I suppose.
“Natural” foods are to not contain any artificial ingredients such as coloring or chemical preservatives. Meat from animals treated with artificial hormones can be labeled natural, however, as can meat injected with saline solutions to add flavor. Food can also contain processed proteins that are harmful to some sensitive allergy-sufferers and still legally sport the label “natural.” Basically, the meaning behind the word “natural” is up the manufacturer or producer.
Speaking of labels, most perishable items have to show the country of origin on the label now. My daughter brought by some frozen hamburger patties for grilling the other night, purchased at a warehouse store which requires a membership. The label said the meat inside was from the United States, Canada, New Zealand, Mexico and Australia. Can you imagine the size of that processing plant if the end product could have possibly come from one of, or a combination of, five different countries? And the label has to list all five. Just to be safe.
Or should I say unsafe?
Worrying about where my natural, organic, sustainable, free-trade, grass-fed, cholesterol-free, reduced sugar, free-roaming food is coming from is enough to turn me green.

Handling disappointment is measure of a man

One from the baseball season of 2010:

I had a column half-written, you should know, with complete and perfect answers to the country’s energy crisis, solutions to the Gulf oil mess and the national debt and even some tea party uprisings mixed in.
But I set it aside because something really neat happened earlier this month. Yes, about it much has been written, clips have been shown and many pundits have done their best to underline the hidden themes. But a few more words, if you please, about Armando Galarraga and Jim Joyce.
And maybe some thoughts about disappointment, the theme I see woven into this split second of history and its aftermath.
In a nutshell, you may recall, Galarraga pitched a perfect game — well, it's tempting to call it a perfect game, even though the record books will never list it as one of the 20 perfect games in major league history — until the final out of the final inning, when an umpire — that would be Joyce — badly missed a call at first base, giving a runner a hit and taking away Galarraga’s place in history.
Or did he?
It what now has been a well-chronicled saga, Joyce apologized and Galarraga accepted the misstep with a smile on his face. The pair talked that night as Joyce explained he was just doing his best and Galarraga accepted his fate of being just an asterisk in baseball history. The meeting between the two the next day at home plate is a MBL baseball treasure, right up there with Lou Gehrig’s famous last speech, I think.
Even before the 90-second highlight of this life-in-miniature lesson had ended on the Whywitness News two weeks ago, my mind was pricked and went on a quick review of Harvey Haddix and what most MBL historians consider to be the greatest game ever pitched.
Speaking of asterisks.
Haddix pitched perfect game into the 13th inning of a game against the Milwaukee Braves on May 26, 1959. He retired 36 consecutive batters before a fielding error allowed a Brave to reach first base. After a sacrifice bunt, an intentional walk to Hank Aaron, Joe Adcock hit a home run, ending what would have then been a no-hitter. On this strangest of nights, though, Aaron left the base path on his run to home and was actually passed by Adcock. The umpires huddled and finally gave the Braves the win over the Pirates 1-0.
Rewind. Yes, I said a perfect game for 12 and two-thirds innings. And he will never be on the list of 20. Talk about disappointment. If 27 batters up and 27 down makes a perfect game, shouldn’t 36?
His widow — Haddix died in 1994 at age 64, he being a pack-a-day smoker and smitten with emphysema — said he never did grinch about the snub and “never talked about the game unless someone brought it up.” He was often heard to say of not having his name on the list: “It’s OK. I know what I did.”
Now there’s more to the story, really — a lot of baseball intrigue and minutia, including the stealing of signs from the Braves’ bench, who was in the lineup that night and who wasn’t, what pitches Haddix used — but what I am really fascinated with is Haddix’s and Galarraga’s handling of disappointment.
We’ve all seen friends or family members or know of those who have not handled disappointment well. We’ve seen celebs and athletes melt in the stew of disappointment. Many people make decisions while wallowing in disappointment that have life-altering results, often on more than just themselves.
On a smaller scale, just this past week, I spent an afternoon at Lagoon and watched two grandchildren handle disappointment differently. Playing the midway games — $2 for three tosses at something to get a mini-stuffed animal — both little boys did not win. Shocked, aren’t you?
One immediately fell into deep sadness and despair. Tears were shed and howling heard for an extended period. The other? He immediately wanted to fight back, take another turn, beat this game, regardless of what it took — and as it turned out, what it took was grandpa’s money.
Some deep thinkers of the psychological genre suggest that there are really only about four or five primary emotions, usually listing joy, anger, sadness, fear and love as those. Others, like nervousness, irritation, frustration and, yes, disappointment are subsets of those primary emotions. So, is disappointment a subset of sadness or of anger? Which was it for my grandkids? I saw one as a subset of anger and another of sadness.
How do we deal with disappointment? That answer alone tells us a lot about ourselves.
Gallaraga and Haddix, near as I can tell, handled it perfectly. Joyce was, in retrospect, so smart to be open and humble about his error, but Gallaraga was, well, unnatural in the way he worked through his disappointment. As I look at the layers of learning here, I think there are only three keys to overcoming disappointment: Look at a bigger picture; don’t look for someone to blame; and examine how selfishness is motivating your actions.
Haddix often said winning a World Series and a World Series Game 7 was his biggest achievement. Gallaraga must believe that there will be more chances to get his name on a list, must understand that when you look at the big picture, baseball at any level really is still just a game. Haddix never was heard to blame teammates, stolen signs or the commissioner. And neither was heard to cry, “What about me?”
Disappointment is inevitable. Appropriate responses to it are enviable and noteworthy.

Growing older should teach us something

With almost each passing day I come to realize what an idyllic life I had growing up. In my little one-cafe, no traffic-signal town, I roamed the streets as a preteen without a worry in the world. I rode my bike, caught frogs and tossed a ball until the sun went down and then I went home. Or at least that’s the way I remember it now, and I honestly don’t think that is too far off the truth.
Suggest to Spouse that our 9-year-old ought to be able wander the streets until dark today, and, well, you’ve got a fight on your hands.
It was simpler then, yes, but a trio of truths permeated the rural culture of my childhood and my hometown: Sports were a big deal, you couldn’t grow a tomato, and old people were important, they were to be trusted and were a valuable resource.
I obviously define “old people” slightly differently now, with my current definition being anyone two, maybe three years older than me. Then it was gray hair and bib overalls and wrinkles that defined the group.
But my reverence for oldsters is starting to be shaken and I long for the idyllic utopian image that aging — growing older — used to hold for me. I was shaken again this past week when an 81-year-old Murray man was sentenced for abusing a 10-year-old girl who was delivering cookies previously ordered. This on the heels of a 57-year-old Roosevelt man caught in a drugs for sex scandal.
And I have also seen, as you have, news reports this summer of 60-plus characters nabbed for drug distribution, pornography involvement and other sex crimes.
Eighty-one! How dumb do you have to be to have not caught on by the time you are 81 what appropriate behavior is? Losing their keys, driving too slow, enjoying buffets just a bit too much — these are the crimes “the elderly” ought to be dodging, not sex abuse of children, drug distribution and pornography production and peddling And yet we see quite often that those who fall clearly into a category which I will call “old enough to know much, much better” are paired with these problems. It’s sickening to me. It shakes my faith in society as much as any other single thing to see oldsters — those who should be on someone’s pedestal and distributing wisdom — acting like doped-up dummies. Hasn’t life taught them anything? Have they really been fast asleep for four score years?
Perhaps it has always been this way. Maybe I really have had my head in the clouds. But deep inside me are feelings that shout: “Growing older should teach you something, you fool!” It makes me want to grind out a neck-sized millstone. And use it.
Those on the downward slope of life — and in the case of the 81-year-old with one foot firmly in the grave and the other on a banana peel — the aging should see clearly that they are mortal; that to get this far is, well, because of luck as much as anything else. Those who are one good case of the flu away from being six feet under should realize more than most that small, daily decisions determine long-term happiness. Why at this point of living would anyone want to destroy another life besides their own?
The aging should teach those of the rising generation — that’s one of their major obligations. They should find specific examples of life’s blessings and share them with their families and younger associates. They should tell stories about Okinawa and Germany and Korea and that bittersweet mixture of pain and patriotic pleasure. They should embarrass their children with stories told to the grandchildren. They should, without any shy hesitation, talk about times of no TV, no cell phone and Saturday-night dances. Picture albums should be dog-eared from overuse.
They should teach how to plant, weed and harvest, even if it is only a tomato and an eggplant. They should wear ridiculous hats at the beach and try to throw a Frisbee as well as a 7-year-old. They should learn to play at least one instrument in their life — be it a piano or violin, banjo or harmonica — just so they can bark out a few goofy old songs to the delight of those younger. Even if it is just once. And yes, it will be delightful.
They should be taking pledges to protect, not harm, the young. If aging has not done any other thing, it has surely pointed out the potholes and bear traps that aim to dent and snare. The aging should be a light in the dark, not someone’s reason for life-long therapy.
And, yes, I am trying to deal with my anger, trying to wrestle with the modern realities that are at cross-purposes with my idyllic memories. I get it now that there is evil out there, even rarely — thankfully rarely — in the elderly. I understand now that there are other things besides sports.
But it is still true that you can’t grow a tomato at my old house.

The 10 Nails in the Coffin of Newspapers

A more recent one... written just after Black Friday at the DesNews.

One of the larger news stories of the past couple weeks was the elimination of 43 percent of the staff of The Deseret News and the resultant intermingling of what’s left of The News with some other related-by-marriage media. Forty-three percent, mind you. These were real people, too.
I am not going to examine this specific intermingling and the ethical, journalistic and messy questions that it spawns, but rather take a larger look at how this came to be. One of the editorial writers for The Deseret News, for example, explained it this way, about three days after nearly half of his fellow employees were laid off: “… the problem is that the Internet has sapped ad sales.”

Holy Simplified Answer, Batman!

It’s much more layered, much more complex that than, rest assured. To examine all of the nails that have been and are now being pounded into the coffin of newspapers, you have to go back a generation or more. Here are those nails, in more or less chronological order.

Nail 1. There used to be a TV show on Saturdays in Utah where high school kids — in teams — were quizzed by an editor of the Tribune about current events that had been found in the past week’s issue of the paper. This was not uncommon; throughout the nation, papers were often used as a current events tool. Then came Channel One.

Channel One? Yup, a TV channel sent directly to junior and senior high schools — we’re talking early to mid-‘80s here — to be shown during “home room” periods, a slam-bam TV news look at current events and popular culture. The pervasive presence of Channel One did a couple of things: it made newspapers irrelevant for a rising generation of students; newspapers were no longer a learning tool; and it conditioned millions of students to receive news via quick, flashy edits and tiny sound bites. It also fed into and, in fact, was a catalyst in reducing the attention spans of teenagers.

Nail 2. As it began being an part of day-to-day lives — not even yet an important part, but a part, nonetheless — the Internet was championed as “free.” All content on it was supposed to be free. There were bumper stickers stating the fact; there were student clubs at colleges in the ‘80s with that as their mission statement. I remember having an argument with a web-savvy technogeek college student who was absolutely aghast that a website would charge for anything. With the slogan “The Internet must never be owned by Bill Gates” on their lips, the Internet had at its beginning a strong, earthy undercurrent that said all “information” must be free.
And newspapers bought into that, as did most every similar research source on the web. As news and opinion began being archived or repeated online from print sources, it was free. Given that choice again, newspaper owners would dip themselves in boiling ink before setting out on that road.

Nail 3. Many thought the Internet could actually make money. Remember the so-called “dot.com boom”? One of the basic premises of the dot.com surge was that the Internet could make money … for everyone involved. The “dot.com bust,” you might remember, quickly followed the boom. Realistically, you can count the legitimate businesses making money solely with a presence on the Web on your fingers and toes.

Nail 4. The Channel One generation now becomes the decision-makers. I once heard a group of well-paid executives, all of them aged a couple decades younger than me, describe their marketing campaign for the unveiling of a new video game, a game designed for high school- and college-aged students. They used little videos and little hints strategically placed throughout the web (ala “Blair Witch Project”) and techno-this and unconventional-that, spending thousands to get the word out. When I asked if they had considered a college newspaper to reach college students, they looked blankly at each other and shrugged and said, “Nah, dude. Didn’t even think of that.” The irrelevancy of newspapers in the lives of decision-makers who had never read one — and wear their hats backwards — had, by this point, grown geometrically.

Nail 5. The Perfect Storm. Around October of 2008, several newspapers (Seattle, Los Angeles, Detroit to name a few) announced in varying forms and levels, their failure. At almost the exact same moment, or so it seemed, the stock market died, the so-called housing and banking crises hit and businesses went to the edge of the cliff. Peering over, they saw newspaper advertising as the first expense that could be called back as they circled the wagons. Black October now created an advertising crisis, too.

Nail 6. News by theft or imitation becomes more common. “News” websites that merely cut and paste from other sites and consider all information on the Web “public” begin popping up everywhere. Most are built on the premise they can sell ads around these tidbits and make money. See Nail 3. Add to these the legitimate, thoughtful blogs and alternative (non-newspaper) sites — yes, there are some! — and there began a significant diluting of the niche newspapers once thought was theirs alone. Suddenly newspapers and newspaper sites became only one voice in a chorus.

Nail 7. It becomes obvious that Internet advertising doesn’t work any better than print advertising. If you are reading this column on the ‘net, quick, what was the ad that loaded at the top of the page when you loaded this column? Can’t remember, can you? The Internet generation — as much as they love technology — are absolutely immune to Internet advertising. They are so used to buttons and banners and a little flashing this and that on web pages that they pass over them quicker than, well, quicker than a newspaper ad can be ignored, that’s for sure. So the concept of trading one advertising model — newspapers surrounded by news — for a new model — internet news surrounded by blips and bleeps — showed it’s flaws. And business owners began to question both models. Newspapers that were hoping to supplement their loss of print advertising dollars with dollars from the Web are left hoping.

Nail 8. Phones and other hand-held information gathers get really, really good. It’s not difficult for anyone, regardless of generation or former favorite sources of news, to leave print in the driveway.

Nail 9. You’ve heard sad statistics about how many WWII veterans — the Greatest Generation — we lose every day in the United States. The same can be said for newspaper subscribers; it’s the same group. In fact, that same statistics will soon be said for those who watch the evening news. Watch the CBS or NBC news at 5:30 p.m. and see what is being advertised. That should tell you who is watching, and whom the networks will soon be losing as viewers. Students I deal with more are much more likely to watch TV shows on a computer — or phone — than on a TV.

Nail 10. To bring us full circle — due in part to the shortened attention span and the million-edits-a-minute fed to the minds of the rising generation — people now consider a conversation with a maximum of 140 characters (not words, but characters) normal. Twitter and Facebook have reduced social interaction and discussion to less than a sound bite — to a mere tease. Reduced to “stories” of 140 characters, heck, who of the rising generation has time to read anything of substance, as important or good as it may be?

Even the Deseret News.

You like Halloween why, exactly?

A column published the week after Halloween several years ago. Got some great -- agreeing with me, that's always the best -- feedback, too.

I confess. I had another column already to go, polished and shiny. But very recents events have forced me to come out of the closet. I need to come clean and admit I am one of “them.” It is high time -- well, the perfect time, really -- to admit that I am a full-fledged, flaming, card-carrying Halloween Hater.

Boy, what a stupid holiday. And it is not even a real holiday. Did anyone get the day off? Well, I mean anyone besides those guys and gals who wandered around their respective worksites last Thursday in silly get-ups forcing everyone around them to take notice of their delight at looking like Harry Potter or a mummy or Feidel Castro. I grow so weary of forcing a smile until my face hurts at these folks that wander into my office on Halloween and stand there waiting for you to guess who it is. Take a Dum-dum and leave, please. I read where 35 percent of all costumes are for adults, not kids. Get a life, folks.

I guess it is more of a celebration than a holiday. But celebrate what? Witches, witchcraft? Ghosts, dying, decomposition? Devils, Lucifer? I never have quite caught the vision of this alter-ego insanity.

Not that others haven’t mind you. Halloween is now our second-most-decorated “holiday” in the United States. One CNN report says $765 million is spent annually on Halloween decorations. They say that $45 per household is average for Halloween preparations and activities.
Boy, I hope you spent my $45 well, because I didn’t touch it.

There are some homes in my neighborhood that are lighted up like Christmas with orangish lights and spotlights on ghosts hanging in trees ad naseum. And for what? To work their kids into a fever pitch about being greedy?

See, I told you I was a Halloween Hater. And I think I am not alone. There’s more than one insightful Grinch regarding Oct. 31, I’m sure.

I have grown weary of kids taller than I am walking around with a pillowcase scamming for candy bars. I have grown weary of little kids sticking their hands in my dish of Tootsie Rolls like Curious George in the coconut. I have grown weary of parents saying they hope nothing “happens to their house” if they run out of candy. I have grown weary of teens and young adults being urged to dress up and be quasi-anonymous -- all in the name of Halloween fun -- so that they feel comfortable doing things that they normally wouldn’t dream of doing. I’ve grown weary of Halloween being an excuse for bad behavior.

I have grown weary of hearing local police indicate that more alcohol arrests are made on Halloween (give or take a day or two, depending on the exact time of the big Halloween bashes, especially on campuses) than on any other night besides New Years Eve.

Luckily I married a fellow Halloween Hater, so we have worked on this together. We have had Halloweens where we tried to turn the event into a service project -- you know, like our children taking decorated sugar cookies to the old folk’s home and “trick or treating” there. Or traveling a couple of hours to grandma’s house and surprising her ... and it really was a pleasant surprise all around. This year, my daughter went a day early and “trick or treated” around the neighborhood for a charity.

We were known in our last home as the goofy house that gave goofy treats for Halloween. For several years we gave little boxes of raisins instead of candy (you get some interesting responses from from little devils at the door with those, I assure you.) For several more years, we gave nickels. They opened their bags, we dropped a nickel in it. We have given coupons a number of times, for something besides candy. And we always, always trade out the sack of goodies from our kids and give them a present instead. “Yes, that bag of chocolate, artificial coloring and corn syrup is nice, sweetheart, but give it to daddy and get this present instead! It’ll be kind of like Christmas!”

In case you can’t tell, my wife has a “thing” about candy. And for good reason. Anyone who doesn’t buy into that “bad food coloring/Red dye No. 5 stuff” has never met my children. Also, most kids get constipated for days after Halloween ... after you pull them down off the ceiling. Take note, will you, honest note.

So I ask, again, why? Halloween celebrates all the wrong things, it breeds all the wrong traits, it provides all the wrong excuses and thinking.

So don’t look for me at your door next year. I’ll have something else to do on the last day of the month next October. It’s called home teaching.

Thursday, January 27, 2011

This year's flu has confusion as a symptom

So why can't I get excited about this swine flu pandemic? Or is it epidemic?

As I look up the definitions of both, I wonder about whether my hesitation to panic is a result of the safe culture in which I have grown up -- free from just about every disease except chicken pox and fear of asparagus -- or is it a result of having observed many other overreactions by government and media?

Ahh, here it is: "An epidemic occurs when a disease affects a greater number people than is usual for the locality or one that spreads to areas not usually associated with the disease. A pandemic is an epidemic of world-wide proportions."

So apprarently since it started in Mexico and moved across borders, the H1N1 virus could develop into a pandemic. That is the correct word to use.

I wonder, though, if we took away the word "swine" from it's description, would this outbreak be getting so much attention? Don't people get the flu, well, all the time? According to the CDC, about 10 percent of the population of the United States gets influenza every year, 600,000 are hosptialized and 36,000 die annually from complications of the flu. Three, at last count, had died in the United States from this new strain of the flue. About 600 people die every day from diabetes and inherent complications, just as a point of comparison.

So I am confused. Why has it appeared to be more deadly in Mexico than elsewhere? Are there inherent genetic differences or in-body resistances due to vaccination programs that make those north of the border more hardy and healthy when faced with a flu virus?

Most experts say that's not true, and many are now suspecting that there may be a secondary health issues present in Mexico -- it might even be Mexico City's poor air quality or second virus strain more prevalent in Mexico -- that have combined with the swine flu to make it more deadly there. There is also undoubtedly a reporting issue. The current number of cases reported in Mexico and the deaths therefrom show a 6 percent death rate from this flu. That's way out of line. The total number of cases certainly must have ignored thousands of milder cases like those that have been reported in the United States, which would lower the percentage of deaths in Mexico. We in America are a reporting people, I think, and we like others to commiserate with us -- thus the higher pool of total cases.

I'm also confused about how it is spread. Some reactions have included cancelling soccer games and public events. So, it is alright to house three dozen snot-nosed second graders in an enclosed room for the day but cancel outdoor activities where people might come in contact with fresh air, and maybe the flu? Did anyone shut down Wal-Mart? That's about as "public" as it comes.

One benefit that might come from this overreaction, though, is better immunizations and treatment for all influenzas, as remedies are developed.

There are lots of epidemics that scare me more than the swine flu, in case you can't tell. While I am glad there are those who are quick to jump at the mere mention of the word "influenza," I wish we could stop the rising tide of bad, bad advertisements. Even Geico, the king of advertising, has stumbled. After the effective and memorable caveman and gekko series, no one, but no one gets the "money with eyes." And those "Therm Guy" talk show ads? Hamburger Helper in an elevator? Stop it!

Can we stop the epidemic of video distributors kicking up the porn and language a notch from their theatrical releases and calling the DVD's "unrated" or "director's cut?" This is a nasty trend. And while we are thinking about it, do we really believe the swine flu will cause more harm down the line that the current epidemic of pornography? Now there's a pandemic to make plans against.

Can we stop the epidemic-like tide of steriod use in professional sports? The statistics, records, and the Hall of Fame itself mean absolutely nothing now. Unless Major League Baseball were to stop for a week, test every player, kick users out of the league permanently with no pay (they did break their contract, after all) and basically start over, we'll never believe these players are for real. Bonds and McGwire and Clemens have taken the fun of being a fan of the game out of the game.

Can we slow the epidemic of government bailouts? It is no longer comprehensible how far in debt our federal government is. The numbers wash over us without making an impact, except to make us feel that all is lost. The pig trough is officially empty.

If you or one of yours has been diagnosed with the swine flu, I'm sorry. I hope it is of short duration and that you get to finish that novel while you convalesce at home. But I hate to tell you that by this time next year, we will have forgotten all about you.

The bird flu is coming back, haven't you heard?