Monthly Archives: August 2022

I just learned about the Tommy Westphall Hypothesis

And I enjoyed learning about it, however belatedly.

I had never heard of it, possibly because I never watched a minute of something called “St. Elsewhere” back in the ’80s. Nor do I feel compelled to go find it and binge it, as interesting as the hypothesis is. After all, the hypothesis itself tells me the show and its fictional universe are ephemeral things, with which I need not concern myself.

But I do read Alexandra Petri’s humor columns, which I’ve mentioned before. And Alexandra taught me about Tommy Westphall. And she did it in a cool, offhand sort of way. Did she say, “Brad, I’m about to tell you about something interesting, something everyone else already knows, something you will be grateful to have learned.” She did not. She wrote a fun column about the stunning lack of originality of our film and television industries, as evidenced by some of the silly “prequel” shows that keep coming out.

The column was headlined “Our new fantasy show is definitely a prequel to something you love.” And as I say, it was fun. But then, she slipped in the reference. It was just a passing reference, in the course of mocking the prequel madness:

Could we theoretically just make a totally original show and then zoom in on a little grain of sand and watch it get heated and cooled and become glass and zoom out and reveal that, yes, this was the origin story of the iconic “Friends” apartment window? You know, that’s a possibility. Lots of things are possible; most TV takes place inside Tommy Westphall’s snow globe…

Which makes no sense unless you know about Tommy Westphall. And, of course, his snow globe. So I started looking into it. And it was very cool. I learned that Tommy was the young autistic son of one of the lead characters on the show, a physician named Donald Westphall.

The reference is to the end of the last episode of the series. Wikipedia describes it this way:

Tommy Westphall enters the office and runs to the window, where he looks at the snow falling outside St. Eligius.[3] An exterior camera shot of the hospital cuts to Tommy Westphall sitting in the living room of an apartment building alongside his grandfather, now being portrayed by Norman Lloyd (aka “Daniel Auschlander”). Tommy’s father, still being portrayed by Ed Flanders (aka “Donald Westphall”) arrives at the apartment wearing a hard hat.[3][4]

Wearing a hard hat, you see. So suddenly, he’s not a doctor. And he starts talking, and says to the grandfather, “I don’t understand this autism thing, Pop. Here’s my son, I talk to him, I don’t even know if he can hear me. He sits there, all day long, in his own world, staring at that toy. What’s he thinking about?” Then, Wikipedia continues:

Tommy, who is shaking a snow globe,[5] is told by his father to come and wash his hands. As they leave the living room, Tommy’s father places the snow globe upon a television set. The camera slowly zooms in on the snow globe, which is revealed to contain a replica of St. Eligius hospital inside of it.[3][1]

The foremost interpretation of this scene is that the entire series of events in St. Elsewhere were dreamt by Tommy Westphall, and thus, products of his imagination…

So… kind of a cool, creative ending to a TV show, and one that ticked off a lot of fans. Because it told them, Ya know this was all made up, right? But that’s just the beginning of what it means.

As another website explains:

St. Elsewhere didn’t exist in a bubble. Like most shows, there is some degree of crossover between it and various series. Some of these series ran along side of it, some of them ended before it even began, but most simply call back to it, well after St. Elsewhere comes to an end.

Here is where the Tommy Westphall Universe Hypothesis really kicks in. The concept is simple: if St. Elsewhere is all in the mind of Tommy, then every show connected to it could also be just in his mind. So, taking that into consideration, what all has Tommy dreamed up?

How many such shows are there? Yet another site counts 441. How does that work? Well, think about the overlap between, say, “Cheers” and “Frasier.” Or The “Andy Griffith Show” and “Gomer Pyle, USMC.” And while I never saw the show, I read that “The doctors had visited the bar on Cheers in one St. Elsewhere episode.” And we’re off…

A huge portion of the connection is between fictional characters who appear in multiple shows — as did storekeeper Sam Drucker in “Petticoat Junction,” “Green Acres” and “The Beverly Hillbillies.” Which I noticed at the time and thought was interesting at the time, but people didn’t go around blogging about such silly things back then, because there were no blogs. And no social media.

A lot of that Sam Drucker stuff goes on. Richard Belzer has portrayed cop John Munch on 11 different series — one of them being “Homicide: Life on the Street,” which included some characters from “St. Elsewhere.” So he’s sort of a superspreader of this snow globe virus. So are the guys who played Cliff Clavin and Norm Peterson on “Cheers.” John Ratzenberger and George Wendt appeared as those characters on seven series each, one being, of course, “St. Elsewhere.”

So among the shows that exist only in Tommy’s imagination are “Breaking Bad,” “The Office” (both versions), “Supernatural,” “The Andy Griffith Show,” “Firefly,” the old ‘60s “Batman,” and about 400 or so more. Including, maybe, this.

You’re probably scoffing at me right now, because I suspect everyone on the planet except me knew all about this. In fact, there seems to be a bit of an industry in other series paying homages to Tommy’s snow globe. I’m sure I saw some of those, and didn’t get them until now.

I may be late to the game, but I’m digging it….

Richard Belzer as John Munch.

What would we do if we had REAL inflation?

Yeah, I know we have real inflation now. Of course, unless the economy has come to a halt and is in danger of sliding into deflation, like during the Depression, we always have inflation. It’s just it’s somewhat higher right now. Now, it’s more like what we lived with in the early ’80s. It feels familiar, unless you’re very young.

Oh, and before you think I’m shrugging it off, not only the young are feeling the pinch. My wife, who is the one in the family who has to make our modest income stretch to feed and house us (this is not a task you would want to assign to me), reminds me of it frequently. She did so multiple times when we were shopping together yesterday, and that was at Walmart. She normally shops at Aldi.

But what I mean is, what if we really had the kind of inflation — commonly called “hyperinflation” — that really shows your country is messed up and falling apart? You know, the kind that means your whole system, or your leadership, needs to be replaced? I mean, the kind that you’d think we were having now, if you listened to Republican politicians. And for that matter, some Democrats.

Including some Democrats I really like, such as Abigail Spanberger, who’s in a tough race for reelection to her congressional seat up in Virginia. There was an update on that race on the front page of The Boston Globe today (see above), and it said in part:

Spanberger and her Republican opponent, Yesli Vega agreed that inflation is the most pressing issue for voters.

“We’re facing a time when people have to decide whether they’re going to pump gas or buy groceries,” said Vega, a member of the Prince William Board of County Supervisors and a former law enforcement officer who still serves as an auxiliary sheriff’s deputy. “I do believe that we’re in the condition we are right now because of President Biden’s failed policies and representatives like Abigail Spanberger enabling him every step of the way.”…

“I have certainly found that people want to talk about gas prices, they want to talk about grocery prices, they want to talk about the challenges they’re facing,” Spanberger said after a recent Fredericksburg event highlighting the bipartisan infrastructure law enacted last year that she supported.

“I’m acknowledging the problem and trying to fix it,” she said. “Your other option is somebody who’s just trying to cast blame for the problem.”…

Anyway, I look at this situation in which polls keep showing that voters care more about inflation than anything — as this story states, “ahead of abortion rights, an increase in violent crime during the pandemic, a war in Europe, and attacks on voting rights.” And, presumably, global climate change.

The worst problem in the world? Presumably, you don’t think that if you live, say, in Ukraine. But America is apparently full of people who, at this moment at least, think 8.5 percent inflation is our biggest problem.

They might have had a point, if they were living in the Weimar Republic 100 years ago.

I met a guy named John Toland in 1976. I gave him a ride from the airport to the book festival that had brought him to Memphis. I wasn’t really there to talk to him. I wanted to talk to Mary Hemingway about her new book, being a huge fan of her late husband. The publicists set me up to have lunch with her, but asked me to pick up Toland, who had just come out with a weighty tome about Hitler. I hadn’t read his book, wasn’t planning to read his book, but I gave him a ride, and enjoyed chatting with him.

Years later, I finally read the book, and it left an impression. (I recommend it.) Burned into my memory in particular is an anecdote it related about the night of the Beer Hall Putsch. Hitler and a couple of his boys were hanging out in the beer hall, waiting for the time to make their move. They decided they would blend a bit better if they all were holding beers. So one of his boys went and bought three brews.

They cost three billion marks.

Not having the book at hand — I’m not sure where it is now — I looked up  “Hyperinflation in the Weimar Republic” in Wikipedia. It stated in part:

A loaf of bread in Berlin that cost around 160 Marks at the end of 1922 cost 200,000,000,000 Marks by late 1923.[14]

By November 1923, one US dollar was worth 4,210,500,000,000 German marks.[16]

The line about the cost of bread reminded me of another anecdote I read somewhere years ago. I can’t remember whether it was in Toland’s book or somewhere else. Anyway, a woman was on the way to buy a loaf of bread. She had a laundry basket overflowing with paper money to pay for it. Some emergency came up, and she had to put down the basket and go deal with it.

When she came back, someone had dumped out the money and stolen her basket.

Now that’s inflation.

But you don’t have to go back to Weimar to find examples of serious, profound inflation problems. As I’ve often mentioned, I lived in Ecuador when I was a kid. I lived there longer than I lived anywhere growing up — two years, four-and-a-half months. I’ve never been back there since leaving in 1965. But I became aware of the fact that at some point, the currency that we used there in my day — the Sucre — had been ditched, and the U.S. dollar adopted in its place.

One day, I decided to look that up — also on Wikipedia. In my day in Guayaquil, a Sucre was worth a nickel — it took 20 to make a dollar. I didn’t realize it had been declining in value for years. In 1946, it had taken only 13 to make a dollar. After I left, things sped up. In 1970, the dollar was worth 25 Sucres. In 1983, it took 42. In 1990, it was 800 Sucres, and it plunged to 3,000 in 1995.

Just before the switch to the dollar standard in 2000, you needed 25,000 Sucres to buy what the dollar would buy.

That, too, is real inflation, even if not quite on the billion-for-a-cerveza level. I can see how someone living under those conditions might see it as the biggest problem of the moment.

But 8.5 percent? You’d think a country that saw that as its biggest problem didn’t have any real problems.

And yet, we do — and inflation is one of those problems, although not the worst. For the first time in my life, the first time in our 246 year history, our republic is in profound danger. It could really, truly be falling apart. Look at the number of people who are outraged — our senior senator suggests we’re on the verge of riots in the street (again) — that the government thought it out to go take back those classified documents you-know-who stole and hid in his place down in Florida.

Also, many of the same people, and others, think — and I’m using the word “think” very loosely here — that we ought to turn fine people like Rep. Spanberger out of office over something that is in no rational way her fault — inflation. Note the comments in that Globe story from guy who voted for Biden in 2020, but says maybe he’d vote for Trump next time, “because in Donald Trump’s time, we didn’t have these issues.” (How’s that for steel-trap, cause-and-effect logic? As we all know, the condition of the U.S. economy depends entirely on who happens to be in the White House, right?)

These are serious problems, and considerably more disturbing than this other actual, but more transitory, problem, inflation.

Remember, Germany came up with a “solution” to their Weimar problems.

That solution was Hitler…

Adolf and his posse sitting in prison after the Putsch, all hoping someone else offers to buy the next round of beers.

I suggest we follow the Wally Schirra approach

If we must exercise, let’s do it Wally’s way.

First, a complaint that’s unrelated to the subject: For some time, I’ve been meaning to write something about the sudden death of the newspaper headline. I’m still going to write it, but I’ll just touch on it here.

Back when there were real newspapers everywhere, journalists had an important ethic — to tell their readers everything they needed (or might want) to know about the subject at hand as quickly as possible. Do it in the headline if possible. Then, if you couldn’t do it in the hed, you did it in the lede. People should be able to read nothing but the hed and the lede and move on, and know the most important facts about what the story was about. If the story was a tad too complicated for that, certainly you finished telling the basics in the next couple of grafs — then, assuming you were writing in the classic inverted-pyramid form, the importance of the information you related diminished with each paragraph.

You did this for two reasons. First, those rabid lunatics on the copy desk (no offense to copy editors; I’m just describing them the way a reporter would) were likely to end your story randomly wherever they felt like ending it, in order to cram it into inadequate space, so you needed to get the best stuff up top. Second, you saw it as your sacred duty to inform the busy reader as well as you could. A reader who didn’t have the time to sit down and read the stories should be able to glance over the headlines on the front page and at least have a rough, overall idea of the important news of the day. A reader with a little more time should be able to get a somewhat deeper understanding just by reading the front, without having to follow the stories to the jump pages. And so forth.

But no more. Now, the point is to get readers to click on the story. So you get “headlines” that say things like, and I am not making this up, “What you need to know about X.” When there was room in the headline to just tell you what you needed to know. Or they make it clear that the story is about a particular person, but don’t name the person. The idea being that if you aren’t willing to click, then you can just take a flying leap. (There’s another, even more absurd, reason why the person is often not named, but I’ll get into that another time.)

Different ethic — if you want to call it that.

But you see what I just did? I wrote 414 words without getting to the point of this post. See what writing for an online audience, without the discipline enforced by the limited space of a dead-tree newspaper, can do to you?

I went on that tangent, though, because I was irritated by a story headlined, “What Types of Exercise Reduce Dementia Risk?” That grabbed me on account of knowing someone — a good friend, you see — who will soon be 69. And he might care to know. But did the story tell me? No. At least, not in the first 666 words. After that, it finally gave me a subhed that said, “Start by doing what you like best.”

Which meant we were getting somewhere, but not exactly. Still, I forgive this writer and her editors, because she had an excuse: She doesn’t know the answer. Nobody knows the answer. At least not an answer that would satisfy me — or rather, my friend.

So, in a way, my long digression about bad headlines was even less relevant than it seemed. Oh, well. At least I got some of that out of my system. But I’ll return to the subject in another post, with examples.

Back to the exercise thing — while there are no answers, there are… indications, such as those from three recently-published “major long-term studies” that “confirm that regular physical activity, in many forms, plays a substantial role in decreasing the risk of developing dementia,” and further tell us that “Vigorous exercise seems to be best, but even non-traditional exercise, such as doing household chores, can offer a significant benefit.”

That’s good. But I went into this hoping — that is, my friend went into it hoping — that the stories would endorse the Wally Schirra approach.

Did you read The Right Stuff? Well, you should have, and if you haven’t, go read it right now, and return to this point in the post when you’re done…

Did you enjoy it? It’s awesome, isn’t it? Well, I always liked the part where Wolfe is telling about how the people in charge of the Mercury program encouraged our nation’s first seven astronauts to engage in frequent exercise. And John Glenn, demonstrating what a Harry Hairshirt he was, would go out and run laps around the parking lot of the BOQ. But most of the guys agreed with Wally Schirra “who felt that any form of exercise that wasn’t fun, such as waterskiing or handball, was bad for your nervous system:”

Nothing against John Glenn. He’s a hero of mine, as for most Americans alive in that time. I was really disappointed that he didn’t do better in his bid for the presidency in 1984. I was definitely ready to vote for him.

But I like Wally’s approach to exercise. And while the data may not all be in on precisely the best exercise for keeping one’s nervous system functioning properly, it seems a good idea to “Start by doing what you like best.”

At least that way, maybe you’ll keep doing it…

Open Thread on Technology for Tuesday, August 23, 2022

The Singularity hasn’t arrived, but we’re all pretty obsessed with the Matrix, as it currently exists…

Editor’s note: I wrote this on Tuesday, but didn’t post it because I thought it wasn’t very good. But today — Friday — I decided not to waste that time I spent typing it. So here it is, with only slight editing. But I didn’t take the time to edit all the places where it said “today,” which at the time meant Tuesday.

I have to be careful here. After all, there are already those who see me as an old guy (the insolent puppies). I don’t want to give them any additional reason to see me as Uncle Ben in “Spider-Man,” looking in the physical, dead-tree newspaper for a job (which shows you how long ago 2002 was), and seeing a help-wanted ad for a computer analyst, moans, “My Lord, even the computers need analysts these days!”

All my adult life, I was always on the leading edge of technology — when newspapers went from typewriters to mainframe, and then from mainframe to PCs, I was one of the people who learned it first and taught the others. I paginated the editorial pages before the rest of the newspaper followed. When I got canned in 2009, I was the only person at the paper actively blogging and regularly interacting with readers online.

But lately I’ve been noticing something a bit unsettling. Gradually, the news I read is less about what people do, and more about what their technology does. I’m not saying the singularity is imminent — artificial intelligence is still too stupid — but we’re moving in that direction, in terms of what we pay attention to. Maybe it’s because we’ve spent too much time observing stupid people, and no longer notice the intellectual limitations in the tech.

Anyway, these were all in The Washington Post today:

  • You’re charging wrong: 5 ways to make gadget batteries last longer — Hey, I love my iPhone and my iPad, and am on decent terms with my PC. But I’ll respect them all more — especially the iPhone — when the batteries are better. Or at least, more reasonable. Here’s what reasonable would look like: When I take off my phone and am not using it — which means when I’m sleeping — it should be charging, and without damaging the battery. And please, don’t do this thing where you take all fricking night to charge. Ever since that started, I’ll wake up in the night and reach over to unplug it, because it’s been a couple of hours and should be charged — but it’s nowhere near done, because it’s aiming to finish around 5 a.m. I’ve tried turning off this “convenient” feature in the past, but failed. So it charges all night, but gradually. But what if I needed to grab it and go in the middle of the night?
  • How a photo of a woman yelling in a guy’s ear became a viral meme — That sounds stupid, doesn’t it? That’s because it is. Not as stupid, say, as ‘haul videos” were, but pretty dumb. Apparently, it’s news because as a meme, it is somehow evocative of other memes, and has meaning to someone who spends all his or her time thinking about memes instead of, say, great literature. It’s an actual international sensation, apparently.
  • Strangers rallied worldwide to help this Maryland mom find where she parked her car — In this case, the amazing part isn’t about the technology. The amazing thing is the way this lady managed to lose the car she had hurriedly parked on the way to take a child to the doctor. Which is reasonable to anyone who has had to spend a little time remembering exactly where in the lot, or the garage, the car was parked. That I get. What blows my mind is that she didn’t even know in which nearby parking garage she had parked it. Which means she arrived at the doctor so flustered that she didn’t know how she’d gotten there, even roughly. So after unsuccessfully searching, she posted something about it on social media, and went home, defeated. And people around the world jumped in to solve the mystery, and two days later, someone found it. Which is cool, and even nice. But how did this happen to begin with?
  • Down and out and extremely online? No problem: Just enter a new ‘era.’ — You’ll have to read a few grafs of the story even to understand what it’s about. But when you do, you may react as I did, wondering how anyone could become this lost in narcissism. (Which is really something, coming from a guy who blogs.) And then, you’ll wonder about something even more perplexing: Who would actually watch such a thing? Compared to this, haul videos actually made sense.
  • Former security chief claims Twitter buried ‘egregious deficiencies’ — I put this last, but this morning, this was actually the lede story on the app. So Elon Musk isn’t the only one complaining. But then, he’s looking for something in Twitter other than what I see, and enjoy. I use it all the time, and it works great. I post something, and it shows up, and people interact with it. Yeah, lying to regulators is a bad thing and all, but if you want to go after a social medium that really sucks, take on Facebook. Or Instagram. Or Snapchat. Twitter remains my fave.

This saturation in tech news today reminded me of another story about something I want to complain about, from last week:

How to send text messages from the comfort of your computer — The only reason I read this was because I use an iPhone for my phone, and a PC for my computer. Which means I’m up the creek, unlike people who use all Apple products — their texts are shared smoothly on all their platforms. So I started reading, thinking that maybe, just maybe, I won’t have to shell out a fortune to get a Mac when my Dell gives out. And I read on even though the subhed warned me what was coming: “The process ranges from ‘surprisingly simple’ to ‘ugh’ depending on your mix of devices.” Of course, they save the “iPhone + Windows” scenario for the end, at which point they say that it’s technically possible, but…

So I kind of wasted my time there…

This is more MY kind of quiz — but I still blew it

I think I got a little overexcited, and hurried a bit too much. How else do I explain missing the one that asked, “The Pantheon, rebuilt during the reign of Emperor Hadrian, is a major landmark in which European capital city?”

That was really, really stupid. If only I’d read it a tad more carefully. But I was going to miss a couple of others anyway. People who concoct these tests all seem to think to themselves, Let’s throw in a football one, so Brad misses at least that one. So they do. And I did, because I’d never heard of any of the four people I had to chose from.

I had a similar problem with this: “Which song is the highest-charting single on the Billboard Hot 100 for the band Panic! at the Disco?” Really? That’s a band?

But still, I appreciate the shift to a more general trivia test — since I read less news now, and never read some of the things Slate counts as “news” — and was really enjoying it for the first few questions, thinking I was going to ace it.

Notice that they didn’t go with a staffer as the “ringer” on this one. They went with a “Slate Plus Member,” which is really unfair. We’ve established in the past that the average Slate reader is often smarter than the average Slate editor (and smarter that yours truly, but let’s not get into that).

Anyway, I’ll be interested to see how some of y’all like it

DeMarco: A New Confederate Statue?

The Op-Ed Page

Florence County Museum.

By Paul V. DeMarco
Guest Columnist

Casting a likeness in bronze and setting it on public property establishes a long-term relationship between a community and the person being honored. Some communities, spurred by an awakened consciousness of the messages Confederate statues send, have chosen to remove them. Others have added markers to provide a broader historical context than the monument alone provides.

But few are placing new statues to honor Confederates. Enter Florence County Council, which has decided by a 5-4 vote that 2022 was finally the time for Florence to do so. “This guy (William Wallace Harllee) formed the reason the town is here,” Council member and statue supporter Kent Caudle told The Post and Courier. “I don’t think that has anything to do with racism.”

Placing a statue because it acknowledges a historical person or event is not rationale enough. Those who argue that statues teach us history misunderstand their purpose. There is not enough bronze in the world to properly convey a complete picture of Florence’s 150 years of history. Learning that history requires reading, walking the streets, visiting the museum, and talking with those whose families have lived there for generations.

Statues accomplish a different objective. The best statues are about our values and our future. They capture someone whose life embodies important and timeless principles, ones that can continue to guide us. The worst statues point only backwards, evincing nostalgia for a romanticized version of the past.

Weighing a person’s life is an uncomfortable but critical part of the process. The key is to determine the person’s primary legacy. Lincoln had disabling bouts of depression and, although he always opposed slavery, whether he truly believed blacks were the equals of whites is a question historians still debate. But summing up Lincoln’s life, these are just footnotes. He was the Great Emancipator and Commander-in-Chief in the war that preserved the Union.

The County Council should apply a similar rubric to their decision to place a statue of Harllee at the Florence County Museum. Here is how I would encapsulate his life: He was a lawyer, businessman, military officer, and legislator from the Pee Dee who was lieutenant governor from 1860-1862, during the time South Carolina seceded from the Union. The fact that Florence is named after his daughter is a footnote in his story.

It seems strange that the County Council would want to honor this man, even stranger that it would override the museum board’s unanimous vote rejecting displaying the statue on museum property.

Perhaps if Gen. Harllee had a strong connection to Florence or had been an important part of the city’s development, it might make more sense. Gen. Harllee did found the Wilmington and Manchester Railroad in 1852, which was first railroad to locate a depot near what would become Florence. However, Harllee resigned from the company in 1855. Florence was not established until 1872, and Harllee did not live there until 1889. Florence Harllee’s obituary from 1925 states that the railroad construction superintendent, Colonel Fleming, gave the depot the name Florence during its construction circa 1853.

The statue, which is titled “This Place Will Be Called In Your Name, Florence” and shows a larger-than-life Harllee standing beside a railroad track with his left hand on Florence’s shoulder, is deceiving. It invites us to believe we are seeing Gen. Harllee sharing with his daughter a vision of the great metropolis into which her namesake city will grow. However, it appears that Gen. Harllee had no such vision; it was someone else who suggested the name.

The lives of Gen. Harllee and Florence are well documented in the museum as well as online. The sculpture, in the vein of other Lost Cause memorials, attempts to rewrite and idealize the city’s history. Some cities are named after giants. Florence is named after the daughter of a secessionist who oversaw South Carolina’s decision to go to war for the right to continue to enslave. This is a history to be overcome, not to be celebrated.

I do not intend to besmirch the name of the daughter, Florence. She was a devout woman who was proud of her city. She lived more than three decades in Florence, and served the community as a teacher. At one point, Florence was her town’s librarian.

It’s doubtful that Florence would have enjoyed all the fuss we are currently making. According to an article in the Florence News Journal in 2015, she was “quiet and unassuming.” In 1923, when she was seventy-four, she was invited to an elaborate celebration marking the opening of a bridge spanning the Great Pee Dee River to connect Florence and Marion counties. Seats for her and several other family members were reserved, and she was to be publicly recognized. The article reports that Florence said “The very idea of being willing to make a spectacle of ourselves!” and wrote back to the planning committee to politely decline their invitation.

Harllee’s ancestors and other admirers had every right to commission this sculpture. But it is a private homage and up to them to find private property on which to display it (although I would urge them not to display it at all). No public funds should be spent on it nor should it be displayed on public property, because it doesn’t do what public sculpture must do: ignite a sense of shared purpose, reminding us of those in our past whose values can propel us into the future.

Paul DeMarco is a physician who resides in Marion, SC. Reach him at pvdemarco@bellsouth.net. A version of this article appeared in the Florence Morning News on 8/17/22.

Postscript: On 8/18, the members of the Florence County Council voted unanimously to reverse their decision after receiving a letter on 8/15 from the Harllee Memorial Statue Committee asking them to do so. The letter stated “It was never the intent of the Harllee Memorial Sculpture Committee to cause any division in this great and prosperous community where we live, work, play, learn and enjoy life.” The Florence branch of the NAACP deserves the credit for mobilizing the community. The council had already received the letter by the time my column was published, so it likely played no role in their decision. I’m just glad they came to their senses so quickly.

Thank goodness I didn’t try eating haggis

Nor did I make myself watch “Braveheart,” on the off chance I would like it better this time.

In fact, I made no effort to acclimate myself to being Scottish, in spite of Ancestry’s bold claim that I was 52 percent thataway. Oh, when my wife and I were discussing where in the world we should travel to next, I mentioned that maybe I had a sort of ancestral obligation to try out Scotland — but I didn’t push it. Frankly, I’d rather go back to England or Ireland — or maybe Wales.

Bottom line, though, I never really believed it. And in spite of Ancestry’s long disinformation campaign of declaring me more and more Scottish — boosting me from a negligible amount to 40 percent, then 48 percent, and then, earlier this year, to 53 percent! — I retained my doubts. And I hoped Ancestry would realize its mistake, and start dialing it back.

Which they have now done, to a rather dramatic degree:

So now, I’m allegedly somewhat more Scots than anything else, but not mostly Scottish. I now await the next adjustment, which should get us back down to something based more in fact. Which means more English, and a good bit more Irish.

Nothing against being Scottish, mind you. It’s just that I don’t think its accurate, based on my family tree. Near as I can tell, I’m mostly English, followed by Welsh, Irish and Scottish all vying for a distant second.

Of course, as I’ve acknowledged before, this may just be because the English managed to keep better records — while busy lording it over those other three groups (and likely destroying a lot of those records). It’s particularly difficult tracing ancestors once they get back to Ireland. I can get them back there, but once in Ireland, they seem to have had no parents or any other antecedents.

But this latest assessment seems closer to reality…

The Ned Stark gimmick

Apparently, a prequel to “Game of Thrones” is about to air, and some folks are very excited about it.

Perhaps you are among them. I am not, although I confess I made a point of watching the original series. Each year that a new season appeared, I signed up for HBO Now (later succeeded by HBO Max) for a few weeks to watch it — and catch up with such things as “Barry.”

I found it entertaining in its own weird way, but was not a fan in the original sense of a fanatic. For instance, I wasn’t the sort to sign petitions demanding that the final season be reshot with a different ending. I thought the ending was fine. I mean, come on — Daenerys needed to go, and if you can’t see that, I suspect you might be one of those who believes the 2020 presidential election was stolen. And the ways the writers tied up the other loose ends were, I suppose, satisfactory. Time to move on, people.

Now the prequel is about to start, which I know because this morning The Washington Post went on and on about it, in five separate stories by my count. You see four of them in the screengrab above. And no, I’m not planning to sign up for HBO Max to watch it. I did skim through some of the stories, though.

For instance, this one, which tries to parse the alleged 6,887 deaths that occurred in the series, began with this (I’d say SPOILER ALERT here, but if you don’t know this, you obviously don’t care about the topic, and therefore haven’t read this far):

The season that started it all. When Ned Stark, the main hero and character supposedly least at risk, was beheaded, viewers everywhere realized that no one was safe.

Exactly. And this reminds me why, from the very beginning, I would never love this series. I don’t like being manipulated that way.

And this was major-league manipulation. You have bewilderingly numerous cast of actors you’ve never seen before (with the possible exception of Aidan Gillen, if you’re a fan of “The Wire”), but you know Sean Bean, right? And he’s the hero, right? So at the end of the first season, he gets killed off, so that two things will happen:

  1. You’ll get more invested in the other characters, whom you’ve sort of gotten to know over the course of the first season.
  2. You’ve been shocked into believing, with all your heart, that anybody can get killed at any time, which adds suspense during every subsequent second of the rest of the series. (Which only makes the Red Wedding slightly less shocking.)

(And no, this was not a big surprise to those who had read the books, I suppose, but I’m not a member of that set.)

Anyway, I had seen this before, and the first time, I was more impressed by it. Remember the opening scene of “The Hurt Locker?” It starts with Guy Pearce, as a bomb-disposal specialist, getting suited up to approach and disarm an IED. Every little detail of the scene persuades you that he will be the star of the show. He’s obviously the central character of this scene, suiting up for his task with a certain heroic elan. And you know him, from L.A. Confidential and, more impressively, from “Memento.” He’s the only then-famous actor in the whole movie, with the exception of the brilliant David Morse, whose later scene as a wound-too-tight colonel pretty much steals the movie.

And then, in that very first scene (SPOILER ALERT, although you’ve certainly seen this coming), he gets blown up. And the “star” of the rest of the movie is Jeremy Renner, whom at this point in his career, you’ve probably never seen before. (Really. Check out IMDB for any major flicks in which he was the star before this one.)

And you watch the rest of the film thinking, “This nobody could get blown up any second. Hey, they killed off Guy Pearce at the very beginning!”

This is such an obvious and effective gimmick that I’m sure Hollywood had used it before. Maybe you can give me a Top Five list of previous films that did the same thing. (In fact, here’s such a list on which Guy Pearce shows up as No. 6.) But this was the first time I really noticed it, and identified all the elements. It was quite well done. And it impressed me.

When I saw it again in “Game of Thrones,” I was far less impressed. In fact, I was kind of ticked, particularly since they didn’t hit me with it until I had watched a whole season.

Next time I see it, I’ll probably just stop watching…

Guy Pearce, in the opening scene of “The Hurt Locker.”

Well, I had a leg up on THAT question, anyway…

And I did really well — 10 out of 12 questions right!

But it wasn’t good enough. My score of 40 on the Slate News Quiz was edged out by Bill Carey, who is the editorial director for strategy (whatever that is) at Slate, and Mr. Average just squeaked by at 408.

Of course, I wouldn’t have done even that well if not for the gimme question you see above. And I admit I got lucky on guesses on a couple of others. Educated guesses, of course.

Here’s hoping you do better. As practically everyone does, time after time…

 

I’m a bit obsessed with my iPad, and my iPhone knows it

I hardly go anywhere without my iPad.

Certainly not if I’m going somewhere work-related — a meeting or an interview or whatever — because it’s easy to carry and can perform most work-related functions.

But I don’t go on vacation without it, either. And in the past, I haven’t even left the hotel or B&B without it. When we went to Thailand and Hawaii several years ago, I carried it in a drawstring bad strapped across my chest (I long ago outgrew the trying-to-look-cool thing) or back. See the embarrassing image below.

But by the time we went to Ireland and then to Boston, I’d decided if I absolutely had to do something while walking about on vacation, my phone would do. If I can keep the blasted thing charged.

Still, the iPad goes with me nearly everywhere.

And my iPhone has noticed. Lately, it’s been acting a bit sarcastic about it. Every time I leave the house now — for a walk, or to go to the grocery — I’ve started getting these notifications, like the one above, as soon as I’m a few blocks from the house.

They’re like, “Hey, you — it looks like you left your baby behind! Don’t you want to run back home and get it?”

OK, so maybe this isn’t petulance on the part of the phone. It seems to have started when I allowed the iPad to update its operating system recently. And there seems to be an easy way to turn off such notifications.

But… maybe one of these times, I really would want to go back and get it. So I’m leaving it on for now. I’ll just have to see how much it bugs me going forward…

Here I am with me mate Mark, whom I met on the road to Kanchanaburi in 2015. He’s a retired roofer from England. Note that in addition to the drawstring bag, I’m wearing my tropical-weight travel vest. So I’m really not kidding when I say I’ve outgrown trying to look cool on the road.

 

There go the trees…

There they go…

I’m sitting here uneasily listening to the buzzing of chainsaws right behind me. Followed by booms.

The trees are coming down today, having been struck by lightning last month.

It happened the first night we were in Boston. July 7.

It had been a long day, flying in from Columbia, with a brief stop at LaGuardia. We had ridden around Boston to get our bearings, then checked into our B&B in Newton before walking around that area — checking out the location of the dance studio where the twins had their training with Boston Ballet, and other local features. We’d been up since 3:30 a.m., had put in 18,515 steps, and were ready to relax when we went to have dinner at O’Hara’s Food & Spirits on Walnut St.

My wife had ordered a chicken pot pie, while I had chosen the broiled steak tips, with a Guinness to wash it down. The food had arrived, and we had just started enjoying it when… I got a text from our neighbor in West Columbia.

A shot my neighbor took that night. Note the Harry-Potter-scar effect…

He was checking on us to see if we were still alive. He was wondering why we hadn’t come out to see what had happened in our front yard — on account of the loud, booming crash. He had had a perfect view of it. He’d been in his garage watching the storm we knew nothing about, and his garage door perfectly framed the spectacle.

He said it was really something — the light flashed around multiple trees in the yard in various colors. He said it seemed to go on for about 10 seconds. He wished he’d shot video. Then, as soon as he said that, he apologized for seeming to speak of the event as a source of entertainment. I said no, don’t concern yourself — I wished he’d gotten video, too. I’d like to have seen it.

He did his best, though, going over and shooting some pictures of the damage. It was pretty spectacular, without the light show. There were streaks of stripped-away bark on at least three of the trees — a couple of pines and a sweetgum.

After we got back, we watched the pine nearest our house start going brown. I had an arborist out to look the situation over, and he said that one would definitely have to go, and mostly likely a couple of others as well. In the end, we decided to say goodbye to five of them, and they’re coming down now.

Mind you, this was not an isolated incident. We’ve had a rash of this sort of thing on our street lately. A neighbor across the other street (we’re on a corner) had trees behind the house hit a week or so before we did, and those came down just recently. They also had some damage in the house, from the charge running to it underground.

What greeted us when we got home…

We had a bit of that ourselves. At the bottoms of the streaks down the trees were pits where earth had been blasted away, and then trenches dug toward the house where the roots ran. But the damage was minimal. It put two HD TV antennas — the kind you put in a window — out of action, without harming the TVs. It also, I just realized to my sorrow a couple of days ago, destroyed the electronics of my elliptical trainer — which was plugged into the same outlet as one of the antennae, on the same side of the house as the trees.

The worst damage, the thing that worried us the whole time we were in Boston, was that our upstairs air-conditioning went out. When my son checked on things the day after the storm, he found it was 105 degrees up here, so he turned the system off.

But we lucked out there. The only damage was to a valve, the loss of which had confused the system so that it was trying to cool the house by blowing heat. It was easy, and cheap, to fix.

The biggest deal is what’s happening now — the felling of the trees.

After this, the view from our house will be radically different. See that house across the street in the picture below (this is the other street, not the one from which my neighbor witnessed the event)? We’ll have a perfect, unobstructed view of that house after today, since all five trees between us and it will go.

Maybe we’ll plant something in place of them. I’m thinking a Japanese maple. Those are pretty cool…

We’ll have a perfectly clear view of my neighbor’s house when we’re done.

A quick follow-up on that Will column…

Oh my, look! THERE’S an attractive candidate…

If you read the piece that inspired the previous post, you know that Will launched into his topic about the debasement of our politics and our political journalism with the observation that for the likes of Josh Hawley, it’s not about getting anything done, or saying anything meaningful (in his case, certainly not!) — it’s about getting attention.

Like an infant feeling ignored and seeking attention by banging his spoon on his highchair tray, Sen. Josh Hawley (R-Mo.) last week cast the only vote against admitting Finland and Sweden to NATO. He said adding the two militarily proficient Russian neighbors to NATO would somehow weaken U.S. deterrence of China.

Sen. Tom Cotton (R-Ark.), who is an adult and hence not invariably collegial, said: “It would be strange indeed for any senator who voted to allow Montenegro or North Macedonia into NATO to turn around and deny membership to Finland and Sweden.” That evening, Hawley appeared on Fox News to receive Tucker Carlson’s benediction….

Which, for someone like Hawley, is the point.

Anyway, I was reminded of that point this morning when I saw, and reacted to, this: “Cunningham wants end of ‘geriatric’ politicians. Will it cost him help from Biden, Clyburn?

My response was to say:

But I decided to post this to take it to the next step, which is to point out the connection to what Will was saying yesterday. You don’t have to look far. It’s the lede of The State‘s story:

Joe Cunningham made national headlines when he suggested an end to the “geriatric oligarchy” in political office and said on national television that President Joe Biden should step aside in 2024 and let someone younger run…

You see, he “made national headlines!” He got “on national television!” How terribly exciting. What more could he want?

Well, I’ll tell you what more I want, as a voter. I’d like him to step up and tell me why he, Joe Cunningham, would be a better governor than Henry McMaster. That shouldn’t be hard, if he has anything to say in his behalf at all.

And no, being younger doesn’t cut it. Just as it wouldn’t if you boasted that you are white, and male. I’m looking for something a tad more substantial than that. Yeah, I’m picky…

The nonreading public, and the media that serve it

George Will had a good piece yesterday. It offered multiple levels upon which you could enjoy it.

There was the headline, of course: “Josh Hawley, senator-as-symptom of a broken news business.” But it’s not really about that insufferable little twerp — although you may enjoy the link Will provides toward the end (rendered above in gif form), showing him skedaddling away from his good friends in the Jan. 6 mob. (Frank Bruni had some fun with that as well, in light of Hawley’s new book, which is, hilariously, about being a man.)

It was more about… well, here’s the most appropriate excerpt:

… (T)oday’s journalism has a supply-side problem — that is, supplying synthetic controversies:

“What did Trump say? What did Nancy Pelosi say about what Trump said? What did Kevin McCarthy say about what Pelosi said about what Trump said? What did Sean Hannity say about what Rachel Maddow said about what McCarthy said about what Pelosi said about what Trump said?”

But journalism also has a demand-side problem: Time was, journalists assumed that news consumers demanded “more information, faster and better.” Now, instantaneous communication via passive media — video and television — supplies what indolent consumers demand.

More than half of Americans between ages 16 and 74 read below the sixth-grade level. Video, however, requires only eyes on screens. But such passive media cannot communicate a civilization defined by ideas. Our creedal nation, Stirewalt says, “requires written words and a common culture in which to understand them.”…

The first part of that provides a certain understanding of what is wrong with today’s political journalism, and we can talk about that all day. Will employs the analogy I’ve used a gazillion time in recent decades about how reporters cover politics the way they would sports — there are only two sides to anything, and all we care about is which of the two wins, to the extent that we care.

Of course, that’s an insult to sports, the more I think about it. Actual sports contain far more nuance, variety, color and humanity than the ones-and-zeroes coverage we get of politics these days.

But the thing that really grabbed me was this one sentence:

More than half of Americans between ages 16 and 74 read below the sixth-grade level.

It grabbed me not simply because such low levels of literacy are distressing in themselves. It’s because of the larger point Will is making, which is that in an atmosphere of such plunging intellectual engagement, we’ve seen political journalism change “from reporting what had happened to reporting what was happening, and now to giving passive news consumers the emotional experience of having their political beliefs ratified.”

And that’s the essential problem, or at least one of the essential problems. To engage with politics meaningfully and constructively requires the active mental process of reading. The passive mob, engaged only to the extent of its members’ sense of identity with one of the two sides (and there can only be two, under the current rules of the stupid game), cannot possibly maintain a healthy, vital republic of the sort our Founders established.

To be a citizen, you can’t just twitch. Nor can you merely go about feeling strongly about this or that. You have to think. And of course, we don’t see very much of that anymore…

 

The Hero’s Journey

Sometimes in this distracted age, our myths let us down.

I got to thinking about that this morning:

OK, I remember that Obi-Wan let Darth win. It was a deliberate sacrifice, which I’m sure means a great deal in the theology of the Force, or would if there were such a theology. For us caught up in the film, I suppose the point was that it was so important to let the guys rescue Princess Leia, and even more importantly, destroy the Death Star (remember what it did to Alderaan), that he was willing to give his life to make it happen. (I’m not entirely sure why he couldn’t do all that and beat Darth, too, but I suppose Darth needed to live so there could be another movie, and so Anakin could be redeemed in the end.)

But anyway, he lost. And in this case, I’d rather see Rep. Cheney win and You-Know-Who lose. But I guess we can’t have everything.

My point, if I have one, is that this reminded me of something I’ve thought about a good bit lately. Actually, I’ve been thinking about it for several years, but I’m not asking you to be impressed — I suppose others have thought about it for millennia. It was when I was reading Rubicon by Tom Holland.

And as always, when I read about those days, I’m struck by how much the Trojan War comes up. Over and over and over again. It’s like the Greeks just had this one story they kept going back to, and of course, the Romans — as industrious as they were in so many other ways — couldn’t be bothered even to come up with one story of their own, so they stole the Greeks’. Which was their way.

If they came up with another story — like the one about Odysseus/Ulysses — they couldn’t even separate that new one from the big one. Sure, that’s about him and his boys being lost for years on the way home — but they were on the way home from… the Trojan War.

It even comes into the Romulus and Remus story, although I’m always forgetting how exactly.

Seems like they could have come up with some other stories. But they didn’t. They liked that one, and they stuck with it. Sort of makes me feel bad that I’ve never read the originals — not the Iliad, or for that matter the Aeniad. But you see, I have no Greek beyond Kyrie Eleison, and my Latin — despite the best efforts of the legendary Mrs. Sarah T. Kinney of Bennettsville High School — remains inadequate to tackling literature. I mean, I know that Gallia est omnis divisa in partes tres, but I don’t know what comes next.

And yes, I know millions of people over the ages — or a lot of them, anyway — have contented themselves with translations, but it just seems that after all this time, I could have made myself learn Greek. But I didn’t, so I leave it alone. I know the basic story, though — that horndog Paris caused a heap of trouble, and it went on for a bunch of years, and ended with a fake horse. I content myself with that. At least I don’t have to study Communism or Nazism or anything to get what the war was about. Pretty basic, really, even though it’s a bit hard for a modern mind to fully grasp why most of those other people went along with having a war over it.

That’s not my point, though. My point is that I started thinking about it again lately when I read a piece in The Wall Street Journal headlined, “The Power of Our New Pop Myths.”

Yeah, I know — the paywall. Actually, it’s getting in my way at the moment, too — some problem with my password I’ve had for about 20 years. Which I’m not going to change. But anyway, the subhed is “Marvel, Star Wars and other franchises have become central to our culture by returning to a primal form of storytelling.,” and it begins like this:

And so forth. It’s sort of related to a complaint I frequently voice about Hollywood being unable to come up with fresh stories. They just keep recycling the same yarns. (How many Spider-Man origin movies have we had in the past few years?)

Kind of like with the ancient Greeks and Romans, but at least we have more than one story. There’s Marvel, there’s Harry Potter, there’s Bilbo Baggins, and Dune if you like. There’s the Matrix. All of which are at least entertaining, the first time you hear them.

And of course, between the Trojan War and Peter Parker, we Westerners who have at least paid some attention to the actual bases of our culture have had, with the help of the ancient Hebrews, the rich stories of the Bible, and a religion that speaks to me and many others of eternal verities, which if you’ll forgive me, I find even more meaningful than learning about the Kwisatz Haderach.

Which brings me back to Bishop Barron, who as you know continues to impress me with the power of his Sunday sermons.

He had a good one this week, in which he got all Jungian on the way to teaching an important lesson about what God wants from us.

His title was “Go on a Hero’s Journey,” and in it he gets into such stories as “The Hobbit.” It’s about how comfortable Bilbo was in his Hobbit hole, as hobbits tend to be, and beyond that about the inconvenient fact that that’s not what God wants. Like the dwarves who invade Bilbo’s sanctuary, and like Gandalf, he wants us to get out there and have an adventure, one that actually matters.

Anyway, I’m not going to recite the whole sermon to you; you can watch it below. I recommend it highly…

Sometimes, history is quite disappointing

I’ve remarked a number of times recently, I think, on the fact that no matter how much history I think I know, I keep getting slapped in the face by the fact that I don’t know squat about it.

Even when you limit it to a certain period I’ve obsessed over, I keep learning things that you would have thought anyone would have known. But I didn’t. Makes me humble — almost. I wish it would make those people on both sides of the CRT battles — who all think they know everything they need to know about what went before, and what it means — humble. Or at least quiet them down a bit. Because they get tiresome.

I had this happen again a few minutes ago. For reasons having nothing to do with this post, I happened to look up a town called Jeannette, Pennsylvania. A guy named it for his wife. It’s a pretty new town, only founded in 1888. You’d think it was out West or something, but no. Near Pittsburgh, which is only out West if you’re in Philadelphia.

Anyway, I read that its 2010 population was 9,654. Which made me think of my hometown, Bennettsville. Y’all know, of course, that I use that term “hometown” loosely, as only a Navy brat can. I grew up in America — mostly — rather than one bit of it. But I was born there, and it was the place I returned to in the summers, and I spent the entire 9th grade at Bennettsville High School, back when there was one (go, Green Gremlins!). I feel a great fondness for the place, but as I’ve said repeatedly, I could walk all the way through downtown on Main Street and not be recognized by anyone, unless I got lucky.

So I looked up B’ville on Wikipedia as well, and found that as I thought, the population was close to the same — 9,069 in 2010.

But then I read on, and got to this:

The city of Bennettsville was founded in 1819 on the Great Pee Dee River and named after Thomas Bennett, Jr., then governor of South Carolina….

I’d never thought about it before, but I guess I’d always assumed it had been named for, you know, somebody who lived there in the early days. Some plucky pioneer who was among the first Europeans to turn the sod on the banks of the Pee Dee, or who operated a ferry, or some such.

But no, this guy was just — the governor. Some guy from Charleston. It appears he raised some questions about the conduct of the Denmark Vesey investigation, trial and executions. Perhaps the points he raised were to his credit. It’s a bit hard to tell, because the article isn’t very well written.

But that’s all irrelevant to the point that, aside from having it named for him, I don’t see anything that indicates he had anything to do with Bennettsville. Or Marlboro County, for that matter. Or the Pee Dee, even.

Which is rather disappointing. It’s like founding a town and naming it for Henry McMaster, even though he’d never been there. Don’t you think that’s kind of lame? I’d think it was lame even if Henry were a more interesting and distinguished governor. Which, as we know, wouldn’t take much.

I’m not lobbying to change it, of course, even though B’ville has plenty of more interesting sons and daughters — Hugh McColl, Marian Wright Edelman, or if you want someone more recent, Aziz Ansari. I mean, come on — it was the home and base of operations of Sen. Jack Lindsey! Why, my Uncle Woody embodies the town, far as I’m concerned, and could entertain you enormously telling stories about it. But it’s not named for him, either.

But again, I love the name “Bennettsville,” and wouldn’t change it. It has a certain warm, rounded feel. It’s part of my own deepest identity, one of the essential “B” names and words for which I’ve always felt such a keen comfort and affection. (Have you seen me in my new B hat?) Like the color blue.

I just wish we had a better reason for the name. Maybe there is one, and it didn’t make Wikipedia. I’ll have to ask Walter Edgar, next time I see him. Being a real historian, he knows stuff like that…