Did y’all see the “news” the other day — ironically, the day before my grandson was born — that white babies are no longer the majority? I first heard it on NPR’s Talk of the Nation. Quoth Neal Conan:
We’ve known for years this day would come, but here it is. The Census Bureau announced today that nonwhite births now make up a majority in the United States. Data gathered in 2011 show that nonwhite, Hispanic, African-American, Asian, Native American, mixed race and others combined for 50.4 percent. That’s the first time that white births were not a majority in U.S. history, and that raises some questions about policy – from education to social services programs – and about how we see ourselves as a nation….
Perhaps this is a good time to inject a bit of historical perspective…
I’m still off-and-on gradually making my way through Charles C. Mann’s 1493, while reading several other books at varying rates, and every time I read a stretch in it, I learn something startling. For instance, I refer you to the fact that for most of the history of Europeans in the Americas — up to the mid-nineteenth century — there were far more people of African descent in the Western Hemisphere than there were white. Way more. An excerpt (I hope the publishers will excuse the length of this quote. I share it within the context of urging you to run out and buy this book; there are many other things in it that will surprise you, and enlarge your understanding of our world.):
I used to think of that as anomalous. I thought of it as helping explain the fact that South Carolina slaveholders were more fanatically devoted to their Peculiar Institution that the white elites anywhere else. Hence that firing on Fort Sumter thing.
But as it turns out, if you look at ALL post-Columbian immigration across the hemisphere, not just English, you see that far, far more were brought here as slaves from African than came here, either free or indentured, from all Europe. By 1860, this balance had changed in many places (thereby making SC somewhat anomalous), but for most of the time from 1492 until then, a larger black population had been the norm. (Of course, for that same period, there remained more Indians than whites or blacks, in spite of the way native populations had been decimated by European and African diseases.)
I also found it surprising because I spent part of my childhood in Latin America, and it did not prepare me for this statistic — even though I studied history in Spanish in school (Mann’s references to Columbus as Cristóbal Colón seem very natural to me). In Ecuador, where I lived for two-and-a-half years, it was very unusual to see anyone who looked at all African. I knew that Brazil had imported vast numbers of slaves during the colonial period, and that you could see the results on the streets of Rio. I would have said the same of the islands of the Caribbean.
But for there to be that many more blacks than whites across all the Americas? I had no idea. We all are aware that black labor largely built this country, but I guess I thought that was because those workers were owned by a white majority. I was wrong. At least from a hemispherewide perspective.
In any case… whites not being the majority? Nothing new about that on this side of the world.
House unanimously passes “born alive” bill (?????) — Note the question marks, and the lack of a link. I haven’t seen an MSM story on this. Will Folks is running a press release saying this happened (and as he notes, “The above communication is an email from a politically active organization”), and I received a release from Senate Republicans congratulating the House. No independent news coverage yet. But it seemed like something that would generate controversy, and I didn’t want to ignore it.
They Moved a Robot With Their Minds (NYT) — “Scientists said a tiny brain implant allowed two people who are virtually paralyzed below the neck to maneuver a robotic arm.” Wow.
I just thought I’d add some elaboration on the Medal of Honor presentation. From the AP story:
Obama presented the Medal of Honor to Sabo’s widow, Rose Mary, and said doing so helps right the wrongs done to a generation that served freedom’s cause but came home to a brooding and resentful nation.
“Instead of being celebrated, our Vietnam veterans were often shunned,” Obama said in a hushed East Room. “They were called many things when there was only one thing that they deserved to be called and that was American patriots.”
Spec. Leslie H. Sabo Jr. of Elwood City, Pa., was serving with U.S. forces near the village of Se San in eastern Cambodia in May of 1970 when his unit was ambushed and nearly overrun by North Vietnamese forces.
Comrades testified that the rifleman charged up from the rear, grabbed an enemy grenade and tossed it away, using his body to shield a fellow soldier. And shrugging off his own injuries, Sabo advanced on an enemy bunker that had poured fire onto the U.S. troops — and then, pulled the pin on his own grenade.
“It’s said he held that grenade and didn’t throw it until the last possible moment, knowing it would take his own life but knowing he could silence that bunker,” Obama recounted. “And he did. He saved his comrades, who meant more to him than life.”
Sabo was 22 years old when he gave his life for his comrades.
The P.M. flashes his famous V-for-Victory sign. We can't tell, from this photograph, whether he was flashing an "E" sign with the other hand.
After The Times played down its advance knowledge of the Bay of Pigs invasion, President Kennedy reportedly said he wished we had published what we knew and perhaps prevented a fiasco. Some of the reporting in The Times and elsewhere prior to the war in Iraq was criticized for not being skeptical enough of the Administration’s claims about the Iraqi threat. The question we start with as journalists is not “why publish?” but “why would we withhold information of significance?” We have sometimes done so, holding stories or editing out details that could serve those hostile to the U.S. But we need a compelling reason to do so.
— Bill Keller, then-executive editor The New York Times June 2006
The apology came a bit late for Kennedy, who died in a traffic accident in 1963.
This is not to confuse him with another Kennedy who also died in 1963, and had earlier persuaded The New York Times to back off the Bay of Pigs story for national security reasons. (See Keller quote above.)
The icing on this tale came today, when we learned that the AP knew last week that the United States was closing in on Underwear Bomber II in Yemen — but withheld the news at the request of the government. That’s what the WSJ reported this morning, anyway:
U.S. officials had known about the plot for about a month, and President Barack Obama was briefed on the plot in April. White House officials had persuaded the Associated Press, which had an account of the plot in hand as early as last week, to hold off on publishing because the intelligence operation was still under way.
This is fascinating. It was one thing to stand shoulder-to-shoulder with government censors in 1945, when the entire country was united in the all-out effort to win WWII, and cooperation with the censors was reflexive.
But today? When it is fashionable to call the War on Terror the “so-called War on Terror”? When, as Keller mentions above, the leftward side of the political spectrum persists in excoriating the media for not being skeptical enough prior to the Iraq invasion? For a major media entity to respond with a snappy salute to a government request to be discrete is decidedly remarkable.
This will no doubt spark dark rumblings — and probably already has; I’m not bothering to look — among Republicans about whether the AP would have agreed to this request if it had come from the Bush administration.
Interesting question.
What do you think about all of this? Oh, you want to know what I think. Well, I don’t know enough to have an opinion yet. I’d like to know what the AP knew, and what it was told before it made the decision to hold back on the story. The default position for a journalist is to report a story when you know it’s true, as Keller reported. But this sounds like it’s one of those rare cases in which lives may have been saved by holding back — which would justify the decision to wait.
President Barack Obama makes a point during one in a series of meetings in the Situation Room of the White House discussing the mission against Osama bin Laden, May 1, 2011. National Security Advisor Tom Donilon is pictured at right. (Official White House Photo by Pete Souza)
OK, my Corleone metaphor aside, let’s address the actual political question before us: Does Barack Obama deserve any particular credit for “getting” Osama bin Laden, or would “anyone have done what he did?”
This is actually a very important question. When deciding who should be one’s president going forward, there is no more important question than whether he would be an effective commander in chief (or in the case of the incumbent, whether he is an effective commander in chief).
Republicans, including some who should know better, are essentially saying Obama did nothing that anyone else wouldn’t have done. They are wrong. I initially thought as they did — not that I wanted to take anything away from the president, but because I thought it was true — but as I read and learned more about the decision-making process leading to the raid on Abbottabad, I changed my mind.
Last night, I inadvertently saw a few seconds of TV “news.” John McCain was saying that of course Mitt Romney would have done the same thing, or something along those lines.
Well, as it happens, we have strong reason to believe that Jimmy Carter would have ordered such an operation. He actually did order a roughly comparable one. It failed, as military operations sometimes do. (The one Obama ordered could have failed, too, at a number of critical points. That’s one reason he deserves credit for having the guts to give the order.) But he ordered it. It was a big deal that he ordered it. His secretary of state resigned over it.
But would “anyone else” have done the same? There is little reason to think so. It would have been Bill Clinton’s M.O., for instance, to have flipped a couple of cruise missiles in that direction. And as we saw in Kosovo, he had a predilection for air power rather than boots on the ground. But… and this is a huge “but”… is it fair to make the assumption that the real-life Bill Clinton of the 1990s would have been as reticent, as cautious, post-9/11? It’s impossible to say.
What we do know is that in real life, there was sharp disagreement and debate in the Obama administration over how to proceed — whether to believe the assumptions based on incomplete intelligence (for doing that, George W. Bush earned the never-ending “Bush lied” canard), whether to act on them at all, whether to send in troops at all or simply bomb the compound, whether to send a joint force or a coherent Navy team, whether to notify the Pakistanis or just go in, whether to try to capture bin Laden or go in intending to kill him, whether to bring back his body or send it to sleep with the fishes.
And when I say debate within the administration, I don’t mean between what the Republicans would characterized as the Democratic sissy politicos, but among the professionals — the generals and admirals and Sec. Gates.
And at critical stages, the president and the president alone seems to have made very tough calls. And the right ones. Most importantly, he decided to send in men rather than just bombs. That way, he could make sure, he could minimize collateral damage — and the U.S. could reap an intelligence bonanza.
That took nerves not everyone would have. So many things could have gone wrong doing it this way — and nearly did. In what had to feel like a replay of Jimmy Carter’s debacle, we lost a helicopter. But having learned that lesson, we had backups.
Some Republicans would have you believe that giving Obama credit would take away somehow from the superb, almost superhuman job that the SEALs and the rest of the military and CIA team did. Nothing could be further from the truth. It stands as one of the most amazing coup de main operations of the past century. They performed as brilliantly as the Israelis did at Entebbe, for instance. But they had their roles to play, and the commander in chief had his. And all involved did their jobs remarkably well.
I refer you to two posts I wrote last year, as I came to the conclusion that Barack Obama personally deserved credit for the leadership calls that led to our killing bin Laden. Here they are:
In invite you to go back and read them, to see how I reached a conclusion very different from the line we’re hearing from Republicans now.
There is no way of knowing whether Mitt Romney would have made the same calls. I suspect that he might have erred on the side of caution, but I could be completely wrong about that. He might have acted in exactly the same manner. But what I know is that Barack Obama did — and that what he did is not just “what anyone would have done.”
President Barack Obama and Vice President Joe Biden, along with members of the national security team, receive an update on the mission against Osama bin Laden in the Situation Room of the White House, May 1, 2011. Seated, from left, are: Brigadier General Marshall B. “Brad” Webb, Assistant Commanding General, Joint Special Operations Command; Deputy National Security Advisor Denis McDonough; Secretary of State Hillary Rodham Clinton; and Secretary of Defense Robert Gates. Standing, from left, are: Admiral Mike Mullen, Chairman of the Joint Chiefs of Staff; National Security Advisor Tom Donilon; Chief of Staff Bill Daley; Tony Binken, National Security Advisor to the Vice President; Audrey Tomason Director for Counterterrorism; John Brennan, Assistant to the President for Homeland Security and Counterterrorism; and Director of National Intelligence James Clapper. Please note: a classified document seen in this photograph has been obscured. (Official White House Photo by Pete Souza)
An old friend sent me the above video. When I got home last night, I asked my wife to watch it without telling her why. She looked at it only a second before saying “Michael!”
Yep. The guy playing the “English teacher” at the beginning is Michael Mercer. Michael and I started out as copy editors together at The Jackson (TN) Sun in 1975, soon after I graduated from Memphis State. Michael got out of the business long before I did, taking a teaching gig at Auburn. Now he’s at another college in San Antonio, as he explained when I asked about the video:
The young lady featured in the film is one of my student-advisees at the University of the Incarnate Word in San Antonio. She’s a communication arts major concentrating in journalism — although broadcast like most of them do today. She asked me at the last minute to be in the video that she and the filmmaker — her boyfriend from another school in San Antonio — were doing for a video contest promoting San Antonio park recreation.
They didn’t win the contest but I thought they, too, did an excellent job. I was only familiar with the classroom scene where they asked me to mouth a few words as an “English teacher.” Those other students in the video in the classroom are UIW students but not any of mine. We spent about an hour shooting various takes, angles, short bites. I was told it would only take about five minutes. Then a week later, my student asked me to wear the same shirt and pants for a scene they wanted to shoot minus the class in that same classroom showing me walking out the door after that “chill” comment.
No, most of my South Carolina readers won’t know Michael, but some of our former colleagues will see this, which is why I share it here. The fact that I can do so so easily — the fact that a student could even produce something like this — is testament to how the world has changed since Michael and I started out.
In those days, the copy desk was still a big horseshoe (or “elephant’s commode,” as one of my Tennessee colleagues referred to it), with the slot man (or woman) sitting in the center, distributing copy to those on the rim who would edit it and write headlines as assigned by the slot. The copy and headlines would then be passed back to the slot for checking before being sent to the composing room. Except that the editing wasn’t done on paper at this point. The text had been scanned and output onto a paper punchtape, which was clipped to the hard copy with a clothespin (without clothespins, we couldn’t have gotten the paper out). After an editor received the copy with attached tape from the slot, he or she would take it over to a Harris 1100 editing machine, and feed the punch tape into it. The copy would appear on a CRT screen, and the editor would use a keyboard to edit it. When done, the edited text would be output to another punch tape of a different color, which the editor would roll up the tape (using a little electrical device that was sort of like one of those handheld, flashlight-sized fans) and clip it back to the copy. That bundle is what the editor would pass back to the slot, along with a headline written in pencil on a hand-torn strip of paper.
A couple of months after I joined The Sun, I was pulling shifts in the slot, and I found I liked it so much that by the time I moved on from the desk, I was doing it most days. The job entailed what would have been three to five jobs at a paper the size of The State in those days. The slot not only supervised the editing process, but laid out the entire A section, monitored the wires and selected all wire copy, and oversaw the production process in the composing room. If a page was late, it was the slot’s fault. And in those days, things were so loose and informal at The Sun that an assertive slot (which, I confess, I was) could pretty much decide how all of the news in the paper was played, including local copy.
The day started at 5:30 a.m., and the whole first edition (which was more pages than you find today in The State) had to be out at 11. Then we’d grab a quick lunch before having the city edition out by 1:30.
Doing that job at the age of 22 gave me a lot of confidence that stood me in good stead in the years to come. And it gave me a taste for calling the shots. Which is why after that gig, I only spent a couple of years as a reporter before becoming a supervising editor. You can learn a lot by starting out in a small pond.
Michael followed a similar path, without being quite as power-mad as I was (you can probably tell in his brief appearance that Michael is a nicer guy than I am — which I’m betting is why he was cast in this film; I’m sure he’s the sort of teacher who might be students’ favorite). He was one of my assistant editors over the news reporters at The Sun in later years.
And now there he is, playing the “English teacher.”
I just finished reading Mutiny on the Bounty, for the first time — I think. I initially had this vague idea that I had read it as a child. Yet most of it seemed fresh to me. Of course, I knew at ever step of the way what was to happen next. Who doesn’t know the general outline of the story? Who hasn’t seen at least one of the Hollywood versions? But the actual words seemed fresh as I read them, and certain things about it — such as the fact that, bizarrely, the English sailors refer to the people of Tahiti as “Indians” throughout — seemed totally unfamiliar.
In any case, I’m certain I’ve never read either of the sequels, Men Against the Sea or Pitcairn Island. That is to say, I’ve never read the “chapter book” versions. I have a clear memory of reading the Classics Illustrated version of Men Against the Sea. What sticks out in my mind is the desperate men in the open boat managing to kill a seagull, and Captain Bligh rigidly serving out portions of its blood to the neediest men on board. (Or do I remember Charles Laughton doing that in the 1935 film?)
Anyway, now that I’ve finished the first book, I’m wondering whether it’s worth my while to read the second. I know what happened — Bligh, a tyrant of a captain but an extraordinary seaman, manages to get himself and 17 others safely to Timor, 3,500 miles away, in an open boat with practically no provisions. It stands as one of the most extraordinary feats of seafaring history.
But I’ve got to think it’s not much fun to read. It’s a tale of horrific suffering, day after day. And the main protagonist is a guy who’s hard to like. I mean, Mutiny on the Bounty had gorgeous topless Tahitian girls. (No pictures, but still…) What’s this got to recommend it?
Perhaps the fact that it’s told mainly through the experience of Thomas Ledward, acting surgeon, helps you root for these guys a bit more than you otherwise might. I’m pretty sure I wouldn’t want to read it in Bligh’s voice.
Anyway, has anyone out there read it? Did you like it? And if so, why?
Christians Debate: Was Jesus For Small Government?
What would Jesus do with the U.S. economy?
That’s a matter of fierce debate among Christians — with conservatives promoting a small-government Jesus and liberals seeing Jesus as an advocate for the poor.
After the House passed its budget last month, liberal religious leaders said the Republican plan, which lowered taxes and cut services to the poor, was an affront to the Gospel — and particularly Jesus’ command to care for the poor.
Not so, says Wisconsin Republican Rep. Paul Ryan, who chairs the House Budget Committee. He told Christian Broadcasting Network last week that it was his Catholic faith that helped shape the budget plan. In his view, the Catholic principle of subsidiarity suggests the government should have little role in helping the poor.
“Through our civic organizations, through our churches, through our charities — through all of our different groups where we interact with people as a community — that’s how we advance the common good,” Ryan said.
The best thing that government can do, he said, is get out of the way.
But Stephen Schneck, a political scientist at Catholic University, says he thinks Ryan is “completely missing the boat and not understanding the real heart, the real core, of Catholic social teaching.”…
At the time, I zeroed in on Ryan’s (rather restrictive and misleading) use of “subsidiarity.” What I didn’t get into was the bigger subject: What would Jesus do politically? What sort of government would he advocate?
In a sense, it’s a stupid question, in that it really can’t be answered authoritatively.
We are hobbled by the fact that Jesus wasn’t into politics. In his day, that simply wasn’t in the hands of the people, and therefore there could be no moral imperative to shape one’s society. He taught people how they should live their lives in the world as they found it.
Such issues as “the size of government” (which has always seemed like a ridiculous thing to talk about, as though there could be an objectively ideal “size” — of course, that’s me talking, not Jesus) simply were not anything an average person had any control over. That was up to Caesar. Or the Senate. Or on the more local level, the Tetrarch or Pilate. Or the Sanhedrin. In His day, government actually was what libertarians imagine it to be today. It was “they,” something outside of and apart from the individual.
One of the tough things about applying moral teachings from the Bible to our own time and place is that our relationship to government today is so radically different. For the first time in human history most people (in Western countries, at least) now have a moral responsibility for the world around them, because they have a say in it. They elect the leaders who make the laws. That was unthinkable in Jesus’ day.
Jesus had a live-and-let-live attitude toward government. Unlike his apostle the Zealot, he wasn’t interested in revolution. And if you tried to engage him discussing the morality of taxation, he said render unto Caesar — that was Caesar’s business, not his.
The challenge that Christians have today is what to in in a world in which they have a say in the government. But they don’t get all that much guidance from the Bible, which is why Christians run the gamut from left to right on the political spectrum.
There’s no question, for instance, that we are called upon to care for the poor. But both left and right can make cases for their positions. The left will insist that government must do that job; the right will insist that it must be done by private entities.
The weakness in the left’s argument is that, in this country at least, what the government does is by definition done outside the Christian framework. Government can’t say, “What would Jesus do?” and act accordingly, on account of the way we currently interpret the First Amendment.
The weakness in the right’s argument is that since a Christian today does have responsibility for his government, he should advocate that his government act in accord with his beliefs. If we are enjoined to minister to the poor, than we should vote accordingly. Our vote should be an instrument of Christian charity just as our tithe at Church is.
Ironically, it is so often people on the left who object to anyone trying to make the government an agent of any sort of religious agenda. (I point you to liberals’ horror at what they perceived Rick Santorum as being about.)
In the end, Christians on the left and on the right will tend to imagine what a “Citizen Jesus” would do if he lived in a modern liberal democracy in terms of what they themselves believe politically.
When, of course, we know he would have voted UnParty…
On a previous thread, young Kathryn scoffed at people my age, suggesting that we think the music that was popular when we were in high school (and I would add, college) was great just because it came along when we were young.
I think people of any age are going to have a special feeling for music that was played when their hormones were raging at their peak. But while I hesitate to invoke an “objective” standard, I think you can demonstrate with some degree of detachment that the period in question for, say, Burl and me (roughly 1965-1975) was one of extraordinary creativity on many popular fronts.
There were so many genres just exploding:
British pop groups and their American imitators (what everyone thinks of first). And I’m not going to bother splitting this into its many sub-genres.
Folk, evolving from acoustic to electric, in numerous directions (Dylan and Simon and Garfunkel are very different)
Varieties of soul, from Motown to Memphis
Burt Bacharach — He gets his own category. If you want to create a 60s feel in a movie, you’re as likely to turn to Bacharach as the Beatles — if not more so
Latin (Spanish variety), spanning a broad spectrum from Herb Alpert to Trini Lopez to Jose Feliciano (Alpert is as essential as Bacharach to a 60s soundtrack)
Latin (Brazilian variety), from Girl from Ipanema through Sergio Mendes and Brasil 66
Old folks/Rat Pack-style — Dean Martin and others reached broadest audiences ever on TV
Crossover country — spanning a wide spectrum from Glen Campbell to Johnny Cash, enjoying wide popularity not seen before or since
West coast beach music (surf music) — Yes, it came along earlier, but there was still a lot going on in the early part of this period (“Wipeout,” the later Beach Boys stuff
East coast beach music — This movement started in the 40s, but some of the big hits came along in the early part of this period (“Can’t Help Myself”)
Even Broadway show tunes — Almost every show tune I’m familiar with was sung repeatedly on the 60s TV variety shows
White blues — big overlap with British groups here (The Animals, Cream, early Led Zeppelin), but Paul Butterfield and others sort of stand alone
Then there are all those bands and individuals that can’t be easily categorized — Warren Zevon, Randy Newman (late in the process), David Bowie, The Band, Creedence Clearwater Revival, Janis Joplin, Linda Ronstadt, Elton John, Blood, Sweat and Tears…
I just can’t think of a time when so many kinds of music were so huge, and reaching such a diverse audience (in the pre-cable age, everyone was exposed to pretty much the same cultural influences — if it got on TV, the audience was immense), with so much energy and creativity exploding out of every one of them.
Can you?
It wasn't just about guitar groups -- not by a long shot.
I was intrigued by this question that The Washington Postposed on Twitter today: “Could the Web generation uncover a Watergate-type scandal?”
I followed the link and saw that the piece was based on a panel discussion featuring Bob Woodward and Carl Bernstein. They had their doubts:
“One of the colleges asked students in a journalism class to write a one-page paper on how Watergate would be covered now,” said Bob Woodward, “and the professor — ”
“Why don’t you say what school it was,” suggested Carl Bernstein, sitting to Woodward’s left in a session titled “Watergate 4.0: How Would the Story Unfold in the Digital Age?”
“Yale,” Woodward said. “He sent the one-page papers that these bright students had written and asked that I’d talk to the class on a speakerphone afterward. So I got them on a Sunday, and I came as close as I ever have to having an aneurysm, because the students wrote that, ‘Oh, you would just use the Internet and you’d go to “Nixon’s secret fund” and it would be there.’ ”
“This is Yale,” Bernstein said gravely.
“That somehow the Internet was a magic lantern that lit up all events,” Woodward said. “And they went on to say the political environment would be so different that Nixon wouldn’t be believed, and bloggers and tweeters would be in a lather and Nixon would resign in a week or two weeks after Watergate.”
A small ballroom of journalists — which included The Washington Post’s top brass, past and present — chuckled or scoffed at the scenario…
I also enjoyed the way the piece, written by Dan Zak, characterized the Woodstein legacy:
Tuesday’s panel briefly reunited the pair, whose untangling of the Nixon administration inspired a generation of journalists who have since been laid off or bought out in large numbers. Woodward and Bernstein’s main point was evocative of a previous, plentiful era: Editors gave them the time and encouragement to pursue an intricate, elusive story, they said, and then the rest of the American system (Congress, the judiciary) took over and worked. It was a shining act of democratic teamwork that neither man believes is wholly replicable today — either because news outlets are strapped or gutted, or because the American people have a reduced appetite for ponderous coverage of a not-yet-scandal, or because the current Congress would never act as decisively to investigate a president.
For the record, while I may indeed be one of those “who have since been laid off or bought out in large numbers,” I didn’t get the idea to go into journalism from these two guy — however much their example may have encouraged me. I was already working as a copy boy at The Commercial Appeal when I first heard of them…
Slate has put up a really interesting photo slide show invoking the “Mad Men” era, to help us all get psyched up for the season premiere coming Sunday.
I read something that surprised me this morning, in a book review in The Wall Street Journal. As is fairly typical in opinion pieces in the Journal, the reviewer repeatedly expressed disdain for the author of a book about Irish politics in Boston whenever he failed to be insufficiently conservative (praising him for not dwelling on the Kennedys, castigating him for insufficiently respecting the Southies who fought busing for integration). But I was startled by this revelation:
Unfortunately, Mr. O’Neill has produced a rather straightforward recapitulation of Irish politics in the Hub, sticking to the well-established narrative of mustache-twisting Brahmins (or “Yankee overlords,” in Mr. O’Neill’s phrasing) doing battle against spirited, rascally Irish politicians. Indeed, “Rogues and Redeemers” doesn’t so much upend myths as reinforce them. In Irish America, tales of rampant employment discrimination by Yankee businessmen, who posted signs warning “No Irish need apply” are accepted as gospel. Such anti-Irish bias, writes Mr. O’Neill, was “commonly found in newspapers” and became “so commonplace that it soon had an acronym: NINA.”
But according to historian Richard Jensen, there is almost no proof to support the claim that NINA was a common hiring policy in America. Mr. Jensen reported in the Journal of Social History in 2002 that “the overwhelming evidence is that such signs never existed” and “evidence from the job market shows no significant discrimination against the Irish.” The tale has been so thoroughly discredited that, in 2010, the humor magazine Cracked ranked it No. 2 on a list of “6 Ridiculous History Myths (You Probably Think Are True).” Mr. O’Neill doesn’t inspire confidence by faithfully accepting NINA as fact…
I spent a few moments just now checking to see to what extent it is true that the NINA phenomenon is a “myth” of victimization. What I found kept directing me to the aforementioned Mr. Jensen, whose article on the subject is much cited.
But even Jensen documents that some (although not many) ads saying “No Irish Need Apply” appeared in American newspapers during the period. And no one disputes that such prejudice against the Irish was common in Britain; the only debate has to do with the extent of the practice in this country.
The NINA slogan seems to have originated in England, probably after the 1798 Irish rebellion. Throughout the 19th and 20th centuries it was used by English to indicate their distrust of the Irish, both Catholic and Protestant. For example the Anglican bishop of London used the phrase to say he did not want any Irish Anglican ministers in his diocese. By the 1820s it was a cliché in upper and upper middle class London that some fussy housewives refused to hire Irish and had even posted NINA signs in their windows. It is possible that handwritten NINA signs regarding maids did appear in a few American windows, though no one ever reported one. We DO have actual newspaper want ads for women workers that specifies Irish are not wanted; they will be discussed below. In the entire file of the New York Times from 1851 to 1923, there are two NINA ads for men, one of which is for a teenager. Computer searches of classified help wanted ads in the daily editions of other online newspapers before 1923 such as the Booklyn Eagle, the Washington Post and the Chicago Tribune show that NINA ads for men were extremely rare–fewer than two per decade. The complete absence of evidence suggests that probably zero such signs were seen at commercial establishments, shops, factories, stores, hotels, railroads, union halls, hiring halls, personnel offices, labor recruiters etc. anywhere in America, at any time. NINA signs and newspaper ads for apartments to let did exist in England and Northern Ireland, but historians have not discovered reports of any in the United States, Canada or Australia. The myth focuses on public NINA signs which deliberately marginalized and humiliated Irish male job applicants. The overwhelming evidence is that such signs never existed.
Irish Americans all have heard about them—and remember elderly relatives insisting they existed. The myth had “legs”: people still believe it, even scholars. The late Tip O’Neill remembered the signs from his youth in Boston in 1920s; Senator Ted Kennedy reported the most recent sighting, telling the Senate during a civil rights debate that he saw them when growing up 5 Historically, physical NINA signs could have flourished only in intensely anti-Catholic or anti-Irish eras, especially the 1830—1870 period. Thus reports of sightings in the 1920s or 1930s suggest the myth had become so deeply rooted in Irish-American folk mythology that it was impervious to evidence…
Make of this what you will.
Personally, I think it unlikely that NO such signs existed. Given what we can see even today of nativist sentiment, and knowing the nation’s history of suspicion and even hostility toward Catholics, it seems almost certain that back in a day when the “n-word” invited no social ostracism, such alienation toward an outside group would have been expressed quite openly and without embarrassment. But I’m just extrapolating from known facts here. Jensen is right — neither I nor anyone else can produce physical evidence of such signs at worksites.
I suspect that the truth lies somewhere between the utter dismissal of the reviewer, and the deep resentment of alleged widespread practices that runs through the history of Southie politics.
I’m beginning this post at 11:19 a.m. on the Ides of March. That is, it’s 11:19 in real time, sun time. According to every time-keeping device within reach of me, including this laptop, it’s 12:19 p.m. (OK, 12:21 now, as I stopped to look something up.) But that’s because every time-keeping device in my vicinity lies. They are required to do so by law.
The law is the Energy Policy Act of 2005, which extended the lying practice of observing Daylight Saving Time for four weeks more out of the year. You know why? Because Senator Michael Enzi and Michigan Representative Fred Upton thought it would be a fine idea to move the end of it later in the fall so that kids could go trick-or-treating in daylight. Really. (As if any self-respecting spook would venture forth before darkness has fallen.) I don’t know the excuse for moving the start from April to before the middle of March, but I’m sure it is also a doozy.
I had no idea that my church’s bishops were against it, but of course that makes perfect sense, as all right and moral people would be.
There are few measurements of time that are based in the natural world. There is the day, and the year, which both make sense as long as one is earthbound. Divorced from the cycles of the moon, months are nonsensical — just arbitrary devices we’ve agreed to pretend are real. The hours of the day make sense in only one way — if noon occurs at the height of the sun. In the days of sail, in the Royal Navy at least, noon was the occasion of some ceremony — the official beginning of the naval day. The captain would assemble his midshipmen on the quarterdeck and they would all shoot the sun with their sextants, and when there was agreement that indeed it was noon, the captain would say to the quartermaster, “Make it noon,” and a marine would strike the bell, and the foremast jacks would be piped to their dinner. Noon was real, it was grounded, and it provided a reference point for giving every other hour of the day meaning.
Now, the time of day is arbitrary, and I see little reason to respect it. Particularly when it robs me of an hour of sleep on my weekend, then causes me to rise before the sun every day for most of the year. Then — and this is the thing that bugs me more — it completely eliminates any enjoyment of the evening. I don’t know about you, but I am completely uninterested in eating my last meal of the day while the sun still shines. I’m a busy guy, and I continue being busy until the setting of the sun tugs at my attention. (This is rooted, I suppose, in all those years of newspaper work, when the climax of the long working day occurred in the evening.)
So the sun goes down, and we eat supper, and… it’s time to go to bed. No relaxing evening. No downtime. It’s all over. And I know I’ll have to get up an hour early in the morning. Which I resent.
I’m feeling this with particular force this week because I recently started working out everyday (I have a new elliptical trainer at home), and this week was when I started trying extra hard to do my workout in the morning rather than at night. I get that initial boost of energy from the workout, then I eat breakfast and about mid-morning I crash, and feel tired the rest of the day. I blame this on having to do my workout before the sun is up.
Some say it’s just an adjustment. Even people who don’t hate DST say the first few days are hard. I say stuff to that. I’ll hate it until the first week in November arrives.
You know, it’s not inevitable. Since DST is a false construct of man, it can be undone by man (arrogant man, who thinks he can revoke the movement of the spheres). They don’t put up with this tyranny in Arizona:
Arizona observed DST in 1967 under the Uniform Time Act because the state legislature did not enact an exemption statute that year. In March 1968, the DST exemption statute was enacted and the state of Arizona has not observed DST since 1967. This is in large part due to energy conservation: Phoenix and Tucson are hotter than any other large U.S. metropolitan area during the summer, resulting in more power usage from air conditioning units and evaporative coolers in homes and businesses.[citation needed][disputed – discuss] An extra hour of sunlight while people are active would cause people to run their cooling systems longer, thereby using more energy.[8] Local residents[who?] remember the summer of 1967, the one year DST was observed. The State Senate Majority leader at the time[citation needed] owned drive-in movie theaters and was nearly bankrupted by the practice. Movies could not start until 10:00 PM (2200) at the height of summer: well past normal hours for most Arizona residents. There has never been any serious consideration of reversing the exemption.
Did you read that? They’ve figured out in Arizona that it costs more money, because it makes you run air-conditioning longer. Well, duh. DST might, just might, make some sense if you live in Minnesota. Or back in 1918, before air-conditioning.
But it makes no kind of sense now, in South Carolina. Where are all these neo-Confederates who want to nullify every sensible act of the Congress when it comes to a useless act such as DST? How dare those damnyankees tell us to build our entire days upon a lie against God’s creation? Why, it offends all decent sensibilities.
People just accept things, as though they were sheep. Are there no men among us anymore?
I don’t know, but I wish somebody would do something. I would, but I’m too blamed tired…
Well, I’m back. I had some sort of crud yesterday that made me leave the office about this time yesterday– upset stomach, weakness, achiness. It lasted until late last night. When I got up this morning, I was better, but puny. So I went back to bed, and made it to the office just after noon. Much better now.
Anyway, instead of reading newspapers over breakfast at the Capital City Club the way I usually do, I read a few more pages in my current book, 1493: Uncovering the New World Columbus Created, by Charles C. Mann. Remember how I was all in a sweat to read it several months ago after reading an excerpt in The Wall Street Journal? Well, having read the prequel, 1491, I’m finally well into this one.
And I’m reading about how settlement by Europeans in many parts of the New World established “extraction societies.” At least, I think that was the term. (It’s one I’ve seen elsewhere, related to “extraction economy” and, less closely, to “plunder economy.” The book is at home, and Google Books won’t let me see the parts of the book where the term was used. But the point was this: Settlements were established that existed only to extract some commodity from a country — say, sugar in French Guiana. Only a few Europeans dwelt there, driving African slaves in appalling conditions. Profits went to France, and the institutions and infrastructure were never developed, or given a chance to develop.
Neither a strong, growing economy with opportunities for all individuals, nor its attendant phenomenon democracy, can thrive in such a place. (Which is related to something Tom Friedman often writes about, having to do with why the Israelis were lucky that their piece of the Mideast is the only one without oil.)
Here are some excerpts I was able to find on Google Books, to give the general thrust of what I’m talking about:
There are degrees of extraction societies, it would seem. South Carolina developed as such a society, but in modified form. There were more slaves than free whites, and only a small number even of the whites could prosper in the economy. But those few established institutions and infrastructure that allowed something better than the Guianas to develop. Still, while we started ahead of the worst extraction societies, and have made great strides since, our state continues to lag by having started so far back in comparison to other states.
It is also inhibited by a lingering attitude among whites of all economic classes, who do not want any of what wealth exists to be used on the kind of infrastructure that would enable people on the bottom rungs to better themselves. This comes up in the debate over properly funding public transit in the economic community of Columbia.
Because public transit doesn’t pay for itself directly, any more than roads do, there is a political reluctance to invest in it, which holds back people on the lower rungs who would like to better themselves — by getting to work as an orderly at a hospital, or to classes at Midlands Tech.
It’s a difficult thing to overcome. Other parts of the country, well out of the malarial zones (you have to read Mann to understand my reference here), have no trouble ponying up for such things. But here, there’s an insistent weight constantly pulling us down into the muck of our past…
To say that people of faith have no role in the public square? You bet that makes you throw up. What kind of country do we live in that says only people of non-faith can come into the public square and make their case? That makes me throw up…
— Rick Santorum
But since Bud mentioned it today on a previous post, and I read it again in The New Yorker while eating my lunch today, I thought I’d go ahead and say something that’s occurred to me several times in the last few days.
This sort of thing keeps happening. Someone running for president says something that I wouldn’t say, but I understand what he means, and what he means isn’t that awful — and the Chattersphere goes nuts over it, day after day, as though it were the most outrageous thing said in the history of the world.
It happened with Mitt Romney saying he wasn’t concerned about the poor. Obviously, he meant that there were mechanisms in place to help the poor, and that people like him didn’t need any help, but he was worried about the middle class. Not the best way to say it — and if he thinks the safety net makes it OK to be poor, he’s as wrong as he can be. But he was right to express worry about the state of the middle class, whatever he may imagine the remedies to be.
As for Santorum and the “throw-up” line. Well, to start with, I would recommend that no one running for president ever say that something someone else says or believes makes him want “to throw up.” It makes him seem… overwrought. Not at all cool. How can we trust him with that 3 a.m. phone call, with having his finger on the button, when he keeps running to the john to, in a memorable phrase I heard several years ago, “call Roark on the Big White Phone?”
That said, I get what he’s trying to say about the JFK speech to the Greater Houston Ministerial Association. I used to have a similar response to it, although I was never in danger of losing my lunch. Matter of degree, I suppose. In any case, it put me off. Because, far from being an assertion of the legitimate difference between church and state, I had taken it as an assertion that JFK would not bring his deepest values into the public sphere. I further saw it as a sop to bigotry. If offended me to think of a Catholic giving the time of day to anyone so small-minded as to suppose that a mackerel-snapper couldn’t be a good president, much less trying to tell them what they wanted to hear. Altogether a shameful instance of a candidate putting winning ahead of everything. Or so I thought.
My reaction was somewhat like that of Santorum when he addressed the subject a couple of years ago:
Let me quote from the beginning of Kennedy’s speech: ‘I believe in an America where the separation of church and state is absolute.’
The idea of strict or absolute separation of church and state is not and never was the American model. …
That’s correct. There is no such “absolute” separation, and none was intended, except perhaps by Thomas Jefferson (who was not one of the Framers of our Constitution, FYI). Kennedy’s choice of the word “absolute” was unfortunate. Santorum went on:
Kennedy continued: ‘I believe in an America … where no Catholic prelate would tell the president — should he be Catholic — how to act … where no public official either requests or accepts instructions on public policy from the pope, the National Council of Churches or any other ecclesiastical source; where no religious body seeks to impose its will directly or indirectly upon the general populace or the public acts of its officials.’
Of course no religious body should ‘impose its will’ on the public or public officials, but that was not the issue then or now. The issue is one that every diverse civilization like America has to deal with — how do we best live with our differences.
There, I can really identify with what he’s saying. The paranoia toward the Church that Kennedy was addressing is so idiotic, so offensive, that one hates even to see it dignified with an answer.
As for the overall point — was JFK’s performance offensive or not? I once thought it was, although as I say, it didn’t make me physically ill. But that’s because I had never read the speech in its entirety, or heard it. I had simply relied on characterizations of it by others, and the way they presented it made it sound as though Kennedy were kowtowing to anti-Catholic prejudice in a way that bothered me. Worse, there was this suggestion that he was pushing his faith away from him, suggesting that he would conduct himself in office as though he had no beliefs.
Implicit in all of it was the suggestion that faith had no place in the public sphere, which, like Santorum, I reject.
The speech itself is so well-rounded, so erudite, so articulate, so thoughtful about the relationship between faith and political power in this country, that I find myself won over to a candidate who could give such a speech…
I then quoted an excerpt:
Finally, I believe in an America where religious intolerance will someday end, where all men and all churches are treated as equals, where every man has the same right to attend or not to attend the church of his choice, where there is no Catholic vote, no anti-Catholic vote, no bloc voting of any kind, and where Catholics, Protestants, and Jews, at both the lay and the pastoral levels, will refrain from those attitudes of disdain and division which have so often marred their works in the past, and promote instead the American ideal of brotherhood.
That is the kind of America in which I believe. And it represents the kind of Presidency in which I believe, a great office that must be neither humbled by making it the instrument of any religious group nor tarnished by arbitrarily withholding it — its occupancy from the members of any one religious group. I believe in a President whose views on religion are his own private affair, neither imposed upon him by the nation, nor imposed by the nation upon him¹ as a condition to holding that office.
I would not look with favor upon a President working to subvert the first amendment’s guarantees of religious liberty; nor would our system of checks and balances permit him to do so. And neither do I look with favor upon those who would work to subvert Article VI of the Constitution by requiring a religious test, even by indirection. For if they disagree with that safeguard, they should be openly working to repeal it.
I want a Chief Executive whose public acts are responsible to all and obligated to none, who can attend any ceremony, service, or dinner his office may appropriately require of him to fulfill; and whose fulfillment of his Presidential office is not limited or conditioned by any religious oath, ritual, or obligation.
I went on to wax nostalgic for a time when political candidates had the respect for the American people to speak to them that way. This was far, far from the simple “separation of church and state” speech that I had heard about.
Even before I read the speech, there was never a time that mention of it made me want to throw up. The worst thing I said about it was that “I don’t much like the way Kennedy did it.” But I did, like Santorum, have a negative conception of it.
The thing was, I didn’t know what I was talking about.
Speak out, you got to speak out
against the madness,
you got to speak your mind,
if you dare.
But don’t no don’t try to
get yourself elected
If you do you had better cut your hair.
— Crosby, Stills and Nash
Pursuant to our previous discussion of ponytails and their relationship to credibility (in a specific context, not in general), Kathryn Fenner shares this article:
Hair style and dress sense are the only issues where politicians present a narrower range of options for voters than policies. Their political conservatism is reflected, and possibly shaped by, their follicular safeness. If you like, you can research this yourself. But you will find, after inspecting candidates’ heads at the local, state and federal level, there are very few afros, perms, ducktails, beehives, streaks, mop-tops, hi-top fades, curtains, asymmetrical fringes, Mohicans, pony-tails, dreadlocks, cornrows, Jheri curls, devilocks, liberty spikes, rat tails, bowl-cuts, under-cuts or mullets.
Tony Blair BEFORE thinking up New Labour.
If you are one of the thousands or millions of men with one of these things on your head, voting can be a lonely and frustrating process.
Today’s politicians don’t actually have a thing against long hair per se, since a lot of them are deserters from the long-haired community. Look at old pictures of Barack Obama with an afro, Bill Clinton’s shaggy mop and Tony Blair in his Mick Jagger phase. But they visited the barber before they ran for office because politics is an annex of the banking, legal, military and other notoriously short-haired professions.
The political establishment and its associated industries simply use a candidate’s appearance as a means of weeding out people who don’t act in their interests. So we end up with phrases like “presidential hair,” which means, on a more subtextual level, that the man underneath it won’t be out of place pressing flesh at a Wall Street dinner or engaging in bonhomie with military personnel. In short, these industries want to make sure the candidate is one of their guys, and in their antiquated world of alpha masculinity, something approaching a buzz cut is essential. Considering their election campaigns — especially the fundraising part — are essentially a series of job interviews with a panel of generals, bankers and super-rich lawyers, it’s not surprising that candidates scissor themselves as soon as their name gets near a ballot paper.
On a previous post, we got into a discussion of the importance of character in political candidates. (I have come over time to believe that it is paramount, to the point of paying far less attention to policy proposals by comparison. And of course, as you know, I am positively inimical to ideologies.)
We had a good discussion, and achieved some degree of synthesis. Along the way to that, Phillip happened to mention the fact that many in politics use military service or the lack thereof as a shorthand marker for character. This is certainly true. But as we discussed the relationship of such service to character, I went on a tangent… and decided it would be worth a separate post, as follows…
I believe that our politics started becoming dysfunctional, in the ways that I decry (hyperpartisanship, adamant refusal to listen to, much less work with, the “other side”), when we ended the draft.
Before that, you didn’t find many men (most officeholders today are men, and it was more true then) who had not spent at least a portion of their youth in the military. That certainly exposed them to having to work with all sorts of people from different backgrounds (as Phillip noted here), but it did something else: it forged them into something larger than those differences.
The WWII generation in particular may have had its political differences, but those guys understood that as a country, we all share interests. They may have been (in fact, were) liberals or conservatives or Northerners or Southerners or what have you, but they understood that they were Americans first. For those who served after the war, when the military was on the cutting edge of integration, it helped give black and white a sense of shared identity as well. (Indeed the shared experience of the war, even though it was in segregated units, helped lay the groundwork for the next generation’s gains toward social justice.)
As the first wave of young men who had NOT served (starting with those who were of an age to have served, but had not, such as Bill Clinton and Newt Gingrich) arrived in the top echelons of political power in the country, they brought with them a phenomenon that we hadn’t seen among their elders… a tendency to see fellow Americans who disagreed with them politically as the OTHER, even as “the enemy,” and a practically dehumanized enemy — one that must be opposed at all costs.
That said, Bill Clinton does deserve credit for rising above that new partisanship in many cases (welfare reform, deficit reduction) in order to accomplish things. And Newt Gingrich often worked with him to accomplish such goals.
But below them, among the young guys coming up in politics — the ones hustling around statehouses and working in campaigns — there was a generation rising that really could not think of the OTHER SIDE as someone to be communicated with, much less worked with.
I really believe that if those young guys had had the experience of being thrown together, outside of their communities, their cliques and their comfort zones, their heads shaved and put into uniforms, and required to work together in a disciplined manner toward common goals — THEY would be different, and consequently our politics would be different.
Mind you, I’m not saying we should reinstitute the draft in order to make our politics more civil (although there may be other reasons to have one). But I am saying that I believe today’s extreme polarization is in part an unintended function of that development in our history.
Maybe you consider the end of the draft to have been a good thing. What I’m asking you to do is consider that even good things can have unintended ill effects. The opposite is true as well. Y’all know how deeply opposed I am to abortion on demand. But it seems reasonable that it would have the effect claimed in Freakonomics of reducing crime over time (by instituting a sort of pre-emptive capital punishment of unwanted children, who are more likely than the wanted to become criminals). Just as it has had the undesirable effect in parts of Asia of drastically reducing the number of females in society.
Good actions have good and bad consequences; so do bad ones. It’s a complicated world.
When I started reading the story on the front page of The State this morning about a proposal to change the name of the denomination from “Southern Baptist,” I assumed that the reason would be the convention’s roots in the pro-slavery cause.
So I was taken aback when the reason given in the AP story was concerns “that their name is too regional and impedes the evangelistic faith’s efforts to spread the Gospel worldwide.” That seemed an awfully vanilla way to put it.
I read on, expecting to find the part that dealt with the convention’s founding in 1845… and it wasn’t there at all. No mention of why Southern Baptists had split from other Baptists.
Then, when I went to find the story online to link to it in this post, I found the missing passage:
The Southern Baptist Convention formed in 1845 when it split with northern Baptists over the question of whether slave owners could be missionaries. Draper said that history has left some people to have negative associations with the name.
Well, yeah.
AP stories are generally written in the “inverted pyramid” style, to make it easy for copy editors to cut from the bottom in making a story fit on a print page. But sometimes that doesn’t work. Sometimes a copy editor needs to read the whole story and think about what parts the reader can’t do without if he or she is to understand what’s going on. This is one of those cases.
The omission is more startling since someone thought to add a paragraph at the end telling how many Southern Baptists there are in South Carolina.
Of course, the blame doesn’t accrue entirely to the editor or page designer. This was a badly written AP story. The origins of the “Southern” identity should have been up top, rather than in the 14th graf. It was essential to understanding what the story was about.
Now, let me add that I don’t say any of this to condemn the convention, or the independent churches that belong to it. I do not mean to besmirch today’s Southern Baptists. My parents are Southern Baptists; I was baptized in Thomas Memorial Baptist Church.
But to fail to mention where the convention’s name came from in a story about a discussion of changing the name is like writing a history of Spanish Catholicism without mentioning the Inquisition, or the persecution of Jews and Muslims under Their Most Catholic Majesties Ferdinand and Isabela. Actually, you could even say it’s worse than that in terms of relevance, since the story was specifically about the name.
Given The State‘s usual interest in the history of slavery and Jim Crow (particularly during Black History Month), I was surprised by this omission.
Time now flies to the point that it’s achieved escape velocity.
Today, it is 50 years since my 3rd-grade class was herded into the auditorium to watch John Glenn take off in Friendship 7 to orbit the Earth.
And look how far we’ve come… today, Glenn marks the anniversary by chatting with American astronauts who are … visiting our moon colony? landing on Mars? pushing to the outer edges of the solar system?… no, merely orbiting the Earth in a space station. And not a cool, elegantly-revolving-wheel space station like in “2001: A Space Odyssey,” but something that looks like a cross between a bunch of tin cans fitted together and the kind of TV antenna we used to have affixed to our houses in 1962.
So, in other words, we haven’t come very far at all. In fact, looking at our sad, tentative little foothold in space, we haven’t moved at all. In fact, we’ve regressed.
Sure, it was primitive to seal an astronaut into that little nonaerodynamic capsule like “Spam in a can” and throw it into space, but today we don’t even have a functioning capsule. The United States doesn’t have a single spacecraft of any kind in service. Remember the terrible Russkies whom we feared dropping atom bombs on us from Sputnik like rocks from a highway overpass? We have to hitch rides with them now.
When Glenn made his flight, anything and everything seemed possible — and because it seemed so, it was so. We were going to the moon, even though many technical barriers remained to doing so. We weren’t entirely sure when we started that it could be done, but we were going to do it. And we did. And then we pulled back to Glenn-like riding around the block. And then we even quit doing that.
No one could possibly have predicted, 50 years ago, that we would be so earthbound now. It was impossible to conceive. Back then, Robert Heinlein assumed we would have made two expeditions to Mars by the end of the century (even with a third World War delaying us), and that was totally doable. Of course we would! If we could go to the moon in a decade, surely we could make it to Mars in four!
But today, people make fun of Newt Gingrich for even talking about it. And between the left and its preference for social programs and the right with its not wanting government to do anything, it’s hard even to remember a time when we knew, for a fact, that we could do it all. And did it.
Now, we’re all about what we can’t do, or don’t want to do, which amounts to much the same thing.
It’s just not as exciting to be an earthling now as it was that day 50 years ago.
Here’s something that has frustrated me, and maybe some of y’all can advise me.
Several times, I’ve wondered what a word cloud of my whole blog — since I started it in 2009 — would show in terms of what has obsessed me over these last three tumultuous years. Or, more practically, what verbal habits I need to dial back on.
But all I can ever get, when I enter my URL, is a cloud made of the last few posts, as you can see above. That’s pretty useless. I mean, I know what I’ve written about today. What I want to see is what sort of result I get over time. That might actually tell me something.
Elie Wiesel, the Holocaust survivor who has devoted his life to combating intolerance, says Republican presidential candidate Mitt Romney “should speak to his own church and say they should stop” performing posthumous proxy baptisms on Jews.
The Nobel Peace Prize winner spoke to The Huffington Post Tuesday soon after HuffPost reported that according to a formerly-Mormon researcher, Helen Radkey, some members of the Church of Jesus Christ of Latter-day Saints had submitted Wiesel’s name to a restricted genealogy website as “ready” for posthumous proxy baptism. Radkey found that the name of Wiesel had been submitted to the database for the deceased, from which a separate process for proxy baptism could be initiated. Radkey also said that the names of Wiesel’s deceased father and maternal grandfather had been submitted to the site…
To which I can only say, Proxy baptism? Really? That doesn’t sound kosher to me, somehow.
Anyway, the Mormons are saying they didn’t really “baptize” Wiesel, even though his name pops up in their records. Nor did they intend to sorta, kinda baptize Simon Wiesenthal’s parents:
SALT LAKE CITY — Mormon church leaders apologized to the family of Holocaust survivor and Jewish rights advocate Simon Wiesenthal after his parents were posthumously baptized, a controversial ritual that Mormons believe allows deceased people a way to the afterlife but offends members of many other religions.
Wiesenthal died in 2005 after surviving the Nazi death camps and spending his life documenting Holocaust crimes and hunting down perpetrators who remained at large. Jews are particularly offended by an attempt to alter the religion of Holocaust victims, who were murdered because of their religion, and the baptism of Holocaust survivors was supposed to have been barred by a 1995 agreement…
The church immediately apologized, saying it was the actions of an individual member of church — whom they did not name — that led to the submission of Wiesenthal’s name…
Hey, it could happen to anybody, right? Right?
I don’t want to cast any aspersions, but this seems kind of… out there. I mean, we baptize babies who don’t know what’s going on, but dead people? Dead people who are not of your persuasion?