Losing That Old-Time Religion: Political Christianity in America

Dr. Dan Druckermann, post-cyber historian of the 22nd century, happened across something enormously old. It was stuffed into the wall of another empty church undergoing remodeling into an apartment. It survived probably because it had not seen air or light for a hundred years. It was a newspaper, a product once obtained from dead trees through a complex series of tasks that involved many workers and enormous machines.

The newspaper was dated June, 13, 2012. It was named after an ancient Roman debating spot called “The Forum.”

Not all old news is interesting, but this particular issue attracted Dr. Dan’s attention. It reported issues from a primary election the day before. While even a regional history Newman Center, Fargo.specialist like Dr. Dan had hardly any knowledge of vague early 21st-century issues such as “abolishing the property tax” or “keeping the Fighting Sioux,” he knew plenty about one thing reported: “protecting religious freedoms.”

This was an issue that drew support mostly from Christians. Many conservative churches, and particularly the Catholic Church, had preached its approval in pulpits and political signs.

Yet the measure had been defeated. Soundly. “Of course it was,” thought Dr. Dan. “Not at all surprising.”

Dr. Dan’s research specialty, early 21st century American history, took him inevitably into the expanding effort of American religious leaders to aggressively inject themselves into United States political debates. Historians traced the movement to the late 1970s. This was when a conservative Christian minister named Jerry Falwell founded a group called the Moral Majority.

The movement was frankly political; Falwell was determined to save the country’s supposed moral decay by recourse to politics. We must, he said, defeat the liberal agenda that’s destroying America. Before Falwell Christian churches had traditionally shied away from promoting political agendas.

Some 30 years later American Christians who agreed with Falwell’s efforts found new energy in the election of a liberal president, Barack Obama. Even the Roman Catholics—not natural allies of evangelicals who traditionally detested Catholicism—came on board when the president proposed to protect women’s rights to contraception through a new health care law.

“It was a weird, quixotic quest,” mused Dr. Dan. Hardly any women at that time, Catholic or not, opposed contraception. And yet it formed a springboard to Catholic bishops joining aggressive political campaigns.

The “Tea Party Movement” of this period was an attempt of the religious right to create an alignment of church and state. “A fusion,” as socialists writing in Foreign Affairs at the time called it.

By mid-21st century Christian leaders had formed their own conservative “Christian Capitalists for Jesus” (CCJ) political party, and fielded politicians in most elections. Yes, their tax-exempt status had been revoked in response to their frankly political motivations. But Christian leaders believed that loss to be worth the price, as political contributions might make up the difference.

“But American Christian leaders had forgotten their history,” Dr. Dan wrote in one of his research articles on the topic. “Or maybe they refused to learn it, because it was European history. Conservatives of the era detested European approaches to improve their societies. These Americans called it ‘socialism,’ and belittled anything European.”

But what European Christian leaders found out was this: if you turn Christianity into a political movement, you lose your authority as spiritual guide. Britain in 2014 dumped the Church of England after its archbishop in 2012 refused to accept gay equality. And democratic France in 1905 had violently separated itself from the politically royalist Catholic church, confiscating property of church leaders discredited by political machinations.

In France and England of the early 2000s, people who called themselves practicing Christians formed a tiny minority.

Dr. Dan thought American Christians, at the least, should have seen this coming. Because in 30 years of the Moral Majority, none of its key political issues had advanced. Abortion, same-sex marriage, smaller government, religious freedom—not one of these issues, or any others promoted by the original movement, had been successful in changing American minds by 2012. In fact, surveys showed the opposite, that a new generation of Americans was becoming more liberal.

And also more suspicious of organized Christian religion. Polls showed young Americans in 2012 perceived Christianity as being too political, and too conservative. The next generation was apparently saying, “If religion is just about conservative politics, I’m outta here.

By the time this antiquated old newspaper hit the streets in 2012, American Christianity already was doomed. Its fateful decision to become political had been greeted by a sustained decline, seen in almost every sort of statistic, from number of practitioners to influence over culture. The number of self-identified Christians in America was dropping dramatically—about a percentage point a year—and confidence in Christian institutions had sunk to an all-time low.

The truth, as Dr. Dan and everybody else in 22nd-century America knew, was that when Christian leaders became political, they discredited their power to persuade Americans that they stood for Jesus.

By Dr. Dan’s time, Christianity in America still existed, just as it had existed in a European country such as France in 2012. And similar to France, those Americans who still described themselves as practicing Christians stood at about 4.5 percent.

Lobsters, long life, and a grasshopper

Dr. Dan Druckermann, post-cyber historian of the 22nd century, was thinking about grasshoppers.

cicada

Cicada.

More specifically cicadas. Those insects that used to sometimes create a shrill din in southern climes but now, as he looked out his office window at the flat Dakota landscape, had moved north with global warming.

But it was not the warm weather that occupied the thoughts of Dr. Dan. It was mortality.

Dr. Dan had reached his 70s, and so by standards of the 22nd century was middle-aged. Based on his last robo-checkup—more accurate an less expensive than an actual doctor—he was expected to live the average 22nd century lifespan of 140.

“Not too much time left,” Dr. Dan mused, “and a lot to do. Learn tae kwon do! Try to fix that antique iPad from the antique shop!” But Dr. Dan was a specialist in 20th century history. He knew that such self-talk would have sounded like science fiction a century ago.

In 2012 the average lifespan was nearing 80. People thought that to be fairly routine. But to the generation of a century before that, such a lifespan would have sounded as amazing as 140 might have sounded to the millennials. Because in 1900 the average life expectancy world-wide was about 40.

It had nearly doubled in that century. Why? No single reason. It was a matter of incremental advances in medicine and hygiene. Many were based on the ability of medicine to conquer infectious diseases that in ancient times through the 1800s carried off most people young or old. But medicine also had discovered ways to produce safer food and water, ways to engineer new blood vessels, organs and joints, and even ways to better treat the once-mysterious scourge of cancer.

“People in 2012 still thought cancer was a death sentence,” mused Dr. Dan. That was a vestige of 1970s mindset. People didn’t realize that cancer survival rates in 40 years had slowly inched up, year after year. By 2010 they had doubled. But as Dr. Dan the historian knew, people tend to discount incremental progress, and forget to reset their understanding of the world made new by slow change.

And that brought Dr. Dan back to grasshoppers. As medicine and public policy produced incremental gains in lifespan throughout the 21st century, people who once considered a century of life to be truly astounding—because only six-tenths of one percent of the population in 2012 lived that long—grew to accept 100 as a modest achievement. They expected to live much longer. But Dr. Dan and every 22nd century human knew what was discovered long ago: one’s second century was not something to anticipate with enthusiasm. It was something to worry about, possibly fear, even dread.

The ancient Greek myth of Tithonus told a story of the god Eros, who fell in love with a mortal. According to the Homeric Hymn, Eros begged Zeus to make her lover Tithonus immortal. Zeus granted her request. But Eros forgot that eternal life is not the same thing as eternal youth.

Tithonus as immortal did not die. But he continued to age. More and more decrepit, soon he had no strength to even move. As related by Alfred, Lord Tennyson, Tithonus laments,

The woods decay, the woods decay and fall,
The vapours weep their burthen to the ground,
Man comes and tills the field and lies beneath,
And after many a summer dies the swan.
Me only cruel immortality
Consumes; I wither slowly in thine arms.

A shrunken, desiccated, ancient Tithonus finally becomes—a cicada.

Ponce de Leon

Ponce de Leon

“We in the 22nd century have similarly extended our lives,” mused Dr. Dan. But the triumphs of medicine that made long life possible did not take into consideration Ponce de Leon’s search. He did not look for a fountain of long life. He looked for a fountain of perpetual youth.

One of the big problems of the early 2200s was longevity’s curse, something people had already begun to discover a century before. The longer people lived, the more they suffered the effects of old age.

The nursing home business was booming, sure, but Dr. Dan didn’t think that was necessarily a plus for society. “What we needed a century ago,” thought Dr. Dan, “was a medical science that worked not to make our lives longer, but to keep us eternally youthful until the day we drop dead. If only I were a lobster with legs!

Know more! “Radical Life Extension Is Already Here, But We’re Doing It Wrong.” 

Finding your career in the mass media

Careers in mass media flow chart.What were the mass media all about in 2012? Dr. Dan Druckermann, post-cyber historian of the 21st century, could explain it. After all, he was a specialist in 20th-century mass media history. But instead of describing it all to his history students, he decided a visual reputation would make it more clear.
What could mass media students of a century ago do, and why would they decide to do it? Dr. Dan offered this flow chart for the young and confused.

Lefse, vegans, and a video

Lefse, vegans, and a video

Lefse and rolling pin.Does anybody eat lefse anymore? Dr. Dan Druckermann, post-cyber historian of the 22nd century, wanted to know, because, well, historians are a curious sort. In 2112 America, the old Norwegian food seemed to have about become extinct.

Why? Veganism. It had swept the Midwest like a spring snowstorm in the late 21st century. Americans began to take seriously the cost of the obesity epidemic blamed on habits developed in the late 20th century. People just had to lose weight. One way to do that was to eliminate “junk food,” the highly processed and calorie-dense fare of most American restaurants at that time.

But Americans tend to believe anything worth doing is worth doing to extremes. So many people did not stop at hamburgers and donuts. They moved on to eliminate all meat, then all fish, then all dairy. And lefse featured cream and butter.

But it took Dr. Dan only a little research time to find out that, indeed, lefse still found favor among a small group of good Norwegian stock living in the Upper Midwest. That area of the United States always loved its lefse, a food hard to explain to the rest of the country. Sort of a soft flour tortilla, but made with potatoes, was Dr. Dan’s best effort.

As dairy products became less popular, a few intrepid Scandinavians moved to find substitutes for the milk and butter. And they succeeded! This lefse was almost as good as the old original. But more than that. Dr Dan should have known: his indefatigable predecessor, Ross Collins, had actually shot a video of experts making lefse. And included was not only the traditional recipe, but the vegan recipe. “Good ol’ Ross,” thought Dr. Dan. “He was a true historian–always trying to conserve the present as a gift for those of us in the future.”

Watch the video.

Get the recipe.

Dr. Dan Druckermann, a specialist in 21st century history, writes about America and the world. His writing partner, Ross F. Collins, preceded him as a professor at North Dakota State University.

Madagascar. Like Nowhere Else.

Madagascar

Madagascar: Like Nowhere Else.

Historians may be given to occasional bouts of longing for the distant past they study. Research, facts, a steely-eyed approach to writing history: that’s the training of the professional historian. But sometimes on a grey winter day random feelings sneak into the idle academic’s mind and can’t be denied. For Dr. Dan Druckermann, post-cyber historian of the 22nd century, the memory jog was a video his predecessor had produced featuring Madagascar. Ross Collins had visited the country over holiday break 2011-2012.

Madagascar, Dr. Dan recalled, is the world’s fourth largest island. It intrudes into the Indian Ocean about 200 miles east of Mozambique. At the beginning of the 21st century, it was the focus of a world-wide biodiversity debate.

LemurMadagascar was unique as an extraordinary cradle for animals and plants found nowhere else on earth. About 80 percent of its plant species were endemic in 2012; that is they were found no place else. Every one—100 percent—of its animals not introduced from elsewhere were found only in Madagascar. Most famous were the lemurs, a monkey-like ancestor that thrived on the island but elsewhere had been eliminated by evolution in favor of monkeys.

The charming friendly chameleon—60 percent were endemic, and biologists thought that animal probably originated on the island. And those amazing baobobs! Dr. Dan had seen the photos.

But the sad feelings of longing and loss washing over his thoughts came not from the statistics of a century ago. Even then, deforestation and unwise farming techniques had driven so many of Madagascar’s unique species to extinction’s brink. Even then all the lemurs were considered endangered, and much of the flora.

Worldwide organizations tried to help. But political instability and world indifference could not be overcome. By 2100 most of the lemurs had become extinct. Only two species of chameleons remained. The famous baobabs, enormous trees that had existed more then 500 years, had fallen as habitat around them was converted into cropland.

BaobobsDr. Dan could know these great vanished sentinels only in photographs—just as his ancestors of a century ago knew only from pictures the passenger pigeon, Barbary lion and Yunnan box turtle.

Madagascar had become a weedy collection of second-growth forest overrun with rats, rattlesnakes and kudzu—aggressive species able to overcome and replace endemic animals and plants. But what humans now know was even more unsettling: most medications found to treat the terrible diseases of humanity had been discovered in diverse species of plants and animals. And in Dr. Dan’s time, as diversity had diminished, so had hope for humanity that a new discovery could save lives as well as preserve the immense diversity of life on earth. If only his ancestors had cared enough to do something.

When the Great Yellow Father Ruled the World

It was a profoundly moving moment for Dr. Dan Druckermann, post-cyber historian of the 22nd century. His burrowing into the archives had produced rare videos of a process thought lost to history.

Dr. Dan loved researching photojournalism during its golden age of the later 20th century. (See “The Rise and Decline of Straight Photography,” posted Sept. 29, 2011.)

Photography was a human activity driven by technological advances. The original chemical-based photographic processes had seen revolution after revolution, until reaching a pinnacle of achievement with Kodak’s T-Max black-and-white film in the 1990s. But that pinnacle came to symbolize the last hurrah for a process based on precious silver and finicky liquids. By the first decade of the 21st century, film-based photography was pretty much dead.

“And so was the Great Yellow Father,” mused Dr. Dan. As a historian of the early 21st century, he was one of the few who knew what photographers of yore fondly nicknamed the once world leader with the iconic yellow logo. By 2011, recalled Dr. Dan, Eastman Kodak was considering bankruptcy.

Kodak productsThat wasn’t the loss that particularly bothered Dr. Dan. He could understand the nostalgia of his predecessor, Ross Collins, who grew up working in a home darkroom stocked with familiar yellow containers. But Dr. Dan knew this much: great and famous companies come and go. Heck, who in 2111 remembered a once-powerful company called Microsoft? It was now a half century since technology advanced to the point where electrical brain waves could access the cloud to produce most of what used to require that antique MS operating system. Like Royal typewriters, or the U.S. Postal Service, Microsoft evaporated into Dr. Dan’s history books.

So be it. Capitalism at work. And capitalism was very much at work in the early 21st century. By about 2005, digital imaging had nearly replaced film. Why? For newspaper publishers, it began as a matter of money. Film and darkroom chemicals were expensive. Digital images cost nothing to produce, and only the price of a ubiquitous computer to edit.

Digital photography ushered a new golden age. But this time the age belonged to the amateurs. The old home darkroom required space, skill and money. But everyone could produce pretty good digital images without much special skill at all.

“And that,” Dr. Dan had written once in a scholarly article, “marked the beginning of the dark ages for historians of the visual image.”

This was because the old chemical-based processes offered proven longevity. Chemical-based black-and-white prints properly stored lasted centuries. But digital photos printed on those clumsy ink-jet machines had no such proven longevity. Some survived the century. Most did not. And the digital files became corrupted or unreadable.

Did the people care? No, not really. Historical study had not become the centerpiece of higher education, as it was in Dr. Dan’s time. People in 2011 lived in the present. They thought little about leaving a historical trace.

So how did people actually go about that obsolete work of producing photos with chemicals? Dr. Dan knew hobbyists often set up darkrooms in their homes. But he had no idea what they might have looked like, or how photographers might have used them. Until he found a treasure among his predecessor’s pack-ratty archives. Ross had actually made videos of the process in his home darkroom! They weren’t very professional, Dr. Dan observed. But they were accessible. And so Dr. Dan sat back in his office ergonomic posterior support structure, called up a cup of Darjeeling from the food replicator, and began to watch.

Developing film. How to develop film in a home darkroom.

Printing photos.How to print pictures in a home darkroom.

 

Rolling a filmFeature video: Rolling a film onto a developing reel.

Learn more! A short history of the home darkroom.

The children’s wars

Boy with machine gun.Should children be involved in war? Dr. Dan Druckermann, post-cyber historian of the 22nd century, thought not, not ever. But thinking does not make it so—particularly for historians. And Dr. Dan realized that children a century ago often were warriors. Children killed, and were killed, tortured and were tortured, brutalized, and were beastly to their enemies. How did such a thing become acceptable in 2011? In an article found in the pack-rattish archives of Ross Collins, Dr. Dan found his long-gone predecessor once had taken a closer look at that question.

The children’s wars
It was a mere detail mostly ignored in the long frustrating story of America’s war in Afghanistan. Children, war reporter Sebastian Junger observed, found part-time paid work attacking United States military bases. They would harass troops with a few rounds, then hopefully scamper away before soldiers responded with mortars. For this the Taliban reportedly paid $5.

Perhaps this attracted little notice because such stories have become so familiar. At the beginning of the 21st century, the lives of children had been militarized throughout the world. One-quarter of the world’s military recruited children under 15. Nearly one-fifth welcomed children under 12. Some were as young as 6. In the last decade of the 20th century, war deaths included 2 million children. In Iran alone, during its war with Iraq, the Ayatollah Khomeini delighted in the “children’s sacrifice”: 100,000 boys died on the front lines.

But militarization of childhood does not happen only in brutal dictatorships, and governments run by religious zealots or hate-filled sects. Children have been drawn into a militarized life in many countries whose people strongly supported war and clearly believed war could transmit positive virtues to kids. Countries such as the United States.

United States children did not participate in America’s recent wars. But they did participate in World Wars I and II, in a big way. Propaganda aimed at children worked to persuade the youngsters that war was good for them. It could build manly virtues in boys. It could encourage a sense of duty in girls. It could offer all kinds of valued benefits: physical fitness, obedience, thrift, teamwork, loyalty, sacrifice, respect, generosity, and practical skills. In fact, authorities believed, world war could create the kind of admirable traits parents had longed to develop in their children—a miraculous transformation.

The U.S. Government took keen interest in fashioning war propaganda for children. Educators, youth groups and editors of child magazines responded. The child made virtuous by war was expected to fill free time with all sorts of  war duties. Many Americans who were children during World War II remember collecting scrap metal. That was just one of a multitude of jobs, from building airplane models to helping out at farms.

Children were also expected to sustain morale by becoming propagandists themselves. Junior speakers’ bureaus canvassed the country to promote the war. Juvenile war bond sales agents fanned through neighborhoods. These small salesmen were sometimes encouraged to become spies as well—to report to the police those neighbors who refused to buy.

Toy manufacturers turned war into a game. Coaches touted sports as good preparation. America’s child life during the world wars was thoroughly militarized.

American authorities during the world wars hoped to establish a home-front army of children for reasons practical as well as virtuous. Children needed to be kept busy while their parents were fighting or working in war industry. Propaganda served to remind kids that soldiers were fighting not for the present, but for the future. The future was the children. And to whom much—and maybe all—was given, much was required.

The grooming process to bring children into war had never been undertaken at such a massive level before World War I. Yet authorities during these wars had no evil intent in encouraging children to adopt a wartime frame of mind. They worried about children fearful of war and anxious for their parents and family. They found an answer by fashioning war as part of the family, commonplace, something not to be feared. “In wartime, war is a way of life,” educator Angelo Patri in 1943 told parents. “We must adjust our thinking and our behavior to its demands.”

Did this militarization of childhood—all world war belligerent nations shared this country’s philosophy— have unintended consequences? Today’s world has become more brutal, death more acceptable. Since 1945 war deaths have been double that of the 19th century, seven times that of the 18th century. America’s soldiers in World War I were slightly older men. In World War II a voracious need for troops drew the U.S. Army into accepting 17-year-old boys. The Nazi’s Hitler Youth were thrown at Allied troops late into the war. Many perished pointlessly. In World War I it became acceptable to groom children for the home front. In World War II, it became acceptable to groom them for the actual front. At the beginning of the new millennium, soldiers fighting in three-fourths of the world’s wars included children.

Find out more: Children, War and Propaganda.

Dr. Dan Druckermann, a specialist in 21st century history, writes about America and the world. His writing partner, Ross F. Collins, preceded him as a professor at North Dakota State University.

 

Just in time! The Black Friday Workout

Arnold shopping.“History,” mused Dr. Dan Druckermann, “is like an onion.” The post-cyber historian of the 22nd century was thinking about his research as the university neared its Thanksgiving Day break. “You begin with a wrinkled weathered peel that everybody recognizes. You peel the outer layer to find another. You keep peeling through more and more layers–keep looking into more and more sources. And as you reach deeper and deeper, you discover things few people know. Maybe things nobody knows.”

Dr. Dan particularly found that to be true as he shuffled through the crumbling archives of his predecessor, Ross Collins. A century before Ross had donated many boxes of documents–what a pack rat he was–to the university archives. They sat unexplored for a century; after all, who was interested in an obscure professor from 2011? But now and then during a free moment Dr. Dan took a look. And sometimes he did find something interesting. Today, for instance. He discovered that Ross had a sideline interest. He actually was a Certified Fitness Trainer and Group Fitness Instructor!

What a weird sideline for a historian, thought Dr. Dan. Another layer of the onion. And Ross now and then produced articles and videos about fitness. Such as the one below–just in time for “Black Friday,” the merchandizing frenzy that dominated post-Thanksgiving U.S. culture a century ago. The shops became “black with people,” perhaps a translation of the familiar French phrase “Noir de monde.” Hence the slightly sinister-sounding “Black Friday.”

The Black Friday Workout.

As “Black Friday” shopping starts earlier and earlier, many Americans will perhaps begin to worry. In your haste to snag the best deals are you neglecting your post-Thanksgiving feast workout?

No need to despair! You can turn your mall marathon from pathetic to powerful by just building on what we already do so well: shop. Functional training is the hot topic among fitness pros. But you don’t have to go to the gym to function. Just build your body as you buy! Below are 10 exercises designed to turn traditional after-Thanksgiving shopping into a great fitness workout.

1. Lateral hanger raise.

We shoppers prefer to try on multiple items without leaving the changing room. That’s just more efficient. It can also be good exercise! Grab four or more attractive ensembles by their hangers. Hold clothes out to the side, half in each hand. Arms straight, no drooping, please! Wend all the way to the back-of-the-store changing rooms. You can increase the intensity of this exercise by making your way around the entire crowded store, maybe with some added pushing, as you’ll need to create plenty of space shopping with your arms stretched out.

Major muscles worked: deltoids.

2. Shop squats.

You want to take a closer look at that pair of cute shoes on the bottom rack. Instead of bending over and picking them up, squat. Weight through your heels, please! Now the shoes are about at eye level. Hold that pose. Examine the shoes while continuing to squat. Stand up, repeat. Go for the burn!

Build on this exercise by bounding up escalators and stairs two steps at a time. You may have to push past other shoppers to do this, making the shop squat a good cardio challenge as well.

Major muscles worked: quadriceps, hamstrings, glutes, full lower body. A good squat works up to 260 muscles!

3. Top shelf calf raises. 

Can’t quite reach that pair of jeans on the high shelf? Rise to your tiptoes. Hold it! Examine the item from the shelf while staying on your toes. Lower and raise several times.

Major muscles worked: calf (gastrocnemius and soleus).

4. Plyometric get-’em-downs.

Okay, so the pair of jeans is actually higher than you can reach on tiptoe. Instead of relying on a (probably non-existent) sales clerk, snatch the pants yourself. Ease down to a half squat. Jump. Reach out and grab that garment! Don’t want it after all? Jump again to return it to the shelf.

Major muscles worked: full body for speed and strength.

Warning: Not recommended for retrieving breakable items.

5. Bottle curls.

Liquid is heavier than you think! One gallon of water weighs 8.3 pounds. Buy a couple jumbo-sized bottles of shampoo or lotion at the beginning of your expedition. As you move from store to store, bottle in each hand for weight, flex your arms biceps-curl style.

Major muscles worked: biceps.

6. Quad door sweeps.

You can open doors with a mere push, sure. That’s a little bit of exercise, but our upper body is used to it. Why not challenge your lower body instead? Push at the bottom to open the door with your foot. You’ll feel it your quadriceps, and in fact, in your whole lower body, challenged in this surprisingly difficult exercise.

Major muscles worked: quads, abdominals.

7. Checkout line tree pose.

One of the worst parts of Black Friday shopping: waiting in those interminable checkout lines. Why not make it part of your workout? Try standing on one leg. If you can, bring the other heel up until it rests above your knee on the side. Stay in “tree pose” as long as you can. Switch to the other leg.

Major muscles worked: core, lower body. Balance challenge.

 8. Penny-drop lunges.

Will the government ever dispense with those annoying pennies? Worth little of nothing–except as part of your shopping exercise routine! We inevitably drop these coins we get in change. Instead of just letting them go, use this opportunity to do a lunge. One leg back, one leg in front, back straight. Bend front leg and pick up that coin. Add a couple more lunges for good measure.

Major muscles worked: quads, hamstrings, gluteals, core.

9. Weighted mall walk.

Got a few heavy purchases? Don’t ask the kids to haul them off to the car! Get nice carry bags, and lug them around with you throughout your shopping. Get benefits of the gym-time “farmer’s walk” exercise without the dumbbells. Remember: back straight!

Major muscles worked: whole body. Walking fast, it also can be a reasonable cardio exercise.

10. Food court dips.

Find a sturdy and stable plastic chair–that is, all of them, as food court chairs have to handle a heavy American shopping population. Ease yourself into the chair. Place your hands at your sides holding the front of the seat. Slide your butt off the front of the chair. Still holding the seat, your legs out in front, dip up and down.

Major muscles worked: triceps, abs.

Dr. Dan and the cyber-gym: A video motivation

 

We've made fitness too hard.On Fitness and Heath: A Journey. 

Dr. Dan Druckermann, post-cyber historian of the 22nd century, reflected that while many things had changed in a century, one thing had not: the human body. Despite 22nd-century cyber medicine, despite collision avoidance systems, despite anti-stupidity brain implants for teen-agers, the body still required for good health sufficient exercise and good nutrition.

Dr. Dan, of course, was able to make better nutrition choices in 2111. McTofu’s fast-food restaurants since the 2070s had been manufacturing French-fry-tasting fries out of, well, tofu. And the popularity of the healthy Japanese diet that had been growing even a century ago flowered to become standard choice of most Americans.

Seaweed: it’s what was for dinner. And everyone lived better for it.

But exercise? A stubborn challenge. A century of fitness trainers and repeated government pronouncements had not managed to move many Americans to move. Gyms had become more sophisticated, of course. Nearly every modern cyber-gym featured a holodeck somewhat like the one in the retro “Star Trek” television series. Fitness was more fun when you could  get your core workout through swordplay against evil monsters from history, such as the infamous Col. Gaddafi.

But free time still was short in the 22nd century, and Dr. Dan was like most Americans: he had gained a few pounds since college days. So as he looked through the antique video collection created by his predecessor, Ross Collins, he made an unlikely discovery: Ross was a fitness fan. In fact, he actually taught group fitness classes at the university wellness center. “That crabby old duffer was a fitness trainer?” he marveled.

Dr. Dan knew a lot of people a century ago were overweight by a quite few pounds. While a healthy diet kept most 22nd-century Americans from actually becoming obese, Dr. Dan wondered how health choices of 2011 affected his ancestors. He decided to consult the archives. What he found was that he’d dramatically underestimated the problem:

  • Overweight adults in America in 2010: 97.1 million. Obese: 39.8 million.
  • Percentage increase in obesity 1991-2000: 61%.

“I had always been curious,” Dr. Dan thought, “about America’s 20th-century preoccupation with pills and surgery in a quest for youth and beauty. Clearly what they actually wanted was the true fountain of youth: fitness.”

But Dr. Dan had to admit the journey to fitness could not be undertaken from a chair, then as now. After watching Ross’s video, he concluded: “I will get back to the cyber-gym. Beginning tomorrow.”

Dr. Dan Druckermann, a specialist in 21st century history, writes about America and the world. His writing partner, Ross F. Collins, preceded him as a professor at North Dakota State University.

 

Photography: Ways to See

A photograph is an abstraction of reality. Learn to visualize. Learn to see.

Dr. Dan Druckermann, a specialist in 21st century history, writes about America and the world. His writing partner, Ross F. Collins, preceded him as a professor at North Dakota State University.

Dr. Dan Druckermann, post-cyber historian of the 22nd century, was sifting through his predecessor’s videos. A hundred years ago it was still not possible to stream information directly from computer through the wireless neurological connection to the brain. People3-D glasses. actually looked at a screen, television, computer or iPhone. And what about those ludicrous 3-D glasses? A fad from about 2011 that never failed to cause chuckles among the next generation. It was like those leg warmers and shoulder pads from about 1985.

In any case, images in 2011 were made using a digital process still called photography (“light painting”). And while Americans had pretty much moved away from the stone-age chemical process, they had not moved into the realm of image as electrical impulse from the brain. And that meant that Ross Collins still taught photography.

Ways to see.

Ways to see.

Ross fancied himself a photographer. Dr. Dan had to smirk about that. But his predecessor did try to teach through visual means, including video. And Dr. Dan had discovered this video from a photography class of long-ago 2011.