Wednesday, December 06, 2017

No, Renewables Are Not Taking Over The World Anytime Soon

No, Renewables Are Not Taking Over The World Anytime Soon

We have spent the last two centuries getting off renewables because they were mostly weak, costly and unreliable. Half a century ago, in 1966, the world got 15.6% of its energy from renewables. Today (2016) we still get less of our energy at 13.8%.

With our concern for global warming, we are ramping up the use of renewables. The mainstream reporting lets you believe that renewables are just about to power the entire world. But this is flatly wrong.

The new World Energy Outlook report from the International Energy Agency shows how much renewables will increase over the next quarter century, to 2040. In its New Policies Scenario, which rather optimistically expects all nations to live up to their Paris climate promise, it sees the percentage increase less than 6 percentage points from 13.8% to 19.4%. More realistically, the increase will be 2 percentage points to 15.8%.

Most of the renewables are not solar PV and wind. Today, almost 10 percentage points come from the world’s oldest fuel: wood. Hydropower provides another 2.5 percentage points and all other renewables provide just 1.6 percentage points, of which solar PV and wind provide 0.8 percentage points.

Neither will most renewables in 2040 come from solar PV and wind, as breathless reporting tends to make you believe. 10 percentage points will come from wood. Hydropower provides another 3 percentage points and all other renewables provide 6 percentage points, of which solar PV and wind will (very optimistically) provide 3.7 percentage points.

Oh, and to achieve this 3.7 % of energy from solar PV and wind, you and I and the rest of the world will pay – according to the IEA – a total of $3.6 trillion in subsidies from 2017-2040 to support these uncompetitive energy sources. (Of course, if they were competitive, they wouldn’t need subsidies, and then they will be most welcome.)

Most people tend to think about electricity for renewables, but the world uses plenty of energy that is not electricity (heat, transport, manufacture and industrial processes).

Actually, if the world miraculously could make the *entire* global electricity sector 100% green without emitting a single ton of greenhouse gasses, we would have solved just a third of the total global greenhouse gas problem.

As Al Gore’s climate adviser, Jim Hansen, put it bluntly: “Suggesting that renewables will let us phase rapidly off fossil fuels in the United States, China, India, or the world as a whole is almost the equivalent of believing in the Easter Bunny and [the] Tooth Fairy.”

We need to get real on renewables. Only if green energy becomes much cheaper – and that requires lots of green R&D – will a renewables transition be possible.

Friday, November 17, 2017

The Fragile Generation - Bad policy and paranoid parenting are making kids too safe to succeed



The Fragile Generation
Bad policy and paranoid parenting are making kids too safe to succeed.
 
One day last year, a citizen on a prairie path in the Chicago suburb of Elmhurst came upon a teen boy chopping wood. Not a body. Just some already-fallen branches. Nonetheless, the onlooker called the cops.

Officers interrogated the boy, who said he was trying to build a fort for himself and his friends. A local news site reports the police then "took the tools for safekeeping to be returned to the boy's parents."

Elsewhere in America, preschoolers at the Learning Collaborative in Charlotte, North Carolina, were thrilled to receive a set of gently used playground equipment. But the kids soon found out they would not be allowed to use it, because it was resting on grass, not wood chips. "It's a safety issue," explained a day care spokeswoman. Playing on grass is against local regulations.

And then there was the query that ran in Parents magazine a few years back: "Your child's old enough to stay home briefly, and often does. But is it okay to leave her and her playmate home while you dash to the dry cleaner?" Absolutely not, the magazine averred: "Take the kids with you, or save your errand for another time." After all, "you want to make sure that no one's feelings get too hurt if there's a squabble."

The principle here is simple: This generation of kids must be protected like none other. They can't use tools, they can't play on grass, and they certainly can't be expected to work through a spat with a friend.

And this, it could be argued, is why we have "safe spaces" on college campuses and millennials missing adult milestones today. We told a generation of kids that they can never be too safe—and they believed us.

Safety First
We've had the best of intentions, of course. But efforts to protect our children may be backfiring. When we raise kids unaccustomed to facing anything on their own, including risk, failure, and hurt feelings, our society and even our economy are threatened. Yet modern child-rearing practices and laws seem all but designed to cultivate this lack of preparedness. There's the fear that everything children see, do, eat, hear, and lick could hurt them. And there's a newer belief that has been spreading through higher education that words and ideas themselves can be traumatizing.
How did we come to think a generation of kids can't handle the basic challenges of growing up?
Beginning in the 1980s, American childhood changed. For a variety of reasons—including shifts in parenting norms, new academic expectations, increased regulation, technological advances, and especially a heightened fear of abduction (missing kids on milk cartons made it feel as if this exceedingly rare crime was rampant)—children largely lost the experience of having large swaths of unsupervised time to play, explore, and resolve conflicts on their own. This has left them more fragile, more easily offended, and more reliant on others. They have been taught to seek authority figures to solve their problems and shield them from discomfort, a condition sociologists call "moral dependency."

This poses a threat to the kind of open-mindedness and flexibility young people need to thrive at college and beyond. If they arrive at school or start careers unaccustomed to frustration and misunderstandings, we can expect them to be hypersensitive. And if they don't develop the resources to work through obstacles, molehills come to look like mountains.

This magnification of danger and hurt is prevalent on campus today. It no longer matters what a person intended to say, or how a reasonable listener would interpret a statement—what matters is whether any individual feels offended by it. If so, the speaker has committed a "microaggression," and the offended party's purely subjective reaction is a sufficient basis for emailing a dean or filing a complaint with the university's "bias response team." The net effect is that both professors and students today report that they are walking on eggshells. This interferes with the process of free inquiry and open debate—the active ingredients in a college education.

And if that's the case already, what of the kids still in grammar school, constantly reminded they might accidentally hurt each other with the wrong words? When today's 8-year-olds become the 18-year-olds starting college, will they still view free speech as worthy of protecting? As Daniel Shuchman, chairman of the free speech-promoting Foundation for Individual Rights in Education (FIRE), puts it, "How likely are they to consider the First Amendment essential if they start learning in fifth grade that you're forbidden to say—or even think—certain things, especially at school?"
Parents, teachers, and professors are talking about the growing fragility they see. It's hard to avoid the conclusion that the overprotection of children and the hypersensitivity of college students could be two sides of the same coin. By trying so hard to protect our kids, we're making them too safe to succeed.

Children on a Leash
If you're over 40, chances are good that you had scads of free time as a child—after school, on weekends, over the summer. And chances are also good that, if you were asked about it now, you'd go on and on about playing in the woods and riding your bike until the streetlights came on.
Today many kids are raised like veal. Only 13 percent of them even walk to school. Many who take the bus wait at the stop with parents beside them like bodyguards. For a while, Rhode Island was considering a bill that would prohibit children from getting off the bus in the afternoon if there wasn't an adult waiting to walk them home. This would have applied until seventh grade.

As for summer frolicking, campers don't just have to take a buddy with them wherever they go, including the bathroom. Some are now required to take two—one to stay with whoever gets hurt, the other to run and get a grown-up. Walking to the john is treated like climbing Mt. Kilimanjaro.
After school, kids no longer come home with a latchkey and roam the neighborhood. Instead, they're locked into organized, supervised activities. Youth sports are a $15 billion business that has grown by 55 percent since just 2010. Children as young as third grade are joining traveling teams—which means their parents spend a lot of time in the car, too. Or they're at tutoring. Or they're at music lessons. And if all else fails, they are in their rooms, online.

Even if parents want to shoo their kids outside—and don't come home till dinner!—it's not as easy as it once was. Often, there are no other children around to play with. Even more dishearteningly, adults who believe it's good for young people to run some errands or play kickball down the street have to think twice about letting them, because busybodies, cops, and social workers are primed to equate "unsupervised" with "neglected and in danger."

You may remember the story of the Meitivs in Maryland, investigated twice for letting their kids, 10 and 6, walk home together from the park. Or the Debra Harrell case in South Carolina, where a mom was thrown in jail for allowing her 9-year-old to play at the sprinkler playground while she worked at McDonald's. Or the 8-year-old Ohio boy who was supposed to get on the bus to Sunday school, but snuck off to the Family Dollar store instead. His dad was arrested for child endangerment.
These examples represent a new outlook: the belief that anytime kids are doing anything on their own, they are automatically under threat. But that outlook is wrong. The crime rate in America is back down to what it was in 1963, which means that most of today's parents grew up playing outside when it was more dangerous than it is today. And it hasn't gotten safer because we're hovering over our kids. All violent crime is down, including against adults.

Danger Things
And yet it doesn't feel safer. A 2010 study found "kidnapping" to be the top parental fear, despite the fact that merely being a passenger in a car is far more dangerous. Nine kids were kidnapped and murdered by strangers in 2011, while 1,140 died in vehicles that same year. While Harvard psychologist Steven Pinker writes in 2011's The Better Angels of Our Nature that life in most countries is safer today than at any time in human history, the press keeps pushing paranoia. This makes stepping back feel doubly risky: There's the fear of child kidnappers and the fear of Child Protective Services.

At times, it seems like our culture is conjuring dangers out of thin air, just to have something new to worry about. Thus, the Boulder Public Library in Colorado recently forbade anyone under 12 to enter without an adult, because "children may encounter hazards such as stairs, elevators, doors, furniture, electrical equipment, or other library patrons." Ah, yes, kids and library furniture. Always a lethal combo.

Happily, the library backed off that rule, perhaps thanks to merciless mocking in the media. But saner minds don't always prevail. At Mesa Elementary School, which also happens to be in Boulder, students got a list of the items they could not bring to the science fair. These included "chemicals," "plants in soil," and "organisms (living or dead)." And we wonder why American children score so low on international tests.

But perhaps the single best example of how fantastically fearful we've become occurred when the city of Richland, Washington, got rid of all the swings on its school playgrounds. The love of swinging is probably older than humanity itself, given our arboreal origins. But as a school district spokesman explained, "Swings have been determined to be the most unsafe of all the playground equipment on a playground."

You may think your town has avoided such overkill, but is there a merry-go-round at your local park, or a see-saw? Most likely they, too, have gone the way of lawn darts. The Consumer Product Safety Commission even warns parks of "tripping hazards, like…tree stumps and rocks," a fact unearthed (so to speak) by Philip Howard, author of 2010's Life Without Lawyers.

The problem is that kids learn by doing. Trip over a tree stump and you learn to look down. There's an old saying: Prepare your child for the path, not the path for your child. We're doing the opposite.
Ironically, there are real health dangers in not walking, or biking, or hopping over that stump. A Johns Hopkins study this summer found that the typical 19-year-old is as sedentary as a 65-year-old. The Army is worried that its recruits don't know how to skip or do somersaults.

But the cost of shielding kids from risks goes well beyond the physical, as a robust body of research has shown.

Of Trophies and Traumas
A few years ago, Boston College psychology professor emeritus Peter Gray was invited by the head of counseling services at a major university to a conference on "the decline in resilience among students." The organizer said that emergency counseling calls had doubled in the last five years. What's more, callers were seeking help coping with everyday problems, such as arguments with a roommate. Two students had dialed in because they'd found a mouse in their apartment. They also called the police, who came and set a mousetrap. And that's not to mention the sensitivity around grades. To some students, a B is the end of the world. (To some parents, too.)

Part of the rise in calls could be attributed to the fact that admitting mental health issues no longer carries the stigma it once did, an undeniably positive development. But it could also be a sign, Gray realized, that failing at basic "adulting" no longer carries the stigma it once did. And that is far more troubling.

Is this outcome the apotheosis of participation-trophy culture? It's easy to scoff at a society that teaches kids that everything they do deserves applause. But more disturbing is the possibility that those trophies taught kids the opposite lesson: that they're so easily hurt, they can't handle the sad truth that they're not the best at something.

Not letting your kid climb a tree because he might fall robs him of a classic childhood experience. But being emotionally overprotective takes away something else. "We have raised a generation of young people who have not been given the opportunity to…experience failure and realize they can survive it," Gray has said. When Lenore's son came in eighth out of nine teams in a summer camp bowling league, he got an eighth-place trophy. The moral was clear: We don't think you can cope with the negative emotions of finishing second-to-last.

Of course, it's natural to want to see kids happy. But the real secret to happiness isn't more high fives; it's developing emotional resilience. In our mania for physical safety, coupled with our recent tendency to talk about "emotional safety," we have systematically deprived our children of the thousands of challenging—and sometimes upsetting—experiences that they need in order to learn that resiliency. And in our quest to protect them, we have stolen from children the best resilience training known to man: free play.

Play's the Thing
All mammals play. It is a drive installed by Mother Nature. Hippos do backflips in the water. Dogs fetch sticks. And gazelles run around, engaging in a game that looks an awful lot like tag.

Why would they do that? They're wasting valuable calories and exposing themselves to predators. Shouldn't they just sit quietly next to their mama gazelles, exploring the world through the magic of PBS Kids?

It must be because play is even more important to their long-term survival than simply being "safe." Gray's main body of research is on the importance of free play, and he stresses that it has little in common with the "play" we give kids today. In organized activities—Little League, for example—adults run the show. It's only when the grown-ups aren't around that the kids get to take over. Play is training for adulthood.

In free play, ideally with kids of mixed ages, the children decide what to do and how to do it. That's teamwork, literally. The little kids desperately want to be like the bigger kids, so instead of bawling when they strike out during a sandlot baseball game, they work hard to hold themselves together. This is the foundation of maturity.

The older kids, meanwhile, throw the ball more softly to the younger ones. They're learning empathy. And if someone yells, "Let's play on just one leg!"—something they couldn't do at Little League, with championships (and trophies!) on the line—the kids discover what it means to come up with and try out a different way of doing things. In Silicon Valley terms, they "pivot" and adopt a "new business model." They also learn that they, not just grown-ups, can collectively remake the rules to suit their needs. That's called participatory democracy.

Best of all, without adults intervening, the kids have to do all the problem solving for themselves, from deciding what game to play to making sure the teams are roughly equal. Then, when there's an argument, they have to resolve it themselves. That's a tough skill to learn, but the drive to continue playing motivates them to work things out. To get back to having fun, they first have to come up with a solution, so they do. This teaches them that they can disagree, hash it out, and—perhaps with some grumbling—move on.

These are the very skills that are suddenly in short supply on college campuses.

"Free play is the means by which children learn to make friends, overcome their fears, solve their own problems and generally take control of their own lives," Gray writes in 2013's Free to Learn (Basic Books). "Nothing we do, no amount of toys we buy or 'quality time' or special training we give our children, can compensate for the freedom we take away. The things that children learn through their own initiatives, in free play, cannot be taught in other ways."

Unstructured, unsupervised time for play is one of the most important things we have to give back to kids if we want them to be strong and happy and resilient.

Where Have All the Paperboys Gone?
It's not just that kids aren't playing much on their own. These days, they're not doing much of anything on their own. In an article in The Atlantic, Hanna Rosin admits that "when my daughter was 10, my husband and I suddenly realized that in her whole life, she had probably not spent more than 10 minutes unsupervised by an adult."

In earlier generations, this would have seemed a bizarre and wildly overprotective upbringing. Society had certain age-related milestones that most people agreed on. Kids might be trusted to walk to school by first grade. They might get a latchkey at 8, take on a newspaper route around 10, start babysitting at 12. But over the past generation or so, those milestones disappeared—buried by fears of kidnapping, the rise of supervised activities, and the pre-eminence of homework. Parents today know all about the academic milestones their kids are supposed to reach, but not about the moments when kids used to start joining the world.

It's not necessarily their fault. Calls to eight newspapers in North Carolina found none that would take anyone under the age of 18 to deliver papers. A police chief in New Albany, Ohio, went on record saying kids shouldn't be outside on their own till age 16, "the threshold where you see children getting a little bit more freedom." A study in Britain found that while just under half of all 16- to 17-year-olds had jobs as recently as 1992, today that number is 20 percent.

The responsibility expected of kids not so long ago has become almost inconceivable. Published in 1979, the book Your 6-Year-old: Loving and Defiant includes a simple checklist for what a child entering first grade should be able to do: Can he draw and color and stay within the lines of the design being colored? Can he ride a small two-wheeled bicycle without helper wheels? Can he travel alone in the neighborhood (four to eight blocks) to a store, school, playground, or friend's home?
Hang on. Walk to the store at 6—alone?

It's tempting to blame "helicopter parents" for today's less resilient kids. But when all the first-graders are walking themselves to school, it's easy to add yours to the mix. When your child is the only one, it's harder. And that's where we are today. Norms have dramatically changed. The kind of freedom that seemed unremarkable a generation ago has become taboo, and in some cases even illegal.

A Very Hampered Halloween
In Waynesboro, Georgia, "trick or treaters" must be 12 or younger; they must be in a costume; and they must be accompanied by an adult at least 21 years of age. So if you have kids who are 15, 10, and 8, you can't send them out together. The 15-year-old is not allowed to dress up, yet she won't be considered old enough to supervise her siblings for another six years. And this is on the one night of the entire year we traditionally let children pretend to be adults.

Other schools and community centers now send letters home asking parents not to let their children wear scary costumes. Some even organize "trunk or treats"—cars parked in a circle, trunks open and filled with candy, thus saving the kids from having to walk around the neighborhood or knock on doors. (That would be tiring and terrifying.) If this is childhood, is it any wonder college kids also expect to be micromanaged on Halloween?

At Yale in 2015, after 13 college administrators signed a letter outlining appropriate vs. inappropriate costume choices for students, the childhood development expert and campus lecturer Erika Christakis suggested that it would be better to allow kids to think for themselves. After all, Halloween is supposed to be about pushing boundaries. "Is there no room anymore for a child or young person to be a little obnoxious…or, yes, offensive?" she wrote. "Have we lost faith in young people's capacity—your capacity—to ignore or reject things that trouble you?"

Apparently, yes. Angry students mobbed her husband, the professor Nicholas Christakis, surrounding him in the courtyard of the residential college where he served as master. They screamed obscenities and demanded he apologize for believing, along with his wife, that college students are in fact capable of handling offensive costumes on Halloween. "Be quiet!" a student shouted at him at one point. "As master, it is your job to create a place of comfort and home for the students!" She did not take kindly to his response that, to the contrary, he sees it as his job to create a space where students can grow intellectually.

As it turns out, Halloween is the perfect Petri dish for observing what we have done to childhood. We didn't think anything was safe enough for young people. And now we are witnessing the results.

No Fun and No Joy
When parents curtail their kids' independence, they're not just depriving the younglings of childhood fun. They are denying themselves the grown-up joy of seeing their kids do something smart, brave, or kind without parental guidance.

It's the kind of joy described by a Washington Post columnist who answered the phone one day and was shocked to find her 8-year-old son on the other end. He'd accidentally gone home when he was supposed to stay after school. Realizing she wasn't there, he decided to walk to the store a few blocks away—his first time. The mom raced over, fearing God knows what, and rushed in only to find her son happily helping the shopkeeper stock the shelves with meat. He'd had a snack and done his homework, too. It was an afternoon he'd never forget, and neither would his very proud mother.

When we don't let our kids do anything on their own, we don't get to see just how competent they can be—and isn't that, ultimately, the greatest reward of parenting? We need to make it easier for grown-ups to let go while living in a society that keeps warning them not to. And we need to make sure they won't get arrested for it.

What Is To Be Done?
By trying to keep children safe from all risks, obstacles, hurt feelings, and fears, our culture has taken away the opportunities they need to become successful adults. In treating them as fragile—emotionally, socially, and physically—society actually makes them so.

To combat this problem, we have established a new nonpartisan nonprofit, the Let Grow Foundation. Our goal is to restore resilience by overthrowing the culture of overprotection. We teamed up with Gray, the professor whose research we highlighted above, and FIRE's Shuchman, a New York investment fund manager who is now our chairman.

We are building an organization that seeks to change the social norms, policies, and laws that pressure and intimidate parents, schools, and towns into coddling their kids. We will research the effects of excessive caution, study the link between independence and success, and launch projects to give kids back some free time and free play. Most of all, the Let Grow Foundation will reject the assumption of fragility and promote intellectual, physical, and emotional resilience.

Children know that their parents had more freedom to roam than they do, and more unscheduled time to read or tinker or explore. They also realize that older generations were trusted to roll with some punches, at school and beyond. We hope kids today will start demanding that same independence and respect for themselves. It's their freedom that has been chiseled away, after all.

We want them to insist on their right to engage not just with the physical world, but also with the world of ideas. We want them to hear, read, and voice opinions that go against the grain. We want them to be insulted by the assumption that they and their classmates are so easily hurt that arguments must stop before they start. To this end, we hope to encourage their skepticism about the programs and policies that are ostensibly there to "protect" them from discomfort.

If this effort is successful, we'll soon see kids outside again. Common setbacks will be considered "resilience moments" rather than traumas. Children will read widely, express themselves freely, and work through disagreements without automatically calling on authority figures to solve their problems for them. The more adults step back, the more we believe kids will step up, growing brave in the face of risk and just plain happy in their independence.

Children today are safer and smarter than this culture gives them credit for. They deserve the freedom we had. The country's future prosperity and freedom depend on it.

Lenore Skenazy is founder of the book and blog Free-Range Kids, and president of the non-profit Let Grow Foundation.
Jonathan Haidt is the Thomas Cooley Professor of Ethical Leadership at New York University's Stern School of Business, author of The Righteous Mind (Pantheon Books), and a co-founder and board member of Let Grow.

 http://reason.com/archives/2017/10/26/the-fragile-generation

Sunday, October 22, 2017

The Rise of Identity Politics: An Audit of History Teaching at Australian Universities

The Rise of Identity Politics: An Audit of History Teaching at Australian Universities
The Australian newspaper:
Janet Albrechtsen, Columnist

"Bad ideas flourish in dark places. The Rise of Identity Politics: An Audit of History Teaching at Australian Universities in 2017, ­released on Monday by the Institute of Public Affairs, exposes the dirty ­little secret about history teaching in Australian universities. Rather than rigorous learning about ­important historical events that underpin our dem­o­cracy, history teaching in this country is drenched in identity politics.

Worse, this distortion of history into political ideology is a bellwether of a more profound political disorder that threatens the future of our Australian liberal project.

In a healthy liberal democracy, we contest ideas and we know our democracy is in good shape when the best ideas triumph and the bad ones are sent packing. The Berlin Wall wasn’t dismantled by soldiers but by ideas about individual freedom that appealed more than communism. Today a different menace threatens our democratic health, one that seeks to dismantle our tool for trouncing bad ideas. We’re not just quibbling over different ideas; we’re also arguing over the value of having a healthy contest of ideas. Skewed history teaching is symptomatic of a contest that will determine the future of our democratic project.

The audit by Bella d’Abrera — director of the IPA’s Foundations of Western Civilisation Program and who has a PhD in history from Cambridge University — of 746 history subjects taught across 35 Australian universities found that more subjects (244) focus on the politics of indigenous issues, other race topics, questions of gender, environment and identity than the story of Western civilisation. More history subjects mention race than the Enlightenment by a factor of four to one. The Reformation is cited in only 12 of the 746 subjects and liberalism is mentioned only seven times. More subjects reference Islam than Christianity.

Drawing on work done by British historian Niall Ferguson, who is professor of history at Harvard University, the IPA prepared a list of 20 core topics in the history of Western civilisation. They include ancient Rome, the Renaissance, the Enlightenment, the Reformation, any period of British history, the US revolution, the industrial revolution, Nazism, fascism, communism and the Cold War, and more. The IPA audit found a strong focus on ancient Greece and ancient Rome and the 20th century, “while the events of the intervening millennia are relatively neglected”. In other words, the great historical heritage that built our liberal democracy is not offered by many history ­departments.
Writing online for The Conversation on Thursday, Trevor Burnard, head of school and professor of history at University of Melbourne, rebuked the audit as misguided, arguing that history depart­ments faced two problems: limited funding and students who weren’t interested in Western civilisation. Referring to his own speciality, Burnard wrote: “The reason British history is less taught now than it once was has little to do with politics, and everything to do with student preferences. I would love for students to be fascinated in what I am interested in. Some are. But most aren’t.”

If students arrive at university with little curiosity about the historical triumph of freedom, it’s ­because we haven’t passed on that legacy to them. Students aren’t taught the astonishing story of Western civilisation at school or university. And the adult realm of politics is equally useless.
As IPA executive director John Roskam writes in Audit of History Teaching, we’re in trouble when a senior Liberal MP, federal Treasurer Scott Morrison, waves away the most fundamental freedom in a liberal democracy, freedom of expression, as something that “doesn’t create a single job (and) doesn’t open a business”.

When Gillian Triggs, the former boss of the Australian Human Rights Commission entrusted to defend fundamental freedoms, scolded Australia as a country where “Sadly, you can say what you like around the kitchen table at home”, we’re in double trouble.

And taxpayer-funded public broadcaster ABC, committed to all kinds of diversity except a diversity of voices, signals a preference for ideological homogeneity, not a healthy contest of ideas that emerged from the Enlightenment.

As Roskam says, this contest of ideas gave us a “legacy of liberty, of inquiry, of toleration, of relig­ious plurality, and of social and economic freedom. Western civilisation pioneered the recognition of universal human rights.” He quotes Rufus Black, the master of Ormond College at the University of Melbourne, in The Importance of a Liberal and Sciences Education: “The triumph of freedom and reason is not a law of physics, it is just an idea that has captured our minds for a tiny period of human history. There is no certainty it will continue to do so unless we choose to argue for its values and ensure that we pass it on as it was passed on to us, hard won from authoritarian rule of many forms.”
The historic battles, physical and metaphysical, that shaped our modern liberal project, where we are all equal, regardless of skin colour, creed, sex or sexuality, should be the foundation stone of every history department across Australian campuses. Instead, history teaching is mired in the politics of race, sex, sexuality and identity.

This intellectual regression has its roots in postmodernism, and identity politics has become its political arm. Under the dishonest rubric of “progressive” politics, postmodernism cemented into universities the notion that history and language are corrupted by those who hold power. Ergo history needs to be told through the lens of oppression and language needs to be proscribed to protect victims of the oppressors.

Under the same sham of protecting people, universities are now cottonwool campuses. Last week at Cambridge University, students were given a trigger warning about Shakespeare’s play Titus Andronicus. The Bard’s work has been added to a growing list of literature — F. Scott Fitzgerald, Virginia Woolf, Ovid and Euripides — now deemed offensive by coddled students and muddled academics. Last week To Kill a Mockingbird was removed from a school district in Mississippi because it also offends students. A decade ago, this anti-intellectualism would have been unthinkable.
Determined to police words and speech, proponents of identity politics label opponents as racists, sexists, misogynists, homophobes and Nazis. The aim is to drive a spoke into that critical piece of ­intellectual machinery known as the marketplace of ideas because critical thinking threatens their ­regressive ideas.

Worse, the demand of identity politics that people be treated differently according to race, sex, sex­uality and other forms of identity threatens the core premise of our liberal project that all individuals are of equal moral worth. It’s a staggering inversion of the great civil rights battles of the past century, and a reminder that when people are ill-informed about the past, they are likelier to ­embrace a less liberal future. The latest Lowy Institute Poll where only 52 per cent of people aged ­18 to 29 believe that “democracy is preferable to any other kind of government” is not shocking, it’s inevitable.

Halting the momentum of ­regressive identity politics depends on an intellectual army of iconoclasts who understand that the story of our liberal project must be learned, defended and passed on to the next generation. We need free thinkers such as Camille Paglia, the feminist who, this month, exposed how women’s and gender studies departments came to be “frozen at a certain point of ideology in the 1970s”. Only a radical will ask, what has gender studies contributed to the sum of human knowledge? And rebels such as Jonathan Haidt, the American social psychologist leading the push for universities to reclaim their positions as places of intellectual curiosity. And Lionel Shriver, too, the American author who ­exposed the fundamental flaw of identity politics during her past visit to Australia: “I don’t believe that membership of a larger group constitutes identity. I don’t think being female provides me with an identity. I don’t think it means that I have a character. That’s not my idea of what character is.”

In his recent book, The Strange Death of Europe, British author Douglas Murray traces the triumph of cultural masochists — “only the nations of Europe and their descendants allow themselves to be judged by their lowest moments”. This pathology of guilt has led to a “guilty, jaded and dying culture” in Europe and this virus is spreading across the West.

And let’s not mince words. When the heritage of Western civilisation is devalued in Australian schools and university history departments, debased by our political parties and human rights ­bureaucracies, and snubbed by sections of the media too, it ­becomes a numbers game. I joined the IPA years ago because the ­voices of freedom need critical mass so that the virtues of freedom can be nurtured, defended and passed on to the next generation to do the same. The way forward is to instil in each generation an understanding that our great inheritance comes from the story of Western civilisation. That’s why Roskam and his team at the IPA are ­engaged in this critical contest of ideas that must not be dismantled by the self-loathing politics of identity. Consider this a call to arms."

Wednesday, October 04, 2017

The Happiest Graph on Earth - The decline of extreme poverty

The Happiest Graph on Earth

Board of Contributors, Stratfor

No one would expect sunshine and smiles from an organization called the National Intelligence Council. One of its main tasks is to prepare a document called "Global Trends" once every four years for the new or re-elected U.S. president, laying out likely scenarios for how the world will develop over the coming decade or two. The most recent version, published in January, is every bit as intense as you might anticipate. Its three-page preface warns that we are facing "rising tensions between countries" at a time when "Global growth will slow, just as increasingly complex global challenges impend." Worse still, while "regional aggressors and nonstate actors will see openings to pursue their interests ... Nor is the picture much better on the home front for many countries." And these are just the headings in bold face: The fine print is even more alarming.

But the National Intelligence Council is not all doom and gloom. I gave a talk a few years ago in its offices in Langley, Virginia. The council's members were a lively, thoughtful and witty group, whose ability to see multiple sides to every story — perhaps the defining feature of the true strategist — was extraordinary. Fittingly, their latest Global Trends report is subtitled "Paradox of Progress." The heart of the paradox they describe is summed up by the graph reproduced here, showing the change between 1820 and 2015 in the number of humans living in absolute poverty and the number of people living above that miserable level.
The decline of extreme poverty
When I'm teaching, I like to call this "the happiest graph on Earth." Inevitably, because my background is in premodern economic history, I have some quibbles with the council's data. I strongly suspect that by 1820, only three-quarters of the world's population had incomes below the equivalent of $1.90 per day. Most of the other quarter were merely poor, rather than extremely poor, living it up on as much as $2.50 per day (for comparison, the global average is now over $25 per day) while a tiny minority made much more (Rome's first emperor, Augustus, had personal assets worth something like $4.6 trillion in today's terms). This recalculation, based more on archaeological and anthropological observations than on the kind of data the council's sources used, would slightly diminish the graph's impact — but only very slightly. The graph remains miraculous, and hardly anyone living before 1820 would have believed it possible.

Discussions of poverty in recent years have focused relentlessly on inequality, and gloomy tomes analyzing its rise within most nations during our lifetimes — Thomas Piketty's book Capital in the Twenty-First Century and my colleague Walter Scheidel's The Great Leveler — have deservedly become depressing best-sellers. Up to a point, this is as it should be: Inequality is important, and a series of psychological studies has shown that among both chimpanzees and humans, those who rise to the top of steep hierarchies tend to be happier, have better health and live longer than those who do not.

However, the council's graph shows the limits of the focus on inequality. The global economy has lifted well over a billion people out of extreme poverty since the 1980s. Never in human history have so many people experienced such rapid gains in health, lifespans and perhaps happiness too. In fact, so many people have escaped from extreme poverty that the World Bank felt forced to redefine what counted as misery in 2015, raising the bar from an older limit of $1.25 per day to the current one of $1.90. This has been the post-Cold War world's greatest triumph. Its greatest tragedy, of course, is that the better part of another billion people still languishes in extreme poverty, but everything suggests that number really will dwindle to zero over the coming decades. We know how to eradicate extreme poverty.

The Makings of a Wealthier World

Two big things went into creating the happiest graph on Earth: globalization and energy capture. Since Adam Smith worked on The Wealth of Nations in the 1770s, there has been a long-term trend toward more open, interconnected markets, lower barriers to trade and higher volumes of long-distance commerce. The trend certainly hasn't been consistent: Napoleon's Continental System set it back for nearly a decade between 1806 and 1814, the Great Depression did so even more dramatically in the 1930s, and there have been plenty more times when protectionism temporarily triumphed. However, despite constant pushback from conservatives, more and more of the world has joined an increasingly global market, raising incomes (but also inequality) enormously.
Despite the astonishing gains of the past two centuries, many experts believe that a lot of money is still lying on the globalizing table. According to the World Economic Forum, if every country reduced trade barriers by improving the performance of their border administration and transport infrastructure just halfway to the standards of best global practice, the world's GDP would grow by $2.6 trillion (4.7 percent), with the gains flowing disproportionately to the poorest places in the world. Sub-Saharan Africa, for instance, would see GDP grow by 12 percent. If these calculations are anywhere near correct, more globalization can play a huge part in ending extreme poverty.

Increasing energy capture, though, historically has been even more important than globalization in buoying the poorest of society. Before the Industrial Revolution, even the most developed agrarian economies the world had ever seen — Rome in the first century or two A.D. and China in the 11th and 12th — hadn't raised per capita energy consumption above about 30,000 kilocalories per day. Typically, adults each directly consumed less than 2,000 kilocalories per day as food; the rest went toward fuel, transport and the myriad other tasks an agrarian economy needs to accomplish.
Learning how to harness the energy trapped in fossil fuels annihilated these limits. By 1820, when the council's graph begins, British energy budgets had already risen above 40,000 kilocalories per person per day. They reached 92,000 in 1900, and in the 2010s Americans consume roughly 230,000 kilocalories per person per day (an unhealthy 3,770 of them as food, and most of the rest to power our airplanes, SUVs and laptops). Along the way, engineers learned to use new kinds of fossil fuel, advancing from coal to oil and natural gas, to transmit the energy in new forms (above all, electricity) and to convert energy into almost anything we wanted. Between 1500 and 1900, Western farmers roughly doubled wheat yields per hectare, largely by increasing the amount of energy invested in agriculture by adding more manure and draft animals. Between 1900 and 2000, by contrast, American farmers quadrupled output but did so by investing 80 times as much energy per hectare, now in the forms of gasoline-powered machinery, chemical fertilizers and electric pumps. Without the explosion in energy, the right-hand side of the happiest graph on Earth would still look a lot like the left-hand side.

Even more than in the case of globalization, vast gains in energy capture are still available. Huge reserves of fossil fuel remain in the ground, and new extraction techniques are making them accessible. However, most geologists and climatologists think that the massive use of coal, oil and natural gas since 1800 has already damaged the atmosphere and oceans so severely that exploiting these reserves to the full would be a very bad idea. Fortunately, though, engineers are beginning to deliver extraordinary advances in capturing clean energy and storing it in new kinds of batteries. Estimates of the likely impact vary (and the National Intelligence Council has done a commendable job in pulling many of them together), but increasing numbers of engineers and investors expect the 21st century to experience an energy bonanza even wilder than that of the 19th and 20th centuries.
If these interpretations of the council's graph are anywhere near accurate, we can realistically expect the combination of globalization and the ongoing energy revolution to eliminate extreme poverty altogether within the lifetimes of many of the people reading this column — or, at the very least, to force the World Bank to raise the bar for extreme poverty again, this time to a much higher level.

Putting America's Poor First

The most obvious way to falsify my prediction would be to slam the brakes on globalization and new energy research, which is precisely what many governments seem keen to do. U.S. President Donald Trump, often seen as a spokesman for both economic nationalism and traditional fossil fuel industries, recently told the United Nations, "We have it in our power, should we so choose, to lift millions from poverty," but he went on to argue that globalization was not the way to do this. "As President of the United States," he said, to applause, "I will always put America first, just like you, the leaders of your countries, will always, and should always, put your countries first."

If enriching the poor begins at home, though, driving down the number of people living in extreme poverty is likely to become less of a priority for most governments. Not only do fewer than 10 percent of the world's population now live on less than $1.90 per day, but they are also concentrated in a shrinking number of places. Kathryn Edin, a sociologist, and H. Luke Shaefer, a professor of social work, argued in $2.00 a Day: Living on Almost Nothing in America that many Americans do in fact live in extreme poverty but reached this conclusion by excluding all non-cash transfers and benefits. No one can deny that life is very hard indeed for the poorest Americans, and Edin and Shaefer's stories are truly harrowing. But with a federal minimum wage of $7.25 per hour — 30 times higher than the World Bank's cutoff point for extreme poverty — and the likelihood that by 2022 one-sixth of Americans will live in states or cities with a $15 per hour minimum wage, the struggle against poverty in America is not the same as that in the Central African Republic, where the International Monetary Fund reports the average income to be just $1.75 per day.

Lifting up the "left behind" in rich countries and eradicating the last stubborn pockets of global extreme poverty are very different projects, and in the 2010s electorates in Western democracies have increasingly pursued what they see as solutions to the first problem at the expense of solutions to the second. Economists predicted that by 2030 the Trans-Pacific Partnership would be adding $492 billion (0.6 percent) to global GDP, $130 billion (0.5 percent of GDP) of which would flow to the United States while proportionately more would go to poor countries such as Malaysia and Vietnam. Both the Democratic and Republican nominees for the U.S. presidency in 2016 felt compelled to reject this deal. Similarly, American and European politicians alike rushed to distance themselves from the Trans-Atlantic Trade and Investment Partnership, even though economists expected it to raise GDP on both sides of the Atlantic by between 0.3 and 0.7 percent.

American political reservations about pursuing an energy revolution have also grown. Trump's campaign platform promised to roll back regulations on carbon dioxide emissions, which would help coal and oil compete against renewables, and his administration's draft budget for 2018 included cuts of 6 percent for the Department of Energy and a whopping 31 percent for the Environmental Protection Agency. In June, the president announced the United States' intention to withdraw from the Paris climate change agreement, though more recently his administration's position seems to have softened.

If I am interpreting the history behind the happiest graph on Earth correctly, and globalization and the search for new energy sources are indeed the main forces that drive down extreme poverty, much of this sounds like unhappy news indeed. But even so, I remain optimistic that the World Bank will raise its bar yet again in the next few years. William F. Buckley famously defined a conservative as "someone who stands athwart history, yelling Stop," but standing athwart the happiest graph on Earth just seems quixotic. Those who try are likely to merely surrender to a rival control of the 21st century's two greatest forces.

For neither force is the identity of this rival much of a mystery. China's opening to global markets since the 1970s has marked the single biggest step in reducing the number of the world's extremely poor, and at this year's World Economic Forum in Davos, Switzerland, President Xi Jinping pledged that "China will keep its doors wide open" in the "hope that other countries will also keep their doors open to Chinese investors and maintain a level playing field for us." Putting his words into practice, Beijing has successfully promoted its Asian Infrastructure Investment Bank as an alternative to the Western-dominated World Bank and its Regional Comprehensive Economic Partnership in place of the crippled Trans-Pacific Partnership.

At the same time, renewable energy is already growing faster than total energy demand in China, leading the consumption of fossil fuels to fall by 1.4 percent in 2015. That year, China also accounted for 28 percent of global electric vehicle sales, 32 percent of solar panel installations and 47 percent of wind installations. In 2016 it overtook Europe in wind power, and by 2020 it will probably surpass the Continent in solar power, too.

Historians sometimes like to say that mastering coal allowed Britain to dominate the 19th century, and mastering oil allowed the United States to dominate the 20th. If mastering the wind and sun allows China to dominate the 21st, the happiest graph on Earth will keep getting happier — but at a terrible price for the West.

Wednesday, August 30, 2017

A 19th Century Lesson from Friedrich Nietzsche on the Use and Abuse of History

A Lesson on the Use and Abuse of History
Reva Goujon
VP of Global Analysis, Stratfor

Friedrich Nietzsche's observations about the misuse of history to sow chaos in the present hold as true today as they did in the 19th century when he wrote them.

A lot of history is being casually tossed around these days. We see it from energized segments of the "alt-right" throwing up Nazi salutes, calling for a "revolution" against "the Bolsheviks" and marching to chants like "Jews will not replace us." We see it from their anti-fascist adversaries on the left, branding themselves antifa, a movement that draws its roots from the Antifaschistische Aktion resistance from 1930s Germany. We see it from world leaders when Turkish President Recep Tayyip Erdogan brazenly calls his German and Dutch counterparts Nazis and fascists and when U.S. President Donald Trump ardently defends Confederate statues as symbols of "heritage not hate." We see it from jihadist groups like the Islamic State when a member of the Barcelona attack cell calling himself Abu Lais of Cordoba spookily reminds Spanish Christians to remember "the Muslim blood spilled" during the Spanish Inquisition as the group fights to reincorporate "Al Andalus" into a revived caliphate.

The Third Reich. The Bolshevik Revolution. The American Civil War. The Spanish Inquisition. This is heavy, heavy history. Yet in the words of some actors, some of the darkest days of our historical memory seem to take on a weightless form in today's angst-ridden political discourse. While jarring to observe, this kind of historical levity is to be expected whenever the world moves through a major inflection point. When more wretched periods of history lie just beyond the horizon of the current generation, they become fuzzy, impersonal anecdotes rather than visceral memories that impress upon everyday lives. And when the future looks especially bleak to that same generation, a raucous few can capture the minds of many by plunging deep into the depths of a blemished history to conjure up leaders and legends that, with a bit of polishing and dusting off, can serve as the unadulterated icons of a new world order.
Today's antifa movement models itself after Germany's anti-fascists in the 1930s.

The reasoning behind the tactic is fairly simple. Leaders and movements that try to implement a revisionist agenda in the world crave two things: credibility and power. If the masses can be seduced by a narrative with historical legs (however flimsy those legs may be), then power will presumably follow. And if credibility does not come naturally, then the amassing of state power enables the distortion of facts and silencing of critics. After all, mimicking history is a far easier strategy to pursue than trying to understand and innovate yourself out of the deeper problems of the present.

At the same time, history is just as important to internalize in tense times like these. How else will current and future generations learn from and avoid the mistakes of their past?
Tackling the Dilemma of History

It is this very dilemma of how to responsibly treat history that German philosopher Friedrich Nietzsche tried to tackle in an essay titled "On the Use and Abuse of History for Life" as part of his Untimely Meditations series. When Nietzsche published this work in 1873, just two years following German reunification led by Otto Von Bismarck, he described himself as "out of touch with the times." While most everyone around him was celebrating the music of Richard Wagner and erecting monuments to the heroes of the Franco-Prussian War in the act of knitting together a unified German state, Nietzsche was a bit of a wreck thinking about all the ways this exercise in German nationalism could go terribly awry. His musings on the role history should play in audacious political times are particularly apt in today's "historical fever."

Nietzsche argues that the fundamental purpose of history should be for life. And as we live, we must understand our past without becoming enslaved to it. To underscore the importance of living in the present, he describes with envy the "unhistorical" beast, grazing among a herd in a valley with finite horizons on all sides, living in an eternal state of forgetfulness. In contrast to the human, ever-burdened by the past, the honest beast lives blissfully, as a child does, freeing the mind to think and achieve great things. "The person who cannot set himself down on the crest of the moment, forgetting everything from the past, who is not capable of standing on a single point, like a goddess of victory, without dizziness or fear, will never know what happiness is. … And no artist would achieve his picture, no field marshal his victory, and no people its freedom, without previously having desired and striven for them in that sort of unhistorical condition."

The opposite of the unhistorical mind is the superhistorical being, one who takes history much too seriously and thus feels little point to living in the present. In between these two minds are the more even-minded and optimistic historical beings — those who use history to serve the living. They look to the past with reverence to understand the present and to frame a vision for the future.

Nietzsche then goes on to describe the three approaches to history: Monumentalism, Antiquarianism and Criticism. No single approach is the right one; each can be used in combination and at an appropriate time and context. If misused, however, "destructive weeds" will sprout and pull society down into chaos.

The Monumentalists are on a historical search for glory. This can be a dangerous exercise, for if the past is viewed as something that can be imitated and reinterpreted into something more beautiful, then it can easily fall into a trap of "free poeticizing" by people serving up mythic fictions to "weakly cultured nations" craving hope and direction. In a prophetic peering into the Third Reich, Nietzsche warns that the Monumentalists are most dangerous as superhistorical people when they are not anchored to a particular place. When saturated in a myth of cultural superiority and lacking in geopolitical boundaries, "less favored races and people" roam around, "looking for something better in foreign places," and competition and warfare ensures. While such thinking in the past has fueled foreign adventurism and destructive wars, the Monumentalists today are more concerned about a nativist agenda on their home turf. Chants by the alt-right like "our blood, our soil" and campaigns against "cultural Marxism" taking over the world stem from a stubbornly anchored place and people who believe their white, European-derived race is being diluted by "the other," who belong on the other side of the fence.
(Stratfor)
The Antiquarian, like the Monumentalist, tends to look at history through rose-colored glasses, but wants to preserve the past in a traditionalist, literalist sense rather than use history to generate life. "Man envelopes himself in a moldy smell" and steeps himself so deeply in the past that he will take little interest in what lies beyond the horizon. The modern jihadist drawn to the Islamic State banner can be seen through this lens, believing that a return to a seventh-century Islamic way way of life will restore virtue in man and glory to the Islamic world.

The third approach to history that Nietzsche describes is the critical one. He writes that a person must have the power to break with the past in order to live. This, he argues, can be done by "dragging the past before a court of justice" to look at history with a critical eye. In describing a "first and second nature," he describes how a generation that looks at its past and can recognize its ills alongside its virtues sets the stage for the next generation to study its past with a healthier, more critical view of history, and thus society is better off in the long run. In this thought, Nietzsche may have been a tad idealistic. In reality, as we can see today, several decades is long enough to wear down the critical eye and blur our historical memory for the worse.
The Limitations of the Critical Approach

The critical method may appear like the most pragmatic and reasoned approach to history, but here, too, we must be careful. To make the point that we can never fully internalize the time, place and conditions of our past to perfectly understand it, Nietzsche dramatically asserts that "objectivity and justice have nothing to do with each other." One can "interpret the past only on the basis of the highest power of the present."

This lesson in critical history is an important one for the modern-day globalist fighting to extinguish resurgent flames of nationalism. While the 20th century horrors of unhinged nationalism need to be represented in starkly honest terms to prevent a repeat of the past, it is just as important to remember the economic, social and security conditions that give rise to those ideological currents in the first place. Nationalism is a natural product of the human condition and need not be blanketly vilified in all forms. While some German integrationists want to remain committed to a docile existence in the European Union, German students may tire of history books that overly fixate on the catastrophe of the Third Reich. While the message "Never forget!" matters, there comes a time when rational people looking for purpose and seeing problems with the current order want a fuller historical understanding of their past to comprehend who they are and where they come from instead of drowning themselves in guilt.

Nietzsche also warns against taking science and a purely empirical approach to life too far. The German thinker, best known for the declaration "God is dead," was of course speaking at a time in European history when rationalism and scientific thought were celebrated by philosophers as the great escape from religion's straitjacket. The nuance to Nietzsche's argument is often lost, however. Nietzsche himself was an atheist, but he warned that if we went too far down the critical path and didn't leave any room for the Antiquarian memory, then we would strip society of religion, art and other enigmatic instincts that humans need to make sense of the inexplicable. As he put it, "all living things need an atmosphere around them, a secret circle of darkness" or else European culture would face a catastrophe in suffering the perils of nihilism. If Nietzsche were alive today, he would probably argue that overzealous technologists intent on erasing borders in an ever-globalizing world were setting it up for a violent clash.
Heeding Nietzsche's Warning

Nietzsche would also see the jumbled historical references feeding into today's political discourse as a gross abuse of history warning of much bigger problems ahead. In every camp of Monumentalists and superhistorians, there are some whose egos and visions are so big that they believe that the entire passage of history is to serve their modern agenda. Nietzsche saw this type in his day as well:

"Arrogant European of the nineteenth century, you are raving! Your knowledge does not complete nature, but only kills your own. For once, measure your height as a knower against your depth as a person who can do something. Of course, you clamber on the solar ray of knowledge upward towards heaven, but you also climb downward to chaos."

In fact, chaos is exactly what many modern-day radical Monumentalists are seeking. If one assumes an apocalyptic view of the world, as many among the alt-right and some within a rapidly thinning camp of ultranationalists in the White House have openly espoused, then their energy and focus will be on tearing down the current order at whatever cost to energize the masses to engage in their so-called revolution. To complete this aim, enemies need to be invented, the intelligentsia needs to be condemned and history needs to be revised. And whether they know it or not, the motto of the modern-day Monumentalist, in Nietzsche's words, will be the following: "Let the dead bury the living."