The SCARF ModelBusiness/Psychology churns out a lot of intriguing ideas which are ill-supported by actual evidence but this one looks interesting.
In part, I began thinking about this because my colleagues at the Foundation for Economic (FEE) and I have recently started to import some ideas from David Rock's "SCARF" model for the media we produce, and I became curious about how this might apply to society at large.
For those unfamiliar with the idea, “SCARF” is built around the concept that people are constantly seeking to avoid pain and get more rewards for their activities and that most of these behaviors center around five specific motivations (which together make up the acronym "SCARF"):
StatusFor Dr. Rock, who often acts as a management consultant to large corporations, the idea is that by remembering that these are the emotions that motivate employees, managers can better understand how to get top-quality work out of their people.
It makes a lot of sense. If you know that people are often motivated by improving their relative status among their peers, then you know that by offering people public praise, you’re bolstering their sense of pride in their work and helping them achieve their status-seeking goals and, on the flip-side, you know that if you berate them publicly, you’ll be smashing their sense of status among their peers and that embarrassment may ruin their desire to continue working effectively.
The thing about all this is that, as far as we know from neuroscience and psychological studies, these motivations are real, consistent across essentially all people, and somewhat immutable.
To one degree or another, they’re just things that we’re all concerned about. We can’t just wipe these underlying motivations away by wishing they didn’t exist. As a result, we can’t ignore them. Instead, we can either bolster them—giving people the tools to improve their status, gain certainty about the world, gain more control over their lives, feel related to others and build community, and maintain a sense of fairness about their interactions with other people—or we can try to negate them and make people feel worse about themselves.
Thursday, August 16, 2018
Wednesday, August 15, 2018
Forming coalitions around scientific or factual questions is disastrous, because it pits our urge for scientific truth-seeking against the nearly insuperable human appetite to be a good coalition member
Every human—not excepting scientists—bears the whole stamp of the human condition. This includes evolved neural programs specialized for navigating the world of coalitions—teams, not groups. (Although the concept of coalitional instincts has emerged over recent decades, there is no mutually-agreed-upon term for this concept yet.) These programs enable us and induce us to form, maintain, join, support, recognize, defend, defect from, factionalize, exploit, resist, subordinate, distrust, dislike, oppose, and attack coalitions. Coalitions are sets of individuals interpreted by their members and/or by others as sharing a common abstract identity (including propensities to act as a unit, to defend joint interests, and to have shared mental states and other properties of a single human agent, such as status and prerogatives).
Why do we see the world this way? Most species do not and cannot. Even those that have linear hierarchies do not. Among elephant seals, for example, an alpha can reproductively exclude other males, even though beta and gamma are physically capable of beating alpha—if only they could cognitively coordinate. The fitness payoff is enormous for solving the thorny array of cognitive and motivational computational problems inherent in acting in groups: Two can beat one, three can beat two, and so on, propelling an arms race of numbers, effective mobilization, coordination, and cohesion.
Ancestrally, evolving the neural code to crack these problems supercharged the ability to successfully compete for access to reproductively limiting resources. Fatefully, we are descended solely from those better equipped with coalitional instincts. In this new world, power shifted from solitary alphas to the effectively coordinated down-alphabet, giving rise to a new, larger landscape of political threat and opportunity: rival groups or factions expanding at your expense or shrinking as a result of your dominance.
And so a daunting new augmented reality was neurally kindled, overlying the older individual one. It is important to realize that this reality is constructed by and runs on our coalitional programs and has no independent existence. You are a member of a coalition only if someone (such as you) interprets you as being one, and you are not if no one does. We project coalitions onto everything, even where they have no place, such as in science. We are identity-crazed.
The primary function that drove the evolution of coalitions is the amplification of the power of its members in conflicts with non-members. This function explains a number of otherwise puzzling phenomena. For example, ancestrally, if you had no coalition you were nakedly at the mercy of everyone else, so the instinct to belong to a coalition has urgency, preexisting and superseding any policy-driven basis for membership. This is why group beliefs are free to be so weird. Since coalitional programs evolved to promote the self-interest of the coalition’s membership (in dominance, status, legitimacy, resources, moral force, etc.), even coalitions whose organizing ideology originates (ostensibly) to promote human welfare often slide into the most extreme forms of oppression, in complete contradiction to the putative values of the group. Indeed, morally wrong-footing rivals is one point of ideology, and once everyone agrees on something (slavery is wrong) it ceases to be a significant moral issue because it no longer shows local rivals in a bad light. Many argue that there are more slaves in the world today than in the 19th century. Yet because one’s political rivals cannot be delegitimized by being on the wrong side of slavery, few care to be active abolitionists anymore, compared to being, say, speech police.
Moreover, to earn membership in a group you must send signals that clearly indicate that you differentially support it, compared to rival groups. Hence, optimal weighting of beliefs and communications in the individual mind will make it feel good to think and express content conforming to and flattering to one’s group’s shared beliefs and to attack and misrepresent rival groups. The more biased away from neutral truth, the better the communication functions to affirm coalitional identity, generating polarization in excess of actual policy disagreements. Communications of practical and functional truths are generally useless as differential signals, because any honest person might say them regardless of coalitional loyalty. In contrast, unusual, exaggerated beliefs—such as supernatural beliefs (e.g., god is three persons but also one person), alarmism, conspiracies, or hyperbolic comparisons—are unlikely to be said except as expressive of identity, because there is no external reality to motivate nonmembers to speak absurdities.
This raises a problem for scientists: Coalition-mindedness makes everyone, including scientists, far stupider in coalitional collectivities than as individuals. Paradoxically, a political party united by supernatural beliefs can revise its beliefs about economics or climate without revisers being bad coalition members. But people whose coalitional membership is constituted by their shared adherence to “rational,” scientific propositions have a problem when—as is generally the case—new information arises which requires belief revision. To question or disagree with coalitional precepts, even for rational reasons, makes one a bad and immoral coalition member—at risk of losing job offers, one's friends, and one's cherished group identity. This freezes belief revision.
Forming coalitions around scientific or factual questions is disastrous, because it pits our urge for scientific truth-seeking against the nearly insuperable human appetite to be a good coalition member. Once scientific propositions are moralized, the scientific process is wounded, often fatally. No one is behaving either ethically or scientifically who does not make the best case possible for rival theories with which one disagrees.
Tuesday, August 14, 2018
There's a perception that we sit way more than any other culture out there — or even any culture throughout time. For the first time in human history, we sit for these long stretches, day after day.
Anthropologist David Raichlen at the University of Arizona says that is not accurate.
"No. Not from our data," says Raichlen.
Raichlen studies modern hunter-gatherers called Hadza, in Tanzania. They live primarily off wild foods, such as tubers, honey and barbecued porcupines. And to acquire this food, there's no doubt they are active.
They climb and chop trees to get honey. They dig for tubers and pound nuts.
"They do a lot of upper body work," Raichlen says. "And they spend quite a bit of time walking — at a pretty high rate of speed."
On average, Hadza adults spend about 75 minutes each day exercising, Raichlen says. That amount is way more than most Americans exercise. Many of us can't muster a measly 2.5 hours each week, as recommended by the Centers for Disease Control and Prevention. So there's no doubt the Hadza are in better cardiovascular health than most Americans.
But do the Hadza actually sit less than we do?
A few years ago, Raichlen and colleagues decided to find out. They strapped heart-rate monitors onto nearly 50 Hadza adults for eight weeks and measured how often each day, they were just, well ... sitting around. The results shocked Raichlen.
"The Hadza are in resting postures about as much as we Americans are," he says. "It's about 10 hours a day."
By comparison, Americans sit about nine to 13 hours each day, on average, a study reported in 2016.
I therefore feel no hesitation in rejecting the validity and utility of the entire body of anthropological theory
When I characterise the concepts of culture and social system as 'myths', I do not imply that they bear no relation to reality, for they are obviously derived from observations in the real world. I mean merely that, as reified abstractions, they cannot legitimately be used to explain human behaviour. Culture and social aggregates are explainable as derivates of behaviour, but not vice versa. All systems of theory which are based on the alleged or inferred characteristics of aggregates are consequently inherently fallacious. They are, in short, mythology, not science, and are to be rejected in their entirety-not revised or modified.
This conclusion is supported by a variety of evidence. In any established science, for example, there is substantial agreement among its leading practitioners on the essential core of its body of theory, whereas in anthropology there is virtually no such consensus. In analysing the recent volume by Fortes I discovered - to my astonishment in view of my great respect for his work - that it contained scarcely a single theoretical assumption, postulate, generalisation, or conclusion which I could accept as valid without serious qualification. I had had a similar reaction once before - in reading the theoretical work of Leslie White. And I have since experienced it a third time when, stimulated by Fortes, I reviewed the theoretical writings of Alfred Kroeber. Having known all three men fairly intimately, I am aware that none of them - has found my own views any more acceptable than I have found theirs, and that each of them has felt an equally profound scepticism regarding the views of the others. It is inconceivable that four men of comparable standing in any established field of science, such as astronomy, nuclear physics, or genetics, could differ so radically from one another on basic theoretical issues. One can only conclude from this that what Fortes, White, Kroeber, and I have been producing is not scientific theory in any real sense but something much closer to the unverifiable dogmas of differing religious sects.
I therefore feel no hesitation in rejecting the validity and utility of the entire body of anthropological theory, including the bulk of my own work, which derives from the reified concepts of either culture or social system, and in consigning it to the realm of mythology rather than science. Some of the fragments of existing theory which escape such stigmatisation will engage our attention toward the end of this paper.
In conclusion, I would like to relate an anecdote which is famous in the unwritten history of the Departnent of Anthropology at Yale. Almost exactly forty years ago, when the late Edward Sapir was conducting a seminar on primitive religion, he had a student who came from the society later studied by John Beattie, the Banyoro of Uganda. This student, in reading a rather pedestrian paper on the religion of his own people, happened to mention that in his country the shrines of the war god were tended exclusively by pries`tesses. At this, Sapir pricked up his ears and interrupted to comment that, since war is the most masculine of all occupations, it seemed remarkable that the cult of the war god should be conducted by women only.
'Why should this be?' he inquired, and proceeded, on the spur of the moment, to propound a possible interpretation, highly complex and liberally seasoned with Freudian and other symbolism. The students sat upright in fascinated attention. As he was concluding, an alternative explanation occurred to him-equally brilliant, equally complex, and equally symbolic - and he developed this in like fashion, while the students perched on the edge of their chairs, utterly entranced by this doulble demonstration of his virtuosity. When he came to the end, he turned to the African to inquire the extent to which either hypothesis accorded with Banyoro culture, but, flushed with enthusiasm at his own performance, asked him instead which interpretation was the correct one.
'Actually,' replied the student, 'neither is correct. The explanation is really quite simple. You see, when war occurs in my country, all the men go out to fight, and no one is left except women to tend the cult of the war god.'
This anecdote might well stand as an allegory of both the fascination and the falsity of all forms of anthropology's mythology.
Picture in your mind a political debate between acquaintances, perhaps on social media, or in meatspace. You make your point, your opponent makes his. Demands for evidence are made. Your opponent cites a media piece. Perhaps an article on CNN, or a reference to a study on The Atlantic. The onus is on you to prove that the item is now incorrect. Yet you cannot do so, for the citations within it are true, even though the spin has rendered it into something it really is not. How do you articulate that?I encounter, not many people, but more people than in the past, with whom basic discourse has become close to untenable. They have adopted unworldly beliefs, which if you are to respect (in the sense of avoiding refuting them) preclude constructive engagement. I don't like the situation, but it is unclear to me what can be done.
Consider this CNN headline: Children found in New Mexico compound were training for school shootings, prosecutors say.
What is wrong with it? The headline is true. The children were indeed in a compound in New Mexico, and were indeed training to commit school shootings. Ah, but it omits that this was linked to Islamic terror. Now the article itself sort-of admits this in the last section of the article.
Hogrefe said FBI analysts told him the suspects appeared to be “extremist of the Muslim belief.”
Compare this to how the same event is reported on Fox News: Investigators raided New Mexico compound on tip from terror-tied New York City imam, cleric claims.
Note the difference in spin. One emphasizes ‘school shootings’ and the other ‘terror-tied’ and ‘imam’. This is how the tone of a thing is subtly changed, depending on the journalist’s preferred viewpoint. Of course, aside from Fox News, most media outlets are Left-leaning. So the spin is much more weighted toward the Left, and furthermore Fox News is usually casually dismissed by any Leftist. It is, in essence, banned from the court of polite opinion. And yet, both articles are fundamentally true.
I’ve been on a Tolkien kick of late, for which I blame my friend Francis. And so I caught the connection quite readily when I read the above headlines:
The Stones of Seeing do not lie, and not even the Lord of Barad-dûr can make them do so. He can, maybe, by his will choose what things shall be seen by weaker minds, or cause them to mistake the meaning of what they see. Nonetheless it cannot be doubted that when Denethor saw great forces arrayed against him in Mordor, and more still being gathered, he saw that which truly is.Denethor was shown nothing but truth by the palantir. It could not be made to lie to him. But Sauron could spin what was shown, and cause Denethor to mistake the meaning of the things he saw. This tactic is readily employed by the media, and in the past it has been extremely effective. The journalist, if confronted on his spin, could escape with the excuse “but everything I have said is true!” We know there is a wrong here, we can sense it, but to prove it unequivocally is difficult, and essentially impossible if the instances are few enough.
I live in an area that is deep blue and have many deep blue friends. Critical Theory Social Justice versus Classical Liberal is not the issue. We share common aspirations and experiences even if there is variance in how we interpret things or the significance we attach. There is sufficient mutual respect that we can politely navigate around points of discord to find plenty of common ground and in the process, and over time, perhaps move each other's dial just a bit. No dramatic conversions of fundamental belief perhaps, but an increasing awareness of nuance.
But every now and then I encounter someone whose premises are beyond reach. They believe with deep and abiding conviction that President Trump is a fifth column Russian colluder. They believe that all whites must inherently be prejudiced against blacks and that blacks can do no wrong against whites because of history. They believe that there are whole classes of scientific controversy which can have only one interpretation.
And I don't mean that they are taking a position for rhetorical effect and are doggedly adhering to it. I mean they believe. As in Eric Hoffer's True Believers.
The point isn't that they are a True Believer. The point is that there is no bridge by which to reach them. If you do not accept their predicates wholesale, there is no means by which to share an interpretation of an event. The usual bromides of walk in their shoes, see the world through their eyes, etc. have no application. You can see their view but they will not see anything but their own. They cannot understand alternate interpretations.
The efficient response is not to engage with people whose predicates are unassailable. But that is a bleak position and encourages one to ignore that which might possibly, no matter how improbably, be true. But if there is no mutuality, there is little benefit and much effort.
I guess I need to go back and reread Hoffer and see if he has any advice.
Monday, August 13, 2018
One of the challenges in forecasting around climate change is all the moving parts. These are complex dynamic, interrelated loosely coupled systems. As an example, rising CO2 levels in the atmosphere likely has some negative consequences, possibly on heat retention (though we cannot be certain because our knowledge is still fragmentary). On the other hand, while it might cause some problems, it might also be beneficial by accelerating plant growth (CO2 being plant food).
This article does not answer any of these questions. It illustrates the yeoman's work being done in labs across the world, trying to understand the complex interactions of many dynamic systems, in any of which, our theoretical understanding is partial and our empirical body of knowledge is limited. There are surprises all around.
Earlier this year, a study published in the prestigious journal Science shook up the biology world by turning an accepted paradigm of plant growth on its head.The rest of the article is about respectful but divergent interpretations by different teams.
But now a pair of researchers is calling the findings into question, but the authors of the original work are standing their ground.
In comment pieces in the current issue of Science, the two sides face off over issues of soil conditions, plant biology and experimental design.
The original work, led by Peter Reich at the University of Minnesota, US, sought to understand how plant growth is affected by long-term high carbon dioxide levels. For 20 years, the team compared two groups of plants that employ slightly different methods of photosynthesis: the C3 pathway and the C4 pathway. (Read our report of the study here.)
Prevailing plant biology dogma states that C3 plants are more sensitive to atmospheric carbon dioxide levels than are C4 plants. Therefore, plants using the C3 pathway should produce more biomass as levels rise.
Much to the researchers’ surprise, the C3 plants grew well initially, but then lost their edge after 12 years. Instead, the C4 plants showed accelerated growth in the last eight years of the study, with increases in biomass outstripping the control plants by as much as 24%.
To explain the switch in growth rates, Reid and colleagues hypothesised that long-term high carbon dioxide levels triggered changes in soil microbes and nutrient cycling — changes that favoured C4 plant growth but hampered that of C3 plants.
The findings suggested that it may be difficult to predict with certainty just how much atmospheric carbon can be captured by plants in the future. As the effects of anthropogenic climate change continue to unspool, Reich cautioned, “We shouldn’t be as confident [that] we’re right about the ability of … ecosystems to save our hides.”
Reminds me of that line by Friedrich August von Hayek in The Fatal Conceit
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.
Historians tend to reason backwards from their ideological priors unconstrained by reason, logic or standards of causal inference
Bernstein's post is perhaps in some ways less about authoritarian social justice historians and their shenanigans and more about differences in argumentative traditions. A lot of historians are, at best, trained in persuasive rhetoric. They try and persuade people that their arguments are right. Lawyers are trained in formal logic and prove that the argument is wrong.
No, this isn't another post about that horrible Nancy MacLean book, but it is related. As an early, vociferous critic of the book, I wound up in email, blog, and Twitter debates with some of her defenders among fellow historians, especially those who purport to specialize in intellectual history. And what I learned from this was troubling. While I'm sure there are many excellent historians around, I found that the historians I interacted with not only tended to reason backwards from their political priors, but that their standards of how one makes an appropriate inference from existing evidence are such that they would be laughed out of any decent philosophy or law school academic workshop.The five examples Bernstein provides are compelling.
The historians I've discussed and debated with are not fringe-y. One of my interlocutors is a chaired professor at a major state university. Others are junior professors or grad students or post-docs or think tank fellows with degrees from some of our most reputable history programs.
Again, I'm not saying that these folks represent all historians, all American historians, or even all intellectual historians who specialize in the U.S. Nevertheless, the fact that all of these arguments (and more) have been made with a straight face by well-credentialed historians suggests something is amiss in the profession.
by William Johnson Cory
They told me, Heraclitus, they told me you were dead,
They brought me bitter news to hear and bitter tears to shed.
I wept, as I remembered, how often you and I
Had tired the sun with talking and sent him down the sky.
And now that thou art lying, my dear old Carian guest,
A handful of grey ashes, long long ago at rest,
Still are thy pleasant voices, thy nightingales, awake;
For Death, he taketh all away, but them he cannot take.
I can't remember whether it was NPR or the NYT but one of them had a piece in which they were repeatedly making a big play of Putin and Trump "meeting behind closed doors." They were pathetically trying to denormalize that which is normal in all such global leadership events. You have the public presentation - the ribbon cutting, the photo ops, the smiles and handshakes - and then you have the meetings where the real work is done. Behind closed doors. Sometimes it is beneficial. Sometimes it is nefarious. Sometimes it is well-intended but disastrous. But it is always behind closed doors.
Trying to make "closed doors" into a substantive issue was, in my view, both revealing of the mindset of the journalists as well as strategically disastrous for their professional brand. Not everyone notices what journalists are doing all the time but more are doing so. Now is the time for journalists to be more discrete and cunning, not ham-fisted and clumsy.
Roger Kimball touches on this in his Could The New York Times’s abortion coverage be any more one-eyed? Forget the abortion issue, it is the observations on communication that are illustrative.
A writer friend recently told my son about an exercise he was given in a high school composition class. The idea was to show how word choice affects the mood and emotional weather of your prose. He recalled an example from TIME magazine. (For younger readers: TIME used to be — long, long ago — an important news outlet; that TIME is not to be confused with the virtue-signaling enterprise of the same name that has taken its place). Consider the different rhetorical implications of these two sentences:Regardless of one's personal stance on abortion, or any other moral issue, it is easy to see that the Times signals its editorial positions in other ways than simply in its editorials.
Truman slunk from the back room to huddle with his cronies.vs.
Eisenhower strode from the chamber to consult with his advisers.Would you rather “slink” or “stride”? Do you frequent “back rooms” or occupy “chambers”? Is it, outside the precincts of American football, more dignified to “huddle” or “consult”? And those with whom you do parley: are they “cronies” or “advisers”?
You know the answers to all of these questions and you can see how those different choices of words reveal very different attitudes about the subjects under discussion.
According to my friend, the point of the exercise was admonitory. It was to caution novice writers to be careful lest their prejudices infect their word choice and thereby spoil their reliability as accurate, dispassionate reporters.
I suspect many writers for The New York Times absorbed something like the amusing example my friend retailed without, however, taking on board the moral about preserving one’s reputation for accuracy and (so far as is humanly possible) bias-free reporting.
Consider the opening sentence of the Times’s report on Ireland’s vote to legalise abortion last May: “Ireland voted decisively to repeal one of the world’s more restrictive abortion bans, sweeping aside generations of conservative patriarchy and dealing the latest in a series of stinging rebukes to the Roman Catholic Church.”
Striding through chambers — decisively striding through chambers — to consult with one’s advisers, right?
Compare that with the Times’s reporting on Argentina’s defeat last week of a bill to legalise abortion. “Argentina’s Senate on Thursday narrowly rejected a bill to legalize abortion, dealing a stinging defeat to a grass-roots movement that pushed reproductive rights to the top of the country’s legislative agenda and galvanized activist groups throughout Latin America.”
To paraphrase Jim Treacher, Modern journalism is all deciding which facts the public shouldn't know because they reflect badly on the narrative of critical theory social justice. Or as David Burge says, "Journalism is about covering important stories. With a pillow, until they stop moving."
Earlier in the week, I posted twice about the NYT lying about their own past headlines and reporting (Who deserves credit is one thing. Lying about the recent past is another and It's like claiming you're eighteen while handing over a driver's license which proves you are fourteen).
They capped the week of lying about their past reporting with something that approaches lying about the future.
It has apparently become important for the establishment media to push the claim that America is an inherently racist nation, brimming with knuckle-dragging, mouth-breathing deplorables. They really wanted the first anniversary of Charlottesville to be a thing but it came up empty.
Their fall back was some putative White Nationalist march in Washington, D.C., planned for the weekend. The NYT spent the week building up expectations that this event would manifest that incipient putsch of which they are so terrified. Forget that the only overt racism is coming from their ideology of social justice critical theory. Forget that these events are few and far between. Forget that every time a handful of attention seeking knuckleheads get together, that the countermarches are orders of magnitude greater in number. Forget that Implicit Attitude Tests show that there is no systematic racial prejudice. In the movie the NYT is watching, we are always trembling on the brink of establishing racial apartheid. Again ignoring that the only people who sanction apartheid are the social justice critical theory people.
Sometime around Wednesday or Thursday, the NYT had to check its enthusiasm just a mite when they reported that the march application to the police only anticipated 400 participating white nationalists. To put that in perspective, even the SPLC, which wildly exaggerates numbers for commercial purposes, only estimates some 5-10,000 KKK members/Aryan Brotherhood types out there. The number on the police radar screen is dramatically smaller than that. In comparison, violent gangs such as MS-13, Bloods, Crips, drug cartels, each command some 10,-30,000 members and the FBI is tracking some 1.4 million violent gang members (though I am guessing that the real hardcore number is more like 250-500,000).
According to the FBI, about 20% of all homicides are committed by gangs. In gang centers such as Los Angeles and Chicago, it is as high as fifty percent.
In the US in 2016, 17,250 people were murdered. Of these 17,250 murders, 9 fell into the category of hate crimes. Not 9%. Nine of the murders were racially motivated. Six of those were murders of blacks by whites and three of those were murders of whites by blacks.
I am not ignoring all the tragedy these antiseptic numbers represent, I am simply putting the NYT critical theory social justice obsession into perspective.
The NYT wants reality to be different than it actually is. The US is a more peaceful nation and a more racially integrated and adjusted nation than the NYT can stand. Racial violence is rare, it comes from all sources, and white nationalists are a vestigial concern.
But the establishment media does a good job of misdirecting. Look at the results from Google Trends.
Based on Google Searches, you'd think that the KKK was alive and well despite its near complete collapse. The mainstream media are accomplishing their goal of telling a different story than the one revealed by the numbers.
So how did the white supremacist march in Washington turn out? The one the NYT was profiling and showcasing?
Well, it is hard to keep track because the NYT keeps rewriting its reporting. First we had the anticipation of a major white supremacist march. Then they had to reset expectations for perhaps a 400 person march. Then they were reporting some dozens of march participants. Then it became a couple of dozen. As of 9:05am Monday, August 13th, we are down to:
The right-wing agitator Jason Kessler and perhaps 20 fellow members of the far right"Perhaps twenty"? Talk about weasel words. When numbers are this low, the journalist should be able to count on their fingers and toes. Give it another 24 hours and the trend line suggests that there will have not been a march at all. Or possibly we will be in existential territory with negative numbers of marchers.
The NYT tried to build this up to be the highlight of the critical theory/social justice social calendar - the event that would show the till-now masked face of racist America. With the near complete absence of racists at the heralded racist march, the NYT has done an about turn.
After weeks of hype, white supremacists managed to muster just a couple of dozen supporters on Sunday in the nation’s capital for the first anniversary of their deadly rally in Charlottesville, Va., finding themselves greatly outnumbered by counterprotesters, police officers and representatives of the news media.Let's remember just who was doing the hyping.
The numbers are always small, the counter protesters always outnumber them. What kind of wool is the NYT trying to pull over its reader's eyes?
Grasping for straws, the NYT notes:
But even with the low turnout, almost no one walked away with the sense that the nation’s divisions were any closer to healing.The nation's divisions are being manufactured by the social justice critical theorists of the mainstream media. Out here in the real world, people get along, even respect one another. People are happy about a booming economy. People work hard to make the lives of their nearest and dearest better. They worry about their mortgages and put in some overtime. They volunteer in their community to solve real and persistent problems. They interact with their neighbors and colleagues as individuals, not as identity groups.
But none of that has anything to with the hot house hallucinations, delirium tremens, and rank hysteria of the mainstream media.
Sunday, August 12, 2018
The founding story is that of the death match of Arrhichion of Phigalia. Arrhichion was a champion wrestler in the ancient Olympic Games. He died while successfully defending his wrestling championship in the 54th Olympiad (564 BC).
For when he was contending for the wild olive with the last remaining competitor, whoever he was, the latter got a grip first, and held Arrhachion, hugging him with his legs, and at the same time he squeezed his neck with his hands. Arrhachion dislocated his opponent's toe, but expired owing to suffocation; but he who suffocated Arrhachion was forced to give in at the same time because of the pain in his toe. The Eleans crowned and proclaimed victor the corpse of Arrhachion.Jump forward twenty centuries and there is a similar pyrrhic victory in the scene in William Shakespeare's Romeo and Juliet, wherein Mercutio describes his mortal wound as "Ay, ay, a scratch, a scratch."
Another five centuries forwards and we have John Cleese as the Black Knight in Monty Python and the Holy Grail.
Double click to enlarge.
From Wikipedia describing the Black Knight role in Monty Python and the Holy Grail.
According to the DVD audio commentary by Cleese, Palin, and Idle, the sequence originated in a story told to Cleese when he was attending an English class during his school days. Two Roman wrestlers were engaged in a particularly intense match and had been fighting for so long that the two combatants were doing little more than leaning into one another. It was only when one wrestler finally tapped out and pulled away from his opponent that he and the crowd realised the other man was, in fact, dead and had effectively won the match posthumously. The moral of the tale, according to Cleese's teacher, was "if you never give up, you can't possibly lose" – a statement that, Cleese reflected, always struck him as being "philosophically unsound". The story would have been a deformed (or misremembered) description of the death of Arrichion of Phigalia.The Black Knight's line "'Tis but a scratch" has become a sarcastic description of any dramatically failed endeavor. I wager that few know that they are quoting Shakespeare or paying unconscious tribute to Arrhichion, but the cultural line zig-zags all the way back.
People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.As if in illustration, this passage from Leviathan: The History of Whaling in America by Eric Jay Dolin. Page 113.
By 1760 the candle manufacturers realized that they had a serious problem. Despite the expansion of the colonial whaling fleet and the rise in the number of sperm whales captured, the demand for spermaceti candles was so strong that it far outstripped the supply of head matter. Just a handful of the largest candle manufacturers, working at full tilt, had the capacity quickly to process all the head matter the colonial whale fishery could provide. As historian James Hedges pointed out, the manufacturers’ ability to respond to this situation was constrained. They couldn’t afford to get into a bidding war with one another over the limited amount of head matter, for that would only cause its price, and hence the price of the candles, to rise. And while the wealthy were willing to pay more for spermaceti candles, they were willing to pay only so much. Were the price of spermaceti candles to rise too high, all but the richest customers would replace them with other illuminants, such as right whale oil, seal oil, or tallow candles.
Although the manufacturers couldn’t charge more for their product, there existed an alternative to keep the price of candles in check while still remaining profitable. Namely, if they could keep the cost of the head matter from rising too fast, then their goal could be achieved. At first, only four of the candle manufacturers joined forces, and they told the main supplier of head matter, Nantucketer Joseph Rotch, that they would only pay a certain amount, and no more, for his product. The folly of this plan, however, quickly became apparent. The four manufacturers could agree all they wanted on a ceiling price for head matter, but there was nothing to stop their competition from paying more, and therefore causing the price to rise rapidly. On November 5, 1761, the eight largest candle manufacturers in the colonies attempted to solve this problem by banding together in oligarchic fashion to form the “United Company of Spermaceti Chandlers,” which came to be known as the Spermaceti Trust. Under the terms of their agreement, the manufacturers established a maximum price that they would pay per ton of head matter, which was six pounds higher than the price that “common merchantable spermaceti body brown oil” was fetching in London. They also agreed to “use all fair and honorable means” to stop potential rivals from building new candleworks. And if the trust’s members were not able to keep the price of head matter from rising above the agreed-upon cap, they vowed personally to “fit out at least 12 vessels” to secure the head matter themselves. The trust was scheduled to remain in force for seventeen months, and during that time the members were to meet twice annually, "at the best tavern in Taunton” (Massachusetts), where they could share intelligence and measure the progress of their endeavor.
The Spermaceti Trust, one of the the earliest industrial monopolies in the colonies and, according to one author, the “world’s first energy cartel,” was plagued by problems from the start.13 Trust members accused one another of breaches of their agreement, the most egregious of which was paying more than the stipulated price for head matter. Rather than let the trust sink under an accumulating list of grievances, the members chose to regroup and clarify the terms of their association, signing revised articles of agreement on April 13, 1763. The new maximum price for head matter was now set at ten pounds above the price for “Brown oil,” and members were allowed to buy only from a specific list of suppliers, thereby minimizing the likelihood of secret purchases at inflated prices. The members remained committed to stifling the competition, and instructed their agents to provide them with “the most early notice of any attempt to set up other spermaceti works,” so that they could take steps to keep the interlopers from obtaining the expertise and tools, mainly the screw press, needed to launch the business. The most critical element of the new articles of agreement was the division of head matter among members. Originally the members were free to buy as much head matter as they could secure at a given price. Now members would receive a set allotment from the entire stock of head matter caught in the colonies, which was divided into one hundred parts. Thus, Nicholas Brown and Company, the largest of the manufacturers received twenty out of every one hundred barrels, while the others got smaller amounts.
For the next dozen years the Spermaceti Trust endured, and so did its problems. The members’ faith in one another and the trust itself was repeatedly shaken. Efforts to quash new candleworks were largely ineffectual, and by 1774 there were twenty-four different spermaceti candle manufacturers, all of whom had joined the trust. Worse still was the growing gap between the price of head matter and the price of candles.
Despite these problems, the members remained committed to maintaining the trust, believing that if it were to dissolve, the situation for all of them would be much worse. Better to have an imperfect union than to open the floodgates to uncontrolled competition, which could easily push the price of spermaceti candles beyond what even the luxury market was willing to bear. As long as the trust’s members could obtain head matter on reasonable terms, they could stay in business. And for many years the rise in the price for head matter notwithstanding, the trust managed to keep that price low enough so that they could still make a profit.
Saturday, August 11, 2018
Within economics there is a long established challenge that the lead time between new policies and visible outcomes can be measured in years rather than months. So of course, policies pursued in one administration have a tendency to spill over, for good or ill, into the following administration. The new administration of course is in its turn launching new policies and so if there are beneficial outcomes from the earlier administration, those get snuffled up as short term outcomes from new policies which are yet half-baked. Its the nature of the beast.
It is also true, though, that sometimes either circumstances have changed or policies can be better targeted/executed in a way that can lead to dramatic improvements.
We won't know for a long time where the balance will fall between what carried over from the Obama administration into the Trump administration. Obama pursued policy, fiscal and monetary policies that frequently were subversive of growth, leading to multi-year predictions of "recovery summers" when growth was finally going to take off and never did. Towards the end of Obama's administration, the received wisdom in many policy circles was that 0.5-2.0% productivity growth was the new normal for the US and for developed nations worldwide. Obama's policies were not to blame, it was the new reality.
The investment, growth, labor force participation rate, unemployment rate, stock market numbers are all so positive that it puts the lie to those claiming the new normal is less than 2% growth. But what is causing this windfall of growth? New policies? The promise of new policies not yet delivered? Changes in global economic tail winds?
In order, maybe; probably; almost certainly not. I suspect that perhaps the biggest part is simply a clearly understood change in goals. The old administration wanted to focus on fairness and redistribution, manifested in micro-managing the economy, focusing on growth suppressing policies, especially with regard to energy, ensuring that the productive were paying more taxes ("their fair share"), building up government consumption programs, etc.
The new administration is pursuing different goals with great simplicity and clarity. The message is simply that economic growth is the priority and assumes that increasing economic growth will benefit everyone eventually. I suspect that clarity and the expectations that go with it are perhaps the dominant driver of the new administration growth.
The IBD editors are dwelling on that stale argument of how do you slice the credit cake between old and new administrations. But they call attention to something I think is even more interesting. The willingness of the media to rewrite history that is recent and accessible.
Growth: The stronger the economy gets under President Trump, the more desperate his critics are to hand credit over to Obama. Even if that entails changing the past.This is the second time this week that I have noticed the New York Times making a big claim that is factually inconsistent with their own reporting: reporting that is a matter of record. I posted earlier It's like claiming you're eighteen while handing over a driver's license which proves you are fourteen in which a NYT reporter makes claims about the state of knowledge about climate change in 1979 which are easily refuted by the NYT's own reporting and headlines in 1979.
A recent New York Times story says it all: "An Economic Upturn Begun Under Obama Is Now Trump's To Tout."
The article begins by admitting that "by nearly every standard measure, the American economy is doing well," then spends the next 1,400 words arguing that the current good times have nothing to do with Trump's economic agenda.
The economy, reporter Patricia Cohen declares, "is following the upward trajectory begun under President Barack Obama."
We seem to recall that the economy was stagnating in 2016 after the weakest recovery from a recession since the Great Depression.
In fact, The New York Times itself described Obama's economy this way in August 2016: "For three quarters in a row, the growth rate of the economy has hovered around a mere 1%. In the last quarter of 2015 and the first quarter of 2016, the economy expanded at feeble annual rates of 0.9% and 0.8%, respectively. The initial reading for the second quarter of this year, released on Friday, was a disappointing 1.2%."
GDP growth decelerated in each of the last three quarters of 2016.
And on January 27, 2017, after the government reported that GDP growth for all 2016 was a mere 1.6% — the weakest in five years — the Times announced that "President Trump's target for economic growth just got a little more distant."
That same month, the nonpartisan Congressional Budget Office forecast growth this year would be just 1.9%.
There were other signs of stagnation as well. Stocks had flatlined in 2016, with major indexes down slightly. Real median household income dropped that year, according to Sentier Research.
Growth had been so worrisomely slow throughout Obama's two terms in office that journalists started warning about "secular stagnation." They said the country was in a period of long, sustained, slow growth resulting from slow population and productivity growth.
In August 2016, the Times declared that "the underlying reality of low growth will haunt whoever wins the White House."
Predictions of Slow Growth
The next month, CBS News reported that "with U.S. economic growth stuck in low gear for several years, it's leading many economists to worry that the country has entered a prolonged period where any expansion will be weaker than it has been in the past."
In short, there was no upward trajectory to the economy on anyone's radar when Trump took office.
Why lie about a past that you documented yourself and which everyone can so easily find? They are not stupid people but they are sure acting in a way that makes little sense.
At the Bomb Testing SiteFrom: The Way It Is: New and Selected Poems (Graywolf Press, 1998)
by William E. Stafford
At noon in the desert a panting lizard
waited for history, its elbows tense,
watching the curve of a particular road
as if something might happen.
It was looking at something farther off
than people could see, an important scene
acted in stone for little selves
at the flute end of consequences.
There was just a continent without much on it
under a sky that never cared less.
Ready for a change, the elbows waited.
The hands gripped hard on the desert.
The Future History, by Robert A. Heinlein, describes a projected future of the human race from the middle of the 20th century through the early 23rd century. The term Future History was coined by John W. Campbell, Jr. in the February 1941 issue of Astounding Science Fiction. Campbell published an early draft of Heinlein's chart of the series in the March 1941 issue.This is the timeline for Heinlein's Future History.
Heinlein wrote most of the Future History stories early in his career, between 1939 and 1941 and between 1945 and 1950. Most of the Future History stories written prior to 1967 are collected in The Past Through Tomorrow, which also contains the final version of the chart. That collection does not include Universe and Common Sense; they were published separately as Orphans of the Sky.
Groff Conklin called Future History "the greatest of all histories of tomorrow". It was nominated for the Hugo Award for Best All-Time Series in 1966, along with the Barsoom series by Edgar Rice Burroughs, the Lensman series by E. E. Smith, the Foundation series by Isaac Asimov, and The Lord of the Rings series by J. R. R. Tolkien, but lost to Asimov's Foundation series.
You can see the notes in the upper right, describing the period closest to Heinlein in time (basically the second half of the 20th century) as The Crazy Years, described as:
Considerable technical advance during this period, accompanied by a gradual deterioration of mores, orientation, and social institutions, terminating in mass psychoses in the sixth decade, and the interregnum.Sounds pretty accurate though the Crazy Years are lasting longer than Heinlein anticipated. I am not sure that we are yet at the stage of mass psychoses but certainly perhaps a quarter of the population is manifesting some fairly persistent manifestations of derangement.
SEATTLE AIRPORT MECHANIC, 29, HIJACKS EMPTY 76-SEAT COMMERCIAL PLANE, PERFORMS MID-AIR STUNTS AND CRASHES IN A BALL OF FLAMES ON AN ISLAND 25 MILES AWAY WHILE BEING CHASED BY F-15 FIGHTER JETS.From the Daily Mail
Friday, August 10, 2018
I have always considered it a priceless advantage to have been born as an economist prior to 1936 and to have received a thorough grounding in classical economics. It is quite impossible for modern students to realize the full effect of what has been advisably called "The Keynesian Revolution" upon those of us brought up in the orthodox tradition. What beginners today often regard as trite and obvious was to us puzzling, novel, and heretical.
To have been born as an economist before 1936 was a boon - yes. But not to have been born too long before!
Bliss was it in that dawn to be alive,The General Theory caught most economists under the age of thirty-five with the unexpected virulence of a disease first attacking and decimating an isolated tribe of south sea islanders. Economists beyond fifty turned out to be quite immune to the ailment. With time, most economists in-between began to run the fever, often without knowing or admitting their condition.
But to be young was very heaven!
I must confess that my own first reaction to the General Theory was not at all like that of Keats on first looking into Chapman's Homer. No silent watcher, I, upon a peak in Darien. My rebellion against its pretensions would have been complete, except for an uneasy realization that I did not at all understand what it was about. And I think I am giving away no secrets when I solemnly aver upon the basis of vivid personal recollection-that no one else in Cambridge, Massachusetts, really knew what it was about for some twelve to eighteen months after its publication. Indeed, until the appearance of the mathematical models of Meade, Lange, Hicks, and Harrod, there is reason to believe that Keynes himself did not truly understand his own analysis.
I am sure all dedicated museum and history site visitors have experienced some, if not all of these circumstances. A substantial extract from his post.
1. There will be a mix-up based on similar names – such as Lorch and Lorsch. You will go to the wrong one first, assuming you can find the second one at all.I would add:
2. The thing you really want to see will be either a) on the road on loan or b) removed for conservation. OR
3. Not on display that day for other reasons (like a water leak in the room next door…)
4. If the stars and planets do align, the items will have been permanently removed from display and are now only available to select, vetted researchers who have valid reasons to see the originals. And there are no copies on display. [Yes, Albertina, I am looking at you.]
5. Natural history museums will have at least four school groups present, ranging in size from “small and easy to trip over” to “view-blockingly tall.”
6. You arrive the day after, or leave the day before, “National Free Museum Day!”
7. All the objects from a certain time period, let’s say Paleolithic and early Neolithic, have just been moved to a new state of the art, separate museum. Two miles from the last trolley or bus stop. And outside the cab ring.
8. The museum will be in between exhibits. Large portions will be closed so that the new stuff can be assembled without patrons tripping over curators and vice versa.
9. The museum/site is closed owing to a labor dispute.The one time I had the money, the time, the flexibility, and was close to Pompeii, I arrived to find everything locked up owing to a strike. I have had the same thing happen elsewhere but that was among the most disappointing.
On a positive note, there are instances of the very opposite of the above laws. I have had the great pleasure to visit many sites (Chaco Canyon or Kolomoki Indian Mounds) or museums (Naturhistoriska Riksmuseet in Stockholm) when they were virtually empty. Nothing quite as inspiring as wandering pathways and hallways of great history and wonder with it all to yourself.
There are vested interests on all sides of the debate. There are True Believers who allow for no debate at all. And we are left with a lot of cognitive pollution and little clarity.
Such has been the uproar over Colony Collapse Disorder (CCD) beginning some ten years ago. Honey bee colonies, it was claimed, were collapsing all over the world. Then the claim was extended to all bees, not just honey bees.
I ignored it. It had all the hallmarks of a manufactured crisis, whether for commercial profit or communication clicks was unclear. Whether intentional or simply emergent order was unclear.
I saw something today that made think about CCD and made me wonder, whatever became of that crisis?
The answer - hard to tell. Cognitive pollution everywhere. As usually happens it seems the claims were overstated, the natural conditions misunderstood and the data sets were patchy.
This blogger, echoing many such articles I have seen in the past couple of years, claims Bees are not in Danger, and never were.
Wikipedia goes with an article that is closer to the peak hysteria.
This summary from the EPA circa 2015 has much more moderate claims.
Once thought to pose a major long term threat to bees, reported cases of CCD have declined substantially over the last five years. The number of hives that do not survive over the winter months – the overall indicator for bee health – has maintained an average of about 28.7 percent since 2006-2007 but dropped to 23.1 percent for the 2014-2015 winter. While winter losses remain somewhat high, the number of those losses attributed to CCD has dropped from roughly 60 percent of total hives lost in 2008 to 31.1 percent in 2013; in initial reports for 2014-2015 losses, CCD is not mentioned.I could spend a lot of time researching this but almost certainly the data is incomplete and of variable quality, the vested interests (Commercial or True Believer) are strong, the robust research minimal. I am not sure that there was ever, in empirical terms, a problem in the first place, i.e. any disorder outside of the normal variance in a natural system. I am not sure we know what the real empirical picture looks like for any longitudinal data set. And I am pretty sure we still do not fully comprehend all the variables in this complex, dynamic, complex, evolving, power-law driven system.
So, for the time being, I will continue monitoring but with an increasing assumption that there was never a problem in the first place.
Thursday, August 9, 2018
The article is How a question is phrased can drastically alter the answer from KnowledgeWharton. With due respect to the researchers from my alma mater, do tell. The underlying research is Eliciting the truth, the whole truth, and nothing but the truth: The effect of question phrasing on deception by Julia A. Minson, et al.
Anyone who has ever done even a modicum of survey design knows this. Context matters, wording of questions matters, sequence of questions matters. A halfway competent survey designer can get exactly the answers you want based on how the questions are posed. That is why surveys are so challenging and frequently wrong.
In strategic information exchanges (such as negotiations and job interviews), different question formulations communicate information about the question asker, and systematically influence the veracity of responses. We demonstrate this function of questions by contrasting Negative Assumption questions that presuppose a problem, Positive Assumption questions that presuppose the absence of a problem, and General questions that do not reference a problem. In Study 1, Negative Assumption questions promoted greater disclosure of undesirable work-related behaviors than Positive Assumption or General questions did. In Study 2, Negative Assumption questions increased disclosure of undesirable information in face-to-face job recruitment meetings, relative to Positive Assumption questions and General questions. Study 3 demonstrated that the relationship we identify between question type and the veracity of responses is driven by inferences of assertiveness and knowledgeability about the question asker. Finally, in Study 4, asking assertive questions with regard to uncommon behaviors led the question asker to be evaluated more negatively.Fair enough. But what is new here? The rest is behind a pay wall so I cannot see what insight they are adding that hasn't been surfaced in the past fifty years of research into survey design.
Which is too bad because there are a lot of intensely interesting and useful issues in the topic.
Every conversation is a dance where the two parties are trying to understand and forecast the other. What are their respective bodies of knowledge, what are their assumptions, what are their interests, what are their desires, what is important to them, what are appropriate topics of conversation? The music of a conversation usually starts slow and builds, especially if it is between strangers. There is an ebb and flow, tentative assertions and modest retreats. A mutual seeking of a common ground.
Much the same is true of survey designs. You have to understand the audience or you have to have the time to discover them. As with a conversation, and this may be the point of Minson and her colleagues, there is a two way flow of intentional and unintentional information. Much is revealed of one to the other which in turn causes an evolution of expectations.
If they are taking their research in a useful direction, Minson et al will begin to build a structure by which we can understand how there is a meeting of minds between surveyor and surveyed. How do you reach a pertinent alignment of information and values and assumptions with a salient population in whom you are interested?
It is a lot more complicated and fallible than is usually assumed.
Why and how is translation so hard? Here's a little non-comparative case study to help make the process more visible. 9 words from the very start of the Odyssey, lines 1-2: ὃς μάλα πολλὰ / πλάγχθη, ἐπεὶ Τροίης ἱερὸν πτολίεθρον ἔπερσεν. Syntactically easy.— Emily Wilson (@EmilyRCWilson) August 6, 2018
Double click to enlarge.
Cold Missouri Waters
by Cry Cry Cry
My name is Dodge but then you know that
It's written on the chart there at the foot end of the of the bed
They think I'm blind that I can't read it
I've read it every word and every word it says is death
So confession Is that the reason that you came
Get it off my chest before I check out of the game
Since you mention it well there's thirteen things I'll name
Thirteen crosses high above the cold Missouri waters
August 49 North Montana
The hottest day on record the forest tinder dry
Lightning strikes in the mountains
I was crew chief at the jump base
I prepared the boys to fly
Pick the drop zone C47 comes in low
Feel the tap upon your leg that tells you go
See the circle of the fire down below
Fifteen of us dropped above the cold Missouri waters
Gauged the fire I'd seen bigger
So I ordered them to side hill we'd fight it from below
We'd have our backs to the river
We'd have it licked by morning even if we took it slow
But the fire crowned jumped the valley just ahead
There was no way down headed for the ridge instead
Too big to fight it we'd have to fight that slope instead
Flames one step behind above the cold Missouri waters
Sky had turned red smoke was boiling
Two hundred yards to saftey
Death was fifty yards behind
I don't know why I just thought it
I struck a match to waist high grass running out of time
Tried to tell them step into this fire I've set
We can't make it this is the only chance you'll get
But they cursed me
Ran for the rocks above instead
I lay face down and prayed above the cold Missouri waters
Then when I rose like the phoenix
In that world reduced to ashes
There was none but two survived
I stayed that night and one day after
Carried bodies to the river
Wondering how I stayed alive
Thirteen stations of the cross to mark their fall
I've had my say I'll confess to nothing more
I'll join them now those that they left me long before
Thirteen crosses high above the cold Missouri waters
Wednesday, August 8, 2018
From Let There Be More Than Light by Bjørn Lomborg. Worth reading in its entirety. This is the part that leapt out to me, for purely personal reasons.
Worldwide, fossil fuels produce two-thirds of all electricity, with nuclear and hydro producing another 27%. According to the International Energy Agency (IEA), solar, wind, wave, and bio-energy produce just 9.8% of electricity in the OECD, and this is possible only because of huge subsidies, cumulatively totaling more than $160 billion this year. Even ultra-environmentally aware Germany still produces more than half its electricity with fossil fuels.Then he gets to the ugly rub.
Yet there is a disturbing movement in the West to tell the 1.1 billion people who still lack these myriad benefits that they should go without. A familiar refrain suggests that instead of dirty, coal-fired power plants, poor countries should “leapfrog” straight to cleaner energy sources like off-grid solar technology. Influential donors – including even the World Bank, which no longer funds coal energy projects – endorse this view.
The underlying motivation is understandable: policymakers must address global warming. Eventually moving away from fossil fuels is crucial, and innovation is required to make green energy cheap and reliable. But this message to the world’s poor is hypocritical and dangerous. While fossil fuels contribute to global warming, they also contribute to prosperity, growth, and wellbeing.
There is a strong, direct connection between power and poverty: the more of the former, the less of the latter. A study in Bangladesh showed that grid electrification has significant positive effects on household income, expenditure, and education. Electrified households experienced a jump of up to 21% in income and a 1.5% reduction in poverty each year.
Over the past 16 years, nearly every person who gained access to electricity did so through a grid connection, mostly powered by fossil fuels. And yet donors say that many of the 1.1 billion people who are still without electricity should instead try solar panels.So many of our well intentioned do-gooders in the West chase after theoretically plausible solutions primarily because they want to imagine their way out of having to make hard choices. You can alleviate poverty for tens of millions with coal plants or you can condemn them to lives solitary, poor, nasty, brutish, and short in order to possibly have a cleaner environment a hundred years from now. Don't like that hard trade-off? OK, we'll talk about make-believe solar cell alternatives in order to morally preen and pretend that our heated imaginings will make a difference.
Compared with expensive grid expansion, providing an off-grid, solar cell is very cheap. But for the recipient, it is a poor substitute. It offers just enough power to keep a lightbulb going, and to recharge a mobile phone, which is better than nothing – but only barely. The IEA expects that each of the 195 million people with off-grid solar will get just 170kWh per year – or half of what one US flat-screen TV uses in a year.
Perhaps not surprisingly, the first rigorous test published on the impact of solar panels on the lives of poor people found that while they got a little more electricity, there was no measurable impact on their lives: they did not increase savings or spending, they did not work more or start more businesses, and their children did not study more.
Little wonder: 170kWh is not what most of us would consider real access to electricity. Off-grid energy at this level will never power a factory or a farm, so it cannot reduce poverty or create jobs. And it will not help fight the world’s biggest environmental killer: indoor air pollution, which is mostly caused by open fires fueled by wood, cardboard, and dung, and claims 3.8 million lives annually. This is not a concern in rich countries, where stoves and heaters are hooked up to the grid; but because solar is too weak to power stoves and ovens, recipients of off-grid solar panels will continue suffering.
Meanwhile, someone has to make hard decisions in the real world and you either choose to make people's lives better by means you do not like or you condemn them to grueling, grunting poverty. As Pete Seeger asks, Which side are you on?
As an undergraduate majoring in International Economics with a focus in International Economic Development, I was exposed to the Appropriate Technology ideas of Schumacher and his ilk. Developing nations, rich in labor and poor in capital, ought to avoid the stresses and strains of modern civilization by only adopting technology appropriate to their stage of development.
Sure, sounds pretty. Being a callow youth, I could parrot the ideas without having to actually think it through. A couple of years later in graduate school, I had to suffer the brutalist exposure of all shallow thinking. In an international business class, with an Indian professor, I suggested a solution based on Appropriate Technology thinking. He, with kind gentleness, then forced me to play out the assumptions behind my parroted words. I could string out the logical integrity of the argument but once I started examining the underlying assumptions it became very uncomfortable. Why should people in developing nations have to go through the same steps as earlier pioneers? Why not leap to the end state if at all possible?
Basically, why would you not treat people in developing nations as if they had the same aspirations as those in the developed nations? Why would you hold them back.
I suspect everyone must eventually have gotten hoisted on that unpleasant petard because you don't hear serious thinking about Appropriate Technology anymore. That said, in some ways, the new-found infatuation with Sustainability is merely the old failed Appropriate Technology in new clothing. Just as Diversity is the rebranding of the old failed Affirmative Action.
Rich NGOs advocating for solar panels for the poor in developing nations are the Appropriate Technology apostles of the current day. They don't want to accept other people's agency and the decisions other people make to optimize their own productivity. The feel-gooders would rather tell them what to do, or force them through financial inducements to do the things the rich want them to do even though it does not help the poor.
All That is Gold Does Not Glitter
by J.R.R. Tolkien
All that is gold does not glitter,
Not all those who wander are lost;
The old that is strong does not wither,
Deep roots are not reached by the frost.
From the ashes, a fire shall be woken,
A light from the shadows shall spring;
Renewed shall be blade that was broken,
The crownless again shall be king.
Judge of individuals from your own knowledge of them, and not from their sex, profession, or denomination.
17. TO HIS SONAs good a refutation of the identitarian creed as any. The individual is not the average. Populations always have a distribution. Focus on the individual, not the average.
Dublin Castle, April 5, 1746
Before it is very long, I am of opinion that you will both think and speak more favourably of women than you do now. You seem to think that from Eve downwards they have done a great deal of mischief. As for that Lady, I give her up to you: but, since her time, history will inform you, that men have done much more mischief in the world than women; and, to say
the truth, I would not advise you to trust either, more than is absolutely necessary.
But this I will advise you to, which is, never to attack whole bodies of any kind; for, besides that all general rules have their exceptions, you unnecessarily make yourself a great number of enemies, by attacking a corps collectively. Among women, as among men, there are good as well as bad; and it may be full as many, or more, good than among men. This rule holds as to lawyers, soldiers, parsons, courtiers, citizens, etc. They are all men, subject to the same passions and sentiments, differing only in the manner, according to their several educations; and it would be as imprudent as unjust to attack any of them by the lump. Individuals forgive sometimes; but bodies and societies never do.
Many young people think it very genteel and witty to abuse the Clergy; in which they are extremely mistaken; since, in my opinion, parsons are very like other men, and neither the better nor the worse for wearing a black gown. All general reflections, upon nations and societies, are the trite, thread-bare jokes of those who set up for wit without having any, and so have recourse to commonplace. Judge of individuals from your own knowledge of them, and not from their sex, profession, or denomination. Though at my return, which I hope will be very soon, I shall not find your feet lengthened, I hope I shall find your head a good deal so, and then I shall not much mind your feet. In two or three months after my return, you and I shall part for some time; you must go to read men as well as books, of all languages and nations. Observation and reflection will then be very necessary for you. We will talk this matter over fully when we meet; which I hope will be in the last week of this month; till when, I have the honour of being
Your most faithful servant.
If only they were teaching Chesterfield rather than identity politics in school today, the world would be a better place.
Tuesday, August 7, 2018
A $1.6 trillion dollar debt for 182 years to be borne by the whole nation to support a new principle - universal human rights
What is unique is that it is only in the modern Age of Enlightenment era with its simultaneous beliefs in universal human rights, and rule of law, and property rights, that we have seen the virtual elimination of slavery as an institution. The idea that Age of Enlightenment, classical liberalism required the elimination of slavery is well documented and very traceable to extensive debates and specific legislation, starting in the US, Britain, and France and spreading from there. It is a fascinating history.
I think there were essentially four pulses of freedom that marked the elimination of global slavery. The first three came out of the three heartlands of Age of Enlightenment thinking and institutions - Great Britain, France, and the United States from 1794 to 1865. This was followed by Russia in 1861-66. There was a lot of mopping up to be done here and there over those decades.
But the biggest assault on slavery began in Britain.
The marriage of Christian beliefs and Age of Enlightenment universalism gave the effort its earliest and most powerful boost in Britain. One of the key challenges though, was the reconciliation of simultaneous beliefs marking a cultural transition. The belief in universal human rights (which precludes slavery) had to work through the other Age of Enlightenment precepts - rule of law and property rights. The nut of the problem was how to convert slaves (a then form of property) into free humans while adhering to the law and observing property rights.
Of course the whole process, new in global affairs, was messy.
In the United States, despite the Age of Enlightenment ethos of the Revolution and the inspiration of Thomas Paine's Common Sense, there was a false start. The Founding Fathers were unable to bridge the gap between universal human rights, property rights and rule of law. Without being able to reach agreement in an already fragile set of circumstances, having no credit, and being unwilling to fight yet another war, they incorporated slavery as it then existed into the Constitution, a problem to be resolved later. Jefferson made a small downpayment against this blight by supporting legislation which ended up outlawing the importation of slaves to the US in 1808.
France was the first to abolish slavery by legal edict in 1794 with the promise that slave owners would be compensated for their property. The mechanics of making that happen fell victim to the turmoil of the French Revolution. Seeking to avoid too many wars on too many fronts, Napoleon promulgated a new law in 1802, reestablishing slavery in French colonies which continued until 1848. This was the second false start.
Britain was the most effective in its abolitionist efforts starting circa 1750s and gaining momentum in the 1770s. One of the key sticking points was the cost of abolition. If slave-owners were to be compensated for their usurped property (as rule of law and property rights demanded), how much and who was to pay. This was no small issue. The great majority of the British population still lived perilously close to poverty and starvation in the late eighteenth century. The capital value of the total population of slaves was immense.
Serendipitously, national credit markets also came into existence with the Age of Enlightenment, first in the Netherlands and then across northern Europe. Britain was only able to sustain its freedoms and fighting capabilities during the Napoleonic wars through massive national debt accumulation. This was new, unprecedented and contentious. Despite moral and pragmatic objections to debt, however, national debt did work to accomplish things otherwise impossible to achieve.
The Slave Trade Act of parliament in 1807 laid the first foundation for the eventual global elimination of slavery by outlawing the trade in slaves in its territories. With both Britain and the United State outlawing the traffic in slaves, and with the might of the British Navy behind it, and given the extent of Britains colonial holdings, this, to some degree, eliminated the trade of slaves globally but without addressing the core issue of slave ownership. That had to wait until 1833.
From The Power of Ideas and the Great Emancipation of 1834.
In the Slavery Abolition Act of 1833, Parliament committed the huge sum of 20 million pounds sterling to compensate slave owners for the loss of their “assets.” That was equivalent to 40 percent of the entire national budget (and five percent of Britain’s GDP at the time), requiring the government to borrow most of the 20 million from private sources. It finally paid the loan off just three years ago, in 2015. The former slaves themselves received no payment for their suffering—a lamentable aspect of the law that freed them.Britain squared the circle of liberty, property and rule of law by undertaking the modern equivalent of a $1.6 trillion debt for 182 years to be borne by the whole nation. Remarkable.
France eventually abolished slavery in its colonies (with compensation) in 1848.
The United States ultimately paid for the abolition of slavery in blood. 620,000 dead and cultural scars for another two or three generations. $1.6 trillion was a steep price the British paid but a bargain in comparison.
Russia abolished serfdom in 1861-66, freeing some twenty-five million serfs. The cost of freedom was borne by the serfs themselves, the government imposing taxes on the newly freed serfs in order to compensate their former owners.