Author: chasdarwin

How playing Wittgensteinian language-games can set us free

Source: Aeon

Author: Sandy Grant – is a philosopher who works at the University of Cambridge. Following a distinguished academic career, in which she had both a Professorial Chair and a baby by the age of forty, she has begun writing philosophy for public audiences. Sandy has recently enjoyed writing for Aeon, Quartz and The New European. She still teaches in Tripos and for the Kings-Pembroke Faculty.

Emphasis Mine

We live out our lives amid a world of language, in which we use words to do things. Ordinarily we don’t notice this; we just get on with it. But the way we use language affects how we live and who we can be. We are as if bewitched by the practices of saying that constitute our ways of going on in the world. If we want to change how things are, then we need to change the way we use words. But can language-games set us free?

It was the maverick philosopher Ludwig Wittgenstein who coined the term ‘language-game’. He contended that words acquire meaning by their use, and wanted to see how their use was tied up with the social practices of which they are a part. So he used ‘language-game’ to draw attention not only to language itself, but to the actions into which it is woven. Consider the exclamations ‘Help!’ ‘Fire!’ ‘No!’ These do something with words: soliciting, warning, forbidding. But Wittgenstein wanted to expose how ‘words are deeds’, that we do something every time we use a word. Moreover, what we do, we do in a world with others.

This was not facile word-nerdery. Wittgenstein was intent on bringing out how ‘the “speaking” of language is part of an activity, or form of life’. In Philosophical Investigations (1953), he used the example of two builders. A brickie calls ‘Slab!’ and his helper brings it. What’s going on here? The helper who responds is not like a dog reacting to an order. We are humans, the ones who live together in language in the particular way that we do, a way that involves distinctive social practices.

With this spotlight on language-games, Wittgenstein asks readers to try to see what they are doing. But if we are entranced by our linguistic practices, can we even see what we’re doing? Wittgenstein’s attempts to see met with the charge that he was stopping us from seeing anything else, from perceiving new possibilities: his linguistic obsessions were a distraction from real politics. The chief accuser was Herbert Marcuse, who in his blockbuster One-Dimensional Man (1964) declared that Wittgenstein’s work was reductive and limiting. It could not be liberatory, for the close focus on how we use words misses what’s really going on.

These objections are serious. But do they succeed?

Marcuse claims that Wittgenstein is reductive, seeing only language, and poorly at that. Wittgenstein strives to bring language-games to light: Marcuse says this is stupid. Well, is it? Yes and no. In Culture and Value (1977), Wittgenstein admits: ‘How hard I find it to see what is right in front of my eyes.’ All too often, he says, we miss the obvious. That which is close is the most difficult to see for what it is. When we use words, we partake of everyday understandings and carryings-on. Wittgenstein looks to these everyday usages, and remarks upon them.

One remark that Marcuse ridicules is Wittgenstein’s example, ‘My broom is in the corner…’ Marcuse is super-snarky about this, and denounces ‘the almost masochistic reduction of speech to the humble and common’. But, amid the bluster, Marcuse misses the point. The mundane example is apt given the everyday practices at issue. Moreover, if you look closely, even a statement so banal is not quite what it seems. There are numerous other examples of Wittgenstein’s that Marcuse ignores, for example on reading or the aroma of coffee.

This all-too-human stupidity is deep-seated. Wittgenstein is calling attention to the ways in which, by our everyday language-games, we entrap ourselves. So he looks closely at what he is doing and saying. He sees work in philosophy as therapeutic, in the sense of ‘a work on oneself’. And there is an intense self-scrutiny in Philosophical Investigations. It is quite remarkable, questioning the ways we use language to do mundane things such as telling the time, doing sums, or hoping that someone will come. This is not something to which we are accustomed. We can be resistant, not wanting to see things for what they are. Is this ‘masochistic’? It is a subjection of oneself to self-scrutiny, but surely only painful or humiliating for those who stand to lose from finding that they are not so clever after all. So, if we are to change, we must first face up to an imperative to ‘be stupid’, and to know ourselves to be. Marcuse could have welcomed this, for he gets that it is in everyday practices that we are unwittingly subjected: ‘magic, witchcraft, and ecstatic surrender are practised in the daily routine of the home, the shop, and the office’. In short, the lady doth protest too much.

Does Marcuse’s second objection fare any better? This is the claim that Wittgenstein is confining, ensnaring us only further within language. Marcuse says that Wittgenstein’s take on language is one-dimensional. But this is not borne out by a reading of Wittgenstein’s book, where we find a view of language as irreducibly multi-dimensional. Wittgenstein painstakingly shows how the basis for what we use as language is provided by shifting patterns of communal activity. Language is contingent and provisional, so language-games can’t but be open to change, in numerous ways. One arises from recognising that we can choose to see something as this, or as that. One of Wittgenstein’s most famous passages involves this picture-puzzle:

Look at the picture, and you can see it as a duck. Look again, and you can see it as a rabbit. Because language-games are played by humans, we can notice what is going on when we see things as this, or as that. A contemporary example is the controversy over all-male speaker events. You can look at the line-up and say ‘a panel of experts’, or you can say ‘manel’. But is it only a manel if you choose to see it that way? These examples invite us to question what we take to be given in everyday uses of language. But Marcuse doesn’t mention the duck-rabbit, or discuss its implications.

So language usage admits contestation and change, in virtue of what it is. Marcuse, on the other hand, denies this, and even says that societal processes close the universe of discourse. We don’t get from him anything like Wittgenstein’s suggestion that there is in language usage itself something recalcitrant to fixity.

Indeed, Wittgenstein’s position is rather more radical than Marcuse cares to notice. He says ‘something new (spontaneous, “specific”) is always a language-game’. This cryptic remark might suggest that we need to play language-games differently if we are to change anything. What of this prospect? Notably, on Wittgenstein’s account, we don’t play language-games solo. They arise through communal uses of language. One game is polari, the secret language used among gay men in Wittgenstein’s time. Language-games, with their beguiling snares, raise a collective action problem. We can’t extricate ourselves from them if acting alone. But this raises a further question, given how profoundly we are ensnared. It is one that Wittgenstein anticipates:

[T]his language grew up as it did because human beings had – and have – the tendency to think in this way. So you can only succeed in extricating people who live in an instinctive rebellion against language; you cannot help those whose entire instinct is to live in the herd which has created this language as its own proper mode of expression.

The rebels live in a state of dissatisfaction with language. They feel their alienation, cut off from others and themselves within language. But the contented are untroubled, and humans are inclined to think that way. Reading Wittgenstein brings us to such questions.

So Marcuse’s objections are unfounded. He fails to show that Wittgenstein’s astonishing scrutiny of language-games is either pointlessly stupid or enslaving. In fact, his efforts only heighten regard for Wittgenstein’s relevance in the darkness of these times.

Using language is an integral part of the human condition. We live within language, yet our way of life is something we find hard to see. Wittgenstein is not peddling ready answers to this predicament. Indeed as long as there is language it will bewitch us, we will face the temptation to misunderstand. And there is no vantage point outside it. There is no escape from language-games then, but we can forge a kind of freedom from within them. We might first need to ‘be stupid’ if we are to see this.


Merry Christmas, Donald Trump!

Source: Tablet Magazine

Author:Adam Kirsh

Emphasis Mine

TO: America  FROM: The Jews

If I become president, we’re all going to be saying Merry Christmas again, that I can tell you,” Donald Trump promised more than a year ago. Well, he is not president quite yet; but the first Christmas of the Trump era is just around the corner, and, so far, this looks like one campaign promise that is not going to be kept. The use of “Happy Holidays” as an all-purpose December greeting is just too habitual in America to be banished by presidential edict. Indeed, Trump himself recently sent out a card to his supporters that contains the dreaded greeting.

Still, as so often with Trump, what matters is not the performance but the rhetoric; and by coming out so strongly against “Happy Holidays,” he was signaling his support for a certain vision of America. This is not so much a pious Christian vision—Trump himself is famously cavalier in matters of faith—as it is an ideal of homogeneity. The implied reasoning is that Americans stopped saying “Merry Christmas” and started using “Happy Holidays” because of the unwelcome arrival of people who did not celebrate the Christian holiday—people who forced Christian Americans to abandon a religious custom in order to cater, in politically correct fashion, to their alien sensitivities.  Theoretically, it might be possible to think of Muslims or Hindus as the guilty party here. But historically, of course, it is the Jews who were the first major immigrant group to change the complexion of Christian America. For a long time, this change was minimized by the adoption of “Judeo-Christian” as a new adjective for American religion. Jews, in this view, might not actually celebrate Christmas, but they could be comfortably grandfathered in as honorary members of the Christian tradition. But in recent years, this tolerance has been eroding as the notion of a “war on Christmas” gains traction, to the point that even so benign a figure as Garrison Keillor could complain about the Jewish conspiracy to replace Christmas carols with non-denominational holiday songs, like “White Christmas.” (This was written by Irving Berlin, who also gave us the all-purpose nondenominational hymn “God Bless America.”)

(N.B.: What Do the Writers of “White Christmas,” “Rudolph the Red-Nosed Reindeer,” and “Chestnuts Roasting on an Open Fire” All Have in Common? Check out pretty much any list of the most iconic Christmas songs and about half of them were written by Jewish people. Johnny Marks may be the most prolific, he wrote “Rudolph the Red-Nosed Reindeer,” “Rockin’ Around the Christmas Tree,” and “A Holly Jolly Christmas.” In addition to the songs listed above, you can also credit Jewish songwriters with “Silver Bells,” “Let It Snow,” “Santa Baby” and plenty more. So how do you explain this religious contradiction? According to Emmy Winner Michael Feinstein, “The Christmas songs that are popular are not about Jesus, but they’re about sleigh bells and Santa and the trappings of Christmas.” In other words, Christmas songs are really just about winter and family and being “Home for the Holidays.” (Also written by a Jewish person).)

Talk of a war on Christmas is, then, at least implicitly anti-Jewish, and sometimes quite explicitly so. Donald Trump’s promise to restore “Merry Christmas” was a coded message about reducing the Jewish influence on and presence in American culture—just as his notorious campaign ad about the “global power structure” robbing “our working class” made the same promise in economic and political code.

There are good reasons, however, to believe that “Happy Holidays” is here to stay as a public December greeting, especially in commercial and official contexts. This has nothing to do with sparing Jewish feelings, or even Muslim and Hindu and atheist feelings—though taken together non-Christians make up a growing minority in American life. It is, rather, because American public discourse lacks the ability to discuss religion in any kind of substantive way. Commerce, not religion, is the tie that binds Americans of many different faiths, including the various Christian denominations; it is what we all have in common, like it or not. That is why American Christmas, to the despair of many religious Christians, long ago became a holiday whose public expressions are not about the birth of Jesus, but about buying things and giving gifts. (In a sense, this represents a deep continuity with the ancient roots of the holiday as a pagan winter festival, in which a season of deprivation was symbolically banished by feasting.)

Indeed, the transformation of Christmas into a holiday of consumption tinged by humanitarianism is not just an American phenomenon; it happened across Europe as well, in tandem with the rise of capitalism. A splendid way to see this process at work is to read the new Penguin Christmas Classics, a charmingly designed box set of six stories about the holiday, each in its own slim, stocking-size volume. These range from Dickens’ A Christmas Carol and E.T.A Hoffmann’s The Nutcracker to lesser-known tales by Louisa May Alcott, Anthony Trollope, Nikolai Gogol, and L. Frank Baum, the creator of Oz.

These are mostly 19th-century works, and though they come from several different countries and in a variety of languages, they have a remarkable amount in common. Most notable is that none of them is about Jesus, and few even mention the birth of the Christian savior except in a pro forma way. In the Dickens story, Scrooge’s nephew dispenses with the theological meaning of the holiday in a parenthesis: “I am sure I have always thought of Christmastime, when it has come around—apart from the veneration due to its sacred name and origin, if anything belonging to it can be apart from that—as a good time: a kind, forgiving, charitable, pleasant time.” The language is significantly ambiguous: Christmas is about “having a good time” as much as it is a time for doing good, and indeed, for Dickens, the two are inseparable. At the end of the story, Scrooge’s reformation is signaled by his finally accepting his nephew’s invitation to a Christmas party, as if the ability to be jolly were itself a sign of moral grace.

This union or confusion of virtue and enjoyment is one reason why Dickens’ story has become the classic of capitalist Christmas. Scrooge is, of course, a famous symbol of miserliness, of the capitalist ethic run amok—his only purpose in life is accumulating money. But the opposite of miserliness is not only generosity; it is also consumption, the joyous, free-spending consumption that for Dickens is essential to the Christmas spirit. Just after Scrooge’s transformation, Dickens writes, “His hands were busy with his garments all this time: turning them inside out, putting them on upside down, tearing them, mislaying them, making them parties to every kind of extravagance.”

Extravagance is the key, and it is very Dickensian to transfer this word from the realm of emotion and behavior to the realm of physical objects, as though clothes themselves could embody it. But extravagance is itself a key capitalist virtue, because it is the drive to spend and consume that keeps the economy in motion. Scrooge’s miserliness can, in fact, be seen as a vestige the heroic age of the Protestant work ethic, the time when capital accumulation was necessary for the first stage of industrialism to take off. In a later day, in a consumer society, it is maladaptive, and Scrooge must learn to spend as well as earn—just as a capitalist economy needs demand as well as supply if it is to avoid a depression.

Extravagance, then, is one meaning of Christmas in the modern world. The Nutcracker, as readers familiar with the Tchaikovsky ballet will remember, is all about the voluptuous pleasure of getting presents. Here, again, consumption is seen as a blessed activity, as Hoffman writes: “The children, who kept whispering about the expected presents … added that it was now also the Holy Christ, who, through the hands of their dear parents, always gave them whatever real joy and pleasure He could bring them.” It is not until later, when the dolls and candy assume nightmarish proportions in young Marie’s fevered dreams, that there comes to seem something ominous about accumulating luxuries. But even then, the uncanniness of Hoffmann’s tale reads like a distant homage to the original uncanniness of the Christian incarnation, in which the eternal breaks into the temporal. Christmas is a time when the usual laws—not just of economics, but of nature—are momentarily suspended.

Scrooge must learn to spend as well as earn—just as a capitalist economy needs demand as well as supply if it is to avoid a depression.

Taken to the extreme of banality, as it is in Anthony Trollope’s Christmas stories, this means that Christmas is a time for lucky breaks and funny coincidences, of the kind familiar from sitcoms or romantic comedies. Trollope’s tales, such as “Christmas at Thompson Hall” and “The Mistletoe Bough,” were seasonal commodities produced for magazines, and what they show is that even in Victorian England, readers wanted Christmas stories with as little Christianity in them as possible. In these tales, a wife mistakes her hotel room number and accidentally applies a mustard plaster to a strange man—who turns out to be her future brother-in-law; or else a young man wins the love of a young woman after overcoming the slightest of misunderstandings. Christmas has no role to play except as a generally happy and benevolent atmosphere, which ensures that everything will turn out for the best.

Reading these classic Christmas tales helps to explain how American Jews could develop Hanukkah, previously a fairly minor winter holiday, into such a successful counterpart to Christmas. Religiously and ideologically, Hanukkah is just about the worst holiday possible for such a purpose—it is, after all, a story about Jews resisting assimilation by violence. But if Christmas is civically celebrated mainly as a day of consumption tinged by benevolence, as it is by Dickens and Trollope and Hoffmann, then it presents no obstacle for Jews who want to enter into its spirit. Singing songs and giving gifts requires no particular theological commitment, and we can all share in the secular magic of “the Christmas season,” even if we do it under the stern aegis of the Maccabees. Still, we should remember that the transformation of Merry Christmas into Happy Holidays predates the entry of Jews into Anglo-American society, and it happened as a response to Christian, not Jewish, cultural and economic needs. If it smoothed the entry of Jews into American society, that was only a side effect, though a wonderful one.

Speaking for myself, I knew perfectly well as a child that Christmas was not my holiday; but I never felt that it was wrong for me to attend friends’ Christmas parties or to enjoy their trees. I did have some compunction about singing Christmas carols, which are explicitly religious. But what matters is that I was invited to do it, by friends and at school, and I could do it in a spirit of friendliness and participation, rather than religious affirmation. This inclusiveness seemed only natural to me, and it is only as an adult, having learned much more about Jewish history, that I realize what a truly extraordinary thing it is. American Jews celebrate Christmas Eve by going out to the movies or eating Chinese food, making common cause with another non-Christian minority; and Christian America accepts this as a kind of endearing oddity.

Compare this to the way Jews used to “observe” Christmas Eve in Eastern Europe, on what they referred to as Nittel Nacht: by holding vigil all night and refraining from Torah study, both for theological reasons and to avoid incurring the wrath of celebrating Christians. (The custom of playing dreidel may originate in the games Jews played to pass the time while locked in their houses on Nittel Nacht.) The fact that most American Jews today have never heard of this tradition is a sign of how completely our relationship to Christians and Christianity has changed for the better. That makes Christmas a holiday worth celebrating for Jews, and other non-Christians, as well.


All we want for Christmas … is Jew. Read Tablet’s holiday coverage here.



Young adults agree more with Karl Marx than the bible, a new study finds

time heals a lot of problems...
time heals a lot of problems…


Author: Dan Arel

Emphasis Mine

Socialism is on the rise in the U.S. thanks in part to Senator Bernie Sanders’ recent run for president and the spotlight he put on the once-taboo word.

Now, thanks to a new survey of 2,300 people conducted by YouGov and the Washington, D.C.-based Victims of Communism Memorial Foundation, we know that young adults referred to as millennials in this study, agree more with the words of Karl Marx than they do with passages from the bible.

When asked if they agree with what Marx said, “From each according to his abilities, to each according to his need,” 64 percent of the millennials polled said, “yes,” while only 53 percent said they agree with the statement in the Bible that “if any would not work, neither should he eat.”

The study also shows, in my opinion, that millennials are more educated about what communism and socialism is. They are not stuck behind red scare, McCarthyist views of the USSR or Cuba. They instead seem to be more educated on the topic.

57 percent of Americans overall say they have a “very unfavorable” view of Communism, only 37 percent of millennials said the same.

The study also found that 45 percent of those aged 16 to 20 said they would vote for a Socialist, and 21 percent said they would vote for a Communist.

Only 42 percent of young adults had a favorable view capitalism.

The study found a growing acceptance of Socialist and Marxist viewpoints among a younger generation of Americans who did not grow up during the Cold War,” the report said. “When considered alongside the broad support among millennials for Bernie Sanders and his ideals —the poll, for example, found more support for quotes of Sanders than Milton Friedman and the Bible — Socialism has growing support in America.”

Ironically, the study was meant to shed a negative light on communism and socialism and wishes to imply that Stalin, Castro, and Mao are communist leaders instead of looking at the authoritarian systems they created they don’t meet even the basic necessities of communism laid out by Marx.

“One of the concerns the Victims of Communism Memorial Foundation has had since its establishment is that an emerging generation of Americans have little understanding of the collectivist system and its dark history,” Marion Smith, the foundation’s executive director, said in the report. “Unfortunately, this report, which we intend to release on an annual basis, confirms this worrisome impression.”

The truth is, these young adults likely do understand it and realize it’s not socialism or communism. They are happy to work towards a system that actually helps all of its citizens rather than continue to try and fix a broken capitalist system that they understand only helps the wealthy.


Study Links Religious Belief To Poor Understanding Of Physical World


Author: Michael Stone

Emphasis Mine

Ignorance and defective thinking styles lead to religious superstition.

A poor understanding of the physical world is linked to religious and paranormal beliefs in a new study.

A recent study published June 2016 in Applied Cognitive Psychology connects belief in the supernatural (religious and paranormal beliefs) with poor reasoning skills, low information about basic physics and biology, and a propensity to assign intention and mentality to non-mental phenomena (magical thinking).

PsyPost reports the study shows that religious and paranormal (supernatural) beliefs are correlated with “poor intuitive physics skills, poor mechanical ability, poor mental rotation, low school grades in mathematics and physics, poor common knowledge about physical and biological phenomena, intuitive and analytical thinking styles, and in particular, with assigning mentality to non-mental phenomena.”

The following excerpt is from the summary of the article Does Poor Understanding of Physical World Predict Religious and Paranormal Beliefs?:

The results showed that supernatural beliefs correlated with all variables that were included, namely, with low systemizing, poor intuitive physics skills, poor mechanical ability, poor mental rotation, low school grades in mathematics and physics, poor common knowledge about physical and biological phenomena, intuitive and analytical thinking styles, and in particular, with assigning mentality to non-mental phenomena.

PsyPost reports that researchers conclude that “Nonscientific ways of thinking are resistant to formal instruction…” adding that this can “affect individuals’ ability to act as informed citizens to make reasoned judgments in a world that is increasingly governed by technology and scientific knowledge.”

In other words, low information coupled with defective thinking styles and limited cognitive abilities can not only lead to religious and supernatural beliefs but can also hinder the ability of individuals to “make reasoned judgments.”

Bottom line: The study results are not particularly surprising, and merely confirm what many others have long suspected: Religious and supernatural beliefs are often associated with poor reasoning skills and low information about the natural world.



What would Einstein do?

Source: Aeon

Author: Thomas Levenson edited by Corey S Powell

Emphasis Mine

Albert Einstein understood the power of science and scientific metaphors, and their ability to provide perspective on everyday experiences. Here is his oft-retold description of his best-known idea: ‘When you sit with a nice girl, an hour seems like a minute. When you sit on a hot stove, a minute seems like an hour. That’s relativity.’ The joke hardly captures the precise physics involved, but it brings home the reality that our experience of time is malleable. Particularly relevant today, his explanatory approach offers a lesson to journalists struggling to cover complicated topics in a polarised media world. Thinking like Einstein – thinking relativistically – can help to decode stories on topics as far removed from science as power, love or money.

Einstein’s relativity was born in 1905, often called his ‘miracle year’. The paper in which he lays out the theory is a rarity within the scientific literature, clear and citation-free. Some of Einstein’s celebrated thought experiments are there to help the reader grasp the deep ideas within the paper’s seeming simplicity. The most famous of these concerns his shocking redefinition of the idea of simultaneity. Einstein breaks down the old conception and introduces a new one, using the scenario of a train being struck by lightning, observed both on board and from track-side. From the start, he connected his relativistic thinking to the familiar world.

But there’s another important thought problem in Einstein’s paper, one that is mostly overlooked. It focuses on an oddity in the way that physics was understood at the time. Scientists knew well that if a magnet and a wire coil (or any conductor) move with respect to each other, a current flows through the wire. But Einstein noted that, in turn-of-the-century theory, the description of the event differed depending on whether the magnet moved and the coil remained at rest, or the coil moved and the magnet stayed. That duality, Einstein realised, shouldn’t be. Either way, the relative motion was the same, and the outcome was also the same – yet the way in which physicists grappled with the events was different. Einstein deduced that his colleagues were missing the broader, unifying context.

There is a direct correspondence between Einstein’s emphasis on the need to come up with a consistent picture of an event as seen by any observer (in this case, from the coil’s perspective and the magnet’s) and a critical demand for journalistic rigour. For example, consider the ruling made by President Barack Obama’s administration this May that made more than 4 million workers eligible for paid overtime if they work more than 40 hours a week. Peter O’Dowd, in a piece for the National Public Radio programme Here and Now, told listeners that the workers had just received ‘a raise’. Many other journalists offered the same interpretation. In one sense, that’s true: people earning overtime will take home more money than those who don’t. But it could also be said that there was no raise at all. A newly overtime-eligible employee receives the exact same base salary rate as before, but will now get paid for all the hours worked.

Here are two distinct and yet internally consistent descriptions of the same event. For a supervisor, spending more on wages feels like a raise. To subordinates, getting paid for all of their time on the job is just getting back to level. So how can a reporter get the story right? Think like Einstein, this time accounting for the categories of boss and worker instead of coil and magnet.

The special theory of relativity gives the physicist a tool that allows her to reconcile different descriptions of the same event. Einstein’s answer in his 1905 paper turns on the concept of reference frames, the coordinates and clock ticks that mark where and when each observer views a given event. Observers in separate reference frames that are in constant motion with respect to each other (the coil or the magnet, the train passenger or someone watching from the embankment) will make different measurements of the same event. In the latter half of the paper, Einstein supplies the mathematical framework that connects those two views, but even the conceptual version is enormously powerful. Relativity is a misleading name, one that Einstein himself didn’t love; the key to special relativity is that it reconciles the differing, ‘relative’ interpretations of a single, invariant event.

Moving from physics to daily life: here, again, special relativity accepts the critical importance of point of view, the way observers interpret what they’ve just seen. At the same time, it affirms the unique reality of the event being observed. For a journalist, that sense of a formal relationship between interpretation – even spin – and the underlying event or action is vital. Relativistic thinking is especially helpful in any area that has accumulated a dominant narrative frame. The economics beat, for instance, naturally lends itself to the corporate perspective. Relativisitic journalism would help to ensure that no story about a change in employment rules talked only about raises and not about work hours.

There is no mathematical transformation that can precisely align the boss’s view with that of her workers, but the idea of reference frames maps directly from physics on to the shop floor. It does so, too, for many other stories that hinge on disparities of power. It helps journalists to hear the silent ‘…as much as everyone else’s’, after the slogan ‘Black Lives Matter’, and hew more closely to Einstein’s own generous views on racial equality. It sharpens the reporting of medical stories; in the recent debate over different countries’ mammogram frequency recommendations, for instance, it clarifies that the issue is at least as much about communicating risk as it is about performing accurate diagnostics. Frames of reference certainly impinge on stories about politics and policy.

To be clear, I am not calling for mere ‘both-sides’ journalism. We already have too much of that. Not every fact has two distinct, equivalent meanings. Human-driven global warming and disease-reduction from vaccination are real, and the complaints of a handful of dissenters doesn’t alter that reality. But many more stories exist in which a commitment to one perspective blinds the reporter – and the audience – to the alternatives. Not every reporter can be as smart as Einstein. But it is possible, and a damn good idea, to think at least a little bit like him.

Thomas Levenson is is professor of science writing at the Massachusetts Institute of Technology. His most recent book is The Hunt for Vulcan: … And How Albert Einstein Destroyed a Planet, Discovered Relativity, and Deciphered the Universe (2015). He lives in Massachusetts.


How the Christian Right’s Sex Hangups Turn Zika Into a Bigger Crisis

Photo Credit: parinyabinsuk / Shutterstock
Photo Credit: parinyabinsuk / Shutterstock

Source: AlterNet

Author: Valerie Tarico

Emphasis Mine

Zika could have been an ordinary epidemic, like the ever-changing influenza that emerges each winter and spreads across the Northern Hemisphere with sad but rare complications. But the Religious Right’s antagonism to birth control and abortion—and honest conversation about sex in general—has transformed the Zika epidemic into a nightmare that will devastate lives for an entire generation.

In the absence of pregnancy, Zika usually isn’t a big deal. Only one in five people who contract Zika experience symptoms, and those who do mostly feel like they’ve gotten the flu. This is not to say Zika never does lasting harm to adults, just that, like the flu, those cases appear to be rare.

The difference, as most people now know, is that getting Zika while pregnant is really, really bad. The virus attacks the fetal nervous system, eating brain structures that have already developed and blocking development of others. Even babies who look normal may be damaged for life.

Unlike the flu, when it comes to Zika, pregnancy prevention or timing is everything.

Three Ways to Safeguard Families

Even if Zika spreads across its potential range of 41 states, a quick and targeted response could make lasting harm rare, at least within U.S. borders. The solution is simple and relatively cheap, but it consists of policies that the sex-obsessedpatriarchy-protecting Religious Right has been opposing for decades:

  • Information. Launch a huge public education campaign so all couples know how to prevent mistimed or unwanted pregnancy and can delay parenthood till the time is safe. Currently a third of pregnancies globally and almost half in the U.S. are accidents, with some of the highest rates where Zika-carrying mosquitos live.
  • Contraception. Make state-of-the-art birth control available to all free of charge, including the very best IUDs and implants, which drop the accidental pregnancy rate below 1 in 500. (With the Pill that’s 1 in 11; with condoms 1 in 6; with the rhythm method it’s closer to 1 in 4.)
  • Abortion. Ensure that couples who discover microcephaly and other fetal defects in utero can, if they prefer, abort a diseased pregnancy and start over. Millions of healthy children exist in this world only because their parents receive the mercy of a fresh start (like I did).Each of these steps is easier and cheaper than trying to eradicate mosquitos, prevent people from getting bitten, or develop and distribute a vaccine. With existing contraceptive knowledge and technologies, birth defects from Zika could drop to near zero. The problem is not a lack of means; it’s a lack of will brought on by religious teachings that generate resistance and controversy around anything that has to do with sex, gender roles or reproduction.You Reap What You Sow

    No matter what, tragic birth defects from Zika would have hit some families as the virus spreads out of Africa where it is endemic (and where most women appear to have immunity before they reach reproductive age). But without relentless promotion of ignorance and falsehood by priests and pastors—without anti-contraception campaigning by the Vatican in particular—birth defects from Zika would be a small fraction of what humanity now faces.

    Religious conservatives claim to love women and babies, especially unborn babies, but this claim is pure self-deception by biblical standards. The writer of Matthew warns of men who claim to speak for God but actually don’t. He says,

    “By their fruit ye shall know them.”

    What are the fruits of conservative Christian hostility toward judicious, planned, intentional parenthood? For generations, humanity has been battered by preventable harms from ill-timed and unwanted pregnancy: children bearing children in hopeless poverty, education foregone, abuse and neglect, family conflict triggered by stress, armed conflict triggered by population pressures and resource depletion; and starvation, illness and death.

    If the church hadn’t thrown its wealth and weight against family planning programs in the 1960s and every decade since, who knows how different life on Earth might be right now. Zika merely ups the ante.

    And the conservative Christian solution to it all? More prayer and less sex. If God’s self-proclaimed messengers actually loved women and children more than they love power and tradition, they would admit they have been wrong and would do what’s best for healthy families:

    • Stop using the political clout of the church to make birth control expensive and hard to get, especially for poor people and those at risk of Zika.
    • Stop goading conservative politicians to waste millions on bogus, indefensible anti-abortion laws, and work instead to make abortion less necessary.
    • Stop teaching young people that they should “let go and let God” determine how many kids they have (whether infected or starving or not). Start teaching that the ability to plan our families is a precious gift.
    • Stop pretending that vows of abstinence work for more than a few odd individuals. Start providing real information about healthy, respectful, responsible pleasure and intimacy.
    • Stop forcing doctors and nurses to follow anti-contraception, anti-abortion religious directives bordering on malpractice; and instead ensure that hospitals and clinics controlled by religious institutions provide model family planning care.The Zika wave will sweep over the Americas, and as immunity grows rates of infection will likely drop off. In that case, the suffering caused by church hostility to sexuality education and family planning will drop back to more familiar levels. But right now Zika presents a rare opportunity for religious leaders to show that they are not, as they often appear, so busy defending dogma that they have become morally bankrupt.

      Valerie Tarico is a psychologist and writer in Seattle, Washington, and the founder of Wisdom Commons. She is the author of “Trusting Doubt: A Former Evangelical Looks at Old Beliefs in a New Light” and “Deas and Other Imaginings.” Her articles can be found at



Must science be testable?


Author:Massimo Pigliucci

emphasis mine

The general theory of relativity is sound science; ‘theories’ of psychoanalysis, as well as Marxist accounts of the unfolding of historical events, are pseudoscience. This was the conclusion reached a number of decades ago by Karl Popper, one of the most influential philosophers of science. Popper was interested in what he called the ‘demarcation problem’, or how to make sense of the difference between science and non-science, and in particular science and pseudoscience. He thought long and hard about it and proposed a simple criterion: falsifiability. For a notion to be considered scientific it would have to be shown that, at the least in principle, it could be demonstrated to be false, if it were, in fact false.

Popper was impressed by Einstein’s theory because it had recently been spectacularly confirmed during the 1919 total eclipse of the Sun, so he proposed it as a paradigmatic example of good science. Here is how in Conjectures and Refutations (1963) he differentiated among Einstein on one side, and Freud, Adler and Marx on the other:

Einstein’s theory of gravitation clearly satisfied the criterion of falsifiability. Even if our measuring instruments at the time did not allow us to pronounce on the results of the tests with complete assurance, there was clearly a possibility of refuting the theory.

The Marxist theory of history, in spite of the serious efforts of some of its founders and followers, ultimately adopted [a] soothsaying practice. In some of its earlier formulations … their predictions were testable, and in fact falsified. Yet instead of accepting the refutations the followers of Marx re-interpreted both the theory and the evidence in order to make them agree. In this way they rescued the theory from refutation … They thus gave a ‘conventionalist twist’ to the theory; and by this stratagem they destroyed its much advertised claim to scientific status.

The two psycho-analytic theories were in a different class. They were simply non-testable, irrefutable. There was no conceivable human behaviour which could contradict them … I personally do not doubt that much of what they say is of considerable importance, and may well play its part one day in a psychological science which is testable. But it does mean that those ‘clinical observations’ which analysts naively believe confirm their theory cannot do this any more than the daily confirmations which astrologers find in their practice.

As it turns out, Popper’s high regard for the crucial experiment of 1919 may have been a bit optimistic: when we look at the historical details we discover that the earlier formulation of Einstein’s theory actually contained a mathematical error that predicted twice as much bending of light by large gravitational masses like the Sun – the very thing that was tested during the eclipse. And if the theory had been tested in 1914 (as was originally planned), it would have been (apparently) falsified. Moreover, there were some significant errors in the 1919 observations, and one of the leading astronomers who conducted the test, Arthur Eddington, may actually have cherry picked his data to make them look like the cleanest possible confirmation of Einstein. Life, and science, are complicated.

This is all good and well, but why should something written near the beginning of last century by a philosopher – however prominent – be of interest today? Well, you might have heard of string theory. It’s something that the fundamental physics community has been playing around with for a few decades now, in their pursuit of what Nobel physicist Steven Weinberg grandly called ‘a theory of everything’. It isn’t really a theory of everything, and in fact, technically, string theory isn’t even a theory, not if by that name one means mature conceptual constructions, such as the theory of evolution, or that of continental drift. In fact, string theory is better described as a general framework – the most mathematically sophisticated one available at the moment – to resolve a fundamental problem in modern physics: general relativity and quantum mechanics are highly successful scientific theories, and yet, when they are applied to certain problems, like the physics of black holes, or that of the singularity that gave origin to the universe, they give us sharply contrasting predictions.

Physicists agree that this means that either theory, or both, are therefore wrong or incomplete. String theory is one attempt at reconciling the two by subsuming both into a broader theoretical framework. There is only one problem: while some in the fundamental physics community confidently argue that string theory is not only a very promising scientific theory, but pretty much ‘the only game in town,’ others scornfully respond that it isn’t even science, since it doesn’t make contact with the empirical evidence: vibrating superstrings, multiple, folded, dimensions of space-time and other features of the theory are impossible to test experimentally, and they are the mathematical equivalent of metaphysical speculation. And metaphysics isn’t a complimentary word in the lingo of scientists. Surprisingly, the ongoing, increasingly public and acerbic diatribe often centres on the ideas of one Karl Popper. What, exactly, is going on?

I had a front row seat at one round of such, shall we say, frank discussions last year, when I was invited to Munich to participate in a workshop on the status of fundamental physics, and particularly on what some refer to as ‘the string wars’. The organiser, Richard Dawid, of the University of Stockholm, is a philosopher of science with a strong background in theoretical physics. He is also a proponent of a highly speculative, if innovative, type of epistemology that supports the efforts of string theorists and aims at shielding them from the accusation of engaging in flights of mathematical fancy decoupled from any real science. My role there was to make sure that participants – an eclectic mix of scientists and philosophers, with a Nobel winner thrown in the mix – were clear on something I teach in my introductory course in philosophy of science: what exactly Popper said and why, since some of those physicists had hurled accusations at their critical colleagues, loudly advocating the ejection of the very idea of falsification from scientific practice.

In the months preceding the workshop, a number of high profile players in the field had been using all sorts of means – from manifesto-type articles in the prestigious Nature magazine to Twitter – to pursue a no-holds-barred public relations campaign to wrestle, or retain, control of the soul of contemporary fundamental physics. Let me give you a taste of the exchange, to set the mood: ‘The fear is that it would become difficult to separate such ‘science’ from New Age thinking, or science fiction,’ said George Ellis, chastising the pro-string party; to which Sabine Hossenfelder added: ‘Post-empirical science is an oxymoron.’ Peter Galison made crystal clear what the stakes are when he wrote: ‘This is a debate about the nature of physical knowledge.’ On the other side, however, cosmologist Sean Carroll tweeted:

My real problem with the falsifiability police is: we don’t get to demand ahead of time what kind of theory correctly describes the world,’ adding ‘[Falsifiability is] just a simple motto that non-philosophically-trained scientists have latched onto.’ Finally (but there is more, much more, out there), Leonard Susskind mockingly introduced the neologism ‘Popperazzi’ to label an extremely naive (in his view) way of thinking about how science works.

This surprisingly blunt – and very public – talk from prestigious academics is what happens when scientists help themselves to, or conversely categorically reject, philosophical notions that they plainly have not given sufficient thought to. In this case, it was Popper’s philosophy of science and its application to the demarcation problem. What makes this particularly ironic for someone like me, who started his academic career as a scientist (evolutionary biology) and eventually moved to philosophy after a constructive midlife crisis, is that a good number of scientists nowadays – and especially physicists – don’t seem to hold philosophy in particularly high regard. Just in the last few years Stephen Hawking has declared philosophy dead, Lawrence Krauss has quipped that philosophy reminds him of that old Woody Allen joke, ‘those that can’t do, teach, and those that can’t teach, teach gym,’ and science popularisers Neil deGrasse Tyson and Bill Nye have both wondered loudly why any young man would decide to ‘waste’ his time studying philosophy in college.

Loud debates on social media and in the popular science outlets define how much of the public perceives physics.

This is a rather novel, and by no means universal, attitude among physicists. Compare the above contemptuousness with what Einstein himself wrote to his friend Robert Thorton in 1944 on the same subject: ‘I fully agree with you about the significance and educational value of methodology as well as history and philosophy of science. So many people today – and even professional scientists – seem to me like somebody who has seen thousands of trees but has never seen a forest. A knowledge of the historic and philosophical background gives that kind of independence from prejudices of his generation from which most scientists are suffering. This independence created by philosophical insight is – in my opinion – the mark of distinction between a mere artisan or specialist and a real seeker after truth.’ By Einstein’s standard then, there are a lot of artisans but comparatively few seekers of truth among contemporary physicists!

To put things in perspective, of course, Einstein’s opinion of philosophy may not have been representative even then, and certainly modern string theorists are a small group within the physics community, and string theorists on Twitter are an ever smaller, possibly more voluble subset within that group. The philosophical noise they make is likely not representative of what physicists in general think and say, but it matters all the same precisely because they are so prominent; those loud debates on social media and in the popular science outlets define how much of the public perceives physics, and even how many physicists perceive the big issues of their field.

That said, the publicly visible portion of the physics community nowadays seems split between people who are openly dismissive of philosophy and those who think they got the pertinent philosophy right but their ideological opponents haven’t. At stake isn’t just the usually tiny academic pie, but public appreciation of and respect for both the humanities and the sciences, not to mention millions of dollars in research grants (for the physicists, not the philosophers). Time, therefore, to take a more serious look at the meaning of Popper’s philosophy and why it is still very much relevant to science, when properly understood.

As we have seen, Popper’s message is deceptively simple, and – when repackaged in a tweet – has in fact deceived many a smart commentator in underestimating the sophistication of the underlying philosophy. If one were to turn that philosophy into a bumper sticker slogan it would read something like: ‘If it ain’t falsifiable, it ain’t science, stop wasting your time and money.’

But good philosophy doesn’t lend itself to bumper sticker summaries, so one cannot stop there and pretend that there is nothing more to say. Popper himself changed his mind throughout his career about a number of issues related to falsification and demarcation, as any thoughtful thinker would do when exposed to criticisms and counterexamples from his colleagues. For instance, he initially rejected any role for verification in establishing scientific theories, thinking that it was far too easy to ‘verify’ a notion if one were actively looking for confirmatory evidence. Sure enough, modern psychologists have a name for this tendency, common to laypeople as well as scientists: confirmation bias.

Nonetheless, later on Popper conceded that verification – especially of very daring and novel predictions – is part of a sound scientific approach. After all, the reason Einstein became a scientific celebrity overnight after the 1919 total eclipse is precisely because astronomers had verified the predictions of his theory all over the planet and found them in satisfactory agreement with the empirical data. For Popper this did not mean that the theory of general relativity was ‘true,’ but only that it survived to fight another day. Indeed, nowadays we don’t think the theory is true, because of the above mentioned conflicts, in certain domains, with quantum mechanics. But it has withstood a very good number of high stakes challenges over the intervening century, and its most recent confirmation came just a few months ago, with the first detection of gravitational waves.

Scientific hypotheses need to be tested repeatedly and under a variety of conditions before we can be reasonably confident of the results.

Popper also changed his mind about the potential, at the least, for a viable Marxist theory of history (and about the status of the Darwinian theory of evolution, concerning which he was initially skeptical, thinking – erroneously – that the idea was based on a tautology). He conceded that even the best scientific theories are often somewhat shielded from falsification because of their connection to ancillary hypotheses and background assumptions. When one tests Einstein’s theory using telescopes and photographic plates directed at the Sun, one is really simultaneously putting to the test the focal theory, plus the theory of optics that goes into designing the telescopes, plus the assumptions behind the mathematical calculations needed to analyse the data, plus a lot of other things that scientists simply take for granted and assume to be true in the background, while their attention is trained on the main theory. But if something goes wrong and there is a mismatch between the theory of interest and the pertinent observations, this isn’t enough to immediately rule out the theory, since a failure in one of the ancillary assumptions might be to blame instead. That is why scientific hypotheses need to be tested repeatedly and under a variety of conditions before we can be reasonably confident of the results.

Popper’s initial work pretty much single-handedly put the demarcation problem on the map, prompting philosophers to work on the development of a philosophically sound account of both what science is and is not. That lasted until 1983, when Larry Laudan published a highly influential paper entitled ‘The demise of the demarcation problem,’ in which he argued that demarcation projects were actually a waste of time for philosophers, since – among other reasons – it is unlikely to the highest degree that anyone will ever be able to come up with small sets of necessary and jointly sufficient conditions to define ‘science,’ ‘pseudoscience’ and the like. And without such sets, Laudan argued, the quest for any principled distinction between those activities is hopelessly Quixotic.

‘Necessary and jointly sufficient’ is logical-philosophical jargon, but it is important to see what Laudan meant. He thought that Popper and others had been trying to provide precise definitions of science and pseudoscience, similar to the definitions used in elementary geometry: a triangle, for instance, is whatever geometrical figure has the internal sum of its angles equal to 180 degrees. Having that property is both necessary (because without it the figure in question is not a triangle) and sufficient (because that’s all we need to know in order to confirm that we are, indeed, dealing with a triangle). Laudan argued – correctly – that no such solution is ever going to be found to the demarcation problem, simply because concepts like ‘science’ and ‘pseudoscience’ are complex, multidimensional, and inherently fuzzy, not admitting of sharp boundaries. In a sense, physicists complaining about ‘the Popperazzi’ are making the same charge as Laudan: Popper’s criterion of falsification appears to be far too blunt an instrument not only to discriminate between science and pseudoscience (which ought to be relatively easy), but a fortiori to separate sound from unsound science within an advanced field like theoretical physics.

Yet Popper wasn’t quite as naive as Laudan, Carroll, Susskind, and others make him out to be. Nor is the demarcation problem quite as hopeless as all that. Which is why a number of authors – including myself and my longtime collaborator, Maarten Boudry – have more recently maintained that Laudan was too quick to dismiss the demarcation problem, and that perhaps Twitter isn’t the best place for nuanced discussions in the philosophy of science.

The idea is that there are pathways forward in the study of demarcation that become available if one abandons the requirement for necessary and jointly sufficient conditions, which was never strictly enforced even by Popper. What, then, is the alternative? To treat science, pseudoscience, etc. as Wittgensteinian ‘family resemblance’ concepts instead. Ludwig Wittgenstein was another highly influential 20th century philosopher, who hailed, like Popper himself, from Vienna, though the two could not have been more different in terms of socio-economic background, temperament, and philosophical interests. (If you want to know just how different, check out the delightful Wittgenstein’s Poker (2001) by journalists David Edmonds and John Eidinow.)

Wittgenstein never wrote about philosophy of science, let alone fundamental physics (or even Marxist theories of history). But he was very much interested in language, its logic, and its uses. He pointed out that there are many concepts that we seem to be able to use effectively, and that yet are not amenable to the sort of clear definition that Laudan was looking for. His favorite example was the deceptively simple concept of ‘game.’ If you try to arrive at a definition of games of the kind that works for triangles, your effort will be endlessly frustrated (try it out, it makes for a nice parlour, ahem, game). Wittgenstein wrote: ‘How should we explain to someone what a game is? I imagine that we should describe games to him, and we might add: ‘This and similar things are called games.’ And do we know any more about it ourselves? Is it only other people whom we cannot tell exactly what a game is? […] But this is not ignorance. We do not know the boundaries because none have been drawn […] We can draw a boundary for a special purpose. Does it take that to make the concept usable? Not at all!’

The point is that in a lot of cases we don’t discover pre-existing boundaries, as if games and scientific disciplines were Platonic ideal forms that existed in a timeless metaphysical dimension. We make up boundaries for specific purposes and then we test whether the boundaries are actually useful for whatever purposes we drew them. In the case of the distinction between science and pseudoscience, we think there are important differences, so we try to draw tentative borders in order to highlight them. Surely one would give up too much, as either a scientist or a philosopher, if one were to reject the strongly intuitive idea that there is something fundamentally different between, say, astrology and astronomy. The question is where, approximately, the difference lies?  But this is not ignorance. We do not know the boundaries because none have been drawn […] We can draw a boundary for a special purpose. Does it take that to make the concept usable? Not at all!’

Rather than laying into each other in the crude terms, scientists should work together not just to forge a better science, but to counter true pseudoscience.

Similarly, many of the participants in the Munich workshop, and the ‘string wars’ more generally, did feel that there is an important distinction between fundamental physics as it is commonly conceived and what string theorists are proposing. Richard Dawid objects to the (admittedly easily derisible) term ‘post-empirical science,’ preferring instead ‘non-empirical theory assessment’, but whatever one calls it, he is aware that he and his fellow travellers are proposing a major departure from the way we have done science since the time of Galileo. True, the Italian physicist himself largely engaged in theoretical arguments and thought experiments (he likely never did drop balls from the leaning tower of Pisa), but his ideas were certainly falsifiable and have been, over and over, subjected to experimental tests (most spectacularly by David Scott on the Apollo 15 Moon landing).

The broader question then is: are we on the verge of developing a whole new science, or is this going to be regarded by future historians as a temporary stalling of scientific progress? Alternatively, is it possible that fundamental physics is reaching an end not because we’ve figured out everything we wanted to figure out, but because we have come to the limits of what our brains and technologies can possibly do? These are serious questions that ought to be of interest not just to scientists and philosophers, but to the public at large (the very same public that funds research in fundamental physics, among other things).

What is weird about the string wars and the concomitant use and misuse of philosophy of science is that both scientists and philosophers have bigger targets to jointly address for the sake of society, if only they could stop squabbling and focus on what their joint intellectual forces may accomplish. Rather than laying into each other in the crude terms sketched above, they should work together not just to forge a better science, but to counter true pseudoscience: homeopaths and psychics, just to mention a couple of obvious examples, keep making tons of money by fooling people, and damaging their physical and mental health. Those are worthy targets of critical analysis and discourse, and it is the moral responsibility of a public intellectual or academic – be they a scientist or a philosopher – to do their best to improve as much as possible the very same society that affords them the luxury of discussing esoteric points of epistemology or fundamental physics.



Sit Down and Shut Up: Pulling Mindfulness Up By Its (Buddhist) Roots

Source: Religion Dispatches

Author: Max Zahn

Emphasis Mine

Wisdom 2.0?” meditation teacher Kenneth Folk has famously pondered. “That’s a networking opportunity with a light dressing of Buddhism.”

If mindfulness has gone corporate, then Wisdom 2.0 is its annual shareholder meeting. The yearly conference hosts tech royalty alongside media mavens like Arianna Huffington and Russell Simmons. Eyes may close for a meditation session, but a different ritual, perhaps even older, is perpetually enacted: see and be seen.

A couple of years ago, in a headline-making action, protesters leapt on the conference stage and unfurled a banner demanding an “eviction-free San Francisco.” The protestors were talking about physical displacement, but I’d argue that a kind of spiritual gentrification was also getting underway at Wisdom 2.0. As the recent glut of best-selling books, trend pieces, and celebrity testimonials attest, the mindfulness industry shows no sign of a slowdown. Apple has even built a mindfulness tool—reminding users to “breathe”—for its newest Apple Watch.

The rise of corporate mindfulness has rendered Buddhism far whiter and wealthier than it has ever been. For some immigrant Asian Buddhists and other politically engaged practitioners, the trend is reminiscent of the divorce of yoga from its religious roots. Viewed this way, the adoption of Buddhist practices into executive suites and government offices seems like a textbook case of cultural appropriation.

Proponents of mindfulness, of course, don’t see it this way. They claim that mindfulness is simply the tradition’s newest iteration. But as mindfulness saturates the culture, it has become the public face of Buddhism for many Americans.

What, if anything, gets lost in translation? And to whom does it matter?

The roots of mindfulness

Any conversation about the appropriation of Buddhist practices is difficult, because as the tradition spread—from the Indian subcontinent, and across the globe—it always adapted to host cultures. Buddhism absorbed Chinese religion when it spread to China, Tibetan religion in Tibet, and so on.

But mindfulness—what we think of as “meditation,” as opposed to prayer or ceremonial observance, for example—does have deep roots in the tradition. The Pali word sati, which can be translated as mindfulness, is frequently used in Buddhist scripture. (The noun comes from the verb sarati, meaning “to remember,” and alternative translations for sati include “remembrance” or even “collectedness.”)

Sati is just one part of a much larger set of practices and traditions. Specifically, it’s the seventh element of the Noble Eightfold Path, which is itself one part of the body of teachings known as Buddhist dharma, or religious doctrine. And dharma is just one of what are called Buddhism’s three jewels; the other two are the Buddha and the sangha, or monastic community.

As such, mindfulness makes up a small segment of the immense ethical, philosophical, institutional and ritual latticework that constitutes the totality of Buddhist practices, which themselves vary widely from place to place and era to era.

The modern emphasis on mindfulness and meditation is a fairly recent phenomenon. Historian Eric Braun notes that lay meditation did not begin in earnest until an early-twentieth-century anti-colonial movement among Theravada Buddhists in Burma. Before then, meditation teachings remained the province of monks and nuns. The monastic setting ensured that meditation practices took place in a community of aspirants abiding by a common code of conduct and observing a shared set of rituals. While deemed necessary for liberation, meditation practices like mindfulness were not sufficient. The accompanying rules and relationships were just as important.

Modern-day mindfulness takes this twentieth-century shift to lay meditation one step further, sanctioning meditation as a technique developed in the midst of worldly life. In previous generations, mindfulness “was surrounded by lots of other things,” said Mushim Ikeda, a Buddhist teacher at the East Bay Meditation Center in Oakland, California. “We’ve taken out one thing and we’re applying it to reduce stress, to increase performance.”

Bodhi to Brooklyn

Once extracted from this communal context, mindfulness becomes a catch-all (and a cure-all). Want to relieve stress, get more sleep, perform better at work, have better sex, and actually pay attention to that elaborate breakup story your friend is telling you? Just meditate. Mindfulness-Based Stress Reduction (MBSR) techniques, pioneered by medical professor and former Buddhist practitioner Jon Kabat-Zinn, have spread from hospitals to offices to the halls of Capitol Hill.

But these practices can diverge, at times in very deep ways, from traditional understandings of sati.

Thanissaro Bhikkhu, an American Buddhist monk who deploys encyclopedic knowledge of the sutras in his cultural commentary, has pointed out crucial differences between traditional and modern (Western) mindfulness. “The Buddha himself defined sati as the ability to remember,” he writes, whereas contemporary teaching vivifies the breath as something continually discovered anew. The traditional focus on remembrance draws upon a practitioner’s experience in cultivating the technique. But for modern-day meditators, instructed in the benefit of practicing just five minutes per day, this appeal to long-cultivated expertise means little.

The aims of contemporary mindfulness are very different, too. “The way mindfulness is practiced, it’s not necessarily Buddhism,” said Reverend Toshikazu Kenjitsu Nakagaki, president of the Buddhist Council of New York. “It’s used to improve business. So the purpose is fundamentally different.”

Do these differences matter? Things change. In fact, there may be no better two-word summary of Buddhist thought and history than that. “Buddhism has gone through many, many transformations from India to China to Japan,” said David McMahan, a scholar of Buddhism at Franklin and Marshall College. McMahan’s 2008 book, The Making of Buddhist Modernism, analyzes how Buddhism changed as it took hold in the West during the nineteenth and twentieth centuries. “Now it’s changing again,” he said.

This change, though, comes within a context of colonial expansion and widespread cultural theft, argued Katie Loncke, the co-director of the Bay Area’s Buddhist Peace Fellowship (BPF), a social justice-oriented organization. “In the legacy of colonialism in which the U.S. was founded and still very much find ourselves, it’s not surprising that we see repeated patterns of cultural appropriation,” she said. “Even within these very cherished contemplative practices.”

When it comes to contemporary mindfulness, this appropriation aligns with the dominant neoliberal mode of economic, social, and political life. Instead of challenging the status quo, mindfulness merely enables the ensuing preoccupation with social climbing and career advancement.

In other words, mindfulness is a technique that asks Americans to quite literally sit down and shut up.

Zen Buddhist teacher David Loy and management professor Ron Purser call this form of the technique “McMindfulness.” “Rather than applying mindfulness as a means to awaken individuals and organizations from the unwholesome roots of greed, ill will, and delusion,” they write, “it is usually being refashioned into a banal, therapeutic, self-help technique that can actually reinforce those roots.”

McMahan, however, warns against the idealization of a mythic, pristine form of the religion. “I would be cautious about contrasting mindfulness with some kind of absolutely pure, past Buddhism in which there was no conception or concern with material well-being or financial reward,” he said.

A divided community

It’s important to keep in mind that this conversation is unfolding within a divided American Buddhist community. Specific numbers are hard to pin down, and estimates vary widely, but roughly three-quarters of American Buddhists are Asian. The remainder are predominantly white converts. Practitioners in the Asian diaspora typically join communities that are aligned with sects popular in their origin countries.

Convert Buddhists, on the other hand, tend to congregate at meditation-oriented Buddhist centers founded in the United States, such as the Insight Meditation Society or those of the Shambhala Buddhist community.

The secularized mindfulness technique foregrounds this overwhelmingly white set of practitioners, even though they are probably a minority of American practitioners. “So much of the Asian diasporic and Asian-American face of Buddhism has been erased and dismissed from the mainstream versions of the dharma in the U.S.,” said the BPF’s Katie Loncke. The outsized visibility of white celebrities and CEOs practicing mindfulness highlights the longstanding tension between the immigrant Asian Buddhist community and the convert one. (There are exceptions, most notably Chade-Meng Tan, the Singapore-born founder of Google’s in-house meditation program, whom I interviewed for RD earlier this year).

This split within American Buddhism raises the question of who owns and defines those practices going forward. In his 2014 book, Mindful America, scholar Jeff Wilson holds the media partially accountable: “The vast majority of information about mindfulness is disseminated by white people, in media venues controlled by white people, for the primary consumption of white people.”

Loncke agrees, pointing to “glossy Buddhist magazine versions of the dharma.” She notes a recent cover of the Buddhist* magazine Lion’s Roar, which featured a photo of Buddhist teachers, many of whom teach mindfulness and three of whom are Asian, below a headline calling them “The New Face of Buddhism.” The cover’s spotlight on a fresh set of predominantly non-Asian, convert Buddhist teachers seemed to overlook—or worse yet, intentionally downplay—the enduring role of Asian immigrant teachers.

“It’s an unfortunate element of the pressures of U.S. marketing to sell things by announcing them new or fresh or interesting,” added Loncke. Marketers and practitioners gravitate toward this newness, rather than “doing the sometimes tedious work of giving credit where credit is due.”

Loncke and other mindfulness critics belong to an engaged Buddhist movement that itself is open to criticism for the appropriation of Buddhism, hitching the tradition to a progressive political agenda. “There’s nothing necessarily inherent in the Buddhist tradition that would lend itself to leftist politics,” said McMahan. “I’m sympathetic to engaged Buddhism,” he added. “But that’s more about me and my politics than it is about Buddhism.” The Buddha wasn’t mindfully coding apps—but he wasn’t scrawling lefty placards, either.

Loncke acknowledges that concerns over cultural appropriation in engaged Buddhism are “hella real,” adding that “it’s not very helpful” when practitioners “look back to the life of the Buddha or his teachings for . . . guidelines about what policy choices to make.” Such appeals risk the same willful neglect of the tradition’s origins that engaged Buddhists often counsel against. Both a corporate mindfulness practitioner and an engaged Buddhist run the risk of cherry-picking from the tradition.

Perhaps the corrective for appropriation, in either case, is proper citation. That’s what Ikeda, the East Bay Meditation Center teacher, wants, demanding of Western Buddhists “not only awareness” of source cultures “but attribution and acknowledgment.”

Such attribution comes sparingly from teachers of corporate mindfulness, many of whom stress the secular nature of the practice. Kabat-Zinn, for instance, chooses not to identify as a Buddhist despite his decades of experience as a student of the tradition. A 24-page instructional manual on Mindfulness-Based Stress Reduction—released in 2014 by Kabat-Zinn’s Center for Mindfulness in Medicine, Healthcare, and Society—does not mention Buddhism once. The manual does, however, include a regimen of carefully referenced “hatha yoga.” Even when they do invoke Buddhism’s longstanding traditions, mindfulness advocates often do so as a superficial means of establishing credibility for the practice.

Engaged Buddhists, meanwhile, tend to link broad Buddhist notions like nonviolence (ahimsa) or generosity (dana) to contemporary political issues. Ikeda warned against such “cherry-picking through Buddhist sources” or “finding nuggets of things and saying ‘aha, this supports my point.’” She contrasted this approach with a “logical and thoughtful evolution” of the religion, which she says engaged Buddhism can provide.

Media coverage of mindfulness has increased dramatically in the past few years, along with a new wave of criticism. But quantity isn’t quality. And the quality of this conversation depends on the depth of its analysis and the diversity of its voices, with special attention paid to the least powerful and well-represented among them.

Then again, I’m a white dude living in Brooklyn who has concerns about cultural appropriation. And Chade-Meng Tan, one of the foremost advocates of corporate mindfulness, is an immigrant from Singapore. The axes of ethnicity, power, and critique get muddled. The future of American Buddhism depends on a collective willingness to investigate them, respectfully.

Awkward moments will certainly result. They comprise, perhaps, the challenge for which all this mindfulness has been preparing us.


*Due to an editorial error, this article originally stated that Lion’s Roar is a Shambhala Buddhist magazine, rather than an independent publication covering a range of Buddhist traditions. 

Max Zahn is a freelance writer based in Brooklyn, New York.



Belief in supernatural beings is totally natural – and false

Source: Aeon

Author: Stephen Law

Emphasis Mine

Human beings are remarkably prone to supernatural beliefs and, in particular, to beliefs in invisible agents – beings that, like us, act on the basis of their beliefs and desires, but that, unlike us, aren’t usually visible to the naked eye. Belief in the existence of such person-like entities is ubiquitous. As Steven Pinker notes in ‘The Evolutionary Psychology of Religion’ (2004), in all human cultures people believe that illness and calamity ‘are caused and alleviated by a variety of invisible person-like entities: spirits, ghosts, saints, evils, demons, cherubim or Jesus, devils and gods’. In the United States, for example, a 2013 Harris Poll found that around 42 per cent believe in ghosts, 64 per cent in survival of the soul after death, 68 per cent in heaven, and 74 per cent in God.

Why are we drawn to such beliefs? The answer cannot be simply that they are true. Clearly, most aren’t. We know many beliefs are false because they contradict other similar beliefs. Take god-type beliefs. Some believe there’s one god; others (such as the Manicheans) that there are two gods; others: pantheons of gods. People also hold dramatically differing beliefs about the characteristics of these divine beings, ascribing to them incompatible attributes and actions. But it’s not just disagreement between believers that reveals many of these beliefs to be false. Science has also demonstrated that many of these beliefs are false: for example, diseases are produced not by demonic beings but by entirely natural causes. And of course, supposed evidence for such beings – sightings of ghosts, fairies, angels, gods and their miraculous activities – is regularly debunked by investigators.

When people are asked to justify their belief in such invisible beings, they often appeal to two things. First, to testimony: to reports of sightings, miraculous events supposedly caused by such beings, and so on. Any New Age bookshop will be able to provide numerous testimonies regarding invisible agency that might seem hard to account for naturalistically in terms of hallucination, self-deception, misidentified natural phenomena, trickery, and so on. Second, many will also claim a subjective sense of presence: they ‘just know’ their dead Auntie is in the room with them, or that they have a guardian angel, by means of some sort of extra sense: a spirit sense. The Delphic oracle believed she received communications from the god Apollo while perched on her tripod. Many contemporary religious folk believe they can sense divinity by means of some sort of sensus divinitatis or god-sense.

If there really are no good grounds for believing such beings exist, however, why do people believe in them? There’s much scientific speculation about that but, as yet, no definitive answer.

One obvious advantage of positing invisible agents is that they can account for what might otherwise be baffling. I could swear I left my keys on the table, but there they are under the sofa. How on Earth did that happen? If I believe in gremlins – invisible beings living in my house that have the desire to cause mischief and the power to do so – then the mystery is immediately solved. Invisible agents provide quick, convenient explanations for events that might otherwise strike us as deeply mysterious and, in so far as these beings can be appeased or persuaded, belief in them can also create the illusion of control, which can be comforting in an otherwise uncertain and dangerous world.

Scientists working in the cognitive science of religion have offered other explanations, including the hyperactive agency-detecting device (HADD). This tendency explains why a rustle in the bushes in the dark prompts the instinctive thought: ‘There’s someone there!’ We seem to have evolved to be extremely quick to ascribe agency – the capacity for intention and action – even to inanimate objects. In our ancestral environment, this tendency is not particularly costly in terms of survival and reproduction, but a failure to detect agents that are there can be very costly. Fail to detect a sabre-toothed cat, and it’ll likely take you out of the gene pool. The evolution of a HADD can account for the human tendency to believe in the presence of agents even when none can actually be observed. Hence the human belief in invisible person-like beings, such as spirits or gods. There are also forms of supernatural belief that don’t fit the ‘invisible person-like being’ mould, but merely posit occult forces – eg, feng shui, supernaturally understood – but the HADD doesn’t account for such beliefs.

In fact, I doubt that any single mechanism accounts for the human tendency to hold such supernatural beliefs. Certainly nothing as crude as ‘wishful thinking’ really does the job. What is believed is not always to the liking of the believer; sometimes, as in the case of night visits by demonic beings, it’s absolutely terrifying. In any case, the appeal to wishful thinking just postpones the mystery, as we then require an explanation for why humans are so attracted to believing in invisible beings.

Whatever the correct explanation for the peculiar human tendency to believe falsely in invisible person-like beings, the fact that we’re so prone to false positive beliefs, particularly when those beliefs are grounded in some combination of testimony and subjective experience, should provide caution to anyone who holds a belief in invisible agency on that basis.

Suppose I see a snake on the ground before me. Under most circumstances, it’s then reasonable for me to believe there is indeed a snake there. However, once presented with evidence that I’d been given a drug to cause vivid snake hallucinations, it’s no longer reasonable for me to believe I’ve seen a snake. I might still be seeing a real snake but, given the new evidence, I can no longer reasonably suppose that I am.

Similarly, if we possess good evidence that humans are very prone to false belief in invisible beings when those beliefs are based on subjective experience, then I should be wary of such beliefs. And that, in turn, gives me good grounds for doubting that my dead uncle, or an angel, or god, really is currently revealing himself to me, if my only basis for belief is my subjective impression that this is so. Under such circumstances, those who insist ‘I just know!’ aren’t being reasonable.


Lawrence Krauss: ‘All Scientists Should Be Atheists’

kraussscience1Source: Patheos

Author: Michael Stone

Emphasis Mine

“Scientists have an obligation not to lie about the natural world.” – Lawrence Krauss.

In an essay for The New Yorker titled All Scientists Should Be Militant Atheists Lawrence Krauss makes a powerful argument for science and against the urge to protect religious superstition from scrutiny.

The essay, published last September, begins with a discussion concerning conservative culture warrior Kim Davis using her Christian religious beliefs to deny wedding licenses to gays and lesbians in Kentucky. Commenting on the controversy, Krauss notes:

The Kim Davis controversy exists because, as a culture, we have elevated respect for religious sensibilities to an inappropriate level that makes society less free, not more. Religious liberty should mean that no set of religious ideals are treated differently from other ideals.

Krauss dismisses the demand that many make for respecting religious superstitions by noting the obvious:

The problem, obviously, is that what is sacred to one person can be meaningless (or repugnant) to another.

Krauss is correct. What is a sacred commandment or belief for one is another’s moral abomination. One need only be reminded of the sexism and misogyny woven into the fabric of all three of the Abrahamic religions to understand why many would find the supposedly sacred to be morally repugnant. The refusal by Kim Davis to issue marriage licenses to gays and lesbians is another example, and there are of course many more.

Krauss goes on to move from a discussion of Davis to a discussion of science, opining:

In science, of course, the very word “sacred” is profane. No ideas, religious or otherwise, get a free pass. The notion that some idea or concept is beyond question or attack is anathema to the entire scientific undertaking. This commitment to open questioning is deeply tied to the fact that science is an atheistic enterprise.

Krauss observes that science is inherently dangerous to religion because scientific understanding often draws people away from religion:

Because science holds that no idea is sacred, it’s inevitable that it draws people away from religion.

Yet the uncomfortable fact that science often has the effect of exposing religious superstitions as irrational and ultimately untenable beliefs about the world means that the culture of science often panders to the faithful by sugar coating the truth about the natural world:

Even so, to avoid offense, they sometimes misleadingly imply that today’s discoveries exist in easy harmony with preexisting religious doctrines, or remain silent rather than pointing out contradictions between science and religious doctrine.

Krauss rejects the misleading fabrication that science and religious dogma are compatible, at one point declaring:

Scientists have an obligation not to lie about the natural world.

In concluding, Krauss sees a direct link “between the ethics that guide science and those that guide civic life.” Arguing that honesty should take priority over religious dogma, Krauss says “we owe it to ourselves and to our children not to give a free pass” to those “that endorse, encourage, enforce, or otherwise legitimize the suppression of open questioning in order to protect ideas that are considered ‘sacred.’”

Bottom line: Krauss is right, all scientists, and all thinking people, should be atheists.

Lawrence Krauss is a physicist and director of the Origins Project at Arizona State University. He is also the author of The Physics of Star Trek and A Universe from Nothing: Why There Is Something Rather than Nothing.