Showing posts with label Language. Show all posts
Showing posts with label Language. Show all posts

Thursday, February 7, 2019

Great moments in trend-setting

It's rare that I'm ahead of the curve in very much. But the latest Steve Sailer column had the following puzzling claim:
Nobody can deny Lindsay, Boghossian, and Pluckrose one historic accomplishment: They’ve permanently affixed the name Grievance Studies to their targets.
Before last fall, there were a variety of self-designations that only their smartest critics could keep track of. For example, Steven Pinker tweeted,
Is there any idea so outlandish that it won’t be published in a Critical/PoMo/Identity/‘Theory’ journal?
But if you aren’t quite up to Pinker’s level of brainpower, it’s hard to remember that “Critical/PoMo/Identity/‘Theory’” are all more or less the same moonshine.
But now we don’t need to. They are all just Grievance Studies.
Google searches show that the term “grievance studies” appeared only 85 times in the history of the internet before they announced their hoax last October, but 89,700 times since then.

To which my first thought was: huh? Hasn't everyone been using this term for ages?

No, it just turns out, I've been using it for ages. I couldn't get Sailer's "85 results" number easily. But this post of mine from May 2013 features the phrase. Though, hilariously, it doesn't seem to show up on my google search, and since I'm John Q. Nobody, read by nobody, I contributed almost zero to the currency of the phrase.

I have no idea if I just picked it up from someone else, or it independently seemed like a good description. To slightly paraphrase Moldbug, the great thing about the truth is that, being true, anybody is free to notice it at any time.

Come to Chateau Holmes for fresh social commentary, or be one of the herd reading about it at Sailer's blog six years later!

(I kid - Steve Sailer is a national treasure, and the best journalist of his generation. The fact that he writes for donations at the Unz Review, instead of having major newspapers fight to hire him, tells you everything you need to know about the clown world we live in).

Saturday, July 28, 2018

On the Cultural Aggression of the Quebecois

I was in Quebec recently. It's an odd place. Some parts, like the old part of Quebec City, feel like you've somehow set foot into a European town, with a walled city, cobblestone streets, and statues of local heroes from four hundred years prior. But then when you drive out of that part, especially between the major cities, it feels like you're in Anytown, USA, except that everything has been run through Google translate.

The attitude of the Quebecois towards their history is an interesting one. They very much celebrate their "Frenchness", but they're considered mostly as an object of humour and curiosity by the actual French themselves. In this regard, they share some similarities to the Northern Ireland Protestants, who are similarly ignored or viewed with mild embarrassment by the English. In the case of the Quebecois, their conception of France is also quite different to France itself. One aspect of this, that seems obvious and striking in hindsight, but is actually quite easy to overlook, is that the French Revolution never came to Quebec. The British took over in 1763, and so the Quebecois' conception of French rule dates back to this French Royal period. This is why you see the Fleur De Lis, the symbol of French monarchy, around everywhere. As I've written about before, you will walk around a long time in Paris before seeing many of these, or indeed any other celebration of the French Kings. But in Quebec, and indeed in Lousiana, the Fleur De Lis just means "French", not "Royal". 

The immediate aspect that strikes all tourists is of course the language. For a long time, I had always wondered about their stubborn intransigence towards issues of language and history. Not only do they insist on speaking French, but if anything they appear to have gotten more aggressive on the subject over time, not less. This includes the endless language police (an uber driver was recounting how a company he worked for was scrambling around to replace all the keyboards and telephone with French versions in advance of the language police visit). It also includes clamping down on English language education.

Not only that, but the Quebecois seem to have had a remarkable ability to shoot themselves in the foot with their endless hand-wringing about independence. They've managed to pick the worst possible outcome - neither becoming independent, nor being committed to staying part of Canada. Indeed, if you want a metric of just how much this screwed over Quebec, and Montreal specifically, consider the following: how many countries can you think of where most important city in the country changed in the past hundred years? London was the most important a century ago and is today, Paris was the most important a century ago and is today, Moscow was the most important city a century ago and is today, and so on. Not so in Canada. Montreal was the most important city for most of the 20th century. Then a wave of independence agitation, starting with the formation of Parti Quebecois in 1968 (of course! when else?) put paid to all that. You know who loves that kind of endless uncertainty? Businesses! Where do you think the operational headquarters of the Bank of Montreal are? Did you guess "Toronto"? They have been since 1977, when the bank decided to beat the rush ahead of the first referendum in 1980 on moves towards independence. 

I had just put all this down to the French generally being stubborn socialist assholes, and especially resenting Anglo-Saxons. Charles De Gaulle could never, ever forgive the British and Americans for kicking out the Nazis. It was almost easier to forgive the Nazis themselves. A recipient of charity nearly always hates his benefactor, as Orwell wisely observed. Quebec wasn't exactly in the same position, but resentment of the Anglos has a long, long history, dating at least back to the Plains of Abraham in 1759. It's not for nothing that the license plates read "Je Me Souviens" - "I remember". It's hard not to detect a vague note of sullen hostility in that, a determined insistence to bear a grudge. There are plenty of ways to remind people to celebrate their French heritage, and most of them sound more upbeat, like "Vive Le Français Canada".  Instead, it always seemed to imply to me "I remember when this used to be France".

But in any conflict, it's nearly always a useful exercise to consider "How did the other side think of the reasons behind the conflict?" One doesn't need to go full moral equivalence to think that if the Quebecois resent the Anglos, it's worth at least pondering why this might be. Actually, that's not quite sufficient, because that tends to lead one back to self-serving explanations. No, the better question is: what explanations might they have that, if true, were unflattering to our own self-image? This is nearly always the blind spot. "Why do they dislike us?" tends to produce answers like "They're assholes", "they're confused or misled" etc. "What did we do to provoke this?", even when asked in earnest, tends to produce answers like "We're too noble, too generous, too successful". You only get to the heart of the matter by asking "How might I be the asshole here?". Or as Mitchell and Webb put it - Are we the baddies?


Of course, the great irony in that skit is that while it's very funny, Mitchell and Webb could only jokingly portray Nazis asking this question of themselves, thereby displaying quite a high level of introspection. This is compounded by the fact that in the direct scene depicted, they appear to be fighting Stalin, who was a monster of the highest order. One does not have to be a Nazi sympathiser to reflect that on the Eastern Front, "good guys" were pretty damn thin on the ground.

But in the skit, there's no suggestion whatsoever that you, the audience member, should actually ponder the same question, even if just on the small scale of some pretty morally dubious choices. The point is not whether you or they are right overall, though that is surely important, and probably the most important question. But the other point is, do you know why the other side thinks you are in the wrong?

And one of the recurring themes that comes up among such honest questioning is that a lot of actions that seem to be aggressive offensive campaigns are perceived by those who wage them as actually defensive. Because we all live in America, the elephant in the room that we are all apt to leave out of the re-telling is America itself, the Vampire of the World. The ways in which the west may provoke things are rarely thought of, except to the extent that leftists claim that America provokes violence by being insufficiently progressive. 

I remember The War Nerd talking about this in the context of the Middle East. To America, jihad seems like an outrageous, insane form of unprovoked attack. But he makes a quite convincing case that many of the jihadis in the Middle East actually perceive it as a defensive war. How could that be?:
American exceptionalism is always just American provincialism, no matter how benevolent it seems. Not everyone is like us, and a lot of people are actively trying not to become like us. Jihadis are, roughly speaking, the armed wing of that group.
The truth about the clash of civilizations you hear people discussing is that it’s all the other way: The Mall is invading Islam, the Mall is taking over. There isn’t any Sharia Law in North Carolina, but there damn well are US-style malls in even the most conservative Islamic countries.
...
The Mutaween (“Society for the Promotion of Virtue and the Suppression of Vice”) has hundreds of men, and even a few women, working in Najran. Some wear the big beards and special headdress, but others are in disguise. And what these undercover morality police do, mostly, is patrol HyperPanda to see if boys are talking to girls, or looking at girls, or throwing girls little folded-up slips of paper with their cell phone numbers. That last one is perhaps the greatest threat to morality in town, and HyperPanda is the scene of most such crimes. The Mutaween mount multi-cop surveillance routines, with some disguised as Malays or Filipinos, to detect any instances of heterosexual contact at the mall.
The culture, the law, are very clear. No pre-marital fooling around, and that includes flirting at HyperPanda. Mall rules are very clear too: It’s an obvious place for boys and girls to check each other out. When mall meets culture, hijinks ensue—and murders sometimes follow, with the male relatives of the girl who’s been compromised at HyperPanda hunting down and killing the boy who accosted her.
And again:

The vectors for contagion in Najran are legion, starting with the usual suspects: Facebook, where daughters of respectable families maintain private accounts which feature “risqué” photos of young women without the niqab (face veil), hijab (head scarf), or abaya (black robe). These accounts also allow girls to “like” one professional footballer over another, an expression of preference in male appearance which violates every marriage norm in the rural-Arabian book.
Then there’s the cellphone itself, Ooredoo’s trademark product. Cellphones are lethal for traditional female prohibitions. In Najran, girls can’t leave the house without a male relative, even to visit female friends. But with a cellphone, they can jump outside the compound without breaking a sweat, texting unrelated males to say God knows what in that krazy lingo you kidz are using these days. And because the older generation in Najran grew up in a world without telephones of any kind, let alone cellphone culture, they’re hopeless at monitoring this coded, corrosive language.
And in a way, the most corrosive of all the alien influences attacking Najran were the most seemingly innocuous: K-Pop and Korean Soap Operas. It’s amazing that there are still people in the old countries, like the US, who don’t realize yet that Korea has taken over world culture. They don’t need your stinkin’ American pop no more. They’ve got Sistar and they’re humming “Can’t Go to Sinchon.”
The Korean dramas Najran girls watch on their computers are intensely romantic—and “romantic” is a Western, alien import, a very dangerous one in a world where marriage is between or within families, and where young women expect to feel little or no affection for their husbands. When you’re stuck in your room—and your room’s windows have been boarded up to prevent heterosexual gazes from passing in or out—it’s quite a trip to be suddenly transported to a Korean beach, where two young lovers are strolling, having a heart-to-heart on a program called “Autumn in My Heart.”
Read both those articles, they're eye-opening. Again, the point is not that Jihad is justified. Brecher's implication that there isn't any genuinely aggressive component of Muslim cultural expansion in the west (Europe in particular) seems, shall we say, naively optimistic. But that's not the question. The question is: how many Americans could think of any reasons why they might dislike the West that a) aren't entirely self-serving, and b) aren't just progressive talking points? I don't think Jihad was exactly the example that Moldbug has in mind, but if you want to understand how someone might consider America the Vampire of the World, you could do far worse.

Which brings us back to the Quebecois.

To wit: you simply cannot tell the story of Quebec's cultural aggressiveness without discussing America.

Because it doesn't take much pondering to realise that in the case of language and cultural preservation, the Quebecois almost certainly view their actions as entirely defensive.

And it doesn't take much more pondering to realise that they're almost certainly correct.

English is essentially like the Borg. If you're in North America, it just tends to creep in, with a thousand vectors of attack. The tourists come to Montreal and Quebec City with their US dollars (or even their Yen or Renminbi), which bring with them enormous incentives to speak English. If you're a shop owner in Montreal, how do you greet your customers? They seem to have settled on "Hello, Bonjour", an expression that must surely annoy the French-speaking locals. All the major movies and pop songs come in English. Educated French-speaking parents start thinking that it's important for their child to learn good English, so maybe they decide to send them to an English school, figuring they'll get the French at home anyway. Slowly, bit by bit, if you don't do anything, the degree of French-speaking gets chipped away.

This isn't even just hypothetical. We already have examples of what happens if you don't actively fight these trends.

 

New France extended over a huge territory, not just Quebec. So the question is: how much French is still spoken in these areas? Even the cities which were the biggest at the time, like New Orleans? To ask is to laugh. "French" becomes limited to street signs, a small section of historical architecture for the tourists, and the Fleur De Lis around the place. That's what happens when you aren't willing to aggressively insist on French being spoken in every official capacity. English slowly grinds you down until it's taken over, at which point it's probably there for good (or at least until the Chinese invade).

Even in this decayed age, America is still a great country. Yet it is one of the tragedies of our era that gradually everywhere is slowly turning into America. I like America, but I don't want everywhere to be America. To add to the tragedy, the main parts that seem to be most contagious are mass market consumer culture and humourless political correctness. But the vector of attack for all of this is the spread of the English language. It's no coincidence to me that, among first world countries, the Japanese have not only the least embrace of open borders diversity nonsense, but also the least embrace of spoken English.

The Quebecois seem to have little interest in fighting the message itself, at least that I've seen. But to the extent that they don't want to simply end up as American, I can entirely sympathise.

Monday, July 17, 2017

The fastest way in

"Nature," as Mr Emerson once noted, "has made up her mind that what cannot defend itself shall not be defended."

A similar principle operates with public policies.

To wit, policies that the average person is not willing to openly and publicly defend, under their own name, will eventually be dismantled.

This may sound like a tautology, until you realize that there are lots of policies that exist partially out of inertia. But when you try to explain why they exist, suddenly the explanations sound awkward.

And the awkwardness comes because the mind instinctively feels that they jar with a broader principle that has been enunciated, but not yet everywhere applied. They are, in other words, Larry Auster’s famous unprincipled exceptions. And they are strong candidates when guessing where the liberal zeitgeist might head next, as I’ve written about before.

Auster seems to mostly have had in mind exceptions that get made deliberately, out of a sense by those in power that being consistent would lead to bad practical consequences. While this is true, there are other instances where it seems that inertia explains a lot. Steve Sailer’s quip that the Eye of Soros is powerful, but can’t be everywhere at once, seems quite apt. Sometimes the discussion almost has the flavor of gradually pushing the boundaries of the Overton window leftward until the mainstream feels instinctively that the boundary is catching up to them, and move accordingly.

The most glaring instances of these apply to immigration.

Modern liberalism, if taken at face value, deems it the height of evil and injustice to grant people special privileges and status based on:

-Their skin color when they were born

-Their genitalia when they were born

-Their sexual preference, (according to current fashion, also decided when they were born, though it doesn’t matter much if it happens later).

But so far, it is still acceptable to grant people special privileges based on where their mother was standing when they were born.

Why is this the case?

More importantly, suppose you were asked to justify why this is the case, in an essay that would be printed under your name around your workplace.

How many people would be comfortable doing so? I suspect not very many.

Because there are approximately two defenses

i) F*** you, it’s ours, and we don’t owe anybody anything.

ii) As a practical matter, we can’t let in everyone.

Version i) has a variety of flavors, most of which hinge upon variations of the definition of “us”, ranging from a particular ethnic/religious group (e.g.: Israel), to the current citizens (however they got here), to the current citizens plus whoever we choose to invite at our sole discretion (though this mostly punts the question to that of who we should invite).

I suspect however, that the vast majority of public defenses would be made along version ii).
But version ii) inexorably leads to the current situation – the west absorbs the third world, but at a slightly slower rate. Everyone who is let in, stays in. The ratchet moves gradually, but never moves back.

There are, however, a variety of ways to chip away at the current immigration policies.
You can make the frontal assault on the idea itself – denounce the very idea of citizenship as racist. We may yet end up there, but the frontal assault runs into too many problems of seeming to go against things that normal people love, like the American flag and national anthem.

You can make the bait and switch – citizenship is so important and beneficial, that we must make the important act of generosity and grant it to anyone who wants it. In other words, citizenship changes from something based on lineage (where your parents must be American) to something merely based on assenting to American propositions, with those whose parents are American merely being presumed to assent to them automatically. At that point, it seems like mere mean-spiritedness to not let anyone who wants to assent to the ideas and become American to do so, provided they’re not a criminal. Because, after all, people are all the same, so there could be no differences in anything to consider, none whatsoever, no siree.

Or you can make a circuitous attack. Maintain the idea of citizenship, but find another reason to let people in anyway.

This seems to be the most likely outcome to me. And the vector that seems the most potent here is the expansion of refugee programs.

The notion of refugees circumvents people’s ideas of how immigration should normally work. Sure, we normally screen immigrants carefully and don’t let just anybody move here, but these people are refugees! To send them back to where they came from would be to return them to certain death.

And this association has been built up so strongly that mostly people don’t seem to scrutinize any of the policies being snuck in as a consequence. The clearest sign of this is the fact that the major war being used to justify large migration flows into Europe is the conflict in Syria. However, a cursory glance at either a) immigration statistics, or b) photos of the refugees themselves reveals that a large quantity of them are coming from Africa (or, more recently, Bangladesh!), from regions where there is either no war at all, or conflict at a sufficiently low level that what they are fleeing from is simply everyday life in these places. Suddenly, the definition of a potential refugee has expanded to anyone in a sufficiently crappy country. Which, at last count, is most of the people of the world.

Not only that, but a second bait-and-switch has taken place, also without much discussion. Refugees went from being people that were taken in temporarily for the duration of a conflict, to people who were settled permanently. This process is left quite mysterious to the general public, which appears by design. It’s good if it happens by a court. It’s better if it happens by a permanent civil servant, or a whole lot of them. It’s best if the average person who is annoyed at the idea doesn’t even know whose decision it was.

I write the above sentence with snark, but then honest humility forces me to admit that I also have no idea exactly whose decision it was to have the vast shifts in immigration for most of the west.

Even if asylum is not immediately granted on a permanent basis, the refugee can always claim the risk of violence if he goes back. Like all of these claims, they are incredibly difficult to evaluate from thousands of kilometres away with little, if any, documentary evidence available. One either has to accept most of the claims, or reject most of the claims. The idea that one will be able through careful scrutiny determine the facts of each case seems fanciful. Of course, now that they’ve arrived *we* would be killing them by sending them back, and you don’t want that on your hands, do you, prole?

The other great benefit, to the progressive, is that part of the process happens through bodies like the UN (certainly in the case of places like Australia). Phrases like ‘international treaty obligations’ get thrown around, which are another way of telling the rubes that they have no say in the matter.

Refugees have become the ideal motte and bailey  of the immigration world. The motte is the idea that we're temporarily helping people who would literally die without our help. The bailey is large scale, permanent immigration of people from some of the most dysfunctional parts of the planet, via a mechanism mostly shielded from the political process.

You may wonder why this is a bailey, but that's probably just because you're insufficiently educated in the benefits of diversity, comrade. A little time in reeducation camp will sort you out. Or more practically, knowing how holiness spirals work.

I expect that mass immigration will most be legitimized by a systematic expansion of the practical, though not necessarily the formal, definition of refugee. In the world where policy is determined by feelz, it will come to mean "anyone who might plausibly be portrayed as an object of sympathy".

Which in practice means that anyone is allowed to come, as long as they're some kind of approved minority. In theory, there's probably an additional requirement that they haven't yet been proven to have committed a crime, though the process for evaluating that clause becomes essentially nugatory.

And so the ultimate aim gets accomplished. The others might have got there too, but I suspect this one will effectively dismantle immigration systems with less resistance. The unprincipled exception becomes semi-principled consistency.

Tuesday, September 20, 2016

On the Decline of Wisdom

The Dissenting Sociologist began a post recently with a quite striking sentence:
The principle that “the wise shall govern the strong” is a law of Nature so basic that human society is inoperable and indeed altogether inconceivable without it. Democracy as such is an illogical Utopian fiction that doesn’t exist anywhere and cannot. In human society anywhere we find it, men in the physical flower of their youth allow themselves to be bossed around by senior men they could easily overwhelm, and legitimate authority assumes the form of a pyramid such that positions of authority, by definition, are fewer to the extent that the scope of authority attached to them is greater."
And my immediate thought was to wonder: this is a fascinating idea. Is it actually true?

The second sentence is definitely true. Society would definitely be better ordered if the first sentence were also true. But the universe isn’t usually ordered the way we would like it.

So what would be the similar, purely positive version of the same idea that might be closer to being true? I’d say that the elite will always rule over the masses. Like most, if not all, seemingly universal truths in the social sciences, it has a somewhat tautological aspect – the elites are defined as the ruling class, because ruling itself confers status. Sometimes the rulers are priests, or warriors, or kings, or judges, or bureaucrats. But everywhere there are the leaders, and the led.

Power is always jealously sought, even if not actively contested at every point in time. And so any elite must be savvy enough to at least maintain their own supremacy against other contenders for power. If you are incompetent enough, you probably won’t stay in power that long. Strictly speaking, you don’t need to be competent at any task other than maintaining your own power. You can run your country into ruin and beggary, as many long-lived dictators have done, as long as you maintain your own power. So you can definitely have an evil, psychopathic elite. But a sufficiently incompetent elite is a fragile equilibrium, at risk of collapsing. This also is the strongest evidence against frequent claims that some or other presidential candidate is a moron – Trump, Bush, Kerry, whoever. There are simply too many other people viciously vying for the presidential job for any true moron to get that close to succeeding.

Of course, the number of true psychopaths is rather small. So most leaders will have at least some regard for their people. And so if there is a general quality of intelligence and good judgment needed to maintain power, that will hopefully flow over into competent administration of the rest of the country (perhaps one of the biggest mercies the world provides, actually). The main hitch here, of course, is that psychopaths (though numerically few) are disproportionately attracted to power, and ruthless in the methods they are willing to use to obtain it. Hence the horror of the many dictators of the 20th century, from Mao to Mugabe.

A lot of elites will have a need to occasionally augment their ranks with competent administrators who can help them secure their rule. And this is where the starting quote is quite interesting, particularly with regard to exactly what qualities are being sought. What is needed is competence. But this can come from a number of different base qualities.

Reactionaries are generally drawn to old ideas, and wisdom is one such concept. Wisdom connotes judgment, nuance, experience, and a sense of doing what is right. It is related to its less lofty and less mystical relative, good judgment (of which wisdom is in some sense the pinnacle). It is not surprising that these are also associated with age – if someone is wise beyond their years, it is because wisdom is generally thought to be more likely to reside in the elders of a society.

Wisdom, dear reader, is a quality whose heyday has largely passed. The thoroughly brilliant Google NGram viewer charts the decline for us.



It should not therefore come as a surprise to find that modern society, which places relatively less emphasis on wisdom, should also come to have less respect for the elderly relative to the young.

So if the elites aren’t selecting on wisdom, but have to select on competence (broadly defined), what else are they selecting on?

Here’s one answer:



First ‘clever’, then ‘smart’.

‘Wise’ has been more or less declining as an idea since 1820 or so. Its decline was also marked by the rise of ‘clever’ – more intellectual, but in a way that seemed to prioritise shrewdness and savvy behavior, as opposed to good judgment.

But the big rise of late has been ‘smart’. This goes mostly to intelligence, raw cognitive firepower. This is a trait that (at least at an individual level) is generally considered to be inherited at birth, and which displays itself more in youth than old age.

The modern ideal of innovative success is the young tech CEO. Mark Zuckerberg is assuredly smart, and often described as such. I have yet to hear anyone praise him as wise.

The other striking aspect of this perception is that if good decisions are thought to come mostly from being smart, then they are something that one is either just born with, or can acquire merely by turning one’s gigantic brain to the subject at hand. And since every man flatters himself that he is smart, he is thereby largely relieved of the obligation of humble study at the feet of those that have come before him. Hence the modern progressive wet dream of the show ‘The West Wing’ – brilliant young minds elevated straight from their Harvard Political Science undergrad education to being White House advisors, solving the world’s problems as understudies to a Nobel Prize Winner in Economics (or at least Hollywood writers’ limited conception of one).

Intellect alone is presumed to be able to solve the world’s problems, from Syria to Washington.

Good judgment, by comparison is considered far too prosaic a quality to be encouraged, and wisdom seems almost archaic.

I am far from convinced that this shift in emphasis has been for the good.

Thursday, February 12, 2015

A good heuristic for a certain type of BS

One phrase that in practice means almost the exact opposite of what it claims is the expression 'scientifically proven'.

I have known a good number of scientists, both social and physical, and I've never once heard them use this expression non-ironically to describe either their own, or anyone else's work. Mathematics proves things, by formal theorems. Science, on the other hand, provides evidence that supports some hypotheses and which rejects other hypotheses. But even when a null hypothesis is formally rejected, knowledge in the sciences is contingent. At any time, your theory is making falsifiable predictions that are so far consistent with the data, but which might be overturned at any time.

And even in places like economics, theory models, which do use formal mathematical proofs of particular ideas and thus may loosely be justified in terms of speaking of 'proof', almost never use the term when referencing the broad idea they're trying to advance. Economists will say 'I solve a model which shows how information asymmetry affects trading volume', not 'Information asymmetry is scientifically proven to decrease trading volume'. What has been solved is one particular model, but there are many other competing models that may be consistent with the data too. Nobody would dream of saying that science proved their theory result.

'Oh sure', you might say, 'we understand that there's a distinction among the finer points of philosophy of science. But in practice, saying science has proved something just means there's lots of evidence consistent with it. Why be such a purist?'

A good question, since you asked.

The reason my heuristic works, however, is that most people who perform actual science do understand the distinction, and are likely to use the right language. By contrast, people who like the phrase 'scientifically proven' are almost always sneaking in an appeal to authority in order to paper over either a) their lack of understanding of the complexity of the issue, or b) the annoyingly inconclusive evidence for the particular proposition that they think it would be politically desirable for more people to believe.

The claim in the above paragraph, of course, is a hypothesis. In the name of science, we should see whether the evidence supports the hypothesis or not.

To check, here's the top 5 results that come up when I type in the phrase 'reject the null hypothesis' into Google News:

1. Do Teams Undervalue European Skaters in the Draft?
2. Hypothesis Testing in Finance: Concept & Examples
3. Culture war in the deep blue sea: Science’s contentious quest to understand whales and dolphins
4. WaPo Climate Fail on Missouri
5. Using a fund manager? You'd get the same results at a casino

So that may not sound stellar, but they're all somewhat related to formal evaluation of evidence for and against ideas in the social or physical sciences. Now compare it with what comes up for 'scientifically proven':

1. Scientifically proven herbal aphrodisiacs
2. Writing Exercises Scientifically Proven To Redirect Your Life
3. 10 scientifically proven ways love can heal!
4. Emojis Are Now Scientifically Proven To Help You Get Lucky
5. Ryan Gosling’s Face Has Been Scientifically Proven To Make Men More Supportive Of Feminism

In other words, worthless clickbait. Colour me shocked.

The results, while not subjected to formal statistical testing, directionally support the hypothesis that 'scientifically proven' is a brain-dead appeal to authority by lazy English majors who wish to unjustifiably associate their claims with the patina of scientific credibility.

Thursday, September 25, 2014

A thing I did not know until recently

The word 'se'nnight'. It's an archaic word for 'week', being a contraction of 'seven night(s)'. The most interesting thing is that it makes it immediately clear where 'fortnight' comes from, being a similar contraction of 'fourteen night(s)'. The more you know.

Via the inimitable Mark Steyn.

Saturday, July 19, 2014

Snappy responses you weren't hoping for that nonetheless answer the question quite well

In the last few years, unable to hold a list of just four grocery items in my head, I’d begun to fret a bit over my literal state of mind. So to reassure myself that nothing was amiss, just before tackling French I took a cognitive assessment called CNS Vital Signs, recommended by a psychologist friend. The results were anything but reassuring: I scored below average for my age group in nearly all of the categories, notably landing in the bottom 10th percentile on the composite memory test and in the lowest 5 percent on the visual memory test.
All this means that we adults have to work our brains hard to learn a second language. But that may be all the more reason to try, for my failed French quest yielded an unexpected benefit. After a year of struggling with the language, I retook the cognitive assessment, and the results shocked me. My scores had skyrocketed, placing me above average in seven of 10 categories, and average in the other three. My verbal memory score leapt from the bottom half to the 88th — the 88th! — percentile and my visual memory test shot from the bottom 5th percentile to the 50th. Studying a language had been like drinking from a mental fountain of youth.
What might explain such an improvement?
Regression toward the mean.

Friday, November 1, 2013

The war on dying is going poorly, but at least the war on "dying" is succeeding.

We live in an age where people go to enormous lengths to not contemplate mortality – either their own, or anyone else’s, really.

In the current present tense culture, people don’t even realise how much this shift has occurred. Today's moral fashions are not only correct, but self-evidently so. This mindset is imbibed so deeply, in fact, that most people don't even find it necessary to contemplate why things weren't always the way they are now. To the extent that other cultures and periods felt differently, the aberration is all on their side. The past is a foreign country, all right. And people's attitudes towards it resemble those they hold to real life foreign countries: namely, we'd rather read the latest news about Miley Cyrus than give a rat's @$$ what's going on there or why.

But every now and again, subtle language choices creep in to remind us how recently the current attitudes came about.

Take, for instance, the use of the present participle verb form ‘dying’.

It used to be common to say that an old person was ‘dying’. This would refer to known terminal illnesses (‘Dad is dying of cancer’), as well as people just in very poor health and probably going to eventually lose out to something or other. To the extent that it wasn’t known exactly when things would happen, sometimes there was uncertainty (‘I think he’s dying’).

Now, to apply the term to someone with a serious illness but not currently on life support is sufficiently rare that it sounds jarring, even anachronistic. Which is ironic, because death sure ain't getting any more anachronistic. (The disappearance of the concept of 'dying of old age' is highly related).

People will still use the phrase occasionally, of course. But usually only at the absolute last possible moment when it's absolutely clear that nothing can be done. The medical profession aids and abets this view, not wanting to deliver bad news any earlier than necessary, focusing instead just on the treatment options.

The part that stands out today is just how far away from the moment of death people used to be willing to make this observation (months, typically), and how matter of fact the whole thing was. Certainly in the case of terminal diseases, where the end result was known. The only time I’ve read this expression used in any sort of recent memory was the opening sentence of Mencius Moldbug’s pre-death eulogy to Larry Auster:
In case you haven't heard, Larry is dying.
Not coincidentally, Moldbug has perhaps the strongest sense of historical perspective of almost any writer around today. The second most honest description was from John Derbyshire. Make from this what you will.

The reason nobody uses the word any more, I suspect, is that people will do everything in their power to deny the possibility of death until the last minute, when the Titanic is already half-submerged and the orchestra has fallen into the ocean. We can fight this thing! The cancer has metastasized, but they're trying a new treatment! There's still a chance!

I would wager that there are plenty of people who will never be willing to use the word 'dying' to refer to a loved one. Dad is never dying, he's just going along, right up until he's 'dead'. That bit they'll acknowledge, if only for the absurdity of the alternative. To highlight the Nelsonian artificality of all this, doctors and nurses have very little difficulty telling when a patient is nearing the end - that's how they know to tell you to call the relatives. I have little doubt that if you asked them three months ahead of time, they'd be able to give similarly accurate prognoses, but nobody ever does ask them.

The only circumstances where modern man will rouse himself to use the present participle form are in metaphorical circumstances that have nothing to do with mortality. So Mum may be 'dying of boredom', 'dying of laughter', 'dying of embarrassment', but she's never just 'dying'.

This online diary has an entire label dedicated to 'mortality'. I have no doubt that in the scheme of modern society, this makes me morbid and weird. 

I maintain, however, that the strangeness is not mine, but today's world.

Nevertheless, it ends.

Tuesday, January 15, 2013

Fake Accents

One of my hobbies is to try to imitate foreign accents. It's often convenient for humor purposes to be able to portray a generic person of some nationality - Yank, Irish, Brit, whatever. You need to get it good enough that that it doesn't devolve into 'half-assed Indian accent', which is the death rattle of any impersonation.

Fake accents are also great as examples of the power of suggestion. The easiest trick is to just find a few words that suggest the place in question according to stereotypes, learn to do them well, and just sprinkle them in liberally. So if you needed to suggest Irishness, you could just learn Irish-sounding versions :
'Guinness'
'Taters'
'County Cork'
'Fookin' English'.
and just use them in some combination.
'I love Guinness with me 'taters, 'specially in County Cork. But not with the fookin' English'.
etc.

If you need to actually give a randomly chosen dialogue in a foreign accent, it's considerably harder, since you can't just pick your own words. The chance of being able to convince people depends greatly on their own familiarity with the accent. The hardest is to convince native speakers, since they'll know immediately what sounds wrong. The gold standard for all this is of course Hugh Laurie - Americans who watched House are constantly surprised to find out that his normally speaking voice is strongly English. This is the real Hugh Laurie voice. You can here his House accent here and here.

My fake American accent is marginal at best. By which I mean, it's pretty good by the standard of most people's fake accents, but put me next to a native-speaking American and you can clearly tell where my flubs and weird vowel sounds are. C.f. Hugh Laurie, my American friends generally find it painful to listen to. So if the test is 'If you suspect it might be fake, can you quickly find evidence to confirm this hypothesis?', then I flunk it by a mile.

But most of the time, this isn't actually the test. The real test is 'If you didn't know in advance that it was fake, is it bad enough to raise in your mind the possibility that it might be an impersonation?'. It turns out that this is a much easier standard to beat, because most of the time people aren't on the lookout for someone using a fake accent.

Being a man of science, I decided to try this in the wild. For the first 40 minutes of meeting new Americans, I'd use my fake American accent, then switch to Australian. I'd then ask the person if they suspected that it was fake. Based on a pretty big sample, the percentage who suspected it was fake was between about 5 and 10%. And this is for an accent so bad that people who know me find it gratingly unpleasant to listen to. But people who don't know me just interpret the mistakes as being some sort of regional variation - the slightly Australian 'r' sounds were forgiven as being some sort of East coast/Boston twang.

It's really an example of the curse of knowledge - people who know some information are typically very bad at putting themselves in the position of someone who didn't know the information. If you know my accent is fake, you suspect that everyone will be able to tell that it's fake. But it doesn't work that way.

The other funny observation on this came from my friend SH, who watched one of my recent attempts. He said that my body language became somewhat forced. It was like, he said, watching me trying to perform a difficult calculation. I'd totally believe it - some significant part of your brain is devoted to making the words come out in a different way, and this is actually pretty hard work.

Convincing them that you're not weird after you switch accents, however, is considerably harder. Nobody said science was easy.

Monday, November 19, 2012

Predicting if someone is Brazilian by how they speak English

One of my minor hobbies is trying to guess where people were born based on small details about them.

A fun way of doing this is with language. When people speak English (or any other language), they often subconsciously import assumptions about pronouncing words from their original tongue. Certain sounds will get pronounced in ways that sound slightly odd to a native English speaker, but are often correlated among people who grew up speaking a particular tongue, or from a particular region. The great OKH informed me that the study of this area is called 'phonotactics', so you might call me an amateur phonotactician

The latest one I cam across is a diagnostic for Brazilians. Like all linguistic tics, it's not universal, but it's reasonably predictive - it's neither necessary nor sufficient, but it's closer to being sufficient than it is to being necessary . It's the following:

Past tense verbs (e.g. words that end in 'ed'), they will sometimes pronounce the 'ed' as a hard sound.

So, for instance, the word 'combined', they'll sometimes pronounce as 'combine-ed', with the last sound being pronounced as in the start of 'education'.

I noticed this first in two Brazilians that I know, and confirmed it out of sample this weekend with another guy - he had dark brown hair and pale-ish skin with an accent that I couldn't easily place when I heard him giving a talk. He did the hard 'ed' sound in a talk, so I googled him and sure enough he was from Brazil.

The previous one (which I noted in the comments here, but which deserves its own post) is the following:

A strong diagnostic for Turkish people speaking English is that words that end in a hard 'r' they sometimes combine the 'r' with a 'zh' afterwards (think as in Dr Zhivago, or 'Jean-Claude' in the French pronunciation). So the word 'cover', they'll pronounce almost like 'coverj', if that makes sense. They won't do it all the time, so you often have to listen for a while before they'll do it. It's not uniquely Turkish - I've also come across it in one or two Eastern European groups, although I forget which. But it's a pretty strong predictor.

I've confirmed this across a few people, but I'll report to you soon an out of sample test - I heard my tailor say it the other day when I took in a suit to get adjusted. I'm going to ask him when I return, and we'll see if I'm right.

[Update]: Confirmed - he is indeed Turkish.

Correlations, baby. Though you throw them out with a pitchfork, yet they return.

Sunday, May 13, 2012

Taking the Power Narrative Back

With small semantic differences.

From a conversation earlier today.
DG: Man, you like taking long showers.
Shylock: Sure do. Why be in a hurry?
DG: So you like wasting water, then?
Shylock: Not "wasting". "Spending."
The longer rationale, of course, is here.

Friday, January 27, 2012

Insight of the Day That I Was Most Pleased With

I was listening to a talk by this Greek girl today.

I was speaking to The Greek afterwards, and asked him the following: "Hey, does the Greek language have any works that end in either 't' or 'p' "?

Sure enough, it doesn't. Which I knew it wouldn't.

How did I know this?

Listening to the girl talk, there were certain words where she would add half an extra vowel at the end, particularly words that ended in 't' or 'p'. So the word 'treatment' became something almost like 'treatmenta' and 'group' became 'groupa'. Not with a strong emphasis on the 'a' at the end, but noticeable.

My hunch, which it seems was right, is that this came from the fact that she wasn't used to words ending in 't' and 'p' - she was used to a vowel at the end after these letters. And this was so subconscious that she was adding it in slightly in English, even though it wasn't there. This would only seem to work if words ending in these letters were completely absent.

Bam! It makes you look like Sherlock (not Shylock) Holmes when you can spot these kinds of obscure connections.

There's few things as satisfying as correctly identifying something random about the world based on correlations that most people aren't paying attention to.

Saturday, December 3, 2011

Why Smart People Are Getting Worse At Spelling

This is not a post about why dumb people are getting worse at spelling. There's lots of obvious culprits - the spreading of teenage text message LOLspeak, declining educational quality, feelgood 'everybody wins a prize!' teaching methods that decline to correct too many mistakes, lunatic academics who argue that insisting on correct spelling and grammar is horribly racist and elitist, etc. etc.etc.

No, what's less remarked on is the hidden decline in spelling knowledge among educated people. But you'll only see this in a very particular context - if you ask them to hand-write something that requires big words. Their typewritten work is getting better and better.

Once upon a time, people used to need to know how words were spelled. To write something wrong in a letter was embarrassing, and every correction you made was obvious too. The benefits in knowing the correct spelling the first time were significant.

Now, we instead train people to know that they have to use spell-check. This requires them to know how to have a good stab at the word, and to diligently check that their document doesn't have any red squiggly lines under any words. But this doesn't actually drill spelling.

The reality is that bad spelling in a document these days is a sign only of laziness or complete illiteracy. Grammar is still more of a filter, as grammar checkers are less sophisticated. But the test of 'does this document contain typos?' is now only a very weak signal of actual spelling ability.

Five minutes ago, I had to type the word 'accelerate', and I couldn't remember if it had one or two "c"s, and whether it had one or two "l"s. No worries! Just have a stab, and keep going through the combinations until you hit it.

But here's the problem - within 5 seconds, I'd forgotten what the answer was. And next time, I'm going to do the same thing. It's like using a GPS instead of a map - in theory, the more efficient system could be used as a tool to improve the learning process. In practice, it gets used as a substitute for the learning process.

Don't believe me? Try writing a hand-written letter to someone, and see how many times you find yourself stumbling over the correct spelling of a word. And this is only the mistakes you know you're making, let along the ones you don't! It's a sure-fire way to cure yourself of any hints of snobbery about how eloquent and precise your writing is.

Overall, I'm okay with this process - it's not like men are about to be thrust into the wilds of nature where no spell-checkers are available. This is certainly less problematic than the decline in mental arithmetic skills with the ubiquitiousness of calculators. There are a lot more situations where it's valuable to be able to do fast mental mathematics than to always have correct spelling.

Since nobody writes handwritten letters any more, there is only one case where you really see how bad people's spelling has gotten - handwritten signs. Very few people who are going to write a protest sign tend to type it in Word first. But they should:

Stop Vandaling Education

(image credit)




(when it's on 'know your meme', the time for image credits is pretty much over)

Everyone looks at these signs and thinks these people are unspeakable idiots. But this is the wrong lesson. I'm sure if either one had to send an email, it would be spelled just fine.

Technology giveth, and technology taketh away.

Thursday, September 8, 2011

Metaphors for No Free Lunches

From the excellent 'The House Wins', by OK GO, a wonderful metaphor for the strong form of the No Free Lunches principle* :
Ice age upon catastrophic ice age of selection and only one result has trickled in...
The house wins.
Oh the house always wins.
If evil were a lesser breed than justice after all these years the righteous would have freed the world of sin.
The house wins.
Oh the house always wins.
The house wins, and you lose. No matter the game, no matter the circumstance, no matter if you're sure you're figured out a system - doesn't matter, the house will win.

I love it! It's almost as good as Bob Dylan's metaphor for opportunity cost.



*Granted 'No Free Lunches' is already a metaphor, so this is more of a meta-metaphor.

Thursday, July 21, 2011

Same Same, but Different

A properly functioning I.E.D. and an improperly functioning I.U.D. can both really mess up your day.

Monday, June 20, 2011

Trendy Job Titles

You learn a lot about trends in popular perceptions of the economy (the corporte zeitgeist, if you will) by looking at what job titles people choose to give themselves.

In the late 90's and early 2000's the buzzword was 'consulting'. Everyone was a consultant of some form. Usually, it was left unspecified (until the impolite pushed the point) as to
A) what was the subject matter being consulted on
B) what the person's relevant qualifications or experience were, if any, and
C) whether they actually had any clients, or had received any meaningful remuneration in their chosen profession.

In fact, it is precisely these vague aspects that make the term so appealing - the unemployed programmers get to lump themselves in with McKinsey, and hope nobody spots the difference. They're just waiting for a company to hire them to tell them and hear all about the mistakes the company is making.

Somewhere along the line, consulting became passé. The new hot job title, it seems, is 'working at a startup'. This has the same benefits as before. What is conjured up is 'founding the next facebook' or 'CFO of groupon'. The reality might be anywhere from working at a company making napkins, to being unemployed and toying with the idea of writing an iPhone app to track navel lint (or whatever), even though you have no programming experience.

You observed something similar for a while going on with the phrase 'I work at a non-profit'. For better or worse, I don't meet enough people who would be in a position to be claiming this, so can't tell you if it still has the same cachet (at least relative to "I work for a Catholic charity" or "I volunteer for the Sea Shepherds").

My guess is that the time that people stop saying that they work at startups will roughly coincide with the time that technology startups start trading at reasonable price-to-earnings ratios, and I might think about buying shares in them.

Tuesday, June 14, 2011

Stopping By Woods on a Snowy Evening

Apropos nothing, the great Robert Frost.

Stopping By Woods on a Snowy Evening

Whose woods these are I think I know.
His house is in the village though;
He will not see me stopping here
To watch his woods fill up with snow.

My little horse must think it queer
To stop without a farmhouse near
Between the woods and frozen lake
The darkest evening of the year.

He gives his harness bells a shake
To ask if there is some mistake.
The only other sound's the sweep
Of easy wind and downy flake.

The woods are lovely, dark and deep.
But I have promises to keep,
And miles to go before I sleep,
And miles to go before I sleep.



I love this poem a lot. It manages to say much, even though most of the poem is merely describing the scene. The point, only made explicit at the end, seems to me to be partly about the short time we have on earth and the relation of man to nature. Nature is ambiguous in the poem - beautiful, but somewhat lonely and foreboding. The poem notes that the duties of the world we live in stop us from usually really noticing this, and instead we rush on on the long road that ends in sleep, with the repetition suggesting the second meaning of the long sleep we all face eventually.

Serious poetry fans eschew Frost, because he is too common and accessible, and thus affords few opportunities for snobbery and condescension. And while it would be easy to mock this motivation as being stupid (and it is), I think it is also unnecessary, since there is no danger of appearing too common by liking any sort of poetry these days (as opposed to, say, Lady Gaga). Frost, like Kipling, is popular because he is great - both of them are on the efficient frontier of 'profound' and 'accessible' - there are greater poets, and more accessible poets, but there are no poets who are both greater and more accessible.

Wednesday, June 1, 2011

In Praise of Monolingualism

One of the standard markers of being sophisticated is learning a second language. This is regarded as an unadulterated "good thing", and the multilingual sophisticates look down on the mouth-breathing, Walmart-shopping, non-passport-owning plebs that never bothered to learn a language other than English. Don't they know what they're missing? The chance to speak to people in other countries! The chance to learn about the assumptions of one's own language at a deeper level! The chance to read great books in their original language!

Now, gentle reader, I must confess to once being drawn towards such logic. Several times I slogged away through my teach-yourself-Spanish mp3s, usually in the lead-up towards a trip to some Spanish speaking country, and out of a sense that it would be cool.

What I would inevitably find once I got to said country is that knowing a little bit of a language is basically no better than not knowing anything. In particular, the range of questions you can ask and understand the response for is almost the same as those you can get with pointing and gestures.You can ask what stuff costs, as long as you know numbers. You can ask for directions (e.g. to the bathroom), but anything that's not immediately visible will be an answer too complicated to understand. You can maybe read a menu, but even that can be done (and I did once) just by pointing and making animal noises. The simple reality is that a wad of money that you're trying to spend, and possibly a phrasebook, is about as useful as a year or two of learning a language.

The main reason to learn a second language is when your first language isn't English. English has become what Esperato fanboys always claimed to want - a common lingua franca language that everyone could speak and understand. Strangely, the Esperanto folks aren't celebrating this fact.

The reality is that learning a language is one of those things that always seems great, as long as you don't consider the opportunity cost. If you force kids to learn a language at school, that's time they're not spending on maths, history or science. However the argument is always phrased as 'learning a language is important!', not 'learning a language is more important than spending the time on science', even though that's the relevant comparison. Sounds a bit less convincing the second way, doesn't it?

In my case, the opportunity cost was the fact that I didn't get to listen to music while driving to work, and had to concentrate hard the whole drive. What a trivial cost! Who wouldn't give that up?

Well, in the end, me. After noting this discrepancy between my stated and revealed preference, eventually I just became comfortable with what revealed preference was telling me - I didn't actually want to learn Spanish, and I did actually enjoy listening to my music. The only change was that I stopped feeling remotely bad about the fact that I don't speak anything other than English.

In the mean time, the google translate app, which now speaks sentences in dozens of different languages and can be carried around in your phone, has done little to modify my earlier views.

Saturday, May 21, 2011

Mr Brown


My latest Pandora find is the thoroughly excellent 'Mr Brown', by Styles of Beyond.

This falls into the category both of the current musical corner solution, and also the metaphor of the day - in this case, a metaphor for getting shot between the eyes:
'[To] catch a 40-calibre case of glaucoma'
Classic!

This song also exemplifies the fact that the chord progression of 'Tonic, Sub-Dominant, Dominant, Sub-Dominant, Repeat' (e.g. C, F, G, F) is excellent and among the best four-chord riffs. It works really well in this song, it works well here, it works well here, it works well here, it pretty much works well everywhere.

Monday, April 25, 2011

"They"

One of the laziest rhetorical devices in song-writing is the use of the oppositional 'they'.

Usually this is done in relationship songs. For some reason, 'they' are always opposed to any given relationship. Don't ask me why. We live in a world were large impersonal forces are aligned to prevent couples who are always 'meant to be' from getting together. (The only exception that I can think of is 'the old folks', with their "C'est la vie ... it goes to show you never can tell" attitude.)

A good example of this is 'Check Yes Juliet', by 'We the Kings'. Catchy and boppy, but inane:



(If the Vevo clip doesn't work, you can also try here)
"Run baby run
Don't ever look back
They'll tear us apart if you give them the chance"
It is an immense but common conceit of juvenile relationships that the world, as personified by the mysterious 'they', has deep interests in making sure that you and your girlfriend don't stay together.

The world, of course, is very rarely troubled by such matters. Your relationships end because you found someone else, or because you weren't actually suited to each other, or any other number of mundane but important reasons. Rarely do they end because 'they' chose to 'tear you apart'.

To give 'We the Kings' credit, they at least get a little more specific about who is opposed, in this case (implicitly) the parents.
They can change the locks, don't let them change your mind
You can tell how serious the parents are, because the Dad in the film clip keeps looking on in a vaguely disapproving manner while never actually saying anything. He's probably thinking about the possibility of illegitimate red-headed grandchildren, which frankly would concern me too.

Forget the parents. The 'they' that the narrator should actually be worried about is some other smooth-talking guy at school who also wants to hook up with the cute chick in the film clip. But that doesn't work so well as a rhetorical device, because for this particular 'they' to succeed in 'tearing them apart', the girl would have to want to go along. Which makes the narrative a little more awkward.

(Other examples of the mysterious 'they' can be found here, or here, or here.)

You know who could honestly write this song without it being self-indulgent?

Eva Braun, maybe? Okay, so 'they' had less interest in ending the relationship specifically, and more in ending one party to the relationship, but still.

Edward VIII is about the only one that springs to mind. Yes, large impersonal forces really were opposed to that relationship.

And I'll give a pass to anyone living in areas where "honour" killings are practiced.

Other than that? Justify your relationship without pretending it's so important that an entire conspiracy is being organised against it.