Tuesday, September 27, 2016

Steel Man (opposite of Straw Man) Rationality

I often tell people on my team who are arguing for a particular position to also be ready and able to summarize the best argument against their position.

This might be something to keep in mind while thinking about the election and the debate last night.
Steel man: Sometimes the term "steel man" is used to refer to a position's or argument's improved form. A straw man is a misrepresentation of someone's position or argument that is easy to defeat: a "steel man" is an improvement of someone's position or argument that is harder to defeat than their originally stated position or argument.

John Stuart Mill: "He who knows only his own side of the case knows little of that. His reasons may be good, and no one may have been able to refute them. But if he is equally unable to refute the reasons on the opposite side, if he does not so much as know what they are, he has no ground for preferring either opinion..."
While the questions below may have right and wrong answers, I doubt the analysis leading to those answers is in any way simple and I doubt that most people with strong opinions on them could give the strongest arguments either for or against their own position.

Has global trade helped the average American over the last 30 years? Link.

When elites and the larger population disagree on a policy issue, which group generally gets their way? Link.

Can a nation with a $20 trillion national debt and large trade deficit maintain an aggressive military and foreign policy stance? Link.

At what threshold of economic productivity does a net "taker" become a "maker"? What is the threshold relative to the current minimum wage? Link.

Should the US accept more low-skill immigrants? What are the long term consequences? Link.

Is the total economic return to university education, either to the individual or to the nation, positive regardless of the ability level of the individual? Link.

Is the US winning or losing the cyberwar? Are China and Russia also listening to Merkel's cell phone conversations? Are PCs in Chinese and Russian government ministries actually secure and better maintained than ours? Link.

Is it possible that much of economic growth in future decades depends on technical innovations in areas such as quantum physics, machine learning, AI, genomics, advanced materials, robotics that only the top few percent of the population are cognitively equipped to understand? Has this already been true for some time? Link.


WARNING THESE VIDEOS MAY CONTAIN TRIGGERING OR OFFENSIVE MATERIAL. THEY HAVE SOMETHING FOR EVERYONE.





Thursday, September 22, 2016

Annals of Reproducibility in Science: Social Psychology and Candidate Gene Studies

Andrew Gelman offers a historical timeline for the reproducibility crisis in Social Psychology, along with some juicy insight into the one funeral at a time manner in which academic science often advances.
OK, that was a pretty detailed timeline. But here’s the point. Almost nothing was happening for a long time, and even after the first revelations and theoretical articles you could still ignore the crisis if you were focused on your research and other responsibilities. ...

Then, all of a sudden, the world turned upside down.

If you’d been deeply invested in the old system, it must be pretty upsetting to think about change. Fiske is in the position of someone who owns stock in a failing enterprise, so no wonder she wants to talk it up. The analogy’s not perfect, though, because there’s no one for her to sell her shares to. What Fiske should really do is cut her losses, admit that she and her colleagues were making a lot of mistakes, and move on. She’s got tenure and she’s got the keys to PPNAS, so she could do it. Short term, though, I guess it’s a lot more comfortable for her to rant about replication terrorists and all that.

... Why do I go into all this detail? Is it simply mudslinging? Fiske attacks science reformers, so science reformers slam Fiske? No, that’s not the point. The issue is not Fiske’s data processing errors or her poor judgment as journal editor; rather, what’s relevant here is that she’s working within a dead paradigm. A paradigm that should’ve been dead back in the 1960s when Meehl was writing on all this, but which in the wake of Simonsohn, Button et al., Nosek et al., is certainly dead today. It’s the paradigm of the open-ended theory, of publication in top journals and promotion in the popular and business press, based on “p less than .05” results obtained using abundant researcher degrees of freedom. It’s the paradigm of the theory that in the words of sociologist Jeremy Freese, is “more vampirical than empirical—unable to be killed by mere data.”

... In her article that was my excuse to write this long post, Fiske expresses concerns for the careers of her friends, careers that may have been damaged by public airing of their research mistakes. Just remember that, for each of these people, there may well be three other young researchers who were doing careful, serious work but then didn’t get picked for a plum job or promotion because it was too hard to compete with other candidates who did sloppy but flashy work that got published in Psych Science or PPNAS. It goes both ways. ...
An old timer who has seen it all before comments.
ex-social psychologist says:
September 21, 2016 at 5:36 pm

Former professor of social psychology here, now happily retired after an early buyout offer. If not so painful, it would almost be funny at how history repeats itself: This is not the first time there has been a “crisis” in social psychology. In the late 1960s and early 1970s there was much hand-wringing over failures of replication and the “fun and games” mentality among researchers; see, for example, Gergen’s 1973 article “Social psychology as history” in JPSP, 26, 309-320, and Ring’s (1967) JESP article, “Experimental social psychology: Some sober questions about some frivolous values.” It doesn’t appear that the field ever truly resolved those issues back when they were first raised–instead, we basically shrugged, said “oh well,” and went about with publishing by any means necessary.

I’m glad to see the renewed scrutiny facing the field. And I agree with those who note that social psychology is not the only field confronting issues of replicability, p-hacking, and outright fraud. These problems don’t have easy solutions, but it seems blindingly obvious that transparency and open communication about the weaknesses in the field–and individual studies–is a necessary first step. Fiske’s strategy of circling the wagons and adhering to a business-as-usual model is both sad and alarming.

I took early retirement for a number of reasons, but my growing disillusionment with my chosen field was certainly a primary one.
Geoffrey Miller also contributes
Geoffrey Miller says:
September 21, 2016 at 8:43 pm

There’s also a political/ideological dimension to social psychology’s methodological problems.

For decades, social psych advocated a particular kind of progressive, liberal, blank-slate ideology. Any new results that seemed to support this ideology were published eagerly and celebrated publicly, regardless of their empirical merit. Any results that challenged it (e.g. by showing the stability or heritability of individual differences in intelligence or personality) were rejected as ‘genetic determinism’, ‘biological reductionism’, or ‘reactionary sociobiology’.

For decades, social psychologists were trained, hired, promoted, and tenured based on two main criteria: (1) flashy, counter-intuitive results published in certain key journals whose editors and reviewers had a poor understanding of statistical pitfalls, (2) adherence to the politically correct ideology that favored certain kinds of results consistent with a blank-slate, situationist theory of human nature, and derogation of any alternative models of human nature (see Steven Pinker’s book ‘The blank slate’).

Meanwhile, less glamorous areas of psychology such as personality, evolutionary, and developmental psychology, intelligence research, and behavior genetics were trundling along making solid cumulative progress, often with hugely greater statistical power and replicability (e.g. many current behavior genetics studies involve tens of thousands of twin pairs across several countries). But do a search for academic positions in the APS job ads for these areas, and you’ll see that they’re not a viable career path, because most psych departments still favor the kind of vivid but unreplicable results found in social psych and cognitive neuroscience.

So, we’re in a situation where the ideologically-driven, methodologically irresponsible field of social psychology has collapsed like a house of cards … but nobody’s changed their hiring, promotion, or tenure priorities in response. It’s still fairly easy to make a good living doing bad social psychology. It’s still very hard to make a living doing good personality, intelligence, behavior genetic, or evolutionary psychology research.

In the title of this post I mention Candidate Gene Studies. Forget, for the moment, about goofy Social Psychology experiments conducted on undergraduates. Much more money was wasted in the early 21st century on under-powered genomics studies that looked for gene-trait associations using small samples. Researchers, overconfident in their vaunted biological or biochemical intuition, performed studies using p < 0.05 thresholds that produced (ultimately false) associations between candidate genes and a variety of traits. According to Ioannidis, almost none of these results replicate (more). When I first became aware of GWAS almost a decade ago, the field was in disarray, with some journals still publishing results at the p < 0.05 threshold, whereas others having adopted the corrected p < 5E-08 = 0.05 x 1E-06 "genome wide significance" threshold (based on multiple testing correction for 1E06 SNPs)! The latter results routinely replicate, as expected.

Clearly, many researchers fundamentally misunderstood basic statistics, or at least were grossly overconfident in their priors for no good reason. But as of today, genomics has corrected its practices and although no one wants to dwell on the 5+ years worth of non-replicable published results, science is at least moving forward. I hope Social Psychology and other problematic areas (such as in biomedical research) can self-correct their practices as genomics has.

See also One funeral at a time?


Bonus Feature!
Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature

Denes Szucs, John PA Ioannidis
doi: http://dx.doi.org/10.1101/071530

We have empirically assessed the distribution of published effect sizes and estimated power by extracting more than 100,000 statistical records from about 10,000 cognitive neuroscience and psychology papers published during the past 5 years. The reported median effect size was d=0.93 (inter-quartile range: 0.64-1.46) for nominally statistically significant results and d=0.24 (0.11-0.42) for non-significant results. Median power to detect small, medium and large effects was 0.12, 0.44 and 0.73, reflecting no improvement through the past half-century. Power was lowest for cognitive neuroscience journals. 14% of papers reported some statistically significant results, although the respective F statistic and degrees of freedom proved that these were non-significant; p value errors positively correlated with journal impact factors. False report probability is likely to exceed 50% for the whole literature. In light of our findings the recently reported low replication success in psychology is realistic and worse performance may be expected for cognitive neuroscience.
From the paper. FRP = False Report Probability = the probability that the null hypothesis is true when we get a statistically significant finding.
... In all, the combination of low power, selective reporting and other biases and errors that we have documented in this large sample of papers in cognitive neuroscience and psychology suggest that high FRP are to be expected in these fields. The low reproducibility rate seen for psychology experimental studies in the recent Open Science Collaboration (Nosek et al. 2015a) is congruent with the picture that emerges from our data. Our data also suggest that cognitive neuroscience may have even higher FRP rates, and this hypothesis is worth evaluating with focused reproducibility checks of published studies. Regardless, efforts to increase sample size, and reduce publication and other biases and errors are likely to be beneficial for the credibility of this important literature.

Wednesday, September 21, 2016

The death of Sol (Soylent Green)



I first saw Soylent Green on television when I was a kid. (CBS Late Night Movie of the Week, or something like that :-) It was terrifying -- the sweaty dystopian desperation, the riot scenes, but most of all Sol's final moments at the euthanasia center.
Backstage: ... it is unlikely that an actor will ever give a last performance with the stunning emotional resonance of Edward G. Robinson’s work in 1973’s “Soylent Green.”

The film is an environmental parable and, along with “Planet of the Apes,” could be considered the granddaddy of the “dystopian future” genre. But in 1973, the notion of a world destroyed by pollution, overpopulation, and food shortages was frightening and fresh. The fact that every ill depicted in “Soylent Green” (set in the then-distant world of 2022) is actually coming to pass has only made the film seem prescient.

Robinson plays Sol Roth, the partner, friend, and father figure to Charlton Heston’s Detective Thorn, a cop tasked with saving the world. As a man who remembers the Earth’s beauty before it was compromised, Robinson’s Sol symbolizes nothing less than humanity itself, and is given scene after scene where he conveys the wonder and longing for a world that exists only in his memory. These moments—Robinson remembering, for example, how food used to taste when he was young—are both chilling and touching. He breaks your heart, but Robinson is never sentimental.

That Robinson was dying of cancer during filming makes his rich performance even more psychologically intricate. In real life, Robinson died two weeks after filming ended—and he dies in the film itself. “Soylent Green” imagines Sol Roth’s elective euthanasia scene as the final word in personalized shopping. After making his AV preferences for his final journey, Roth is escorted to a large room where he lies on a gurney, imbibes some sort of (presumably) fatal drink, and begins what can only be described as one of the most poetic and powerful death scenes in film history.

As the soundtrack plays an assortment of elegiac Beethoven and Tchaikovsky, Sol watches a 1973 version of an IMAX screen project the breathtaking beauty of the vanished world: sunsets, birds, oceans, plains, flowers. Director Richard Fleischer gives the actor a series of wonderful close-ups, and what Robinson is able to convey with only his eyes is stunning in both its precision and economy. In these close-ups, the actor is able to wordlessly communicate a great many things: the brilliance of the late silent period; the scope of his personal struggles; the totality of his expansive body of work—all with an incredibly light, unmannered touch.

Though he was inexplicably overlooked for an Oscar nomination for “Soylent Green” (and, indeed, for his entire career), he was awarded a richly deserved posthumous Oscar for lifetime achievement. Here’s to you, Mr. Robinson.
See also Soylent is for People.

The death of Fermi


From Stan Ulam's Adventures of a Mathematician. More Ulam. More Fermi.
His illness progressed rapidly. I went to Chicago to visit him. In the hospital I found him sitting up in bed with tubes in the veins of his arms. But he could talk. Seeing me, he smiled as I came in and said: "Stan, things are coming to an end." It is impossible for me to describe how shattering it was to hear this sentence. I tried to keep composed, made a feeble attempt at a joke, then for about an hour we talked about many subjects, and all along he spoke with serenity, and under the circumstances really a superhuman calm.

He mentioned that Teller had visited him the previous day, and joked that he had "tried to save his soul." Normally it is the priest who wants to save the soul of the dying man; Fermi put it the other way round, alluding to the public hullabaloo about Teller and the H-bomb. Perhaps their conversation had an effect, for shortly after Fermi died Teller published an article entitled "The Work of Many People," toning down the assertions of Shepley and Blair. During my visit to Fermi Laura dropped in and I was amazed at the ordinary nature of their conversation about some household appliance.

We talked on and I remember his saying that he believed he had already done about two-thirds of his life's work, no matter how long he might have lived. He added that he regretted a little not having become more involved in public affairs. It was very strange to hear him evaluating his own activity—from the outside, as it were. Again I felt that he achieved this super-objectivity through sheer will power. ...

... Then half seriously I raised the question whether in a thousand years so much progress will be made that it may be possible to reconstruct people who had lived earlier by tracing the genes of the descendants, collecting all the characteristics that make tip a person and reconstructing them physically. Fermi agreed, but he added: "How about the memory? How will they put back in the brain all the memories which are the makeup of any given individual?" This discussion now seems rather unreal and even weird, and it was partly my fault to have put us on such a subject, but at the time it came quite naturally from his super-detachment about himself and death.

I paid him one more visit, this time with Metropolis; when we came out of his room I was moved to tears. Only Plato's account of the death of Socrates could apply to the scene, and paraphrasing some of the words of Krito I told Nick, "That now was the death of one of the wisest men known."

Fermi died shortly after.




Lunch with Fermi at Los Alamos. Feynman is looking at the camera.



Ulam with Feynman and Von Neumann.


See also Passing the Torch.
From Fermi Remembered. The insightful biographical sketch at the beginning of the book, by Emelio Segre, includes details of Fermi's early (self) education and entry into Scuola Normale Superiore.

Murray Gell-Mann: When Fermi lay dying in Billings Hospital, I realized how much I cared for this brilliant, funny, difficult man. I was on leave in the East, and I invited Frank (C.N.) Yang to come with me to Chicago to see him. When we got to the bedside, Enrico kept telling us not to be downcast. "It is not so bad," he said. He told of a Catholic priest who had visited him and whom he had had to comfort. And Frank reminded me a few years ago of what Enrico said when we left, never to see him again. "Now, it is up to you."

Friday, September 16, 2016

Genomic prediction of adult life outcomes using SNP genotypes


Genomic prediction of adult life outcomes using SNP genotypes is very close to a reality. This was discussed in an earlier post The Tipping Point. The previous post, Prenatal and pre-implantation genetic diagnosis (Nature Reviews Genetics), describes how genotyping informs the Embryo Selection Problem which arises in In Vitro Fertilization (IVF).

The Adult-Attainment factor in the figure above is computed using inputs such as occupational prestige, income, assets, social welfare benefit use, etc. See Supplement, p.3. The polygenic score is computed using estimated SNP effect sizes from the SSGAC GWAS on educational attainment (i.e., a simple linear model).

A genetic test revealing that a specific embryo is, say, a -2 or -3 SD outlier on the polygenic score would probably give many parents pause, in light of the results in the figure above. The accuracy of this kind of predictor will grow with GWAS sample size in coming years.

Via Professor James Thompson. See also discussion by Stuart Ritchie.
The Genetics of Success: How Single-Nucleotide Polymorphisms Associated With Educational Attainment Relate to Life-Course Development

Psychological Science 2016, Vol. 27(7) 957–972
DOI: 10.1177/0956797616643070

A previous genome-wide association study (GWAS) of more than 100,000 individuals identified molecular-genetic predictors of educational attainment. We undertook in-depth life-course investigation of the polygenic score derived from this GWAS using the four-decade Dunedin Study (N = 918). There were five main findings. First, polygenic scores predicted adult economic outcomes even after accounting for educational attainments. Second, genes and environments were correlated: Children with higher polygenic scores were born into better-off homes. Third, children’s polygenic scores predicted their adult outcomes even when analyses accounted for their social-class origins; social-mobility analysis showed that children with higher polygenic scores were more upwardly mobile than children with lower scores. Fourth, polygenic scores predicted behavior across the life course, from early acquisition of speech and reading skills through geographic mobility and mate choice and on to financial planning for retirement. Fifth, polygenic-score associations were mediated by psychological characteristics, including intelligence, self-control, and interpersonal skill. Effect sizes were small. Factors connecting DNA sequence with life outcomes may provide targets for interventions to promote population-wide positive development.

Thursday, September 15, 2016

Prenatal and pre-implantation genetic diagnosis (Nature Reviews Genetics)


An IFV cycle can produce multiple (e.g., 5-10 for younger mothers) viable embryos. This leads to an inevitable Embryo Selection Problem. Genomic advances allow for better-informed selection, raising complex ethical issues.
Prenatal and pre-implantation genetic diagnosis

Nature Reviews Genetics 17, 643–656 (2016) doi:10.1038/nrg.2016.97
Published online 15 September 2016

The past decade has seen the development of technologies that have revolutionized prenatal genetic testing; that is, genetic testing from conception until birth. Genome-wide single-cell arrays and high-throughput sequencing analyses are dramatically increasing our ability to detect embryonic and fetal genetic lesions, and have substantially improved embryo selection for in vitro fertilization (IVF). Moreover, both invasive and non-invasive mutation scanning of the genome are helping to identify the genetic causes of prenatal developmental disorders. These advances are changing clinical practice and pose novel challenges for genetic counseling and prenatal care.
From the paper:
Whole-genome analysis of pre-implantation embryos provides information about not only the disorder tested for, but the whole genomic make-up of the embryo. This not only allows for improved selection, but also provides information on genetic variants that are associated with several non-health-related traits. These prospects raise difficult ethical questions. Some people may see this as the slippery slope towards the ‘designer child’ (REF. 136), whereas a different perspective is that it enables prospective parents and professionals to take into account the welfare of the future child. Following the principle of procreative beneficence, it is common practice to rank embryos and select the embryo with the highest chance of resulting in a healthy individual137. This raises questions as to whether prospective parents have the right to select for the best embryo and how to define ‘best’, especially in the context of genome-wide analysis.

...

With further technological improvements and increasing success rates, prenatal and pre-implantation diagnosis of genetic disorders will become commonplace, and with increasing public acceptance a continued growth in their implementation can be anticipated. This implementation, in turn, will reduce the frequency of rare severe inherited genetic diseases. Increasingly, more common genetic variants causing late-onset disorders (for example, BRCA1 and BRCA2) or recessive disorders (for example, cystic fibrosis) could also be selected against and will eventually become rare. In the future, new diagnostic technologies will not only provide a tool to give parents the option of an informed choice, but they will also lead towards fetal personalized medicine ...

Truth and Remembrance at ASEAN: Duterte remarks

What did Philippine President Duterte really say at the recent ASEAN meeting?
Asia Times: Truth and Duterte in media crosshairs

... An actual listen to the full press conference is enlightening in terms of Duterte’s issues with the United States.

At the 6:40 mark, Duterte goes off on a Reuters reporter who, in Duterte’s view, accepts the premise that he needs to answer questions President Obama and others might raise on extrajudicial killings and human rights issues in the drug war.

Duterte is infuriated because in his view the United States is devoid of the moral stature to question him on human rights, given its bloody history of “Moro pacification” in Duterte’s homeland of Mindanao.

CNN helpfully (or hopelessly) glossed the human cost of the US intervention for its readers as a matter of about 600 dead:

Duterte was referring to the US’s history as a colonial power in the Philippines, and specifically to one infamous massacre in the southern Philippines — the 1906 Battle of Bud Dajo — in which hundreds of Filipinos, including women and children, were killed.

Actually, he wasn’t, which CNN would have discovered if they had listened past Duterte’s first agitated reference to his fuller statement about “600” at the ten-minute mark. Duterte is referring to 600,000 dead, not 600. Even more shockingly, Duterte’s number is actually one of the more conservative estimates (the upper end is 1.4 million) of Moro deaths at the hand of the US military.

Yes, American friends, Duterte is referring to one of the most brutal and shameful chapters in the history of American imperialism, the brutal subjugation of the Muslim population of Philippines’ Mindanao over 30 years of formal war and informal counterinsurgency from 1898 into the 1920s.

Mindanao is where the United States first applied the savage lessons of its Indian war to counterinsurgency in Asia—including massacre of civilians, collective punishment, and torture. Waterboarding entered the US military toolkit in Mindanao, as immortalized on the May 22, 1902 front cover of Life magazine.

And the war never ended. After the Philippines shed its colonial status, the Manila Roman Catholic establishment continued the war with US help. Today, the Philippines is locked in a cycle of negotiation and counterinsurgency between the central government and the Moro Islamic Liberation Front (MILF) —a cycle that Duterte as president hopes to bring to its conclusion with a negotiated peace settlement.

This is not ancient history to Duterte, who emphatically stated in his press conference that the reason Mindanao is “on the boil” today is because of the historical crimes of the United States.

Duterte has additional reasons for his choler.

As I wrote previously at Asia Times, Duterte suspects US spooks of orchestrating a deadly series of bombings in his home city of Davao in 2002, with the probable motive of creating a pretext for the central government to declare martial law on Mindanao to fight the MILF. The 2002 Davao bombings form the foundation of Duterte’s alienation from the United States and his resistance to US-Philippine joint exercises on Mindanao, as he declared upon the assumption of his presidency.

And, though it hasn’t received a lot of coverage in the United States, last week, on September 2, another bomb ripped through a marketplace in Davao, killing fourteen people. It was suspected of being part of an assassination plot against Duterte, who was in town at the time, and the Communist Party of the Philippines (which is also engaged in peace talks with Duterte) accused the United States of being behind it.

...

At the ASEAN gathering in Laos, Duterte apparently tried to explain the roots of his indignation ... :

“The Philippine president showed a picture of the killings of American soldiers in the past and the president said: ‘This is my ancestor they killed. Why now we are talking about human rights,'” an Indonesian delegate said. The Philippines was an American colony from 1898 to 1946.

The delegate described the atmosphere in the room as “quiet and shocked.”
As I wrote here (substitute Filipinos for Chinese below):
Most Chinese are incredulous that european colonialists and imperialists, many inhabiting the lands of indigenous people exterminated or displaced only a few centuries ago, would think to assume the moral high ground.
Someone quipped that a transcript of Duterte's remarks might be mistaken for something written by Howard Zinn or a post-colonial theorist. Obama, of all US Presidents, is most likely to understand Duterte's perspective. Obama initially responded by calling Mr. Duterte a “colorful guy” :-)

Saturday, September 10, 2016

Speed, Balding, et al.: "for a wide range of traits, common SNPs tag a greater fraction of causal variation than is currently appreciated"

I recently blogged about a nice lecture by David Balding at the 2015 MLPM (Machine Learning for Personalized Medicine) Summer School: Machine Learning for Personalized Medicine: Heritability-based models for prediction of complex traits. In that talk he discussed some results concerning heritability estimation and potential improvements over GCTA. A new preprint on bioRxiv has the details:
Re-evaluation of SNP heritability in complex human traits

Doug Speed, Na Cai, The UCLEB Consortium, Michael Johnson, Sergey Nejentsev, David Balding
http://dx.doi.org/10.1101/074310

SNP heritability, the proportion of phenotypic variance explained by SNPs, has been estimated for many hundreds of traits, and these estimates are being used to explore genetic architecture and guide future research. To estimate SNP heritability requires strong assumptions about how heritability is distributed across the genome, but the assumptions in current use have not been thoroughly tested. By analyzing imputed data for 42 human traits, we empirically derive an improved model for heritability estimation. It is commonly assumed that the expected heritability of a SNP does not depend on its allele frequency; we instead identify a more realistic relationship which reflects that heritability tends to decrease with minor allele frequency. Two methods for estimating SNP heritability, GCTA and LDAK, make contrasting assumptions about how heritability varies with linkage disequilibrium; we demonstrate that the model used by LDAK better reflects the properties of real data. Additionally, we show how genotype certainty can be incorporated in the heritability model; this enables the inclusion of poorly-imputed SNPs, which can capture substantial extra heritability. Our revised method typically results in substantially higher estimates of SNP heritability: for example, across 19 traits (mainly diseases), the estimates based on common SNPs (minor allele frequency >0.01) are on average 40% (SD 3) higher than those obtained using original GCTA, and 25% (SD 2) higher than those from the recently-proposed extension GCTA-LDMS. We conclude that for a wide range of traits, common SNPs tag a greater fraction of causal variation than is currently appreciated. When we also include rare SNPs (minor allele frequency <0.01), we find that across 23 quantitative traits, estimates of SNP heritability increase by on average 29% (SD 12), and that rare SNPs tend to contribute about half the heritability of common SNPs.
In contrast to GCTA, which assumes a uniform Gaussian distribution of effect sizes for each SNP, this paper considers effect sizes which depend on the local linkage disequilibrium in a particular region w_j, as well as a SNP quality score r_j. (See equation 1 of the paper.) The intuition behind w_j is that if there are n SNPs in a small region which are all highly correlated, they are likely to all be proxies for the actual causal variant, and hence one might over count its contribution by assigning nearly equal effects to each of the SNPs. Instead, the method proposed in this paper (roughly) splits the effect size among the SNPs (Figure 1 below). Their model also allows the effect size distribution to depend on the MAF of j: SNPs at lower frequency in the population contribute less to heritability than in the GCTA default assumption.



The resulting heritability estimates tend to be higher than from GCTA
, so if this method is an improvement (as the authors argue), the amount of missing heritability is even less than that found in GCTA.

Supplement Figure 21 (p.26) provides yet more criticism of Kumar et al., a paper we discussed previously here. [Kumar, S., Feldman, M., Rehkopf, D. & Tuljapurkar, S. Limitations of GCTA as a solution to the missing heritability problem, PNAS 113, E61E70 (2015).]

Friday, September 09, 2016

Defense Science Board report on Autonomous Systems


US DOD Defense Science Board report on Autonomy (autonomous systems).
... This report provides focused recommendations to improve the future adoption and use of autonomous systems.

... While difficult to quantify, the study concluded that autonomy—fueled by advances in artificial intelligence—has attained a ‘tipping point’ in value. Autonomous capabilities are increasingly ubiquitous and are readily available to allies and adversaries alike. The study therefore concluded that DoD must take immediate action to accelerate its exploitation of autonomy while also preparing to counter autonomy employed by adversaries.

... The primary intellectual foundation for autonomy stems from artificial intelligence (AI), the capability of computer systems to perform tasks that normally require human intelligence (e.g., perception, conversation, decisionmaking). Advances in AI are making it possible to cede to machines many tasks long regarded as impossible for machines to perform. ...

Countering adversary use of autonomy (p.42)

As has become clear in the course of the study, the technology to enable autonomy is largely available anywhere in the world and can—both at rest and in motion—provide significant advantage in many areas of military operations. Thus, it should not be a surprise when adversaries employ autonomy against U.S. forces. Preparing now for this inevitable adversary use of autonomy is imperative.

This situation is similar to the potential adversary use of cyber and electronic warfare. For years, it has been clear that certain countries could, and most likely would, develop the technology and expertise to use cyber and electronic warfare against U.S. forces. Yet most of the U.S. effort focused on developing offensive cyber capabilities without commensurate attention to hardening U.S. systems against attacks from others.28 Unfortunately, in both domains, that neglect has resulted in DoD spending large sums of money today to “patch” systems against potential attacks. The U.S. must heed the lessons from these two experiences and deal with adversary use of autonomy now.

While many policy and political issues surround U.S. use of autonomy, it is certainly likely that many potential adversaries will have less restrictive policies and CONOPs governing their own use of autonomy, particularly in the employment of lethal autonomy. Thus, expecting a mirror image of U.S. employment of autonomy will not fully capture the adversary potential.

The potential exploitations the U.S. could face include low observability throughout the entire spectrum from sound to visual light, the ability to swarm with large numbers of low-cost vehicles to overwhelm sensors and exhaust the supply of effectors, and maintaining both endurance and persistence through autonomous or remotely piloted vehicles.

...

The U. S. will face a wide spectrum of threats with varying kinds of autonomous capabilities across every physical domain—land, sea, undersea, air, and space—and in the virtual domain of cyberspace as well.

Figure 9 (photo on left) is a small rotary-wing drone sold on the Alibaba web site for $400.29 The drone is made of carbon fiber; uses both GPS and inertial navigation; has autonomous flight control; and provides full motion video, a thermal sensor, and sonar ranging. It is advertised to carry a 1 kg payload with 18 minutes endurance.

Figure 9 (photo on right) shows a much higher end application of autonomy, a UUV currently being used by China. Named the Haiyan, in its current configuration it can carry a multiple sensor payload, cruise up to 7 kilometers per hour (4 knots), range to 1,000 kilometers, reach a depth of 1,000 meters, and endure for 30 days.30 Undersea testing was initiated in mid-2014. The unit can carry multiple sensors and be outfitted to serve a wide variety of missions, from anti-submarine surveillance, to anti-surface warfare, underwater patrol, and mine sweeping. The combat potential and applications are clear.

Harvard to Release Six Years of Admissions Data for Lawsuit

This amounts to "comprehensive data" on almost 200k applicants! I imagine the legal team could use some good data scientists...
Crimson: Harvard to Release Six Years of Admissions Data for Lawsuit

Harvard must produce “comprehensive data” from six full admissions cycles for use in the pending admissions lawsuit between the University and anti-affirmative action group Students for Fair Admissions following a court order filed Tuesday.

Students for Fair Admissions launched the lawsuit in 2014, alleging that the University’s admissions process discriminates against Asian American applicants by setting quotas. ...
See also 20 years @15 percent: does Harvard discriminate against Asian-Americans? and much more.

Does this graph look like "soft-quotas" to you?

Wednesday, September 07, 2016

Daimler investing 500 million euros in drone delivery (video)



WSJ: Daimler to Work With Matternet to Develop Delivery Van Drones

Auto maker investing $562.75 million to design electric vans that can host aerial deliveries

Daimler AG said on Wednesday it would join with U.S. startup Matternet to develop drones for its delivery vans and invest €500 million ($562.7 million) over the next five years in designing electric, networked vans.

Daimler, the maker of Mercedes-Benz cars and trucks, acquired a minority stake in Menlo Park, Calif.-based Matternet as part of the partnership, a spokeswoman said. Daimler’s overall investment in the initiative, called adVANce, will go to vehicle digitization, automation, robotics and mobility solutions technologies.

“We are looking beyond the vehicle to the whole value chain and the entire environment of our clients,” said van division chief Volker Mornhinweg. The goal is to turn vans into “intelligent, interconnected data centers,” he said.

SMPY in Nature


No evidence of diminishing returns in the far tail of the cognitive ability distribution.
How to raise a genius: lessons from a 45-year study of super-smart children (Nature)

A long-running investigation of exceptional children reveals what it takes to produce the scientists who will lead the twenty-first century.

Tom Clynes 07 September 2016

On a summer day in 1968, professor Julian Stanley met a brilliant but bored 12-year-old named Joseph Bates. The Baltimore student was so far ahead of his classmates in mathematics that his parents had arranged for him to take a computer-science course at Johns Hopkins University, where Stanley taught. Even that wasn't enough. Having leapfrogged ahead of the adults in the class, the child kept himself busy by teaching the FORTRAN programming language to graduate students.

Unsure of what to do with Bates, his computer instructor introduced him to Stanley, a researcher well known for his work in psychometrics — the study of cognitive performance. To discover more about the young prodigy's talent, Stanley gave Bates a battery of tests that included the SAT college-admissions exam, normally taken by university-bound 16- to 18-year-olds in the United States.

Bates's score was well above the threshold for admission to Johns Hopkins, and prompted Stanley to search for a local high school that would let the child take advanced mathematics and science classes. When that plan failed, Stanley convinced a dean at Johns Hopkins to let Bates, then 13, enrol as an undergraduate.

Stanley would affectionately refer to Bates as “student zero” of his Study of Mathematically Precocious Youth (SMPY), which would transform how gifted children are identified and supported by the US education system. As the longest-running current longitudinal survey of intellectually talented children, SMPY has for 45 years tracked the careers and accomplishments of some 5,000 individuals, many of whom have gone on to become high-achieving scientists. The study's ever-growing data set has generated more than 400 papers and several books, and provided key insights into how to spot and develop talent in science, technology, engineering, mathematics (STEM) and beyond.

...

At the start, both the study and the centre were open to young adolescents who scored in the top 1% on university entrance exams. Pioneering mathematicians Terence Tao and Lenhard Ng were one-percenters, as were Facebook's Mark Zuckerberg, Google co-founder Sergey Brin and musician Stefani Germanotta (Lady Gaga), who all passed through the Hopkins centre.

“Whether we like it or not, these people really do control our society,” says Jonathan Wai, a psychologist at the Duke University Talent Identification Program in Durham, North Carolina, which collaborates with the Hopkins centre. Wai combined data from 11 prospective and retrospective longitudinal studies2, including SMPY, to demonstrate the correlation between early cognitive ability and adult achievement. “The kids who test in the top 1% tend to become our eminent scientists and academics, our Fortune 500 CEOs and federal judges, senators and billionaires,” he says.

Such results contradict long-established ideas suggesting that expert performance is built mainly through practice — that anyone can get to the top with enough focused effort of the right kind. SMPY, by contrast, suggests that early cognitive ability has more effect on achievement than either deliberate practice or environmental factors such as socio-economic status.

...

The study's first four cohorts range from the top 3% to the top 0.01% in their SAT scores. The SMPY team added a fifth cohort of the leading mathematics and science graduate students in 1992 to test the generalizability of the talent-search model for identifying scientific potential.

“I don't know of any other study in the world that has given us such a comprehensive look at exactly how and why STEM talent develops,” says Christoph Perleth, a psychologist at the University of Rostock in Germany who studies intelligence and talent development.

...

Monday, September 05, 2016

A secret map of the world (Venkatesh Rao / Ribbonfarm)

This is Venkatesh Rao's conceptual map of the world (as seen from Silicon Valley / the internet). Details in the video and this blog post.



In case you can't make out all the features on the map, here is a hi-res version. See also this other map.

Some places of note:

Isle of Deep Learning
Isle of Physics
Moldbug's Lair
Alt-Right Hills
Dark Enlightenment Volcano
Paleo Crossing
Satoshi Mines
Secret Cloud Empire of Amazon
Fjords of Sisu
Algomonopolia (Google, Facebook, ...)
a16z Unicorn Hunting Ground
Lean Startup Town
SJW Cathedral
Manosphere Tar Pit
Global Bro-Science Laboratory
NSA
Academia
Efficient Market Temple
Graveyard of Boomer Dreams
Ghost of Industrial Past

If these memes are unfamiliar, you need to spend more time on the internet or in the bay area :-)


World's fastest supercomputer: Sunway TaihuLight (41k nodes, 11M cores)



Jack Dongarra, professor at UT Knoxville, discusses the strengths and weaknesses of the Sunway TaihuLight, currently the world's fastest supercomputer. The fastest US supercomputer, Titan (#3 in the world), is at Oak Ridge National Lab, near UTK. More here and here.

MSU's latest HPC cluster would be ranked ~150 in the world.
Top 500 Supercomputers in the world

Sunway TaihuLight, a system developed by China’s National Research Center of Parallel Computer Engineering & Technology (NRCPC) and installed at the National Supercomputing Center in Wuxi, which is in China's Jiangsu province is the No. 1 system with 93 petaflop/s (Pflop/s) on the Linpack benchmark. The system has 40,960 nodes, each with one SW26010 processor for a combined total of 10,649,600 computing cores. Each SW26010 processor is composed of 4 MPEs, 4 CPEs, (a total of 260 cores), 4 Memory Controllers (MC), and a Network on Chip (NoC) connected to the System Interface (SI). Each of the four MPEs, CPEs, and MCs have access to 8GB of DDR3 memory. The system is based on processors exclusively designed and built in China. The Sunway TaihuLight is almost three times as fast and three times as efficient as Tianhe-2, the system it displaces in the number one spot. The peak power consumption under load (running the HPL benchmark) is at 15.371 MW or 6 Gflops/W. This allows the TaihuLight system to hold one of the top spots on the Green500 in terms of the Performance/Power metric. [ IIRC, these processors are inspired by the old Digital Alpha chips that I used to use... ]

...

The number of systems installed in China has increased dramatically to 167, compared to 109 on the last list. China is now at the No. 1 position as a user of HPC. Additionally, China now is at No. 1 position in the performance share thanks to the big contribution of the systems at No. 1 and No. 2.

The number of systems installed in the USA declines sharply and is now at 165 systems, down from from 199 in the previous list. This is the lowest number of systems installed in the U.S. since the list was started 23 years ago.

...

The U.S., the leading consumer of HPC systems since the inception of the TOP500 lists is now second for the first time after China with 165 of the 500 systems. China leads the systems and performance categories now thanks to the No.1 and No. 2 system and a surge in industrial and research installations registered over the last few years. The European share (105 systems compared to 107 last time) has fallen and is now lower than the dominant Asian share of 218 systems, up from 173 in November 2015.

Dominant countries in Asia are China with 167 systems (up from 109) and Japan with 29 systems (down from 37).

In Europe, Germany is the clear leader with 26 systems followed by France with 18 and the UK with 12 systems.

Thursday, September 01, 2016

Balance of power in the western Pacific (Hugh White ANU video)

More Hugh White (see The Pivot and American Statecraft in Asia). In the first video below @38min: he gives what I consider a realistic assessment of the current and near future military balance of power in the Pacific, including the fact that both sides have significant uncertainty in their evaluation of opponent capability. If you just have a few minutes that part is worth a listen.

Defeating A2AD (Anti-Access Area Denial) requires ASB (Air Sea Battle), which is highly escalatory. See also A2AD fait accompli?
Hugh White AO is Professor of Strategic Studies at the Australian National University. His work focuses primarily on Australian strategic and defence policy, Asia-Pacific security issues, and global strategic affairs especially as they influence Australia and the Asia-Pacific. He has served as an intelligence analyst with the Office of National Assessments, as a journalist with the Sydney Morning Herald, as a senior adviser on the staffs of Defence Minister Kim Beazley and Prime Minister Bob Hawke, and as a senior official in the Department of Defence, where from 1995 to 2000 he was Deputy Secretary for Strategy and Intelligence, and as the first Director of the Australian Strategic Policy Institute (ASPI). In the 1970s he studied philosophy at Melbourne and Oxford Universities.





In the second video, see the syllogism expressed @25 min. @44min: surface ships are toast.

Monday, August 29, 2016

A2AD fait accompli?



As I mentioned previously, Australian strategists are a good source of analysis on China-US defense issues in the western Pacific because they are caught in the middle and have to think realistically about the situation.

@3:30 min: A2AD by DF21 / DF26 ASBM a fait accompli? 9 dash line to become a reality? Is containment broken?

See earlier post The Pivot and American Statecraft in Asia.

I linked to the report below some time ago.
China’s Constellation of Yaogan Satellites & the Anti-Ship Ballistic Missile – An Update

Professor S. Chandrashekar and Professor Soma Perumal
International Strategic & Security Studies Programme (ISSSP)
National Institute of Advanced Studies (NIAS)
December 2013

With the recent launch of the Yaogan 19 satellite China has in place an advanced space capability to identify, locate and track an Aircraft Carrier Group (ACG) on the high seas. This space capability is an important component of an Anti-Ship Ballistic Missile (ASBM) System that China has set up.

The current 19 satellite constellation consists of ELINT satellites, satellites carrying Synthetic Aperture Radar (SAR) sensors as well as satellites carrying optical imaging sensors. Based on the orbit characteristics, their local time of equatorial crossing and other related parameters, these satellites can be grouped into different categories that perform the various functions for identifying, locating and tracking the ACG.

Yaogan 9 (Yaogan 9A, 9B, 9C), Yaogan (16A, 16B, 16C) and Yaogan 17 (17A, 17B, 17C) are the three clusters that are equipped with ELINT sensors that provide broad area surveillance over the Oceans. With a coverage radius of about 3500 Km, they provide the first coarse fix for identifying and locating an ACG in the Pacific Ocean.

Yaogan 13, Yaogan 10, Yaogan 18 and Yaogan 14 are the satellites carrying a SAR sensor. With Local times of crossing of 02 00, 06 00, 10 00 and 14 00 hours and a resolution of 1 to 3 m , they provide all weather as well as day and night imaging capabilities over the regions of interest.

Yaogan 11, Yaogan 4, Yaogan 2 and Yaogan 7 constitute the high resolution optical satellites in the current constellation. The sensors they carry may have resolutions of between 1 to 3 m.

... 
The analysis and the simulation results suggest that China has in place an operational ASBM system that can identify, locate, track and destroy an Aircraft Carrier in the Pacific Ocean.

This seems to be an important component of a larger Chinese Access and Area Denial Strategy focused around a conflict over Taiwan.
Over the summer I bumped into a micro-satellite startup guy at the Googleplex and we got onto the subject of imaging aircraft carriers from space. He thought a carrier group would be easy to image and couldn't possibly survive a serious conflict in the Pacific.

Sunday, August 28, 2016

Geoffrey Miller on Virtue Signalling (audio + slides)



This talk was given at the meeting Effective Altruism Global 2016. Includes a good warning about IQ signaling and some advice on how to market a movement to neurotypicals. Slides.


More Geoffrey Miller :-)

Friday, August 26, 2016

GWAS: Multiple Loci Influencing Normal Human Facial Morphology


These are not surprising results, given that identical twins raised apart tend to have nearly identical facial morphology. It's implausible that most of this heritability is due to rare variants. If large GWASes would take photos and video of individuals in the study, genomic prediction of facial morphology could advance dramatically using face recognition algorithms.

See also HLI and genomic prediction of facial morphology  (source of image above of Craig Venter).
Genome-Wide Association Study Reveals Multiple Loci Influencing Normal Human Facial Morphology

http://dx.doi.org/10.1371/journal.pgen.1006149

Numerous lines of evidence point to a genetic basis for facial morphology in humans, yet little is known about how specific genetic variants relate to the phenotypic expression of many common facial features. We conducted genome-wide association meta-analyses of 20 quantitative facial measurements derived from the 3D surface images of 3118 healthy individuals of European ancestry belonging to two US cohorts. Analyses were performed on just under one million genotyped SNPs (Illumina OmniExpress+Exome v1.2 array) imputed to the 1000 Genomes reference panel (Phase 3). We observed genome-wide significant associations (p < 5 x 10−8) for cranial base width at 14q21.1 and 20q12, intercanthal width at 1p13.3 and Xq13.2, nasal width at 20p11.22, nasal ala length at 14q11.2, and upper facial depth at 11q22.1. Several genes in the associated regions are known to play roles in craniofacial development or in syndromes affecting the face: MAFB, PAX9, MIPOL1, ALX3, HDAC8, and PAX1. We also tested genotype-phenotype associations reported in two previous genome-wide studies and found evidence of replication for nasal ala length and SNPs in CACNA2D3 and PRDM16. These results provide further evidence that common variants in regions harboring genes of known craniofacial function contribute to normal variation in human facial features. Improved understanding of the genes associated with facial morphology in healthy individuals can provide insights into the pathways and mechanisms controlling normal and abnormal facial morphogenesis.

Wednesday, August 24, 2016

Tyler Cowen on Efficient Markets (video)



Tyler Cowen explains the basics of the Efficient Market Hypothesis. For a deeper exploration, see Tyler Cowen and rationality, which links to his paper How economists think about rationality.
Tyler Cowen and rationality [my comments]: ... The excerpt below deals with rationality in finance theory and strong and weak versions of efficient markets. I believe the weak version; the strong version is nonsense. (See, e.g, here for a discussion of limits to arbitrage that permit long lasting financial bubbles. In other words, capital markets are demonstrably far from perfect, as defined below by Cowen.)

Although you might think the strong version of EMH is only important to traders and finance specialists, it is also very much related to the idea that markets are good optimizers of resource allocation for society. Do markets accurately reflect the "fundamental value of corporations"? See related discussion here.

...

As you can tell from my comments, I do not believe there is any unique basis for "rationality" in economics. Humans are flawed information processing units produced by the random vagaries of evolution. Not only are we different from each other, but these differences arise both from genes and the individual paths taken through life. Can a complex system comprised of such creatures be modeled through simple equations describing a few coarse grained variables? In some rare cases, perhaps yes, but in most cases, I would guess no. Finance theory already adopts this perspective in insisting on a stochastic (random) component in any model of security prices. Over sufficiently long timescales even the properties of the random component are not constant! (Hence, stochastic volatility, etc.)

Tuesday, August 23, 2016

Bayesian large-scale multiple regression with summary statistics from genome-wide association studies

This is an interesting idea, which has also been advocated to me over the years by my collaborator Carson Chow (blog). I'm optimistic that we're entering an era of large data sets that can be analyzed in situ using more sophisticated algorithms than simple regression. However, it will always be useful to have a better method for combining multiple data sets using only aggregate statistics.

Xiang Zhu's slides.
Bayesian large-scale multiple regression with summary statistics from genome-wide association studies

Xiang Zhu, Matthew Stephens
doi: http://dx.doi.org/10.1101/042457

Bayesian methods for large-scale multiple regression provide attractive approaches to the analysis of genome-wide association studies (GWAS). For example, they can estimate heritability of complex traits, allowing for both polygenic and sparse models; and by incorporating external genomic data into the priors they can increase power and yield new biological insights. However, these methods require access to individual genotypes and phenotypes, which are often not easily available. Here we provide a framework for performing these analyses without individual-level data. Specifically, we introduce a "Regression with Summary Statistics" (RSS) likelihood, which relates the multiple regression coefficients to univariate regression results that are often easily available. The RSS likelihood requires estimates of correlations among covariates (SNPs), which also can be obtained from public databases. We perform Bayesian multiple regression analysis by combining the RSS likelihood with previously-proposed prior distributions, sampling posteriors by Markov chain Monte Carlo. In a wide range of simulations RSS performs similarly to analyses using the individual data, both for estimating heritability and detecting associations. We apply RSS to a GWAS of human height that contains 253,288 individuals typed at 1.06 million SNPs, for which analyses of individual-level data are practically impossible. Estimates of heritability (52%) are consistent with, but more precise, than previous results using subsets of these data. We also identify many previously-unreported loci that show evidence for association with height in our analyses. Software implementing RSS is available at https://github.com/stephenslab/rss.

Monday, August 22, 2016

Wikileaks, Julian Assange, and an October(?) Surprise



In recent interviews Julian Assange more or less claims to have the goods on Hillary. If I had to guess, I suppose he might have email traffic showing that she lied to congress, for example in the exchange above with Rand Paul, or perhaps in some of her answers to questions about her private email server. The most impactful time to release this information is probably just before one of the debates.
Paul: "It’s been in news reports that ships have been leaving from Libya and that they may have weapons. And what I’d like to know is, that [CIA] annex that was close by [the State Department facility], were they involved with procuring, buying, selling, obtaining weapons, and were any of these weapons being transferred to other countries? Any countries, Turkey included?"

Clinton: “I don’t know. I don’t have any information on that.”
If we lived in a country where rule of law applied, there might be serious consequences for this sort of thing. In the 21st century USA, we'll be lucky if any mainstream media outlets cover the story ;-) The NYTimes will probably just blame the Russians.

Wednesday, August 17, 2016

The Pivot and American Statecraft in Asia

Hugh White, Professor of Strategic Studies at the Australian National University, critiques the Obama administration's so-called pivot to Asia. Australian strategists are a good source of analysis on this issue because they are caught in the middle and have to think realistically about the situation.

Whenever I see a book or article on this topic I quickly search for terms like DF-21, ASBM, ASCM, cruise missiles, satellite imaging, submarines, etc. The discussion cannot be serious or deep without an understanding of current military and technological capabilities of both sides. (See High V, Low M.)
Book review: 'The Pivot: The Future of American Statecraft in Asia', by Kurt Campbell: As Assistant Secretary of State for Asia in Barack Obama's first term, Kurt Campbell has a respectable claim to being the principal architect of the president's Pivot to Asia. Not surprisingly, then, his new book The Pivot: The Future of American Statecraft in Asia argues that the Pivot is the right policy for America in Asia over coming years, and explains how it should be elaborated and extended under the next president.

... Washington has never clearly identified or analysed the problem which the Pivot is supposed to solve, and The Pivot doesn't either. And yet there is no mystery here. America's problem in Asia today is that China seeks to take its place as the primary power in Asia, and the shift in relative power between the two countries over recent decades makes China's challenge very formidable indeed. This simple fact must be at the centre of any serious analysis of America's policy options in Asia.

The Pivot mentions China a lot, but does not plainly acknowledge the centrality of its challenge to America's predicament in Asia today, and nowhere seriously assesses the power and ambition that drive China's challenge. Nor is the book clear about America's objectives. In places it says America's aims include preventing Asia falling under someone else's hegemony, but elsewhere that the Pivot is all about preserving Asia's geopolitical 'operating system', by which it plainly means preserving the status quo based on US primacy.

Thus the book, like the policy itself, is based on evasions about both China's and America's aims, and therefore avoids acknowledging how directly those aims conflict, and how stark and serious the resulting confrontation between them has already become.

... The practical steps taken under the Pivot have always been far too modest to meet the challenge America faces in Asia. Indeed, it is hard to imagine that they were ever intended to have more than a symbolic effect. The Pivot's architects apparently assumed that a merely symbolic reassertion of US power and resolve would be enough to make China back off and abandon its challenge. China's assertive posture in the East and South China Seas today is strong evidence that they were wrong.

... In particular, The Pivot has nothing to say about the most important single question facing America in Asia today: is it willing to go to war with China to preserve US primacy? This question, more than anything else, will determine the shape of future Asian order and America's role in it. China's recent conduct strongly suggests that it will only abandon its challenge to American primacy if it is really convinced that the answer is 'yes'. But nothing Beijing has seen or heard from Washington in recent years has convinced it of that, which is why it has been acting so boldly. Unless that changes, the chances of facing down Beijing's challenge are very low.

That will not change until an American president is willing to stand up and explain to America's people why US primacy in Asia is so important to them that they should be willing to go to war with China to preserve it. The answer to that question must encompass the fact that China is a nuclear-armed power with the capacity to destroy US cities. This is an issue which The Pivot entirely avoids. I found no substantive reference to China's nuclear forces in the entire book, nor to extended nuclear deterrence as the foundation of America's key alliances, and hence to its position in Asia. No analysis that evades these hard questions can address the future of America's Asia strategy effectively.

So Kurt Campbell's new book reinforces the impression that important elements of America's foreign policy establishment still haven't begun either to take China's rise seriously or to consider the momentous choices America faces in response to it. Until that changes, America's response to China is unlikely to become much more effective than it has been for the five years since Barack Obama launched the Pivot in Canberra. And so it becomes more and more likely that American power in Asia will continue to dwindle.
See also Red Star over the Pacific and The Thucydides Trap.

I added the following in the comments. These questions of military/technological capability stand prior to the prattle of diplomats, policy analysts, or political scientists. Perhaps just as crucial is whether top US and Chinese leadership share the same beliefs on these issues.
It's hard to war game a US-China pacific conflict, even a conventional one. How long before the US surface fleet is destroyed by ASBM/ASCM? How long until forward bases are? How long until US has to strike at targets on the mainland? How long do satellites survive? How long before the conflict goes nuclear? I wonder whether anyone knows the answers to these questions with high confidence -- even very basic ones, like how well asymmetric threats like ASBM/ASCM will perform under realistic conditions. These systems have never been tested in battle.

The stakes are so high that China can just continue to establish "facts on the ground" (like building new island bases), with some confidence that the US will hesitate to escalate. If, for example, both sides secretly believe (at the highest levels; seems that Xi is behaving as if he might) that ASBM/ASCM are very effective, then sailing a carrier group through the South China Sea becomes an act of symbolism with meaning only to those that are not in the know.
This Aug 2016 RAND report delves into some of the relevant issues (see Appendix A, p.75). But it is not clear whether the 2025 or 2015 scenarios explored will be more realistic over the next few years. A weakness of the report is that it assumes US forces will undertake large scale conventional attack on the Chinese mainland (referred to as Air Sea Battle by US planners) relatively early in the conflict, without fear of nuclear retaliation. A real decision maker could not confidently make that assumption, PRC "no first use" declaration notwithstanding.

See also Future Warfare in the Western Pacific (International Security, Summer 2016) for a detailed analysis of A2AD capability, potentially practiced by both sides. I disagree with the authors' claim that the effectiveness of A2AD in 2040 will be limited to horizon distances (they assume all satellites have been destroyed). The authors neglect the possibility of large numbers of stealthy drone radar platforms (or micro-satellites) which are hard to detect until they activate to provide targeting data to incoming missiles.

This article by Peter Lee gives a realistic summary of the situation, including the role of nuclear weapons. As a journalist, Lee is not under the same political restrictions as RAND or others funded by the US military / defense industry. The survivability of the surface fleet (=aircraft carriers) and the escalatory nature of what is known as Air Sea Battle (=ASB) are both highly sensitive topics.

Sunday, August 14, 2016

Cheng Li on elite Chinese politics (Sinica podcast)

Excellent podcast interview with Cheng Li of Brookings. Li has both a long historical perspective on Chinese politics (having lived through the Cultural Revolution) and a detailed understanding of current developments. He addresses topics such as technocracy, rule of law, Xi Jinping, corruption, princelings vs grassroots party members, etc.
Sinica podcast: One of the most prominent international scholars of elite Chinese politics speaks about the past, present and future of factionalism, reform and technocracy in China and the nation's direction under Xi Jinping.
Li grew up in Shanghai during the Cultural Revolution. In 1985 he came to the United States, where he received a master's in Asian studies from the University of California, Berkeley and a doctorate in political science from Princeton University. From 1993 to 1995, he worked in China as a fellow sponsored by the Institute of Current World Affairs in the U.S., observing grassroots changes in his native country. Based on this experience, he published a nationally acclaimed book, "Rediscovering China: Dynamics and Dilemmas of Reform" (1997).

Li is also the author or the editor of numerous books, including "China’s Leaders: The New Generation" (2001), "Bridging Minds Across the Pacific: The Sino-U.S. Educational Exchange 1978-2003" (2005), "China’s Changing Political Landscape: Prospects for Democracy" (2008), "China’s Emerging Middle Class: Beyond Economic Transformation" (2010), "The Road to Zhongnanhai: High-Level Leadership Groups on the Eve of the 18th Party Congress" (in Chinese, 2012), "The Political Mapping of China’s Tobacco Industry and Anti-Smoking Campaign" (2012), "China's Political Development: Chinese and American Perspectives" (2014), "Chinese Politics in the Xi Jinping Era: Reassessing Collective Leadership" (2016), and "The Power of Ideas: The Rising Influence of Thinkers and Think Tanks in China" (forthcoming). He is currently completing a book manuscript with the working title "Middle Class Shanghai: Pioneering China’s Global Integration." He is the principal editor of the Thornton Center Chinese Thinkers Series published by the Brookings Institution Press.

Blog Archive

Labels