Friday, July 29, 2016

The True Believers

Beliefs are either objective, derived ultimately from evidence, or subjective and derived from some internal revelation or the belief that some other person has had a true internal revelation and communicated it honestly. While not wanting to denigrate people’s beliefs in subjective truths, the success of Science and the triumphs of our modern world rest entirely on objective truths. Yet the voting public has partly lost faith in the people claiming to be purveyors of objective truth. The water is muddied by people lying for fun and profit, and by Science shooting itself in the foot with a lot of overconfident deduction and speculation having little or no support.
There is an urgent need to restore the public’s faith in Science and objective truth, which are the foundations of our civilization. Can we perhaps get some intelligent supporters of objective truth to make some sacrifices that will bring the matter to the public’s attention in a good way. Take the following as an example of the sort of thing that might work. Better ideas are welcome.
The proposal is to create a Monastery (or hopefully a network of many Monasteries) dedicated to determining the objective truth, and to discovering and “excommunicating” those attempting to pervert the search for the truth. Those who had merely strayed, making bad deductions or trusting the wrong people, would be formally forgiven and “born again”.
The monks would put their wealth in a financial Trust, and wear some modest uniform. Trusted folk from outside can act as lay advisers. There are investigations, in which evidence is collected. Then there are trials in which the two sides are debated. When nobody is available to support one side, devil’s advocates are appointed. Monks clinging to views that most monks think have been completely disproved can be demoted back to applicant status and thus expelled from the monastery.
And, of course, all this is live streamed on the Internet. Viewers can participate to varying degrees based on their level. The highest levels are applicants (wanting to become monks) and lay advisors (who typically are unable to become monks for some reason). Next are trainees learning to evaluate objective truth and preparing for tests that will get them up to that higher level. Finally there are supporters. All these lower levels can make themselves available to help the monks. The general public have a ringside seat on this vigorous search for the truth and the truthful, and the key objective is to get them to understand and support it.
Finally a little riff on the desirability of the truth. The truth can’t lead us astray if properly understood. However properly understanding it is not always easy when our culture has implanted so many subjective truths so firmly in our brain. Let’s take a simple case: bullying. You will pardon my expression of personal non-expert opinion as if it was truth. It is purely illustrative, so it doesn’t matter if wrong. Humans used to form groups of several hundred related individuals divided into families. The families have a status order, like a pecking order in chickens, and it is there for the same reason as all status hierarchies: because without it there would need to be a fight over each conflicting intention. The human situation is similar to (some species of) macaques, with family status passed from mother to daughter. Everyone needs to know their place and one of the ways this is done is by bullying. If A bullies B and B’s relatives don’t rush in then B knows that A and A’s family have higher status. In its natural setting bullying is only necessary when there is doubt. Of course in our city lifestyle this all breaks down. Status is a mess, with endless struggles leading to a lot of bullying. Now I would say that we need to understand this to make good decisions about dealing with bullying (which doesn’t even do its job in our society). But a lot of people would say “You can’t say that bullying is natural. That condones it.” Actually it only condones it if you have the subjective idea that human nature is good by default and departures from the good represent some malfunction. The correct view is that human evil, such as bullying, needs to be dealt with in human ways, not by trying to find and fix a malfunction.
Still we need to accept that the general public is not going to easily accept things which contradict their firmly implanted subjective truths. Such matters need to be dealt with carefully, and avoided as much as possible.

[update: This interact with the proposed Truth and Expertise Network (previous post) because people can just say "I trust the monastery" and get good feedback on the stuff they read.]

The Truth and Expertise Network

While specific facts and resulting deductions are at the core, in the main we are interested in identifying those who are, and those who are not, good sources of the truth. Most particularly we want to identify those who are lying for their own advantage, since those lies are much more likely to cause harm than merely mistaken beliefs. We are mainly concerned with facts that are relevant to public policy, but even there we come to issues where well-meaning folk would say “you shouldn’t say that, even if it is true”. We’ll leave such delicate considerations until the next blog post.
For those who don’t want the technical details, the general idea is this: Individuals can specify their level of belief about claims made, about the motivation for claims, and about the trustworthiness of other individuals. Software can then warn you about claims based on the claim itself, or the people making it. The software would only follow links from the people you trust (and that they trust, etc). This might need some social engineering to actually work, and that is described in the following blog post.
So the idea is that participants in the scheme will have one or more public-private keypairs. These will be used to sign assertions of various sorts, discussed below. They will be of no use unless (a) people link to those keys in various ways; and (b) the assertions are made public (at the very least to some of the linking people).
People can just make their main public key publically available in places they are known to control. Or give them to specific people. They can also have keypairs that they don’t advertise as belonging to them, but endorse them as reliable as if they were some other unknown person. They can then  make assertions that can’t be attributed but can be used by people who trust you and the people you trust.
I’ll list (some of) the assertions that can be made. Software running in the user’s machine, and the machines of those she trusts, and in central servers, will cooperate to provide the user with warnings of false claims, claimants lacking the expertise they claim, claimants seeking to mislead. Perhaps the most important things will be information about internal contradictions in the trust network. If your trust network supports incompatible claims then it is an indication of a problem, such as people in your trust network being overly confidant about an uncertain matter, or infiltration of the trust network by incompetent or bad actors. Tracking these things down will help everybody who wants to get a good handle on the truth.
  • “My prior (belief pending future evidence) for this claim to be true is P%” where P is between 0 and 100. The claim should be a block of text, [+ optionally a wider block of text from the same source providing context], + a URL giving the location.
  • “My prior that this claim is honestly believed by the claimant is …”
  • “I believe the claimant is … [with probability] acting on behalf of … [with probability]”
  • “I trust this person to only express honestly held beliefs”, giving a public key.
  • “I believe this person is an expert on …”
  • “I trust this person to choose others who are trustworthy” (thus allowing an extended network of trust).
Systematizing all that (and more) is a tough job. It is similar to the jobs done by the IETF (Internet Engineering Task Force), and maybe we need an OTETF (Objective Truth Engineering Task Force).

[update: Naturally browser plugins and other user software will make it as easy as possible for users to participate in this scheme.]

The Truth and Expertise Problem

Katherine Viner, writing in The Guardian, gives a comprehensive overview of the way social media and tailored news is disrupting the truth (https://www.theguardian.com/media/2016/jul/12/how-technology-disrupted-the-truth). It used to be that most of the population saw a common set of reasonably authoritative information from newspapers and TV. Now the real experts struggle to be heard over the cacophony, the truth is mixed up with falsehoods, and the way people get their news is highly likely to reinforce their biases.
But things are not all bad. Previously many lies got firmly established, and Viner gives the example of the Hillsborough tragedy. Now the population has learnt not to trust everything that is written down. This is a necessary step to not being misled. The problem is to get them to take the next step: to collect and evaluate the evidence like a scientist, then to think through the implications like a mathematician.
Well obviously that’s unrealistic. All of us, even the greatest experts in some field, are forced to identify experts that we trust when it comes to areas in which we lack expertise. The problem is that our faith in experts has been seriously eroded. In many ways this is a good thing. We now know that experts have put many people in prison through highly exaggerated claims about DNA and fingerprint identification. We know that many of the medical treatments that doctors have used, and advice they have given, are not supported by the evidence or results. We needed to take this step of treating experts with caution, but not go to the current extreme where expert advice is often completely ignored.
Experts need to be evaluated in some way. They won’t like that, but we need to achieve two things: Making sure that the experts we trust are not abusing that trust; And finding the real experts to trust in the midst of so many competing claims. We also need a process that can be respected by the public so that they and the media are inclined to select real and honest experts for their understanding of important matters.
This is the first of 3 blog posts. The 2nd will discuss a technological solution for identifying what is true, and why, and for identifying who is an expert, who is trying to mislead. The 3rd will provide a rather wild idea for a social experiment that might get the message to the public that there is an objective truth which is worth pursuing.
[There is a relevant new book “A Survival Guide to the Misinformation Age” by David Helfand, and here’s a review: http://physicsworld.com/cws/article/indepth/2016/jul/28/between-the-lines].