Friday, September 7, 2018

My Crackpot Theory

There is matter and antimatter, but luckily for us there is more matter, otherwise the matter and antimatter would annihilate each other leaving nothing but photons. Why is there more matter? I have an answer! Since I'm not an expert in the field, and haven't done due diligence on it, it is, by definition, a crackpot theory. But I like it.

The great Richard Feynman said that antimatter behaves, for computational purposes like ordinary matter travelling backwards in time. Let's take that literally.

Our Universe is expanding from the big bang in accordance with Einstein's equations. Another solution of those equations would be a universe contracting to a reverse big crunch. It doesn't take too much imagination to imagine that immediately before the big bang another universe was coming in to a big crunch. You can regard that as being an earlier part of our Universe, but I want to imagine it as separate. I'll call it the negative universe since it is in negative time if the big bang is at zero.

Particles don't have well defined positions in our quantum universe, which lets them tunnel through apparently insurmountable barriers. So it is possible for some particles moving forward in time from the negative universe to tunnel through to our Universe. And similarly some of our particles moving backward in time (i.e. antimatter) can tunnel through to the negative universe.

So naturally we end up with an excess of matter, and the negative universe ends up with an excess of antimatter. I suggest that in the negative universe, the arrow of time defined by increasing entropy would point backwards. So folk living there would perceive that antimatter as moving forward in time, and perceive the universe as expanding.

It's such a beautiful symmetric picture, it just has to be.

It makes a prediction. In the very early universe particles don't survive long before being annihilated by antiparticles. In the current model where matter is assumed to be slightly different from antimatter, the preponderance of matter happens later. In my model it would appear earlier, so that there is net matter even when the energy level is very high and particles don't last long.

Sunday, April 29, 2018

Bi-quarternions and 4d Clifford algebra

You took a shortcut weeks ago in the program you're writing and now it's biting you, and you know you have to go back and rewrite stuff, but to put that off you start reading Conway and Smith's "On Quarternions and Octonions", and then you wake up in the middle of the night thinking about bi-quarternions. What is the Clifford algebra way of thinking about them?

In Clifford algebra, rotations are given by elements of the even sub-algebra (acting on the vectors to be rotated by the sandwich product). It forms a sub-algebra because the clifford product sums the dimensions so even goes to even. Multiples of the same even element result in the same rotation with the sandwich product, so the degrees of freedom of rotation is one less than the dimension of the even subalgebra.

The grades of the n-dimensional Clifford algebra follow Pascal's triangle and give a total dimension of 2ⁿ. The 0-grade is scalars. There is only one part of the highest grade, so it is called a pseudo-scalar. The 1-grade is the vectors of the base vector space. Orthogonal to each vector is an (n-1)-grade element that is a pseudo-vector which behaves a lot like a vector.

So let's start with 2-d. 1-d is left as an exercise. We have a 2-d vector space. The Clifford algebra consists of: scalars (1-d); vectors (2-d); oriented area elements (1-d, the pseudo scalar). 1+2+1=4. The square by clifford multiplication of the unit area element is the scalar -1. That's suggestive! The even sub-algebra is the scalars plus the area elements. Yes it is the Complex numbers. Because the complex numbers and the vectors are both 2-d it is easy to get them confused. 

In 3-d the Clifford algebra consists of: scalars (1-d); vectors (3-d); bivectors (pseudo-vectors, so also 3-d); oriented volume element (1-d pseudo-scalar). 1+3+3+1=8. The even sub-algebra is the scalars and the bivectors. Yes it is the quarternions. Once again there is potential confusion, this time because the vectors and the bivectors have the same dimensions.

The product of n vectors is called a versor. Up to 3-d there are no Clifford algebra elements that aren't versors. The square of a versor is a scalar. So it makes sense to take the sum of squares of the components of an even subalgebra element, then take the square root to get a norm. With a bit of other calculation we find that they form normed division algebras. This breaks down in 4-d where there are bivectors which are not versors.

In 4-d we have: scalars (1-d); vectors (4-d); bivectors (6-d); trivectors (pseudo-vectors 4-d); oriented hypervolume (1-d pseudo-scalar). 1+4+6+4+1=16. The even subalgebra is the scalars, the bivectors and the pseudo-scalar. But we are told that rotations can also be represented by a pair of quarternions. Here's a way to see two quarternions in the even subalgebra:

We like Clifford algebras because so much can be done without picking distinguished directions to be basis vectors. But somehow we keep finding it convenient to specify a basis, as we will here. Let e1, e2, e3 and e4 be a basis of our 4-d vector space. Now consider just the 3-d subspace formed by e1, e2 and e3. The even subalgebra is the quarternions consisting of the scalar and the bivectors generated by e1e2, e2e3 and e3e1.

This leaves from the bivector basis: e1e4, e4e2 and e3e4. Plus the pseudoscalar e1e2e3e4. Now if we define a new multiplication as the Clifford product times (or divided by) e1e2e3e4 then we get a new model of the quarternions: this time with the pseudoscalar as the scalar. So I've divided the even subalgebra into 2 quarternions.

Well I should check this out more carefully, but I'd better get back to my program that needs fixing.

Monday, June 12, 2017

Logic Programming in Functional Style

[This was also posted on the WombatLang blog, but it is self-contained and might have wider interest.]
[N.B. There is code. In I have hand-compiled the example below to an AST, and written a pretty-printer to check it is right. Next step is to write an interpreter for the AST. How hard can that be :-).]

Wombat was designed to be a (mostly) functional programming language with some logic programming capabilities. But it turned out that you can't be half-hearted about logic programming. However the functional roots shine through, giving Wombat a familiar look for most programmers. But behind the scenes, unification is everywhere.

Procedures are also ubiquitous in Wombat. They always have exactly one input and one output. Even things that aren't procedures can act like procedures. In normal functional programming the input is specified, and the output starts out as a hole that the result gets put into. In Wombat both the output and the input are passed to the procedure. Either can be a hole. One or both might be structures which include holes. Consider

(x,y) = f(1,2)

Here "=" causes unification. One might think that the function f will be called, it will return a pair, and unification will then happen. But this is not how Wombat works. Instead (x,y) is unified with f's output, (1,2) is unified with f's input, and execution of f then starts.

Before we look at a more interesting example, some relevant features of Wombat are:
  • An identifier is preceded by backquote when used for the first time. It starts life as a hole, and like all holes it can only be filled in once. `x:Int; x=3 (Explicit typing is optional.);
  • An explicit procedure (closure) is just an expression in braces -- { x+1 } ;
  • A closure's input is $ and its output is `$. The input is commonly a tuple which is unpacked immediately, and $ is never mentioned again -- { $ = (`x,`y); x+y } ;
  • If `$ isn't explicitly unified, then it is unified with the whole expression: {$+1} means {`$=$+1}.
  • A list is given by elements in square brackets separated by spaces. The +> operator adds an element to the head of the list and is invertible.

Here is the classic list append program (using the caseP procedure, rather than the almost identical case syntactic sugar):

`append = {
   $ = (`x,`y); # 2 input lists
   caseP [
       { x=[]; y }
       { x = `hdx +> `tlx;
         hdx +> append(tlx,y) }
   ] ()

print( append([1 2],[3 4])); # [1 2 3 4]
[1 2 3 4] = append([1 2],print(`a)); # [3 4] -- print returns its argument
[1 2 3 4] = append(print(`b),[3 4]); # [1 2]

Consider the last line. Execution proceeds concurrently:
  • x is unified with print(`b) and y with [3 4];
    • print is called with its `$ set to the hole x, and its input set to the hole `b. Since it is going to have an effect it has to stall waiting for one or other to be filled. If there were any later effects they would also stall, even if ready to go, because of a lexical ordering requirement.
  • At the same time caseP is called with input set to unit (=()), and output set to the output of the whole procedure (i.e. [1 2 3 4]) since it is the last expression. Now caseP calls all procedures in its list expecting precisely one to succeed. In this case:
    • Closures execute in a sandbox where unifications with holes from outside are tentative and only make it out if the procedure doesn't fail. If the outside hole gets filled in while the closure is executing then the unification is made firm if it agrees with the tentative binding, or the closure fails if it doesn't.
    • So when we look at the first procedure in the caseP, it tentatively unifies x with [], then tries to unify y=[3 4] with `$=[1 2 3 4]. This fails, so that closure fails.
    • At the same time we start the more complex 2nd closure. The first line creates a relationship between the 3 holes: x, hd and tl. The 2nd line then unifies [1 2 3 4] with (hd+>append(tl,y)). this sets hd=1 and unifies [2 3 4] with append(tl,y). So we call append recursively with `$=[2 3 4] and $=(tl,y).
    • The following time that append is called we have `$=[3 4] and then the first closure succeeds (while the 2nd fails), so that when it returns it establishes its parent's y as [3 4], tlx=[] and hdx=2. This resolves the previous line giving x=[2].
    • When this returns the output of print(`b) is unified with [1 2] which in turns sets b to [1 2] and allows the print to proceed.
    • If we weren't going to use b subsequently we could have just written print(_) because _ is always a new hole.

Saturday, May 6, 2017

Networks of People

When you want to build a network, such as the Internet or the national road network, you don't put in a lot of random links. Instead you have a mixture of local links forming local clusters, and long distance links between clusters. This can continue for more levels.

Now we come to the interesting observation that people divide up, corresponding roughly to the right/left divide in politics, between people who want friendship and support to extend only to their local group (or groups), and those who want to extend help and support more broadly, with some extending that to all humanity. This division is just what you need to efficiently build networks of people.

A key feature of humans is our ability to move from simple things to multi-level recursively defined things. We see that most obviously in human language compared to the simpler communications of our related species. And indeed it is notable the way that nations relate to each other in ways that seem similar to the way that humans interact.

Friday, July 29, 2016

The True Believers

Beliefs are either objective, derived ultimately from evidence, or subjective and derived from some internal revelation or the belief that some other person has had a true internal revelation and communicated it honestly. While not wanting to denigrate people’s beliefs in subjective truths, the success of Science and the triumphs of our modern world rest entirely on objective truths. Yet the voting public has partly lost faith in the people claiming to be purveyors of objective truth. The water is muddied by people lying for fun and profit, and by Science shooting itself in the foot with a lot of overconfident deduction and speculation having little or no support.
There is an urgent need to restore the public’s faith in Science and objective truth, which are the foundations of our civilization. Can we perhaps get some intelligent supporters of objective truth to make some sacrifices that will bring the matter to the public’s attention in a good way. Take the following as an example of the sort of thing that might work. Better ideas are welcome.
The proposal is to create a Monastery (or hopefully a network of many Monasteries) dedicated to determining the objective truth, and to discovering and “excommunicating” those attempting to pervert the search for the truth. Those who had merely strayed, making bad deductions or trusting the wrong people, would be formally forgiven and “born again”.
The monks would put their wealth in a financial Trust, and wear some modest uniform. Trusted folk from outside can act as lay advisers. There are investigations, in which evidence is collected. Then there are trials in which the two sides are debated. When nobody is available to support one side, devil’s advocates are appointed. Monks clinging to views that most monks think have been completely disproved can be demoted back to applicant status and thus expelled from the monastery.
And, of course, all this is live streamed on the Internet. Viewers can participate to varying degrees based on their level. The highest levels are applicants (wanting to become monks) and lay advisors (who typically are unable to become monks for some reason). Next are trainees learning to evaluate objective truth and preparing for tests that will get them up to that higher level. Finally there are supporters. All these lower levels can make themselves available to help the monks. The general public have a ringside seat on this vigorous search for the truth and the truthful, and the key objective is to get them to understand and support it.
Finally a little riff on the desirability of the truth. The truth can’t lead us astray if properly understood. However properly understanding it is not always easy when our culture has implanted so many subjective truths so firmly in our brain. Let’s take a simple case: bullying. You will pardon my expression of personal non-expert opinion as if it was truth. It is purely illustrative, so it doesn’t matter if wrong. Humans used to form groups of several hundred related individuals divided into families. The families have a status order, like a pecking order in chickens, and it is there for the same reason as all status hierarchies: because without it there would need to be a fight over each conflicting intention. The human situation is similar to (some species of) macaques, with family status passed from mother to daughter. Everyone needs to know their place and one of the ways this is done is by bullying. If A bullies B and B’s relatives don’t rush in then B knows that A and A’s family have higher status. In its natural setting bullying is only necessary when there is doubt. Of course in our city lifestyle this all breaks down. Status is a mess, with endless struggles leading to a lot of bullying. Now I would say that we need to understand this to make good decisions about dealing with bullying (which doesn’t even do its job in our society). But a lot of people would say “You can’t say that bullying is natural. That condones it.” Actually it only condones it if you have the subjective idea that human nature is good by default and departures from the good represent some malfunction. The correct view is that human evil, such as bullying, needs to be dealt with in human ways, not by trying to find and fix a malfunction.
Still we need to accept that the general public is not going to easily accept things which contradict their firmly implanted subjective truths. Such matters need to be dealt with carefully, and avoided as much as possible.

[update: This interact with the proposed Truth and Expertise Network (previous post) because people can just say "I trust the monastery" and get good feedback on the stuff they read.]

The Truth and Expertise Network

While specific facts and resulting deductions are at the core, in the main we are interested in identifying those who are, and those who are not, good sources of the truth. Most particularly we want to identify those who are lying for their own advantage, since those lies are much more likely to cause harm than merely mistaken beliefs. We are mainly concerned with facts that are relevant to public policy, but even there we come to issues where well-meaning folk would say “you shouldn’t say that, even if it is true”. We’ll leave such delicate considerations until the next blog post.
For those who don’t want the technical details, the general idea is this: Individuals can specify their level of belief about claims made, about the motivation for claims, and about the trustworthiness of other individuals. Software can then warn you about claims based on the claim itself, or the people making it. The software would only follow links from the people you trust (and that they trust, etc). This might need some social engineering to actually work, and that is described in the following blog post.
So the idea is that participants in the scheme will have one or more public-private keypairs. These will be used to sign assertions of various sorts, discussed below. They will be of no use unless (a) people link to those keys in various ways; and (b) the assertions are made public (at the very least to some of the linking people).
People can just make their main public key publically available in places they are known to control. Or give them to specific people. They can also have keypairs that they don’t advertise as belonging to them, but endorse them as reliable as if they were some other unknown person. They can then  make assertions that can’t be attributed but can be used by people who trust you and the people you trust.
I’ll list (some of) the assertions that can be made. Software running in the user’s machine, and the machines of those she trusts, and in central servers, will cooperate to provide the user with warnings of false claims, claimants lacking the expertise they claim, claimants seeking to mislead. Perhaps the most important things will be information about internal contradictions in the trust network. If your trust network supports incompatible claims then it is an indication of a problem, such as people in your trust network being overly confidant about an uncertain matter, or infiltration of the trust network by incompetent or bad actors. Tracking these things down will help everybody who wants to get a good handle on the truth.
  • “My prior (belief pending future evidence) for this claim to be true is P%” where P is between 0 and 100. The claim should be a block of text, [+ optionally a wider block of text from the same source providing context], + a URL giving the location.
  • “My prior that this claim is honestly believed by the claimant is …”
  • “I believe the claimant is … [with probability] acting on behalf of … [with probability]”
  • “I trust this person to only express honestly held beliefs”, giving a public key.
  • “I believe this person is an expert on …”
  • “I trust this person to choose others who are trustworthy” (thus allowing an extended network of trust).
Systematizing all that (and more) is a tough job. It is similar to the jobs done by the IETF (Internet Engineering Task Force), and maybe we need an OTETF (Objective Truth Engineering Task Force).

[update: Naturally browser plugins and other user software will make it as easy as possible for users to participate in this scheme.]

The Truth and Expertise Problem

Katherine Viner, writing in The Guardian, gives a comprehensive overview of the way social media and tailored news is disrupting the truth ( It used to be that most of the population saw a common set of reasonably authoritative information from newspapers and TV. Now the real experts struggle to be heard over the cacophony, the truth is mixed up with falsehoods, and the way people get their news is highly likely to reinforce their biases.
But things are not all bad. Previously many lies got firmly established, and Viner gives the example of the Hillsborough tragedy. Now the population has learnt not to trust everything that is written down. This is a necessary step to not being misled. The problem is to get them to take the next step: to collect and evaluate the evidence like a scientist, then to think through the implications like a mathematician.
Well obviously that’s unrealistic. All of us, even the greatest experts in some field, are forced to identify experts that we trust when it comes to areas in which we lack expertise. The problem is that our faith in experts has been seriously eroded. In many ways this is a good thing. We now know that experts have put many people in prison through highly exaggerated claims about DNA and fingerprint identification. We know that many of the medical treatments that doctors have used, and advice they have given, are not supported by the evidence or results. We needed to take this step of treating experts with caution, but not go to the current extreme where expert advice is often completely ignored.
Experts need to be evaluated in some way. They won’t like that, but we need to achieve two things: Making sure that the experts we trust are not abusing that trust; And finding the real experts to trust in the midst of so many competing claims. We also need a process that can be respected by the public so that they and the media are inclined to select real and honest experts for their understanding of important matters.
This is the first of 3 blog posts. The 2nd will discuss a technological solution for identifying what is true, and why, and for identifying who is an expert, who is trying to mislead. The 3rd will provide a rather wild idea for a social experiment that might get the message to the public that there is an objective truth which is worth pursuing.
[There is a relevant new book “A Survival Guide to the Misinformation Age” by David Helfand, and here’s a review:].

Sunday, June 12, 2016

Gene compatibility for making babies

I presume that the common reason why multiple variants of a gene persist is because variants of one gene work variably with variants of other genes.

To take the simplest case, suppose there are 2 relevant genes, and the first has variants A and X, and the second has variants B and Y. And suppose that either combination  AB or XY is good and will make a successful individual, but combinations AY or XB work badly. Even though AY and XB people don’t have many offspring, still they keep coming.

The way evolution tries to handle this is for gene combinations to also cause detectable changes, and to cause detection systems that respond to those changes. In humans this is done in part by smell. So perhaps B people will smell different from Y people, and A people will prefer the smell of B people and X people will prefer the smell of Y people. But obviously there are a lot of genes and this has to be a rough process.

We now have the opportunity to help people find compatible partners, so that no AY or XB babies are born. We also have the opportunity to make some big mistakes. For example if AB people are actually better than XY people (according to some evaluation), then it might seem like a good idea to get rid of X and Y completely so there is no problem. But having multiple variants gives us robustness. Suppose a new beneficial variant of a 3rd gene arises. It may be that it works best with XY individuals rather than AB individuals. If we get stuck with only AB people then maybe we lose the option of benefiting from that 3rd gene variant.

It is not impossible that our 3rd gene variant solves the problems that AY people have and actually works best with them, but is that a good reason not to help people find compatible partners?

Friday, June 10, 2016

The rise of tribalism (and how it helps the rich)

We observe that the Democratic Party in  the USA is more likely to take actions that help poor people, while the Republicans like to help the rich. Yet a lot of poorer white folk vote Republican, so that the poorer states often have Republicans in power, amplifying the bad outcomes in those states. This is typically perceived as voter stupidity. This seems unlikely given 100s of thousands of years of human evolution favouring intelligence. We need to look deeper.

Humans have triumphed in the boom and bust cycles of the Earth's last few million years. What is likely to be a good strategy? When things are good we have lots of kids, and send some of them off to explore and settle. When things are bad the population has to shrink. What do our genes want us to do then? The higher status individuals are most likely to survive and reproduce. So the genes for low status individuals will like it (metaphorically) if it is closely related high status individuals who survive, as opposed to more distantly related individuals. So we should have an increased willingness for low status individuals to make sacrifices to promote the interests of high status individuals in their tribe against other tribes.

But how do we identify relatively more closely related individuals in large societies. In addition to the standard cultural identification (similar language, accent, facial expressions, attire, hair style), it is natural that people take notice of similar appearance.

The rich in America, at least the visible ones, seem to be mostly of European origins. So perhaps there is a reason for poor whites, feeling (mostly incorrectly) that things are going bad, to support the Republican party and its support for hereditary wealth. And perhaps it is instinctive for the high status individuals to promote tribalism. It's certainly working for them today, but as we saw in Europe in the 30s, it can easily get out of hand.

So it would be possible to attack Trump's support by identifying wealthy foreigners and those of non-European heritage who will profit by Trump's and wider Republican plans. Of course this contributes to inflaming social tensions, and should not be used unless there seems to be some chance of Trump winning. After the election we hope Democrats in Congress will move vigorously to address the decline of the middle class, most particularly by printing money and spending it on much needed infrastructure.

Friday, April 15, 2016

David MacKay and individual action

I know that "it rains on the just and on the unjust", but the early death of Prof David MacKay seems to take this too far. What a great guy.

I first learned about him when I was trying to understand modern probability and inference. He wrote an excellent book on this, and made it freely available. It starts out with a nicely self-deprecating story about trying to solve a physics problem that becomes trivial when looked at using Bayesian thinking. Then he used his understanding of inference to build software that enabled highly disabled people to type English on a computer with reasonable efficiency by just steering towards various options with the most probable being the biggest target. And he then went and worked with disabled people to make it work for them. More about these things on his web page (which I hope is preserved for posterity):

He became interested in global warming. He was particularly concerned that politicians were making token plans that had no hope of addressing the problem. As, indeed, they still are. So he wrote a book on sustainable energy. I was already following his blog, so I saw early drafts. In the early drafts nuclear power was classified as unsustainable because of incorrect low estimates on the amount of available uranium.  But in the end the book includes nuclear power in the options for ending the use of fossil fuel. Though the book is written in a balanced way, its effect on me was to convince me that nuclear power is the only answer and wind and solar are a distraction from the mammoth investment needed in research and development of next generation nuclear power.

The effect of his beautifully balanced and logical and well researched book was that the British Labour government gave him a big public service job as scientific advisor to the Energy ministry. This enabled him to build a tool for future energy planning that anybody could use. The end result is that Britain is continuing to expand nuclear power while other countries are talking wind and solar, closing down nuclear power and very quietly building ever more coal and gas power plants.

MacKay was big on individual action. He made his house as energy efficient as possible and turned the thermostat down. He rode a bicycle. I think that this sort of thing is counter productive. Most people are struggling with life and this just puts them in "I don't want to think about that stuff" mode. And, just like turning your phone charger off when not using it, it isn't going to get the job done.

It is a sobering thought that when David MacKay is my age he'll have been dead for 20 years (a modified Tom Lehrer quote).

Saturday, February 27, 2016

21st Century Money

On Google+, I wrote:

Nick Szabo argues, convincingly, that commodities (like gold, oil, and even real estate) become (in part) money, when people use them to store value rather than entirely for their immediate practical value. He then shows how this leads to big swings in commodity prices as people change their mind about which things will be the best place to store value. His belief in gold and silver as a good single store of value didn't convince me. He also fails to note that it is a good thing if people store things which will be more valuable in the future than they are now. That has the effect of driving prices, and hence supply, up sooner rather than leaving it to the last minute. Speculators and hoarders provide a useful service! Information technology allows us to manage multiple currencies simultaneously allowing a more coherent attack on the issue than just hoping gold will do the job.

To expand on that last sentence: Banks should make it easy for their customers to hold their balances in a mix of currencies. When payments are made then conversion is done into the agreed exchange currency (such as the local central currency) according to a prescription provided by the customer. And inversely for incoming payments. Presumably banks will compete to cover the currencies that customers want and to provide reasonable conversion rates. 

Banks or governments could provide a currency that covers a (slowly changing) basket of valuable commodities. It would be important to do this in a very open and transparent way, so that there is confidence that the backing commodities do exist and can all be recovered by the currency owners (given large enough amounts of the currency). But banks should also allow customers to keep a balance of commodities according to the customer's prescribed ratios.

Monday, July 27, 2015

Village Clan Tribe

The anonymous nature of the Internet is one of the things that makes Internet security difficult to get right. The anonymous nature of modern daily life affects real life security too. Consider the way loners can accumulate weaponry and then cause a sudden massacre. But the following proposal isn’t intended to address these issues. It’s just meant to be fun. It is also meant to allow experiments with social structure on the Internet that may help with practical concerns in the future.
The idea is to get people to self-organize into a hierarchy and let the participants try using that for various social purposes. The levels of the hierarchy are: Village, Clan, Tribe, Nation, Empire.
Participants join some Village that will take them, or they get together with others to create one. A Village has from 64 to 128 people, at most ⅔ of one sex. The only rule is that every member must have a face to face brief meeting, perhaps in a small group, either live or by video link, with more than half the other members of the Village on some regular basis (and covering every other member on some longer time scale). The intention is that everyone knows everyone else, and a little bit about them. Belonging to more than one Village is not allowed but is not necessarily detectable.
Once there are 32 Villages then every Village belongs to a Clan. A Clan has 32 to 64 Villages. Once there are 32 Clans they belong to a Tribe. And so on upwards, but it can’t go much further.
What things the Village may do is to be decided democratically, and then by representative democracy in the Clan. Every Village appoints 2 representatives to the Clan, not the same sex. These then are in contact with other representatives on a regular basis, and thence know a little bit about the other Villages in the Clan. And so on up the line.
When Villages (or Clans or Tribes) get too big they need to  reorganize. It is just, barely, possible for them to get to 128 and split to two 64s. A more interesting option is for 2 Villages to get together and emerge as three. (Clans and above are perhaps more likely to take the split option.)
Individuals have unique names or pseudonyms in a Village. Village names are unique in a Clan, and so on. So this gives every participant a unique identification.
Villages will, in general, have additional future members who are not yet qualified (minors who have a parent in the village, or candidates who haven’t yet met other villagers). Similarly Clans can have incipient “future” Villages that don’t yet meet the rules, and villages are demoted to that status when they break the rules. Things will presumably be more stable higher up.
I imagine that a significant activity will be jokey us-vs-them cultural activity. I.e. Villages will have some special, perhaps partly secret: dress rules, handshakes, idioms, musical activities. Villages will socialize with other villages in some ways. The aim will be to grow villages by attracting compatible people, then perhaps get to the mock-serious business of getting together with another village to produce an offspring.
Note that there is a presumption of privacy beyond the Village level, apart from the reps at the next level up, and village activities that people have agreed to take part in. A high level objective of this proposal is to experiment with cryptographic support for making this work in ways that combine security with the medium level of privacy which we all enjoyed until recent decades.

Tuesday, July 14, 2015

Multiplicity of currencies

The application of computers to economic life is about managing complexity. Think of the systems that keep planes full and airfares cheap, compared to the old paper ticketing system. Handling complexity means that you can get closer to an optimal system. When you can't handle complexity you have to accept inefficiencies to simplify the process.

Fiat currencies, like the Euro and national currencies, are just such a simplification that leads to a non-optimal solution. We see that non-optimality starkly in Greece (and Puerto Rico, and Detroit) where a semi-independent government body gets into financial trouble, and the resulting exodus of the best people from that area exacerbates the disaster.

We don't need fiat currencies any more. We can handle the complexity of multiple proof-of-work currencies. You go to a shop that wants payment in gold. You point your phone at the displayed price and it will display the conversion in currencies you have. Online purchases are even easier. Of course we are assuming a mutually trusted body to hold and track the ownership of that physical gold. This can be augmented with cryptographic currencies (like bitcoin) and by cryptographic proof of transfer of centrally held physical stuff (like gold) held by one or more trusted parties.

For small payments fiat currencies will continue to be important, and we can presume that people selling small amounts of stuff will continue to be required to accept them.

One of the advantages of fiat currencies is that they enable the government to raise money easily in an emergency (such as a war) by simply printing it. This is a tax on people who happen to hold the currency or who are owed money denominated in the currency. This will likely continue to work for small items, while for large items IOUs denominated in various currencies can be required to be accepted in a proclaimed emergency.

There is a really big advantage of moving to a proof-of-work currency that is backed by physical stuff held by multiple trusted central services: The physical stuff doesn't have to be useless (like gold or bitcoin). Instead it can be non-perishable stuff that might be useful at some future time. It might, in other words, double as actual preparation for some difficult future circumstances.

Sunday, July 12, 2015

3 Climate-related news items

In we see the key quote: "for the past 1 million years, carbon dioxide levels never rose above about 300 ppm". This means we're in uncharted territory and even if we didn't know anything about the probable effects we should still stop pumping CO2 ASAP. But we do know some highly probable effects: climate change and surface ocean acidification (more H+ ions, less OH- ions, even though staying on the basic side of neutral).

Two other items remind us of things that nature might throw at us.

The 20th century had few major geological events. The 21st century has started with two huge earthquakes. What might happen if we have some big volcanic eruptions. Recent analysis of ice cores and tree rings is very interesting:

There is also the possibility of a maunder minimum in the sun's activity. Some modellers claim ( to have an improved model of the sun's sunspot cycle, and that model predicts a maunder minimum in the next 20 years. The last one was associated with, and may have been a contributing cause of, the Little Ice Age (around 1700).

It would be interesting to see the climate models of a couple of big volcanoes going off close to each other in time, during a maunder minimum of solar activity, combined with a lot of open ocean in the Arctic pumping moisture into the atmosphere to fall as snow...

These scientific investigations act as an antidote to the impression non-scientist Greens try to give that all bad things come from humans, and all natural stuff is good. But they don't effect the imperative to stop pumping CO2. Maybe we've lucked out by making it a bit warmer before a cold snap hits, but after the cold snaps we'll race quickly to the trend line of warming. And, of course, acidification will be added stress to ocean surface life during any cold snaps.

The way to prepare for nature's cold snaps is to have lots of spare capacity of cheap carbon-free energy, so that we can do a lot of farming indoors with artificial light. The way to do that is with advanced nuclear power. Needless to say: solar power is not going to work in the next year without a summer.

Monday, July 6, 2015

Countries can save without causing financial disasters

Germany and China continue to demand the right to export more than they import. How do they do that? Obviously there must be more money coming in than going out. This has to be balanced by money going out that does not have anything coming back immediately, which is foreign investment in various forms.

The Chinese used to do this by buying US bonds. Then when the US decided to print money to buy bonds (and thus keep the value of bonds low), the Chinese realized they'd lost control of their money. We now see a lot more real investment in external real world assets. They still don't ultimately control that. Let's hope they never try to.

The Germans have a multinational currency. So all they need to do is find a sucker in the EU to lend it to. This gets harder and harder, so they've been lending to increasingly poor credit risks. Then when it blows up in their face they claim that the suckers must go to the equivalent of debtor prison.

Repeatedly down the years we hear China and Germany claim the moral high ground: "Why doesn't everyone live within their means and export more than they import, like us". When people point out that this is totally moronic, they go quiet for a while. But they don't change what their doing. And then they say it again.

Countries can, and should, prepare for the future, but they can't do it by saving money as ordinary folk do. They need to actually acquire real world stuff that will make the future safer. A good choice is to build energy creation systems that are relatively expensive to build (capital cost), but then have relatively low running cost. If you have cheap energy you can then make other stuff. Another thing to do is stockpile raw materials that are not easily available within your borders, but that can go wrong if future production switches away from that particular material.