Sapiens. A Brief History of Human Kind

Jon Jagger from less code, more software

is an excellent book by Yuval Noah Harari (isbn 978-0-099-59008-8)
As usual I'm going to quote from a few pages.

Whereas chimpanzees spend five hours a day chewing raw food, a single hour suffices for people eating cooked food.
Since long intestines and large brains are both massive energy consumers, it's hard to have both. By shortening the intestines and decreasing their energy consumption, cooking inadvertently opened the way to the jumbo brains of Neanderthals and Sapiens.
Large numbers of strangers can cooperate successfully by believing in common myths.
There is some evidence that the size of the average Sapiens brain has actually decreased since the age of foraging. Survival in that era required superb mental abilities from everyone.
What characterises all these acts of communication is that the entities being addressed are local beings. They are not universal gods, but rather a particular deer, a particular tree, a particular stream, a particular ghost.
The extra food did not translate into a better diet or more leisure. Rather, it translated into population explosions and pampered elites.
The new agricultural tasks demanded so much time that people were forced to settle permanently next to their wheat fields.
This is the essence of the Agricultural Revolution: the ability to keep more people alive under worse conditions.
One of history's few iron laws is that luxuries tend to become necessities and to spawn new obligations.
Evolution is based on difference, not on equality.
There is not a single organ in the human body that only does the job its prototype did when it first appeared hundreds of millions of years ago.
The mere fact that Mediterranean people believed in gold would cause Indians to start believing in it as well. Even if Indians still had no real use for gold, the fact that Mediterranean people wanted it would be enough to make the Indians value it.
The first religious effect of the Agricultural Revolution was to turn plants and animals from equal members of a spiritual round table into property.
The monotheist religions expelled the gods through the front door with a lot of fanfare, only to take them back in through the side window. Christianity, for example, developed its own pantheon of saints, whose cults differed little from those of the polytheistic gods.
Level two chaos is chaos that reacts to predictions about it, and therefore can never be predicted accurately.
In many societies, more people are in danger of dying from obesity than from starvation.
Each year the US population spends more money on diets than the amount needed to feed all the hungry people in the rest of the world.
Throughout history, the upper classes always claimed to be smarter, stronger and generally better than the underclasses... With the help of new medical capabilities, the pretensions of the upper classes might soon become an objective reality.


The Culture Code

Jon Jagger from less code, more software

is an excellent book by Daniel Coyle (isbn 978-1-847-94127-5)
As usual I'm going to quote from a few pages.

Much of the connection happens around the dinner table, as Popovich is obsessed with food and wine.
One misconception about highly successful cultures is that they are happy, lighthearted places. This is mostly not the case. They are energized and engaged, but at their core their members are oriented less around achieving happiness than around solving hard problems together.
Allen could find none that played a meaningful role in cohesion. Except for one. The distance between their desks.
For the vast majority of human history, sustained proximity has been an indicator of belonging.
It's important to avoid interruptions. The smoothness of turn taking, as we've seen, is a powerful indicator of cohesive group performance.
The groups I studied had extremely low tolerance for bad apple behaviour.
The groups I visited were uniformly obsessed with design as a lever for cohesion and interaction.
He also had the company buy nicer coffee machines and install them in more convenient gathering places.
Merely replacing four-person tables with ten-person tables has boosted productivity by 10 percent.
Kauffman decreed that every aspect of training be team-based.
It's very hard to be empathic when you're talking.
The road to success is paved with mistakes well handled.
We should have made the hallways wider. We should have made the cafe bigger.


Memory capacity growth: a major contributor to the success of computers

Derek Jones from The Shape of Code

The growth in memory capacity is the unsung hero of the computer revolution. Intel’s multi-decade annual billion dollar marketing spend has ensured that cpu clock frequency dominates our attention (a lot of people don’t know that memory is available at different frequencies, and this can have a larger impact on performance that cpu frequency).

In many ways memory capacity is more important than clock frequency: a program won’t run unless enough memory is available but people can wait for a slow cpu.

The growth in memory capacity of customer computers changed the structure of the software business.

When memory capacity was limited by a 16-bit address space (i.e., 64k), commercially saleable applications could be created by one or two very capable developers working flat out for a year. There was no point hiring a large team, because the resulting application would be too large to run on a typical customer computer. Very large applications were written, but these were bespoke systems consisting of many small programs that ran one after the other.

Once the memory capacity of a typical customer computer started to regularly increase it became practical, and eventually necessary, to create and sell applications offering ever more functionality. A successful application written by one developer became rarer and rarer.

Microsoft Windows is the poster child application that grew in complexity as computer memory capacity grew. Microsoft’s MS-DOS had lots of potential competitors because it was small (it was created in an era when 64k was a lot of memory). In the 1990s the increasing memory capacity enabled Microsoft to create a moat around their products, by offering an increasingly wide variety of functionality that required a large team of developers to build and then support.

GCC’s rise to dominance was possible for the same reason as Microsoft Windows. In the late 1980s gcc was just another one-man compiler project, others could not make significant contributions because the resulting compiler would not run on a typical developer computer. Once memory capacity took off, it was possible for gcc to grow from the contributions of many, something that other one-man compilers could not do (without hiring lots of developers).

How fast did the memory capacity of computers owned by potential customers grow?

One source of information is the adverts in Byte (the magazine), lots of pdfs are available, and perhaps one day a student with some time will extract the information.

Wikipedia has plenty of articles detailing cpu performance, e.g., Macintosh models by cpu type (a comparison of Macintosh models does include memory capacity). The impact of Intel’s marketing dollars on the perception of computer systems is a PhD thesis waiting to be written.

The SPEC benchmarks have been around since 1988, recording system memory capacity since 1994, and SPEC make their detailed data public :-) Hardware vendors are more likely to submit SPEC results for their high-end systems, than their run-of-the-mill systems. However, if we are looking at rate of growth, rather than absolute memory capacity, the results may be representative of typical customer systems.

The plot below shows memory capacity against date of reported benchmarking (which I assume is close to the date a system first became available). The lines are fitted using quantile regression, with 95% of systems being above the lower line (i.e., these systems all have more memory than those below this line), and 50% are above the upper line (code+data):

Memory reported in systems running the SPEC benchmark on a given date.

The fitted models show the memory capacity doubling every 845 or 825 days. The blue circles are memory that comes installed with various Macintosh systems, at time of launch (memory doubling time is 730 days).

How did applications’ minimum required memory grow over time? I have a patchy data for a smattering of products, extracted from Wikipedia. Some vendors probably required customers to have a fairly beefy machine, while others went for a wider customer base. Data on the memory requirements of the various versions of products launched in the 1990s is very hard to find. Pointers very welcome.

Bad Luck Comes In Ks – a.k.

a.k. from thus spake a.k.

Lately we have been looking at Bernoulli processes which are sequences of independent experiments, known as Bernoulli trials, whose successes or failures are given by observations of a Bernoulli distributed random variable. Last time we saw that the number of failures before the first success was governed by the geometric distribution which is the discrete analogue of the exponential distribution and, like it, is a memoryless waiting time distribution in the sense that the distribution for the number of failures before the next success is identical no matter how many failures have already occurred whilst we've been waiting.
This time we shall take a look at the distribution of the number of failures before a given number of successes, which is a discrete version of the gamma distribution which defines the probabilities of how long we must wait for multiple exponentially distributed events to occur.

The big mistake with Platform Product Owners and what to do about it

AllanAdmin from Allan Kelly Associates

From time to time I come across software platform team – also called infrastructure teams. Such teams provide software which is used by other teams rather than end customers as such they are one step, or even more, removed from customers.

Now I will admit part of me doesn’t want these teams to exist at all but let’s save that conversation for another day. I acknowledge that in creating these teams organisations act with the best intentions and there is a logic to the creation of such teams.

It is what happens with the Product Owners that concerns me today.

Frequently these teams struggle with product owners.

Sometimes the teams don’t have product owners at all: after all these teams don’t have normal customers, they exist to do work which will enhance the common elements and therefore benefit other teams who will benefit customers. So, the thinking goes, coders should just do what they think is right because they know the technology best.

Sometimes an architect is given the power of product ownership: again the thinking is that as the team is delivering technology to technologists someone who understand the technology is the best person to decide what will add value.

And sometimes a product owner exists but they are a developer, they may even still have development responsibilities and have to split their time between the two roles. Such people obtain their role not because of their marketing skills, their knowledge of customers or because they are good at analysing user needs. Again it is assumed that they will know what is needed because they know the technology.

In my book all three positions are wrong, very wrong.

A platform team absolutely needs a customer focused product owner. A product owner who can appreciate the team have two tiers of customers. First other technology teams, and then beyond them actual paying customers. This means understanding the benefit to be delivered is more difficult, it should not be the case of ducking the issue, it should be a case of working harder.

If the platform team are to deliver product enhancements that allow other teams to deliver benefit to customers then it is not a case of “doing what the technology needs.” It is, more than ever, a case of doing things that will deliver customer benefit.

Therefore, platform teams need the strongest and best product owners who have the keenest sense of customer understanding and the best stakeholder management skills because understanding and prioritising the work of the platform team is a) more difficult and b) more important.

A platform team that is not delivering what other teams need does more damage to more teams and customers – in terms of benefit not delivered – than a regular team that just delivers to customers. Sure the PO will need to understand the technology and the platform but that is always the case.

So, to summarise and to be as clear as possible: Platform teams need the best Product Owners you have available; making a technical team member, one without marketing and/or product ownership experience, the product owner is a mistake.

The post The big mistake with Platform Product Owners and what to do about it appeared first on Allan Kelly Associates.