Science will follow a sigmoid curve, not an exponential curve

This is what a sigmoid curve looks like.

image

All kinds of things naturally follow sigmoid curves - neurons in the brain, populations of animals (including humans), and soon I think it will be agreed that scientific progress also follows such a curve.

For the past few decades, we have all been convinced that we’re on an exponential growth curve. If you look at the left half of a sigmoid curve, it certainly looks exponential. Moore’s law, for instance, has appeared to be exponential for quite a while but it has been “slowing down”. Sure, we can keep stacking cores together, but check out this graph:

Frequency and single-core performance have already topped out. Why is this? One big reason is that we have reached the physical limits of the underlying materials. Transistors are so small now that we have to account for quantum physics, such as electron tunneling.

At the same time, we’re seeing costs of all kinds of research skyrocket. One of the most profound examples is the cost of drugs research. We have picked much of the low hanging fruit and so innovation becomes more expensive, and by extension, much slower:

I used to think that Star Wars and Star Trek technology, only a few hundred or so years ahead of our own, would surely underestimate the future. I figured that we would ultimately be living like hypercosmic gods - such as the Q continuum - within a few thousand years. Surely, with the current exponential growth of our understanding, wouldn’t we inevitably arrive at a state of being that is utterly incomprehensible to our present minds? Perhaps that’s not our future after all…

We’re seeing this inflation, this compaction in all sorts of fields. Hubble cost $1.5B while James Webb cost $10B (and climbing). Even archaeology is becoming slower, more difficult, and more expensive. As we exhaust the number of discoveries close to the surface, we must dig deeper and use more expensive ground-penetrating sensors. But then we also have social and political pressures against some progress.

It is true that we have enjoyed seeing the most explosive growth in all of human history over the last few decades, but I think that we will soon see increasing costs and other resistances slow down progress. What does this mean for us? I now believe that the singularity will never happen. I think that we’re probably near the steepest part of the sigmoid curve and that, within a decade or two, we will see all progress drastically slow down.

What does this mean for AI? Why am I writing this here?

I think that it’s important to have an accurate set of expectations when thinking about the future. If we believe that everything will continue to accelerate indefinitely, that creates certain pressures, certain needs. For instance, if we are barreling towards AGI and then super-intelligence that will rapidly outstrip human intelligence, then we must take certain precautions. However, I think that human brains are, in many respects, always going to outperform machines. Evolution has created a 5 petaflop computer that runs on 10W of energy. The NVIDIA DGX A100 runs at about 5 petaflops and uses a minimum of 250W - that’s 25x less efficient than our brains (and that’s assuming that the relative computing power is actually accurate - it’s possible that human brains actually operate in the exaflop range). Thermodynamics might be the greatest restriction in our progress of computational horsepower. If not, then we absolutely need to be worried about Skynet.

If, instead, we believe that progress will be slowing down, that there are certain thresholds that will be exorbitantly expensive to cross, it will drastically change our outlook and our precautions. Perhaps we need to be worrying more about mundane issues, such as privacy and climate impact more than malevolent AGI.

What say you? Is science and technology forever on an exponential growth curve or are we on a sigmoid curve? If it’s exponential, how would that change our risk calculations? If it’s sigmoid, what does that mean for safety and precautions?

EDIT: I was wrong about the brain being 5 petaflops - current estimates are closer to 100 to 1000 petaflops. So yeah, the human brain is an exascale computer. By comparison, Aurora runs on 60MW of juice.

1 Like

I think Peter Thiel proposes that the problem is not that the low-hanging fruit has been picked, but rather that the problem is more cultural in nature i.e. that people are in senseless competition with each other thereby making them more imitative of each other instead of thinking for themselves and being innovative. This is exacerbated by the current “Lean Startup” paradigm which encourages endless feedback looks with consumers that might not know what they want.

Thiel argues that the fruit has always been “intermediate-hanging”; but people were less prone to imitating each other because they were more isolated; now everyone is connected via the internet and social media which makes it easier to just ape each other.

Not sure of Thiel is right or the low-hanging fruit argument is correct.

2 Likes

Considering that many people consider Peter Thiel to be a charlatan and his company is/was being investigated by the FBI, it’s probably best not to hold his ideas up as solid gold. He’s also not a scientist, not a researcher, and perhaps not even math literate. His education is philosophy and law.

So you be the judge - who is right? One shill celebrity or literally the rest of the world’s data? The rising cost of clinical trials can, in no sane way, be explained by “senseless competition with each other”. There have been snake oil salesmen throughout history, Thiel is likely going to be recorded as one of them. Even the people who like his message roll their eyes at how ridiculous he is.