Sunday, June 28, 2020

The Earth Has Not Been Disassembled for Computation - Percent Utilization of Phosphorus and Nitrogen on Earth by Living Things

A 2015 paper by Landenmark et al estimates the total number of DNA bases in nature as 5.3x10^31 megabases. This of course leads to questions like: how much of the elements on Earth is life on Earth using? I'm aiming for an answer within an order of magnitude. This has implications for concerns about AI takeoff that I will return to at the end.


NITROGEN

Living things occupy slightly more than a billionth of the planet's nitrogen in our DNA (0.000000115%). Living things occupy 0.0023% of the planet's nitrogen overall, the lion's share of which of course is in protein. (See my assumptions below if you like.)


PHOSPHORUS

Living things are using only 0.00047% of the planet's phosphorus in our DNA - but that expands to 4.7% of the planet's phosphorus in living cells overall. This is a much more significant fraction.


Does this difference exist because life on Earth has chosen phosphorus as, effectively, energy currency to manipulate gradients? Or because nitrogen is harder to make biologically available? Even now we rely on relatively few bottlenecks to fix it.



IMPLICATIONS FOR AI TAKEOFF

There's no reason to assume that these numbers represent a global, rather than local optimum for resource utilization for replicators on Earth. That said, we've had four billion years to optimize. This is relevant because of the concern that AI taking off without regard to human welfare would disassemble the Earth into atoms for computation - the farther we are from truly optimized resource utilization, the more an intelligence explosion would be disruptive to the status quo. I found the Bar-On paper on amount of DNA in the biosphere from a link in a discussion about the computational efficiency of nucleic acids in cells. The latter paper suggests that protein translation is several orders of magnitude faster than the fastest current computers, and only an order of magnitude under the Laundauer limit. Of course, resource utilization and computing speed are two different variables, but it seems computation is getting near optimized already - and yet, no disassembly of the Earth for phosphorus. Not even 5% of the energy currency atoms are put to work! Of course, an AI would be qualitatively and quantitatively different in unpredictable ways from what came before, in which case there is no point in discussing this - but the replicators that exist in reality make the best starting point for such a discussiong.

What's more, protein translation is computation in the service of replication. It is quite likely that AIs would end up being selected in much the same way as cells have, with limited resources to be dedicated to refining the model of the universe (getting smarter.) The ivory tower AI super-minds would be dominated by the silicon bacteria. Of course, this is still no reason to think a hard AI takeoff could be disastrous for all life on Earth, an extinction like we've never seen - which the AIs themselves might not have the foresight to survive - but if they do, the best bet is that they will "revert to the mean" of all replicators, with making copies as the goal.




An imperfect analogy. In nature, you have to make do with what's there. The shapes aren't friendly for efficient packing and there are a lot more holes.


Assumptions:

I could not find estimates of the overall mass of nitrogen and phosphorus in the biosphere, so I used the percentage weights in living cells, and derived from a paper estimating the mass of carbon in the biosphere at 5.5x10^14 kg (Bar-On et al 2018), along with carbon being 18% of the atoms in living things.

For both I used 2884.6 kg/m^3 mass of the Earth's crust (weighted the differently dense continental and oceanic crusts at 0.3 and 0.7 resp.) My number for nitrogen comes from nitrogen in the atmosphere, plus nitrogen in the top meter of the Earth's crust, estimating mass of the atmosphere as 5.15*10^18 kg, of which 78.09% is nitrogen, and abundance in the crust as 0.002% by mass (there was some conflict over this between sources actually of up to an order of magnitude; but there is so little nitrogen in the crust compared to the atmosphere, about 347,000 times less using this number, that it's still a rounding error. I assume that there are an equal number of A T C and G which means 3.75xnitrogen atoms per base.

For phosphorus, I used a crustal abundance of 0.1% mass, ignoring the negligible phosphorus in the atmosphere. There is 1xphosphorus atom per base. The major "slop" in this figure occurs because different organisms have different fractions of phosphorus, for one thing since phosphorus is used in structural molecules like bone (85% of phosphorus in humans is in bone; even the same organism at different ages differs substantially, e.g. 0.5% in infants, close to 1% in adults.) Bacteria come in at 0.9% (3% dry weight, assuming 70% water mass per cell) so I used that figure, since bacteria outweigh us by a factor of a thousand, and the number is intermediate even for the values for vertebrates.


REFERENCES

Bar-On YM, Phillips R, Milo R. The biomass distribution on Earth. PNAS June 19, 2018 115 (25) 6506-6511.

P. Kempes CP, Wolpert D, Cohen Z, Pérez-Mercader J. The thermodynamic efficiency of computations made in cells across the range of life. Philos Trans A Math Phys Eng Sci. 2017 Dec 28; 375(2109): 20160343.

Landenmark HKE, Forgan DH, Cockell CS. An Estimate of the Total DNA in the Biosphere. PLoS Biol. 2015 Jun; 13(6): e1002168. Published online 2015 Jun 11. doi: 10.1371/journal.pbio.1002168

Michael Schirber. Chemistry of Life: The Human Body. Livescience.com. https://www.livescience.com/3505-chemistry-life-human-body.html#:~:text=Oxygen%20(65%25)%20and%20hydrogen,%25)%20is%20synonymous%20with%20life.

2 comments:

Grego said...

- Earth doesn't have to be disassembled; it's already busily computing the Ultimate Question. Go ask the white mice about it.

- You're ignoring the probability of alternate computing substrates. Plants are already optimized energy and nutrient absorbers, self-replicate, and are ubiquitous... It doesn't have to be fast, we already have fast.

- An intelligence capable of rational behavior that discovers something like the prior point might hesitate before remodeling so thoroughly.

- Barring singularity-type physics discoveries, a planetary remodel involving general disassembly may require one or more intermediate stages of resource development to be feasible even for an actual global planning entity.

- Complete takeover while undiscovered is the best strategy.

- Perhaps that's already underway!

- Are we ourselves host to centrally planned computing units and thereby subject to reprogramming via reassembly?

- Why disassemble complex systems when you can subvert and direct, and still benefit from the specialties of life? Raw efficiency is still achievable in reasonable timeframe without removing all these sources of crazy interesting randomness from the system.

...

-G

Michael Caton said...

All valid questions. This is of course extremely speculative but the goal was just a rough bound on how much, so far, actual replicators have utilized real resources. I especially like your point about the silent takeover. Questions about the definitions of "agent" in AI immediately touch on the same questions philosophers have been asking, like "can a country be conscious" (if a collection of neurons can be, and DIFFERENT collections of neurons can be, then why not?) So the AI might be distributed in the actions of humans since the industrial revolution (or agriculture, or specialized tool use), and the takeover has been happening since then, intentional or otherwise. A superintelligent agent whose existence could be threatened by humans would expend effort remaining undetected until we were no longer any threat.

As always, I must say something about the untimeliness of your comments. It was a full 5 days from my post until your comment.