NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
The Genomic Code: The genome instantiates a generative model of the organism (arxiv.org)
svnt 165 days ago [-]
It’s an intriguingly framed paper, and others have written about how gene regulatory networks store weights, etc, but this seems to me like it is putting too much emphasis on the direct mapping of the developmental process to a generative process just because it is popular at the moment.

The encoder being evolution is an idea that has been developed by Sui Huang and numerous others.

The genetic decoder being creative/generative is an idea that has been put forth by Richard Watson and others.

What I’m more interested in at this point is how do the layers interact? Rather than waving at energy landscapes, stable attractors, and catastrophe theory, why can’t we put this to use producing alife simulations that deliver open-ended evolution?

sam537 165 days ago [-]
Popular and incompletely understood. That analogy is unhelpful to understand processes. It's attempting to explain a incompletely understood phenomenon with another. As someone else said below, the problem of the genome being the 'compressed representation' is reductive since genomes do not exist in isolation in an organism. Genomes were environmentally selected for millenia and many unicellular organisms share genomic material with other organisms or the environment. I think the framework is useful but it does not scale.
samuell 165 days ago [-]
I find the analogy itself super interesting and thought-provoking, and might be quite useful in studying biological systems.

The idea that we need to move away from understanding the genetic code as static machinery also aligns well with recent understanding of biology, as summarized in the highly appraised book "How Life Works" by Philip Ball [1]

But in terms of evolution, I don't see how the proposed analogy/model will get away with the fact that natural selection operates on the individual level (either you survive or you don't), while all genomic information is a package of (depending on the organism) humongous number of genes, not to mention base pairs or individual locuses (that's not even mentioning diploidy, that will mean two different copies of each locus).

With the known proportions and numbers of positive vs slightly deleterious mutations (much more of the latter, counted in the hundreds per individual), selection of positive mutations can not avoid accumulating slightly deleterious mutations.

I don't get how the proposed model is supposed to solve that.

I see that a smartly designed system could probably have processes that can change multiple loci in parallel in a beneficial way (along the lines of the theory of facilitated variation by Gerhart & Kirschner. See their papers or [2]), but that would only explain how such a fine-tuned system could be effective at adaptation, not how the system itself - including its processes - could arise from a state before these processes are in place.

To connect it to ML: In natural selection you don't have gradients and the ability to update multiple parameters based on detailed feedback on them individually. You only can provide a whole, binary feedback (survive, 1, or not, 0), to the whole set of parameters, whether they are slightly deleterious or possibly positive. The resolution is simply lacking here.

- [1] https://www.amazon.com/How-Life-Works-Users-Biology/dp/02268...

- [2] https://www.amazon.com/Plausibility-Life-Resolving-Darwins-D...

bubblyworld 165 days ago [-]
I think it's more useful to think of natural selection as acting (probabilistically) on populations of genomes, not individuals. The feedback is individual, but the "gradients" are at the population level. It's not a perfect analogy but e.g. there are formal correspondences like this one: https://www.nature.com/articles/s41467-021-26568-2
RaftPeople 165 days ago [-]
An interesting fact that I wasn't aware of until I read it recently is that our genes constrain the chance of mutations in critical areas of the body, which shifts the landscape.
stonogo 165 days ago [-]
Mutation resistance is itself the result of mutation (i.e. evolution), and isn't anything particularly special among humans. And it's not just critical areas; every cell in your body has enzymes that prevent mutation, both before and after a given replication.
RaftPeople 165 days ago [-]
> And it's not just critical areas

True, but the resistance to mutations is increased in critical areas compared to other areas. Not all changes are equally likely.

bubblyworld 165 days ago [-]
Yeah, that's interesting, I guess it's also a simplification to think of an organism as a single genome. The reality is much more complicated! (lichens come to mind as examples of even more genetic diversity housed in a single organism, or even gut bacteria in humans maybe?)
logicchains 165 days ago [-]
No matter how low the probability of life arising by chance, from the perspective of life the probability that it happened is 100%, because if it didn't happen then we wouldn't be around to observe it. We're operating from a massive selection bias.
mistermann 165 days ago [-]
The probability of it happening by chance alone is not known though, in the technical sense of the word. In the colloquial sense it is though, which makes for a very interesting situation here in 2024, and prior.
evv555 165 days ago [-]
>The encoder being evolution is an idea that has been developed by Sui Huang and numerous others.

Parallels between evolutionary systems and hill-climbing algorithms have been floating around for a long time at this point.

jhbadger 165 days ago [-]
Although normally in the other direction -- genetic algorithms and genetic programming directly mimic evolution by natural selection, for example.
evv555 165 days ago [-]
It was just the less controversial direction. You would be accused of anthropomorphizing and teleological thinking if you suggested evolution is a search process towards some optimum. Good luck talking to an orthodox biologist about evolutionary processes with constructs like agents and intelligence.
bjornsing 165 days ago [-]
> why can’t we put this to use producing alife simulations that deliver open-ended evolution?

Has it been tried? I’ve said for ages: show me a representation where a random bit flip generally results in a different but viable entity, and I’ll show you artificial life. The latent space of a VAE could well have those properties.

But it’s not open-ended though (in its obvious form) since the VAE would have to be trained on various complex life forms and will probably not extrapolate well outside the support of the training distribution.

RaftPeople 165 days ago [-]
> show me a representation where a random bit flip generally results in a different but viable entity

It's a really interesting problem I pondered quite a bit when doing some a-life hobby stuff.

I never came up with a good solution, but you can kind of "feel" that the solution needs to be more analog-ish in the way info is represented. As you say, a small change in data (bit flip) probably needs to produce a small change in the resulting form. Possibly the binary representation points to a vector space of form "primitives" (drivers of form) such that adjacent points have similar form.

svnt 165 days ago [-]
> But it’s not open-ended though (in its obvious form) since the VAE would have to be trained on various complex life forms and will probably not extrapolate well outside the support of the training distribution.

That is always the issue in alife: we discover processes that help us explore bounded information spaces, but only that.

exe34 165 days ago [-]
> why can’t we put this to use producing alife simulations that deliver open-ended evolution?

you might like this: https://www.gregegan.net/DIASPORA/01/Orphanogenesis.html

theGnuMe 165 days ago [-]
Do you have any references?
svnt 165 days ago [-]
Here is a readable article by Watson that is probably among the more relevant:

https://www.richardawatson.com/songs-of-life-and-mind

There are many, and the paper itself cites many good sources including by Huang and Watson in the references section.

danwills 165 days ago [-]
I have only read the abstract so far but this seems to align pretty well with the idea of how the relationship between genes and tissues/organs is framed in Michael Levin's group's research: Genes mostly encode the molecular hardware and this helps to set up the initial-state of the 'software' during morphogenesis, and the cells primarily follow the software, but within the bounds of what is supported by the hardware.

The 'software' of biology in this framework is described as like pattern-memories stored in "vMem" voltage-gradient patterns between cells in the tissue, analagously to how neurons store information. I think the analogy breaks down slightly here because the memory is more like a remebered-target than something that 'can be executed' like software can.

The vMem 'memory' of what 'shape' to grow-into can be altered (by adding chemicals that open or close specific gap junctions) such that any regrowth or development can target a different location in morphospace (ie grow an eye instead of epithelial tissue as in the tadpole example from Levin's research).

Fascinating and I hope to have a read of the whole paper soon!

visarga 165 days ago [-]
DNA "generates" the body, which generates behaviour, which affects gene survival, closing the loop.

<rant> It's a syntactic process with the ability to update syntax based on outcomes in the environment. I think this proves that syntax is sufficient for semantics, given the environment.

Wondering why Searle affirmed the opposite. Didn't he know about compilers, functional programming, lambda calculus, homoiconicity - syntax can operate on syntax, can modify or update it. Rules can create rules because they have a dual status - of behaviour and data. They can be both "verbs" and "objects". Gödel's incompleteness theorems use Arithmetization to encode math statements as data, making math available to itself as object of study.

So syntax not fixed, it has unappreciated depth and adaptive capability. In neural nets both the fw and bw passes are purely syntactic, yet they affect the behaviour/rules/syntax of the model. Can we say AlphaZero and AlphaProof don't really understand even if they are better than most of us in non-parroting situations? </>

meroes 165 days ago [-]
But in the first instance of syntax, what does that bare syntax mean? 1+1=2 as a physical instantiation (say written on paper) only has any relevant causal powers because of humans. DNA is a physical, causally interfacing thing with or without anything else built on top of it/from it. A mathematical sentence sits on a piece of paper just like any random scrabbling of pencil led without a consciousness. DNA is at its barest much much more causally interactive. And syntax is always like this. DNA is morphing itself however.

Remember they are just symbols. Whereas DNA is chemically highly interactive. We could all change conventions and obsolete the “+” back to nothingness. We can’t do that for a chemical in DNA

visarga 165 days ago [-]
I think of syntax as applying a rule. A program is just syntactic operations. The idea is that it has two aspects: "program in execution" and "program as data". Rules expressing behavior and rules expressed as data, in which case it can be processed by other rules.

One concrete example is a bootstrapped compiler. It is both data and execution. It can build itself, putting its output as input again. Another example is in math - Gödel's arithmetization, which encodes math statements as numbers, processing math syntax with math operations. And of course neural nets, you can describe them as purely syntactic (mechanical) operations, but they also update rules and learn. In the backward pass, the model becomes input for gradient update. So it is both rule and data. DNA too.

These systems that express rules or syntax that is adaptive, I think they make the leap to semantics by grounding in the outside environment. The idea that syntax is shallow and fixed is wrong, in fact syntax can be deep and self generative. Syntax is just a compressed model of the environment, and that is how it gets to reflect semantics.

This was an argument against Stochastic Parrots and Chinese Room (syntax is not sufficient for semantics) maxim. I aimed to show that purely mechanical or syntactic operations carry more depth than originally thought.

theGnuMe 165 days ago [-]
I thought syntax was finite... wasn't that the Turing thing discussed here the other day.. also if your idea is true you'd be limited to the recursively enumerable.
alexpopow 165 days ago [-]
Very nice perspective. I used to joke about the missing comment lines in the genetic code: if "God" had been a good, conscientious programmer, he would have left a reasonable amount of comments in the code to make it maintainable for all the next developers - but it seems the task went over his head, and now we have to reverse-engineer all that chaos...
mistermann 165 days ago [-]
Alternatively, maybe all of the hints we need are available, but not where we "think" they are.

https://en.m.wikipedia.org/wiki/Streetlight_effect

dekhn 165 days ago [-]
yes, when I was a kid and already interested in computers, then learned about molecular biology (late 80s, early 90s), I definitely used that analogy to motivate my work. It really reminded me of the early MIT hackers exploring these systems that were somewhat opaque for people not in the "priesthood".
amelius 165 days ago [-]
If you're holding a hammer, everything looks like a nail ...
UncleOxidant 165 days ago [-]
Always happens when there's some new technology that dominates the imagination. The Universe was like a clock, the brain was like a computer, etc.
silverc4t 165 days ago [-]
> “Malcolm sat back in his seat. ‘And fractals…fractal patterns are everywhere in nature. Trees, clouds, shells, lightning. Everything in nature is fractal, in the sense that nothing can be broken down into simple shapes. In fact, I now think that animals, and perhaps especially large animals like the dinosaurs, are more efficient than we ever imagined. Their bulk allows them to utilize fractal designs in their biological systems, which means that larger animals have greater efficiency of scale than smaller ones.’”

I had a similar idea in university, when I was fascinated by fractal designs and graphics, that allows to generate complex and different structures with just an algorithm and a seed.

Jurrasic Park and the quote above helped too, because it played with the idea that the bigger a system is, the more efficient it is in nature, instead of less efficient, like an organization.

This paper kind of supports my idea that DNA is just a seed for our biological system to produce a specific output.

jkingsman 165 days ago [-]
While this paper is perhaps is a bit of an overfitting of the concept of generative weights, another fun take is Evo-Devo[0], an acapella overview of evolutionary developmental biology.

[0]: https://www.youtube.com/watch?v=ydqReeTV_vk

stainablesteel 165 days ago [-]
its a very poetic interpretation

i think i would take it a step further, most organisms alive today operate at the level of a generative model for a generative model (continue umpteen times) until you arrive at the level of physiology that assembles nerves and organs to work at the scale they do

and i would also comment on the impeccability of the feedback mechanisms across each layer, that every message eventually gets into a cell, binds to a protein, which probably cascades into a binding of a 100 different proteins at some point, that eventually sends a message to the tiny nucleus to wrap or unwrap a specific segment of DNA, is quite a beautiful way of thinking about it

mistermann 165 days ago [-]
I wonder if this extends to consciousness, including the impeccable part.
stainablesteel 164 days ago [-]
a real adrenaline rush can give you amazing physical power (at the cost of a lot of pain the next day)

so in a way, yeah, because that would go through consciousness first then signals back down stream

keepamovin 165 days ago [-]
Maybe… but if you take the view that it instantiates automata that create the morphology then i think it’s like, “well.. duh!”
billybones 165 days ago [-]
ChatGPT's response to "can you summarize this in lay terms":

In studies of the "RNA world," a theoretical early stage in the origin of life where RNA molecules played a crucial role, researchers have observed that parasitism is a common phenomenon. This means that some molecules can exploit others for their own benefit, which could lead to the extinction of those being exploited unless certain protective measures are in place, such as separating the molecules into compartments or arranging them in specific patterns.

By thinking of RNA replication as a kind of active process, similar to a computer running a program, researchers can explore various strategies that RNA might use to adapt to challenges in its environment. The study uses computer models to investigate how parasitism emerges and how complexity develops in response.

Initially, the system starts with a designed RNA molecule that can copy itself and occasionally makes small mistakes (mutations) during this process. Very quickly, shorter RNA molecules that act as parasites appear. These parasites are copied more rapidly because of their shorter length, giving them an advantage. In response, the original replicating molecules also become shorter to speed up their own replication. They develop ways to slow down the copying process, which helps reduce the advantage parasites have.

Over time, the replicating molecules also evolve more complex methods to distinguish between their own copies and the parasites. This complexity grows as new parasite species keep arising, not from evolving existing parasites, but from mutations in the replicating molecules themselves.

The process of evolution changes as well, with increases in mutation rates and the emergence of new mutation processes. As a result, parasitism not only drives the evolution of more complex replicators but also leads to the development of complex ecosystems. In summary, the study shows how parasitism can be a powerful force that promotes complexity and diversity in evolving systems.

endlessvoid94 165 days ago [-]
I believe you've commented on the wrong thread
doctoboggan 165 days ago [-]
While I am sure some people will roll their eyes at the idea, I thought it was pretty interesting. I wish they were able to make some predictions using their model though.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 15:44:09 GMT+0000 (Coordinated Universal Time) with Vercel.