2023 Biophysics Symposium

February 3, 2023 in SPL 57

8:00 a.m. Coffee

8:30 a.m. Marianne Bauer (TU Delft) Sense and Decide: Information processing during gene regulation

Abstract: Cells express genes in order to respond to environmental changes, differentiate or decide their fates, and develop into a healthy organism. Gene expression is regulated by cues, such as changing transcription factor concentrations, whose concentrations are often low. Fundamentally, regulating an output (gene expression) in response to a cue (changing concentration) can be viewed as a type of decision-making process that can be analyzed in terms of an information-theoretic framework. In this talk, I will show, on the example of early fly embryo development, how such an information-theoretic inference approach can help us understand features of a complex transcriptional apparatus that may be difficult to model, due to the complexity of the contributing regulatory factors. I will compare the inferred optimal sensor to realistic, microscopic models for regions on the DNA that respond to transcription factors, and, finally, relate their architecture to features commonly found in efficient computing systems.

9:15 a.m. Kamesh Krishnamurthy (Princeton University) Towards a physics of neural computation

Abstract: I will give an overview of how tools from theoretical physics can be used to understand computations in neural circuits. In the first part of the talk, I will focus on a fundamental question in neural computation: how do the microscopic mechanisms in neural networks influence their collective computation? In this context, I will present results on how a mechanism termed “gating” shapes computation in recurrent neural networks. Gating is not only ubiquitous in neurons, but is also the central driver of performance gains in modern machine learning models. Among other benefits, I will show how gating robustly allows the generation of long timescales, and makes models more easily trainable by virtue of taming the gradients. Second, I will build on these insights about gating to address another salient issue in biophysics and neuroscience: i.e. the challenges involved in implementing graded/continuous memories in biological systems without fine-tuning parameters. I will propose a general principle of “Frozen-Stabilization”, which allows a wide variety of systems to self-organize to a critical state, allowing them to robustly implement continuous memory without the need for fine-tuning. This state also robustly exhibits a wide range of relaxation timescales – something that has been challenging to achieve theoretically. I will end the talk by laying out some broader problems in neural and biological computation upon which these versatile techniques from physics could be brought to bear.

10:00 a.m. Christopher Lynn (Princeton University and the City University of New York) Statistical physics of emergence and dynamics in neural systems

Abstract: The brain is immensely complex, with microscopic interactions building upon one another to produce macroscopic behaviors and impressive feats of information processing. Yet basic questions remain: How do large-scale patterns of activity emerge from networks of fine-scale interactions? And how do these patterns dynamically evolve to process information?

Here, I will discuss how principles from statistical mechanics can shed light on these questions. I will begin with a high-level overview of my research, before focusing on two representative projects. First, building upon recent breakthroughs in non-equilibrium physics, I will show how the irreversibility – or the distance from equilibrium – of complex systems can be systematically decomposed into simpler parts. Applying our framework to populations of neurons, we find that this irreversibility arises primarily from pairs of cells, thus giving a simplified picture of the non-equilibrium dynamics. Second, combining ideas from information theory and network science, I will present a framework for inferring the optimally informative interactions in large networks of neurons. Together, these results – and the principled techniques developed along the way – have the potential to shape our understanding of emergence and dynamics not only in the brain, but broadly in complex living systems.

11:00 a.m. Gautam Reddy Nallamala (NTT Research: Harvard University) Towards a physics of learning and decision-making

Abstract: Advances in machine learning and the ability to track animal movements for long periods of time have opened the possibility for precise quantification of behavior and the development of new theory. In this talk, I will present an attempt at developing one such framework, rooted in control theory and reinforcement learning. I will touch on two examples (1) odor trail tracking and (2) rodent navigation in mazes, where theory explains phenomenology and provides new insights into how complex behaviors emerge from the interaction between learning rules, task structure and the physics of the environment.

11:45 a.m. Armita Nourmohammad (University of Washington) Learning the shape of the immune and protein universe

Abstract: The adaptive immune system consists of highly diverse B- and T-cell receptors, which can recognize a multitude of diverse pathogens. Immune recognition relies on molecular interactions between immune receptors and pathogens, which in turn is determined by the complementarity of their 3D structures and amino acid compositions, i.e., their shapes. Immune shape space has been previously introduced as an abstraction of molecular recognition in the immune system. However, the relationships between immune receptor sequence, protein structure, and specificity are very difficult to quantify in practice. In this talk, I will discuss how the growing amount of immune repertoire sequence data together with protein structures can shed light on the organization and encoding of information in the adaptive immune system. I will introduce physically motivated machine learning approaches to learn representations of protein micro-environments in general, and of immune receptors, in particular. The learned models reflect the relevant biophysical properties that determine a protein’s stability, and function, and could be used to predict immune recognition and to design novel immunogens e.g. for vaccine design.

12:30 p.m. Suraj Shankar (Harvard University) Active Hydraulics: Physics and Physiology

Abstract: Hydraulics, i.e., the study of the motive power of fluids, plays an important role in controlling rapid movements in plants. But in animals, where muscles are the primary effectors of behavior and locomotion, are there similar physical principles that dictate the limits of muscular performance? By adopting a spatiotemporally integrated and multiscale view of muscle, I will show how a minimal description of muscle as a soft, active and wet solid (an `active sponge’) is sufficient to describe its mechanical, dynamic and energetic properties. By reanalyzing existing data, I will highlight the presence of intracellular fluid flow which in conjunction with the kinetic cycling of molecular motors eventually dictates the ultimate limits of muscular contraction across the animal kingdom. Furthermore, I will demonstrate that muscle naturally exhibits an unusual mechanical response that is nonreciprocal (or ‘odd’), hence uncovering a new mode of muscular power generation from periodic strain cycles alone. I will conclude by highlighting the consequences of this work for physiology.

Contact: Paul Tipton (PhysBiosearch@mailman.yale.edu)