Feed aggregator

Evolving higher-order synergies reveals a trade-off between stability and information integration capacity in complex systems

Complexity Digest - Tue, 01/30/2024 - 14:22

Thomas F. Varley, Joshua Bongard

There has recently been an explosion of interest in how “higher-order” structures emerge in complex systems. This “emergent” organization has been found in a variety of natural and artificial systems, although at present the field lacks a unified understanding of what the consequences of higher-order synergies and redundancies are for systems. Typical research treat the presence (or absence) of synergistic information as a dependent variable and report changes in the level of synergy in response to some change in the system. Here, we attempt to flip the script: rather than treating higher-order information as a dependent variable, we use evolutionary optimization to evolve boolean networks with significant higher-order redundancies, synergies, or statistical complexity. We then analyse these evolved populations of networks using established tools for characterizing discrete dynamics: the number of attractors, average transient length, and Derrida coefficient. We also assess the capacity of the systems to integrate information. We find that high-synergy systems are unstable and chaotic, but with a high capacity to integrate information. In contrast, evolved redundant systems are extremely stable, but have negligible capacity to integrate information. Finally, the complex systems that balance integration and segregation (known as Tononi-Sporns-Edelman complexity) show features of both chaosticity and stability, with a greater capacity to integrate information than the redundant systems while being more stable than the random and synergistic systems. We conclude that there may be a fundamental trade-off between the robustness of a systems dynamics and its capacity to integrate information (which inherently requires flexibility and sensitivity), and that certain kinds of complexity naturally balance this trade-off.

Read the full article at: arxiv.org

WOSC 19th Congress 2024

Complexity Digest - Tue, 01/30/2024 - 11:23

September 11-13, 2024 in Lady Margaret Hall, Oxford, UK

Shaping collaborative ecosystems for tomorrow

The complexity of interactions and relationships in our world have consistently surpassed our ability to fully comprehend and govern. The presence of intelligent tools, both in the digital and physical realms, is progressively enhancing our capacities to act on personal, organizational, national, and international levels, leading to both intended and unintended consequences. Collectively, these changes are reshaping our primary habitat—the planet Earth—at a speed and scale that necessitate earnest consideration. In the midst of uncertainty, the development and utilization of these new capabilities would greatly benefit from CyberSystemic approaches and methods of learning. This advancement is crucial for fostering a sustainable understanding and taking actions to avert major threats to our civilization.

More at: wosc.world

Defining Complex Adaptive Systems: An Algorithmic Approach

Complexity Digest - Tue, 01/30/2024 - 09:18

Ahmad, M.A.; Baryannis, G.; Hill, R

Systems 2024, 12(2), 45

Despite a profusion of literature on complex adaptive system (CAS) definitions, it is still challenging to definitely answer whether a given system is or is not a CAS. The challenge generally lies in deciding where the boundaries lie between a complex system (CS) and a CAS. In this work, we propose a novel definition for CASs in the form of a concise, robust, and scientific algorithmic framework. The definition allows a two-stage evaluation of a system to first determine whether it meets complexity-related attributes before exploring a series of attributes related to adaptivity, including autonomy, memory, self-organisation, and emergence. We demonstrate the appropriateness of the definition by applying it to two case studies in the medical and supply chain domains. We envision that the proposed algorithmic approach can provide an efficient auditing tool to determine whether a system is a CAS, also providing insights for the relevant communities to optimise their processes and organisational structures.

Read the full article at: www.mdpi.com

How Much of the World Is It Possible to Model?

Complexity Digest - Sat, 01/20/2024 - 10:36

Dan Rockmore

Mathematical models power our civilization—but they have limits.

Read the full article at: www.newyorker.com

Is the Emergence of Life an Expected Phase Transition in the Evolving Universe?

Complexity Digest - Sat, 01/20/2024 - 04:41

Stuart Kauffman, Andrea Roli

We propose a novel definition of life in terms of which its emergence in the universe is expected, and its ever-creative open-ended evolution is entailed by no law. Living organisms are Kantian Wholes that achieve Catalytic Closure, Constraint Closure, and Spatial Closure. We here unite for the first time two established mathematical theories, namely Collectively Autocatalytic Sets and the Theory of the Adjacent Possible. The former establishes that a first-order phase transition to molecular reproduction is expected in the chemical evolution of the universe where the diversity and complexity of molecules increases; the latter posits that, under loose hypotheses, if the system starts with a small number of beginning molecules, each of which can combine with copies of itself or other molecules to make new molecules, over time the number of kinds of molecules increases slowly but then explodes upward hyperbolically. Together these theories imply that life is expected as a phase transition in the evolving universe. The familiar distinction between software and hardware loses its meaning in living cells. We propose new ways to study the phylogeny of metabolisms, new astronomical ways to search for life on exoplanets, new experiments to seek the emergence of the most rudimentary life, and the hint of a coherent testable pathway to prokaryotes with template replication and coding.

Read the full article at: arxiv.org

Math’s ‘Game of Life’ Reveals Long-Sought Repeating Patterns

Complexity Digest - Fri, 01/19/2024 - 12:38

John Conway’s Game of Life, a famous cellular automaton, has been found to have periodic patterns of every possible length.

Read the full article at: www.quantamagazine.org

Evolution of a Theory of Mind

Complexity Digest - Wed, 01/17/2024 - 11:35

Tom Lenaerts, Marco Saponara, Jorge M. Pacheco, Francisco C. Santos

iScience

Even though Theory of Mind in upper primates has been under investigation for decades, how it may evolve remains an open problem. We propose here an evolutionary game theoretical model where a finite population of individuals may use reasoning strategies to infer a response to the anticipated behaviour of others within the context of a sequential dilemma, i.e., the centipede game. We show that strategies with bounded reasoning evolve and flourish under natural selection, provided they are allowed to make reasoning mistakes and a temptation for higher future gains is in place. We further show that non-deterministic reasoning co-evolves with an optimism bias that may lead to the selection of new equilibria, closely associated with average behaviour observed in experimental data. This work reveals both a novel perspective on the evolution of bounded rationality and a co-evolutionary link between the evolution of ToM and the emergence of misbeliefs.

Read the full article at: www.sciencedirect.com

Universal Complexity Science and Theory of Everything: Challenges and Prospects

Complexity Digest - Tue, 01/16/2024 - 13:58

Srdjan Kesić

Systems 2024, 12(1), 29

This article argues that complexity scientists have been searching for a universal complexity in the form of a “theory of everything” since some important theoretical breakthroughs such as Bertalanffy’s general systems theory, Wiener’s cybernetics, chaos theory, synergetics, self-organization, self-organized criticality and complex adaptive systems, which brought the study of complex systems into mainstream science. In this respect, much attention has been paid to the importance of a “reductionist complexity science” or a “reductionist theory of everything”. Alternatively, many scholars strongly argue for a holistic or emergentist “theory of everything”. The unifying characteristic of both attempts to account for complexity is an insistence on one robust explanatory framework to describe almost all natural and socio-technical phenomena. Nevertheless, researchers need to understand the conceptual historical background of “complexity science” in order to understand these longstanding efforts to develop a single all-inclusive theory. In this theoretical overview, I address this underappreciated problem and argue that both accounts of the “theory of everything” seem problematic, as they do not seem to be able to capture the whole of reality. This realization could mean that the idea of a single omnipotent theory falls flat. However, the prospects for a “holistic theory of everything” are much better than a “reductionist theory of everything”. Nonetheless, various forms of contemporary systems thinking and conceptual tools could make the path to the “theory of everything” much more accessible. These new advances in thinking about complexity, such as “Bohr’s complementarity”, Morin’s Complex thinking, and Cabrera’s DSRP theory, might allow the theorists to abandon the EITHER/OR logical operators and start thinking about BOTH/AND operators to seek reconciliation between reductionism and holism, which might lead them to a new “theory of everything”.

Read the full article at: www.mdpi.com

Defining a city — delineating urban areas using cell-phone data

Complexity Digest - Tue, 01/16/2024 - 11:32

Lei Dong, Fabio Duarte, Gilles Duranton, Paolo Santi, Marc Barthelemy, Michael Batty, Luís Bettencourt, Michael Goodchild, Gary Hack, Yu Liu, Denise Pumain, Wenzhong Shi, Vincent Verbavatz, Geoffrey B. West, Anthony G. O. Yeh & Carlo Ratti 
Nature Cities (2024)

What is a city? Researchers use different criteria and datasets to define it—from population density to traffic flows. We argue there is one dataset that could serve as a proxy of the temporal and spatial connections that make cities what they are: geolocated data from the world’s more than 7 billion cell-phone users. Cell-phone data are a proxy of people’s presence in a given area and of their movement between areas. Combined with computational methods, these data can support city delineations that are dynamic, responding to multiple statistical and administrative requirements, and tailored to different research needs, thus accelerating ongoing work in urban science.

Read the full article at: www.nature.com

Information decomposition and the informational architecture of the brain

Complexity Digest - Sun, 01/14/2024 - 11:40

Andrea I. Luppi, Fernando E. Rosas, Pedro A.M. Mediano, David K. Menon, Emmanuel A. Stamatakis

Trends in Cognitive Science

To explain how the brain orchestrates information-processing for cognition, we must understand information itself. Importantly, information is not a monolithic entity. Information decomposition techniques provide a way to split information into its constituent elements: unique, redundant, and synergistic information. We review how disentangling synergistic and redundant interactions is redefining our understanding of integrative brain function and its neural organisation. To explain how the brain navigates the trade-offs between redundancy and synergy, we review converging evidence integrating the structural, molecular, and functional underpinnings of synergy and redundancy; their roles in cognition and computation; and how they might arise over evolution and development. Overall, disentangling synergistic and redundant information provides a guiding principle for understanding the informational architecture of the brain and cognition.

Read the full article at: www.cell.com

Bundling by volume exclusion in non-equilibrium spaghetti

Complexity Digest - Fri, 01/12/2024 - 11:37

I. Bonamassa, B. Ráth, M. Pósfai, M. Abért, D. Keliger, B. Szegedy, J. Kertész, L. Lovász, A.-L. Barabási

In physical networks, like the brain or metamaterials, we often observe local bundles, corresponding to locally aligned link configurations. Here we introduce a minimal model for bundle formation, modeling physical networks as non-equilibrium packings of hard-core 3D elongated links. We show that growth is logarithmic in time, in stark contrast with the algebraic behavior of lower dimensional random packing models. Equally important, we find that this slow kinetics is metastable, allowing us to analytically predict an algebraic growth due to the spontaneous formation of bundles. Our results offer a mechanism for bundle formation resulting purely from volume exclusion, and provide a benchmark for bundling activation and growth during the assembly of physical networks.

Read the full article at: arxiv.org

Fireflies, brain cells, dancers: new synchronisation research shows nature’s perfect timing is all about connections

Complexity Digest - Thu, 01/11/2024 - 13:13

Joseph Lizier

Getting in sync can be exhilarating when you’re dancing in rhythm with other people or clapping along in an audience. Fireflies too know the joy of synchronisation, timing their flashes together to create a larger display to attract mates.

Synchronisation is important at a more basic level in our bodies, too. Our heart cells all beat together (at least when things are going well), and synchronised electrical waves can help coordinate brain regions – but too much synchronisation of brain cells is what happens in an epileptic seizure.

Sync most often emerges spontaneously rather than through following the lead of some central timekeeper. How does this happen? What is it about a system that determines whether sync will emerge, and how strong it will be?

Read the full article at: theconversation.com

Antifragility as a complex system’s response to perturbations, volatility, and time

Complexity Digest - Wed, 01/10/2024 - 13:09

Cristian Axenie, Oliver López-Corona, Michail A. Makridis, Meisam Akbarzadeh, Matteo Saveriano, Alexandru Stancu, Jeffrey West

Antifragility characterizes the benefit of a dynamical system derived from the variability in environmental perturbations. Antifragility carries a precise definition that quantifies a system’s output response to input variability. Systems may respond poorly to perturbations (fragile) or benefit from perturbations (antifragile). In this manuscript, we review a range of applications of antifragility theory in technical systems (e.g., traffic control, robotics) and natural systems (e.g., cancer therapy, antibiotics). While there is a broad overlap in methods used to quantify and apply antifragility across disciplines, there is a need for precisely defining the scales at which antifragility operates. Thus, we provide a brief general introduction to the properties of antifragility in applied systems and review relevant literature for both natural and technical systems’ antifragility. We frame this review within three scales common to technical systems: intrinsic (input-output nonlinearity), inherited (extrinsic environmental signals), and interventional (feedback control), with associated counterparts in biological systems: ecological (homogeneous systems), evolutionary (heterogeneous systems), and interventional (control). We use the common noun in designing systems that exhibit antifragile behavior across scales and guide the reader along the spectrum of fragility-adaptiveness-resilience-robustness-antifragility, the principles behind it, and its practical implications.

Read the full article at: arxiv.org

A tiny fraction of all species forms most of nature: Rarity as a sticky state

Complexity Digest - Wed, 01/10/2024 - 08:21

Egbert H. van Nes, Diego G. F. Pujoni, Sudarshan A. Shetty, Gerben Straatsma, Willem M. de Vos, Marten Scheffer

PNAS 121 (2) e2221791120

Data from the human microbiome as well as communities of flies, rodents, fish, trees, plankton, and fungi suggest that consistently a tiny fraction of the species accounts for most of the biomass. We suggest that this may be due to an overlooked phenomenon that we call “stickiness” of rarity. This can arise in groups of species that are equivalent in resource use but differ in their response to stochastic stressors such as weather extremes and disease outbreaks. Stickiness is not absolute though. In our simulations, as well as natural time series from microbial communities, rare species occasionally replace dominant ones that collapse, supporting the insurance theory of biodiversity. Rare species may play an important role as backups stabilizing ecosystem functioning.

Read the full article at: www.pnas.org

Infodynamics, a Review

Complexity Digest - Tue, 01/09/2024 - 15:17

Klaus Jaffe

A review of studies on the interaction of information with the physical world found no fundamental contradiction between the eighth authors promoting Infodynamics. Each one emphasizes different aspects. The fact that energy requires information in order to produce work and that the acquisition of new information requires energy, triggers synergistic chain reactions producing increases of negentropy (increases in Useful Information or decreases in Information Entropy) in living systems. Infodynamics aims to study feasible balances between energy and information using empirical methods. Getting information requires energy and so does separating useful information from noise. Producing energy requires information, but there is no direct proportionality between the energy required to produce the information and the energy unleashed by this information. Energy and information are parts of two separate realms of reality that are intimately entangled but follow different laws of nature. Infodynamics recognizes multiple forms and dimensions of information. Information can be the opposite of thermodynamic entropy (Negentropy), a trigger of Free Energy (Useful or Potentially Useful), a reserve (Redundant Information), Structural, Enformation, Intropy, Entangled, Encrypted Information or Noise. These are overlapping functional properties focusing on different aspects of Information. Studies on information entropy normally quantify only one of these dimensions. The challenge of Infodynamics is to design empirical studies to overcome these limitations. The working of sexual reproduction and its evolution through natural selection and its role in powering the continuous increase in information and energy in living systems might teach us how.

Read the full article at: www.qeios.com

Critical phenomena in complex networks: from scale-free to random networks

Complexity Digest - Tue, 01/09/2024 - 13:06

Alexander Nesterov & Pablo Héctor Mata Villafuerte

The European Physical Journal B Volume 96, article number 143, (2023)

Within the conventional statistical physics framework, we study critical phenomena in configuration network models with hidden variables controlling links between pairs of nodes. We obtain analytical expressions for the average node degree, the expected number of edges in the graph, and the Landau and Helmholtz free energies. We demonstrate that the network’s temperature controls the average node degree in the whole network. We also show that phase transition in an asymptotically sparse network leads to fundamental structural changes in the network topology. Below the critical temperature, the graph is completely disconnected; above the critical temperature, the graph becomes connected, and a giant component appears. Increasing temperature changes the degree distribution from power-degree for lower temperatures to a Poisson-like distribution for high temperatures. Our findings suggest that temperature might be an inalienable property of real networks.

Read the full article at: link.springer.com

SHEEP, a Signed Hamiltonian Eigenvector Embedding for Proximity

Complexity Digest - Mon, 01/08/2024 - 13:02

Shazia’Ayn Babul & Renaud Lambiotte 

Communications Physics volume 7, Article number: 8 (2024

Signed network embedding methods allow for a low-dimensional representation of nodes and primarily focus on partitioning the graph into clusters, hence losing information on continuous node attributes. Here, we introduce a spectral embedding algorithm for understanding proximal relationships between nodes in signed graphs, where edges can take either positive or negative weights. Inspired by a physical model, we construct our embedding as the minimum energy configuration of a Hamiltonian dependent on the distance between nodes and locate the optimal embedding dimension. We show through a series of experiments on synthetic and empirical networks, that our method (SHEEP) can recover continuous node attributes showcasing its main advantages: re-configurability into a computationally efficient eigenvector problem, retrieval of ground state energy which can be used as a statistical test for the presence of strong balance, and measure of node extremism, computed as the distance to the origin in the optimal embedding.

Read the full article at: www.nature.com

Complex Systems Summer School | Santa Fe Institute

Complexity Digest - Mon, 01/08/2024 - 10:04

Complex Systems Summer School (CSSS) offers an intensive four-week introduction to complex behavior in mathematical, physical, living, and social systems. CSSS brings together graduate students, postdoctoral fellows, and professionals to transcend disciplinary boundaries, take intellectual risks, and ask big questions about complex systems. The residential program comprises a series of lectures and workshops devoted to theory and tools, applications-focused seminars, and discussions with faculty and fellow participants. CSSS participants put what they learn from these didactic sessions into practice through group research projects, conducted throughout the program and often extending into manuscripts and longer-term collaborations. CSSS provides an unparalleled opportunity for early-career researchers to expand their professional networks, produce a novel research product, and gain valuable experience working in transdisciplinary teams.

More at: www.santafe.edu

Emergence and Causality in Complex Systems: A Survey on Causal Emergence and Related Quantitative Studies

Complexity Digest - Sun, 01/07/2024 - 17:02

Bing Yuan, Zhang Jiang, Aobo Lyu, Jiayun Wu, Zhipeng Wang, Mingzhe Yang, Kaiwei Liu, Muyun Mou, Peng Cui

Emergence and causality are two fundamental concepts for understanding complex systems. They are interconnected. On one hand, emergence refers to the phenomenon where macroscopic properties cannot be solely attributed to the cause of individual properties. On the other hand, causality can exhibit emergence, meaning that new causal laws may arise as we increase the level of abstraction. Causal emergence theory aims to bridge these two concepts and even employs measures of causality to quantify emergence. This paper provides a comprehensive review of recent advancements in quantitative theories and applications of causal emergence. Two key problems are addressed: quantifying causal emergence and identifying it in data. Addressing the latter requires the use of machine learning techniques, thus establishing a connection between causal emergence and artificial intelligence. We highlighted that the architectures used for identifying causal emergence are shared by causal representation learning, causal model abstraction, and world model-based reinforcement learning. Consequently, progress in any of these areas can benefit the others. Potential applications and future perspectives are also discussed in the final section of the review.

Read the full article at: arxiv.org

Scalable network reconstruction in subquadratic time

Complexity Digest - Sun, 01/07/2024 - 12:58

Tiago P. Peixoto
Network reconstruction consists in determining the unobserved pairwise couplings between N nodes given only observational data on the resulting behavior that is conditioned on those couplings — typically a time-series or independent samples from a graphical model. A major obstacle to the scalability of algorithms proposed for this problem is a seemingly unavoidable quadratic complexity of O(N2), corresponding to the requirement of each possible pairwise coupling being contemplated at least once, despite the fact that most networks of interest are sparse, with a number of non-zero couplings that is only O(N). Here we present a general algorithm applicable to a broad range of reconstruction problems that achieves its result in subquadratic time, with a data-dependent complexity loosely upper bounded by O(N3/2logN), but with a more typical log-linear complexity of O(Nlog2N). Our algorithm relies on a stochastic second neighbor search that produces the best edge candidates with high probability, thus bypassing an exhaustive quadratic search. In practice, our algorithm achieves a performance that is many orders of magnitude faster than the quadratic baseline, allows for easy parallelization, and thus enables the reconstruction of networks with hundreds of thousands and even millions of nodes and edges.

Read the full article at: arxiv.org

Pages

Subscribe to Self-organizing Systems Lab aggregator