Complexity Digest

Subscribe to Complexity Digest feed Complexity Digest
Networking the complexity community since 1999
Updated: 1 hour 37 min ago

Beehive scale-free emergent dynamics

Sat, 06/29/2024 - 09:14

Ivan Shpurov, Tom Froese & Dante R. Chialvo 

Scientific Reports volume 14, Article number: 13404 (2024)

It has been repeatedly reported that the collective dynamics of social insects exhibit universal emergent properties similar to other complex systems. In this note, we study a previously published data set in which the positions of thousands of honeybees in a hive are individually tracked over multiple days. The results show that the hive dynamics exhibit long-range spatial and temporal correlations in the occupancy density fluctuations, despite the characteristic short-range bees’ mutual interactions. The variations in the occupancy unveil a non-monotonic function between density and bees’ flow, reminiscent of the car traffic dynamic near a jamming transition at which the system performance is optimized to achieve the highest possible throughput. Overall, these results suggest that the beehive collective dynamics are self-adjusted towards a point near its optimal density.

Read the full article at: www.nature.com

Assembly Theory and its Relationship with Computational Complexity

Fri, 06/28/2024 - 19:10

Christopher Kempes, Sara I. Walker, Michael Lachmann, Leroy Cronin

Assembly theory (AT) quantifies selection using the assembly equation and identifies complex objects that occur in abundance based on two measurements, assembly index and copy number. The assembly index is determined by the minimal number of recursive joining operations necessary to construct an object from basic parts, and the copy number is how many of the given object(s) are observed. Together these allow defining a quantity, called Assembly, which captures the amount of causation required to produce the observed objects in the sample. AT’s focus on how selection generates complexity offers a distinct approach to that of computational complexity theory which focuses on minimum descriptions via compressibility. To explore formal differences between the two approaches, we show several simple and explicit mathematical examples demonstrating that the assembly index, itself only one piece of the theoretical framework of AT, is formally not equivalent to other commonly used complexity measures from computer science and information theory including Huffman encoding and Lempel-Ziv-Welch compression.

Read the full article at: arxiv.org

Hidden citations obscure true impact in science

Fri, 06/28/2024 - 15:07

Xiangyi Meng, Onur Varol, Albert-László Barabási Author Notes

PNAS Nexus, Volume 3, Issue 5, May 2024, page 155,

References, the mechanism scientists rely on to signal previous knowledge, lately have turned into widely used and misused measures of scientific impact. Yet, when a discovery becomes common knowledge, citations suffer from obliteration by incorporation. This leads to the concept of hidden citation, representing a clear textual credit to a discovery without a reference to the publication embodying it. Here, we rely on unsupervised interpretable machine learning applied to the full text of each paper to systematically identify hidden citations. We find that for influential discoveries hidden citations outnumber citation counts, emerging regardless of publishing venue and discipline. We show that the prevalence of hidden citations is not driven by citation counts, but rather by the degree of the discourse on the topic within the text of the manuscripts, indicating that the more discussed is a discovery, the less visible it is to standard bibliometric analysis. Hidden citations indicate that bibliometric measures offer a limited perspective on quantifying the true impact of a discovery, raising the need to extract knowledge from the full text of the scientific corpus.

Read the full article at: academic.oup.com

Feedback: How to Destroy or Save the World — Péter Érdi

Fri, 06/28/2024 - 13:46

The book offers an exciting, non-technical intellectual journey around applying feedback control to emerging and managing local and global crises, thus keeping the world on a sustainable trajectory. There is a narrow border between destruction and prosperity: to ensure reasonable growth but avoid existential risk, we must find the fine-tuned balance between positive and negative feedback.  This book addresses readers belonging to various generations, such as: young people growing up in a world where everything seems to be falling apart; people in their 30s and 40s who are thinking about how to live a fulfilling life;  readers in their 50s and 60s thinking back on life; and Baby Boomers reflecting on their past successes and failures.

Read the full article at: link.springer.com

François Chollet on Deep Learning and the Meaning of Intelligence

Fri, 06/28/2024 - 12:54

Which is more intelligent, ChatGPT or a 3-year old? Of course this depends on what we mean by “intelligence.” A modern LLM is certainly able to answer all sorts of questions that require knowledge far past the capacity of a 3-year old, and even to perform synthetic tasks that seem remarkable to many human grown-ups. But is that really intelligence? François Chollet argues that it is not, and that LLMs are not ever going to be truly “intelligent” in the usual sense — although other approaches to AI might get there.

Listen at: www.preposterousuniverse.com

Heinz von Foerster’s operational epistemology: orientation for insight into complexity

Fri, 06/28/2024 - 12:21

Arantzazu Saratxaga Arregi
Kybernetes

Purpose

Based on the reception of the principle of self-organization, the core of Heinz von Foerster’s operational theories, I hypothesize how Heinz von Foerster’s theory can be an orientation model for the epistemological problem of complexity. I have chosen this study to demonstrate complexity as an epistemological problem. This is because the question of how order arises – the core problem of complexity – is an epistemological question for which Heinz von Foerster developed an epistemology of self-organization. I do not present new research because HvF already had the complex organization of systems in mind. Rather, I build a critical approach to complexity on the research and work on operational epistemology in HvF.

Design/methodology/approach

This article aims to provide an orientation for a philosophical and epistemological understanding of complexity through a reading of Heinz von Foerster’s operational theory. The article attempts to establish complexity as an epistemological phenomenon through the following method: (1) a conceptual description of the science of complexity based on the turn to thermodynamic time, (2) a genealogy of complexity going back to the systemic method, and (3) Heinz von Foerster’s cybernetic approach to self-organization.

Findings

Based on the reception of the principle of self-organization, the core of Heinz von Foerster’s operational theories, the conclusion is drawn that complexity as a description is based on language games.

Research limitations/implications

The results present complexity not as an object of science, but as a description that stands for the understanding of complex description.

Social implications

The hypothesis that complexity is a question of description or observation, i.e. of description for what language serves, has enormous social implications, in that the description of complexes and the recognition of their orders (patterns) cannot be left to algorithmic governmentality, but must be carried out by a social agency.

Originality/value

HvF’s operational epistemology can serve as an epistemological model for critical complexity theory.

Read the full article at: www.emerald.com

Unveiling the reproduction number scaling in characterizing social contagion coverage

Fri, 06/28/2024 - 11:09

Xiangrong Wang, Hongru Hou, Dan Lu, Zongze Wu, Yamir Moreno

Chaos, Solitons & Fractals

Volume 185, August 2024, 115119

The spreading of diseases depends critically on the reproduction number, which gives the expected number of new cases produced by infectious individuals during their lifetime. Here we reveal a widespread power-law scaling relationship between the variance and the mean of the reproduction number across simple and complex contagion mechanisms on various network structures. This scaling relation is verified on an empirical scientific collaboration network and analytically studied using generating functions. Specifically, we explore the impact of the scaling law of the reproduction number on the expected size of cascades of contagions. We find that the mean cascade size can be inferred from the mean reproduction number, albeit with limitations in capturing spreading variations. Nonetheless, insights derived from the tail of the distribution of the reproduction number contribute to explaining cascade size variation and allow the distinction between simple and complex contagion mechanisms. Our study sheds light on the intricate dynamics of spreading processes and cascade sizes in social networks, offering valuable insights for managing contagion outbreaks and optimizing responses to emerging threats.

Read the full article at: www.sciencedirect.com

2025 IEEE Symposium Series on Computational Intelligence (IEEE SSCI 2025) – Trondheim, Norway 17th – 20th March

Wed, 06/26/2024 - 05:56

IEEE SSCI is widely recognized for cultivating the interchange of state-of-the-art theories and sophisticated algorithms within the broad realm of Computational Intelligence Applications. The Symposia provide for cross-pollination of research concepts, fostering an environment that facilitates future inter and intra collaborations.

The 2025 event marks a significant milestone in the evolution of IEEE SSCI, launching the newly restructured biennial Symposia Series featuring ten dedicated Applied Computational Intelligence Symposia.

More at: ieee-ssci.org

Fundamental Constraints to the Logic of Living Systems

Sun, 06/16/2024 - 13:52

Solé, R.; Kempes, C. P.; Corominas-Murtra, B.; De Domenico, M.; Kolchinsky, A.; Lachmann, M.; Libby, E.; Saavedra, S.; Smith, E.; Wolpert, D.

Preprints 2024, 2024060891

It has been argued that the historical nature of evolution makes it a highly path-dependent process. Under this view, the outcome of evolutionary dynamics could have resulted in organisms with different forms and functions. At the same time, there is ample evidence that convergence and constraints strongly limit the domain of the potential design principles that evolution can achieve. Are these limitations relevant in shaping the fabric of the possible? Here, we argue that fundamental constraints are associated with the logic of living matter. We illustrate this idea by considering the thermodynamic properties of living systems, the linear nature of molecular information, the cellular nature of the building blocks of life, multicellularity and development, the threshold nature of computations in cognitive systems, and the discrete nature of the architecture of ecosystems. In all these examples, we present available evidence and suggest potential avenues towards a well-defined theoretical formulation.

Read the full article at: www.preprints.org

Experimental Measurement of Assembly Indices are Required to Determine The Threshold for Life

Sun, 06/16/2024 - 09:51

Sara I. Walker, Cole Mathis, Stuart Marshall, Leroy Cronin

Assembly Theory (AT) was developed to help distinguish living from non-living systems. The theory is simple as it posits that the amount of selection or Assembly is a function of the number of complex objects where their complexity can be objectively determined using assembly indices. The assembly index of a given object relates to the number of recursive joining operations required to build that object and can be not only rigorously defined mathematically but can be experimentally measured. In pervious work we outlined the theoretical basis, but also extensive experimental measurements that demonstrated the predictive power of AT. These measurements showed that is a threshold in assembly indices for organic molecules whereby abiotic chemical systems could not randomly produce molecules with an assembly index greater or equal than 15. In a recent paper by Hazen et al [1] the authors not only confused the concept of AT with the algorithms used to calculate assembly indices, but also attempted to falsify AT by calculating theoretical assembly indices for objects made from inorganic building blocks. A fundamental misunderstanding made by the authors is that the threshold is a requirement of the theory, rather than experimental observation. This means that exploration of inorganic assembly indices similarly requires an experimental observation, correlated with the theoretical calculations. Then and only then can the exploration of complex inorganic molecules be done using AT and the threshold for living systems, as expressed with such building blocks, be determined. Since Hazen et al.[1] present no experimental measurements of assembly theory, their analysis is not falsifiable.

Read the full article at: arxiv.org

Higher-order correlations reveal complex memory in temporal hypergraphs

Sat, 06/15/2024 - 07:05

Luca Gallo, Lucas Lacasa, Vito Latora & Federico Battiston 
Nature Communications volume 15, Article number: 4754 (2024)

Many real-world complex systems are characterized by interactions in groups that change in time. Current temporal network approaches, however, are unable to describe group dynamics, as they are based on pairwise interactions only. Here, we use time-varying hypergraphs to describe such systems, and we introduce a framework based on higher-order correlations to characterize their temporal organization. The analysis of human interaction data reveals the existence of coherent and interdependent mesoscopic structures, thus capturing aggregation, fragmentation and nucleation processes in social systems. We introduce a model of temporal hypergraphs with non-Markovian group interactions, which reveals complex memory as a fundamental mechanism underlying the emerging pattern in the data.

Read the full article at: www.nature.com

Investigador Asociado C de Tiempo Completo en el área de “Ciencia de Datos e Inteligencia Artificial” (Tenure-track Research Professor in Data Science and AI)

Sat, 06/15/2024 - 03:24

At the Centro de Ciencias de la Complejidad (C3), Universidad Nacional Autónoma de México (UNAM)

Deadline: June 28th

More at: www.c3.unam.mx

Measuring Complexity using Information

Fri, 06/14/2024 - 05:32

Klaus Jaffe

Measuring complexity in multidimensional systems with high degrees of freedom and a variety of types of information, remains an important challenge. Complexity of a system is related to the number and variety of components, the number and type of interactions among them, the degree of redundancy, and the degrees of freedom of the system. Examples show that different disciplines of science converge in complexity measures for low and high dimensional problems. For low dimensional systems, such as coded strings of symbols (text, computer code, DNA, RNA, proteins, music), Shannon’s Information Entropy (expected amount of information in an event drawn from a given distribution) and Kolmogorov‘s Algorithmic Complexity (the length of the shortest algorithm that produces the object as output), are used for quantitative measurements of complexity. For systems with more dimensions (ecosystems, brains, social groupings), network science provides better tools for that purpose. For complex highly multidimensional systems, none of the former methods are useful. Useful Information Φ, as proposed by Infodynamics, can be related to complexity. It can be quantified by measuring the thermodynamic Free Energy F and/or useful Work it produces. Complexity measured as Total Information I, can then be defined as the information of the system, that includes Φ, useless information or Noise N, and Redundant Information R. Measuring one or more of these variables allows quantifying and classifying complexity.

Read the full article at: www.qeios.com

Evidence Mounts That About 7% of US Adults Have Had Long COVID

Wed, 06/12/2024 - 12:12

Zhengyi Fang; Rebecca Ahrnsbrak; Andy Rekito

JAMA Data Brief

New data from the Medical Expenditure Panel Survey (MEPS) Household Component support prior findings that about 7% of US adults have had post–COVID-19 condition, also known as long COVID. The household survey of the US civilian noninstitutionalized population, sponsored by the Agency for Healthcare Research and Quality, found that an estimated 6.9% of adults—17.8 million—had ever had long COVID as of early 2023.

This nationally representative survey included a sample of 17 418 adults aged 18 years or older, which corresponds to 259 million adults. A total of 8275 adults reported having had COVID-19, of which 1202 adults reported having had long COVID symptoms.

Read the full article at: jamanetwork.com

Irruption Theory in Phase Transitions: A Proof of Concept With the Haken-Kelso-Bunz Model

Wed, 06/12/2024 - 11:18

Javier Sánchez-Cañizares

Adaptive Behavior

Many theoretical studies defend the existence of ongoing phase transitions in the brain dynamics that could explain its enormous plasticity to cope with the environment. However, tackling the ever-changing landscapes of brain dynamics seems a hopeless task with complex models. This paper uses a simple Haken-Kelso-Bunz (HKB) model to illustrate how phase transitions that change the number of attractors in the landscape for the relative phase between two neural assemblies can occur, helping to explain a qualitative agreement with empirical decision-making measures. Additionally, the paper discusses the possibility of interpreting this agreement with the aid of Irruption Theory (IT). Being the effect of symmetry breakings and the emergence of non-linearities in the fundamental equations, the order parameter governing phase transitions may not have a complete microscopic determination. Hence, many requirements of IT, particularly the Participation Criterion, could be fulfilled by the HKB model and its extensions. Briefly stated, triggering phase transitions in the brain activity could thus be conceived of as a consequence of actual motivations or free will participating in decision-making processes.

Read the full article at: journals.sagepub.com

ICTP – SAIFR » School on Active Matter

Tue, 06/11/2024 - 11:16
Date: September 30 – October 4, 2024 Venue: IFT-UNESP, São Paulo, Brazil Active matter describes systems whose constituent elements consume energy locally in order to move or to exert mechanical forces. As such, active matter systems are intrinsically out of thermodynamic equilibrium. Examples include flocks or herds of animals, collections of cells, components of the cellular cytoskeleton and even artificial microswimmers. Active matter is a rapidly growing field involving diverse scientific communities in physics, biology, computational sciences, applied mathematics, chemistry, and engineering. Numerous applications of active matter are constantly arising in biological systems, smart materials, precision medicine, and robotics.

This school is intended for graduate students and researchers interested in the physics of active matter. The lectures will cover well-tested and successful theoretical approaches as well as a discussion of experimental results. To achieve this purpose, leading experts will present lectures on fundamental aspects of active matter and a pedagogical exposition of its recent trends.

Applicants are invited to submit abstracts for poster presentations.

There is no registration fee and limited funds are available for travel and local expenses.

Lecturers:
  • Julia M Yeomans (University of Oxford, UK): From Active Nematics to Mechanobiology
  • Rodrigo Soto (Universidad de Chile, Chile): Computational Modeling of Active Systems
  • Aparna Baskaran (Brandeis University, USA): Theoretical Foundations of Active Matter: Lessons from Ideal Microscopic Models
  • Francesco Ginelli (University of Insubria, Italy): Physics of Flocking
Application deadline: July 27, 2024 More information: https://www.ictp-saifr.org/sam2024/

Anatomy of an AI-powered malicious social botnet

Mon, 06/10/2024 - 07:59

Yang, K., & Menczer, F. (2024).

Journal of Quantitative Description: Digital Media 4

Large language models (LLMs) exhibit impressive capabilities in generating realistic text across diverse subjects. Concerns have been raised that they could be utilized to produce fake content with a deceptive intention, although evidence thus far remains anecdotal. This paper presents a case study about a Twitter botnet that appears to employ ChatGPT to generate human-like content. Through heuristics, we identify 1,140 accounts and validate them via manual annotation. These accounts form a dense cluster of fake personas that exhibit similar behaviors, including posting machine-generated content and stolen images, and engage with each other through replies and retweets. ChatGPT-generated content promotes suspicious websites and spreads harmful comments. While the accounts in the AI botnet can be detected through their coordination patterns, current state-of-the-art LLM content classifiers fail to discriminate between them and human accounts in the wild. These findings highlight the threats posed by AI-enabled social bots.

Read the full article at: journalqd.org

Is the Emergence of Life an Expected Phase Transition in the Evolving Universe?

Sun, 06/02/2024 - 10:27

Stuart Kauffman and Andrea Roli

We propose a novel definition of life in terms of which its emergence in the universe is expected, and its ever-creative open-ended evolution is entailed by no law. Living organisms are Kantian Wholes that achieve Catalytic Closure, Constraint Closure, and Spatial Closure. We here unite for the first time two established mathematical theories, namely Collectively Autocatalytic Sets and the Theory of the Adjacent Possible. The former establishes that a first-order phase transition to molecular reproduction is expected in the chemical evolution of the universe where the diversity and complexity of molecules increases; the latter posits that, under loose hypotheses, if the system starts with a small number of beginning molecules, each of which can combine with copies of itself or other molecules to make new molecules, over time the number of kinds of molecules increases slowly but then explodes upward hyperbolically. Together these theories imply that life is expected as a phase transition in the evolving universe. The familiar distinction between software and hardware loses its meaning in living cells. We propose new ways to study the phylogeny of metabolisms, new astronomical ways to search for life on exoplanets, new experiments to seek the emergence of the most rudimentary life, and the hint of a coherent testable pathway to prokaryotes with template replication and coding.

Read the full article at: osf.io

Self-Improvising Memory: A Perspective on Memories as Agential, Dynamically Reinterpreting Cognitive Glue

Fri, 05/31/2024 - 09:21

Michael Levin

Entropy 2024, 26(6), 481

Many studies on memory emphasize the material substrate and mechanisms by which data can be stored and reliably read out. Here, I focus on complementary aspects: the need for agents to dynamically reinterpret and modify memories to suit their ever-changing selves and environment. Using examples from developmental biology, evolution, and synthetic bioengineering, in addition to neuroscience, I propose that a perspective on memory as preserving salience, not fidelity, is applicable to many phenomena on scales from cells to societies. Continuous commitment to creative, adaptive confabulation, from the molecular to the behavioral levels, is the answer to the persistence paradox as it applies to individuals and whole lineages. I also speculate that a substrate-independent, processual view of life and mind suggests that memories, as patterns in the excitable medium of cognitive systems, could be seen as active agents in the sense-making process. I explore a view of life as a diverse set of embodied perspectives—nested agents who interpret each other’s and their own past messages and actions as best as they can (polycomputation). This synthesis suggests unifying symmetries across scales and disciplines, which is of relevance to research programs in Diverse Intelligence and the engineering of novel embodied minds.

Read the full article at: www.mdpi.com

Antifragility of stochastic transport on networks with damage

Thu, 05/30/2024 - 15:21

L. K. Eraso-Hernandez, A. P. Riascos

A system is called antifragile when damage acts as a constructive element improving the performance of a global function. In this letter, we analyze the emergence of antifragility in the movement of random walkers on networks with modular structures or communities. The random walker hops considering the capacity of transport of each link, whereas the links are susceptible to random damage that accumulates over time. We show that in networks with communities and high modularity, the localization of damage in specific groups of nodes leads to a global antifragile response of the system improving the capacity of stochastic transport to more easily reach the nodes of a network. Our findings give evidence of the mechanisms behind antifragile response in complex systems and pave the way for their quantitative exploration in different fields.

Read the full article at: arxiv.org

Pages