Hannah E. Davis, Lisa McCorkell, Julia Moore Vogel & Eric J. Topol
Nature Reviews Microbiology (2023)
Long COVID is an often debilitating illness that occurs in at least 10% of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections. More than 200 symptoms have been identified with impacts on multiple organ systems. At least 65 million individuals worldwide are estimated to have long COVID, with cases increasing daily. Biomedical research has made substantial progress in identifying various pathophysiological changes and risk factors and in characterizing the illness; further, similarities with other viral-onset illnesses such as myalgic encephalomyelitis/chronic fatigue syndrome and postural orthostatic tachycardia syndrome have laid the groundwork for research in the field. In this Review, we explore the current literature and highlight key findings, the overlap with other conditions, the variable onset of symptoms, long COVID in children and the impact of vaccinations. Although these key findings are critical to understanding long COVID, current diagnostic and treatment options are insufficient, and clinical trials must be prioritized that address leading hypotheses. Additionally, to strengthen long COVID research, future studies must account for biases and SARS-CoV-2 testing issues, build on viral-onset research, be inclusive of marginalized populations and meaningfully engage patients throughout the research process.
Read the full article at: www.nature.com
Josué Ely Molina, Jorge Flores, Carlos Gershenson, Carlos Pineda
A recent increase in data availability has allowed the possibility to perform different statistical linguistic studies. Here we use the Google Books Ngram dataset to analyze word flow among English, French, German, Italian, and Spanish. We study what we define as “migrant words”, a type of loanwords that do not change their spelling. We quantify migrant words from one language to another for different decades, and notice that most migrant words can be aggregated in semantic fields and associated to historic events. We also study the statistical properties of accumulated migrant words and their rank dynamics. We propose a measure of use of migrant words that could be used as a proxy of cultural influence. Our methodology is not exempt of caveats, but our results are encouraging to promote further studies in this direction.
Read the full article at: arxiv.org
Giorgio Matassi and Pedro Martinez
Front. Ecol. Evol., 13 January 2023 Sec. Models in Ecology and Evolution
In this review essay, we give a detailed synopsis of the twelve contributions which are collected in a Special Issue in Frontiers Ecology and Evolution, based on the research topic “Current Thoughts on the Brain-Computer Analogy—All Metaphors Are Wrong, But Some Are Useful.” The synopsis is complemented by a graphical summary, a matrix which links articles to selected concepts. As first identified by Turing, all authors in this Special Issue recognize semantics as a crucial concern in the brain-computer analogy debate, and consequently address a number of such issues. What is missing, we believe, is the distinction between metaphor and analogy, which we reevaluate, describe in some detail, and offer a definition for the latter. To enrich the debate, we also deem necessary to develop on the evolutionary theories of the brain, of which we provide an overview. This article closes with thoughts on creativity in Science, for we concur with the stance that metaphors and analogies, and their esthetic impact, are essential to the creative process, be it in Sciences as well as in Arts.
Read the full article at: www.frontiersin.org
Sean Kelty, Raiyan Abdul Baten, Adiba Mahbub Proma, Ehsan Hoque, Johan Bollen, Gourab Ghoshal
Academic success is distributed unequally; a few top scientists receive the bulk of attention, citations, and resources. However, do these “superstars” foster leadership in scientific innovation? We introduce three information-theoretic measures that quantify novelty, innovation, and impact from scholarly citation networks, and compare the scholarly output of scientists who are either not connected or strongly connected to superstar scientists. We find that while connected scientists do indeed publish more, garner more citations, and produce more diverse content, this comes at a cost of lower innovation and higher redundancy of ideas. Further, once one removes papers co-authored with superstars, the academic output of these connected scientists diminishes. In contrast, authors that produce innovative content without the benefit of collaborations with scientific superstars produce papers that connect a greater diversity of concepts, publish more, and have comparable citation rates, once one controls for transferred prestige of superstars. On balance, our results indicate that academia pays a price by focusing attention and resources on superstars.
Read the full article at: arxiv.org
Arsham Ghavasieh, Manlio De Domenico
Complex network states are characterized by the interplay between system’s structure and dynamics. One way to represent such states is by means of network density matrices, whose von Neumann entropy characterizes the number of distinct microstates compatible with given topology and dynamical evolution. In this Letter, we propose a maximum entropy principle to characterize network states for systems with heterogeneous, generally correlated, connectivity patterns and non-trivial dynamics. We focus on three distinct coalescence processes, widely encountered in the analysis of empirical interconnected systems, and characterize their entropy and transitions between distinct dynamical regimes across distinct temporal scales. Our framework allows one to study the statistical physics of systems that aggregate, such as in transportation infrastructures serving the same geographic area, or correlate, such as inter-brain synchrony arising in organisms that socially interact, and active matter that swarm or synchronize.
Read the full article at: arxiv.org
Tomasz Korbak
Adaptive Behavior 31(1)
The notion of self-organisation plays a major role in enactive cognitive science. In this paper, I review several formal models of self-organisation that various approaches in modern cognitive science rely upon. I then focus on Rosen’s account of self-organisation as closure to efficient cause and his argument that models of systems closed to efficient cause – (M, R) systems – are uncomputable. Despite being sometimes relied on by enactivists this argument is problematic it rests on assumptions unacceptable for enactivists: that living systems can be modelled as time-invariant and material-independent. I then argue that there exists a simple and philosophically appealing reparametrisation of (M, R)–systems that accounts for the temporal dimensions of life but renders Rosen’s argument invalid.
Read the full article at: journals.sagepub.com
Koen Siteur, Quan-Xing Liu, Vivi Rottschäfer, Tjisse van der Heide, Max Rietkerk, Arjen Doelman, Christoffer Boström, and Johan van de Koppel
PNAS 120 (2) e2202683120
Human-induced environmental changes push ecosystems worldwide toward their limits. Therefore, there is a growing need for indicators to assess the resilience of ecosystems against external changes and disturbances. We highlight a novel class of spatial patterns in ecosystems for which resilience indicators are lacking and introduce a new indicator framework for these ecosystems, akin to the physics of phase separation. Our work suggests that aerial imagery can be used to monitor patchy ecosystems and highlights a link between physics and ecosystem resilience.
Read the full article at: www.pnas.org
Economic complexity methods have become popular tools in economic geography, international development, and innovation studies. Here, I review economic complexity theory and applications, with a particular focus on two streams of literature: the literature on relatedness, which focuses on the evolution of specialization patterns, and the literature on metrics of economic complexity, which uses dimensionality reduction techniques to create metrics of economic sophistication that are predictive of variations in income, economic growth, emissions, and income inequality.
Watch at: www.youtube.com
You will produce a simulation program demonstrating self-organizing logistic networks that become more circular and sustainable over time.
You will create novel research breakthroughs and contribute to the ambitious ERC Advanced Investigator Grant on “Co-Evolving City Life” (CoCi) in subject areas connected to smart cities and digital societies. Your research focus will be on “Sustainable Cities and Coordination”. Given recent digital technologies such as the Internet of Things (sensor and communication networks), Artificial Intelligence, and blockchain technology, one can expect that production, logistics, and even waste, are becoming increasingly smart. Ideally, you will study how the convergence of these technologies can be used to fuel new approaches towards more sustainable production and logistics in an urban context.
The research question we would like to answer is, how the approach of self-organized and federated, learning, networked multi-agent systems can be used to create socio-economic incentives that would promote the emergence of closed loops in a material supply network and could thereby boost the formation of a circular and sharing economy. We want to study, how a multi-dimensional real-time measurement, feedback and coordination system would have to be designed and operated in order to reach this goal.
Together with our team, you will work on the mechanisms and effects of multi-dimensional real-time coordination, perform related agent-based simulations, and work towards demonstrating the approach in an application project. It will be great to couple the simulation program with a sensor-based environment (Raspberry Pi or Arduino, or other) that responds to measurements, flexibly adapts, and self-organizes. You will be the key researcher addressing these challenges or a subset of them (please specify), collaborating with a highly motivated team.
More at: www.jobs.ethz.ch
Joshua Bongard, Michael Levin
The applicability of computational models to the biological world is an active topic of debate. We argue that a useful path forward results from abandoning hard boundaries between categories and adopting an observer-dependent, pragmatic view. Such a view dissolves the contingent dichotomies driven by human cognitive biases (e.g., tendency to oversimplify) and prior technological limitations in favor of a more continuous, gradualist view necessitated by the study of evolution, developmental biology, and intelligent machines. Efforts to re-shape living systems for biomedical or bioengineering purposes require prediction and control of their function at multiple scales. This is challenging for many reasons, one of which is that living systems perform multiple functions in the same place at the same time. We refer to this as “polycomputing” – the ability of the same substrate to simultaneously compute different things. This ability is an important way in which living things are a kind of computer, but not the familiar, linear, deterministic kind; rather, living things are computers in the broad sense of computational materials as reported in the rapidly-growing physical computing literature. We argue that an observer-centered framework for the computations performed by evolved and designed systems will improve the understanding of meso-scale events, as it has already done at quantum and relativistic scales. Here, we review examples of biological and technological polycomputing, and develop the idea that overloading of different functions on the same hardware is an important design principle that helps understand and build both evolved and designed systems. Learning to hack existing polycomputing substrates, as well as evolve and design new ones, will have massive impacts on regenerative medicine, robotics, and computer engineering.
Read the full article at: arxiv.org
Randall D.Beer, Ezequiel A.Di Paolo
Biosystems
Volume 223, January 2023, 104823
Enaction is an increasingly influential approach to cognition that grew out of Maturana and Varela’s earlier work on autopoiesis and the biology of cognition. As with any relatively new scientific discipline, the enactive approach would benefit greatly from a careful analysis of its theoretical foundations. Here we initiate such an analysis for one of the core concepts of enaction, precariousness. Specifically, we consider three types of fragility: systemic, processual and thermodynamic. Using a glider in the Game of Life as a toy model, we illustrate each of these fragilities and examine the relationships between them. We also argue that each type of fragility is characterized by which aspects of a system are hardwired into its definition from the outset and which aspects are emergent and hence vulnerable to disintegration without ongoing maintenance.
Read the full article at: www.sciencedirect.com
Diffusion models generate incredible images by learning to reverse the process that, among other things, causes ink to spread through water.
Read the full article at: www.quantamagazine.org
Cecile Philippe, Yaneer Bar-Yam, Stephane Bilodeau. Carlos Gershenson, Sunil K.Raina, Shu-Ti Chiou, Gunhild A. Nyborg, Matthias F.Schneider
The Lancet Regional Health – Europe
Volume 25, February 2023, 100574
After a period where many countries have let the SARS-CoV-2 virus spread more or less freely, individuals and communities are now grappling with the many negative health effects and economic ramifications from high levels of illness over long periods. As evidence of the detrimental long-term effects of the virus mount, it is increasingly clear that the policy vacuum comes at an unacceptable price both in the short and long term; its only justification would be if there was no other alternative that did not come at an even greater cost. Entering the cold season, the number of infections will most likely increase significantly in Europe (≈ one – two order of magnitude in 2021). While the world awaits and hopes for new and more effective vaccines, we need tools in the toolbox that can effectively control transmission of rapidly spreading new variants, especially if more pathogenic. Otherwise, we may face significant disruptions and enormous costs due to repeated waves of illness, with each wave increasing the numbers of workers thrown out of the workforce from long term health effects. Lockdowns, due to their social restrictions and high short-term economic costs, are no longer the best available option. We here point out that mass testing (regular asymptomatic screening of the general population) is an alternative approach that can dramatically reduce cases and quickly restore economic and social activity.
Read the full article at: www.sciencedirect.com
In high-income countries, the COVID-19 pandemic fostered the generation of surveillance data at spatial and temporal resolution unseen before, providing comprehensive and accurate estimates of cases, detection capability, hospitalizations and deaths. At the same time, data describing behavioral response, mobility, mixing and compliance to public health measures have also become available with similar level of detail. Such an exhaustive picture of the unfurling of a pandemic was a first in human history, made possible because we live in the digital age. It does not imply that epidemiological surveillance will remain this way in the future. As COVID-19 becomes less virulent with vaccination and acquired immunity, political pressure is shifting away from comprehensive detection of cases, and individual willingness to get tested may also be declining. At the same time, corporate commitment to make proprietary data on human behavior available to scientific research (e.g., mobile phone data) is waning. This underpins the main scientific goal of this project: can we use the experience of “wartime” COVID-19 surveillance during years2020-2022 to improve epidemic understanding in the future “peacetime” period ? Typical data available for surveillance in peacetime is scarcer, for example syndromic surveillance for influenza and other respiratory viruses as reported in networks of general practitioners (GP), with limited virological confirmation. Other data sources, including participatory surveillance and drug sales, may complement such reports, but are less specific. Importantly, during the first 2 years of COVID-19, the aforementioned high-resolution data and the scarcer traditional data sources were observed together. We wish to exploit this overlap to build statistical and mathematical models that will extract more and better information from peacetime surveillance data. Specifically, we aim at generating estimates of incidence, severe cases, reproductive number that are better than those previously available in terms of spatial resolution, temporal resolution, predictive power (ability to make short-term forecasts and mid-term projections of epidemic activity). We will make use of AI/ML techniques to come up with models with which transfer of knowledge, for example from the dynamics of COVID-19 to that of Influenza, or from drug sales data to influenza, from mobility to infectious spread will make it possible to improve accurate estimation of influenza incidence and short term prediction. The impact of this project will be thus twofold. First, we will improve the knowledge and predictability of seasonal epidemic waves of airborne, directly transmitted pathogens. Second, we will provide with policymakers with new tools to inform public health response to seasonal acute respiratory illness.
More at: soundai.sorbonne-universite.fr
What is the nature of intelligence in social insect societies, adaptive matter, groups of cells like brains, sports teams, and AI, and how does it arise in these seemingly different kinds of collectives?
The Symposium & Short Course will search for unifying principles in collective intelligence by tackling its foundations, and explore radical ideas for harnessing collective potential. The event will begin with a broad discussion of first-principles approaches from the physical and natural sciences for deriving group performance from microscopic, individual-level behavior and interactions. Participants will debate the most promising measures of intelligence across systems and consider the dynamics of collective intelligence in changing environments. Finally, we will explore radical ideas for harnessing collective intelligence in human and hybrid systems and invite scholars, artists, writers, musicians, actors, directors, dancers, and inventors, in addition to scientists, to participate in this discussion.
APPLY by February 1st, 2023 for priority review.
More at: www.santafe.edu
Lucas Böttcher, Mason A. Porter
In many scientific applications, it is common to use binary (i.e., unweighted) edges in the study of networks to examine collections of entities that are either adjacent or not adjacent. Researchers have generalized such binary networks to incorporate edge weights, which allow one to encode node–node interactions with heterogeneous intensities or frequencies (e.g., in transportation networks, supply chains, and social networks). Most such studies have considered real-valued weights, despite the fact that networks with complex weights arise in fields as diverse as quantum information, quantum chemistry, electrodynamics, rheology, and machine learning. Many of the standard approaches from network science that originated in the study of classical systems and are based on real-valued edge weights cannot be applied directly to networks with complex edge weights. In this paper, we examine how standard network-analysis methods fail to capture structural features of networks with complex weights. We then generalize several network measures to the complex domain and show that random-walk centralities provide a useful tool to examine node importances in networks with complex weights.
Read the full article at: arxiv.org
Marco Pangallo, Alberto Aleta, R. Maria del Rio Chanona, Anton Pichler, David Martín-Corral, Matteo Chinazzi, François Lafond, Marco Ajelli, Esteban Moro, Yamir Moreno, Alessandro Vespignani, J. Doyne Farmer
The potential tradeoff between health outcomes and economic impact has been a major challenge in the policy making process during the COVID-19 pandemic. Epidemic-economic models designed to address this issue are either too aggregate to consider heterogeneous outcomes across socio-economic groups, or, when sufficiently fine-grained, not well grounded by empirical data. To fill this gap, we introduce a data-driven, granular, agent-based model that simulates epidemic and economic outcomes across industries, occupations, and income levels with geographic realism. The key mechanism coupling the epidemic and economic modules is the reduction in consumption demand due to fear of infection. We calibrate the model to the first wave of COVID-19 in the New York metropolitan area, showing that it reproduces key epidemic and economic statistics, and then examine counterfactual scenarios. We find that: (a) both high fear of infection and strict restrictions similarly harm the economy but reduce infections; (b) low-income workers bear the brunt of both the economic and epidemic harm; (c) closing non-customer-facing industries such as manufacturing and construction only marginally reduces the death toll while considerably increasing unemployment; and (d) delaying the start of protective measures does little to help the economy and worsens epidemic outcomes in all scenarios. We anticipate that our model will help designing effective and equitable non-pharmaceutical interventions that minimize disruptions in the face of a novel pandemic.
Read the full article at: arxiv.org
Alvaro Diaz-Ruelas
Chaos 32, 123136 (2022)
The incorporation of stochastic ingredients in models describing phenomena in all disciplines is now a standard in scientific practice. White noise is one of the most important of such stochastic ingredients. Although tools for identifying white and other types of noise exist,1,2 there is a permanent demand for reliable and robust statistical methods for analyzing data in order to distinguish noise and filter it from signals in experiments. Or in hypothesis tests, for assessing the plausibility of the outcome of an experiment being the result of randomness and not a significant, controllable effect. Due to its ubiquity in experiments and its mathematical simplicity, white noise is very often the most convenient stochastic component that adds realism to a dynamic model, commonly regarded as the noise polluting observations. It can be continuous or discrete both in time and in distribution, so it can be applied to many scenarios. It is a stationary and independent and identically distributed process, all relatively simple properties for a stochastic process. Here, we present a combinatorial perspective to study white noise inspired in the concept of ordinal patterns.
Read the full article at: aip.scitation.org
Yang Tian, Zeren Tan, Hedong Hou, Guoqi Li, Aohua Cheng, Yike Qiu, Kangyu Weng, Chun Chen, Pei Sun
Network Neuroscience (2022) 6 (4): 1148–1185.
The brain criticality hypothesis is one of the most focused and controversial topics in neuroscience and biophysics. This research develops a unified framework to reformulate the physics theories of four basic types of brain criticality, ordinary criticality (OC), quasi-criticality (qC), self-organized criticality (SOC), and self-organized quasi-criticality (SOqC), into more accessible and neuroscience-related forms. For the statistic techniques used to validate the brain criticality hypothesis, we also present comprehensive explanations of them, summarize their error-prone details, and suggest possible solutions. This framework may help resolve potential controversies in studying the brain criticality hypothesis, especially those arising from the misconceptions about the theoretical foundations of brain criticality.
Read the full article at: direct.mit.edu
Ivan Shpurov and Tom Froese
Entropy 2022, 24(12), 1840
Social insects such as honey bees exhibit complex behavioral patterns, and their distributed behavioral coordination enables decision-making at the colony level. It has, therefore, been proposed that a high-level description of their collective behavior might share commonalities with the dynamics of neural processes in brains. Here, we investigated this proposal by focusing on the possibility that brains are poised at the edge of a critical phase transition and that such a state is enabling increased computational power and adaptability. We applied mathematical tools developed in computational neuroscience to a dataset of bee movement trajectories that were recorded within the hive during the course of many days. We found that certain characteristics of the activity of the bee hive system are consistent with the Ising model when it operates at a critical temperature, and that the system’s behavioral dynamics share features with the human brain in the resting state.
Read the full article at: www.mdpi.com