The 2024 Colloquium series has now finished; we will be restarting in February 2025 after a winter holiday. Thank you to all the speakers and participants throughout the year, and we wish you all a peaceful break.
Events shown in time zone:
The Topos Institute Colloquium is an expository virtual talk series on topics relevant to the Topos mission and community. We usually meet over Zoom on Thursdays at 17:00 UTC, but this may vary depending on the speaker's time-zone. Talks are recorded, and remain available on our YouTube channel.
If you wish to be subscribed to the mailing list, then simply send an email to seminars+subscribe@topos.institute. For any queries about the colloquium, contact tim+colloquium@topos.institute.
The colloquium centres around following four themes, with many talks fitting multiple themes.
Topos aims to shape technology for public benefit via creating and deploying a new mathematical systems science. What does this mean, and how do we do it? How does technology shape people’s lives today? What are the risks that future technologies may bring? What is the role of mathematicians and computer scientists in shaping how their work has been deployed? What lessons can we learn from the past and present about successes and failures? This theme aims to spark discussion among mathematicians and computer scientists about these questions, and bring them into contact with experts on these subjects.
Thinking clearly about today’s most pressing scientific, technical, and societal questions requires finding the right abstractions. Category theory can help with this by offering design principles for structuring how we account for phenomena in a specific domain, as well as how we translate problems and solutions between different domains. This theme aims to highlight recent developments in applied category theory, in domains such as computation, neuroscience, physics, artificial intelligence, game theory, and robotics.
Like many parts of pure mathematics, results that are initially seen as purely theoretical may not be ripe for application until much later. At Topos we foster the entire pipeline, from the creation of elegant theory to the development of its effective application. The goal of this theme is to foster the pure side of this pipeline, to take in the most beautiful results in category theory, logic, type theory, and related fields, as well as to scout for not-yet-categorical work that appears ripe for "categorification".
Beyond its intrinsic interest, the purpose of applied mathematics is to provide a foundation for new capabilities and technologies that benefit the public. In this theme, we highlight work, at the intersection of research and engineering, that transfers ideas from applied category theory and other parts of mathematics into viable technologies, with an emphasis on software systems and tools. Topics may include scientific modeling, functional programming, differential programming, probabilistic programming, quantum computing, formal verification, and software and systems engineering.
To see the talks from previous years, use the links to the archives at the top of the page.
Robert Paré (11th of January)
As a tool for studying the structure of endofunctors F of Set, we introduce the difference operator△
△[F](X) = F(X + 1) \ F(X).
This is analogous to the classical difference operator for real valued functions, a discrete form of the derivative.
The \ above is set difference and can't be expected to be functorial, but it is for a large class of functors, the taut functors of Manes, which include polynomial functors and many more.
We obtain combinatorial versions of classical identities, often "improved". Many examples will be given.
The talk should be accessible to everyone. The only prerequisite is some very basic category theory.
John Cartmell (18th of January)
I will discuss significant aspects of a theory of data and what may be achieved by representing data specifications as sketches of Range Categories with additional structure. I will discuss the distinction between relational and non-relational physical data specifications and contrast physical and logical data specifications. I will discuss goodness criteria for such specifications and define some specific criteria which generalise the classic relational goodness criteria i.e. the so called normal forms of Codd, Fagin, Ling and Goh and others.
My goal for a fully elaborated Mathematical Theory of Data is to effect a change in what is considered best practice for the way in which data is specified and programmed as as to enable best practice to be shifted from being at the level that data is physically represented and communicated to being at the more abstract level of its logical structure.
Juan Pablo Vigneaux (25th of January)
This talk will discuss the cohomological aspects of information functions within the framework of information cohomology (first introduced by Baudot and Bennequin in 2015). Several known functionals can be identified as cohomology classes in this framework, including the Shannon entropy of discrete probability measures and the differential entropy and underlying dimension of continuous measures. I’ll try to provide an accessible overview of the foundations of the theory, which should require only a basic familiarity with category theory and homological algebra, and survey the main known results. Finally, I'll discuss some perspectives and open problems: firstly in connection with Renyi's information dimension and other (possibly geometric) invariants of laws taking values on manifolds, and secondly with notions of entropy for categories (akin to Leinster's diversity of metric spaces).
Susan Niefield (2nd of February)
In his 1973 paper (TAC Reprints, 2002), Lawvere observed that a metric space Y is a category enriched in the extended reals, and showed that Y is Cauchy complete if and only if every bimodule (i.e., profunctor) with codomain Y has a right adjoint. More recently, Paré (2021) considered adjoints and Cauchy completeness in double categories, and showed that an (S,R)-bimodule M has a right adjoint in the double category of commutative rings if and only if it is finitely generated and projective as an S-module. It is well known that the latter property characterizes the existence of a left adjoint to tensoring with M on the category of S-modules, and this was generalized to rigs and quantales in a 2017 paper by Wood and the speaker.
This talk consists of two parts. First, after recalling the relevant definitions, we present examples of Cauchy complete objects in some "familiar" double categories. Second, we incorporate the two above mentioned projectivity results into a version of the 2017 theorem with Wood which we then apply to (not-necessarily commutative) rings, rigs, and quantales.
André Joyal (15th of February)
Whitman's theory of free lattices can be extended to free lattices enriched over a quantales, to free bicomplete enriched categories and even to free bicomplete enriched oo-categories. It has applications to the semantic of Linear Logic (Hongde Hu and J.)
Joachim Kock (22nd of February)
In their simplest form, polynomial functors are endofunctors of the category of sets built from sums and products. At first they can be considered a categorification of the notion of polynomial function, but it has turned out the theory of polynomial functors is more generally an efficient toolbox for dealing with induction, nesting, and substitution. The talk will highlight some of these aspects in combinatorics, logic, and homotopy theory.
Steven Clontz (28th of March)
The National Science Foundation defines "cyberinfrastructure" as "the hardware, software, networks, data and people that underpin today's advanced computing technology", particularly technologies that advance scientific discovery. In particular, this infrastructure is incomplete without its "people", leading some to prefer the terminology "sociotechnical infrastructure" to emphasize the importance of how these technologies connect human researchers, and how human researchers in turn use and develop these technologies in order to create new knowledge. In mathematics research, even theoretical mathematics, we use many technologies, and engage with many different communities, but there is little scholarship on the ad hoc research infrastructure itself that we implicitly rely on from day to day. This talk will provide an overview of some of the work I've done as part of my spring 2024 sabbatical dedicated to the research and development of improved sociotechnical infrastructure for mathematics research.
Po-Shen Loh (4th of April)
One of the central challenges of beyond-standard-curriculum instruction (such as "gifted" education) is how to achieve equitably-distributed scale. Making matters worse, generative AI such as ChatGPT is increasingly adept at solving standard curricular tasks, so it is urgent to scalably deliver teaching that goes beyond current standards. Fortunately, there is an area close to math which devises solutions in which problems solve themselves even through self-serving human behavior: Game Theory.
The speaker will describe his recent work, which uses Game Theory to create a novel alignment of incentives, which concurrently solves pain points in disparate sectors. At the heart of the innovation is a new, mutually-beneficial cooperation between high school math stars and professionally trained actors and comedians. This creates a highly scalable community of extraordinary coaches with sufficient capacity to teach large numbers of middle schoolers seeking to learn critical thinking and creative analytical problem solving (https://live.poshenloh.com). At the same time, it creates a new pathway for high school math stars to significantly strengthen their emotional intelligence. The whole program is conducted virtually, so it reaches through geographical barriers. The speaker will also share his experience extending this work to build talent development pipelines in underprivileged communities, identifying and supporting highly motivated middle school students who otherwise did not have access to coaching.
Edward Lee (18th of April)
Mathematical models can yield certainty, as can probabilistic models where the probabilities degenerate. The field of formal methods emphasizes developing such certainty about engineering designs. In safety critical systems, such certainty is highly valued and, in some cases, even required by regulatory bodies. But achieving reasonable performance for sufficiently complex environments appears to require the use of AI technologies, which resist such certainty. This talk suggests that certainty and intelligence may be fundamentally incompatible.
Gioele Zardini (25th of April)
When designing complex systems, we need to consider multiple trade-offs at various abstraction levels and scales, and choices of single components need to be studied jointly. For instance, the design of future mobility solutions (e.g., autonomous vehicles, micromobility) and the design of the mobility systems they enable are closely coupled. Indeed, knowledge about the intended service of novel mobility solutions would impact their design and deployment process, while insights about their technological development could significantly affect transportation management policies. Optimally co-designing sociotechnical systems is a complex task for at least two reasons. On one hand, the co-design of interconnected systems (e.g., large networks of cyber-physical systems) involves the simultaneous choice of components arising from heterogeneous natures (e.g., hardware vs. software parts) and fields, while satisfying systemic constraints and accounting for multiple objectives. On the other hand, components are connected via collaborative and conflicting interactions between different stakeholders (e.g., within an intermodal mobility system). In this talk, I will present a framework to co-design complex systems, leveraging a monotone theory of co-design and tools from applied category theory. The framework will be instantiated in the task of designing future mobility systems, all the way from the policies that a city can design, to the autonomy of vehicles as part of an autonomous mobility-on-demand service. Through various case studies, I will show how the proposed approaches allow one to efficiently answer heterogeneous questions, unifying different modeling techniques and promoting interdisciplinarity, modularity, and compositionality. I will then discuss open challenges for compositional systems design optimization, and present my agenda to tackle them.
Filippo Bonchi (2nd of May)
We introduce the calculus of neo-Peircean relations, a string diagrammatic extension of the calculus of binary relations that has the same expressivity as first order logic and comes with a complete axiomatisation. The axioms are obtained by combining two well known categorical structures: cartesian and linear bicategories.
Eugenio Moggi (9th of May)
In 1998 Manes introduced the notion of collection monad on the category of sets as a suitable semantics for collection types. The canonical example of collection monad is the finite powerset monad.
In order to account for the algorithmic aspects the category of sets should be replaced with other categories, whose arrows are maps computable by "low complexity" algorithms.
We extends Manes' definition of collection monad to models for weak versions of Algebraic Set Theory (AST). AST was proposed by Joyal and Moerdijk in the '90 as a category-theoretic counterpart of Bernays' set theory based on classes.
We give a systematic way to construct such models, which include categories whose arrows are "low complexity" functions between countable sets.
Elena Di Lavore (16th of May)
Effectful streams are a coinductive semantic universe for effectful dataflow programming and traces. As an example, we formalise the stream cipher cryptographic protocol. In monoidal categories with conditionals and ranges, effectful streams particularize to families of morphisms satisfying a causality condition. Effectful streams allow us to develop notions of trace and bisimulation for effectful Mealy machines; bisimulation implies effectful trace equivalence.
This is recent joint work with Filippo Bonchi and Mario Román.
Nicola Gambino (23rd of May)
The aim of this talk is to present bicategorical counterparts of the notions of a linear explonential comonad, as considered in the study of linear logic, and of a codereliction transformation, introduced in the study of differential linear logic via differential categories. As an application, the differential calculus of Joyal's analytic functors will be extended to analytic functors between presheaf categories, in a way that is analogous to how ordinary calculus extends from a single variable to many variables. This is based on joint work with Marcelo Fiore and Martin Hyland (https://arxiv.org/abs/2405.05774).
Steve Vickers (30th of May)
Point-free topology is known in various guises (locales, formal topology), but can be boiled down to a procedure of defining the points of a space not ("point-set") as elements of a set, but as models of a logical theory. This is for a constrained "geometric" logic, so that the opens of the space correspond to propositional formulae derived from the theory: thus the theory defines both the points and the topology. Then continuity of maps just means that they are constructed in accordance with the constraints of the logic.
Why bother to do topology that way? After all, the logic is even more constrained than constructive reasoning, and we still don’t know how far it reaches.
The first reason is that it quite painlessly extends to toposes, viewed as generalized spaces. This uses the machinery of classifying toposes, but only in an unobtrusive way [1]. Many proper classes can then be viewed as point-free spaces - just write down a geometric theory whose models are the elements of the class.
The second reason follows from the first but can then be applied back to ungeneralized spaces in a very natural treatment of bundles. Theory presentations can themselves be described as the models of a geometric theory, and this allows us to view bundles as continuously mapping base points to spaces (the fibres). For physics in particular, this invites exploration of how much can be done using geometric methods. See eg [2].
Meanwhile, there is the question of how much ordinary mathematics, and in particular real analysis, can be done in this style. I present as a case study the Fundamental Theorem of Calculus [3]. This illustrates some typically geometric features of the reasoning, such as attention paid to one-sided reals and the use of hyperspaces and their analogues, and some exploitation of the geometric fact that everything is continuous, as well as a cute new trick using uniform probability measures.
[1] Steven Vickers "Topical categories of domains", Mathematical Structures in Computer Science 9 (1999)
[2] Bas Spitters, Steven Vickers and Sander Wolters "Gelfand spectra in Grothendieck toposes using geometric mathematics", Electronic Proceedings in Theoretical Computer Science 158 (2014)
[3] Steven Vickers "The Fundamental Theorem of Calculus point-free, with applications to exponentials and logarithms", arXiv:2312.05228
Alex Simpson (6th of June)
I shall give a brief guided tour of three toposes that have arisen in a research programme to model aspects of probability and randomness from a topos perspective. The first topos of "probability sheaves" supports a synthetic style of probabilistic reasoning about random variables. The second "random topos" makes sense of the notion of "random element" and models a world in which all sets are measurable. The third topos of "random probability sheaves" combines the previous two and provides a home for a more radical style of "synthetic probability theory" expunged of all concerns about sigma-algebras, measurability and the like.
Bartosz Milewski (13th of June)
Locally graded categories and parametric optics provide a compositional model of neural networks. I will show how to generalize this approach to pre-optics. I'll introduce a parametric profunctor representation of preoptics and use it to implement a perceptron in Haskell.
Mason Porter (27th of June)
I will discuss topological data analysis (TDA), which uses ideas from topology to quantify the "shape" of data. I will focus in particular on persistent homology (PH), which one can use to find "holes" of different dimensions in data sets. I will briefly introduce these ideas and then discuss a series of examples of TDA of spatial systems. The examples that I'll discuss include voting data, the locations of polling sites, and the webs of spiders under the influence of various drugs.
Fabio Gadducci (11th of July)
It is now folklore that cartesian categories can be viewed as symmetric monoidal ones equipped with two natural transformations, modelling diagonals and projections. In this talk we explore the taxonomy obtained by relaxing the naturality requirement, from gs-monoidal/cd-categories to restriction and Markov ones. We show how these possibly order-enriched categories are related by suitable commutative monads and the shape of the arrows of the free categories generated by an algebraic signature.
Kathrin Stark (1st of August)
For performance and verification in machine learning, new methods have recently been proposed that optimise learning systems to satisfy formally expressed logical properties. Among these methods, differentiable logics (DLs) are used to translate propositional or first-order formulae into loss functions deployed for optimisation in machine learning. At the same time, recent attempts to give programming language support for verification of neural networks showed that DLs can be used to compile verification properties to machine-learning backends. This situation is calling for stronger guarantees about the soundness of such compilers, the soundness and compositionality of DLs, and the differentiability and performance of the resulting loss functions. In this talk, I report on recent work to 1.) give uniform semantics to and 2.) to formalise existing DLs using the Mathematical Components library in the Coq proof assistant. This work is meant as a stepping stone for the development of programming language support for verification of machine learning.
Arthur J Parzygnat (8th of August)
Bayes' rule has recently been given a categorical definition in terms of string diagrams due to Cho and Jacobs. This definition of Bayesian inversion, however, is not robust enough for categories that include reasoning about quantum systems due to the no-cloning theorem. In this talk, I will explain how semi-cartesian categories (which have less structure than Markov categories) provide a suitable framework to define Bayesian inversion categorically. In particular, I will provide axioms for such an abstract form of Bayesian inversion. It remains an open question whether these axioms characterize Bayesian inversion for quantum systems.
Rodrigo Ochigame (15th of August)
This talk will examine some aspects of the ongoing automation of mathematical research from an anthropological perspective. It will first address the diversity of practical standards of proof (across fields, places, and times), the longstanding tension between "understanding" and mechanical "checking," and the persistent need for social adjudication even for computer-encoded ("formal") proofs. It will then discuss how automation might affect divisions of labor, economies of credit, institutional hierarchies, and structures of incentive and reward in professional research. The talk will also draw comparisons to current debates about automation and machine learning in other fields of science, such as experimental physics.
Will Chrichton (22nd of August)
Tools for formalized mathematics (FM), such as proof assistants and model checkers, are increasingly capable of handling the real-world problems of both mathematicians and software developers. Yet, these tools are only as effective as the people who use them. The FM community clearly needs to invest in better education and better tooling. But... which curricula are actually effective for learners? What tooling will actually make users more productive? In this talk, I will lay out some preliminary ideas for how to systematically investigate these questions, i.e., develop a science of human factors for FM. My core proposal is to combine experimental psychological methods (e.g., lab studies, IDE telemetry) and cognitive theories (e.g., working memory, mental models) to study how people use FM tools. Then that understanding can be applied to make principled predictions about the efficacy of curricula, tooling, and language design.
Cyrus Omar (29th of August)
This talk will review progress on the Hazel programming environment and its underlying theoretical developments. Hazel is the first totally live typed general-purpose programming environment, meaning that it deploys error localization and recovery mechanisms, rooted in language-theoretic developments, that ensure that every editor state is syntactically well-structured and statically and dynamically meaningful. The talk will review the underlying theory and include a live demonstration of various Hazel features, including its editor, training mode, and its stepper, which is forming the basis for ongoing work on a Hazel-based theorem prover. The talk will also discuss various other ongoing and future directions of interest to the community, including our vision for a "computational commons" that operates like a planetary-scale live program.
Spencer Breiner (5th of September)
I will argue that applied category theory would benefit, both practically and strategically, from greater attention to user interface. Starting from some very basic examples, I will discuss some of the conceptual challenges in this area and explain how they map onto categorical structures like monads, comonads and polynomial functors. I will close with a discussion of the Semagrams user interface library, its goals and future prospects.
Christine Tasson (12th of September)
Synchronous languages are now a standard industry tool for critical embedded systems. Designers write high-level specifications by composing streams of values. These languages have been extended with Bayesian reasoning to program state-space models which compute a stream of distributions given a stream of observations [1].
This talk aims at describing semantics for probabilistic synchronous languages, based on a joint work with Guillaume Baudart and Louis Mandel [2]. The key idea is to interpret probabilistic expressions as a stream of un-normalized density functions which maps random variable values to a result and positive score. Two equivalent semantics are presented: the co-iterative semantics is executable while the relational semantics is easy to use for proving program equivalence. The semantical framework is then applied to prove the correctness of a program transformation required to run an optimized inference algorithm.
Chris Fields (19th of September)
The idea that objects have associated Identity morphisms, or operators, is fundamental to mathematics, physics, and the theory of Active Inference. All of these, moreover, support the idea that identity can be maintained via multiple paths. We will explore the relationship between the idea of identity and the notions of space, time, and memory, with an emphasis on how these latter three relate to the idea of an object in both physical theory and active inference.
Seth Frey (26th of September)
The best citizens of a large-scale democracy are those who have built and broken several small ones to see how they work. By empowering people to build any kind of community together, the Internet has become a laboratory for self-governance experimentation. Groups who start online communities must overcome the challenges of recruiting finite resources around difficult common goals. Fortunately, they can draw on a growing range of support technologies, peer networks, and scholarship. With their transparency, the Internet's millions of online communities can be surveyed for insights into their design and functioning. Looking at three large platforms for small self-governing online communities, we will pose several questions of institutional processes at the population level, as drawn from the literatures on common-pool resource management and institutional analysis and design.
Amélia Liao (3rd of October)
Cubical type theory is an extension of dependent type theory designed to make the univalence principle *provable*, rather than an axiom. Then, by what is essentially a happy coincidence, it also provides a design for working with higher inductive types and with coinductive types.
However, the tradeoff for these features is a hit to usability: In practice, the user of cubical type theory is directly exposed to the complicated primitive operations that let us implement these higher features, even if they're working on set-level mathematics. Worse, any implementation of HoTT imposes additional proof obligations to stay in the realm of sets rather than escaping off into coherence purgatory.
This talk discusses the experience of using cubical type theory to build the 1Lab— in particular, the automation we've been building so the end-user of the library does not have to memorise the cubical type theory papers if all they want is to formalise traditional, low-homotopy level mathematics.
Thomas Powell (16th of October)
It is an elementary fact from real analysis that any monotone bounded sequence of real numbers converges. It turns out that the monotone convergence theorem can be given an equivalent finitary formulation: roughly that any sufficiently long monotone bounded sequence experiences long regions where the sequence is metastable. This so-called "finite convergence principle" is carefully motivated and discussed by Terence Tao in a 2007 blog post ('Soft analysis, hard analysis, and the finite convergence principle'), but was already known to proof theorists, where the use of logical methods to both finitize infinitary statements and provide uniform quantitative information for the finitary versions plays a central role in the so-called proof mining program.
The goal of my talk is to discuss how methods from proof mining can be applied to the theory of stochastic processes, an area of mathematics hitherto unexplored from this perspective. I will begin by discussing the simple monotone convergence principle, and will then focus on how everything I mentioned above can be generalised to the analogous result in the stochastic setting: Doob's martingale convergence theorems. This then sets the groundwork for new applications of proof theory in stochastic optimization, and I will give a high level overview of some work in progress in this direction, including a quantitative analysis of the famous Robbins-Siegmund lemma. This is all joint work with my PhD student Morenikeji Neri.
In giving the talk I will not assume any knowledge of proof theory, just some elementary analysis and probability theory.
Jason Gross (24th of October)
The field of mechanistic interpretability – techniques for reverse engineering model weights into human-interpretable algorithms – seeks to compress explanations model behavior. By studying tiny transformers trained to perform algorithmic tasks, we can make rigorous the extent to which various understandings of a model permit compressing an explanation of its behavior.
In this talk, I’ll discuss how we prototyped this approach in our paper where formally proved lower bounds on the accuracy of 151 small transformers trained on a Max-of-K task, creating 102 different computer-assisted proof strategies to assess their length and tightness of bound on each of our models. Using quantitative metrics, we found that shorter proofs seem to require and provide more mechanistic understanding. Moreover, we found that more faithful mechanistic understanding leads to tighter performance bounds.
We identified compounding structureless noise as the leading obstacle to generating more compact proofs of tighter performance bounds. I plan to discuss ongoing work to address this challenge by either relaxing the worst-case constraint enforced by using proofs; or by fine-tuning partially-interpreted models to align more closely with our explanations.
I’ll conclude by discussing the roadmap I see to scaling the compact proofs approach to rigorous mech interp up to frontier models.
Mario Román (31st of October)
Partial Markov categories are an algebra and syntax for Bayesian inference. They use a string diagrammatic syntax—with a formal correspondence to programs—to reason about continuous and discrete probability, decision problems (Monty Hall, Newcomb's), the compositional properties of normalization, and an abstract Bayes' theorem.
Partial Markov categories are a careful blend of Markov categories (from categorical probability theory) and cartesian restriction categories (from the algebraic theory of partial computations). We will discuss the construction, theory, and applications of partial Markov categories.
This is joint work with Elena Di Lavore. It is based on "Evidential Decision Theory via Partial Markov Categories", presented at LiCS'23 (arXiv:2301.12989).
Carl Miller (7th of November)
Mathematical security proofs are crucial in the field of cryptography: they allow us to base the theoretical security of a particular protocol on a set of simply-stated assumptions. However, a major challenge (in both education and research) is managing the complexity of these proofs. Fully rigorous security proofs can occupy a lot of space even when the underlying ideas are relatively simple. In the subfield of quantum cryptography, which incorporates quantum states and processes, this challenge has additional complications.
Visual reasoning helps illuminate proofs in cryptography, and in some cases, visual reasoning can actually serve as a replacement for symbolic reasoning. In this talk I will discuss how picture-proofs, based on categorical quantum mechanics, can be incorporated into quantum cryptography.
References:
Arezoo Islami (14th of November)
The Applicability Problem is the problem of explaining why mathematics is applicable to the empirical sciences. This problem is revived and reformulated by the physicist Eugene Wigner under the striking title "The Unreasonable Effectiveness of Mathematics in the Natural Sciences". In this influential work, Wigner argues that the applicability of mathematics is a miracle, "a wonderful gift which we neither understand nor deserve". Responses to this problem range from metaphysical claims about the mathematical structure of the universe to epistemic claims about the nature of human cognition, as well as formalist views that characterize mathematics as a type of game.
In my view, to find an explanation for this relationship, we must first understand the explanandum itself. More fundamental than the why-question (why is mathematics applicable in the natural sciences) is the how-question (how is mathematics applicable in the natural sciences). By examining how mathematics has been used across different eras and fields within the natural sciences, we can begin to understand the relationship between mathematics and the sciences. More importantly, this exploration allows us to address questions regarding the nature of mathematics as it is used and practiced. By distinguishing pseudo-problems from the genuine problems of applicability, we open new paths for philosophical reflections on the nature of mathematics and the sciences.
Matthew Akamatsu (5th of December)
Scientific research teams face persistent challenges in coordinating their work and synthesizing collective knowledge into models across projects and labs. Discourse graphs offer a solution by breaking scientific research into its atomic components - questions, claims, and evidence - and connecting them in a semantic graph. In our cell biology lab, we have implemented discourse graphs as a coordination layer for collective research and model building. Lab members use the graphs to identify open research questions, maintain focus on their chosen hypotheses while documenting new results, and share discrete findings with clear provenance and context. Researchers report streamlined thinking, better orientation toward their target question, and a sense of accomplishment from contributing tangible, reusable results. This collaborative approach has allowed the lab to organically evolve its scientific models through bottom-up knowledge synthesis. Our new user pilot aims to extend these benefits through accessible tooling and interoperable graphs, thereby enabling collective grassroots knowledge generation and sensemaking across research communities.
Eswaran Subrahmanian and Yves Keraron (12th of December)
In this talk, we will illustrate the role of information structures in engineering practices. In a particular case, we will instantiate an example of the required structures and their interaction in the systems engineering process. We will then identify Categorical representations that could formalize the models embedded in systems engineering standards and support their composition. The talk is not mathematical but an attempt to map the needs of systems engineering modeling to potential categorical representations that serve the purpose of engineering practices.