This is the homepage^{1} of Bilkent Math Graduate Seminars. For the Zoom link visit researchseminars.org.
Previous reading seminars
Nov 25: $2$-categories for the working graduate student by Redi Haderi
An ordinary ($1$-)category consists of a collection of objects, morphisms between them and composition of morphisms satisfying some rules. A $2$-category is a structure which has incorporated in it an extra layer of data: morphisms between morphisms. Examples of morphisms between morphisms are natural: indeed we have homotopies between continuous maps, conjugations between group homomorphisms, natural transformations between functors etc. We will briefly illustrate how this extra layer of data allows us to “weaken” usual categorical concepts. In particular we discuss equivalences between objects in a $2$-category.
Dec 2: $2$-categories for the working graduate student by Redi Haderi
Part II
Dec 9: Euler characteristics of Morita equivalent categories by Mustafa Akkaya
In this talk, definition of Euler characteristics of finite categories will be given and some properties will be discussed. We will show that Leinster’s Euler characteristic is invariant under equivalence of categories as we expected, but not under Morita equivalence.
Dec 16: Sheaf-theoretic approach to quantum contextuality by Cihan Okay
I will talk about how sheaf theory can be used to capture the notion of non-locality and contextuality in quantum mechanics following the paper https://arxiv.org/abs/1102.0264. For this I will introduce the distribution monad, the notions of empirical model and hidden-variable model. I will discuss the Bell scenario in this framework.
Dec 23: Sheaves by Pejman Parsizadeh
Part I: We will cover basics of sheaf theory: definition of the category of sheaves, the sheafification functor, and pull-back/push-forward sheaves.
Jan 13: Torsors and first cohomology set of torsors by Pejman Parsizadeh (moved to Feb 17)
Part II
Jan 20: Euler characteristics of Morita equivalent categories by Mustafa Akkaya
Part II: Quillen theorem A
Jan 27: Categorical logic I: Structures in Categories by Kristof Kanalas
The model $M$ of a theory $T$ consists of an underlying set $X$ together with functions $f:X^n \to X$ and subsets $R\subseteq X^m$ which correspond to the function and relation symbols of the language, in such a way that the axioms of $T$ are valid in $M$. If we replace “set” with “object”, “function” with “arrow” and “subset” with “subobject” we get the notion of a structure in the category $\mathcal{C}$, and if this category has enough structure we can also interpret formulas (and decide their validity), hence define the models of $T$ inside $\mathcal{C}$. As a first application we can define the internal language (and theory) of a category: it turns out that “from inside every category looks like $\mathbf{Set}$”, e.g. an arrow $f:A\to B$ of $\mathcal{C}$ is a monomorphism iff the formula $f(a)=f(a') \rightarrow a=a'$ is valid in the internal theory, etc.
Feb 3: Introduction to Double Categories by Redi Haderi
Double categories are a way to do two-dimensional category theory. We will present this structures and some of the main examples, and observe how the theory of double categories is much richer than that of 2-categories. In particular we present the concept of double colimit and (time permitting) equipments.
Feb 10: Defining second cohomology group of a group by using group extensions by Zilan Akbas
In this talk, first we will define what is a group extension, and introduce split and general extensions. By defining factor set (cocycle), we will have several properties which will lead to second cohomology group. Finally, we will show that there exist a bijection between second cohomology group and family of all the equivalence class of extensions of a group.
Feb 17: Torsors and first cohomology set of torsors by Pejman Parsizadeh
Part II
Feb 24: Cech cohomology and its relation with torsors by Pejman Parsizadeh (moved to Apr 7)
Part III
Mar 3: Operads and $E_n$-algebras by Calista Bernard
Suppose we have a space with a multiplication that is not strictly associative, so $(xy)z$ and $x(yz)$ are not equal. If we want to multiply $n$ elements together, we now have many ways to do so depending on where we choose to put our parentheses. Often we have some additional data, such as paths between $(xy)z$ and $x(yz)$ and some coherency between these paths, and we would like to keep track of this data to see how to relate the different ways of multiplying $n$ elements. Operads provide a concise way of encoding this type of data of operations and relations between them. In this talk I will define operads and give examples that determine to what extent a multiplication is associative or commutative up to homotopy. In particular, I will discuss $E_n$-algebras, which are homotopy-commutative objects, and their relationship to $n$-fold loop spaces.
Mar 10: Quantum state certification by Gautam Gopal Krishnan
Quantum state certification deals with the problem of testing whether a quantum mixed state is equal to some unknown mixed state or else is $\epsilon$-far from it. Definitions of quantum mixed states and measurement schemes will be given in this talk and we will talk about some recent developments concerning quantum state certification. We will then illustrate how Schur-Weyl duality and the double commutant theorem in representation theory can be exploited to generalize these results.
Mar 17: Higher order Toda brackets by Azez Kharoof
This will be a preparation for the topology seminar talk on Monday.
Apr 7: An overview of Hochschild cohomology by Pablo Sanchez Ocal (5pm UTC+3)
Taking Hochschild cohomology is a way of algebraically encoding infinitesimal information about an associative algebra. In this talk we will give an unpretentious introduction to this cohomology, we will justify its importance by computing some of the lower degrees, and we will then give explicit applications that advance the understanding of quantum symmetries.
Apr 28: Various Quantum Relative Entropies and some Applications by Sarah Chehade
In Quantum Information Theory, one of the most famous inequalities is called the Data Processing Inequality (DPI). The inequality states that two quantum states become harder to distinguish after they pass through a noisy quantum channel. This inequality holds for a number of distinguishability measures, with the most basic one being the Umegaki relative entropy otherwise known as the quantum relative entropy. We are interested in the case of saturation of DPI for these measures. One of the generalizations of quantum relative entropy is a two-parameter family called the $\alpha-z$ Renyi relative entropy. Recently, the set of parameters has been completely characterized for which the $\alpha-z$ Renyi relative entropy satisfies DPI.
In this talk, I will present a necessary and a sufficient condition for saturation of DPI for $\alpha-z$ Renyi relative entropy. Both conditions are similar to the original condition of recoverability for quantum relative entropy, and coincide when $\alpha=z$, leading to a so-called sandwiched Renyi relative entropy.
Other applications of relative entropy, such as coherence and entanglement measures will also be mentioned and discussed as a future direction of work.
May 3-4-5-10: Mini workshop on Quantum Computing
See the schedule of talks for more information.
Jun 23: Plethysm, Segal groupoids and operads by Alex Cebrian
Plethysm is a substitution operation in the ring of formal power series in infinitely many variables. It was introduced in the context of unlabelled enumeration (Pólya, 1937) in combinatorics and in representation theory (Littlewood, 1944).
Operads have long been a standard tool in topology, algebra and category theory, and they are becoming increasingly important also in combinatorics.
Decomposition spaces, certain simplicial spaces introduced by Gálvez, Kock, and Tonks (and independently by Dyckerhoff and Kapranov under the name 2-Segal spaces), provide a general framework for incidence coalgebras in objective combinatorics. It was shown by Kock and Weber that operads give rise to Segal groupoids, a special kind of decomposition spaces.
I will begin by reviewing some of the notions mentioned above, and then I will explain the main contributions of my thesis: a Segal groupoid whose incidence coalgebra encodes the combinatorics of plethysm, and a construction on operads which allows to systematically obtain Segal groupoids for several variations of plethysm.
Jun 26: Quivers and 2 Calabi-Yau categories by Jie Ren
The framework of Calabi-Yau categories is appropriate for the theory of motivic Donaldson-Thomas invariants. I will give an introduction to the Calabi-Yau categories associated to quivers. If time permits I will also talk about Donaldson-Thomas invariants.
Aug 12: Quantum Mechanics From Inductive Inference: Applications to Quantum Fields in Curved Spacetime by Selman Ipek
Despite its overwhelming empirical successes, there have been many sharp criticisms of the quantum formalism, dating back to the early insights of Schr"{o}dinger, as well as Einstein, Podolsky, and Rosen (EPR). Building off of these initial efforts were the pioneering works of J. Bell, as well as S. Kochen and E. Specker (BKS) that highlighted the nonclassical features of quantum mechanics, which have now been subject to stringent experimental tests. In recent years there has thus been a renaissance in the foundations of quantum theory brought about a surge of interest in quantum information technologies seeking to take advantage of such nonclassical resources.
Complementary to these recent developments is a line of thought, first popularized by John Wheeler’s “it from bit”, that the notion of information may itself play a key role in clarifying the structure of quantum theory. Here we discuss Entropic Dynamics (ED), one recent proposal for deriving the structure of quantum mechanics that utilizes the methods of inductive inference. Following the example set by Jaynes’ statistical mechanics, in ED we attempt to distinguish between those aspects of the quantum formalism that are physical in nature, and those which follow from straightforward statistical inference. Such an approach not only sheds light on issues in the foundations of quantum theory, but it also provides a general setting in which to explore alternative “st-quantum“ models of mechanics. This latter possibility is explored by explicitly constructing an ED of quantum fields interacting with a dynamical spacetime. The result is a hybrid ED model that approaches quantum field theory in one limit and classical general relativity in another, but is not fully described by either. A particularly significant prediction of this ED model is that the coupling of quantum fields to gravity implies violations of the quantum superposition principle. We conclude by discussing the implications for foundations and information processing.
Nov 26 | 4 pm | SA 141: The Basics of Measurement-Based Quantum Computing by Selman Ipek
In quantum computing quintessential quantum effects are harnessed to provide speedups over standard, or “classical” computers in certain computational tasks. Many conventional approaches to quantum computing utilize the so-called circuit model, which draws heavily on gate-based designs familiar from classical compu- tation. An alternative to this standard model is Measurement-Based Quantum Computation (MBQC), an approach that instead takes full advantage of the peculiar features of quantum measurements. In this seminar we will discuss MBQC from a broad perspective that is suitable for an audience with only an exposure to circuit-based quantum computation. In particular, here we outline the basic premise of MBQC; briefly demonstrate its sufficiency for universal quantum computation; highlight some differences with the circuit-based model; and discuss its advantages. We close with some brief comments on the intimate connection of MBQC to foundational issues in quantum theory.
Dec 3 | 4 pm | SA 141: Contextuality as a resource in MBQC by Markus Frembs
The Kochen-Specker theorem is a key result in quantum foundations. It proves that quantum theory does not admit an underlying classical description. More precisely, there exists no state space, whose elements assign sharp values to all physical quantities simultaneously. In the first part of the talk I will give a brief overview of this result and the resulting concept of quantum contextuality, focussing on its foundational significance. What is more, contextuality has recently been proposed as a resource for quantum computation. The measurement-based model, in particular, provides a concrete manifestation of contextuality as a computational resource: if local measurements on a multi-qubit state can be used to evaluate nonlinear Boolean functions with only linear side processing, then this computation constitutes a proof of strong contextuality—the possible local measurement outcomes cannot all be pre-assigned. In the second part, I will review this result and prove a generalisation to the case when the local measured systems are qudits.
Dec 10 | 4 pm | SA 141: Contextuality and the fundamental theorems of quantum mechanics by Markus Frembs
Contextuality is a key concept in quantum theory. https://arxiv.org/abs/1910.09591 reveals just how important it is by demonstrating that quantum theory builds on contextuality in a fundamental way: many key theorems in quantum foundations—Wigner, Kochen-Specker, Gleason and Bell—can be given a unified presentation in terms of presheaves over the partial order of contexts. In this talk, I will introduce these objects, review their conceptual motivation, and discuss the reformulation of the above results in this framework.
Feb 23 | 5:30 pm | SA 141: Introduction to quantum channels (part I) by Melin Okandan
A quantum state is a density operator for some complex Euclidean space. In this talk, we will briefly touch on the definition of pure states related to quantum states and then prove that there exists a purification for every positive semidefinite operator P. Next, we will establish the fact that if there are different purifications of P then these purifications are unitarily equivalent. Our proofs will involve facts from linear algebra which will be reviewed during the talk and we will conclude our discussion with an introduction to quantum channels.
Mar 2 | 5:30 pm | SA 141: Introduction to quantum channels (part II) by Melin Okandan
A quantum channel is a trace preserving and completely positive linear map between vector spaces of operators. In this talk, we will provide different representations of quantum channels in complex Euclidean spaces and our aim is to show the equivalent characterizations among them. To do this, we will present alternative definitions of trace preserving and completely positive maps linked to these representations and our methods in the proofs will involve facts from linear algebra.
May 16 | 4:00 pm | Zoom: Generalizing the quantification of information by Bilal Canturk
A statistical approach to the second law of thermodynamics leads to an expression in terms of probabilities $H_{\text{BGS}} = - k_{B}\sum_{k = 1}^{N}p_{k}\ln p_{k}$ (Boltzmann-Gibbs-Shannon entropy). On the other hand, a rational approach to the quantification of information led Shannon to form a function, which is formally equal to entropy up to the constant factor. From these two facts, a deep question forced itself to the mind of science: What is the relationship between information and energy? This question can be implicitly divided into two questions: (1) What is the general quantification of information? (2) What is the general relationship between information and energy?
Regarding to the first question, $H_{\text{BGS}}$ is derived uniquely from the four Shannon-Khinchin (SK) axioms as the quantification of information. Over the course of time, several generalized entropies were proposed by relaxing the fourth SK axiom to subsume the new recorded phenomena that do not obey the distribution deriving from the maximization of $H_{\text{BGS}}$ (Rényi, 1961;Tsallis, 1988;Tempesta, 2016). Be that as it may, any admissible entropy must satisfy some physical conditions (Bento et al., 2015; Canturk et al., 2017, 2018).
In addition, entropy can also be used to quantify the uncertainty principle and in quantum cryptography (Maassen & Uffink, 1988; Canturk & Gedik, 2021). On the other hand, and regarding the second question, some reasonable arguments in the realm of quantum information theory requires that we need some function for quantifying information such that they must be invariant under unitary transformations (Brukner & Zeilinger, 1999).
I will present a survey of the first question and some links to the second question in the scope of the quantum measurements that are provided by mutually unbiased bases (MUBs) (Wootters & Fields, 1989) and symmetric informationally complete positive operator-valued measures (SIC-POVMs) (Renes et al., 2004).
May 23 | 4:00 pm | Zoom: Differential Geometry of Contextuality by Sidiney B. Montanhano
Contextuality has many approaches, each built for a specific purpose or strategy to exploit its characteristics. All of them start from a codification of physical systems in some mathematical structure that cannot be represented by another compatible structure called classical. The main idea of this talk, based on (Montanhano, 2022) will be to identify states, effects, and transformations in the framework of generalized contextuality as vectors living in a tangent space, and the non-contextual conditions as discrete closed paths implying null vertical phases. Two equivalent interpretations hold: the geometrical or realistic view, where flat space is imposed, implying that the contextual behavior becomes equivalent to the curvature of the probabilistic functions, thus a modification of the valuation function; and the topological or anti-realistic view, where the valuation functions must be preserved, implying that the contextual behavior can be translated as topological failures. Such formalism allows the study of a set of concepts: interference, non-commutativity, signed measures, non-embeddability, Voroby’ev theorem, contextual fraction, and the disturbance in ontic models.
June 6 | 12:00 pm | Zoom: The XP Stabiliser Formalism: a Generalisation of the Pauli Stabiliser Formalism with Arbitrary Phases by Mark Webster
Mark Webster works in the field of quantum error correction and he will be discussing a generalisation of the Pauli stabiliser formalism. The new XP stabiliser formalism allows us to represent a much wider set of states and XP codes have a much richer logical operator structure compared to the Pauli stabiliser formalism. In addition, XP codes cannot be classically simulated which suggests that they capture some aspects of quantum advantage.
June 20 | 4:00 pm | Zoom: Polytope theory and classical simulation of quantum computation with magic states by Michael Zurel
Polytopes come up in several areas of quantum information science. They appear in the foundations of quantum theory, for example through Bell inequalities and noncontextuality inequalities. They are also useful tools in the study of quantum information processing tasks like quantum computation and quantum communication. Here they can describe separations between the capabilities of classical theories, quantum theory, and beyond-quantum theories like the no-signalling polytope. In this talk I will give an overview of some examples of where polytopes are used in quantum computation. In particular, I will focus on a few families of polytopes that provide useful descriptions for a universal model of quantum computation and I will describe how these families of polytopes can be used to characterize the quantum computational advantage over classical computation. In addition, I will review some of the algorithms and tools used for studying these polytopes.
July 4 | 4:00 pm | Zoom: Contextuality as a resource in shallow quantum circuits by Sivert Aasnæss
August 1 | 4:00 pm | Zoom: Neither Contextuality nor Nonlocality Admits Catalysts by Martti Karvonen
In this talk, I will give an overview of https://arxiv.org/abs/2102.07637 , showing that the resource theory of contextuality does not admit catalysts, i.e., there are no correlations that can enable an otherwise impossible resource conversion and still be recovered afterward. As a corollary, we observe that the same holds for nonlocality. As entanglement allows for catalysts, this adds a further example to the list of “anomalies of entanglement,” showing that nonlocality and entanglement behave differently as resources. On the way, I will explain the construction of the resource theories of contextuality and nonlocality, and discuss some categorical aspects of these. Time permitting, we will also show that catalysis remains impossible even if, instead of classical randomness, we allow some more powerful behaviors to be used freely in the free transformations of the resource theory.
August 8 | 4:00 pm | Zoom: The Topology and Geometry of Causality by Nicola Pinzani
In my talk I am going to present a unified operational framework for the study of causality, non-locality and contextuality, in a fully device-independent and theory-independent setting. Our investigation proceeds from two complementary fronts: a topological one, using tools from sheaf theory, and a geometric one, based on polytopes and linear programming. From the topological perspective, we understand experimental outcome probabilities as bundles of compatible contextual data over certain topological spaces, encoding causality constraints. From the geometric perspective, we understand the same experimental outcome probabilities as points in high-dimensional causal polytopes, which we explicitly construct and fully characterise. Our work is a significant extension of both the established Abramsky-Brandenburger framework for contextuality and the current body of work on indefinite causality. We provide definitions of causal fraction and causal separability for empirical models relative to a broad class of causal constraints: this allows us to construct and characterise novel examples which explicitly connect causal inseparability to non-locality and contextuality. In particular, we clearly demonstrate the existence of “causal contextuality”, a phenomenon where causal structure is explicitly correlated to the classical inputs and outputs of local instruments, so that contextuality of the associated empirical model directly implies causal inseparability.