Theses

Information is physical

Landauer's principle pertains to the lower limit of energy consumption of a computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit, must be accompanied by a corresponding entropy increase". Typically this increase takes the form of energy imported into the computer, converted to heat, and dissipated into the environment.
The aim of this thesis is to write an introduction to Landauer's principle for computer scientists. It may also cover related topics such as Maxwell's daemon or reversible computation.

Circuit Complexity

Circuit complexity is a branch of computational complexity theory that classifies functions according to the size of the Boolean circuits that compute them. This is in contrast to the Turing machine based measures that consider the amount of time or space used for the computation. The aim of this thesis is to write an introduction to circuit complexity that could be given to interested students in the "Berechenbarkeit und Komplexität" lecture.

Proof Complexity

Complexity theory usually measure the cost of solving a computational problem. In the same spirit, proof complexity studies which is cost of proving a statement in a formal proof system. This cost is measured by analyzing which is the minimum length that a proof of the statement has. In this project we will study an introduction to proof complexity and its connections to the P vs NP problem.

Bioinspired Models of Computation

A model of computation is a mathematical object that allows us to study the limits and cost of computing. The ubiquitous model of computation is Turing Machines, which was proposed by Alan Turing after a exhaustive analysis of what "computing" means. However, there are other models of computation whose inspiration comes from Biology. Examples of these models are membrane computing, inspired in the inner structure of cells; cellular automata, inspired in life itself; or virus machines, inspired in the trasmission and replication of viruses. In this project we will study these biology-inspired models, from a mathematical point of view.

Programming Language Semantics

Assume we are given an algorithm A in a programming language designed to fulfill a task T. How do we know that after executing A the task T has been correctly fulfilled? Algorithms are purely syntactical objects (pieces of text), so to understand their execution better we can assign them a meaning (a semantics). In this project we will define multiple semantics models for programming languages and we will show that they are equivalent.

Degrees of unsolvability

The halting problem is the canonical example of a problem that is unsolvable. But how complex is it compared to other unsolvable problems? Turing reductions allow us to compare the difficulty of unsolvable problems to each other. This allows us to sort all programs into Turing degrees (or degrees of unsolvability) of programs that are of the same difficulty. The resulting structure of (reducibility) relations between these equivalence classes is quite complicated, and has been a fruitful area of study. In this project we will describe some of the properties of this structure of unsolvability degrees, including that there exists a computably enumerable degree strictly between that of the computable sets and that of the halting problem (Post's problem).

Modeling and Comparing Auction Mechanisms Using Epistemic Logic

Auctions are widely used to allocate goods and resources, from selling artwork and concert tickets to distributing online advertisements and radio spectrum. Different auction formats—such as English, Dutch, first-price sealed-bid, and second-price sealed-bid—reveal and conceal information in very different ways, shaping how participants reason about others’ bids and values. This thesis will use methods from epistemic logic—for example, Dynamic Epistemic Logic or runs-and-systems semantics—to formally model these auctions and compare them in terms of how knowledge evolves, what remains private, and how transparency or strategic behavior emerges. The aim is to better understand the role of information and knowledge in auctions by analyzing them with logical tools.

Modeling the Swiss Data Protection Act in Deontic-Temporal Logic

The goal of this thesis is to explore whether deontic-temporal logic—a formal framework for reasoning about obligations, permissions, prohibitions, and time—can be used to model selected rules from the revised Swiss Federal Act on Data Protection (FADP, 2023). By focusing on a small set of provisions (such as data breach notifications or data deletion duties), the project aims to show how legal norms with deadlines and exceptions can be represented formally, and to reflect on the opportunities and limits of using logical methods to understand and analyze data protection law.

Model Checking Social Network Logic

Model checking tools are traditionally used for automatic verification of the correctness of (e.g. software) programs. The behaviour of a given program is evaluated by checking the validity of the desired properties on the structure of the program; representing properties as logical formulas, and the evaluated program as a model, the problem is reduced to logical satisfiability. Model checkers have already been developed for various temporal, dynamic epistemic, and spatial logics. In this project, we will implement a Haskell-based model checker for social network logic: a family of logics that model the dynamics of social influence and friendship selection in social networks.

Logical Verification with Lean

As rigorous as we try to be in the process of writing proofs, human error is nearly unavoidable. One solution to this problem is to use a proof assistant for the automated verification of the correctness of our proofs. Lean is such a proof assistant: a functional programming language designed for the implementation of proofs. Proof verification usually proceeds interactively, using a backwards approach: using premises and assumptions, the user applies "tactics" to simplify a goal formula and iteratively break it down into subgoals, until no goals are left. The aim of this project is to get familiar with Lean and to use it to verify the proof of a mathematical theorem.

Formal Verification of Communication Protocols in Lean

The field of distributed computing studies distributed systems: networks of computers, or agents, that need to communicate and collaborate to achieve a common goal. Several protocols have been designed to ensure efficient and/or reliable communication between these agents. The choice of protocol influences the solvability of the task at hand. For a number of tasks and protocols, the necessary and sufficient conditions for a given protocol to solve a given task have been formally proved. In this project, we will formally verify such statements using the proof assistant Lean: a functional programming language designed for the implementation and verification of proofs.

Theories of Vagueness and Indeterminacy

Roughly speaking, a concept may be considered vague if it gives rise to borderline cases, i.e., cases in which something neither clearly belongs to the concept nor clearly does not belong to the concept. A typical example is the term "tall": although there are definite cases in which someone is clearly a tall person or clearly not a tall person, one cannot specify an exact height that can be used to determine whether an arbitrary person is clearly tall or clearly not tall. Over the years, philosophers and logicians have developed numerous formal frameworks that allow to deal with vague sentences. Examples include many-valued logics, supervaluationism, strict-tolerant logics and fuzzy logic. The project aims to discuss and evaluate these frameworks from a philosophical and logical point of view. Students are also welcome to consider other types of indeterminacy or to develop their own solution to the problem of vagueness.

Questions and Dependency in Logic

Inquisitive logic is a flourishing area of research that has received much attention over the past ten years. It is based on inquisitive semantics, a semantic framework that allows to formalize the meaning of both statements and questions in a uniform way. A characteristic feature of inquisitive semantics is that logical formulas are not evaluated with respect to objects representing states of affairs (such as, e.g., possible worlds), but with respect to sets of such objects (which are usually referred to as "information states" or "teams"). A closely related framework, known as propositional dependence logic, is concerned with the semantic analysis of dependencies between propositional variables. The project aims to explore the properties and applications of team-based logics and to engage with various open problems, especially in the field of inquisitive modal logic.


Other topics in the area of logic and theoretical computer science are also possible. If you would like to propose a topic for a thesis, please contact us.