Phd courses 2015

Perspectives in parallel programming

Title: Perspectives in parallel programming

Lecturer: Marco Danelutto, Dipartimento di Informatica


May 13, 11-13, Seminari W
May 15, 11-13, Seminari W
May 22, 11-13, Seminari W

(the schedule of the next lectures will be decided on May 13)

The course will introduce several programming environments used to implement parallel applications on modern parallel architectures. A detailed introduction of the available structured parallel programming environments will be given. Optimization problems relative to different non functional concerns (performance, power management, fault tolerance) will also be described and typical experiences in different applicative domains will be discussed. Overall, the course will give the student a good perspective on the kind of problems to be solved when tackling the efficient programming of (massively) parallel systems as well as a basic knowledge relative to state-of-the-art parallel programming frameworks.

Slide course:

Bayesian Machine Learning

Title: Bayesian Machine Learning

Lecturer: Guido Sanguinetti, University of Edinburgh

Place:      Computer Science Department, University of Pisa


  • 23 February seminar room west  15 – 17;
  • 24 /25 / 26 February seminar room west  9- 11;
  • 27 February Laboratorio didattico Polo Fibonacci “I”  9 – 11;
  • 2 March seminar room west  16 – 18;
  • 3 / 4/ 5 March seminar room west  9- 11;
  • 6 March Laboratorio didattico Polo Fibonacci “I”  9 – 11;

This is a rough summary for the Bayesian Machine Learning course. The main reference is D. Barber’s book, Bayesian Reasoning and Machine Learning; the numbers in brackets refer to Barber’s book unless explicitly stated otherwise. The book is available online from

  • Lecture 1: Statistical basics. Probability refresher, probability distributions, entropy and KL divergence (Ch 1, Ch 8.2, 8.3). Multivariate Gaussian (8.4). Estimators and maximum likelihood (8.6 and 8.7.3). Supervised and unsupervised learning (13.1) Lecture 1
  • Lecture 2: Linear models. Regression with additive noise and logistic regression (probabilistic perspective): maximum likelihood and least squares (18.1 and 17.4.1). Duality and kernels (17.3). Lecture 2
  • Lecture 3: Bayesian regression models and Gaussian Processes. Bayesian models and hyperparameters (18.1.1, 18.1.2). Gaussian Process regression (19.1-19.4, see also Rasmussen and Williams, Gaussian Processes for Machine Learning, MIT Press, 2007, Ch 2. Available for download at Lecture 3
  • Lecture 4: Active learning and Bayesian optimisation. Active learning, basic concepts and types of active learning (B. Settles, Active learning literature survey, sections 2 and 3, available from Bayesian optimisation and the GP-UCB algorithm (Brochu et al, see Lecture 4
  • Lab 1: GP regression and Bayesian Optimisation.
  • Lecture 5: Latent variables and mixture models. Latent variables and the EM algorithm (11.1 and 11.2.1). Gaussian mixture models and mixture of experts (20.3, 20.4). Lecture 5
  • Lecture 6: Graphical models. Belief networks and Markov networks (3.3 and 4.2). Factor graphs (4.4). Lecture 6
  • Lecture 7: Exact inference in trees. Message passing and belief propagation (5.1 and 28.7.1). Lacture 7
  • Lecture 8: Approximate inference in graphical models. Variational inference: Gaussian and mean field approximations (28.3, 28.4). Sampling methods and Gibbs sampling (27.4 and 27.3). Lecture 8
  • Lab 2: Bayesian Gaussian mixture models.

Verifiable voting systems and secure protocols: from theory to practice

Title: Verifiable voting systems and secure protocols: from theory to practice

Lecturer: Peter Y. A. Ryan, Université du Luxenbourg

Period: 6-10 July 2015 — Sala Seminari W, 9-12

Coinductive Methods in Computer Science (and beyond)

Title: Coinductive Methods in Computer Science (and beyond)

Lecturer: Filippo Bonchi and Damien Pous, ENS Lyon

Period: 13 — 24 April — all lectures will be in Sala Seminari W

The induction principle is ubiquitous in computer science. For instance it is used to prove properties of programs involving finite data structures, such as natural numbers, lists and trees. For reasoning about concurrent programs or infinite data structures, like streams and infinite trees, one needs a dual principle — coinduction — which is the main topic of this course.
The relevance of coinductive reasoning has been usually associated with bisimilarity, a notion of equivalence that emerged at the end of the seventies in three different fields: non-well founded set theory, modal logics and concurrency theory. In the last years, coinductive methods has been studied in more and more areas of computer science, such as automata and type theory. Perhaps unexpectedly, applications to game theory and economy have been proposed recently.
We will introduce the coinductive definition and proof principles, several techniques to enhance coinductive proofs, and some algorithms to automatize such proofs. During the course, we will encounter examples stemming from different areas, with particular focus to automata and concurrency theory. If time allows, we will also give an overview to the theory of coalgebras, which is the abstract framework encompassing all these different examples.

Monday 13th 10-13
Motivations; Automata

Tuesday 14th 10-12
Hopcroft and Karp’s algorithm; Bisimulations up-to equivalence

Wednesday 15th 10-12
Knaster and Tarski theorem; A lattice-theoretic perspective on up-to techniques

Thursday 16th 10-13
Relational Algebra; Kleene Algebra; Kleene Algebra with Test

Monday 20th 10-12
Bisimulations up-to congruence; Hopcroft and Karp up-to congruence

Tuesday 21st 10-12
Weighted Automata; Streams

Wednesday 22nd 10-12
CCS; GSOS specifications

Thursday 23rd 10-12
Algebras; Coalgebras; Bialgebras

Friday 24th 10-12
Partition Refinement algorithm (as terminal sequence)
Brzozowski’s algorithm (as duality of observability and rechability)


Additional readings:

1) D. Sangiorgi. Introduction to Bisimulation and Coinduction. Cambridge University Press, 2012
2) D. Sangiorgi, J. Rutten. Advanced Topics in Bisimulation and Coinduction, Cambridge University Press, 2012

High Dynamic Range Imaging: theory and applications

Title: High Dynamic Range Imaging: theory and applications

Lecturer: Francesco Banterle, Visual Computing Laboratory, CNR Pisa

Period: 15-26 June 2015 — Sala Seminari Ovest, 15-17

Reference page:

  • Introduction to modern Imaging – 2 hrs
  • Introduction to high dynamic range Imaging – 1 hr
  • Capturing and generating the real-world lighting 4 hrs
  • Tone mapping HDR images for low dynamic range displays – 4hrs
  • Tone mapping HDR videos for low dynamic range displays – 2hrs
  • Inverse Tone mapping – 2hrs
  • HDR content formats and compression – 2hrs
  • High dynamic range imaging applications – 4 hrs

Searching by Similarity on a Very Large Scale

Title: Searching by Similarity on a Very Large Scale

Lecturer: Giuseppe Amato, CNR Pisa

Period: end of September/beginning of October 2015

  • Foundation of Metric Space Searching
    • Distance Searching Problem
    • Metric Distance Measures
    • Similarity Queries
    • Basic Partitioning Principles
    • Principles of Similarity Query Execution
    • Policies for avoiding Distance Computations
    • Metric Space Transformations
    • Priciples of Approximate Similarity Search
    • Advanced issues: Statistics, Proximity, Performance Prediction, Tree quality measures
    • Advanced issues: Choosing reference points
  • Exact Similarity Search
    • Vantage Point Trees
    • GNAT
  • The M-Tree Family
    • The M-Tree
    • Bulk-Loading Algorithm
    • Multy-Way Insertion Algorithms
    • The Slim Tree
    • Slim-Down Algorithm
    • Pivoting M-Tree
  • Hash Based Methods
    • D-Index
    • Approximate Similarity Search with M-Tree
      • Relative Error Approximation
      • Good Fraction Approximation
      • Small Chance Improvement Approximation
      • Proximity-Based Approximation
      • PAC Nearest Neighbor Searching
      • Performance tests
    • Other Approximate Similarity Search Methods
    • Permutation Based Methods
      • Permutation Spearman Rho
      • PP-Index
      • MI-File
    • Locality Sensitive Hashing (LSH)
      • LSH based on p-stable distributions
      • LSH with Hamming Distance

Sistemi di tipi calcoli di processi e di sessione

Title: Sistemi di tipi calcoli di processi e di sessione

Lecturer: Ilaria Castellani, Rosario Pugliese

Period: October 2015

Place: Università di Firenze, Viale Morgagni

Obiettivo: Il corso mira a presentare alcune tecniche formali, basate su sistemi di tipi, per l’analisi di proprietà di sicurezza in una varieta’ di calcoli di processo e di sessione.

Contenuti: Si comincerà con l’introduzione di alcuni sistemi di tipi per la sicurezza per il CCS ed una panoramica di diverse proprietà di sicurezza, con particolare riguardo alla noninterferenza. Si passerà quindi ad approfondire lo studio di sistemi di tipi per calcoli di processi più espressivi del CCS, come il pi-calcolo. Saranno introdotti il “sorting” di Milner e il tipaggio di input/output di Pierce e Sangiorgi, il tipaggio “comportamentale” di Kobayashi, e alcuni sistemi di tipi per la sicurezza del flusso d’informazione. Infine, saranno studiati i calcoli di sessione, che permettono ai processi di organizzare le loro comunicazioni secondo un protocollo predefinito. Saranno presentati alcuni sistemi di tipi per varianti di tali calcoli con sessioni binarie con delega, con sessioni n-arie, con sessioni n-arie con delega e sicurezza, e con sessioni con monitoraggio e adattabilità.

Scientific writing in English

Title:Scientific writing in English

Lecturer: Steven Shore, Department of Physics

Period: February 3, 4, 5, 6

Sala Seminari Ovest, 9-12 (lecture) and 15-17 (exercises) (on the 5th 9-11)

A book is available on demand (about 4 euro per copy).

Attention: this English course has no value as a course or seminar but as cultural enrichment.

Other courses

In addition, each student can attend a course of the Master programme in Computer Science, for instance: