Perspectives in parallel programming
Title: Perspectives in parallel programming
Lecturer: Marco Danelutto, Dipartimento di Informatica
Period:
May 13, 1113, Seminari W
May 15, 1113, Seminari W
May 22, 1113, Seminari W
(the schedule of the next lectures will be decided on May 13)
The course will introduce several programming environments used to implement parallel applications on modern parallel architectures. A detailed introduction of the available structured parallel programming environments will be given. Optimization problems relative to different non functional concerns (performance, power management, fault tolerance) will also be described and typical experiences in different applicative domains will be discussed. Overall, the course will give the student a good perspective on the kind of problems to be solved when tackling the efficient programming of (massively) parallel systems as well as a basic knowledge relative to stateoftheart parallel programming frameworks.
Slide course:
http://backus.di.unipi.it/~marcod/wiki/doku.php?id=pdp2015
Bayesian Machine Learning
Title: Bayesian Machine Learning
Lecturer: Guido Sanguinetti, University of Edinburgh
Place: Computer Science Department, University of Pisa
Period:
 23 February seminar room west 15 – 17;
 24 /25 / 26 February seminar room west 9 11;
 27 February Laboratorio didattico Polo Fibonacci “I” 9 – 11;
 2 March seminar room west 16 – 18;
 3 / 4/ 5 March seminar room west 9 11;
 6 March Laboratorio didattico Polo Fibonacci “I” 9 – 11;
This is a rough summary for the Bayesian Machine Learning course. The main reference is D. Barber’s book, Bayesian Reasoning and Machine Learning; the numbers in brackets refer to Barber’s book unless explicitly stated otherwise. The book is available online from
http://web4.cs.ucl.ac.uk/staff/D.Barber/pmwiki/pmwiki.php?n=Brml.HomePage
 Lecture 1: Statistical basics. Probability refresher, probability distributions, entropy and KL divergence (Ch 1, Ch 8.2, 8.3). Multivariate Gaussian (8.4). Estimators and maximum likelihood (8.6 and 8.7.3). Supervised and unsupervised learning (13.1) Lecture 1
 Lecture 2: Linear models. Regression with additive noise and logistic regression (probabilistic perspective): maximum likelihood and least squares (18.1 and 17.4.1). Duality and kernels (17.3). Lecture 2
 Lecture 3: Bayesian regression models and Gaussian Processes. Bayesian models and hyperparameters (18.1.1, 18.1.2). Gaussian Process regression (19.119.4, see also Rasmussen and Williams, Gaussian Processes for Machine Learning, MIT Press, 2007, Ch 2. Available for download at http://www.gaussianprocess.org/gpml/). Lecture 3
 Lecture 4: Active learning and Bayesian optimisation. Active learning, basic concepts and types of active learning (B. Settles, Active learning literature survey, sections 2 and 3, available from http://burrsettles.com/pub/settles.activelearning.pdf.) Bayesian optimisation and the GPUCB algorithm (Brochu et al, see http://arxiv.org/abs/1012.2599). Lecture 4
 Lab 1: GP regression and Bayesian Optimisation.
 Lecture 5: Latent variables and mixture models. Latent variables and the EM algorithm (11.1 and 11.2.1). Gaussian mixture models and mixture of experts (20.3, 20.4). Lecture 5
 Lecture 6: Graphical models. Belief networks and Markov networks (3.3 and 4.2). Factor graphs (4.4). Lecture 6
 Lecture 7: Exact inference in trees. Message passing and belief propagation (5.1 and 28.7.1). Lacture 7
 Lecture 8: Approximate inference in graphical models. Variational inference: Gaussian and mean field approximations (28.3, 28.4). Sampling methods and Gibbs sampling (27.4 and 27.3). Lecture 8
 Lab 2: Bayesian Gaussian mixture models.
Verifiable voting systems and secure protocols: from theory to practice
Title: Verifiable voting systems and secure protocols: from theory to practice
Lecturer: Peter Y. A. Ryan, Université du Luxenbourg
Period: 610 July 2015 — Sala Seminari W, 912
Coinductive Methods in Computer Science (and beyond)
Title: Coinductive Methods in Computer Science (and beyond)
Lecturer: Filippo Bonchi and Damien Pous, ENS Lyon
Period: 13 — 24 April — all lectures will be in Sala Seminari W
The induction principle is ubiquitous in computer science. For instance it is used to prove properties of programs involving finite data structures, such as natural numbers, lists and trees. For reasoning about concurrent programs or infinite data structures, like streams and infinite trees, one needs a dual principle — coinduction — which is the main topic of this course.
The relevance of coinductive reasoning has been usually associated with bisimilarity, a notion of equivalence that emerged at the end of the seventies in three different fields: nonwell founded set theory, modal logics and concurrency theory. In the last years, coinductive methods has been studied in more and more areas of computer science, such as automata and type theory. Perhaps unexpectedly, applications to game theory and economy have been proposed recently.
We will introduce the coinductive definition and proof principles, several techniques to enhance coinductive proofs, and some algorithms to automatize such proofs. During the course, we will encounter examples stemming from different areas, with particular focus to automata and concurrency theory. If time allows, we will also give an overview to the theory of coalgebras, which is the abstract framework encompassing all these different examples.
Monday 13th 1013
Motivations; Automata
Tuesday 14th 1012
Bisimulations
Hopcroft and Karp’s algorithm; Bisimulations upto equivalence
Wednesday 15th 1012
Knaster and Tarski theorem; A latticetheoretic perspective on upto techniques
Thursday 16th 1013
Relational Algebra; Kleene Algebra; Kleene Algebra with Test
Monday 20th 1012
Bisimulations upto congruence; Hopcroft and Karp upto congruence
Tuesday 21st 1012
Weighted Automata; Streams
Wednesday 22nd 1012
CCS; GSOS specifications
Thursday 23rd 1012
Algebras; Coalgebras; Bialgebras
Friday 24th 1012
Partition Refinement algorithm (as terminal sequence)
Brzozowski’s algorithm (as duality of observability and rechability)
Additional readings:
1) D. Sangiorgi. Introduction to Bisimulation and Coinduction. Cambridge University Press, 2012
2) D. Sangiorgi, J. Rutten. Advanced Topics in Bisimulation and Coinduction, Cambridge University Press, 2012
High Dynamic Range Imaging: theory and applications
Title: High Dynamic Range Imaging: theory and applications
Lecturer: Francesco Banterle, Visual Computing Laboratory, CNR Pisa
Period: 1526 June 2015 — Sala Seminari Ovest, 1517
Reference page: http://www.banterle.com/francesco/courses/2015/hdri/index.php
 Introduction to modern Imaging – 2 hrs
 Introduction to high dynamic range Imaging – 1 hr
 Capturing and generating the realworld lighting 4 hrs
 Tone mapping HDR images for low dynamic range displays – 4hrs
 Tone mapping HDR videos for low dynamic range displays – 2hrs
 Inverse Tone mapping – 2hrs
 HDR content formats and compression – 2hrs
 High dynamic range imaging applications – 4 hrs
Searching by Similarity on a Very Large Scale
Title: Searching by Similarity on a Very Large Scale
Lecturer: Giuseppe Amato, CNR Pisa
Period: end of September/beginning of October 2015
 Foundation of Metric Space Searching
 Distance Searching Problem
 Metric Distance Measures
 Similarity Queries
 Basic Partitioning Principles
 Principles of Similarity Query Execution
 Policies for avoiding Distance Computations
 Metric Space Transformations
 Priciples of Approximate Similarity Search
 Advanced issues: Statistics, Proximity, Performance Prediction, Tree quality measures
 Advanced issues: Choosing reference points
 Exact Similarity Search
 Vantage Point Trees
 AESA/LAESA
 GNAT
 The MTree Family
 The MTree
 BulkLoading Algorithm
 MultyWay Insertion Algorithms
 The Slim Tree
 SlimDown Algorithm
 Pivoting MTree
 Hash Based Methods
 DIndex
 Approximate Similarity Search with MTree
 Relative Error Approximation
 Good Fraction Approximation
 Small Chance Improvement Approximation
 ProximityBased Approximation
 PAC Nearest Neighbor Searching
 Performance tests
 Other Approximate Similarity Search Methods
 Permutation Based Methods
 Permutation Spearman Rho
 PPIndex
 MIFile
 Locality Sensitive Hashing (LSH)
 LSH based on pstable distributions
 LSH with Hamming Distance
Sistemi di tipi calcoli di processi e di sessione
Title: Sistemi di tipi calcoli di processi e di sessione
Lecturer: Ilaria Castellani, Rosario Pugliese
Period: October 2015
Place: Università di Firenze, Viale Morgagni
Obiettivo: Il corso mira a presentare alcune tecniche formali, basate su sistemi di tipi, per l’analisi di proprietà di sicurezza in una varieta’ di calcoli di processo e di sessione.
Contenuti: Si comincerà con l’introduzione di alcuni sistemi di tipi per la sicurezza per il CCS ed una panoramica di diverse proprietà di sicurezza, con particolare riguardo alla noninterferenza. Si passerà quindi ad approfondire lo studio di sistemi di tipi per calcoli di processi più espressivi del CCS, come il picalcolo. Saranno introdotti il “sorting” di Milner e il tipaggio di input/output di Pierce e Sangiorgi, il tipaggio “comportamentale” di Kobayashi, e alcuni sistemi di tipi per la sicurezza del flusso d’informazione. Infine, saranno studiati i calcoli di sessione, che permettono ai processi di organizzare le loro comunicazioni secondo un protocollo predefinito. Saranno presentati alcuni sistemi di tipi per varianti di tali calcoli con sessioni binarie con delega, con sessioni narie, con sessioni narie con delega e sicurezza, e con sessioni con monitoraggio e adattabilità.
Scientific writing in English
Title:Scientific writing in English
Lecturer: Steven Shore, Department of Physics
Period: February 3, 4, 5, 6
Sala Seminari Ovest, 912 (lecture) and 1517 (exercises) (on the 5th 911)
A book is available on demand (about 4 euro per copy).
Attention: this English course has no value as a course or seminar but as cultural enrichment.
Other courses
In addition, each student can attend a course of the Master programme in Computer Science, for instance:

 Pierpaolo Degano: Static analysis techniques
 Ugo Montanari: Semantics and type theory
 Nadia Pisanti: Algorithms for bioinformatics