LogoBanner

Menu:

Courses

Held in cooperation with the University of Chicago Department of Computer Science

All courses held in room 530 unless otherwise noted.

Autumn 2014

Winter 2015

Spring 2015

Summer 2015


Complete Course List

TTIC 31000 - Research at TTIC

TBD

Weekly lectures and discussions by TTIC researchers introducing tehir research and research problems. Provides a broad view of research carried out at TTIC. Course is pass/fail credit. Satisfies one quarter of credit for the Research at TTIC Series Requirement. (See Academic Program Guide for details)

TTIC 31010 - Algorithms

Makarychev, Yury

This is a graduate level course on algorithms with the emphasis on central combinatorial optimization problems and advanced methods for algorithm design and analysis. Topics covered include asymptotic analysis, greedy algorithms, dynamic programming, amortized analysis, randomized algorithms and probabilistic methods, combinatorial optimization and approximation algorithms, linear programming, and advanced data structures.

The course textbook is "Algorithm Design" by Kleinberg and Tardos

Lecture Plan

1. Greedy algorithms (1 week)

2. Dynamic programming (1 week)

3. Amortized analysis (1 week)

4-6 Max flow, min-cut, bipartite matching and their applications (3 weeks)

7. Linear programming, LP-duality (1 week)

8. NP-hardness (1 week)

9. Approximation algorithms (1 week)

10. Randomized algorithms (1 week)


Tutorial: Wednesdays, 3:30-4:20, Ryerson, room 276

Assumes familiarity with proofs and an the asymptotic notation. Some basic knowledge of the notion of NP-hardness is also required.

Expected outcomes:

  • Ability to design and rigorously analyze algorithms using paradigms such as greedy or dynamic programming.
  • Understand the use of linear programming in optimization. Be able to formulate problems as linear programs.
  • Understand linear programming duality and applications to problems such as max- flow/min-cut. Be able to write duals for linear programs.

TTIC 31020 - Introduction to Statistical Machine Learning

Shakhnarovich, Greg

A systematic introduction to machine learning, covering theoretical as well as practical aspects of the use of statistical methods. Topics include linear models for classification and regression, support vector machines, regularization and model selection, and introduction to graphical models. Application examples aretaken from areas like information retrieval, natural language processing, computer vision and others.

Prerequisites: Probability, Linear Algebra, Undergraduate Algorithms.

Lecture Plan

I will focus on supervised learning and only talk about unsupervised settings when necessary (e.g., mixture models and density estimation for generative methods for classification). So, no clustering. There will be twenty 1.5 hour lectures; numbers in parentheses are estimated # of lectures per topic.

  • intro to ML, motivation etc. (1)
  • refresher on probability and algebra (1)
  • statistical framework for learning; loss/risk; least squares regression (1)
  • noise models; error decomposition; bias/variance and overfitting (1)
  • estimation theory; ML/MAP, overfitting, bias/variance (1)
  • model complexity, sparsity (L1/L2) in regression; stepwise methods for L0 sparsity (1)
  • classification; Fisher's LDA, logistic regression and softmax (1)
  • ensemble methods, boosting (1)
  • generative models, Naive Bayes, multivariate Gaussians (1)
  • mixture models; EM (2)
  • SVM and kernels (2)
  • nonparametric methods; nearest neighbors, density esimation (1)
  • multilayer neural networks (1)
  • information theory and learning; information criteria, MDL and their connections to regularization (1)
  • experiment design and evaluation in ML (1)
  • advanced topics TBD (1)
  • wrap-up and review (1)

Assumes knowledge of basic linear algebra, probability and calculus.

Expected outcomes:

  • Understand the notion of fitting a model to data and concepts such as model complexity, overfitting and generalization, and bias-variance tradeoff in estimation.
  • Learn and be able to apply some of the fundamental learning methods, such as logistic regression, support vector machines, boosting, decision trees, neural networks.
  • Learn the basics of optimization techniques such as gradient descent and the general EM algorithm.
  • Familiarity with multivariate Gaussians and mixtures of Gaussians.
  • Understand fundamental concepts in information theory (entropy, KL-divergence) and their relationship to machine learning.

TTIC 31030 - Mathematical Foundations

McAllester, David

This course covers the foundations of mathematics from a classical (nonconstructive) type-theoretic perspective, the general notion of a mathematical structure, the general notion of isomorphism, and the role of axiomatizations and constructions in mathematical definitions. The definition of the real numbers used as a fundamental example. The course also covers the notion of definability in well-typed formalisms. A primary example is the non-definability of a linear bijection between a vector space and its dual. Ontologies (types) relevant to machine learning are emphasized such as the type theory of PCA, CCA and Banach spaces (norms and dual norms).

Lecture Plan

There are again two lectures per week. Introduction and course outline.

Part I: Logic with Abstraction Barriers.

Sequent inference rules and proofs. Types and variable declarations. Free and bound variables. Structures and Isomorphism.

Isomorphism as equivalence under an abstract interface.

Part II: Case Studies in Abstraction

The natural numbers, integers, rationals and reals. Vector spaces. A formal treatment of the non-existence of a canonical basis (coordinate system), canonical inner product, or canonical isomorphism with the dual space. Coordinate-free treatment of matrix algebra.

Equivalences between matrix (operator) types. The fact that least squares regression does not require an ambient inner product but regularization does. Gradient descent. Gradient descent requires an ambient inner product. Newton's method does not. The covariance matrix of a probability distribution on a vector space. The fact that the multivariate central limit theorem does not require an ambient inner product. Canonical correlation analysis also does not require an ambient inner product. PCA requires and ambient inner product.

Norms and Banach spaces. Dual norms. Measure spaces. Hilbert space. Differentiable manifolds. Information geometry. Jeffery's prior. Natural Gradient Descent.

Expected outcomes:

  • Be able to write rigorous proofs and reason formally about various mathematical notions.
  • Understand basics of linear algebra, eigenvalues and eigenvectors from the viewpoint of operators.
  • Understand various algorithms such as SVMs, PCA, CCA and gradient descent from the operator viewpoint.

TTIC 31040 - Introduction to Computer Vision

Shakhnarovich, G. Urtasun, R.

Introduction to techniques in computer vision, with emphasis on fundamental principles and efficient algorithms. Topics include: digital image formation and processing; detection and analysis of visual features; representation of two- and three-dimensional shape; recovery of 3D information from images and video; analysis of motion. Applications covered in depth include stereo, structure from motion, segmentation, instance and category level object detection and recognition.

Lecture Plan

Topics:

  • Image formation, camera models and geometry
  • Digital image formation, color and compression
  • Edge detection and relevant signal processing
  • Hough transform and its variants
  • Features: interest point operators, descriptors, and matching
  • Stereo and multi-view geometry and reconstruction
  • Camera calibration
  • Structure from motion
  • Shape-from-X (in particular shape from shading)
  • 2D shapes and contours: representation, analysis and recognition

TTIC 31050 - Introduction to Bioinformatics and Computational Biology

Xu, J.

Lecture Plan

This course will focus on the application of mathematical models and computer algorithms to studying molecular biology. In particular, this course will cover the following topics.

  1. homology search ( 1 week)
  2. sequence alignment and motif discovery ( 1 week)
  3. next-gen sequencing and genome assembly ( 1 week)
  4. protein sequence/structure analysis including alignment, classification, structure and function prediction ( 2 weeks)
  5. RNA sequence/structure analysis including alignment, classification and prediction ( 1 week)
  6. gene expression analysis (1 week)
  7. biological network analysis ( 1 week)
  8. phylogeny ( 1 week)

TTIC 31060 - Computability and Complexity Theory

Razborov, A.

Part one consists of models for defining computable functions: primitive recursive functions, (general) recursive functions, and Turing machines; the Church-Turing Thesis; unsolvable problems; diagonalization; and properties of computably enumerable sets. Part two deals with Kolmogorov (resource bounded) complexity: the quantity of information in individual objects. Part three covers functions computable with time and space bounds of the Turing machine: polynomial time computability, the classes P and NP, NP- complete problems, polynomial time hierarchy, and P-space complete problems.

Lecture Plan

The tentative plan for this course is to do the following three parts.

Part I. Models for defining computable functions: primitive recursive functions, (general) recursive functions, and Turing machines; the Church-Turing Thesis; unsolvable problems; diagonalization; and properties of computably enumerable sets.

Part II. Kolmogorov (resource bounded) complexity: the quantity of information in individual objects.

Part III. Functions computable with time and space bounds of the Turing machine: polynomial time computability, the classes P and NP, NP-complete problems, polynomial time hierarchy, and P-space complete problems.

Here is the Web page of the previous version of this course: http://people.cs.uchicago.edu/~razborov/teaching/winter10.html. But I will be very willing to adjust the speed, style and content depending on the background of the students enrolled in the course.

TTIC 31070 - Convex Optimization

Orabona, Francesco

The course will cover techniques in unconstrained and constrained convex optimization and a practical introduction to convex duality. The course will focus on (1) formulating and understanding convex optimization problems and studying their properties; (2) understanding and using the dual; and (3) presenting and understanding optimization approaches, including interior point methods and first order methods for non-smooth problems. Examples will be mostly from data fitting, statistics and machine learning.

Prerequisites: Linear Algebra, Multidimensional Calculus, Undergraduate Algorithms

TTIC 31080 - Approximation Algorithms

Chuzhoy, J.

This is a basic course on approximation algorithms, with the main focus on approximation algorithms for central combinatorial optimization problems. We will mostly focus on classical algorithmic results, but will also present some state of the art results and challenges in the area of approximation. The course will cover major algorithmic techniques, including LP-rounding, primal-dual schema, metric methods, SDP rounding and so on. While the main focus of the course is on algorithms, we will also discuss lower bounds on approximation and connections between algorithm design and lower bound proofs.

Assumes the knowledge of material covered in the Algorithms course.

Expected outcomes:

  • Understand concepts such as constant factor approximation, polynomial time approximation schemes and hardness of approximation.
  • Understand applications of linear programs (LPs) to design of approximation algorithms. Learn to analyze rounding algorithms for LPs and understand integrality gaps. Be able to apply LP duality.
  • Understand semi-definite programming and its applications to approximation.

TTIC 31090 - Signals, Systems and Random Processes

Livescu, K.

Introduction to analysis of signals and linear time-invariant systems at a graduate level. Topics include: Continuous and discrete-time transforms (Fourier and others); linear filtering; sampling and aliasing; random processes and their interaction with linear systems; applications in areas such as speech and image processing and robotics.

Assumes familiarity with basic linear algebra, notions of eigenvalues and eigenvectors, and (multivariate) Gaussian distributions.

Expected outcomes:

  • Understand the properties and eigenfunctions of linear time-invariant (LTI) systems.
  • Understand Fourier Series (discrete- and continuous-time), Fourier transform, and convolutions.
  • Be able to analyze random processes, understand their stationarity properties and their interaction with LTI systems.
  • Learn about sampling and aliasing, signal estimation minimizing mean square error and parameter estimation for random processes. Understand the Karhunen-Loeve transform.

TTIC 31100 - Computational and Metric Geometry

Chuzhoy, J. Makarychev, Y. Sidiropoulos, A.

The course covers fundamental concepts and algorithms in computational geometry. Topics covered include: convex hulls, polygon triangulations, range searching, segment intersection, Voronoi diagrams, Delaunay triangulations, motion planning, binary space partitions, geometric polynomial-time approximations schemes, locality-sensitive hashing, metric embeddings and applications.

The course textbooks are "Computational Geometry" by M. de Berg, O. Cheong, M. van Kreveld, M. Overmars, and "Lectures on Discrete Geometry" by J. Matousek.

TTIC 31110 - Topics in Artificial Intelligence: Speech Technologies

Livescu, Karen

This course will introduce techniques used in speech technologies, mainly focusing on speech recognition. Speech recognition is one of the oldest and most complex structured sequence prediction tasks receiving significant research and commercial attention, and therefore provides a good case study for many of the techniques that are used in other areas of artificial intelligence involving sequence modeling. It is also a good example of the effectiveness of combining statistics and learning with domain knowledge. The course will include practical homework exercises using Matlab and speech toolkits.

Assumes a good background in basic probability.

Expected outcomes:

  • Understand and apply tools for analyzing speech time series such as Fourier analysis and dynamic time warping.
  • Understand and apply hidden Markov models, Gaussian mixtures, and the EM algorithm for speech problems.
  • Understand and apply n-gram language models, smoothing techniques, and their application to speech recognition.
  • Understand generative and discriminative structured prediction approaches for speech problems.

TTIC 31120 - Statistical and Computational Learning Theory

Srebro, N.

The course discusses classic results and recent advances in statistical learning theory (mostly under the agnostic PAC model), touches on computational learning theory, and also explores the relationship with stochastic optimization, the oracle model of optimization, and online regret analysis. Our emphasis will be on developing a rigorous quantitative understanding of machine learning and acquiring techniques for analyzing and proving performance guarantees for learning methods.

Pre-Requisites : Mathematical maturity, as obtain, e.g., in a rigorous analysis course. Discrete Math (specifically combinatorics and asymptotic notation) Probability Theory Introduction to Machine Learning Algorithms; Basic Complexity Theory (NP-Hardness) Familiarity with Convex Optimization, Computational Complexity and background in Statistics can be helpful, but is not required.

Course Webpage

TTIC 31130 - Visual Recognition

Urtasun, R. Shakhnarovich, G.

Developing autonomous systems that are able to assist us in everyday's tasks is one of the grand challenges in modern computer science. While a variety of novel sensors have been developed in the past few years, in this class we will focus on the extraction of this knowledge from visual information alone. One of the most remarkable examples of successful recognition systems is our visual system, which is able to extract high-level information from very noisy and ambiguous data. Unfortunately, despite decades of research efforts, machines are still way below human performance. In this class we will study why this is the case.

The goal of this graduate class is to understand the different visual recognition tasks as well as the techniques employed to solve them. A strong component of the course will be statistical learning as it plays a key role in almost every modern visual recognition system. We will cover all stages of the recognition pipeline: low-level (e.g., features), mid-level (e.g., segmentation) as well as high-level reasoning (e.g., scene understanding).

Lecture Plan

Knowledge of machine learning and computer vision is not required, but highly recommended. The theoretical aspects of visual recognition will be covered during the lectures. The class will have a strong practical component, as the students will build the different recognition components during the homework sessions. A list of topics includes:

  1. Classification: features, bag of words (BOW), similarity between images, learning features as well as hashing schemes and retrieval.
  2. Detection: sliding window approaches, branch and bound, structure prediction, hough voting and NN approaches, hierarchical models.
  3. Segmentation: classical approaches (e.g., watershading) as well as modern structure pre- diction approaches including message passing and graph cuts for inference, and CRFs and structured-SVMs for learning.
  4. Modern 3D geometry and 3D scene understanding: stereo, scene layout (e.g., 3D box for indoor scenes, road layout for outdoor scenes).
  5. Pose estimation: pictorial structures (2D) as well as 3D pose estimation including particle filter-based approaches.

TTIC 31140 - Learning and Inference in Graphical Models

Urtasun, R

A graphical model is a probabilistic model, where the conditional dependencies between the random variables is specified via a graph. Graphical models provide a flexible framework for modeling large collection of variables with complex interactions, as evidenced by their wide domain of application, including for example machine learning, computer vision, speech and computational biology. This course will provide a comprehensive survey of learning and inference methods in graphical models, including variational methods, primal-dual methods and sampling techniques.

TTIC 31150 - Mathematical Toolkit

Tulsiani, Madhur

The course is aimed at graduate students starting research in theoretical computer science, at advanced undergraduates, and at graduate students working in other areas but interested in learning some of the mathematical tools used in theory research. The goal of the course is to collect and present important mathematical tools used in different areas of theoretical computer science. We intend to cover the following topics and examples:

  1. The Probabilistic Method: Markov’s inequality, Chebyshev and Chernoff bounds. Thresh- olds for finding subgraphs in random graphs. Randomized routing in the hypercube. Balls and bins, load balancing. Power of two choices (if possible)
  2. Gaussian Probability: Introduction to Gaussians, measure concentration. Dimension reduction via Johnson-Lindenstrauss.
  3. Graph Eigenvalues and Expanders: Two notions of expansion (combinatorial and spectral) and Cheeger’s inequality. Expansion in random graphs. Eigenvalues of cycle and hypercube. Application of expanders to derandomization (if possible)
  4. Fourier Analysis on the Hypercube: Fourier coefficients as eigenvalues of a Cayley graph. Expanders from epsilon-biased sets. Sparsest cut on the hypercube and definition of influence. Linearity testing.
  5. LP Relaxations and Duality: Max-flow, min-cut example. Min-max theorem for zero-sum games. Upper bounds for codes using duality (if possible).
  6. Multiplicative updates method: The basic problem of “learning from experts” and regret bounds. Applications to developing algorithms for linear programs. Examples of algorithms for covering-packing problems.

Assumes familiarity with proofs, basic linear algebra and eigenvalues/eigenvectors of matrices.

Expected outcomes:

  • Ability to analyze various random algorithms and apply concentration bounds for discrete and Gaussian random variables.
  • Understand the eigenvalues/eigenvectors of graphs and use them to analyze combinatorial properties like expansion.
  • Learn about Fourier analysis over finite spaces and apply it to studying pseudorandomness or learning of Boolean functions like decision trees.
  • Learn to apply LP duality as a proof technique.

TTIC 31160 - Topics in Bioinformatics

Xu, Jinbo

TTIC31160 will focus on the application of mathematical models and computer algorithms to studying structure biology, in particular, protein, RNA and DNA molecule structures.

Here is a list of topics that I am going to cover in this course.

  • 1) Introduction to molecular structures (1 week)
  • 2) Bioinformatics for biological sequence analysis (1 week)
  • 3) Algorithms for molecule structure comparison and alignment (1 week)
  • 4) Algorithms for protein secondary structure prediction (0.5 week)
  • 5) Algorithms for protein tertiary structure prediction (1 week)
  • 6) Algorithms for RNA secondary structure prediction (1 week)
  • 7) Algorithms for RNA tertiary structure prediction (0.5 week)
  • 8) Algorithms for protein docking (1 week)
  • 9) Algorithms for protein-protein and protein-RNA interaction prediction (1 week)
  • 10) Algorithms for Chromatin structure determination ( 1 week)

There will be both homework and final projects.

TTIC 31200 - Information and Coding Theory

Tulsiani, Madhur

The course will be focused on explaining and connecting the following topics and different views of information theory:

- Information theory as a theoretical basis for coding theory.

- Information theory as a mathematical framework for understanding and analyzing machine learning algorithms.

- Information theory as a proof technique for areas such as communication complexity.

- Basics of algorithmic coding theory. Unique and list-decoding algorithms for Reed-Solomon and Reed-Muller Codes.

- (Time permitting) A basic idea of new notions such as local decoding, which are motivated by applications in theory of computing.

TTIC 55000 - Independent Research

Advisor

Student engages in research under guidance of research advisor.

TTIC 56000 - Independent Reading

Advisor and/or other TTIC Faculty

Student engages in reading and research assignments under guidance of faculty.

TTIC 57000 - Computer Science Internship

Advisor

In-depth involvement in areas of computer science in a research lab, University or business setting. Internship activities and objectives must be related to the student's program objectives. Required enrollment for F-1 CPT internship. Advisor's Consent Required.