Course Descriptions
E0 202 (3:1)  Automated Software Engineering with Machine Learning  Aditya Kanade 
Engineering highquality software requires mastering advanced programming concepts, and dealing with large and complex code. This course will introduce program analysis and machine/deep learning techniques to help developers in this quest. It will focus on concurrency and security analysis of smartphone and web applications. There is growing realization in the software community that we can learn useful program properties from large codebases by treating code as data, and augmenting program analysis with machine learning. This course will introduce machine/deep learning techniques to build probabilistic models of source code, and discuss how they can be used to solve novel problems in software engineering. Programming Language Processing: tokenization, parsing and semantic analysis, graph representations, syntactic transformations. Smartphone and Web Programming: multithreading, asynchronous eventhandling, permissions. Program Analysis: static and dynamic analysis of concurrent programs, model checking, information flow analysis for security, random testing. Probabilistic Models of Source Code: program embeddings, probabilistic grammars, statistical language models, structural models. Applications of Machine Learning (e.g., code completion, software testing and debugging).
References:
 Zigurd Mednieks and Laird Dornin and G. Blake Meike and Masumi Nakamura. Programming Android. O’Reilly, 2011.
 David Harman. Effective JavaScript. AddisonWesley, 2012.
 Steve Souders. Even Faster Websites. O’Reilly, 2009.
 Ian Goodfellow and Yoshua Bengio and Aaron Courville. Deep Learning. MIT Press, 2016.
 Research papers.
Prerequisites
 None.
E0 203 (3:1)  Spectral Algorithms  Anand Louis / Ambedkar Dukkipati 
Spectral graph algorithms are very popular in theoretical computer science and machine learning, as they provide polynomial time approximations to several hard computational problems. This course will cover some basic topics in spectral graph theory and algorithms with some applications to network analysis. This course emphasizes rigorous analysis of algorithms. Overview of Linear Algebra and Matrix theory, PerronFrobenius theory, Rayleigh Ratios, Laplacians, Spectral graph partitioning algorithm, Cheeger?s inequal ity, DavisKahan theorem and perturbation analysis, Community detection in networks and stochastic block models, SVD, Mixture Models, Probabilistic spectral clustering, Recursive spectral clustering, optimization via lowrank approximation.
References:
 Fan Chung, Spectral Graph Theory, AMS 1997.
 Ravindran Kannan, Santosh Vempala, Spectral Algorithms, NOW Publishers, 2009.
 Andries E. Brouwer, Willem H. Haemers, Spectra of graphs, Springer 2011.
Prerequisites
 Any course in Linear Algebra or Matrix Theory.
 None.
E0 205 (3:1)  Mathematical Logic and Theorem Proving  Deepak D’Souza / Kamal Lodaya 
Motivation and objectives of the course:
This course is about mathematical logic with a focus on automated reasoning techniques that are useful in reasoning about programs. In the first part of the course we cover Propositional and First Order logic and some of the classical results like sound and complete proof systems, compactness, and decidability of the satisfiability/validity problems. In the second part we focus on decision procedures for various theories that arise while reasoning about assertions in programs.
Syllabus :
Basics of probability, events, discrete random variables, expectation, moments, deviations, Chernoff and Hoefding bounds, balls and bins, random graphs, probabilistic methods, Markov chains and random walk, Monte Carlo method, coupling, martingales, sample complexity, VC dimension, Rademacher complexity, balanced allocation, power of two choices, cuckoo hashing, information and entropy, online algorithms, randomorder model, streaming algorithms.
References:
 “Probability and Computing” by Mitzenmacher and Upfal.
 “Randomized Algorithms” by Motwani and Raghavan.
Prerequisites
 E0 225 (Design and Analysis of Algorithms) and E0 226 (Linear Algebra and Probability).
E0 206 (3:1)  A Theorist’s Toolkit  Arindam Khan / Anand Louis 
Motivation and objectives of the course: This course is intended to equip a student interested in studying theoretical computer science with some of the fundamental tools commonly used in this area.
Tentative Syllabus : The topics covered are likely to be a subset of the following.
 Probabilistic methods: Linearity of expectations, alterations, second moment, Lovasz local lemma, martingales, random graphs, JohnsonLindenstrauss lemma, etc.
 Streaming algorithms: Hash functions, pairwise independence, heavy hitters in data stream, pstable distributions, counting distinct elements, etc.
 Information theory: Shearer’s Lemma, entropy and compression, Pinsker’s lemma, KLdivergence, application in bandits and streaming algorithms, etc.
 Linear algebra based algorithms: CourantFischer Theorem, SVD, Cheeger’s Inequality, expanders, etc.
 Discrete Fourier analysis: Boolean function and Fourier expansion, applications in property testing, etc.
 Multiplicative weights update: Hedge algorithm, applications in packing/covering LPs, online convex optimization, etc.
 Combinatorial optimization: Farkas lemma, matroids, total unimodularity, total dual integrality, etc.
References: Since this is a “toolkit” course, we will be teaching material from multiple books/sources. Some of them are the following.
 Michael Mitzenmacher and Eli Upfal. Probability and computing: Randomization and
probabilistic techniques in algorithms and data analysis. Cambridge university press, 2017  Ryan O’Donnell. Analysis of boolean functions. Cambridge University Press, 2014.
 Alexander Schrijver. Combinatorial optimization: polyhedra and efficiency, 2003.
 Alexander Schrijver. Theory of linear and integer programming. John Wiley & Sons, 1998.
 Avrim Blum, John Hopcroft, and Ravindran Kannan. Foundations of Data Science.
 Sanjeev Arora, Elad Hazan, and Satyen Kale. “The multiplicative weights update method: a metaalgorithm and applications.” Theory of Computing 8.1 (2012): 121164.
 Thomas M. Cover and Joy A. Thomas. Elements of information theory. John Wiley & Sons,
2012.  S. Muthukrishnan. Data streams: Algorithms and applications. Now Publishers Inc, 2005.
 Various surveys and lecture notes.
Prerequisites
 Students should have completed or also be registered for E0 225 (Design and Analysis of Algorithms).
E0 207 (3:1)  Computational Topology: Theory and Applications  Vijay Natarajan / Gugan Thoppe 
Motivation and objectives of the course:
Topology is a branch of mathematics which studies how spaces are connected. And, computational topology, specifically, concerns algorithms useful for computing the topological properties of a given space. This is a young and rapidly growing field that is fueled by interesting data analysis applications in domains such as medicine, cosmology and materials science. This growing interest is not at all surprising since the present day need to deal with large and high dimensional data sets necessitates use of tools that help visualize objects in dimensions four and beyond. The fascination also stems from the fact that some of the topological tools are robust to random noise and do not depend extensively on the exact numerical coordinates used to represent the given data. In line with the above view, this course will focus on topics that lie at the intersection of topology, algorithms, probability,
and data analysis.
Course Outline:
 Introduction to topological data analysis via recent applications.
 Mathematical preliminaries from group theory and linear algebra: group homomorphism and isomorphism, quotient group, classification of finitely generated Abelian groups, linear transformations, matrix representations.
 Complexes: Clique, Delaunay, Cech, Rips, random complexes, algorithms for constructing
complexes  Simplicial homology: chains, cycles, the boundary operator, the homology group, simplicial maps, Betti numbers, EulerPoincare characteristic, nerve theorem, matrix reduction algorithms
 Persistent Homology: filtrations, persistence diagrams, barcodes, spanning acycles, algorithms
 Morse functions: Morse Lemma, MorseSmale complex, contour tree, Reeb graph, algorithms for construction and simplification, hierarchical representation
 Random topology: Random complexes, Morse inequalities, Limiting distribution of Betti
numbers and persistence diagrams  Software: TDA on R, Gudhi, Ripser, Javaplex, TTK
References:
 Edelsbrunner, Herbert, and John Harer. Computational topology: an introduction. American Mathematical Soc., 2010.
 Hatcher, Allen. Algebraic topology., (2001).
 Current Literature
Prerequisites: Undergraduate Probability and Linear Algebra, E0 225: Design and Analysis of
Algorithms
E0 208 (3:1)  Computational Geometry  Satish Govindarajan / Rahul Saladi 
Motivation and objective of the course: Computational Geometry is an area of computer science that looks at the computational aspects of geometric problems such as running time of an algorithm, space occupied by a data structure, design of polynomial time approximation algorithms. This area has been well studied over the last four decades and has found applications in computer graphics, computeraided design, geographic information systems, robotics, etc. This course will focus on the theoretical aspects of algorithms and data structures for various geometric problems.
Syllabus:
 Convex hulls: 2D and higher dimensional convex hulls, output sensitive algorithms, randomized incremental construction
 Intersection detection: Segment intersection, plane sweep technique
 Geometric data structures for range searching and point location: Segment and interval trees, range trees, kdtree, persistence.
 Proximity problems: Voronoi diagram, Delaunay triangulation, well separated pair
decomposition, locality sensitive hashing.  Arrangements: Arrangements of lines and hyperplanes, sweepline and incremental algorithms, lower envelopes, levels, zones.
 Geometric sampling: Random sampling and epsilonnets, epsilonapproximation and discrepancy, coresets.
 Geometric optimization: Linear programming, geometric set cover, geometric independent set, clustering.
References:
 [Main textbook] M. de Berg, O. Cheong, M. van Kreveld, and M. Overmars,
Computational Geometry: Algorithms and Applications. SpringerVerlag, 3rd ed., 2008.  Lecture notes on Computational Geometry by David Mount:
https://www.cs.umd.edu/class/spring2012/cmsc754/Lects/cmsc754lects.pdf.  [Additional reference] Sariel HarPeled. Geometric Approximation Algorithms
(Mathematical Surveys and Monographs). American Mathematical Society (2011).
Prerequisites: E0 225: Design and Analysis of Algorithms
E0 209 (3:1)  Principles of Distributed Software  K V Raghavan 
Motivation and objectives of the course :
Development of distributed software applications is a very important activity, accelerated in recent years by the increasing predominance of cloud computing. The typical requirements from a modern day distributed application are continuous availability even in the presence of software and hardware faults, ability to scale up or down onthefly based on input load (i.e., elasticity), ease of development and maintenance, and ease of continuous integration and deployment. This course will introduce the principles and programming models and frameworks for distributed applications that help meet these requirements. It will also cover representative modern languages and technologies that are used to develop and deploy such applications.
Syllabus :
Main features of cloud applications, such as availability, fault tolerance, elasticity, and microservices. The challenges in building cloud applications. Introduction to different types of
concurrent and distributed programming models. Introduction to actors — a messagedriven
programming model that enables large scale concurrency, distribution, and fault tolerance.
Programming actorbased systems using the Akka toolkit. Achieving availability and elasticity by distributing the application over multiple nodes using Akka Cluster. Using Kubernetes to deploy, scale, and monitor distributed applications. Consistency of data in distributed programs, eventual consistency, and implementing these features using Conflictfree Replicated Data Types in Akka. Practical application of these principles in a substantial course project.
References:
 Distributed and Cloud Computing, by K. Hwang, G. C. Fox, and J. J. Dongarra, Morgan
Kaufmann Publishing, 2012  Designing Distributed Systems, by Brendan Burns, O’Reilly, 2018
 Online documentation for Akka and Kubernetes
 Selected research papers
 https://azure.microsoft.com/enus/resources/designingdistributedsystems/https://www.amazon.in/DesigningDistributedSystemsPatternsParadigmsebook/dp/B079YTM4FC
Prerequisites : Undergraduate level data structures and algorithms. Programming
experience required, preferably in Java.
E0 210 (3:1)  Dynamic Program Analysis: Algorithms and Tools  K. Gopinath 
Motivation and objectives of the course:
The design and implementation of scalable, reliable and secure software systems is critical for many modern applications. Numerous program analyses are designed to aid the programmer in building such systems and significant advances have been made in recent years. The objective of the course includes introduction of the practical issues associated with programming for modern applications, the algorithms underlying these analyses, and applicability of these approaches to large systems. There will be special emphasis on practical issues found in modern software. The course project will be geared towards building the programming skills required for implementing large software systems.
Syllabus:
The course will introduce the students to the following topics — bytecode instrumentation; profiling — BL profiling, profiling in the presence of loops, preferential path profiling, memory profiling; software bloat; lockfree data structures; memoization; mapreduce programming model; approximate computing; multithreading; fuzzing techniques; record and replay; memory models; data races — lockset algorithm, happensbefore relation, causallyprecedes relation; atomicity violations; deadlocks; linearizability; symbolic execution; concolic testing; directed program synthesis; constraint solving; deterministic/stable multithreaded systems; floatingpoint problems; security — sqlinjection, crosssite scripting, returnoriented programming, obfuscation; malware detection.
References:
 Course material available from the webpage; research papers
Prerequisites
 Basic knowledge of programming in C/C++/Java.
E0 213 (3:0)  Quantum Safe Cryptography  Sanjit Chatterjee 
Syllabus:
Introduction to cryptography and communication security; Symmetric key and asymmetric key cryptosystems for data encryption and authentication; Impact of quantum computing on currently deployed cryptosystems; Some candidate postquantum public key encryption and digital signature schemes: Error correcting codes, Lattices, Isogeny over elliptic curves, Multivariate polynomials over finite fields, Cryptographic hash functions; Protocols for quantumsafe secure communications.
Reference:
 Bernstein D.J., Buchmann J. and Dahmen E. (Eds.): PostQuantum Cryptography, Springer, 2010.
 Galbraith S.D., Mathematics of Public Key Cryptography, Cambridge University Press, 2012.
 Menezes A.J., van Oorshot P.C. and Vanstone S.A., Handbook of Applied Cryptography, CRC Press, 1996.
E0 214 (3:0)  Applied Linear Algebra and Optimization  Shirish Shevade 
 Charu C Aggarwal, Linear Algebra and Optimization for Machine Learning, Springer, 2020
 Recent Literature
 Undergraduate level Linear Algebra and Differential Calculus
E0 215 (3:1)  Algorithms under Uncertainty  Siddharth Barman and Arindam Khan 
Syllabus:
Online Algorithms: Skirental problem and paging, Yao’s minimax lemma and lower bounds. Online bin packing and scheduling. Online matching (ranking, application of linear programming via dualfitting and primaldual schema). Online set cover.
Dynamic Algorithms: Dynamic matching and set cover.
Streaming Algorithms: Finding frequent elements, MisraGries algorithm. Finding moments of the data stream. Finding frequent items by sketching: CountMin Sketch. AMS Estimator. Communication Complexity and connection with Streaming Algorithms.
Online Learning and Bandits: Multiplicativeweights algorithm, Follow the leader, Regularization, Online convex optimization, Gradient Descent and Mirror Descent. Upper confidence bound algorithm and median elimination. Combining competitive ratio and regret.
Stochastic Settings: Markov Decision Processes, Index Policies, Gittins Index. Prophet Inequalities.
The course will have a significant project component.
References:

 Allan Borodin, and Ran ElYaniv. Online computation and competitive analysis. Cambridge University Press, 2005.
 Blum, Hopcroft, and Kannan. Foundations of Data Science.
 Muthukrishnan. Data streams: Algorithms and applications. Now Publishers Inc, 2005.
 Lattimore and Szepesvari. Bandit Algorithms. Cambridge University Press, 2020.
 Hazan. Introduction to Online Convex Optimization. Now Publishers Inc,. 2016.
 Roughgarden (Ed.). Beyond the WorstCase Analysis of Algorithms. Cambridge University Press, 2021.
 Various surveys and lecture notes.
Prerequisites: E0 225 (Design and Analysis of Algorithms) is a soft prerequisite. Alternatively, a foundational understanding of algorithms and sufficient mathematical maturity will suffice.
E0 219 (3:1)  Linear Algebra and Applications  M. Narasimha Murty 
Vector Spaces : Subspaces, Linear independence, Basis and dimension, orthogonality. Matrices : Solutions of linear equations, Gaussian elimination, Determinants, Eigenvalues and Eigenvectors, Characteristic polynomial, Minimal polynomial, Positive definite matrices and Canonical forms. Singular Value Decomposition, Applications.
References:
 G Strang, Linear Algebra and Applications, ThomsonBrooks/Cole, 4th edition, 2006.
E0 220 (3:1)  Graph Theory  L. Sunil Chandran 
Vertex cover, matching, path cover, connectivity, hamiltonicity, edge colouring, vertex colouring, list colouring; Planarity, Perfect graphs; other special classes of graphs; Random graphs, Network flows, Introduction to Graph minor theory
References:
 Reinhard Diestel, “Graph Theory”, Springer (2010)
 Douglas B. West, “Introduction to Graph Theory”, Prentice Hall (2001)
 A. Bondy and U. S. R. Murty, “Graph Theory”, Springer (2008)
 B. Bollabas, “Modern Graph Theory”, Springer (1998)
E0 221 (3:1)  Discrete Structures  Arpita Patra / Bhavana Kanukurthi / Dilip P. Patil 
Basic Mathematical Notions: Logic, Sets, Relations, Functions, Proofs; Abstract Orders: Partial Orders, Lattices, Boolean Algebra, Well Orders.;Counting & Combinatorics: Pigeonhole Principle, The Principle of Inclusion and Exclusion, Recurrence Relations, Permutations and Combinations, Binomial Coefficients and Identities; Number Theory: Mathematical Induction, Divisibility, The Greatest Common Divisor, The Euclidean Algorithm, Prime Numbers, integers, Fundamental Theorem of Arithmetic, Modular Arithmetic, Arithmetic with a Prime Modulus, Arithmetic with an Arbitrary Modulus, The RSA Algorithm; Groups and Fields: Basics, Isomorphism theorems, Chinese Remainder Theorem, Finite Fields; Graph Theory: Graph Terminology and Special Types of Graphs, Bipartite Graphs and Matching, Representation of Graphs, Connectivity, Euler and Hamilton Paths and Cycles, Planar Graphs, Graph Coloring, Trees.
References:
 Laszlo Lovasz, Jozsef Pelikan, Katalin L. Vesztergombi: Discrete Mathematics, Springer 2003.
 Graham,R.L., Knuth, D.E. and Patashnik, O: Concrete Mathematics: A Foundation for Computer Science, AddisonWesley Professional; 2 edition, 1994.
 Herstein I N : Topics in Algebra, 2 ed., Wiley India 1975.
E0 222 (3:1)  Automata Theory and Computability  Deepak D Souza / Viraj Kumar 
Finitestate automata, including the MyhillNerode theorem, ultimate periodicity, and Buchi’s logical characterization. Pushdown automata and Contextfree languages, including deterministic PDA’s, Parikh’s theorem, and the ChomskyShutzenberger theorem. Turing machines and undecidability, including Rice’s theorem and Godel’s incompleteness theorem.
References:
 Hopcroft J.E. and Ullman J.D.: Introduction to Automata, Languages and Computation. Addison Wesley, 1979.
 Dexter Kozen: Automata and Computability. Springer 1999.
 Wolfgang Thomas: Automata on infinite objects, in Handbook of Theoretical Computer Science, Volume B, Elsevier, 1990.
E0 223 (3:1)  Automated Verification  Aditya Kanade 
(1) Formal models of systems: labelled state transition diagrams for concurrent processes and protocols, timed and hybrid automata for embedded and realtime systems. (2) Specification logics: propositional and firstorder logic; temporal logics (CTL, LTL, CTL*); fixpoint logic: mucalculus. (3) Algorithmic analysis: model checking, data structures and algorithms for symbolic model checking, decision procedures for satisfiability and satisfiability modulo theories. (4) Tools: Student projects and assignments involving model checking and satisfiability tools e.g. zChaff, SPIN, NuSMV, Uppaal.
References:
 Michael Huth, Mark Ryan: Logic in Computer Science: Modelling and Reasoning about Systems, Cambridge University Press, 2004.
 Edmund M. Clarke, Orna Grumberg, Doron Peled: Model Checking, MIT Press, 2001.
 Daniel Kroening, Ofer Strichman: Decision Procedures: An Algorithmic Point of View, Springer, 2008.
Prerequisites
 Basics of data structures, algorithms, and automata theory.
E0 224 (3:1)  Computational Complexity Theory  Chandan Saha 
Computational complexity theory is the fundamental subject of classifying computational problems based on their `complexities’. In this context, `complexity’ of a problem is a measure of the amount of resource (time/space/random bits, or queries) used by the best possible algorithm that solves the problem. The aim of this course is to give a basic introduction to this field. Starting with the basic definitions and properties, we intend to cover some of the classical results and proof techniques of complexity theory.
Introduction to basic complexity classes; notion of `reductions’ and `completeness’; time hierarchy theorem & Ladner’s theorem; space bounded computation; polynomial time hierarchy; Boolean circuit complexity; complexity of randomized computation; interactive proofs; complexity of counting.
References:
 The book titled `Computational Complexity – A Modern Approach’ by Sanjeev Arora and Boaz Barak.
 Lecture notes of similar courses as and when required.
Prerequisites
 Basic familiarity with undergraduate level theory of computation and data structures & algorithms would be helpful.
 More importantly, some mathematical maturity with an inclination towards theoretical computer science.
E0 225 (3:1)  Design and Analysis of Algorithms  Anand Louis / Arindam Khan 
Design and Analysis of Algorithms, Greedy algorithms, divide and conquer strategies, dynamic programming, max flow algorithms and applications, randomized algorithms, linear programming algorithms and applications, NPhardness, approximation algorithms, streaming algorithms.
References:
 Kleinberg and Tardos, Algorithm Design, Addison Wesley, 2005.
 Cormen, Leiserson, Rivest, and Stein, Introduction to Algorithms, 3rd Edition, Prentice Hall, 2009.
E0 226 (3:1)  Linear Algebra and Probability  M. Narasimha Murty / Shalabh Bhatnagar 
Linear Algebra: System of Linear Equations, Vector Spaces, Linear Transformations, Matrices, Polynomials, Determinants, Elementary Canonical Forms, Inner Product Spaces, Orthogonality. Probability: Probability Spaces, Random Variables, Expectation and Moment generating functions, Inequalities, Some Special Distributions. Limits of sequence of random variables, Introduction to Statistics, Hypothesis testing.
References:
 Gilbert Strang, Linear Algebra and its Applications, ThomsonBrooks/ Cole, 4th edition, 2006.
 Hoffman and Kunze, Linear Algebra, Prentice Hall, 2nd edition, 1971.
 Kishor S. Trivedi, Probability and Statistics with Reliability, Queueing, and Computer Science Applications, Wiley, 2nd edition, 2008.
 Vijay K. Rohatgi, A. K. Md. Ehsanes Saleh, An Introduction to Probability and Statistics, Wiley, 2nd edition, 2000.
 Kai Lai Chung, Farid Aitsahlia, Elementary Probability Theory, Springer, 4th edition, 2006.
E0 227 (3:1)  Program Analysis and Verification  K.V. Raghavan / Deepak D Souza 
Dataflow analysis: Lattices, computing joinoverallpaths information as the least solution to a set of equations that model the program statements, termination of dataflow analysis, analysis of multiprocedure programs. Abstract interpretation of programs: Galois connections, correctness of dataflow analysis. Pointer analysis of imperative programs. Program dependence graphs, and program slicing. Assertional reasoning using Hoare logic. Type Systems: Monomorphic and polymorphic type systems, HindleyMilner’s type inference algorithm for functional programs.
References:
 Flemming Nielson, Hanne Riis Nielson, and Chris Hankin: Principles of Program Analysis, Springer, (Corrected 2nd printing, 452 pages, ISBN 3540654100), 2005.
 Benjamic Pierce: Types and Programming Languages, PrenticeHall India, 2002.
 Research papers
Prerequisites
 Exposure to programming, and the basics of mathematical logic and discrete structures.
E0 228 (3:1)  Combinatorics  L. Sunil Chandran 
Basic combinatorial numbers, selection with repetition, pigeon hole principle, InclusionExclusion Principle, Double counting; Recurrence Relations, Generating functions; Special combinatorial numbers: Sterling numbers of the first and second kind, Catalan numbers, Partition numbers; Introduction to Ramsey theory; Combinatorial designs, Latin squares; Introduction to Probabilistic methods, Introduction to Linear algebra methods.
References:
 R. P. Grimaldi, B. V. Ramana, “Discrete and Combinatorial Mathematics: An applied introduction”, Pearson Education (2007)
 Richard A Brualdi, “Introductory Combinatorics”, Pearson Education, Inc. (2004)
 Miklos Bona, “Introduction to Enumerative Combinatorics”, Mc Graw Hill (2007)
 Miklos Bona, “A walk through Combinatorics: An introduction to enumeration and graph theory”, World Scientific Publishing Co. Pvt. Ltd. (2006)
 J. H. Vanlint, R. M. Wilson, “A course in Combinatorics”, Cambridge University Press (1992, 2001)
 Stasys Jukna, “Extremal Combinatorics: With applications in computer science”, SpringerVerlag (2001)
 Noga Alon, Joel H. Spencer, P. Erdos, “The Probabilistic methods”, Wiley Interscience Publication
 Laszlo Babai and Peter Frankl, “Linear Algebra Methods in Combinatorics, with Applications to Geometry and Computer Science” (Unpublished Manuscript, 1992)
Prerequisites
 None. (A very basic familiarity with probability theory and linear algebra is preferred, but not a must. The required concepts will be introduced quickly in the course.)
E0 229 (3:1)  Foundations of Data Science  Ravi Kannan / Siddharth Barman 
High Dimensional Geometry, SVD and applications, Random Graphs,Markov Chains, Algorithms in Machine Learning, Clustering, Massive data and Sampling on the fly
References:
 Foundations of Data Science by Hopcroft and Kannan
Prerequisites
 Basic Linear Algebra, Basic Probability.
E0 230 (3:1)  Computational Methods of Optimization  Chiranjib Bhattacharyya 
Need for unconstrained methods in solving constrained problems. Necessary conditions of unconstrained optimization, Structure of methods, quadratic models. Methods of line search, ArmijoGoldstein and Wolfe conditions for partial line search. Global convergence theorem, Steepest descent method. QuasiNewton methods: DFP, BFGS, Broyden family. Conjugatedirection methods: FletcherReeves, PolakRibierre. Derivativefree methods: finite differencing. Restricted step methods. Methods for sums of squares and nonlinear equations. Linear and Quadratic Programming. Duality in optimization.
References:
 Fletcher R., Practical Methods of Optimization, John Wiley, 2000.
E0 231 (3:1)  Algorithmic Algebra  Ambedkar Dukkipati 
Basic algebraic notions: Integers, Euclidean algorithm, division algorithm, ring and polynomial rings, abstract orders and Dickson’s lemma; Introduction to Gröbner bases: Term orders, multivariate division algorithm, Hilbert basis theorem, Gröbner bases and Buchberger algorithm, computation of syzygies, basic algorithms in ideal theory, universal Gröbner bases; Algebraic Applications: Hilbert nullstellensatz, implicitization, decomposition, radical and zeros of ideals; Other applications: Toric ideals and integer programming, applications to graph theory, coding, cryptography, statistics.
References:
 Ideals, Varieties and Algorithms by D. Cox and O’Shea, Springer; 2nd ed. 1997.
 Algorithmic Algebra by Bhubaneswar Mishra, Springer, 1993.
E0 232 (3:1)  Probability and Statistics  Shalabh Bhatnagar 
Probability spaces and continuity of probability measures, random variables and expectation, moment inequalities, multivariate random variables, sequence of random variables and different modes of convergence, law of large numbers, Markov chains, statistical hypothesis testing, exponential models, introduction to large deviations.
References:
 An Introduction to Probability and Statistics by Vijay K. Rohatgi, A. K. Md. Ehsanes Saleh, Wiley, 2nd edition 2000.
 An Intermediate course in Probability, by Allen Gut, Springer, 2008.
E0 233 (3:1)  Information Theory, Inference and Learning Algorithms  Ambedkar Dukkipati 
Data compression and Kraft’s inequality, source coding theorem and Shannon entropy, KullbackLeibler divergence and maximum entropy, Iprojections and Sanov theorem, Kullback siszar iteration and iterative scaling algorithms, Fisher information and CramerRao inequality, quantization and introduction to rate distortion theory, generalized information measures and powerlaw distributions.
References:
 Elements of Information Theory, by T. M. Cover and J. A. Thomas, John Wiley & Sons, 2nd edition, 2006.
 Information Theory, Inference, and Learning Algorithms by D.J.C. MacKay, Cambridge University Press, 2003.
E0 234 (3:1)  Introduction to Randomized Algorithms  Siddharth Barman / Arindam Khan 
Motivation and objectives of the course :
The use of randomness in algorithm design is an extremely powerful paradigm. Often, it makes algorithm design (and analysis) easier; however, there are some problems for which we only know randomized algorithms and no deterministic algorithms. Furthermore, depending on the model of computation, randomization is often essential — it provably does better than all deterministic algorithms. In this course, we will introduce the basic techniques of designing randomized algorithms although at times we will dive into stateoftheart topics. Students are expected to have taken an introductory course in algorithm design and analysis, and some familiarity with probability, although not essential, is desirable.
Syllabus:
Basics of probability, events, discrete random variables, expectation, moments, deviations, Chernoff and Hoefding bounds, balls and bins, random graphs, probabilistic methods, Markov chains and random walk, Monte Carlo method, coupling, martingales, sample complexity, VC dimension, Rademacher complexity, balanced allocation, power of two choices, cuckoo hashing, information and entropy, online algorithms, randomorder model, streaming algorithms.
References:
 “Probability and Computing” by Mitzenmacher and Upfal.
 “Randomized Algorithms” by Motwani and Raghavan.
Prerequisites : E0 225 (Design and Analysis of Algorithms) and E0 226 (Linear Algebra and Probability).
E0 235 (3:1)  Cryptography  Sanjit Chatterjee / Arpita Patra 
Elementary number theory, Finite fields, Arithmetic and algebraic algorithms, Secret key and public key cryptography, Pseudo random bit generators, Block and stream ciphers, Hash functions and message digests, Public key encryption, Probabilistic encryption, Authentication, Digital signatures, Zero knowledge interactive protocols, Elliptic curve cryptosystems, Formal verification, Cryptanalysis, Hard problems.
References:
 Stinson. D. Cryptography: Theory and Practice.
 Menezes. A. et. al. Handbook of Applied Cryptography
E0 236 (3:1)  Information Retrieval  M. Narasimha Murty 
Information retrieval using the Boolean model. The dictionary and postings lists. Tolerant retrieval. Index construction and compression. Vector space model and term weighting. Evaluation in information retrieval. Relevance feedback and query expansion. Probabilistic information retrieval. Language models for information retrieval. Text classification and clustering. Latent semantic indexing. Web search basics. Web crawling and indexes. Link analysis.
References:
 C. D. Manning, P. Raghavan, and H. Schutze, Introduction to Information Retrieval, Cambridge University Press, 2008.
 Recent Literature
E0 237 (2:1)  Intelligent Agents  V. Susheela Devi 
Concepts of Agency and Intelligent Agents. Action of Agents, Percepts to Actions. Structure of Intelligent Agents, Agent Environments, Communicating, Perceiving, and Acting. Concepts of Distributed AI, Cooperation, and Negotiation. Applications: Webbased Agents, Database Applications. Agent Programming
References:
 S. Russel and P. Norvig, Artificial Intelligence – A Modern Approach, Prentice Hall, 1995.
 Recent Papers
E0 238 (3:1)  Intelligent Agents  V. Susheela Devi 
Introduction to Artificial Intelligence, Problem solving, knowledge and reasoning, Logic, Inference, Knowledge based systems, reasoning with uncertain information, Planning and making decisions, Learning, Distributed AI, Communication, Web based agents, Negotiating agents, Artificial Intelligence Applications and Programming.
References:
 S. Russel and P. Norvig, Artificial Intelligence – A Modern Approach, Prentice Hall, 1995.
 George F. Luger, Artificial Intelligence, Pearson Education, 2001.
 Nils J. Nilsson, Artificial Intelligence – A New Synthesis, Morgan Kaufmann Publishers, 2000
E0 239 (3:1)  Software Reliability Techniques  Aditya Kanade 
Models of concurrency: multithreading, synchronization, eventbased dispatch. Model checking: model checking abstractions, context bounding, partial order reduction. Static analysis: type systems for proving dealock and race freedom, rely guarantee framework for compositional reasoning. Security vulnerabilities/attacks: attacks targeting spatial and temporal memory safety violations, injection and scripting attacks. Vulnerability detection: overflow, heap, and string analyses; information flow.
References:
 M. BenAri, “Principles of concurrent and distributed programming”, AddisonWesley, 2006
 “Handbook of model checking”, Springer, 2014
 Brian Chess and Jacob West, “Secure programming with static analysis”, Addison Wesley, 2007
 Additional research papers.
E0 240 (3:1)  Modeling and Simulation  Chiranjib Bhattacharyya / Matthew Jacob Thazhuthaveetil 
Introduction to Probability theory, Random variables, commonly used continuous and discrete distributions. Introduction to Stochastic Process, Poisson process, Markov chains, steady stateand transient analysis. Psuedo random numbers: Methods of Generation and testing. Methods for generating continuous and discrete distributions. Methods for generating Poisson Process. Building blocks of Simulation, Data Structures and Algorithms. Introduction to Probabilistic modelling, Maximum Likelihood Variance reduction techniques: antithetic variates, control variates, common random numbers, importance sampling. Analysis of Simulation results: confidence intervals, design of experiments. Markov Chain Monte Carlo techniques.
References:
 Sheldon M. Ross: Introduction to Probability Models 7th Edition, Academic Press, 2002
 Donald E. Knuth: The Art of Computer Programming – Volume 2: Semi Numerical Algorithms, 2nd Edition, Addison Wesley, Reading MA, USA 2000
 Sheldon M. Ross Simulation 3rd Edition, Academic Press, 2002
 A. M. Law and W. D. Kelton: Simulation Modeling and Analysis, 3rd Edition, McGrawHill, New York, USA, 1998
 Raj Jain The Art of Computer Systems Performance Analysis, John Wiley and Sons, New York, USA, 1991
E0 241 (3:1)  Computer Communication Networks  Shalabh Bhatnagar 
Introduction to computer networks; telephone networks, networking principles; switching – circuit switching, packet switching; scheduling – performance bounds, best effort disciplines, naming and addressing, protocol stack, SONET/SDH; ATM networks – AAL, virtual circuits, SSCOP; Internet – addressing, routing, end point control; Internet protocols – IP, TCP, UDP, ICMP, HTTP; performance analysis of networks – discrete and continuous time Markov chains, birthdeath processes, time reversibility, queueing / delay models – M/M/1, M/M/m, M/M/m/m, M/G/1 queues, infinite server systems; open and closed queueing networks, Jackson’s theorem, Little’s law; traffic management – models, classes, scheduling; routing algorithms – Bellman Ford and Dijkstra’s algorithms; multiple access, frequency and time division multiplexing; local area networks – Ethernet, token ring, FDDI, CSMA/CD, Aloha; control of networks – QoS, window and rate congestion control, open and closed loop flow control, large deviations of a queue and network, control of ATM networks.
References:
 I. Mitrani, Modelling of Computer and Communication Systems, Cambridge, 1987.
 J.Walrand and P.Varaiya, High Performance Communication Networks, Harcourt Asia (Morgan Kaufmann), 2000.
 S.Keshav, An Engineering Approach to Computer Networking, Pearson Education, 1997.
 D.Bertsekas and R.Gallager, Data Networks, Prentice Hall of India, 1999.
 J.F.Kurose and K.W.Ross, Computer Networking: A TopDown Approach Featuring the Internet, Pearson Education, 2001.
E0 242 (3:1)  Probabilistic Models for Learning  Chiranjib Bhattacharyya 
Review of Probability theory: Random variables, Expectation, Central Limit theorem. Latent variable models: mixture models, HIdden Markov models, EM algorithm, Graphical models: Algorithms for Inference, Markov Chain Monte Carlo Methods, Belief Propagation, Variational methods Factor Analysis. Applications to Text: Maxent Formalism, Statistical Parsing, CKY algorithm, Topic models.
References:
 Sheldon Ross – Introduction to Probability theory
 C. Bishop – Pattern Recognition
 J. Pearl: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference
 C. Manning and H. Schutzle – Foundations of Statistical Natural Language Processing
E0 243 (3:1)  Computer Architecture  Arkaprava Basu 
Processor Architecture: InstructionLevel Parallelism,Superscalar and VLIW architecture; Multicore processors;Memory Subsystem: Multilevel caches, Caches in multicore processors,Memory controllers for multicore systems;Multiple processor systems: shared and distributed memory system,memory consistency models, cache coherence, and Interconnection networks;Advanced topics in architecture.
References:
 Hennessy, J.L., and Patterson, D.A.: Computer Architecture, A quantitative Approach, Morgan Kaufmann.
 Stone, H.S.: HighPerformance Computer Architecture, AddisonWesley.
 Current literature
E0 244 (3:1)  Computational Geometry and Topology  Sathish Govindarajan / Vijay Natarajan 
Voronoi diagram, Delaunay triangulation, Geometric Data Structures — Interval tree, Range tree, Segment tree. Complexes — simplicial complex, Rips complex, alpha complex, homology, Betti numbers, persistence homology, Morse functions, Reeb graph, approximation and fixed parameter algorithms for geometric problems – hitting set and set cover, epsilon nets, epsilon approximations, geometric intersection graphs, geometric discrepancy, clustering.
References:
 Computational Topology : An Introduction, Herbert Edelsbrunner and John L. Harer, American Mathematical Society, Indian Edition, 2010.
 Computational Geometry: Algorithms and Applications, Mark de Berg, Otfried Cheong, Marc van Kreveld, and Mark Overmars, Third Edition, Springer (SIE), 2011.
 Geometric Approximation Algorithms, Sariel HarPeled, American Mathematical Society, Indian Edition, 2013.
Prerequisites
 E0225 : Design and Analysis of Algorithms
E0 248 (3:1)  Theoretical Foundations of Cryptography  Bhavana Kanukurthi / Nishanth Chandran 
This course is a complexitytheoretic introduction to Cryptography. Emphasis will be placed on exploring connections between various fundamental cryptographic primitives via reductions.
Some of the primitives we will cover are oneway functions, pseudorandom generators, pseudorandom functions, trapdoor permutations, encryption, digital signatures, hash functions, commitments. We will also try to cover some special topics (private information retrieval, zeroknowledge proofs, oblivious transfer etc.).
E0 249 (3:1)  Approximation Algorithms  Anand Louis / Arindam Khan 
Combinatorial algorithms: greedy algorithms, local search based algorithms; Linear programming based algorithms: randomized rounding, primaldual schema based algorithms, iterated rounding; multicut, sparsest cut and metric embeddings; Semidefinite programming based algorithms; Hardness of approximation.
References:
 “The Design of Approximation Algorithms” by David Shmoys and David Williamson”.
 “Approximation Algorithms” by Vijay Vazirani.
Prerequisites
 E0225: Design and Analysis of Algorithms.
E0 250 (3:1)  Deep Learning  Ambedkar Dukkipati / Sargur N Srihari 
Machine learning approaches that are based on multiple layers of latent variables have come to be known as deep learning. Students will learn concepts, architectures and implementations underlying deep learning practice and deep learning research. Topics include feedforward networks, generalization capability, optimization, architectures (convolutional networks, recurrent neural networks), representation learning, generative models (variational and adversarial networks) and practical implementation issues.
Student Expectations:
1. Students will be required to implement two projects using deep learning frameworks such as Tensorflow/Keras or Pytorch . The first project will be compulsory. The second project will be on deep learning research (Chapters 1320). Upto two students can participate in a project team.
2. Students will be expected to study the materials that will be discussed in class that day. The topics for each week are described at the end of the course syllabus. Each lecture will conclude with a practicum and a quiz on the topic of the day. Each quiz will have multiple choice questions concerning topics covered in class.
Syllabus:
They are selections from a more complete set at: http://www.cedar.buffalo.edu/~srihari/CSE676.
Overview, Applied Mathematics and Machine Learning Basics, Deep Feedforward Networks, Regularization, Optimization for Training Deep Models, Convolutional Networks, Sequence Modeling: Recurrent and Recursive Nets, Practical Methodology, Applications, Linear Factor Models, Autoencoders, Representation Learning, Structured Probabilistic Models for Deep Learning, Monte Carlo Methods, The Partition Function, Approximate Inference, Deep Generative Models.
References:
 Goodfellow, Bengio and Courville, Deep Learning, MIT Press, 2016.
Prerequisites
 Machine Learning
E0 251 (3:1)  Data Structures and Algorithms  Y.N. Srikant 
Abstract data types and data structures, Classes and objects, Complexity of algorithms: worst case, average case, and amoritized complexity. Algorithm analysis. Algorithm Design Paradigms. Lists: stacks, queues, implementation, garbage collection. Dictionaries: Hash tables, Binary search trees, AVL trees, RedBlack trees, Splay trees, Skiplists, BTrees. Priority queues. Graphs: Shortest path algorithms, minimal spanning tree algorithms, depthfirst and breadthfirst search. Sorting: Advanced sorting methods and their analysis, lower bound on complexity, order statistics.
References:
 A.V. Aho, J.E. Hopcroft, and J.D. Ullman, Data Structures and Algorithms, Addison Wesley, Reading Massachusetts, USA, 1983
 T.H. Cormen, C.E. Leiserson, and R.L. Rivest, Introduction to Algorithms, The MIT Press, Cambridge, Massachusetts, USA, 1990
 M.A. Weiss, Data Structures and Algorithms Analysis in C++, Benjamin/Cummins, Redwood City, California, USA, 1994.
E0 252 (3:1)  Programming Languages – Design and Implementation  Y.N. Srikant 
Example languages from each of the above categories would be discussed along with their implementation details. Formal semantics would be used to enhance the understanding of the features and to assist in the design of correct implementations. However, there will be no deep discussion of the theory. This is neither a course on compiler design nor a course on the theory of programming languages. Emphasis would be on understanding the features and their implementation. Students will be required to carry out mini projects as a part of the course.
Features and implementation of imperative, objectoriented, concurrent, distributed, logicprogramming, functional, aspectoriented, scripting, businessoriented and web programming languages.
References:
 Robert Harper, Practical Foundations for Programming Languages, Cambridge University Press, 2012.
 John Mitchell, Concepts in Programming Languages, Cambridge University Press, 2002.
 John Reynolds, Theories of Programming Languages, Cambridge University Press, 2009.
 Selected papers
Prerequisites
 None. However, programming in C/C++/Java/shell/Perl and a course on compiler design at the BE/BTech level would be helpful. There will be no overlap with the compiler design course in the CSA department (E0 255).
E0 253 (3:1)  Operating Systems  Vinod Ganapathy / Arkaprava Basu 
User Level Specification of OS. Fundamental Concepts of Multiprogrammed OS, Basic Concepts and Techniques for Implementation of Multiprogrammed OS. Processes and the Kernel, Microkernel Architecture of OS. Multiprocessor, Multimedia, and RealTime OS. POSIX Standards. Management and Control of Processes. Basic Concept of Threads, Types of Threads, Models of Thread Implementations. Traditional and RealTime Signals. Clocks, Timers and Callouts. Thread Scheduling for Unix, Windows, and RealTime OS, RealTime Scheduling. Interprocess/Interthread Synchronization and Communication, Mutual Exclusion/Critical Section Problem, Semaphores, Monitors, Mailbox, Deadlocks. Concepts and Implementation of Virtual Memory(32bit and 64bit), Physical Memory Management. File Organization, File System Interface and Virtual File Systems, Implementation of File Systems. I/O Software:Interrupt Service Routines and Device Drivers. Protection and Security. Case Study of Unix, Windows, and RealTime OS.
References:
 Andrew S. Tanenbaum: Modern Operating Systems, Second Edition, Pearson Education, Inc., 2001.
 Uresh Vahalia: UNIX Internals: The New Frontiers, PrenticeHall, 1996.
 J. Mauro and R. McDougall: Solaris Internals: Core Kernel Architecture, Sun Microsystems Press, 2001.
 Daniel P. Bovet and Marco Cesati: Understanding the Linux kernel, 2nd Edition O’Reilly & Associates, Inc., 2003.
E0 254 (3:1)  Network and Distributed Systems Security  R.C. Hansdah 
Security Goals and Violations; Security Requirements; Security Services; Discrete Logs, Encryption/Decryption Functions, Hash Functions, MAC Functions; Requirements and Algorithmic Implementation of OneWay Functions; OS Security Violations and Techniques to Prevent Them; Access Control Models; Secure Programming Techniques; Authenticated DiffieHellman Key Establishment Protocols; Group Key Establishment Protocols; Block Ciphers and Stream Ciphers; Modes of Encryption; Digital Signatures; Authentication Protocols; Nonce and Timestamps; PKI and X.509 Authentication Service; BAN logic; Kerberos; Email Security; IP Security; Secure Socket Layer and Transport Layer Security; Secure Electronic Transactions; Intrusion Detection; Malicious Software Detection; Firewalls.
References:
 William Stallings: Cryptography and Network Security: Principles and Practices, Fourth Edition, Prentice Hall, 2006.
 Neil Daswani, Christoph Kern and Anita Kesavan: Foundations of Security: What Every Programmer Needs to Know, Published by Apress, 2007.
 Yang Xiao and Yi Pan: Security in Distributed and Networking Systems, World Scientific, 2007.
 Current Literature.
Prerequisites
 Knowledge of Java is desirable, but not necessary.
E0 255 (3:1)  Compiler Design  R. Govindarajan / Uday Kumar Reddy B. 
Control flow graphs and analysis; Dataflow analysis; Static single assignment (SSA); Compiler optimizations; Dependence analysis, Loop optimizations and transformations, Parallelization, Optimizations for cache locality, and Vectorization; Domainspecific languages, compilation, and optimization; Register allocation, Instruction scheduling; Run time environment and storage management; Impact of language design and architecture evolution on compilers.
References:
 Aho, A.V., Ravi Sethi and J.D. Ullman: Compilers – Principles, Techniques and Tools, Addison Wesley, 1988.
 S. Muchnick: Advanced Compiler Design and Implementation, Morgan Kauffman, 1998
 Selected Papers.
E0 256 (3:1)  Theory and Practice of Computer Systems Security  Vinod Ganapathy 
This course will seek to equip students with the fundamental principles and practice of computer systems security. The course will cover the major techniques of offense and defense, thereby educating students to think both as attackers and defenders. By the end of the course, students will have been exposed to the state of the art, and will be equipped with the background to start conducting original research in computer systems security. Core concepts such as basic security goals, threat models, notion of TCB and security policies vs. mechanisms. Operating system primitives for protection, reference monitors, authentication, and authorization. Examples of classic security policies from the literature (e.g., Biba, BLP) and their realization on modern systems. Various forms of hijacking attacks, such as buffer overflows, returnoriented programming, and noncontrol data attacks, and examples of such attacks as used by exploits in the wild. Design and implementation of defenses such as controlflow integrity, ASLR, privilege separation, capabilities, informationflow control and virtual machine introspection. Attacks and defenses against the Web ecosystem, mobile devices and the cloud platform. Emerging role of modern hardware in improving systems security. Other assorted topics based on current research literature.
References:
 Security Engineering, 2nd Edition, Wiley, by Ross Anderson. http://www.cl.cam.ac.uk/~rja14/book.html (free online copy)
 Research papers from systems security conferences and journals.
Prerequisites
 None, but standard undergraduatelevel exposure to OS, computer architecture and compilers courses will be assumed.
E0 257 (3:1)  Software Architecture  Raghu Hudli / Y.N. Srikant 
Software process and the role of modeling and analysis, software architecture, and software design. Software Modeling and Analysis: analysis modeling and best practices, traditional best practice diagrams such as DFDs and ERDs, UML diagrams and UML analysis modeling, analysis case studies, analysis tools, analysis patterns. Software Architecture: architectural styles, architectural patterns, analysis of architectures, formal descriptions of software architectures, architectural description languages and tools, scalability and interoperability issues, web application architectures, case studies. Software Design: design best practices, design patterns, extreme programming, design case studies, component technology, object oriented frameworks, distributed objects, object request brokers, case studies.
References:
 Booch,G., Rumbaugh, J., Jacobson, I., The Unified Modeling Language User Guide, Addison Wesley, 1999.
 Gamma, E.,Helm, R. Johnson, R. Vissides, J., Design Patterns, Elements of Reusable Object Oriented Software, AddisonWesley, 1995.
 Frank Buschmann et al. Pattern Oriented Software Architecture, Volume 1: A System of Patterns. John Wiley and Sons, 1996.
 Shaw, M., and Garlan, D., Software Architecture: Perspectives on an Emerging Discipline, PrenticeHall, 1996.
 Len Bass et al. Software Architecture in Practice. Addison Wesley, 1998.
E0 258 (3:1)  Foundations of Programming Languages  Raghu Hudli / Y. Narahari 
Survey of programming paradigms and computational models for program execution. Programming language examples, syntax description and language semantics Functional programming, lamda calculus, Higherorder functions, currying, recursion. Imperative programming and control structures, invariants, object models, messages, and method dispatch, inheritance, subtypes and subclasses, polymorphism, covariance, and contravariance. Formal aspects of Java. Concurrent programming models and constructs, programming in the multicore environment. Introduction to Logic programming quantifiers, first order logic, Horn clauses, unification and resolution.
References:
 Daniel Friedman, Mitchel Wand and Christopher Hanes. “Essentials of Programming Langauges”, Prentice Hall of India, 2nd Edition, 2001.
 John Reynolds, “Theories of Programming Languages”, Cambridge Univ. Press, 1998.
 John Mitchell, “Concepts in Programming Languages”, Cambridge Univ. Press, 2003.
 Benjamin Pierce, “Types and and Programming Languages”, MIT Press, 2002.
 Selected Chapters from J. an Leeuwen, Ed. “Handbook of Theoretical Computer Science”, Vol. B, Elsevier, MIT Press, 1994.
 Kim Bruce, “Foundations of Object Oriented Languages”, Prentice Hall of India, 2002.
 Martin Abadi and Luca Cadelli, “A Theory of Objects”, Springer, 1996.
 Current research papers and Internet resources
E0 259 (3:1)  Data Analytics  Ramesh Hariharan / Rajesh Sundaresan 
Data Analytics is assuming increasing importance in recent times. Several industries are now built around the use of data for decision making. Several research areas too, genomics and neuroscience being notable examples, are increasingly focused on largescale data generation rather than smallscale experimentation to generate initial hypotheses. This brings about a need for data analytics. This course will develop modern statistical tools and modelling techniques through handson data analysis in a variety of application domains. The course will illustrate the principles of handson data analytics through several case studies (810 such studies). On each topic, we will introduce a scientific question and discuss why it should be addressed. Next, we will present the available data, how it was collected, etc. We will then discuss models, provide analyses, and finally touch upon how to address the scientific question using the analyses.Data sets from astronomy, genomics, visual neuroscience, sports, speech recognition, computational linguistics and social networks will be analysed in this course.Statistical tools and modeling techniques will be introduced as needed to analyse the data and eventually address these scientific questions. There will be a few guest lectures from industry also.
References:
 Random Processes (E2 202) or Probability and Statistics (E0 232) or equivalent.
Prerequisites
E0 261 (3:1)  Database Management Systems  Jayant R. Haritsa 
Design of Database Kernels, Query Optimization, Query Processing, Data Access Methods, Transaction Management, Distributed Databases, Data Mining, Data Warehousing, MainMemory Databases, Columnar Databases, NoSQL systems.
References:
 Database Systems Concepts, H. Korth, A. Silberschatz and S. Sudarshan, McGrawHill
 Fundamentals of Database Systems R. Elmasri and S. B. Navathe, AddisonWesley.
 Database Management Systems R. Ramakrishnan and J. Gehrke, McGrawHill.
 Readings in Database Systems M. Stonebraker and J. Hellerstein, Morgan Kaufmann.
 Recent Conference and Journal papers.
Prerequisites
 Data Structures, C or C++, Undergraduate course in DBMS
E0 264 (3:1)  Distributed Computing Systems  R.C. Hansdah 
Fundamental Issues in Distributed Systems, Distributed System Models and Architectures; Classification of Failures in Distributed Systems, Basic Techniques for Handling Faults in Distributed Systems; Logical Clocks and Virtual Time; Physical Clocks and Clock Synchronization Algorithms; Security Issues in Clock Synchronization; Secure RPC and Group Communication; Group Membership Protocols and Security Issues in Group Membership Problems; Naming Service and Security Issues in Naming Service; Distributed Mutual Exclusion and Coordination Algorithms; Leader Election; Global State, Termination and Distributed Deadlock Detection Algorithms; Distributed Scheduling and Load Balancing; Distributed File Systems and Distributed Shared Memory; Secure Distributed File Systems; Distributed Commit and Recovery Protocols; Security Issues in Commit Protocols; Checkpointing and Recovery Protocols; Secure Checkpointing; FaultTolerant Systems, Tolerating Crash and Omission Failures; Implications of Security Issues in Distributed Consensus and Agreement Protocols; Replicated Data Management; SelfStabilizing Systems; Design Issues in Specialized Distributed Systems.
References:
 Randy Chow, and Theodore Johnson, “Distributed Operating Systems and Algorithms”, AddisonWesley, 1997.
 Sukumar Ghosh, “Distributed Systems: An Algorithmic Approach”, CRC Press, 2006.
 Kenneth P. Birman, “Reliable Distributed Systems: Technologies, Web Services, and Applications”, Springer New York, 2005.
 G. Coulouris, J. Dollimore, and T. Kindberg, “Distributed Systems: Concepts and Designs”, Fourth Edition, Pearson Education Ltd., 2005.
 Current Literature
Prerequisites
 NDSS(E0 254) or equivalent course
E0 267 (3:1)  Soft Computing  V. Susheela Devi 
To introduce the student to the soft computing paradigm as compared to hard computing. To make them learn the techniques of soft computing like neural networks, fuzzy and rough systems, evolutionary algorithms etc. which can be applied to the task of classification, clustering, and other applications. Definition of soft computing, Soft computing vs. Hard computing; Advantages of soft computing, tools and techniques;Neural Networks : Fundamentals, backpropogation, associative memory, self organizing feature maps, applications;Fuzzy and rough sets : Concepts and applications; Evolutionary algorithms, swarm intelligence, particle swarm optimization, ant colony optimization, applications;Hybrid systems : Integration of neural networks, fuzzy logic and genetic algorithms, integration of genetic algorithms and particle swarm optimization, Applications.
References:
 Timothy J.Ross, “Fuzzy Logic with Engineering Applications”, McGrawHill,1997
 David E.Goldberg,Genetic Algorithms in Search, Optimization, and Machine Learning, Pearson Education, 2009.
 Melanie Mitchell, An introduction to genetic algorithms, Prentice Hall, 1998. 4. S. Haykin, Neural Networks?, Pearson Education, 2ed, 2001
 Z. Pawlak, Rough Sets, Kluwer Academic Publisher, 1991.
Prerequisites
 Nil
E0 268 (3:1)  Practical Data Science  Shirish K. Shevade 
Introduction, Data Preparation, Linear Methods for Classification and Regression, Additive Models and Tree based methods, Support Vector Machines, Model Assessment and Selection, Unsupervised Learning, Link Analysis, Recommendation Systems and Handling Large Datasets: MapReduce.
References:
 James, Witten, Hastie and Tibshirani, An Introduction to Statistical Learning with Applications in R, Springer, 2015
 Rajaraman, Leskovec and Ullman, Mining of Massive Datasets, Cambridge University Press, 2014
 Hastie, Tibshirani and Friedman, The Elements of Statistical Learning, Springer, 2009
 Recent literature
Prerequisites
 Linear Algebra, Probability and Statistics, Some programming experience in any language.
E0 269 (3:1)  Probabilistic Graphical Models  Indrajit Bhattacharya 
Graph types : conditional independence; directed, undirected, and actor models; algorithms for conditional independence (e.g., Bayesball,dseparation, Markov properties on graphs, factorization,HammersleyClifford theorems). Static Models : linear Gaussian models, mixture models, factor analysis, probabilistic decision trees, Markov Random Fields, Gibbs distributions, static conditional random fields (CRFs), multivariate Gaussians as graphical models, Exponential family, generalized linear models, factored exponential families. Dynamic (temporal) models : Hidden Markov Models, Kalman filtering and linearGaussian HMMs, linear dynamic models, dynamic Bayesian networks (DBNs), label and observation bias in natural language processing, dynamic conditional random fields (CRFs), and general dynamic graphical models. Chordal Graph Theory : moralization; triangulated, decomposable, and intersection graphs, Treewidth and pathwidth parameters of a graph. Exact Probabilistic Inference : The elimination family of algorithms. Relation to dynamic programming. Generality (such as Viterbi, MPE, the fast Fourier transform). junction trees, belief propagation, optimal triangulations. NP hardness results. Approximate Probabilistic Inference : loopy belief propagation (BP), expectation propagation (EP), Sampling (markov chains, metropolis hastings, gibbs, convergence and implementaional issues) particle filtering. Structure Learning : Chow Liu algorithm. Latent Dirichlet Allocation (1 wk): Exchangeability, de Finetti Theorem, Inference using collapsed Gibbs sampling, Dirichlet compound multinomial model.
References:
 Probabilistic Graphical Models: Principles and Techniques”, Daphne Koller and Nir Friedman
 Relevant papers
Prerequisites
 Introduction to Probability and Statistics and Consent of Instructor
E0 270 (3:1)  Machine Learning  Chiranjib Bhattacharyya / Ambedkar Dukkipati 
Introduction to Machine Learning, classification using Bayes rule, introduction to Bayes decision theory. Learning as optimization, linear regression. Probabilistic view: ML and MAP estimates. Logistic Regression: Gradient Descent, Stochastic Gradient methods. Hyperplane based classifiers, Perceptron, and Perceptron Convergence Theorem. Support vector machine and kernel methods.
Feedforward neural networks, backpropagation algorithm. Autoencoders, Convolutional neural networks, and application to computer vision. The sequence to sequence models, recurrent NN and LSTM and applications to NLP.
Undirected Graphical Models, Markov Random Fields, Introduction to MCMC and Gibbs Sampling. Restricted Boltzmann Machine. EM algorithm, Mixture models and Kmeans, Bayesian Networks, Introduction to HMMs. Generative models: GANs and VAEs.
References:
 Bishop. C M, Pattern Recognition and Machine Learning, Springer, 2006.
 Hastie T, Tibshirani R and Friedman J, The Elements of Statistical Learning: Data Mining, Inference and Prediction, Springer, 2nd Edition, 2009
 Haykin. S, Neural Networks and Learning Systems, Prentice Hall, 3rd Edition, 2009
 Goodfellow, Bengio, Courville, Deep Learning, MIT Press, 2017
Prerequisites
 Probability and Statistics (or equivalent course elsewhere). Some background in linear algebra and optimization will be helpful.
E0 271 (3:1)  Graphics and Visualization  Vijay Natarajan 
Graphics pipeline; transformations; viewing; lighting and shading; texture mapping; modeling; geometry processing – meshing, multiresolution methods, geometric data structures; visualization – visualization pipeline, data reconstruction, isosurfaces, volume rendering, flow visualization.
References:
 Edward S. Angel and Dave Shreiner. Interactive Computer Graphics: A TopDown Approach with ShaderBased OpenGL. Pearson, 2011.
 Dave Shreiner, Graham Sellers, John Kessenich, and Bill LiceaKane. OpenGL Programming Guide: The Official Guide to Learning OpenGL. AddisonWesley, 8th Edition, 2013.
 Research papers from graphics and visualization conferences and journals.
Prerequisites
 Undergraduate courses in linear algebra, data structures, algorithms, and programming.
E0 272 (3:1)  Formal Methods in Software Engineering  K.V. Raghavan / Deepak D Souza 
Domain modeling using firstorder predicate logic and relational calculus — the tools Alloy and EventB. Verification of finitestate systems, and concurrent systems — Spin. Verifying code correctness using logical reasoning — VCC. Testing and boundedexploration of applications — Pex and AFL.
References:
 Logic in Computer Science: Modelling and Reasoning about Systems, by Michael Huth and Mark Ryan.
 Software Abstractions: Logic, Language, and Analysis, by Daniel Jackson.
 Model Checking, by Edmund M. Clarke, Orna Grumberg, and Doron Peled.
 Specifying software: A HandsOn Introduction, by R. D. Tennent.
 Research papers.
Prerequisites
 Exposure to programming, and the basics of mathematical logic and discrete structures.
E0 290 (3:1)  Mathematical Foundations for Modern Computing  Ravi Kannan 
HighDimensional Data. Modeling of data in a high dimensional space. High dimensional Euclidean geometry. Random projection theorem. Random Graphs. ErdosRenyi model, Properties of random graphs, Giant component and threshold phenomena. Random satisfiability problems. Singular Value Decomposition and its applications. Random Walks: Connections to electrical networks, convergence using eigen values and conductance measures. Foundations of Learning Theory. The perceptron algorithm, margins and support vector machines and VapnikChervonenkis theorem and applications. Clustering algorithms and criteria. Provable results and algorithms for kmeans and other criteria. Recent work on finding local communities using random walks. Massive Data Computations including streaming algorithms. Fast approximations to matrices such as the CUR approximation. Games and Economic related models and algorithms, the notion of equilibrium, its existence and computation, markets and market equilibria.
References:
 John Hopcroft and Ravi Kannan. Mathematics for Modern Computing. Forthcoming chapters will be made available.
 Ravi Kannan and Santosh Vempala. Spectral Algorithms, Now Publishers, 2009.
E0 291 (3:1)  Spatial Databases  Jayant R. Haritsa / N. L. Sarda 
1. Introduction, Motivation: Application Domains of Geographical Information Systems (GIS), Common GIS data types and analysis, OGC standards and reference geometry model 2. Models of Spatial Data : Conceptual Data Models for spatial databases (e.g. pictogram enhanced ERDs). Logical data models for spatial databases: raster model (map algebra), vector model (OGIS/SQL1999) 3. Spatial query languages : Need for spatial operators and relations, SQL3 and ADT. Spatial operators, OGIS queries. 4. Spatial storage Methods : Clustering methods (space filling curves), Storage methods (Rtree, Grid files), Concurrency control (Rlink trees), Compression methods for raster and vector data, Spatial indexing 5. Spatiotemporal and moving object databases : Spatio Bitemporal objects and operations. Querying, Event models. Spatio temporal indexes 6. Processing spatial queries : Spatial selection, joins, aggregates, nested queries, buffers 7. Query processing and optimization : strategies for range query, nearest neighbor query, spatial joins (e.g. tree matching), cost models for new strategies, impact on rule based optimization, selectivity estimation 8. Spatial networks : Road network databases and connectivity graphs. Topology storage, query for spatial networks. 9. Mining spatial databases : Clustering, Spatial classification, Colocation patterns, Spatial outliers, 10. Geosensor databases
References:
 Spatial Databases: A Tour, S. Shekhar and S. Chawla, Prentice Hall, 2003
 Moving Objects Databases, by Ralf Hartmut Guting, Markus SchneiderMorgan kaufman, 2005
 Spatial Databases with Applications to GIS, P. Rigaux, M. Scholl, A. Voisard, Morgan Kaufmann, 2002
 SpatioTemporal Database, M. Koubarakis, T. Selles at al (ed.), Springer 2003
 Selected papers (see the bibliography available at: http://www.spatial.cs.umn.edu/Courses/Fall07/8715/paperList.html)
E0 292 (3:1)  Mobile Application Development  Nigamanth Sridhar / K. Gopinath 
This course will cover topics on developing applications on mobile smartphone platforms. Primary emphasis will be on Android development, while students will also learn the basics of developing applications for iOS. The course will include a project that will be defined and executed by student groups.
References:
 The Android and iOS developer documentation.
 Lecture notes handed out in class.
 Papers from recent conferences and journals.
Prerequisites
 Programming skills.
E0 293 (3:1)  Reinforcement Learning  B. Ravindran 
Reinforcement learning is a paradigm that aims to model the trialanderror learning process that is needed in many problem situations where explicit instructive signals are not available. It has roots in operations research, behavioral psychology and AI. The goal of the course is to introduce the basic mathematical foundations of reinforcement learning, as well as highlight some of the recent directions of research. The Reinforcement Learning problem: evaluative feedback, nonassociative learning, Rewards and returns, Markov Decision Processes, Value functions, optimality and approximation. Bandit Problems: Exploreexploit dilemma, Binary Bandits, Learning automata, exploration schemes. Dynamic programming: value iteration, policy iteration, asynchronous DP, generalized policy iteration. MonteCarlo methods: policy evaluation, roll outs, on policy and off policy learning, importance sampling. Temporal Difference learning: TD prediction, Optimality of TD(0), SARSA, Qlearning, Rlearning, Games and after states. Eligibility traces: nstep TD prediction, TD(lambda), forward and backward views, Q(lambda), SARSA(lambda), replacing traces and accumulating traces. Function Approximation: Value prediction, gradient descent methods, linear function approximation, Control algorithms, Fitted Iterative Methods. Policy Gradient methods: nonassociative learning – REINFORCE algorithm, exact gradient methods, estimating gradients, approximate policy gradient algorithms, actorcritic methods. Hierarchical RL: MAXQ framework, Options framework, HAM framework, Option discovery algorithms. Case studies: Elevator dispatching, Samuel’s checker player, TDgammon, Acrobot, Helicopter piloting, Computational Neuroscience.
References:
 R. S. Sutton and A. G. Barto. Reinforcement Learning – An Introduction. MIT Press. 1998.
 D. P. Bertsikas and J. N. Tsitsiklis. Neurodynamic programming. Athena Scientific. 1996.
 K. S. Narendra and M. A. L. Thathachar. Learning Automata – An Introduction. PrenticeHall, USA. 1989.
 A. G. Barto and S. Mahadevan, Recent Advances in Hierarchical Reinforcement Learning, Discrete Event Systems Journal, Volume 13, Special Issue on Reinforcement Learning, pp. 4177. 2003.
 R. J. Williams, Simple Statistical Gradientfollowing algorithms for Connectionist Reinforcement Learning, Machine Learning, 8:229256. 1992.
 J. Baxter, P. L. Bartlett, InfiniteHorizon GradientBased Policy Search, Journal of Artificial Intelligence Research, 15: 319350. 2001.
 V. R. Konda and J. N. Tsitsiklis, “ActorCritic Algorithms”, SIAM Journal on Control and Optimization, Vol. 42, No. 4, pp. 11431166. 2003.
Prerequisites
 Basic probability theory and linear algebra, Familiarity with regression and nonlinear optimization.
E0 294 (3:1)  Systems for Machine Learning Instructors  R.Govindarajan / Arkaprava Basu / Sumit Kumar Mandal / Murali Annavaram 
Introduction to ML; Classification of ML Kernels. Training and inferencing of ML kernels in GPUs/TPUs. Workload characterization. ML tools and runtime. ML accelerators. Inmemory computing for ML. Input pipeline of ML workloads. MultiGPU scheduling of DNN training; Security and privacy in ML.
Textbooks:
 REQUIRED READING: Sze, Chen, Yang and Emer: “Efficient Processing of DNNs,” Morgan&Claypool Press. 2021. ISBN: 9781681738321
 REQUIRED READING: http://eyeriss.mit.edu/tutorial.html
 SUGGESTED READING: Dubois, Annavaram and Stenström: “Parallel Comupter Organization and Design” Cambridge University Press, 2012. ISBN: 9780521886758. Background material is in this book and some of the planned chip multi processor related discussion is in this book. Also some problems and reading assignments will be picked from the book.
 Reagen, Robert Adolf, Paul Whatmough, GuYeon Wei, David Brooks “Deep Learning for Computer Architects Brandon” https://www.morganclaypool.com/doi/pdfplus/10.2200/S00783ED1V01Y201706CAC041
Prerequisite:
Any student willing to enroll in this course should have at least gotten B+ grade in E0243 (Computer Architecture). The students should have also preferably done E0253 as well.
E0 301 (3:1)  Virtual Reality and its applications  Swami Manohar / Vijay Natarajan 
Objectives:To provide a graduate level understanding of the concepts, hardware and software systems, and applications of Virtual reality. Course methodology:This course will be driven by a collaborative problem solving based approach. Students will create virtual environments by learning to use hardware and software tools and in the process of creating such environments, grasp the underlying theory and concepts.Course Outline:Exploration of VR Toolkits;Applications of Virtual reality; gaming, scientific visualisation,education and healthcare. Use of recent consumer grade VR equipment (Google glasses, Razer Hydra, nvidia SoundStorm, etc.);programming with open source VR Toolkits; conceptual foundations of VR: 3D interactions, multisensory interaction, immersion, head and body position tracking, perceptual issues;modelling of virtual worlds; 3D graphics, stereo and realtime rendering; physicsbased rendering. Current challenges in virtual reality.
References:
 Conference and journal papers, online resources for open source software and VR hardware.
Prerequisites
 Competence in at least one programming language.A course in computer graphics is a plus, but not required.This is a projectintense course.
E0 302 (3:1)  Topics in Software Engineering  Aditya Kanade / Shirish K. Shevade 
Course objective: Study and design of machine learning techniques to improve software engineering.
Motivation: Machine learning has become an effective technique for making sense of large datasets to glean actionable insights. Large software repositories such as open source gits, smartphone app stores and student submissions in MOOCs courses contain a wealth of information. The goal of this course is to study and design stateoftheart machine learning techniques to improve software engineering using the large amount of code available.
Syllabus: Machine learning models for program analysis, automated program repair, program synthesis, mining software repositories, representation and deep learning for software engineering, programming language processing.
References:
 Recent research papers
Prerequisites
 Background in programming
 Data mining or machine learning course in CSA.
E0 303 (3:1)  Resource Proportional Software Design for Emerging Systems  K. Gopinath / Suparna Bhattacharya / Doug Voigt 
Course objective:
Study and design of machine learning techniques to improve software engineering.
Motivation and objectives of the course:
Software systems have become complex and their correctness and performance are not clearly understood or known. We identify key reasons why this could happen and provide a resource proportional model as one way to understand the performance behaviour of complex software systems.
Syllabus:
Software Bloat, Lost Throughput and Wasted Joules: An Opportunity for Green Computing: Why hardware advancements aren’t enough. Why a plea for lean software isn’t enough. A Sustainable Principle for Emerging Systems: Resource Proportional Software Design. The Problem of Software Bloat: Causes and Consequences, Forms of Software Runtime Bloat, Progress in Bloat Characterization, Measurement and Mitigation. How Bloat in Software Impacts System Power Performance. Analyzing the Interplay of Bloat, Energy Proportionality and System Bottlenecks. Design Principles to Reduce Propensity for Bloat. A systems perspective on the origin of bloat. Defining resource proportionality with respect to feature utilization to predict bloat propensity. Strategies for bloat mitigation. What Component and Tool Developers Can do. Refactoring Existing Software for Improved Resource Proportionality. Implications for Nonfunctional Requirements. Resource Proportional Programming for Persistent Memory Applications/ Memory Speed Fabrics Applying Resource Proportional Design Principles to a Deeply Layered Stack. Data Centric Resource Proportional Systems Design. Adapting the Systems Software Stack to a Changing Paradigm of Uniform Logical Access to a Radically NonUniform System. Bridging the gap from what we know today to open challenges & research topics
References:
 Current research papers.
Prerequisites
 2 or more 200level courses in software systems (OS, databases, compilers, graphics)
E0 304 (3:1)  Computational Cognitive Neuroscience  Sridharan Devarajan 
This reading course is focused on recent advances computational frameworks in cognitive neuroscience. We will review the stateofthe art in data analysis techniques that permit extracting meaningful information from noisy, highdimensional brain data (e.g. machine information from noisy, highdimensional brain data (e.g. machine learning and dimensionality reduction) as well as theoretical and computational models of brain function. The course will be organized into four reading modules on Machine learning and classification, Dimensionality reduction, Neural computation and Theory, and Deep convolutional neural networks, discussing recent applications in computational neuroscience. The project will require analyzing largescale brain datasets, for example, decoding cognitive states from brain imaging data.
References:
 .
Prerequisites
 Familiarity with machine learning, dimensionality reduction, and linear algebra at the advanced undergraduate/early graduate level. Knowledge of coding (e.g. C/Matlab/Python) is essential. Some background in neuroscience is preferred, but not essential (background readings will be provided).
E0 305 (3:1)  Blockchain and its Applications  Arpita Patra 
Motivation and objectives of the course:
Blockchains and its applications in cryptography that include cryptocurrencies are emerging technologies. This course will cover blockchains and their applications to cryptocurrencies such as Bitcoin, distributed consensus and multiparty computation (MPC), smart contracts and beyond.
Syllabus:
a) Introduction to Blockchain and its cryptographic building blocks; (b) Blockchain Analysis (c) Introduction to Cryptocurrencies, Bitcoin and its alternative cryptocurrencies (d) Applications of Blockchains beyond cryptocurrencies (such as in consensus, multiparty computation (MPC), smart contracts); (e) Alternatives of Blockchains.
References:
 Bitcoin and Cryptocurrency Technologies: A Comprehensive Introduction by Arvind Narayanan, Joseph Bonneau, Edward W. Felten, Andrew Miller, Steven Goldfeder and Jeremy Clark. Princeton University Press, 2016.
 Mastering Bitcoins: Unlocking Digital Cryptocurrencies by Andreas Antonopoulos. O’Reilly Media, Inc, 2013.
 Recent research papers and reports.
Prerequisites
 Mathematical maturity will be assumed.
E0 306 (3:1)  Deep learning: theory and practice  Anand Louis / Amit Deshpande (MSR) / Navin Goyal (MSR) 
Motivation and objectives of the course:
The area of deep learning has been making rapid empirical advances, however this success is largely guided by intuition and trial and error and remains more of an art than science. We lack theory that applies “endtoend.” While the traditional theory of machine learning leaves much to be desired, current research to remedy this is very active. Besides being of interest in its own right, progress on theory has the potential to further improve the current deep learning methods. This course will bring students up to date to the current fastmoving frontier. Along with theory topics the empirical phenomena that the theory seeks to explain will be covered in detail. The course will involve both theory and programming assignments.
Syllabus:
Tentative list of topics (subject to change depending on class interests, new developments in the field etc.): Quick introduction/reminder of deep learning and the theory of machine learning and optimization; Expressive power of neural nets; Landscape of deep learning optimization; Generalization in deep learning; Architectures (e.g. convolutional networks), network compression; Adversarial examples; Formal verification and neural networks; Visualization and interpretation; Deep generative models; Recurrent neural networks; Deep reinforcement learning
References:
 Deep Learning by Ian Goodfellow, Yoshua Bengio, Aaron Courville
 Understanding Machine Learning: From Theory to Algorithms by Shai ShalevShwartz, Shai BenDavid
 Relevant recent literature
Prerequisites
 Linear Algebra, Probability. Courses in ML, DL, Optimization and some prior familiarity with Python and deep learning frameworks such as PyTorch/TensorFlow will be useful, though not an absolute requirement.
E0 307 (3:1)  Program Synthesis meets Machine Learning  Chiranjib Bhattacharyya / Deepak D Souza / Sriram Rajamani (MSR India) 
This course will have two parts:
Part 1: In this part, we will cover the theory and fundamentals of program synthesis, including the recent formulations to restrict synthesis using templates, and reformulate synthesis as a search problem. We will also cover blackbox formulations of synthesis, starting with the classic Angluin’s algorithm [5] to its modern variants [6]. We will teach this part in a structured manner through planned lectures.
Part 2: In this part, we will read and discuss recent papers exploring the combination of machine learning and program synthesis. Specific topics include:
– Using ML to Rank Programs and Prune Search Space for Program Synthesis [7, 8]
– Combining ML and synthesis [9]
– Neural program induction [10, 11]
– Automatic differentiation [12, 13]
Motivation and objectives of the course:
Program synthesis has its roots in formal methods and programming languages. The goal of program synthesis is to automatically generate a program (from a space of possible programs) which satisfies a specification written in logic. The problem has its roots in a paper by Church in 1957, and the initial breakthroughs were made by Buchi and Landweber (1969) and M O Rabin (1972) , who showed that the synthesis problem is decidable for specifications written in certain logics. However, the complexity of the algorithms was too high (NonElementary to EXPTIME) to be useful in practice.
Recent formulations have made synthesis more practical . In his PhD thesis, SolarLezama formulated synthesis as “sketching” [2] , a process where part of the program is given by the user as a template and the synthesizer merely fills in “holes” in the sketch using search. Another recent formulation, due to Sumit Gulwani uses input/output examples (rather than formulas) as specifications [3] , and uses clever search algorithms to generate appropriate programs. Sparked by these two works, there has been a resurgence or work in program synthesis in the past decade. There is an annual Sygus competition [4] where practical tools compete every year. Recently there is an interesting interplay developing between program synthesis and machine learning. Machine learning uses continuous optimization methods to learn models that minimize a specified loss function, whereas program synthesis uses discrete combinatorial search to learn programs that satisfy a specification. While program synthesis produces interpretable programs, which can be formally verified, machine learning deals with noise in the inputs more gracefully. There is a rich body of recent work in combining machine learning and program synthesis to get the benefits of both approaches.
References:
 Relevant research literature, some of them listed below:
 [1] Alonzo Church, Application of recursive arithmetic to the problem of circuit synthesis, Summaries of talks presented at the Summer Institute for Symbolic Logic Cornell University, 1957.
 [2] SolarLezama, Program Synthesis by Sketching, PhD Thesis, UC Berkeley, 2003
 [3] Sumit Gulwani , Automating String Processing in Spreadsheets using InputOutput Examples. POPL 2011.
 [4] Sygus competition, https://sygus. org.
 [5] Angluin, D. Learning regular sets from queries and counterexamples. Inf. Comput. 75, 2 (1987) , 87–106.
 [6] Vandrager, F. Model Learning, CACM, Feb 2017.
 [7] Ashwin Kalyan, Abhishek Mohta, Alex Polozov, Dhruv Batra, Prateek Jain, Sumit Gulwani . Neural Guided Deductive Search for Real –Time Program Synthesis from Examples, 6th International Conference on Learning Representations (ICLR) , January 2018.
 [8] Sumit Gulwani , Prateek Jain, Programming by Examples: PL meets ML, Dependable Software Systems Engineering, Published by IOS Press, 2019.
 [9] A Iyer, M Jonnalagedda, S. Parthasarathy, A. Radhakrishna, S. Rajamani , Synthesis and Machine Learning for Heterogeneous Extraction, to appear in PLDI 2019. [10] Abhinav Verma, Vijayaraghavan Murali , Rishabh Singh, Pushmeet Kohli , and Swarat Chaudhuri . 2018, Programmatically Interpretable Reinforcement Learning. In ICML 2018.
 [10] Alex Graves, Greg Wayne, Ivo Danihelka, Neural Turing Machines, 2014.
 [11] Scott Reed, Nando de Freitas, Neural Programmer Interpreters, 2016.
 [12] Pearmutter & Siskind, Reverse mode AD in a functional framework, TOPLAS 2008.
 [13] Elliott, The simple essence of automatic differentiation, ICFP 2018
Prerequisites
 We require students to have good knowledge in programming. We also require students to have taken an introductory course in Machine Learning (regression, classification, deep learning etc). We will not require prior exposure to program synthesis or formal methods. We will supply the necessary background in Part 1. Students will need to show initiative in reading papers for Part 2, and leading discussions in Part 2. Students will also need to do both theory and implementation for the project.
E0 309 (3:1)  Topics in complexity theory  Neeraj Kayal / Chandan Saha 
The theme of this course in the JanApr 2015 semester is arithmetic circuit complexity. Arithmetic circuits are algebraic analogue of boolean circuits that naturally compute multivariate polynomials. The quest for a thorough understanding of the power and limitation of the model of arithmetic circuits (and its connection to boolean circuits) has lead researchers to several intriguing structural, lower bound and algorithmic results. These results have bolstered our knowledge by providing crucial insights into the nature of arithmetic circuits. Still, many of the fundamental questions/problems on arithmetic circuits have remained open till date.The aim of this course is to provide an introduction to this area of computational complexity theory. We plan to discuss several classical and contemporary results and learn about a wealth of mathematical (particularly, algebraic and combinatorial) techniques that form the heart of this subject.
References:
 Current literature on Arithmetic circuit complexity.
Prerequisites
 Familiarity with basic abstract algebra, linear algebra, probability theory and algorithms will be helpful. More importantly, we expect some mathematical maturity with an inclination towards theoretical computer science.
E0 310 (3:1)  Advanced Software Engineering  Murali Krishna Ramanathan 
The course is composed of two parts; the first part will introduce the fundamentals of writing concurrent programs, its applicability in the context of building large scale software systems, different models of concurrency, introduction to various bug patterns. The second part will study the recent trends in designing program analysis techniques to detect bugs with a special emphasis on scalable approaches. A course project will help familiarize all the concepts learned as part of the lectures.
References:
 Java Concurrency in Practice by Brian Goetz, Tim Peierls, Joshua Bloch, Joseph Bowbeer, David Holmes, Doug Lea, AddisonWesley, (2006).
 Slides and research papers listed on the course webpage.
Prerequisites
 Previous experience with building a system will be helpful but not essential.
E0 311 (3:1)  Topics in Combinatorics  L. Sunil Chandran 
Tools from combinatorics is used in several areas of computer science. This course aims to teach some advanced techniques and topics in combinatorics. In particular, we would like to cover probabilistic method which is not covered in the introductory course `graph theory and combinatorics’. Moreover the course would aim to cover to some extent the linear algebraic methods used in combinatorics. We will also discuss some topics from extremal combinatorics.
Linear Algebraic methods: Basic techniques, polynomial space method, higher incidence matrices, applications to combinatorial and geometric problems. Probabilistic Methods: Basic techniques, entropy based method, martingales, random graphs. Extremal Combinatorics: Sun flowers, intersecting families, Chains and antichains, Ramsey theory.
References:
 L. Babai and P. Frankl: Linear algebra methods in combinatorics with applications to Geometry and Computer Science, Unpublished manuscript.
 N. Alon and J. Spenser: Probabilistic Method, Wiley Interscience publication.
 Stasys Jukna: Extremal Combinatorics with applications in computer science, Springer.
Prerequisites
 Basic familiarity with probability theory, linear algebra, and graph theory and combinatorics.
E0 312 (3:1)  Foundations of Secure Computation  Arpita Patra 
Indistinguishability, realideal world and simulationbased security notions; Secret Sharing, Verifiable Secret Sharing, Oblivious Transfer, Circuit Garbling and function encoding, Commitment Scheme, Zeroknowledge Proof, Threshold Cryptography, Encryptions, Broadcast Byzantine Agreement, Cointossing protocol, Theoretical and practical protocols for secure computation in various models.
References:
 Book: “Efficient Twopart Protocols Techniques and Constructions” by Carmit Hazay and Yehuda Lindell.
 Book Draft: “Secure Multiparty Computation and Secret Sharing – An Information Theoretic Appoach” by Ronald Cramer, Ivan Damgaard and Jesper Buus Nielsen.
 Recent Research Papers.
Prerequisites
 Mathematical maturity.
 Basic level crypto course.
E0 313 (3:1)  Theory of convex optimization and sampling 
Ankit Garg / Anand Louis 
Motivation and objectives of the course: This course is intended to teach students in theoretical computer science and related areas about the theory of convex optimization. Our goal is that at the end of the course, students should know some of the common algorithmic techniques used for optimizing convex functions and sampling from convex bodies.
Tentative Syllabus: The topics covered are likely to be a subset of the following.
a. Introduction to Gradient descent and cousins: Basics of convex functions, gradient descent,
accelerated gradient descent, stochastic gradient descent, mirror descent and other variants.
b. Cutting plane methods: Center of gravity, Ellipsoid algorithm, recent breakthroughs on improved running times.
c. Interior point methods: Newton’s method, theory of selfconcordant functions, polynomial time algorithm for linear programming, recent breakthroughs on improved running times.
d. Sampling from convex bodies: Basics of Markov Chains, Isoperimetric inequalities, Ball walk, Dikin walk, Hamiltonian Monte Carlo.
e. Connections to combinatorial optimization: Graph sparsification, Laplacian system solvers, faster algorithms for max flow
References: We will be teaching the material from multiple courses/books. Some of them are the following.
 Yin Tat Lee and Santosh Vempala. Theory of Optimization and Continuous Algorithms. 2019.
 Nisheeth Vishnoi. Algorithms for convex optimization. 2018.
 Sebastien Bubeck. Convex Optimization: Algorithms and Complexity, Foundations and Trends in Machine Learning. 2015.
Prerequisites: E0 225 (Design and Analysis of Algorithms)
E0 314 (3:1)  Proof system in Cryptography  Chaya Ganesh / Arpita Patra 
Motivation and objectives of the course : The course is intended to introduce cryptographic proof systems and applications to students studying cryptography.
Syllabus : The tentative topics that will be covered:
 Interactive proofs:
Class IP, IP=PSPACE
Sumcheck protocol, doubly efficient proofs
Delegating computation, interactive proofs for muggles
Zeroknowledge (ZK) proofs  Foundations of ZK:
ZK for NP, motivation and definitions
Round complexity, Nonblackbox Zeroknowledge
Sequential and Parallel composition
Limitations and lower bounds, Witness indistinguishability  More ZK:
Honest verifier zeroknowledge
Malicious verifier zeroknowledge, proof of knowledge, zeroknowledge
arguments
Sigma protocols, Noninteractive ZK, GrothSahai proof system
MPC and zeroknowledge, MPCinthe head  SNARKs (Succinct Noninteractive ARguments of Knowledge):
PCP, Succinct arguments, separation from falsifiable assumptions
Preprocessing SNARKs with trusted setup
SNARKs from linear PCP
Polynomial commitments, universal updatable SNARK, holographic proofs  Interactive Oracle Proof (IOP):
Model and definitions
IP,PCP as a special case of IOP
Applications of IOP in delegation of computation
Transparent SNARKs  Recent developments:
Confidential transactions, Anonymous cryptocurrency like ZCash
Recursive composition theorem for SNARK, Proofcarrying data Applications in succinct
blockchain, Decentralized private computation, Research directions
References: There will be multiple sources. Since this is an advanced course, references for most of the material will be research papers and surveys
 Foundations of Cryptography, Parts I and II, Oded Goldreich.
 Efficient Secure TwoParty Protocols — Techniques and Constructions, Carmit Hazay and Yehuda Lindell.
 Computational Complexity, Barak and Arora
 Surveys by Oded Goldreich on doubly efficient proofs and PCP
Prerequisites : Algorithms, Intro to crypto
E0 316 (3:1)  Interacting Particle Systems  Gugan Thoppe 
Motivation and objectives of the course: This course will focus on two broad topics i.) Markov processes, and ii.) Interacting particle systems (which can be viewed as countable systems of locally interacting Markov processes). Both these topics are classic in that they have been around for over 50 years and have found applications in varied academic disciplines. Nevertheless, there are still several problems (often easytoformulate) that continue to remain unresolved, and which is exactly the reason for younger researchers to continue getting attracted to these two fields.
The broad aim of this course is to familiarize the student with the fundamentals of these two processes and the inherent challenges in their study, e.g., construction, convergence, etc.
Tentative Syllabus:
The following topics will likely be covered in the course:
Convergence of Probability Measures: Prohorov Metric and its connections to weak convergence, Prohorov’s Theorem, Weak convergence in the space of right continuous function with left limits.
Markov Processes: Markov jump and Feller processes, Martingale problem, Martingale problem for interacting particle systems, Convergence theorems for Markov processes. Application: Asymptotic topology of dynamic simplicial complexes.
Interacting Particle Systems and Finite Markov Information Exchange Processes: Averaging model, Voter model, Epidemic models and firstpassage percolation, SIR and SIS epidemics, contact process
Mean fieldlimit: Processes on the complete graph, Meanfield limit of the Ising model, Analysis of the meanfield model, Functions of Markov processes, Meanfield contact process, Meanfield voter model, Mean field games, and Mean field equilibria
Construction and ergodicity Poisson construction, Generator construction, Ergodicity
Recent Literature:
The course will have significant programming assignments based on recent literature. This will engage the students to explore active research fronts in the abovementioned topics.
Textbooks:
 Ethier, S.N. and Kurtz, T.G., 2009. Markov processes: characterization and convergence. John Wiley & Sons.

Liggett, T.M., 2004. Interacting particle systemsAn introduction.

Swart, J.M., 2017. A course in interacting particle systems. arXiv preprint arXiv:1703.10007.

Various surveys and lecture notes.
Prerequisites:
Students should have completed a graduate level course on probability. Statement on overlap with other courses: This course does not have much overlap with any of the existing or past CSA courses. However, there is some overlap with MA 392 (Random Graphs and Interacting Particle Systems).
E0 320 (3:1)  Topics in Graph Theory  L. Sunil Chandran 
Minors: Introduction – properties which causes dense minors in graphs: average degree, girth, Wagner’s characterisation of graphs without K5 minors. Tree Decompositions: treewidth, pathwidth, upper and lower bounds for treewidth, relation of treewidth and minors, influence on algorithmic graph problems. Hadwiger’s conjecture – its relation with the four colour theorem, related work.
References:
 Graph Theory (Chapters 8 and 12), Reinhard Diestel, Springer, 2000.
 Current Literature
E0 322 (3:1)  Topics in Algebra and Computation  Chandan Saha 
The course will consist of two parts: Computational aspects of algebra & number theory ; Use of algebraic methods in theoretical computer science. Part 1: Chinese remaindering, Discrete Fourier Transform, Resultant of polynomials, Hensel lifting, Automorphisms of rings, Short vectors in Lattices, Smooth numbers etc. – and show how these tools are used to design algorithms for certain fundamental problems like integer & polynomial factoring, integer & matrix multiplication, fast linear algebra, root finding, primality testing, discrete logarithm etc. Part 2: This will deal with certain applications of algebraic methods/algorithms in cryptography (RSA cryptosystem, DiffieHellman), coding theory (ReedSolomon & ReedMuller codes, locally decodable codes), analysis of boolean functions (Fourier analysis), and construction of expander graphs.
References:
 Modern Computer Algebra by von zur Gathen and Gerhard.
 Introduction to Finite Fields by Lidl & Niederreiter.
 Relevant research papers and online lecture notes.
Prerequisites
 Basic familiarity with linear algebra and properties of finite fields (as in the Chapter 13 of the book ‘Introduction to finite fields and their applications’ by Rudolf Lidl and Harald Niederreiter). Alternatively, an undergraduate course in algebra. Most importantly, some mathematical maturity with an inclination towards theoretical computer science.
E0 323 (3:1)  Topics in Verification  Aditya Kanade 
In this course, we aim to study algorithmic approaches for automating 1. Synthesis of programs, 2. Discovering specifications of programs, 3. Selection of domainspecific algorithms. Along with presentations by course instructors, every participant will be assigned a few papers to be presented in the class. The exchange of knowledge will be mainly through open discussions in the classes. An optional course project will be offered for interested participants. The evaluation will be based on quality of presentations, understanding of material, and participation in the class discussions.
References:
 A number of classic as well as recent research papers have been identified carefully. The list can be made available if required. There are no specific text book references for the course.
Prerequisites
 Program Analysis and Verification (E0 227) or Automated Verification (E0 223); in other cases, you can seek permission from the instructors.
E0 325 (3:1)  Topics in Approximation and Online Algorithms  Anand Louis / Arindam Khan 
(i) Introduction to online algorithms, random order arrival models, etc.
(ii) Primaldual methods, dualfitting, etc. and their applications in designing algorithms.
(iii) Convex optimization techniques and their applications to online algorithms, etc.
(iv) Convex programming hierarchies and their application in designing algorithms.
(v) Hardness of approximation using PCPs, reductions from unique games, etc.
(vi) Other selected topics.
References:
 (i) The Design of Approximation Algorithms by David P. Williamson and David Shmoys.
 (ii) Lecture notes from similar course offerings at other universities.
 (iii) Recent papers and monographs/survey articles.
Prerequisites
 E0 249
E0 327 (3:1)  Topics in Program Analysis  Deepak D Souza / K.V. Raghavan 
Dataflow analysis: applications in program verification and transformation. Type systems: applications in software development and verification. Program slicing: Applications in software development. Techniques for pointsto analysis. Symbolic execution: Applications in program testing. Model checking of software using abstractions. Program logics: applications in program verification. Techniques for testing and verification of concurrent programs.
References:
 Research papers
Prerequisites
 Program Analysis and Verification (E0 227)
E0 330 (3:1)  Convex Optimization  Shirish K. Shevade 
Convex sets and functions, Convex Optimization Problems, Duality, Approximation and fitting, Statistical Estimation, Geometric Problems, Unconstrained minimization, Interiorpoint methods.
References:
 S. Boyd and L. Vandenberghe: Convex Optimization, Cambridge University Press, 2004.
E0 331 (3:1)  Optimization for Machine Learning  Shirish K. Shevade 
Convex Optimization – Introduction, Incremental Gradient, Subgradient and Proximal Methods. Nonsmooth Convex Optimization, DC (Difference of Convex functions) Programming, Lagrangian Relaxation – Dual Decomposition. Augmented Lagrangian Methods, Cutting Plane Methods, LargeScale Learning – Approximate Optimization.
References:
 Optimization for Machine Learning, Suvrit Sra, Sebastian Nowozin and Stephen Wright (Editors), The MIT Press, Dec. 2011.
 Recent Literature
Prerequisites
 A course in Machine Learning or Data Mining
E0 333 (3:1)  Theory of Probability and Information  Ambedkar Dukkipati / Shalabh Bhatnagar 
Probability spaces and measure theory, Borel SigmaAlgebras and Random Variables, Lebesgue theory of integration, expectation, Radon Nikodym theorem, Shannon entropy and Idivergence, GYPtheorem for Idivergence, Pinsker inequality, stochastic process and entropy rate, product spaces and Fubini’s Theorem, probability on metric spaces, conditional expectation, martingales, introduction to stochastic calculus.
References:
 Billingsley, P., Convergence of Probability Measures, Wileyinterscience, 1999.
 Borkar, V. S., Probability Theory : An Advanced Course, SpringerVerlag, 1995.
 K. R. Parthasarathy, Coding Theorems of Classical and Quantum Information theory TRIM publication, 2007.
 I. Karatzas and S.E. Shreve, Brownian Motion and Stochastic Calculus, Springer; 2nd edition 1991.
Prerequisites
 Any basic course in Probability.
E0 334 (3:1)  Deep Learning for Natural Language Processing  Shirish K. Shevade / S Sundararajan 
Introduction, Multilayer Neural Networks, Backpropagation, Training Deep Networks; Simple word vector representations: word2vec, GloVe; sentence, paragraph and document representations. Recurrent Neural Networks; Convolutional Networks and Recursive Neural Networks; GRUs and LSTMs; building attention models; memory networks for language understanding. Design and Applications of Deep Nets to Language Modeling, parsing, sentiment analysis, machine translation etc.
References:
 Ian Goodfellow and Yoshua Bengio and Aaron Courville. Deep Learning. MIT Press, 2016
 Recent Literature.
Prerequisites
 A course on Machine Learning or equivalent.
E0 335 (3:1)  Topics in Cryptology: Emerging asymmetric cryptosystems  Sanjit Chatterjee 
Emerging encryption primitives like identitybased encryption, attributebased encryption, predicate encryption, functional encryption etc. Cryptographic protocols for privacy preserving computation, secure storage and cloud. Revisiting the security definition and security reduction with an emphasis on concrete security and the interplay of functionality, security and efficiency of cryptographic protocols. Cryptanalysis of provable security.
References:
 A selection of research papers from journals and conference proceedings.
Prerequisites
 Cryptography (E0 235).
E0 336 (3:1)  Randomness in Cryptography  Bhavana Kanukurthi 
Entropy notions such as minentropy, shannon entropy etc. Computational variants of these notions and the challenges in analyzing them. Randomness extractors, privacy amplification protocols, leakageresilient Cryptography. Design of error correcting codes with specialized properties (motivated by various cryptographic applications) – e.g., nonmalleable codes, updatable codes etc.
References:
 Research papers.
Prerequisites
 An undergraduate course on Probability Theory will be helpful.
E0 337 (3:1)  Topics in Advanced Cryptography  Bhavana Kanukurthi 
The goal of this course is to focus on cuttingedge research themes in cryptography and understand the mathematical objects and/or computational assumptions behind them. Advanced encryption schemes such as, for example, CCA secure encryption, circular secure encryption, searchable encryption, fullyhomomorphic encryption and their underlying computational assumptions (LWE etc.). Other advanced topics such as puncturable PRFs, obfuscation, multilinear maps.
References:
 Not Applicable
Prerequisites
 A course in Cryptography and mathematical maturity.
E0 338 (3:1)  Topics in Security and Privacy  Sanjit Chatterjee 
Recent technological advances in diverse domians such as CPS/IoT, cloud storage and computation, quantum information processing as well as proliferation of tools for digital mass surveillance have thrown up many interesting research problems. This course will focus on some of the theoretical questions in Security and Privacy from a cryptographic perspective. We plan to cover a subset of the following topics:(A) Cryptographic Security in a PostQuantum World.(B) Design and Analysis of Privacy Enhancing Tools.(C) Efficient, Secure and Verifiable Query Processing in Outsourced Database.(D) Cryptocurrency, Smart Contracts, Blockchain and Applications.
References:
 Recent research papers in the relevant areas.
Prerequisites
 Good performence in E0 235 (Cryptography) and consent of the instructor.
E0 343 (3:1)  Topics in Computer Architecture  Matthew Jacob Thazhuthaveetil / R. Govindarajan 
Architecture and hardware description languages (RTL, ISPS, vhdl). Processor architecture, Instruction level parallelism, Latency tolerance, multithreading, interconnection networks, Standards (bus, SCI), architectures, routing, Cache coherency, protocol specification, correctness, performance. Memory consistency models, synchronization primitives, parallel programming paradigms, I/O systems, Interface standards, parallel I/O, performance evaluation, analytical methods, simulation algorithms and techniques, benchmarking.
Prerequisites
 Computer Architecture, Operating Systems, Some Familiarity with Analytical Performance Evaluation Techniques.
E0 352 (3:1)  Topics in System Research: Learning for Computer System  K. Gopinath / Chiranjib Bhattacharyya 
Regression, feature selection, ensemble methods (boosting, bagging, etc) and HMM models. Selected topics in OS (related to the papers under discussion and including, as necessary, some review of required background)
References:
 Current Literature (Conference proceedings of SOSP, SysML, etc)
Prerequisites
 Background in atleast one computer systems area like OS, Databases, Compilers etc. and Instructors’ consent.
E0 353 (3:1)  Topics in Operating Systems  K. Gopinath 
Selected topics in operating systems of topical interest. Design, implementation, correctness and performance related aspects. Past offerings included study of subsystems such as process, storage and network subsystems.
References:
 Recent Literature
Prerequisites
 Consent of instructor and a course in Operating Systems, Computer Architecture with some familiarity of the internals of Linux/Unix
E0 355 (3:1)  Topics in Compiler Design  Y.N. Srikant 
Dynamic and JustInTime compilation. Compilation for embedded systems: performance, memory, and energy considerations. Static analysis: pointsto analysis, abstract interpretation. WCET estimation. Type systems. Optimizations for OO languages. Compilation for multicore systems.
This course will be based on seminars and mini projects.
References:
 Y.N. Srikant and Priti Shankar (ed.), The Compiler Design Handbook: Optimizations and Machine Code Generation, 2nd ed., CRC Press, 2008.
Prerequisites
 Good knowledge of dataflow analysis and compiler optimizations
E0 358 (3:1)  Advanced Techniques in Compilation and Programming for Parallel Architectures  Uday Kumar Reddy B. 
Parallel architectures: a brief history, design, Autoparallelization for multicores, GPUs, and distributed Memory clusters Lockfree and waitfree data structures/algorithms for parallel programming Study of existing languages and models for parallel and high performance programming; issues in design of new ones.
References:
 Aho, Lam, Sethi, and Ullman, Compilers: Principles, Techniques, and Tools, 2nd edition
 Herlihy and Shavit, The Art of MultiProcessor Programming
 Ananth Grama, Introduction to Parallel Computing
 List of research papers and other material which will be the primary reference material will be available on course web page.
Prerequisites
 Knowledge of “E0 255 Compiler Design” course content (especially on parallelization) will be very useful, but not absolutely necessary.
 Knowledge of microprocessor architecture and some basic understanding of parallel programming models.
E0 361 (3:1)  Topics in Database Systems  Jayant R. Haritsa 
Objectoriented Databases, Distributed and Parallel Databases, Multidatabases, Access Methods, Transaction Management, Query Processing, Deductive Databases, multimedia Databases, Real Time Databases, Active Databases, Temporal Databases, Mobile Databases, Database Benchmarks, Database Security, Data Mining and Data Warehousing.
References:
 Readings in Database Systems edited by M. Stonebraker, Morgan Kaufmann, 2nd ed., 1994.
 Conference and Journal papers
E0 367 (3:1)  Topics in Mobile Computing Technologies  L. M. Patnaik 
Wireless Technologies: Land Mobile Vs. Satellite Vs. Inbuilding Communications Systems, Cellular Telephony, Personal Communication Systems/Networks. Wireless Architectures for Mobile Computing. Applications. Wireless LANs, Wireless Networking, Handoff, Media Access Methods, Mobile IP, Unicast and Multicast Communication, Wireless TCP, Security Issues. Mobile Computing Models, SystemLevel Support, Disconnected Operation, Mobility, Failure Recovery. Information Management, Broadcast, Caching, Querying Location Data. Location and Data Management for Mobile Computing, Hierarchical Schemes, Performance Evaluation. Case Studies.
References:
 Current Literature from IEEE Transactions, Journals,and Conference Proceedings.
 Abdelsalam A. Helal et al, Any Time, Anywhere Computing : Mobile Computing Concepts and Technology, Kluwer International Series in Engineering and Computer Science, 1999.
 Evaggelia Pitoura and Geaorge Samaras, Data Management for Mobile Computing, Kluwer International Series on Advances in Database Management,October 1997.
Prerequisites
 Consent of the Instructor
E0 370 (3:1)  Statistical Learning Theory  Shivani Agarwal 
Theoretical foundations of modern machine learning. Kernel methods: support vector machines. Ensemble methods: boosting. Generalization analysis: VCdimension bounds, covering numbers, margin analysis, Rademacher averages, algorithmic stability. Statistical consistency analysis. PAC learning. Online learning and regret bounds. Selected additional topics of current interest.
References:
 Devroye, L, Gyorfi L, and Lugosi G, A Probabilistic Theory of Pattern Recognition. Springer, 1996.
 Anthony M, and Bartlett P L, Neural Network Learning: Theoretical Foundations. Cambridge University Press, 1999.
 Vapnik V N, Statistical Learning Theory. WileyInterscience, 1998.
 Current literature.
Prerequisites
 A strong foundation in probability and statistics, and some previous exposure to machine learning. Some background in linear algebra and optimization will be helpful.
E0 371 (3:1)  Topics in Machine Learning  Indrajit Bhattacharya / Chiranjib Bhattacharyya / Shivani Agarwal 
Review of Directed Graphical Models: Semantics; Exact Inference using Junction Tree Algorithm, Complexity Analysis; Parameter Estimation. Approximate Inference: Loopy Belief Propagation & Generalized Belief Propagation; Sampling techniques, Variational Techniques. Case Study: Latent Dirichlet Allocation. Non parametric Bayesian Models: Dirichlet Processes, Chinese Restaurant Processes and Polya Urn; Hierarchical Dirichlet Processes and Chinese Restaurant Franchise; Infinite Mixture Models and Indian Buffet Processes; Sequential models, Hidden Markov Dirichlet Processes and Hierarchical Dirichlet Process HMM; Hierarchical models, Nested Chinese Restaurant Processes; Dynamic models, Recurrent Chinese Restaurant Processes. Efficient Inference: Parallel, distributed and online algorithms.
References:
 Textbooks: Current literature.
Prerequisites
 Machine Learning, Consent of the instructor.
E0 372 (3:1)  Topics in Bioinformatics  Chiranjib Bhattacharyya / Ramesh Hariharan 
Sequence Alignment, Global and Local Alignment, Hidden Markov Models and their Applications in sequence processing, Phylogenetics, Bayesian Statistics, Sampling Algorithms, Clustering, Classification of Gene expression datasets, Support vector machines, Optimization, Principal Component Analysis.
References:
 R. Durbin, S. Eddy, A. Krogh, G. Mitchison, Biological Sequence Analysis Cambridge University Press, 1998.
 M.S. Waterman, Introduction to Computational Biology Maps, Sequences and Genomes, Chapman and Hall – CRC Press, 1995.
 Current Literature
Prerequisites
 Consent of the Instructor.
E0 373 (3:1)  Topological Methods for Visualization  Vijay Natarajan 
Topological methods for analyzing scientific data; efficient combinatorial algorithms in Morse theory; topological data structures including contour trees, Reeb graphs, MorseSmale complexes, Jacobi sets, and alpha shapes; robustness and application to sampled data from simulations and experiments; multiscale representations for data analysis and feature extraction; application to data exploration and visualization.
References:
 Textbooks: Course material will consist of current literature and lecture notes.
Prerequisites
 Basic familiarity with fundamental algorithms and data structures is desirable (E0 225 or E0 251). Familiarity with the basics of scientific visualization will be useful but not essential. Interested students with a nonCS background may also register for the course after consent of instructor.
E0 374 (3:1)  Topics in Combinatorial Geometry  Sathish Govindarajan 
Fundamental Theorems: Radon’s theorem, Helly’s theorem. Geometric graphs: Proximity graphs, geometric results on planar graphs. Geometric incidences: Incidence bounds using cuttings technique, crossing lemma. Distance based problems: Bounds on repeated distances and distinct distances. Epsilon Nets: Epsilon Net theorem using random sampling and discrepency theory, epsilon nets for simple geometric spaces, weak epsilon nets.
References:
 Janos Pach and Pankaj K. Agarwal: Combinatorial Geometry, Wiley, 1st edition, 1995.
 J. Matousek: Lectures on Discrete Geometry, SpringerVerlag, 1st edition, 2002.
 Current literature
Prerequisites
 The registrants should have preferably completed the “Design and Analysis of Algorithms” or “Discrete Structures” course.
E0 376 ( 3:1)  Information Theory and Statistical Inference  Ambedkar Dukkipati / Rajesh Sundaresan 
Probability and basic information theory, universal data compression, Iprojections and iterative algorithms for estimation with applications to statistics, large deviations and hypothesis testing, probabilities on metric spaces and information topology, Kolmogorov complexity, Applications of IT to other areas such as ergodic theory, gambling, biology.
References:
 Information theory and Statistics: A Tutorial by I. Csisz_ar and P. Shields, Now Publications, 2008.
 Elements of Information Theory, by T. M. Cover and J. A. Thomas, John Wiley and Sons, 2nd edition, 2006.
 Information and Distribution: Occam’s Razor in Action by P. Harremoes and A. Dukkipati, (in preparation) 2008.
 Coding Theorems of Classical and Quantum Information theory by K. R. Parthasarathy, TRIM publication, 2007.
 Information Theory, Inference, and Learning Algorithms by D.J.C. MacKay, Cambridge University Press, 2003.
Prerequisites
 Basic probability theory or consent of instructor.
E0 391 (3:1)  Algebra and Computation  T. Kavitha 
Preliminaries, polynomials, factorization of polynomials, Finite Fields, Berlekamp’s algorithm, Hensel’s lifting, LLL algorithm, applications to error correcting codes, the turnpike problem, some group theory, special cases of graph isomorphism, algorithms for primality testing.
References:
 Joachim von zur Gathen and Jürgen Gerhard: Modern Computer Algebra
 Relevant research papers and online notes.
E0 392 (2:0)  Models and Algorithms for Modern Data  Ravi Kannan 
Representing processing data as highdimensional points, Random graphs and other random models, Probability Concentration phenomena, Eigen Values, Eigen vectors, Singular Value Decomposition and Algorithmic applications, Massive Matrix Computations using randomized algorithms, Learning Algorithms, Optimization.
References:
 Current Literature
Prerequisites
 A solid undergrad background in Calculus, Linear Algebra, Probability and exposure to Algorithms.
E0 394 (3:1)  Performance Management of Internet Applications  Varsha Apte 
Part I: Performance Analysis
Introduction to multitier application performance characteristics, Measurementbased performance analysis of distributed applications, Analytical Performance modeling of multitier applications, Layered Queueing models (generic) Case studies of performance analysis of specific technologies (E.g. web server, virtual machines).
Part II: Performance Management
Overload control mechanims, QoS guarantee mechanisms, Dynamic resource provisioning mechanisms (e.g. in virtualized platforms), Poweraware performance management.
References:
 Scaling for ebusiness: technologies, models, performance, and capacity planning, Daniel A. Menascé, Virgilio A. F. Almeida, PrenticeHall, 2000.
 Papers:
 Woodside, Neilson, Petriu, Majumdar, The Stochastic Rendezvous Network Model for Performance of Synchronous ClientServerlike Distributed Software, IEEE Trans. On Computers, January 1995 (vol. 44 no. 1) pp. 2034.
 Rolia and Sevcik, The Method of Layers, IEEE Transactions on Software Engineering, Volume 21 , Issue 8 (August 1995), Pages: 689 – 700.
 John Dilley, Rich Friedrich, Tai Jin, Jerome Rolia, Web server performance measurement and modeling techniques, Performance Evaluation, Volume 33 , Issue 1 (June 1998), Special issue on tools for performance evaluation, Pages: 5 – 26
 Paul Barford, Mark Crovella, Generating representative Web workloads for network and server performance evaluation, ACM SIGMETRICS Performance Evaluation Review, Volume 26, Issue 1 (June 1998), Pages: 151 – 160.
 TF Abdelzaher, KG Shin, N Bhatti, Performance guarantees for web server endsystems: A controltheoretical approach, IEEE Transactions on Parallel and Distributed Systems, 2002.
 Comparison of the three CPU schedulers in Xen, L Cherkasova, D Gupta, A Vahdat – Performance Evaluation Review, 2007.
 B Urgaonkar, P Shenoy, A Chandra, P Goyal, Agile dynamic provisioning of multitier Internet applications, ACM Transactions on Autonomous and Adaptive Systems (TAAS), Volume 3 , Issue 1 (March 2008).
 Jeffrey S. Chase, Darrell C. Anderson, Prachi N. Thakar, Amin M. Vahdat, Ronald P. Doyle, Managing energy and server resources in hosting centers, December 2001, SOSP ’01: Proceedings of the eighteenth ACM symposium on Operating systems principles.
Prerequisites
 It will be very useful to have a background in queuing systems (as provided in course E0 241, or any equivalent course from other departments). Undergraduate level background in Operating Systems and Computer Networking will be assumed. Students should be comfortable with a broad range of quantitative methods generally required in engineering.
E0 397 (3:1)  Performance and Resource Management in Virtualization and Cloud Computing  Varsha Apte 
Application performance characteristics; Performance metrics, their fundamental behaviour with respect to allocated resources, offered load, etc; Overview of virtualization, Virtual Machines (e.g. Xen, KVM, VMware), Performance Isolation in virtual machines, Xen CPU Schedulers, schedulers in other VMs., Live migration; Understanding energy as a resource, Energy consumption behaviour of server machines, how power consumption can be controlled; Cloud Computing: overview, brief case studies , Dynamic and autonomic resource management in clouds, Resource allocation within one physical machine , Methods based on control theory, reinforcement learning, and other methods; Resource Management of a virtualized cluster – specifically approaches for power usage reduction; Methods based on control theory, reinforcement learning, and other methods.
References:
 The Definitive Guide To The Xen Hypervisor (Series – Prentice Hall Open Source Software Development) by David Chisnall
 Running Xen: A Handson Guide To The Art Of Virtualization by Jeanna Matthews, Eli M. Dow, Todd Deshane. Prentice Hall.
Prerequisites
 Undergraduate level background in Operating Systems and Computer Networking will be assumed.
E0 399 (1:2)  Research in Computer Science  Deepak D Souza / Shirish K. Shevade / Y.N. Srikant 
Contemporary topics of research in theoretical computer science, computer systems and software, intelligent systems.
Motivation and objectives of the course :
This course is meant for MTech (CSE) students. The idea behind the course is that a student works on a short research problem to get handson experience and also to develop soft skills necessary to conduct research. The 1 credit is for one contact hour per week between the instructor(s) and student(s) for discussion and presentations. The 2 credits is for the research work that the student conducts during the week on the course.
References:
 Recent literature
Prerequisites
 Prior consent of instructor(s)
E1 213 (3:1)  Pattern Recognition and Neural Networks  P. S. Sastry 
Introduction to pattern recognition, Bayesian decision theory, supervised learning from data, parametric and non parametric estimation of density functions, Bayes and nearest neighbor classifiers, introduction to statistical learning theory, empirical risk minimization, discriminant functions, learning linear discriminant functions, Perceptron, linear least squares regression, LMS algorithm, artificial neural networks for pattern classification and function learning, multilayer feed forward networks, backpropagation, RBF networks, deep neural Networks, Auto encoders, CNNs, support vector machines, kernel based methods, feature selection and dimensionality reduction methods.
References:
 R. O Dudo, P.E. Hart & D. G. Stork, Pattern Classification John Wiley & sons, 2002.
 C.M Bishop, Neural Network & Pattern Recognition, Oxford University Press(Indian Edition) 2003.
Prerequisites
 Knowledge of Probability theory.
E1 246 (3:1)  Natural Language Understanding  Partha Pratim Talukdar 
Syntax: syntactic processing; linguistics; partsofspeech; grammar and parsing; ambiguity resolution; tree adjoint grammars. Semantics: semantic interpretation; word sense disambiguation; logical form; scoping noun phrases; anaphora resolution. Pragmatics: context and world knowledge; knowledge representation and reasoning;local discourse context and reference; discourse structure; semantic web; dialogue; natural language understanding and generation. Cognitive aspects: mental models, language acquisition, language and thought; theories of verbal field cognition. Applications: text summarization, machine translation, sentiment analysis, perception evaluation, cognitive assistive systems; NLP toolkits augmentation.
References:
 Allen J, Natural language understanding, Pearson Education, 1995, 2003.
 Jurafsky D, and Martin J H, Speech and language processing: an introduction to natural language processing, computational linguistics and speech recognition, Pearson Education, 2000, 2003.
 Posner M I, Foundations of Cognitive Science, MIT Press, 1998.
 Research Literature.
Prerequisites
 Familiarity with programming (optionally including scripting languages); data structures, algorithms and discrete structures; reasonable knowledge of English language.
E1 254 (3:1)  Game Theory  Y. Narahari / Siddharth Barman 
Introduction: rationality, intelligence, common knowledge, von Neumann – Morgenstern utilities; Noncooperative Game Theory: strategic form games, dominant strategy equilibria, pure strategy nash equilibrium, mixed strategy Nash equilibrium, existence of Nash equilibrium, computation of Nash equilibrium, matrix games, minimax theorem, extensive form games, subgame perfect equilibrium, games with incomplete information, Bayesian games. Mechanism Design: Social choice functions and properties, incentive compatibility, revelation theorem, GibbardSatterthwaite Theorem, Arrow’s impossibility theorem, VickreyClarkeGroves mechanisms, dAGVA mechanisms, Revenue equivalence theorem, optimal auctions. Cooperative Game Theory: Correlated equilibrium, two person bargaining problem, coalitional games, The core, The Shapley value, other solution concepts in cooperative game theory.
References:
 Roger B. Myerson, Game Theory: Analysis of Conflict, Harvard University Press, September 1997.
 Martin J. Osborne, An Introduction to Game Theory, Oxford University Press, 2003.
 Y. Narahari, Dinesh Garg, Ramasuri Narayanam, Hastagiri Prakash. Game Theoretic Problems in Network Economics and Mechanism Design Solutions. Springer, 2009.
E1 277 (3:1)  Reinforcement Learning  Shalabh Bhatnagar 
Introduction to reinforcement learning, introduction to stochastic dynamic programming, finite and infinite horizon models, the dynamic programming algorithm, infinite horizon discounted cost and average cost problems, numerical solution methodologies, full state representations, function approximation techniques, approximate dynamic programming, partially observable Markov decision processes, Qlearning, temporal difference learning, actorcritic algorithms.
References:
 D.P.Bertsekas and J.N.Tsitsiklis, NeuroDynamic Programming, Athena Scientific, 1996.
 R.S.Sutton and A.G.Barto, Reinforcement Learning: An Introduction, MIT Press, 1998.
 D.P.Bertsekas, Dynamic Programming and Optimal Control, Vol.I, Athena Scientific, 2005.
E1 313 (3:1)  Topics in Pattern Recognition  M. Narasimha Murty 
Foundations of pattern recognition. Soft computing paradigms for classification and clustering. Knowledgebased clustering. Association rules and frequent itemsets for pattern recognition. Largescale pattern recognition.
References:
 R. O. Duda, P. E. Hart, and D.G. Stork, Pattern Classification, John Wiley & Sons (Asia), Singapore, 2002
 Recent Literature.
E1 335 (3:1)  Cognition and Machine Intelligence  C.E. Veni Madhavan 
Biological cerses computational dichotomy, critical computer – anatomy of neocortex, 100 steps at 5 msec rule, symbolic architecture, connectionist approach, multisensorymotor information, hierarchical, network, pyramidal models, spatiotemporal pattern matching, pattern representation and storage, invariant representations, sequences of sequences, autoassociative, content addressable memory retrieval, memory prediction paradigm, domains: language acquiaition, vision and attention, mental models, design and development of thought experiments and simulation.
References:
 Posner M I, Foundations of Cognitive Science, The MIT Press, 1993.
 Books and Survey Articles by: M. Minsky, A. Newell, H.A. Simon, D.E. Rumelhart, T. Sejnowski, J. Barwise, N. Chomsky, S. Pinker, V.S. Ramachandran and others
E1 354 (3:1)  Topics in Game Theory  Y. Narahari 
Foundational results in game theory and mechanism design: Nash’s existence theorem, Arrow’s impossibility theorem, Gibbard Satterthwaite theorem, etc.; Selected topics in repeated games, evolutionary games, dynamic games, and stochastic games; Selected topics at the interface between game theory, mechanism design, and machine learning; Selected topics in algorithmic game theory; Modern applications of game theory and mechanism design: incentive compatible learning, social network analysis, etc.
References:
 Roger B. Myerson, Game Theory: Analysis of Conflict, Harvard University Press, September 1997.
 Rakesh V. Vohra: Advanced Mathematical Economics. Routledge, New York, NY, 2005.
 Andreu MasColell, Michael D. Whinston, and Jerry R. Green: Microeconomic Theory. Oxford University Press, New York, 1995.
 Current Literature
Prerequisites
 Elementary knowledge of linear algebra, linear programming, algorithms, game theory is useful for this course.
E1 395 (3:0)  Topics in Stochastic Control and Reinforcement Learning  Shalabh Bhatnagar 
Markov decision processes, finite horizon models, infinite horizon models under discounted and longrun average cost criteria, classical solution techniques — policy iteration, value iteration, problems with perfect and imperfect state information. Reinforcement learning, solution algorithms — Qlearning, TD(lambda), actorcritic algorithms.
References:
 D.P.Bertsekas, Dynamic Programming and Optimal Control, Vol.I and II, Athena Scientific, 2005.
 D.P.Bertsekas and J.N.Tsitsiklis, NeuroDynamic Programming, Athena Scientific, 1996.
 R.S.Sutton and A.G.Barto, Reinforcement Learning: An Introduction, MIT Press, 1998.
 Selected Research Papers.
Prerequisites
 A course on probability theory and stochastic processes. Knowledge of nonlinear programming is desirable.
E1 396 (3:1)  Topics in Stochastic Approximation Algorithms  Gugan Thoppe / Shalabh Bhatnagar 
Introduction to Stochastic approximation algorithms, ordinary differential equation based convergence analysis, stability of iterates, multitimescale stochastic approximation, asynchronous update algorithms, gradient search based techniques, topics in stochastic control, infinite horizon discounted and long run average cost criteria, algorithms for reinforcement learning.
References:
 H.J.Kushner and G.Yin, Stochastic approximation and recursive algorithms and applications (2nd edition), Springer Verlag, New York, 2003.
 A.Benveniste, M.Metiview and P.Priouret, Adaptive algorithms and stochastic approximation, SpringerVerlag,1990.
 V.S.Borkar,Stochastic Approximation: A Dynamical Systems Viewpoint, Hindustan Book Agency, 2008.
 D.P.Bertsekas and J.N.Tsitsiklis, Neurodynamic programming, Athena Scientific, 1996.
 Relevant research papers
Prerequisites
 A basic course on probability theory and stochastic processes