Sagnik Chatterjee

About Me

I am a fifth-year PhD student in the BraQIIIT lab at IIIT-Delhi, where I am very fortunate to be advised by Prof. Debajyoti Bera. In the fall, I will be joining as a postdoc at the University of Latvia, Riga, where I will be hosted by Prof. Andris Ambainis.


My research

My main area of research lies at the intersection of quantum computing and computational learning theory, and involves developing algorithms for discriminative and generative tasks under various noise models and proving theoretical bounds for convergence, generalization, and speedups.


Contact

Office: B-513, New Academic block, IIITD
Email: sagnikc [at] iiitd [dot] ac [dot] in


News

  • Aug-Oct'23: Visiting Indian Statistical Institute, Kolkata for a research internship.
  • July'24: Presented our work at the Recent trends in Algorithms (RTA) 2024 workshop in IACS, Kolkata.
  • June'24: Gave a talk on quantum learning at the ACMU seminar series in Indian Statistical Institute, Kolkata.
  • May'24: Presented our work at AISTATS'24 in Valencia.
  • Feb'24: Gave 3 invited lectures on advanced quantum algorithms at the Quantum Computing Semester in Chennai Mathematical Institute, Chennai.
  • Jan'24: Efficient Quantum Agnostic Improper Learning of Decision Trees accepted at AISTATS'24.
  • Jan'24: Presented our work at QIP'24 in Taipei, Taiwan.
  • Jan'24: Teaching Assistant for Theory of Computaton.
  • Sep'23: Gave a Ketchup talk at IIIT-Delhi.
  • Sep'23: Gave an invited talk at the IDA seminar in Czech Technical University, Prague.
  • Sep'23: Gave an invited talk at the University of Latvia, Riga.
  • Jul-Sep'23: Visiting Czech Technical University, Prague for a research internship.
  • Jul'23: New preprint on Quantum Solutions to the Privacy vs. Utility Tradeoff out on arXiv.
  • Jun'23: Quantum boosting using domain-partitioning hypotheses accepted at Quantum Machine Intelligence.
  • Jun'23: Attended the JTG Summer School at IISc Bangalore.
  • Feb'23: Gave an invited talk on our QTML paper at the IIITD CSE Department seminar.
  • Jan'23: Teaching Assistant for Introduction to Quantum Computing.
  • Nov'22: Presented our paper Quantum boosting using domain-partitioning hypotheses at QTML'22 in Naples, Italy.
  • Aug'22: Quantum boosting using domain-partitioning hypotheses accepted at QTML'22 for a short talk.
  • Apr'22: Qualified my Ph.D. Comprehensive Exam.
  • Mar'22: Teaching Assistant for Data Structures and Algorithms.
  • Mar'22: Gave an invited talk on quantum boosting algorithms at the IITD Theoretical CS seminar.
  • Mar'22: Poster presented at QIP'22.
  • Dec'21: Organized and conducted QISE'21 colocated with FSTTCS'21.
  • Aug'21: Teaching Assistant for Modern Algorithm Design.
  • Jan'21: Teaching Assistant for Theory of Computation.
  • Jan'21: Poster presented at QIP'21.
  • Dec'20: Gave an invited talk on Quantum Machine Learning at the Faculty Development program hosted by JNTU, Anantapur.
  • Nov'20: Extended Abstract on Applying QAOA+ to the graph matching problem accepted at AQIS'20.
  • Aug'20: Teaching Assistant for Modern Algorithm Design.
  • Feb'20: Attended the Boolean Functions workshop at ISI-Kolkata.
  • Jan'20: Runners-Up in the IBMQ Awards; Teach Me Quantum Category.
  • Jan'20: Teaching Assistant for Theory of Computation.
  • Dec'19: Attended the 4th Winter School in CS organized by the IIAS, at the Hebrew University of Jerusalem.

Research §

Generalization Bounds for Dependent Data using Online-to-Batch Conversion.


with Manuj Mukherjee and Alhad Sethi.

Under submission.

We give generalization error bounds for statistical learning algorithms trained on non-i.i.d. data in the Online-to-Batch setting.


[ arXiv ]

 Abstract    Citation  
In this work, we give generalization bounds of statistical learning algorithms trained on samples drawn from a dependent data source both in expectation and with high probability, using the Online-to-Batch conversion paradigm. We show that the generalization error of statistical learners in the dependent data setting is equivalent to the generalization error of statistical learners in the i.i.d. setting up to a term that depends on the decay rate of the underlying mixing stochastic process, and is independent of the complexity of the statistical learner. Our proof techniques involve defining a new notion of stability of online learning algorithms based on Wasserstein distances, and employing ”near-martingale” concentration bounds for dependent random variables to arrive at appropriate upper bounds for the generalization error of statistical learners trained on dependent data.
@misc{chatterjee24b,
title = {Generalization Bounds for Dependent Data using Online-to-Batch Conversion},
author = {Chatterjee, Sagnik and Mukherjee, Manuj and Sethi, Alhad},
year={2024},
eprint={2405.13666},
archivePrefix={arXiv},
}

Efficient Quantum Agnostic Improper Learning of Decision Trees.


with Tharrmashastha SAPV and Debajyoti Bera.

Accepted at AISTATS 2024.
Poster at QIP 2024.


We give the first polynomial-time and sample quantum algorithm for learning size-\(t\) decision trees with uniform marginal over instances, in the agnostic setting, without membership queries.


[ proceedings | arXiv | slides | poster]

 Abstract    Citation  
The agnostic setting is the hardest generalization of the PAC model since it is akin to learning with adversarial noise. In this paper, we give a \(\mathrm{poly}\left(n,{t},{\frac{1}{\varepsilon}}\right)\) time quantum algorithm for learning size \(t\) decision trees with uniform marginal over instances, in the agnostic setting, without membership queries. Our algorithm is the first algorithm (classical or quantum) for learning decision trees in polynomial time without membership queries. We show how to construct a quantum agnostic weak learner by designing a quantum version of the classical Goldreich-Levin algorithm that works with strongly biased function oracles. We show how to quantize the agnostic boosting algorithm by Kalai-Kanade (NIPS'09) to obtain the first efficient quantum agnostic boosting algorithm. Our quantum boosting algorithm has a polynomial improvement in the dependence of the bias of the weak learner over all adaptive quantum boosting algorithms while retaining the standard speedup in the VC dimension over classical boosting algorithms. We then use our quantum boosting algorithm to boost the weak quantum learner we obtained in the previous step to obtain a quantum agnostic learner for decision trees. Using the above framework, we also give quantum decision tree learning algorithms for both the realizable setting and random classification noise model, again without membership queries.
@InProceedings{pmlr-v238-chatterjee24a,
title = { Efficient Quantum Agnostic Improper Learning of Decision Trees },
author = {Chatterjee, Sagnik and SAPV, Tharrmashastha and Bera, Debajyoti},
booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics},
pages = {514--522},
year = {2024},
editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen},
volume = {238},
series = {Proceedings of Machine Learning Research},
month = {02--04 May},
publisher = {PMLR},
url = {https://proceedings.mlr.press/v238/chatterjee24a.html} }

Quantum boosting using domain-partitioning hypotheses.


with Rohan Bhatia, Parmeet Singh-Chani, and Debajyoti Bera.

Published in Quantum Machine Intelligence, 5(2): 1-20 (2023).
Short talk at QTML 2022.
Poster at QIP 2022.


We answer an open question on the existence of quantum boosting algorithms for weak learners with non-binary hypotheses.

[ journal | arXiv | code | poster | slides]

 Abstract    Citation  
Boosting is an ensemble learning method that converts a weak learner into a strong learner in the PAC learning framework. The \(\mathrm{AdaBoost}\) algorithm is a well-known classical boosting algorithm for weak learners with binary hypotheses. Recently, Arunachalam and Maity (ICML'20) presented the first quantum boosting algorithm by quantizing \(\mathrm{AdaBoost}\). Their algorithm, which we refer to as \(\mathrm{QAdaBoost}\) hereafter, is a quantum adaptation of \(\mathrm{AdaBoost}\) and only works for the binary hypothesis case. \(\mathrm{QAdaBoost}\) is quadratically faster than \(\mathrm{AdaBoost}\) in terms of the VC dimension of the hypothesis class of the weak learner while it is polynomially worse in the bias of the weak learner.
In this work, we address an open question by Izdebski et al. on the existence of boosting algorithms for quantum weak learners with non-binary hypotheses. We answer this question in the affirmative by developing the \(\mathrm{QRealBoost}\) algorithm motivated by the classical \(\mathrm{RealBoost}\) algorithm. The main technical challenge was to provide provable guarantees for convergence, generalization bounds, and quantum speedup, given that quantum subroutines are noisy and probabilistic. We prove that \(\mathrm{QRealBoost}\) retains the quadratic speedup of \(\mathrm{QAdaBoost}\) over \(\mathrm{AdaBoost}\) and further achieves a polynomial speedup over \(\mathrm{QAdaBoost}\) in terms of both the bias \(\gamma\) of the learner and the time taken by the learner to learn the target concept class.
Finally, we perform empirical evaluations on \(\mathrm{QRealBoost}\) and report encouraging observations on quantum simulators by benchmarking the convergence performance of \(\mathrm{QRealBoost}\) against \(\mathrm{QAdaBoost}\), \(\mathrm{AdaBoost}\), \(\mathrm{RealBoost}\), and \(\mathrm{SmoothBoost}\) on a subset of the \(\mathrm{MNIST}\) dataset and \(\mathrm{Breast Cancer Wisconsin}\) dataset.
@Article{Chatterjee2023,
author={Chatterjee, Sagnik and Bhatia, Rohan and Singh Chani, Parmeet and Bera, Debajyoti},
title={Quantum boosting using domain-partitioning hypotheses},
journal={Quantum Machine Intelligence},
year={2023},
month={Jul},
day={27},
volume={5},
number={2},
pages={33},
issn={2524-4914},
doi={10.1007/s42484-023-00122-3},
url={https://doi.org/10.1007/s42484-023-00122-3}
}

Talks §

Block Encodings and LCUs

  • Quantum Computing Bootcamp, CMI, 2023

These lectures were given at the Chennai Mathematical Institute as part of the 2024 Quantum Computing Semester.
Youtube videos: Lecture 1, Lecture 2.

Generating Hard Instances for the Short Basis problem

  • IIITD Theory Reading Group, 2023

Here are my notes and a short blog post on the Ajtai 1996 paper. The reference papers are Micciancio-Regev, and Alwen-Peikert.

Improved Quantum Query Upper Bounds based on Classical Decision Trees

  • IIITD Theory Reading Group, 2023

This was the inaugural talk of the bi-weekly IIIT-Delhi Theory Reading Group. Here are my notes on the CMP 2022 paper.

Quantum Boosting Algorithms

  • IIIT-Delhi CSE Seminar, 2023
  • IIT-Delhi Theoretical CS Seminar, 2022

This was a talk at the IIIT-Delhi CSE Seminar on the joint work with my advisor, Rohan Bhatia, and Parmeet Singh Chani. See above for details.
Social Media: Twitter, Facebook, Instagram.

Introduction to C

  • Operating Systems Refresher Module, 2022

This was a short course I taught at IIIT-Delhi as part of a refresher module on Operating Systems during the Monsoon 2022 semester.
Lecture 1, Lecture 2, Lecture 3.

The HHL algorithm

  • Quantum Computing Bootcamp, CMI, 2023
  • Gong Show Talk, IIAS Winter School, 2019

My HHL notes are part of the course we designed for the “Teach Me Quantum” category of the 2019-20 IBMQ awards (2019-2020), in which we won the second place prize. The references are HHL '09, Ambainis '10, WZP '17, and CKS '17.

Youtube Video: CMI lecture.
My Notes and Slides.

§ See CV for full list.