Accepted posters TQC 2024
There are 429 accepted posters for TQC 2024. Of these, the Programme Committee highlighted 19 Outstanding Posters: you can find them by filtering on the dropdown tag menu below.
Clarifications
Accepted does not mean presented: Note that not all accepted posters will be presented at the conference due to author availability constraints. Shortly before the conference start, we will clarify which posters are set to be presented in person, based on whether the authors have registered for the conference. If you are interested in a particular poster, please contact the author directly.
Online presentation: For authors who cannot make it to the conference, it will be possible to present the poster online throughout the week on our Discord server. We will share instructions closer to the conference. In our experience, online attendance of these presentations is much lower than in-person attendance.
Withdrawing poster: If you cannot or do not wish to present your accepted poster, you don’t need to contact the organizers or PC chairs; this list will stay here to mark all submissions that were accepted. Exception: if you found a fatal mistake in the submission or would like to change the authors’ names, please let us know.
Upload media: If you would like to upload a thumbnail, more links or the poster pdf, please follow the link on the notification email sent by the PC chairs to the corresponding authors.
Poster sessions: The live poster sessions will be on Monday and Thursday (see schedule). If your poster submission number is below 290, you present on Monday; if it is above 290, you present on Thursday (290 is a talk). If you cannot make it to your allocated session, just bring the poster to the other session and find a free slot. You don’t need to ask the organizers.
Poster printing and size: The poster size should be A0 (84.1 cm × 118.9 cm) in portrait orientation. We recommend bringing your poster with you, as printing options in Okinawa are limited.
1.
João F. Doriguello, Debbie Huey Chih Lim, Chi Seng Pun, Patrick Rebentrost, Tushar Vaidya
Quantum Algorithms for the Pathwise Lasso Poster
2024.
@Poster{P24_7,
title = {Quantum Algorithms for the Pathwise Lasso},
author = {João F. Doriguello and Debbie Huey Chih Lim and Chi Seng Pun and Patrick Rebentrost and Tushar Vaidya},
url = {https://arxiv.org/pdf/2312.14141},
year = {2024},
date = {2024-01-01},
abstract = {We present a novel quantum high-dimensional linear regression algorithm with an L1-penalty based on the classical LARS (Least Angle Regression) pathwise algorithm. Similarly to available classical numerical algorithms for Lasso, our quantum algorithm provides the full regularisation path as the penalty term varies, but quadratically faster per iteration under specific conditions. A quadratic speedup on the number of features/predictors d is possible by using the simple quantum minimum-finding subroutine from Dürr and Høyer [1] in order to obtain the joining time at each iteration. We then improve upon this simple quantum algorithm and obtain a quadratic speedup both in the number of features d and the number of observations n by using the recent approximate quantum minimum-finding subroutine from Chen and de Wolf [2]. In order to do so, we construct, as one of our main contributions, a quantum unitary based on quantum amplitude estimation to approximately compute the joining times to be searched over by the approximate quantum minimum-finding subroutine. Since the joining times are no longer exactly computed, it is no longer clear that the resulting approximate quantum algorithm obtains a good solution. As our second main contribution, we prove, via an approximate version of the KKT conditions and a duality gap, that the LARS algorithm (and therefore our quantum algorithm) is robust to errors. This means that it still outputs a path that minimises the Lasso cost function up to a small error if the joining times are only approximately computed. Finally, in the model where the observations are generated by an underlying linear model with an unknown coefficient vector, we prove bounds on the difference between the unknown coefficient vector and the approximate Lasso solution, which generalises known results about convergence rates in classical statistical learning theory analysis.
[1] Christoph Dürr and Peter Høyer. A quantum algorithm for finding the minimum. arXiv preprint quant-ph/9607014, 1996. 5, 11, 19, 35 [2] Yanlin Chen and Ronald de Wolf. Quantum algorithms and lower bounds for lin- ear regression with norm constraints. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023), volume 261 of Leibniz International Pro- ceedings in Informatics (LIPIcs), pages 38:1–38:21, Dagstuhl, Germany, 2023. Schloss Dagstuhl – Leibniz-Zentrum fu ̈r Informatik. 5, 7, 11, 12, 21, 25, 35},
keywords = {Poster session Monday},
pubstate = {published},
tppubtype = {Poster}
}
We present a novel quantum high-dimensional linear regression algorithm with an L1-penalty based on the classical LARS (Least Angle Regression) pathwise algorithm. Similarly to available classical numerical algorithms for Lasso, our quantum algorithm provides the full regularisation path as the penalty term varies, but quadratically faster per iteration under specific conditions. A quadratic speedup on the number of features/predictors d is possible by using the simple quantum minimum-finding subroutine from Dürr and Høyer [1] in order to obtain the joining time at each iteration. We then improve upon this simple quantum algorithm and obtain a quadratic speedup both in the number of features d and the number of observations n by using the recent approximate quantum minimum-finding subroutine from Chen and de Wolf [2]. In order to do so, we construct, as one of our main contributions, a quantum unitary based on quantum amplitude estimation to approximately compute the joining times to be searched over by the approximate quantum minimum-finding subroutine. Since the joining times are no longer exactly computed, it is no longer clear that the resulting approximate quantum algorithm obtains a good solution. As our second main contribution, we prove, via an approximate version of the KKT conditions and a duality gap, that the LARS algorithm (and therefore our quantum algorithm) is robust to errors. This means that it still outputs a path that minimises the Lasso cost function up to a small error if the joining times are only approximately computed. Finally, in the model where the observations are generated by an underlying linear model with an unknown coefficient vector, we prove bounds on the difference between the unknown coefficient vector and the approximate Lasso solution, which generalises known results about convergence rates in classical statistical learning theory analysis.
[1] Christoph Dürr and Peter Høyer. A quantum algorithm for finding the minimum. arXiv preprint quant-ph/9607014, 1996. 5, 11, 19, 35 [2] Yanlin Chen and Ronald de Wolf. Quantum algorithms and lower bounds for lin- ear regression with norm constraints. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023), volume 261 of Leibniz International Pro- ceedings in Informatics (LIPIcs), pages 38:1–38:21, Dagstuhl, Germany, 2023. Schloss Dagstuhl – Leibniz-Zentrum fu ̈r Informatik. 5, 7, 11, 12, 21, 25, 35