Unleashing the potential of llms for quantum computing: A study in quantum architecture design
Published in arXiv preprint arXiv:2307.08191, 2023
This paper explores the opportunities and potentials of current and forthcoming generations of generative pre-trained transformers (GPTs) in assisting the development of noisy intermediate-scale quantum (NISQ) technologies and fault-tolerant quantum computing (FTQC) research. The authors implement a QGAS model that can rapidly propose promising ansatz architectures and evaluate them with application benchmarks, including quantum chemistry and quantum finance tasks. The results demonstrate that after a limited number of prompt guidelines and iterations, the model can obtain a high-performance ansatz that produces results comparable to state-of-the-art quantum architecture search methods. This study provides an overview of GPT’s capabilities in supporting quantum computing research while also highlighting the limitations of the current GPT. PDF