Generatively modelling properties of a single quantum system can already be computationally expensive, and often, one wishes to model several different scenarios to see how the dynamical or equilibrium properties of a quantum system evolve as certain parameters, such as time, temperature, or the Hamiltonian, are continuously modified. For such parametric families of tasks, there is often an inherent information geometry for this space of tasks and within each task; one would ideally leverage awareness of such a geometry to guide the optimization of generative models from task to task. Here we explore the use of quantum-probabilistic hybrid representations that combine probabilistic generative models with quantum neural networks, paired with optimization strategies which convert between the geometry of the task space and that of the parameter space of our models, in order to achieve an optimization advantage. We specifically study Riemannian metrics defined on the space of density operators, in particular the Bogoliubov-Kubo-Mori (BKM) metric, which can be well-estimated in an unbiased fashion for our class of quantum-probabilistic models, namely quantum Hamiltonian-based models (QHBMs). We show that natural gradient descent with respect to this construction attains quantum Fisher efficiency of parameter estimation. We further present an alternative first-order formulation of mirror descent that is conducive to improvements in quantum sample complexity. We also derive conditional initialization strategies for simulating time evolution processes and equilibrium states for various values of the problem space parameters. We demonstrate both theoretically and numerically that such techniques may enable accelerated convergence to more optimal solutions of quantum generative modelling tasks.




Download Full History