Machine Learning is the construction of algorithm and statistical models which can extract information hidden within a dataset. By Learning a model from a dataset, one then has ability to make predictions on unseen data from the same underlying probability distribution . 👩🔬

For several decades 🧑🔬, research in machine learning was focused on models that can provide theoretical guarantees for their performance. However, in recent years, methods based on heuristics have become dominant, party due to an abundance of data computation resources.

Deep Learning is one such heuristic method which has seen great success. Deep learning methods are based on learning a representation of the dataset in the from of networks of parameterized layers.🧑🔧

A quantum models has the ability to represent and generalize data with a quantum mechanical origin. However, to understand quantum models, two concepts must be introduced — quantum data and hybrid quantum-classical models.

Quantum data exhibits **superposition** and **entanglement,**** **leading to joint probability distributions that cloud require an exponential amount of classical computational resources to represent or store . Quantum data, which can be generated / simulated on quantum processors / sensors / networks include the **simulation of chemicals** and **quantum matter**, **quantum control**, **quantum communication networks** , **quantum metrology**, and much more.

On technical 👨⚕️ and key insight is that quantum data generated by NISQ processors are noisy and are typically entangled just before the measurement occurs. However, applying Quantum Machine Learning to noisy entangle quantum data can maximize extraction of meaningful information.

Inspired by those techniques 🧑🎨, the TFQ library provides primitives for the development of models that disentangle and generalize correlations in quantum data, opening up opportunities to improve existing quantum algorithm or discover new quantum algorithms.

How machine learning evolved towards deep learning with the advent of new computational capabilities.👨🎨 These new algorithms use parametrized quantum transformations called parameterized quantum circuits (PQCs) or Quantum Neural Networks (QNNs). In classical analogy of deep learning, the parameters of QNN are then optimized with respect to a cost function via either black-box optimization heuristics or **gradient descent**-based methods.

In order to learn a representation of the training data. In this paradigm, quantum machine learning is the development of models, training strategies, and inference schemes built on parameterized quantum circuits.