In more detail:
A syntax tree for the sentence is obtained by calling a statistical CCG parser.
lambeqis equipped with a detailed API that greatly simplifies this process, and ships with support for several state-of-the-art parsers.
Internally, the parse tree is converted into a string diagram. This is an abstract representation of the sentence reflecting the relationships between the words as defined by the compositional model of choice, independently of any implementation decisions that take place at a lower level.
The string diagram can be simplified or otherwise transformed by the application of rewriting rules; these can be used for example to remove specific interactions between words that might be considered redundant for the task at hand, or in order to make the computation more amenable to implementation on a quantum processing unit.
The resulting string diagram can be converted into a concrete quantum circuit (or a tensor network in the case of a “classical” experiment), based on a specific parameterisation scheme and concrete choices of ansätze.
lambeqfeatures an extensible class hierarchy containing a selection of pre-defined ansätze, appropriate for both classical and quantum experiments.
Now the output of the pipeline (quantum circuit or tensor network) is ready to be used for training. Since Release 0.2.0,
lambeqprovides a detailed hierarchy of model and trainer classes that cover all the important use-cases of supervised learning.
In the case of a fully quantum pipeline, the trainer will first process the quantum circuit by calling a quantum compiler, and then it will upload the result onto a quantum computer, while in the classical case the tensor network will be passed to an ML or optimisation library, such as PyTorch or JAX.