Quality prediction and inspection are important for industrial engineering because they help maintain quality standards and regulatory requirements.
In this module, Logistic Regression, Hybrid Quantum Machine Learning, and Quantum Machine Learning will be applied to analyze a quality prediction dataset.
Install required libraries
Import required libraries
We are preprocessing the data here. First, we took the features and label datasets, which are in ".data" format, and converted them into ".csv" and concatenated them into a single CSV file.
Mounting the Google Drive and accessing the dataset from the Google Drive.
These are the steps to mount the Google Drive through code.
First Upload the dataset into the Drive and make note of the dataset path.
When you run the above code cell this particular pop-up will appear. Click "Connect to Google Drive".
Then select the Google Drive account in which the dataset is uploaded.
Select "Continue"
Select "Continue" and the drive will be mounted.
Then, fill the NaN fields with "0". Separated the features(X) and label(y). Converted the label values 1, -1 to 1, 0 respectively. Next, we normalized the data and split the data into training and testing.
We are training our model using a Logistic Regression Classifier testing it and printing out the model's accuracy and the confusion matrix.
We are performing feature selection using a Logistic Regression classifier to identify the most important features.
We have setup the quantum device with qubits the same as the size of the features and then defined the quantum circuit using a qnode. Then we defined the hybrid model and the cost function.
Initializing the weights and the optimizer.
Printed the cost for every 10 steps.
Training cost Over Time:
In the early steps, there is a high initial cost with some fluctuations. Because the model begins to learn at the start. Later, during the middle of the training, there is a cost reduction gradually, which is a good sign. Later in the final training, it continued to decrease but with fluctuations, showing that the optimization was kind of successful.
Printing the accuracy of the Hybrid model.
Now, we are implementing the QML model. We started the implementation by normalizing the data and applying PCA.
This code defines a quantum variational classifier that encodes data on qubits, applies rotations and entangling gates, and outputs predictions measured by a Pauli-Z expectation value. A cost function then computes the model’s performance using mean square loss to guide optimization.
Initializing weights and creating adam optimizer.
This code trains the quantum model over 100 steps, recording the cost at each step for a mini-batch of training data. A plot then visualizes the reduction in cost over time, showing the model’s learning progress.
The graph is not quite good, though the cost decreases at certain points it still spikes at certain points.
Calculating the accuracy of the model.
Plotting graph to showcase the accuracies between all the models.
These are the accuracies between all the models.