Let's look at the implementation of the Logistic Regression model on Credit Card Dataset.
Let's start by installing pennylane, pennylane-qiskit.
Import all the required things.
Mount Google Drive, so that you can access your files using the path '/content/drive', which corresponds to your Google Drive directory. You can combine it with os functions to list files, read data, and interact with your files stored in Drive. Define the path for the dataset. Scale the data and convert the target to numeric nd split the dataset into train and test.
Mounting the Google Drive and accessing the dataset from the Google Drive.
These are the steps to mount the Google Drive through code.
First Upload the dataset into the Drive and make note of the dataset path.
When you run the above code cell this particular pop-up will appear. Click "Connect to Google Drive".
Then select the Google Drive account in which the dataset is uploaded.
Select "Continue"
Select "Continue" and the drive will be mounted.
Training and evaluating the classical logistic regression model.
Preprocessing data again for the Quantum Logistic Regression. Scaling the features and applying PCA.
Defining the quantum device, feature map, and ansatz.
Defining the QLR circuit by combining the feature map and ansatz (imitates LR in the QLR circuit), the Loss function, and the Adam optimizer.
Training the model, testing it, and printing the accuracy, tqdm is used for showing the training progress bar to get a better idea.
Plotting graph to show the accuracies between the classical and quantum models.
Accuracies (for the creditcard dataset):
Link for the code: Code Link (Time took to run the code: 35-45 min)
Link for Dataset: Dataset Link