TSH Group is a contract equipment manager that specialises in precision machining, module assembly and system integration, and quick-turn manufacturing, just to name a few.
They were looking for a way to automate the quality control (QC) portion of their manufacturing process as well as an intelligent AI built into the system to help carry out some parts of their inspection process.
Problem Description
In this company, there are 2 main users: inspectors and managers.
As the manager of the company, he would need to ensure that the quality of all components is high so that the company can maximize its profits and customer satisfaction. For this, the inspector ensures that components of the machine including the label on the control box are correct. Currently, he has assigned human inspectors who would check for the accuracy of the placement of labels and QC the other components. They manually document the pictures of the product and if any QC checks fail, they send it back to the production department. To ensure that there are no discrepancies in the process, the manager would like to have a system that would digitize this process, as well as display a big picture analysis.
How might we make the classification of label placement and quality inspection data tracking as accurate as possible with the help of artificially intelligent software?
Who are we designing for?
Mobile application to help inspectors manage multiple QC entries at once
Web page for managers to view analytics, as well as change system controls
AI inspection service detects whether labels pass or fail the quality control process
Manager Dashboard
Inspector App
Front End
Back End
AI, App Server, Authentication
The AI inspection service would involve multi-label image detection and binary classification. Its only responsibility is to analyse an input image and return prediction results.
The Authentication service is an isolated service managing user registrations and logins. This singular authentication service provides access to multiple related applications such as the dashboard and the inspector application.
The App Server is in charge of the main business logic required for our applications, providing endpoints for the frontend services to Create, Read, Update and Destroy (CRUD) resources. It also supports real-time communication between frontend services and itself via WebSockets with Rails ActionCable.
We generally follow REST API principles such as:
Client-Server decoupling: Client applications can only interact with server via known URIs
Statelessness: No server-side sessions and every request includes information needed to process it (e.g. via request headers or bodies)
Uniform Interface: Each resource has its own unique URI. With our data being relational, we have standardised URIs of associated records belonging to a resource (for one-to-many or many-to-many relationships). For example, machine_types/:id/component_types gets the associated component types for a machine type with a certain ID.
App Development, AI Inspection
Cloud Run, Cloud SQL, Pub/Sub, Cloud Functions, BigQuery
Cloud Run
Docker
Cloud Run was used to help us deploy our three microservices and frontend applications with dockerised containers.
CloudSQL
PostgreSQL
We used CloudSQL with PostgreSQL to store most of our application data and resources, as we had decided on a relational database.
Cloud Pub/Sub
Cloud Pub/Sub is used to facilitate inter-service communication. In our project, we push to a Pub/Sub topic whenever a work order is submitted to trigger a Cloud Function.
Cloud Functions
We have multiple Cloud Functions that mainly interacts with BigQuery. One of them is triggered via a push to the aforementioned Pub/Sub topic. It then Extracts, Transforms and Loads (ETL) data into BigQuery.
We also have Cloud Functions to query data from BigQuery. These are used by the dashboard application to build interactive dashboards.
Big Query
We used BigQuery to store structured data that is optimal for building dashboard analytics. Data is transformed and written into BigQuery with the help of a Cloud Function.
Cloud Vision
Cloud Vision
To train our machine learning model to detect the placement of the labels on the machines, we made use of Cloud Vision API to train our custom model. The steps to create our model are as follows:
We recreated our own label dataset through data augmentation.
Uploaded the labelled dataset to Cloud Storage which will then be uploaded to Cloud Vision.
Train a multi-label classification using Cloud Vision
Deploy the model through Cloud Run so that our model can be called upon as an API.
To ensure that the result will be obtained in the fastest time, our model passes through 2 gateways that we have termed First Check and Second Check.
First Check is used to check for the presence of the label in the photo while Second Check is used for checking for the placement of the label. Second Check will return whether the label has passed or has non-conformance.
The above database diagram illustrates the different tables/models our application has. Our data was normalised as much as possible, as our resources are highly relational with numerous one-to-many and many-to-many relationships.