What are the key concepts in this project?

Network Latency 

Latency is defined as the delay or waiting time between the initiation of a request and the receipt of a response. In the context of a computer system or network, latency refers to the amount of time it takes for data to travel from one point to another. High latency can negatively impact the user experience, leading to slow response times, decreased efficiency, and frustration. That is why reducing latency is a key focus in our project, as we aim to provide our users with fast and seamless experiences. By constantly monitoring and improving latency, we are dedicated to delivering a high-performing solution that meets and exceeds user expectations.


Smart Queue Management (SQM)

Smart Queue Management refers to the systematic approach of managing and improving the quality of services provided to customers. It encompasses various processes, tools, and techniques used to assess, monitor, and enhance the service experience. The objective of SQM is to consistently meet or exceed customer expectations, thereby improving customer satisfaction and loyalty.

Bufferbloat

Bufferbloat is a phenomenon that occurs when network buffers become excessively large, causing delays in packet delivery and an increase in latency. It typically occurs in networks where large buffers are used to prevent packet loss or manage traffic congestion. 

Open Systems Interconnection
(OSI) Model 

The Open Systems Interconnection model is a conceptual framework that defines how data communication occurs in a network. The model is divided into seven layers, each with a specific function that interacts with the layers above and below it.