Authors: Alina Humenyuk, Vera Afreixo, Ana Helena Tavares, Diogo Raimundo, Heitor Cardoso, and Bernardo Marques
Cataracts, a clouding of the eye’s lens, lead to a decline in visual acuity. Despite being treatable through surgical removal, they remain the leading cause of blindness and the second most common cause of moderate to severe visual impairment globally. Additionally, the aging global population is expected to increase the number of patients affected by this condition. This scenario highlights the growing need for efficient management of clinical resources associated with cataract treatment. In this context, identifying outliers becomes particularly important, as an outlier clinical pathway could represent either exceptional responses or inefficiencies in care delivery. This study aimed to identify outlier clinical pathways regarding the sequence of clinical activities (Objective 1) and within three specific periods (Objective 2) using data from 1949 patients who underwent cataract surgery. Three approaches were considered: a general approach (A1), one focused only on patients who had surgery on one eye (A2), and another focused on patients who had surgery on both eyes (A3). For each approach, sequence analysis was applied, using a dissimilarity measure based on the longest common subsequence (LCS). Additionally, four unsupervised machine learning algorithms for outlier detection were implemented: k-nearest-neighbor (k-NN), local outlier factor (LOF), densitybased spatial clustering of applications with noise (DBSCAN), and hierarchical density-based spatial clustering of applications with noise (HDBSCAN). The developed workflow was validated through a post-hoc analysis of the identified outliers, confirming that the methodology effectively distinguished outlier pathways from normal ones. Outlier pathways were found to be more resource- and time-intensive. Ultimately, this work shows promise for developing tools that could enable more efficient allocation of clinical resources for cataract surgery clinical pathways, addressing the growing demands of an aging population.
Authors: Ana Leonor Saraiva, Catarina Cardoso, Cristiana Silva, Jorge Cabral, José Pedro Antunes, Paula Rama, Sofia Pinheiro, Vera Afreixo
Primary Health Care (PHC) focuses on addressing the physical, mental, and social well-being of individuals and communities. Monitoring the quality of services provided to the community is crucial, and in recent years, PHC has undergone significant organizational reforms to improve service delivery and foster closer ties to communities. As part of this transformation, Health Indicators have been introduced, allowing each unit to manage and enhance its performance based on these metrics.
The aim of this study is to analyze the behavior and correlations of Health Indicators to support health unit teams in efficiently adjusting and optimizing their management processes. To achieve this, an R Shiny application was developed.
Data were collected from the Primary Health Care Identity Card website, specifically from the Contracting and Indicators section. The dataset spans all months of each contractual year for 41 health units from ACES Baixo Vouga. After finalizing the dataset, an exploratory analysis was conducted to assess the performance of both the indicators and the health units over time. Correlations between indicators across all health units were calculated using the rmcorr package, alongside individual correlations for each health unit.
Subsequently, hierarchical clustering of the time series was performed using the Dynamic Time Warping distance measure to group units with similar patterns in indicator performance.
Authors: Asmae Tajani , Delfim F.M. Torres , and Cristiana J. Silva
In this work we aim to investigate the optimal control problems for gradient controllability of a new spatio-temporal epidemic model that is constructed by adding to the basic SIR epidemic system the diffusion in each compartment and the ψ−Caputo time-fractional derivative of order α ∈ (0, 1), for more details about fractional operators see [1]. The obtained results can be used by policy-makers to control the spreading of infection. Using the semi-group theory, we provide a framework to analyze the sufficient conditions under which the considered time-fractional SIR system is gradient approximate controllable with several proposed control actuation. Using Hilbert Uniqueness method (HUM) introduced in [2], we find the optimal solution that minimizes a cost function depend in vaccination and treatment control strategy. Finally we present some numerical results to illustrate our theoretical approach.
[1] R. Almeida, A Caputo fractional derivative of a function with respect to another function, Communications in Nonlinear Science and Numerical Simulation, Volume 44, 2017, pp. 460-481 [2] J.L. Lions, Contrˆolabilit´e exacte, perturbations et stabilisation de syst`emes distribu´es, Tome 1,2, RMA, Vol. 8,9, 1988.
Authors: Beatriz Lau, Daniel Ramos, Vera Afreixo, Luís Silva, Ana Helena Tavares, Miguel Martins Felgueiras, Diana Gomes, Firmino Machado, Henrique Coelho
Background: In recent years, the concept of patient blood management (PBM) has emerged as a novel initiative defined as patient-centered and evidence-based, with the aim of preserving patients' own blood, thus reducing the necessity for allogeneic blood components (Dhir & Tempe, 2018). While patient blood management is beneficial for all patients, there are certain populations in whom the application of PBM principles has a more pronounced influence on outcome. A proper identification of these patients permits more expedient and suitable referrals, thereby reducing the costs associated with late or ineffective treatments and the necessity for large quantities of red blood cell transfusions. The objective of this study is, therefore, to develop and analyse a predictive model using machine learning algorithms that can identify those patients who are likely to respond most favourably to the implementation of this approach, measure by the RBCs transfusion necessity. Methods: This retrospective observational cohort study was conducted at a Northern Portuguese hospital between 2018 and 2023. To address the imbalanced distribution of the dependent variable, the Synthetic Minority Oversampling Technique (SMOTE) was employed. The primary outcome measures for the classification task were the area under the receiver operating characteristics curve (AUC) and accuracy, chosen to evaluate model performance. Two machine learning algorithms, XGBoost and neural networks (NNs), were employed due to their efficiency in handling complex feature interactions. SHAP values were analysed to assess the contribution of each feature to the predictions generated by the XGBoost model, thereby providing interpretability and insights into feature importance. Results: The two models demonstrated an inability to accurately predict the necessity for at least one unit of red blood cell transfusion. The neural network achieved an accuracy of 0.7548 (recall 0.8491 and specificity 0.5510) and an area under the ROC curve of 0.7983 (95% CI 0.7472 to 0.8494). The XGBoost model achieved an accuracy of 0.700 (recall 0.7358 and specificity 0.6224) and an area under the ROC curve of 0.762 (95% CI 0.7071 to 0.8169). An analysis of SHAP values revealed that the most important variable was preoperative haemoglobin levels, which can be optimised through the PBM approach. Conclusion: The findings of this study emphasise the significance of preoperative optimisation of blood components in minimising the necessity for red blood cell transfusion. The moderate performance of the machine learning algorithms may be attributed to the inclusion of variables with limited predictive capacity.
Authors: Daniel Ramos, Beatriz Lau, Diana Gomes, Ana Helena Tavares, Luís Silva, Vera Afreixo, Miguel Martins Felgueiras, Henrique Coelho, and Firmino Machado
Patient blood management (PBM) is a patient-centered, evidence-based, and systematic approach that aims to manage and preserve the patient’s own blood, minimizing blood loss, and optimizing the patients physiological tolerance of anaemia, which can improve patient outcomes while saving healthcare resources and reducing costs, due to the reduction in blood transfusions and hospitalization days. Despite being a relatively new field of knowledge, there is already a wide range of articles supporting the adoption of these measures. So much so that the World Health Organization urgently called for their integration into various healthcare institutions. However, this integration can be challenging due to the culture of healthcare professionals and the scarcity of resources, requiring precise allocation of them. Therefore, it is essential to understand the most determinative characteristics of patients for adverse outcomes, such as long periods of intensive care unit stays, extended ward admissions, or increased need for red blood cell transfusion. This can be achieved through the implementation of predictive algorithms, such as Random Forests. In this study involving 834 patients undergoing PBM, the variable Transfusion was created with three levels: 0, 1, and >1, reflecting the quantity of transfused red blood cells. Analyzing it using a Random Forest revealed sensitivities of 0.95, 0.05, and 0.28 for the levels ”0”, ”1”, and ”>1”, respectively. Due to the imbalance in classes, a model was trained with a balanced dataset, with 64 observations for each transfusion level (representing 70% of the ”>1” level observations). In this new model, sensitivity improved to 0.98, 0.73, and 0.89 for the same levels, with Hemoglobin measured at the consultation, Glomerular Filtration Rate, EuroScore, and Platelet level, Age, Surgery and Creatinine level identified as the most important variables.
Authors: Diana A. Vázquez-Limón, Pedro Damião, Adelaide Freitas, Marco Costa, and Nélia da Silva
Health institutions need to allocate their resources according to the faced demand in order to provide their services; a performance indicator of the service can be the maximum number of days for a health service to be covered which in Portugal is outlined by law (at Di´ario da Rep´ublica n.° 86/2017, S´erie I, 2017-05-04). This has led to appointment fixing for the Primary Care Services (PCS) when feasible so that the medical staff can distribute the resources according to the expected demand. Patients enter the appointment scheduling system following some probability function assumed to be Poisson in this work, the operation of the scheduling system can be described as that of a queue system. The objective of this study is to explore the patterns in the scheduling system. In this work we used R code to generate random numbers from a Poisson distribution, to simulate entries to the system of PCS. Then, a model to simulate the scheduling appointments (similar to ) was generated considering as a base three constraints: quantity of requests for appointment, availability of physicians, and ensuring that the operational response time for the service is lower than the maximum established response time. The simulation aims to study the number of days the system takes to assign an appointment date from the day it was requested to study the patterns and identify the existence of failure. The simulation yields a proposed timeline to fulfill the demand from which we took two possible performance indicators to determine whether the results had any improvement when changing the allocation of resources. Running generated instances some conclusions are achieved over the time suggested to plan for the available resources and the distribution of the resources to give a service. This work aims to show that these types of simulations may help in the decision making to improve the service. By understanding and identifying the patterns, in the future we expect to be able to use the simulation to identify the parameters of the system that will optimize the use of resources in the PCS while complying with the performance indicators.
Authors: Diana Lucas, Ausenda Machado, Baltazar Nunes, Vera Afreixo, Patrícia Soares
Monitoring vaccine effectiveness (VE) in real-world settings is essential in COVID-19 vaccination, as confounding factors can affect the precision of results. This makes it important to control for these variables. However, traditional regression methods may struggle in scenarios with low event rates and many confounders, such as estimating VE against hospitalisation. The effectiveness of the COVID-19 XBB 1.5 monovalent vaccine against hospitalisation was assessed in mainland Portugal between October and November 2023. Two methods were used to adjust for confounders: stratified Cox regression and Cox regression with Inverse Probability Weighting (IPW). A historical cohort of 2.2 million individuals was analysed, with 41.7% receiving the vaccine. VE estimates for hospitalisation were higher using IPW, reaching 60% for ages 65-79 and 78% for those ≥80. Results suggest IPW provides more precise estimates, especially in cases of low event frequencies and numerous confounders, though further research is needed.