6th International Conference on Signal, Image Processing (SIPO 2022)

January 29~30, 2022, Copenhagen, Denmark

Accepted Papers

Using Convolutional Neural Network to Enhance Heart Diseases Predictions in South African Men Living in the Western Cape Region

Elias Tabane, University of South Africa, South Africa

ABSTRACT

The medical analysis space is most every now and again referenced to be a valu-able wellspring of rich information yet with helpless acumens with medical input . The Heart Disease is among the significant reasons for mortally all around the world. Early recognition of heart disease might aid the decrease rates that is identified with mortality, but the significant experience lies in the trouble and vastness of this information with regards to anticipating while applying the ordinary strategies . The point of this re-search exposition is to utilize the authentic medical information to foresee CHD utilizing Machine Learning Algorithms. The extent of this paper is restricted to utilizing strategic relapse Algorithm. Utilizing the South African Heart Dis-ease dataset of 462 examples, the model is based on Python with 10-crease cross validation as-sessment philosophy.

KEYWORDS

Coronary Heart Disease, Support Vector machine, Random forest, Logistical regression., convolutional neural network.


Brain Oscillatory Representations of Vibrotactile Parameters: an ECG Study

Xinrong WANG, Xinyue FANG, Xinyi ZHENG, Xuemei DENG, Yong LI, and Mei WANG, Department of Mechanical Engineering, Sichuan University, Chengdu, China

ABSTRACT

In this work, we tried to explore brain oscillatory representations triggered by vibrotactile parameters. We utilized amplitude-modulated vibrotactile stimuli that have four envelope frequencies and four amplitudes. Brain oscillations in five frequency bands and ten channels were measured by noninvasive EEG technique, and they were represented in power spectral density (PSD) forms. Results showed that when envelope frequency (Fe)=20Hz or 40Hz, for all four amplitudes, brain oscillations over the ipsilateral sensorimotor cortex decrease with the frequency bands, additionally, a slight decrease of activation in the contralateral frontal regions was founded when amplitude (A)=0.8Grms. When amplitude (A)=0.8Grms, the overall variation tendency of spectrum was similar to that mentioned above. In terms of oscillatory differences, the activation regions were the bilateral sensorimotor cortex, additionally, a increase of oscillations in the frontal and parietal regions was founded when Fe=20Hz and Fe=40Hz. Results contribute to better physiological understandings for physical vibrotactile parameters.

KEYWORDS

Vibrotactile Stimuli, Brain Oscillations, Haptic Feedback.


Fixed-Point Code Synthesis for Neural Networks

Hanane Benmaghnia1, Matthieu Martel1,2, and Yassamine Seladji3, 1University of Perpignan Via Domitia, Perpignan, France, 2Numalis, Cap Omega, Rond-point Benjamin Franklin 34960 Montpellier, France, 3University of Tlemcen Aboubekr Belkaid, Tlemcen, Algeria

ABSTRACT

Over the last few years, neural networks have started penetrating safety critical systems to take decisions in robots, rockets, autonomous driving car, etc. A problem is that these critical systems often have limited computing resources. Often, they use the fixed-point arithmetic for its many advantages (rapidity, compatibility with small memory devices.) In this article, a new technique is introduced to tune the formats (precision) of already trained neural networks using fixed-point arithmetic, which can be implemented using integer operations only. The new optimized neural network computes the output with fixed-point numbers without modifying the accuracy up to a threshold fixed by the user. A fixed-point code is synthesized for the new optimized neural network ensuring the respect of the threshold for any input vector belonging the range [xmin, xmax] determined during the analysis. From a technical point of view, we do a preliminary analysis of our floating neural network to determine the worst cases , then we generate a system of linear constraints among integer variables that we can solve by linear programming. The solution of this system is the new fixed-point format of each neuron. The experimental results obtained show the efficiency of our method which can ensure that the new fixed-point neural network has the same behavior as the initial floating-point neural network.

KEYWORDS

Computer Arithmetic, Code Synthesis, Formal Methods, Linear Programming, Numerical Accuracy, Static Analysis.


Detection Datasets: Forged Characters for Passport and Driving Licence

Teerath Kumar1, Muhammad Turab2, Shahnawaz Talpur2, Rob Brennan1 and Malika Bendechache1, 1CRT AI and ADAPT, School of Computing, Dublin City University, Ireland, 2Department of Computer Systems Engineering, Mehran University of Engineering and Technology, Jamshoro, Pakistan

ABSTRACT

Forged characters detection from personal documents i.e. passport or driving licence is an extremely important and challenging task in digital image forensics, as forged information on personal documents can be used for fraud purposes i.e. theft, robbery etc. For any detection task i.e. forged character detection, deep learning models are data hungry and getting the forged characters dataset for personal documents is very difficult due to various reasons, including information privacy, unlabeled data or existing work is evaluated on private datasets with limited access and getting data labelled is another big challenge. To address these issues, we propose a new algorithm that generates two new datasets named forged characters detection on passport (FCD-P) and forged characters detection on driving licence (FCD-D). To the best of our knowledge, we are the first to release these datasets. The proposed algorithm first reads the plain image, then performs forging tasks i.e. randomly changes the position of the random character or randomly adds little noise. At the same time, the algorithm also records the bounding boxes of the forged characters. To meet real world situations, we perform multiple data augmentation on cards very carefully. Overall, each dataset consists of 15000 images, each image with size of 950 x 550. Our algorithm code, FCD-P and FCD-D are publicly available.

KEYWORDS

Character detection dataset, Deep learning forgery, Forged character detection.


Stride Random Erasing Augmentation

Teerath Kumar, Rob Brennan and Malika Bendechache, CRT AI and ADAPT, School of Computing, Dublin City University, Ireland

ABSTRACT

This paper presents a new method for data augmentation called Stride Random Erasing Augmentation (SREA) to improve classification performance. In SREA, probability based strides of one image are pasted onto another image and also labels of both images are mixed with the same probability as the image mixing, to generate a new augmented image and augmented label. Stride augmentation overcomes limitations of the popular random erasing data augmentation method, where a random portion of an image is erased with 0 or 255 or the mean of a dataset without considering the location of the important feature(s) within the image. A variety of experiments have been performed using different network flavours and the popular datasets including fashion-MNIST, CIFAR10, CIFAR100 and STL10. The experiments showed that SREA is more generalized than both the baseline and random erasing method. Furthermore, the effect of stride size in SREA was investigated by performing experiments with different stride sizes. Random stride size showed better performance. SREA outperforms the baseline and random erasing especially on the fashion-MNIST dataset. To enable the reuse, reproduction and extension of SREA, the source code is provided in a public git repository.

KEYWORDS

Data Augmentation, Image Classification, Erasing Augmentation.


Research on Dual Channel News Headline Classification Based on ERNIE Pre-training Model

Junjie Li and Hui Cao, Key Laboratory of Chinas Ethnic Languages and Information Technology of Ministry of Education Northwest Minzu University Lanzhou, China

ABSTRACT

The quality and style of news headlines in the information age directly determine the click-through rate and popularity of their news content, and have a significant impact on the expression of news centers and the guidance of public opinion. Based on ERNIE to extract the lexical, semantic and contextual feature information of the bottom of the text, generate dynamic word vector representations that integrate the contextual context, and then use the BiLSTM-AT network channel to extract the global features of the contextual temporal information for a second time, and use the attention mechanism to The key part is given higher weight, and the DPCNN channel is used to extract long-distance text dependencies to obtain deep-level local features, and the dual-channel feature vectors are spliced, and finally passed into the fully connected layer and output the final classification result through Softmax. The experimental results show that this classification method has improved the accuracy of news headline classification compared with the multiple comparison models set in the experiment.

KEYWORDS

Text Classification, ERNIE, Dual-Channel, BiLSTM, Attention, DPCNN.


Advanced Service Data Provisioning In ROF-based Mobile Backhauls/Fronthauls

Mikhail E. Belkin, Leonid Zhukov and Alexander S. Sigov, Russian Technological University MIREA, Moscow, Russia

ABSTRACT

A new cost-efficient concept to realize a real-time monitoring of quality-of-service metrics and other service data in 5G and beyond access network using a separate return channel based on a vertical cavity surface emitting laser in the optical injection locked mode that simultaneously operates as an optical transmitter and as a resonant cavity enhanced photodetector, is proposed and discussed. The feasibility and efficiency of the proposed approach are confirmed by a proof-of-concept experiment when optically transceiving high-speed digital signal with multi-position quadrature amplitude modulation of a radio-frequency carrier.

KEYWORDS

5G and beyond, access network, RoF-based mobile fronthaul/backhaul, real-time monitoring, QoS metrics, OIL-VCSEL.


An Automated Quantitative Information Flow Analysis for Concurrent Programs

Khayyam Salehi1, Ali A. Noroozi2, Sepehr Amir-Mohammadian3 and Jaber Karimpour2, 1Shahrekord University, Shahrekord, Iran, 2University of Tabriz, Tabriz, Iran, 3University of the Pacific, Stockton, CA, USA

ABSTRACT

Quantitative information flow is a rigorous approach for evaluating the security of a system, including network protocols. It is used to quantify the amount of secret in formation leaked to the public outputs. In this paper, we propose an automated approach for quantitative information flow analysis of concurrent programs, which are widely used to implement network and communication systems. Markovian processes are used to model the behavior of these programs. To this end, we assume that the attacker is capable of observing the internal behavior of the program and propose an equivalence relation, back-bisimulation, to capture the attacker’s view of the program behavior. A partition refinement algorithm is developed to construct the back-bisimulation quotient of the program model and then a quantification method is proposed for computing the information leakage using the quotient. We have implemented our proposed approach as a leakage quantification tool, called PRISM Leak. It is built on the probabilistic model checker PRISM. Finally, an anonymous protocols, dining cryptographers, is analyzed as a case study to show applicability and scalability of the proposed approach.

KEYWORDS

Information leakage, Network protocol security, Quantitative information flow, Confidentiality, PRISM-Leak.


A Functional Review of the Most Common Opportunistic Routing Protocols

Swati Sharma, Amita Dev, and Arun Sharma, Indira Gandhi Delhi Technical University for Women, Delhi, India

ABSTRACT

Opportunistic Networks are basically on-demand wireless networks that do not undertake routing overheads until there is a packet to send from a source to a destination node. The intermediary nodes on the network operate as forwarding nodes thereby assisting in the process. The selection of these forwarding nodes, coordination between these forwarding nodes and the utilization of critical resources in these processes determine the basic operation of an opportunistic routing protocol. The performance of these protocols is assessed based on the utilization of key network parameters or Metrics. This paper provides a functional review of the operation of the most common opportunistic routing protocols. It compares these protocols in terms of their operation and critical-resource utilization. This literature also summarizes the metrics used and challenges encountered in opportunistic routing protocols. Most of the protocols discussed here are used as benchmarks for routing in opportunistic networks.

KEYWORDS

Opportunistic Networks, Opportunistic Routing Protocols, Forwarding Set, Routing Metrics.


Sentiment Analysis of Cybersecurity Topics in Twitter and Reddit

Bipun Thapa, College of Business, Innovation, Leadership, and Technology, Marymount University, USA

ABSTRACT

Sentiment Analysis provides an opportunity to understand the subject(s), especially in the digital age, due to an abundance of public data points and effective algorithms. Cybersecurity is a subject where opinions are plentiful and differing in the public domain. This descriptive research aims to analyze cybersecurity topics in Twitter and Reddit, and provide a breakdown of its sentiment. In order to conduct this research, data is collected from Twitter, Reddit and an online survey and analyzed through an NLP (Natural Language Processing) algorithm. The results produce mostly positive and neutral sentiments based on the data collected. In addition, the NLP algorithm is evaluated for its accuracy based on human classification. The goal of the research is to capture a snapshot of public sentiment on social media platforms at a given timeframe and potentially advance explanatory research opportunities to find influential variables for appropriate polarity.

KEYWORDS

NTLK, NLP, VADER, Sentiment Analysis, API, Python, Polarity, Evaluation Metrics.


An New Approach for Speech Keyword Detection on Noisy Environment

Peiwen Ye and Hancong Duan, School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, Sichuan, China

ABSTRACT

Keyword Spotting, a.k.a., keyword recognition, has been widely used in products such as mobile devices and smart homes. By detecting wake-up keywords in the continuous voice stream, the corresponding program can be started intelligently. Neural networks have dominated keyword spotting recently. Meanwhile, attention mechanism has been widely used in natural language processing and has achieved great performance improvement. However, few researchers concerned about the noise issue in noisy speech keyword recognition. Thus, we propose a new neural network architecture combined with attention mechanism, which employs the backbone network of LSTM and CNN to emphasize significant information in the keyword. We use separable convolution instead of ordinary convolution to reduce the number of parameters of the model. We train our model on Google Speech Commands dataset to compare recognition accuracy. Besides, to improve the performance of the model in noisy environment, we apply data enhancement methods such as time masking and frequency masking to this task, and add noise of various scenes with different signal-to-noise ratios to the training data. The experimental results show that our model finally achieves the accuracy of 94.93% at most, which exceeds existing models on noisy environment.

KEYWORDS

Keyword Spotting, Noise robustness, Command Recognition, Attention Mechanism, Data augmentation.


An Educational Electronic Health Record with a Configurable User Interface

Yichun Zhao,Florianë Shala,Evangeline Wagner, University of Victoria

ABSTRACT

Background: Proper educational training and support are proven to be major components of EHR implementation and use. However, the majority of health providers are not sufficiently trained in EHR use, leading to adverse events, errors, and decreased quality of care. In response to this, students studying Health Information Science, Public Health, Nursing, and Medicine should all gain a thorough understanding of EHR use at different levels for different purposes. The design of a usable and safe EHR system that accommodates the needs and workflows of different users, user groups, and disciplines is required for EHR learning to be efficient and effective. Objectives: This project builds several artifacts which seek to address both the educational and usability aspects of an educational EHR. The artifacts proposed are models for and examples of such an EHR with a configurable UI to be learnt by students who need a background in EHR use during their degrees.Methods: Review literature and gather professional opinions from domain experts on usability, the use of workflow patterns, UI configurability and design, and the educational aspect of EHR use. Conduct interviews in a semi-casual virtual setting with open discussion in order to gain a deeper understanding of the principal aspects of EHR use in educational settings. Select a specific task and user group to illustrate how the proposed solution will function based on the current research. Develop three artifacts based on the available research, professional opinions, and prior knowledge of the topic. The artifacts capture the user task and user’s interactions with the EHR for learning. The first generic model provides a general understanding of the EHR system process. The second model is a specific example of performing the task of MRI ordering with a configurable UI. The third artifact includes UI mock-ups showcasing the models in a practical and visual way.Significance: Due to the lack of educational EHRs, medical professionals do not receive sufficient EHR training. Implementing an educational EHR with a usable and configurable interface to suit the needs of different user groups and disciplines will help facilitate EHR learning and training, and ultimately improve the quality of patient care.

KEYWORDS

Education, Usability, EHR.


Understanding of Comorbidities Using Modeling Techniques on EHR

Comorbidities refer to the existence of numerous, co-occurring diseases in medicine. The course of one comorbidity is typically extremely dependent on the course of the other condition due to their co-occurrence, and therapies can have major spill-over effects. Despite the high occurrence of comorbidities among patients, there is no complete statistical framework for modelling comorbidity longitudinal dynamics. We propose a probabilistic approach for studying comorbidity dynamics in patients over time in this paper. Coupled- HMM is a linked hidden Markov model with a personalized, non-homogeneous transition mechanism that we developed. Clinical research influenced the design of our coupled-HMM: (1) It accounts for different disease stages (acute, stable) in disease progression by providing clinically meaningful latent phases. (2) It simulates a relationship between the trajectories of comorbidities and the dynamics of capturing co-evolution. (3) The transition mechanism takes into account between-patient heterogeneity (e.g., risk factors, treatments). Based on 675 health trajectories, we assessed our proposed Coupled-HMM, which investigates the concomitant evolution of diabetes mellitus and chronic liver disease. We find that our Coupled-HMM provides a superior fit when compared to competing models without coupling. We also assess the spill-over impact, or the amount to which diabetic therapies are linked to a shift in chronic liver disease from an acute to a stable condition. As a result, our approach has immediate application in both treatment planning and clinical research in the context of comorbidities.CCS Concepts Applied computing ? Health informatics; Mathematics of computing ? Bayesian computation; Markov processes.

KEYWORDS

Longitudinal data analysis, Disease dynamics, Comorbidity, Hidden Markov model, Bayesian analysis.


Bigdata Platform to Accelerate Insights Generation from Healthcare Data

Arun Sundararaman, Healthcare Data and Analytics Practice Head, Accenture Technology, Chennai, India

ABSTRACT

Healthcare Industry is abundant with data, yet poor in insights. Gaps are primary due to highly unstructured data prevalent in medical notes, clinical records, lab results etc. This challenge is compounded with lack of integration of data sources; data that is required for effective medical outcomes is spread across 4 large islands viz., EMRs, Health Administrators, Personal Health Records (including wearables, lifestyle data, implanted medical devices) and Public Health data. Integrating such disparate sources, structures and making meaningful use of healthcare data continues to be time consuming and costly. The Industry cannot afford longer timelines and excessive costs in the form of multiple proprietary environments to integrate such datasets. Emergence of data platforms on the cloud are expected to address these challenges. This paper discusses the need for a Healthcare Data Platform to accelerate insights generation and presents a patented cloud health data platform solution that accelerates insights generation. This paper discusses in detail the Architecture and Design principles, technical details and benefits and value from implementation experiences, besides outlining future directions.

KEYWORDS

Data Supply chain, accelerated insights, Healthcare Data Platform, modular design, Big Data.


Discovery of Association Rules of the Relationship between Food Consumption and Life Style Diseases from Swiss Nutrition’s (Menuch) Dataset & Multiple Swiss Health Datasets from 1992 to 2012

Timo Lustenberger1, Helena Jenzer2 and Farshideh Einsele1, 1Section of Business Information, Bern University of Applied Sciences, Switzerland, 2Hospital of Psychiatry, University of Zurich, Switzerland

ABSTRACT

This article demonstrates that using data mining methods such as Weighted Association Rule Mining (WARM) on an integrated Swiss database derived from a Swiss national dietary survey (menuCH) and 25 years of Swiss demographical and health data is a powerful way to determine whether a specific population subgroup is at particular risk for developing a lifestyle disease based on its food consumption patterns. The objective of the study was to discover critical food consumption patterns linked with lifestyle diseases known to be strongly tied with food consumption. Food consumption databases from a Swiss national survey menuCH were gathered along with data of large surveys of demographics and health data collected over 25 years from Swiss population conducted by Swiss Federal Office of Public Health (FOPH). These databases were integrated and reported in a previous study as a single integrated database. A data mining method such as WARM was applied to this integrated database. A set of promising rules and their corresponding interpretation was generated. As an example, the found rules of the sample show that the consumption of alcohol in small quantities does not have a negative impact on health, whereas the consumption of vegetables is important for the supply of vitamins of the B group, which help the energy metabolism to pro-vide energy. These vitamins are particularly lacking in alcoholics and should then be taken with supplements. Another finding is that dietary supplements do little specially by diabetes. Applying WARM algorithm was beneficial for this study since no interesting rules were pruned out early and the significance of the rules could be highly increased as compared to a previous study using pure Apriori Algorithm.

KEYWORDS

Data Mining, WARM Association Analysis, Diet & Chronical Diseases, Health Informatics.


Encryption Based Watermarking Technique for Security of Medical Image

Abderrahmane Daham1 and Mohamed ouslim2, 1Department of Computer Engineering, Bechar University (UTMB), 2University of Science and Technology of Oran (USTO)

ABSTRACT

This paper proposes an encryption-based image watermarking scheme for medical images using an adaptive quantization of wavelet coefficient and a crypto-system based on the chaotic encryption of Singular Value Decomposition (SVD). In order to increase the robustness of the algorithm and provide extra security, an improved SVD-CHAOS embedding and extraction procedure has been used to scramble the watermark logo in the preprocessing step of the proposed method. In the process of watermark embedding, an R-level discrete wavelet transform was applied to the host image. The high frequency wavelet coefficients are selected to carry these scrambled-watermarks by using adaptive quantization low bit modulation (LBM). The proposed image watermarking method endures all attacks and aptly extracts the concealed watermark without significant degradation in the image quality, Thus when the Peak Signal to Noise Ratio (PSNR) and Normalized Correlation (NC) performance of the proposed algorithm is correlated with other related techniques.

KEYWORDS

Watermarking, medical image, discrete wavelet transform, singular value decomposition, quantization, chaos cryptosystem.


Improving Forecasting Demand for Maintenance Spare Parts: Case Study of Power Utility

Abdulaziz A. Afandi, Department of Engineering College, Islamic University of Medina, Medina, Saudi Arabia

ABSTRACT

The need to reduce inventory holding costs and increasing system operational availability are the main motivation behind improving spare parts inventory management in a major power utility company in Medina. This paper reports in an effort made to optimize the orders quantities of spare parts by improving the method of forecasting the demand. The study focuses on equipment that has frequent spare part’s purchase orders with uncertain demand. The pattern of the demand considers lumpy pattern which makes conventional forecasting methods less effective. A comparison was made by benchmarking various methods of forecasting based on experts’ criteria to select the most suitable method for the case-study. Three actual data sets were used to make the forecast in this case study. Two neural network (NN) approaches were utilized and compared, namely long short-term memory (LSTM) and multilayer perceptron (MLP). The results as expected, showed that the NN models gave better results than traditional forecasting method (judgmental method). In addition, the LSTM model had a higher predictive accuracy than the MLP model.

KEYWORDS

ANN, long short-term memory, multilayer perceptron, forecasting demand, inventory management, spare parts.


AI and Machine Learning Applications in Medicine

Prof. Dr. Mitat Uysal1, M. Ozan Uysal2 and R.A. Nurdanur Pehlivan3, 1Faculty of Software Engineering, Dogus University, Istanbul, Turkey, 2Appcent Ltd, Istanbul, Turkey, 3Faculty of Software Engineering, Istanbul, Turkey

ABSTRACT

In this study the importance and chance of using of Artificial Intelligence and Machine Learning methods are discussed and several application samples are given from this area.Machine Learning methods provide more accuracy in Diagnosis of Medical images than the outputs of human experts.A new found metaheuristic optimization algorithm called as MBO (migrating birds algorithm) is used with SVM for classification of cancer tumors. The new developed model gives very satisfactory results in this area.

KEYWORDS

AI, machine learning, MBO, SVM.