Vol. 12 No. 7 JUL 2014

Vol. 12 No. 7 JULY 2014 International Journal of Computer Science and Information Security

Publication JULY 2014, Volume 12 No. 7 (Download Full Journal) (Archive) (Download 2)

.

Copyright © IJCSIS. This is an open access journal distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

1. Paper 30061401: Logical Analysis of an Accelerated Secure Multicast Authentication Protocol (pp. 1-10)

Full Text: PDF

.

Ghada Elkabbany, Informatics Dept., Electronics Research Institute, Cairo, Egypt

Mohamed Rasslan, Informatics Dept., Electronics Research Institute, Cairo, Egypt

Heba Aslan, Informatics Dept., Electronics Research Institute, Cairo, Egypt

.

Abstract — Multicast authentication is a challenging problem, because it should verify the received packets without assuming the availability of the entire original stream and resist many types of attacks, such as pollution attacks. Researchers have proposed many solutions in literature with major drawbacks in high communication and computation overheads. Others suffer from packet loss and pollution attacks. Recently, signature techniques were used to provide multicast authentication. Signcryption techniques have the advantage of achieving the basic goals of encryption and signature schemes. But, it suffers from the inability to resist packet loss. In a previous work, we proposed a multicast authentication protocol that is based on signcryption techniques and erasure code function to solve the packet loss problem. In this paper, we utilize pipelining technique to reduce the computation overhead. Pipelined technique is chosen due to its suitability for signcryption algorithm nature. The pipelined technique reduces the computation time. Moreover, a verification of our protocol using BAN logic is performed. The analysis shows that it achieves the goals of authentication without bugs or redundancies. A comparison of multicast authentication protocols is carried out. The results show that the accelerated multicast authentication protocol resists packet loss and pollution attacks with low computation and communication overheads, therefore, it could be used in real-time applications.

.

Keywords - Multicast Communication, Authentication, Signcryption, Erasure Code Functions, Parallel Pipelined, BAN Logic.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

2. Paper 30061403: Image Zooming using Sinusoidal Transforms like Hartley, DFT, DCT, DST and Real Fourier Transform (pp. 11-16)

Full Text: PDF

.

Dr. H. B. Kekre, Senior Professor Computer Engineering Department MPSTME, NMIMS University, Vile Parle, Mumbai, India,

Dr. Tanuja Sarode, Associate Professor, Computer Department, Thadomal Shahani Engg. College, Bandra, Mumbai 50, India

Shachi Natu, Ph.D. Research Scholar, Computer Engineering Department MPSTME, NMIMS University, Vile Parle, Mumbai, India

.

Abstract — A simple method of resizing the image using the relation between sampling frequency and zero padding in frequency and time domain or vice versa of Fourier transform is proposed. Padding zeroes in frequency domain and then taking inverse gives zooming effect to image. Transforms like Fourier transform, Real Fourier transform, Hartley transform, DCT and DST are used. Their performance is compared and Hartley is found to be giving better performance. As we increase the size of image, DCT starts giving better performance. Performance of all these transforms is also compared with another resizing technique called grid based scaling and transformed based resizing is observed to be better than grid based resizing.

.

Keywords-Image zooming; DFT; Hartley Transform; Real Fourier Transform; DCT; DST

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

3. Paper 30061404: A Self-Training with Multiple CPUs Algorithm for Load Balancing using Time estimation (pp. 17-21)

Full Text: PDF

.

Aziz Alotaibi, Fahad Alswaina

Department of Computer Science, 221 University Ave, University of Bridgeport, Bridgeport, CT, USA

.

Abstract - In this paper, we propose a self-trading algorithm using two new parameters: time execution and type of priority to improve the load balancing performance. Load balancing uses information such as CPU load, memory usage, and network traffic which has been extracted from previous execution to increase the resource’s utilization. We have included time execution for each property individually such as CPU bound, and Memory bound to balance the work between nodes. Type of priority has been taken into account to enhance and expedite the processing of request with high priority.

.

Keywords – Cloud Computing, Load Balancing, Resource allocation.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

4. Paper 30061405: Result-Oriented Approach for Websites Accessibility Evaluation (pp. 22-30)

Full Text: PDF

.

Marya Butt, School of Governance, Utrecht University, Utrecht, the Netherlands

.

Abstract — The paper attempts to devise a result oriented approach for evaluating the accessibility of three Dutch government websites. Most of the research work pertaining website accessibility evaluation is intended to benchmark the organizations, however this study plans to initiate learning for the selected Government Bodies (GB) to improve websites accessibility. The devised approach spans three phases and is tested in three government bodies of the Netherlands. In the first phase, websites accessibility is evaluated for the selected government bodies. In the second phase, feedback from the web developers of the selected government bodies is collected to disclose their knowledge and practices. The third phase accentuates on measuring the results utilization. The websites evaluation is carried out according to the WCAG version 2.0 (level AA) by using various online tools - e.g. TAW, CCA (Color

Contrast Analyzer), RIC (Readability Index Calculator) - and a test case to check that website is keyboard operable. Test results show that the selected websites failed to adhere to the WCAG 2.0. The feedback of the web developers revealed that though they are aware of these guidelines, yet clients do not want to compromise on other aspects, e.g. outlook and cost. The study initiated learning for all tested government bodies. Government bodies found the accessibility reports useful and showed perseverance to exploit research results in improving website accessibility.

.

Keywords-component; E-government, Websites Accessibility, Evaluation, Netherlands, WCAG 2.0

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

5. Paper 30061411: Performance Evaluation of Forward Difference Scheme on Huffman Algorithm to Compress and Decompress Data (pp. 31-36)

Full Text: PDF

.

Adamu Garba Mubi, Computer Science Department, Federal Polytechnic, Mubi, Adamawa State, Nigeria

Dr. P. B. Zirra, Computer Science Department, Federal University Kashere, Gombe State, Nigeria

.

Abstract - Data Compression using Forward Difference Techniques on Huffman algorithm is a research work which investigated how Forward Difference Techniques was used on Huffman to compress and decompress data without loss of information. The study measured the performance of Huffman algorithm against the Forward Difference on Huffman using Compression Ratio, Compression Factor and Saving Percentage. During the encoding the new algorithm reads the input file, serializes the distinct characters, determines the probability of each character, computes Forward Difference on the positions of each character, computes twos complement on the resulting difference, computes the new probability using the twos complement code, determines the codeword for each distinct character and finally determine the binary symbols to be transmitted. While decoding the new algorithm reads the whole encoded message bit-by-bit, determines a codeword from the coded message and determines a symbol the codeword represented; using the new probability the twos complement code is regenerated. Decimal equivalent of the twos complement described a delta difference. Backward difference is used to determine the character positions of each character which is used again to reconstruct the whole message file. The results obtained revealed clearly that the performance of Forward Difference on Huffman is better than that of Huffman alone.

.

Keyword: Data, Huffman algorithm, data compression

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

6. Paper 30061422: Wireless Sensor Networks Attacks and Solutions (pp. 37-40)

Full Text: PDF

.

Naser Alajmi, Computer Science and Engineering Department, University of Bridgeport, Bridgeport, CT 06604, USA

.

Abstract — A few years ago, wireless sensor networks (WSNs) used by only military. Now, we have seen many of organizations use WSNs for some purposes such as weather, pollution, traffic control, and healthcare. Security is becoming on these days a major concern for wireless sensor network. In this paper I focus on the security types of attacks and their detection. This paper anatomizes the security requirements and security attacks in wireless sensor networks. Also, indicate to the benchmarks for the security in WSNs

.

Keywords-Wireless sensor network, security, vulnerability, attacks

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

7. Paper 30061423: Enhancing the Accuracy of Biometric Feature Extraction Fusion Using Gabor Filter and Mahalanobis Distance Algorithm (pp. 41-48)

Full Text: PDF

.

Ayodeji S. Makinde, Yaw Nkansah-Gyekye, Loserian S. Laizer

School of Computational and Communication Science and Engineering, NM-AIST, Tanzania

.

Abstract - Biometric recognition systems have advanced significantly in the last decade and their use in specific applications will increase in the near future. The ability to conduct meaningful comparisons and assessments will be crucial to successful deployment and increasing biometric adoption. The best modality used as unimodal biometric systems are unable to fully address the problem of higher recognition rate. Multimodal biometric systems are able to mitigate some of the limitations encountered in unimodal biometric systems, such as non-universality, distinctiveness, non-acceptability, noisy sensor data, spoof attacks, and performance. More reliable recognition accuracy and performance are achievable as different modalities were being combined together and different algorithms or techniques were being used. The work presented in this paper focuses on a bimodal biometric system using face and fingerprint. An image enhancement technique (histogram equalization) is used to enhance the face and fingerprint images. Salient features of the face and fingerprint were extracted using the Gabor filter technique. A dimensionality reduction technique was carried out on both images extracted features using a principal component analysis technique. A feature level fusion algorithm (Mahalanobis distance technique) is used to combine each unimodal feature together. The performance of the proposed approach is validated and is effective.

.

Keywords – Gabor filters; Mahalanobis distance; principal component analysis; face; fingerprint; feature extraction.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

8. Paper 30061428: GPGPU based Parallel Spam Filter (pp. 49-58)

Full Text: PDF

.

Prachi Goyal Juneja, M.Tech Scholar, Maulana Azad National Institute of Technology Bhopal (M.P) India-462003

R. K. Pateriya, Associate Professor, Maulana Azad National Institute of Technology Bhopal (M.P) India-462003

.

Abstract - Spam means unwanted emails in our mailboxes each day. These emails consist of promotional messages from companies, viruses, lucrative offers of earning extra income and many more. They are sent in bulk to flood our mailboxes and come from unknown sources. Various ways have been devised to deal with spam; these are known as Spam Filtering Techniques. Spam Filtering is done based on many parameters like keywords, URL, content etc. Content based spam filtering is becoming famous since it incorporates the judging of the email content and then analyzing it to be spam or ham. As the data is increasing and electronic data taking over most of the communication medium, one needs faster processing and computing devices. GPGPU’s have come up in a great way in sharing the CPU’s tasks and make parallel processing possible.

.

Keywords- Spam, Bayesian Spam Filtering, Serial Spam Filter, Parallel Spam Filter, Spamicity.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

9. Paper 30061429: Mobile- Health Application Software Design and Development (pp. 59-66)

Full Text: PDF

.

Ayangbekun Oluwafemi J., Department of Information Systems, University of Capetown, South Africa

Kasali Olanrewaju M., Department of Information Technology, Crescent University Abeokuta, Nigeria

.

Abstract — Mobile technologies are fast developing and it has completely changed the way we interact and provide healthcare services. The rapid spread of mobile technologies and inventive applications to address health related problems has evolved into a new field known as mobile-Health. The purpose of this research is to improve the quality and access to health care services with the aid of mobile-Health application software known as “Crescent Mobile Health”. This paper will address the problem of self medication by creating a channel of communication between a patient and doctor at distant environment there by solving emergency situations. The method used to address this problem is by designing and developing mobile-Health application software, which can be used by patients via an android smartphone that is used to communicate with a doctor/pharmacist/laboratory scientist using electronic-Health application software known as Crescent Health Information System on a desktop via the intranet. The two applications on smartphone and desktop are able to communicate via instant messaging by a persistent connection known as “sockets” and “pusher” which provides implementation for interconnectivity. The Crescent Health Information System can carry out major functionalities such as drugs and tests inventory, instant messaging, prescriptions of drugs, prescription of tests and profile update. The Crescent Mobile Health can also carry out functionalities such as instant messaging, viewing of prescribed drugs, tests, health tips and help file. The mobile-Health application software was developed using java programming language and android development studio while the electronic-Health (E-Health) application software was developed using PHP programming language and MYSQL database. The results of the development of this project concludes that mobile-Health application software has been able to resolve the problem of communication between a patient and a doctor and has provided a means to verify drugs available and tests carried out in the clinic/health sector.

.

Keywords - Electronic-Health; Healthcare; Intranet; Mobile- Health; Patient; Smartphone; Socket

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

10. Paper 30061427: System Analysis and Design for integrated sponsored SMS/USSD Based M-Services: A case study of Maternal Health M-Service in Tanzania (pp. 67-77)

Full Text: PDF

.

Timothy Y. Wikedzi, Ramadhani S. Sinde

Computational and Communication Sci & Eng, Nelson Mandela African Institution of Sci & Tech, Arusha, Tanzania

Dan K. McIntyre, Information Technology, University of Iringa, Iringa, Tanzania

.

Abstract -- Mobile phones have proven to be the best way of providing reliable access to information to people in low and mid income countries where other forms of communication perform poorly. As a result of the wide spread of mobile phones, there has been an increase in number of Mobile Application (M-Services) which are being used as a tool for disseminating different type information to people. M-Services of this nature are established to address informational challenges that are faced by people especially low income people. Because of this then, these projects must be sustained so that people can enjoy the benefits of it. Contrary to this, reports show that most of these M-Services are facing the challenge of cost of operating them, which in a direct way affects the sustainability of these services. In this paper therefore we present an analysis and later design of a noncommercial M-Service, which integrates advertising functionality as a tool for subsidizing the cost of operating M-Services. To achieve this we have employed some concepts of Information System Analysis and Design (ISAD) as the guiding principle towards achieving our design. A prototype of M-Health is used for the study.

.

Keywords-M-Service; ISAD; Ad, USSD; SMS; Mobile; Sustainable; Cost of operation.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

11. Paper 30061407: Conversion of an SR-Flip Flop to a JK-Flip Flop (pp. 78-92)

Full Text: PDF

.

Prof. Olawale J. Omotosho, Engr. Samson O. Ogunlere

Babcock University, Computer Science Department, Ilishan-Remo, Ogun State, Nigeria

.

Abstract - This paper presents a design method to convert a conventional SR-Flip Flop to perform the functions of a corresponding conventional JK-Flip Flop. This requirement becomes very necessary because of the many applications of JK-Flip Flops in digital systems, especially in those systems that drive production industries. In such industries, uninterrupted production is one of the targets required to pay attention to in order not to lose production and consequently revenue. Equipment failure can be responsible for such an unwanted state of production. Therefore, downtime of any equipment becomes very crucial in the assurance procedure of associated equipment and instrumentation of a manufacturing plant. The cause of a large downtime of any equipment is mainly due to unavailability of spare parts and sometimes incompetence and inexperience of the Technologists responsible for the up-keep and assurance of these equipment and instrumentation. Technologist must be versatile in providing alternative solutions to existing provisions that is adequate to solve any prevailing situation which requires urgent attention to keep production going. Such experience is not only borne out of hands-on practice but can be acquired by sound theoretical knowledge of what to do. This paper examines a situation where a device (JK-Flip Flop) is not available to replace a defective one but an SR-Flip flop is configured to be used for the same purpose without degradation of performance.

.

Keywords—Conventional Flip-Flops, uninterrupted production, unavailability of spare parts, incompetence and inexperience of the Technologists, downtime of equipment, K-maps.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

12. Paper 30061418: Digital Shorthand Based Text Compression (pp. 93-97)

Full Text: PDF

.

Yogesh Rathore, CSE,UIT, RGPV, Bhopal, M.P., India

Dr. Rajeev Pandey, CSE,UIT, RGPV, Bhopal, M.P., India

Manish K. Ahirwar, CSE,UIT, RGPV, Bhopal, M.P., India

.

Abstract — With the growing demand for text transmission and storage as a result of advent of net technology, text compression has gained its own momentum. Usually text is coded in yank traditional Code for data Interchange format. Huffman secret writing or the other run length secret writing techniques compresses the plain text[6][11]. We have planned a brand new technique for plain text compression, that is especially inspired by the ideas of Pitman Shorthand. In these technique we propose a stronger coding strategy, which can provide higher compression ratios and higher security towards all possible ways in which of attacks while transmission. The target of this method is to develop a stronger transformation yielding larger compression and additional security[11]. The basic idea of compression is to transform text in to some intermediate form, which may be compressed with higher efficiency and more secure encoding, that exploits the natural redundancy of the language in creating this transformation.

.

Keywords - Compression; Encoding; REL; RLL; Huffman; LZ; LZW; Pitman Shorthand; Compression;

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------