Vol. 6 No. 3 DEC 2009

Vol. 6 No. 3 December 2009 International Journal of Computer Science and Information Security
Publication December 2009, Volume 6 No. 3 (Download Full Journal)

Copyright © 2009-2010 IJCSIS. This is an open access journal distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

1. Paper 30110957: Genealogical Information Search by Using Parent Bidirectional Breadth Algorithm and Rule Based Relationship (pp. 001-006)
Full Text: PDF

Sumitra Nuanmeesri, Department of Information Technology, King Mongkut’s University of Technology North Bangkok, Bangkok, Thailand
Chanasak Baitiang, Department of Applied Science, King Mongkut’s University of Technology North Bangkok, Bangkok, Thailand
Phayung Meesad, Department of Information Technology King Mongkut’s University of Technology North Bangkok, Bangkok, Thailand

Abstract—Genealogical information is the best histories resources for culture study and cultural heritage. The genealogical research generally presents family information and depict tree diagram. This paper presents Parent Bidirectional Breadth Algorithm (PBBA) to find consanguine relationship between two persons. In addition, the paper utilizes rules based system in order to identify consanguine relationship. The study reveals that PBBA is fast to solve the genealogical information search problem and the Rule Based Relationship provides more benefits in blood relationship identification.

Keywords-Genealogical Information; Search; Rule Based; algorithm; Bidirectional Search; Relationship
2. Paper 12110918: Web-Based Expert System for Civil Service Regulations: RCSES (pp. 007-016)
Full Text: PDF

Mofreh Hogo, Dept. of electrical engineering, Technology, Higher Institution of Technology Benha, Benha University, Egypt. 
Khaled Fouad, Central Lab. for Agricultural Expert Systems (CLAES)
Fouad Mousa, Business management Dept, Faculty of commerce, Assuit university

Abstract— Internet and expert systems have offered new ways of sharing and distributing knowledge, but there is a lack of researches in the area of web-based expert systems. This paper introduces a development of a web-based expert system for the regulations of civil service in the Kingdom of Saudi Arabia named as RCSES. It is the first time to develop such system (application of civil service regulations) as well the development of it using web-based approach. The proposed system considers 17 regulations of the civil service system. The different phases of developing the RCSES system are presented, as knowledge acquiring and selection, ontology and knowledge representations using XML format. XML-Rule-based knowledge sources and the inference mechanisms were implemented using ASP.net technique. An interactive tool for entering the ontology and knowledge base, and the inferencing was built. It gives the ability to use, modify, update, and extend the existing knowledge base in an easy way. The knowledge was validated by experts in the domain of civil service regulations, and the proposed RCSES was tested, verified, and validated by different technical users and the developers’ staff. The RCSES system is compared with other related web based expert systems, that comparison proved the goodness, usability, and high performance of RCSES.

Keywords- Knowledge base; Ontology; RCSES; and Civil regulation;
3. Paper 05110909: A Wide-range Survey on Recall-Based Graphical User Authentications Algorithms Based on ISO and Attack Patterns (pp. 017-025)
Full Text: PDF

Arash Habibi Lashkari, Computer Science and Information Technology, University of Malaya (UM) Kuala Lumpur, Malaysia
Dr. Rosli Saleh, Computer Science and Information Technology, University of Malaya (UM) Kuala Lumpur, Malaysia
Samaneh Farmand, Computer Science and Information Technology (IT), University Malaya (UM) Kuala Lumpur, Malaysia 
Dr. Omar Bin Zakaria, Computer Science and Data Communication (MCS), University of Malaya (UM) Kuala Lumpur, Malaysia

Abstract- Nowadays, user authentication is one of the important topics in information security. Text-based strong password schemes could provide with certain degree of security. However, the fact that strong passwords being difficult to memorize often leads their owners to write them down on papers or even save them in a computer file. Graphical user authentication (GUA) has been proposed as a possible alternative solution to text-based authentication, motivated particularly by the fact that humans can remember images better than text. In recent years, many networks, computer systems and Internet-based environments try used GUA technique for their user’s authentication. All of GUA algorithms have two different aspects which are usability and security. Unfortunately, none of graphical algorithms were being able to cover both of these aspects at the same time. This paper presents a wide-range survey on the pure and cued recall-based algorithms in GUA, based on ISO standards for usability and attack patterns standards for security. After explain usability ISO standards and attack patterns international standards, we try to collect the major attributes of usability and security in GUA. Finally, try to make comparison tables among all recall-based algorithms based on usability attributes and attack patterns those we found. 

Keywords - Recall-Based Graphical User Authentication; Graphical Password; Usability and security; ISO 9241-11; ISO 9126, ISO 13407; Attack Patterns; Brute force, Dictionary attacks; Guessing; Spyware; Shoulder surfing; Social engineering (description).
4. Paper 29110946: A New Method to Extract Dorsal Hand Vein Pattern using Quadratic Inference Function (pp. 026-030)
Full Text: PDF

Maleika Heenaye- Mamode Khan, Department of Computer Science and Engineering, University of Mauritius, Mauritius
Naushad Ali Mamode Khan, Department of Mathematics, University of Mauritius, Mauritius

Abstract—Among all biometric, dorsal hand vein pattern is attracting the attention of researchers, of late. Extensive research is being carried out on various techniques in the hope of finding an efficient one which can be applied on dorsal hand vein pattern to improve its accuracy and matching time. One of the crucial step in biometric is the extraction of features. In this paper, we propose a method based on quadratic inference function to the dorsal hand vein features to extract its features. The biometric system developed was tested on a database of 100 images. The false acceptance rate (FAR), false rejection rate (FRR) and the matching time are being computed.

Keywords-dorsal hand vein pattern; quadratic inference function; generalised method of moments;
5. Paper 29110949: Architecture of Network Management Tools for Heterogeneous System (pp. 031-040)
Full Text: PDF

Rosilah Hassan, Rozilawati Razali, Shima Mohseni,Ola Mohamad and Zahian Ismail 
Department of Computer Science, Faculty of Information Science and Technology, Universiti Kebangsaan Malaysia, Bangi, Selangor, Malaysia

Abstract— Managing heterogeneous network systems is a difficult task because each of these networks has its own curious management system. These networks usually are constructed on independent management protocols which are not compatible with each other. This results in the coexistence of many management systems with different managing functions and services across enterprises. Incompatibility of different management systems makes management of whole system a very complex and often complicated job. Ideally, it is necessary to implement centralized meta-level management across distributed heterogeneous systems and their underlying supporting network systems where the information flow and guidance is provided via a single console or single operating panels which integrates all the management functions in spite of their individual protocols and structures. This paper attempts to provide a novel network management tool architecture which supports heterogeneous managements across many different architectural platforms. Furthermore, an architectural approach to integrate heterogeneous network is proposed. This architecture takes into account both wireless fixed and mobile nodes. 

Keywords-component; Network Tools Architecture; Services Management; Heterogeneous System;
6. Paper 23110936: A Topological derivative based image segmentation for sign language recognition system using isotropic filter (pp. 041-045)
Full Text: PDF

M. Krishnaveni, Department of Computer Science, Avinashilingam University for Women, Coimbatore, India. 
Dr. V. Radha, Department of Computer Science, Avinashilingam University for Women, Coimbatore, India

Abstract-The need of sign language is increasing radically especially to hearing impaired community. Only few research groups try to automatically recognize sign language from video, colored gloves and etc. Their approach requires a valid segmentation of the data that is used for training and of the data that is used to be recognized. Recognition of a sign language image sequence is challenging because of the variety of hand shapes and hand motions. Here, this paper proposes to apply a combination of image segmentation with restoration using topological derivatives for achieving high recognition accuracy. Image quality measures are conceded here to differentiate the methods both subjectively as well as objectively. Experiments show that the additional use of the restoration before segmenting the postures significantly improves the correct rate of hand detection, and that the discrete derivatives yields a high rate of discrimination between different static hand postures as well as between hand postures and the scene background. Eventually, the research is to contribute to the implementation of automated sign language recognition system mainly established for the welfare purpose.

Key words: Sign Language; segmentation; restoration; topological Derivates; Quality measures;
7. Paper 29110945: A Framework for Validation of Object-Oriented Design Metrics (pp. 046-052)
Full Text: PDF

Devpriya Soni, Department of Computer Applications, Maulana Azad National Institute of Technology (A Deemed University), Bhopal 462007 India 
Ritu Shrivastava,& M. Kumar, SIRT, Bhopal (India) 

Abstract: A large number of metrics have been proposed for the quality of object-oriented software. Many of these metrics have not been properly validated due to poor methods of validation and non acceptance of metrics on scientific grounds. In the literature, two types of validations namely internal (theoretical) and external (empirical) are recommended. In this study, the authors have used both theoretical as well as empirical validation for validating already proposed set of metrics for the five quality factors. These metrics were proposed by Kumar and Soni.

Keywords- object-oriented software; metrics; validation;
8. Paper 30110969: A New Image Steganography Based On First Component Alteration Technique (pp. 053-056)
Full Text: PDF

Amanpreet Kaur, Renu Dhir, and Geeta Sikka,
Department of Computer Science and Engineering, National Institute of Technology, Jalandhar, India

Abstract—In this paper, A new image steganography scheme is proposed which is a kind of spatial domain technique. In order to hide secret data in cover-image, the first component alteration technique is used. Techniques used so far focuses only on the two or four bits of a pixel in a image (at the most five bits at the edge of an image) which results in less peak to signal noise ratio and high root mean square error. In this technique, 8 bits of blue components of pixels are replaced with secret data bits. Proposed scheme can embed more data than previous schemes and shows better image quality. To prove this scheme, several experiments are performed, and are compared the experimental results with the related previous works.

Keywords—image; mean square error; Peak signal to noise ratio; steganography;
9. Paper 07100903: Evaluating Effectiveness of Tamper-Proofing on Dynamic Graph Software Watermarks (pp. 057-063)
Full Text: PDF

Malik Sikandar Hayat Khiyal, Aihab Khan, Sehrish Amjad, Department of Computer Science, Fatima Jinnah Women University, The Mall, Rawalpindi, Pakistan
M. Shahid Khalil, Department of Mechanical Engineering, University of Engineering, Texila.

Abstract—For enhancing the protection level of dynamic graph software watermarks and for the purpose of conducting the analysis which evaluates the effect of integrating two software protection techniques such as software watermarking and tamper-proofing, constant encoding technique along with the enhancement through the idea of constant splitting is proposed. In this paper Thomborson technique has been implemented with the scheme of breaking constants which enables to encode all constants without having any consideration about their values with respect to the value of watermark tree. Experimental analysis which have been conducted and provided in this paper concludes that the constant encoding process significantly increases the code size, heap space usage, and execution time, while making the tamper-proofed code resilient to variety of semantic preserving program transformation attacks.
Keywords-component; contsant encoding; software watermarking; tamper-proofing;
10. Paper 11100908: A Novel Trigon-based Dual Authentication Protocol for Enhancing Security in Grid Environment (pp. 064-072)
Full Text: PDF

V. Ruckmani, Senior lecturer, Department of Computer Applications, Sri Ramakrishna Engineering College, India
Dr G Sudha Sadasivam, Professor, Department of Computer Science and Engineering, PSG College of Technology, Coimbatore, India

Abstract— In recent times, a necessity has been raised in order to distribute computing applications often across grids. These applications are dependent on the services like data transfer or data portal services as well as submission of jobs. Security is of utmost importance in grid computing applications as grid resources are heterogeneous, dynamic, and multi-domain. Authentication remains as the significant security challenge in grid environment. In traditional authentication protocol a single server stores the sensitive user credentials, like username and password. When such a server is compromised, a large number of user passwords, will be exposed. Our proposed approach uses a dual authentication protocol in order to improve the authentication service in grid environment. The protocol utilizes the fundamental concepts of trigon and based on the parameters of the trigon the user authentication will be performed. In the proposed protocol, the password is interpreted and alienated into more than one unit and these units are stored in two different servers, namely, Authentication Server and Backend Server. Only when the combined authentication scheme from both the servers authenticates the user, the privilege of accessing the requested resources is obtained by the user. The main advantage of utilizing the dual authentication protocol in grid computing is that an adversary user cannot attain the access privilege by compromising a single consolidated server because of the fact that the split password is stored in different servers.

Keywords- Dual authentication; authentication protocol; trigon parameters; authentication code; grid computing; grid security;
11. Paper 20110928: Design and Analysis of a Spurious Switching Suppression Technique Equipped Low Power Multiplier with Hybrid Encoding Scheme (pp. 073-078)
Full Text: PDF

S.Saravanan, Department of ECE, K.S.R.College of Technology, Tiruchengode-637215, India.
M.Madheswaran, Department of ECE, Muthayammal Engineering College, Rasipuram-647408, India

Abstract— Multiplication is an arithmetic operation that is mostly used in Digital Signal Processing (DSP) and communication applications. Efficient implementation of the multipliers is required in many applications. The design and analysis of Spurious Switching Suppression Technique (SSST) equipped low power multiplier with hybrid encoding is presented in this paper. The proposed encoding technique reduces the number of switching activity and dynamic power consumption by analyzing the bit patterns in the input data. In this proposed encoding scheme, the operation is executed depends upon the number of 1’s and its position in the multiplier data. The architecture of the proposed multiplier is designed using a low power full adder which consumes less power than the other adder architectures. The switching activity of the proposed multiplier has been reduced by 86% and 46% compared with conventional and Booth multiplier respectively. It is observed from the device level simulation using TANNER 12.6 EDA that the power consumption of the proposed multiplier has been reduced by 87% and 26% compared with conventional and Booth multiplier. 

Keywords-component; Low power VLSI Design, Booth Multiplier, Hybrid encoding.
12. Paper 22110934: Using Sloane Rulers for Optimal Recovery Schemes in Distributed Computing (pp. 079-083)
Full Text: PDF

R. Delhi Babu, Department of Computer Science & Engineering, SSN College of Engineering, Chennai, India
P. Sakthivel, Department of Electronics& Communication Engineering, Anna University, Chennai, India

Abstract—Clusters and distributed systems offer fault tolerance and high performance through load sharing, and are thus attractive in real-time applications. When all computers are up and running, we would like the load to be evenly distributed among the computers. When one or more computers fail, the load must be redistributed. The redistribution is determined by the recovery scheme. The recovery scheme should keep the load as evenly distributed as possible even when the most unfavorable combinations of computers break down, i.e., we want to optimize the worst-case behavior. In this paper we compare the worst-case behavior of schemes such as Modulo ruler, Golomb ruler, Greedy sequence and Log sequence with worst-case behavior of Sloane sequence. Finally we observe that Sloane scheme performs better than all the other schemes. 

Keywords: Fault tolerance; High performance computing; Cluster technique; Recovery schemes; Sloane sequence;
13. Paper 24110938: ICD 10 Based Medical Expert System Using Fuzzy Temporal Logic (pp. 084-089)
Full Text: PDF

P.Chinniah, Research Scholar, Department of ECE, CEG, Anna University, Chennai, INDIA.
Dr.S.Muttan, Professor, Centre for Medical Electronics, CEG, Anna University, Chennai, India

Abstract-Medical diagnosis process involves many levels and considerable amount of time and money are invariably spent for the first level of diagnosis usually made by the physician for all the patients every time. Hence there is a need for a computer based system which not only asks relevant questions to the patients but also aids the physician by giving a set of possible diseases from the symptoms obtained using logic at inference. In this work, an ICD10 based Medical Expert System that provides advice, information and recommendation to the physician using fuzzy temporal logic. The knowledge base used in this system consists of facts of symptoms and rules on diseases. It also provides fuzzy severity scale and weight factor for symptom and disease and can vary with respect to time. The system generates the possible disease conditions based on modified Euclidean metric using Elder’s algorithm for effective clustering. The minimum similarity value is used as the decision parameter to identify a disease. 

Keywords -Fuzzy clustering, symptoms, fuzzy severity scale, weight factor, Minkowski distance, ICD, WHO, Rules Base, TSQL
14. Paper 30110960: DNA-MATRIX: a tool for DNA motif discovery and weight matrix construction (pp. 090-092)
Full Text: PDF

Chandra Prakash Singh, Department of Computer Sciences, R.S.M.T., U.P. College, Varanasi (India).
Feroz Khan, MSB Division, Central Institute of Medicinal & Aromatic Plants (CSIR), Lucknow (India)
Sanjay Kumar Singh, Department of Computer Sciences, R.S.M.T., U.P. College, Varanasi (India).
Durg Singh Chauhan, Institute of Technology, B.H.U., Varanasi (India).

Abstract— In computational molecular biology, gene regulatory binding sites prediction in whole genome remains a challenge for the researchers. Now a days, the genome wide
regulatory binding site prediction tools required either direct pattern sequence or weight matrix. Although there are known transcription factor binding sites databases available
for genome wide prediction but no tool is available which can construct different weight matrices as per need of user or tools available for large data set scanning by first aligning the input
upstream or promoter sequences and than construct the matrices in different level and file format. Considering this, we developed a DNA-MATRIX tool for searching putative
regulatory binding sites in gene upstream sequences. This tool uses the simple biological rule based heuristic algorithm for weight matrix construction, which can be transformed
into different formats after motif alignment and therefore provides the possibility to identify the most potential conserved binding sites in the regulated genes. The user
may construct and save specific weight or frequency matrices in different form and file formats based on user based selection of conserved aligned block of short sequences ranges
from 6 to 20 base pairs and prior nucleotide frequency before weight scoring. 

Keywords: File format; weight matrix; motif prediction;
15. Paper 12110919: Multiprocessor Scheduling For Tasks With Priority Using GA (pp. 093-100)
Full Text: PDF

Dr.G.Padmavathi, Professor and Head, Dept.of Computer Science, Avinashilingam University for Women, Coimbatore – 43, India. 
Mrs.S.R.Vijayalakshmi, Lecturer, School of Information Technology and Science, Dr.G.R.D College of Science, Coimbatore -14, India.

Abstract - Multiprocessors have emerged as a powerful computing means for running real-time applications, especially where a uni-processor system would not be sufficient enough to execute all the tasks. The high performance and reliability of multiprocessors have made them a powerful computing resource. Such computing environment requires an efficient algorithm to determine when and on which processor a given task should execute. In multiprocessor systems, an efficient scheduling of a parallel program onto the processors that minimizes the entire execution time is vital for achieving a high performance. This scheduling problem is known to be NP- Hard. In multiprocessor scheduling problem, a given program is to be scheduled in a given multiprocessor system such that the program’s execution time is minimized. The last job must be completed as early as possible. Genetic algorithm (GA) is one of the widely used techniques for constrained optimization problems. Genetic algorithms are basically search algorithms based on the mechanics of natural selection and natural genesis. The main goal behind research on genetic algorithms is robustness i.e. balance between efficiency and efficacy. This paper proposes Genetic algorithm to solve scheduling problem of multiprocessors that minimizes the make span.

Keywords: - Task Scheduling; Genetic Algorithm (GA); parallel processing;
16. Paper 30110953: Measurement of Nuchal Translucency Thickness for Detection of Chromosomal Abnormalities using First Trimester Ultrasound Fetal Images (pp. 101 -106)
Full Text: PDF

S. Nirmala, Center for Advanced Research, Muthayammal Engineering College,Rasipuram 
V. Palanisamy, Info Institute of Engineering, Kovilpalayam, Coimbatore – 641 107.

Abstract—The Nuchal Translucency thickness measurement is made to identify the Down Syndrome in screening first trimester fetus and presented in this paper. The mean shift analysis and canny operators are utilized for segmenting the nuchal translucency region and the exact thickness has been estimated using Blob analysis. It is observed from the results that the fetus in the 14th week of Gestation is expected to have a nuchal translucency thickness of 1.87±0.25mm.

Keywords- Down syndrome; Nuchal translucency thickness; Mean Shift Analysis; Blob analysis;
17. Paper 29110947: An Improved Image Mining Technique For Brain Tumour Classification Using Efficient Classifier (pp. 107-116)
Full Text: PDF

P. Rajendran, Department of Computer science and Engineering, K. S. Rangasamy College of Technology, Tiruchengode-637215, Tamilnadu, India.
M. Madheswaran, Center for Advanced Research, Department of Electronics and Communication Engineering, Muthayammal Engineering College, Rasipuram – 637 408, Tamilnadu, India.

Abstract— An improved image mining technique for brain tumor classification using pruned association rule with MARI algorithm is presented in this paper. The method proposed makes use of association rule mining technique to classify the CT scan brain images into three categories namely normal, benign and malign. It combines the low-level features extracted from images and high level knowledge from specialists. The developed algorithm can assist the physicians for efficient classification with multiple keywords per image to improve the accuracy. The experimental result on pre-diagnosed database of brain images showed 96% and 93% sensitivity and accuracy respectively.

Keywords- Data mining; Image ming; Association rule mining; Medical Imaging; Medical image diagnosis; Classification;
18. Paper 30110963: Mining Spatial Gene Expression Data Using Negative Association Rules (pp. 117-120)
M. Anandhavalli, & M. K. Ghose, Department of Computer Science Engineering SMIT Majitar, India 
K. Gauthaman, Department of Drug Technology Higher Institute of Medical Technology, Derna, Libya 

Abstract— Over the years, data mining has attracted most of the attention from the research community. The researchers attempt to develop faster, more scalable algorithms to navigate over the ever increasing volumes of spatial gene expression data in search of meaningful patterns. Association rules are a data mining technique that tries to identify intrinsic patterns in spatial gene expression data. It has been widely used in different applications, a lot of algorithms introduced to discover these rules. However Priori-like algorithms has been used to find positive association rules. In contrast to positive rules, negative rules encapsulate relationship between the occurrences of one set of items with absence of the other set of items. In this paper, an algorithm for mining negative association rules from spatial gene expression data is introduced. The algorithm intends to discover the negative association rules which are complementary to the association rules often generated by Priori like algorithm. Our study shows that negative association rules can be discovered efficiently from spatial gene expression data. 

Keywords- Spatial Gene expression data; Association Rule; Negative Association Rule; 
19. Paper 30110954: Hierarchical Route Optimization by Using Tree information option in a Mobile Networks (pp. 121-123)
K. K. Gautam & Menu Chaudhary, Department of Computer Science & Technology, Roorkee Engineering & Management Technology Institute, Shamli-247 774 (INDIA)

Abstract—The Networks Mobility (NEMO) Protocol is a way of managing the mobility of an entire network, and mobile internet protocol is the basic solution for Networks Mobility. A hierarchical route optimization system for mobile network is proposed to solve management of hierarchical route optimization problems. In present paper, we study Hierarchical Route Optimization Scheme using Tree Information Option (HROSTIO). The concept of optimization finding the extreme of a function that maps candidate ‘solution’ to scalar values of ‘quality’ – is an extremely general and useful idea. For solving this problem, we use a few salient adaptations and we also extend HROSTIO perform routing between the mobile networks.

Keywords-Route Optimization; Tree Information Option; personal area networks; NEMO; IP;
20. Paper 22110932: Seeing Beyond the Surface: Understanding and Tracking Fraudulent Cyber Activities (pp. 124-135)
Full Text: PDF 
Longe O. B. & Mbarika V., Int. Centre for IT & Development, Southern University, Baton Rouge, LA 70813  
Kourouma M, Dept. of Computer Science, Southern University , Baton Rouge, LA 70813  
Wada F. & Isabalija R, Nelson Mandela School of Public Policy, Southern University, Baton Rouge, LA 70813

Abstract - The malaise of electronic spam mail that solicit illicit partnership using bogus business proposals (popularly called 419 mails) remained unabated on the internet despite concerted efforts. In addition to these are the emergence and prevalence of phishing scams that use social engineering tactics
to obtain online access codes such as credit card number, ATM pin numbers, bank account details, social security number and other personal information[22]. In an age where dependence on electronic transaction is on the increase, the web security community will have to devise more pragmatic measures to make the cyberspace safe from these demeaning ills. Understanding the perpetrators of internet crimes and their mode of operation is a basis for any meaningful effort towards stemming these crimes. This paper discusses the nature of the criminals engaged in fraudulent cyberspace activities with special emphasis on the Nigeria 419 scam mails. Based on a qualitative analysis and experiments to trace the source of electronic spam and phishing e-mails received over a six months period, we provide information about the scammers’ personalities, motivation, methodologies and victims. We posited that popular e-mail clients are deficient in the provision of effective mechanisms that can aid users in identifying fraud mails and protect them against phishing attacks. We demonstrate, using state of the art techniques, how users can detect and avoid fraudulent e-mails and conclude by making appropriate recommendations based on our findings.

Keyword: Spammers; Scamming; E-mail; Fraud; Phishing; Nigeria; IPLocator;
21. Paper 19110927: On the Efficiency of Fast RSA Variants in Modern Mobile Phones (pp. 136-140)
Klaus Hansen, Troels Larsen, Kim Olsen, Department of Computer Science, University of Copenhagen, Denmark

Abstract—Modern mobile phones are increasingly being used for more services that require modern security mechanisms such as the public-key cryptosystem RSA. It is, however, well-known that public-key cryptography demands considerable computing resources and that RSA encryption is much faster than RSA decryption. It is consequently an interesting question if RSA as a whole can be executed efficiently on modern mobile phones. In this paper, we explore the efficiency on modern mobile phones of variants of the RSA cryptosystem, covering CRT, Multi-Prime RSA, Multi-Power RSA, Rebalanced RSA and R-Prime RSA by comparing the encryption and decryption time using a simple Java implementation and a typical RSA setup.

Keywords—Public-key cryptography; RSA; software; mobile phones;
22. Paper 30110962: An Efficient Inter Carrier Interference Cancellation Schemes for OFDM Systems (pp. 141-148)
B. Sathish Kumar, K. R. Shankar Kumar, R. Radhakrishnan
Department of Electronics and Communication Engineering, Sri Ramakrishna Engineering College, Coimbatore, India.

Abstract— Orthogonal Frequency Division Multiplexing (OFDM) has recently been used widely in wireless communication systems. OFDM is very effective in combating inter-symbol interference and can achieve high data rate in frequency selective channel. For OFDM communication systems, the frequency offsets in mobile radio channels distort the orthogonality between subcarriers resulting in Inter Carrier Interference (ICI). ICI causes power leakage among subcarriers thus degrading the system performance. A well-known problem of OFDM is its sensitivity to frequency offset between the transmitted and received carrier frequencies. There are two deleterious effects caused by frequency offset one is the reduction of signal amplitude in the output of the filters matched to each of the carriers and the second is introduction of ICI from the other carriers. This research work investigates three effective methods for combating the effects of ICI: ICI Self Cancellation (SC), Maximum Likelihood (ML) estimation, and Extended Kalman Filter (EKF) method. These three methods are compared in terms of bit error rate performance and bandwidth efficiency. Through simulations, it is shown that the three techniques are effective in mitigating the modulation schemes, the ML and EKF methods perform better than the SC method.

Keywords- Orthogonal frequency Division Multiplexing (OFDM); Inter Carrier Interference(ICI); Carrier to Interference Power Ratio (CIR);Self Cancellation(SC);Carrier Frequency Offset (CFO); Maximum Likelihood(ML); Extended Kalman Filtering(EKF).
23. Paper 30110951: High-Precision Half-Wave Rectifier Circuit In Dual Phase Output Mode (pp. 149-152)
Theerayut Jamjaem, Department of Electrical Engineering, Faculty of Engineering, Kasem Bundit University, Bangkok, Thailand 10250
Bancha Burapattanasiri, Department of Electronic and Telecommunication Engineering, Engineering Collaborative Research Center, Faculty of Engineering, Kasem Bundit University, Bangkok, Thailand 10250 

Abstract—This paper present high-precision half-wave rectifier circuit in dual phase output mode by 0.5 μm CMOS technology, +/- 1.5 V low voltage, it has received input signal and sent output current signal, respond in high frequency. The main structure compound with CMOS inverter circuit, common source circuit, and current mirror circuit. Simulation and confirmation quality of working by PSpice program, then it able to operating at maximum frequency about 100 MHz, maximum input current range about 400 μAp-p, high precision output signal, low power dissipation, and uses a little transistor.

Keywords-component; half-wave; rectifier circuit; highprecession; dual phase;
24. Paper 17110924: Internal Location Based System For Mobile Devices Using Passive RFID And Wireless Technology (pp. 153-159) ArXiV link
K Kapil N. Vhatkar, Computer Technology Department, Veermata Jijabai Technological Institute, Mumbai, India-400 019
G. P. Bhole, Computer Technology Department, Veermata Jijabai Technological Institute, Mumbai, India-400 019

Abstract— We have explored our own innovative work about the design & development of internal location-identification system for mobile devices based on integration of RFID and wireless technology. The function of our system is based on strategically located passive RFID tags placed on objects around building which are identified using an RFID reader attached to a mobile device. The mobile device reads the RFID tag and through the wireless network, sends the request to the server. The server resolves the request and sends the desired location-based information back to the mobile device. We had addressed that we can go through the RFID technology for internal location identification (indoor), which provides us better location accuracy because of no contact between the tag and the reader, and the system requires no line of sight. In this paper we had also focused on the issues of RFID technologies i.e. Non-line–of-sight & High inventory speeds.

Keywords- Location Based Services; RFID; J2ME;
25. Paper 30110952: High-Precision Multi-Wave Rectifier Circuit Operating in Low Voltage + 1.5 Volt Current Mode (pp. 160-164)
Bancha Burapattanasiri, Department of Electronic and Telecommunication Engineering, Engineering Collaborative Research Center, Faculty of Engineering, Kasem Bundit University, Bangkok, Thailand 10250

Abstract—This article is present high-precision multi-wave rectifier circuit operating in low voltage +/- 1.5 Volt current modes by CMOS technology 0.5 μm, receive input and give output in current mode, respond at high frequency period. The structure compound with high-speed current comparator circuit, current mirror circuit, and CMOS inverter circuit. PSpice program used for confirmation the performance of testing. The PSpice program shows operating of circuit is able to working at maximum input current 400 μAp-p, maximum frequency responding 200 MHz, high precision and low power losses, and non-precision zero crossing output signal.

Keywords-component; rectifier circuit; high-precision; low voltage; current mode;
26. Paper 05110905: Classifying Application Phases in Asymmetric Chip Multiprocessors (pp. 165-170)
A. Z. Jooya, Computer Science dept., Iran University of Science and Technology, Tehran, Iran
M. Analoui, Computer Science dept., Iran University of Science and Technology, Tehran, Iran

Abstract— In present study, in order to improve the performance and reduce the amount of power which is dissipated in heterogeneous multicore processors, the ability of detecting the program execution phases is investigated. The program’s execution intervals have been classified in different phases based on their throughput and the utilization of the cores. The results of implementing the phase detection technique are investigated on a single core processor and also on a multi-core processor. To minimize the profiling overhead, an algorithm for the dynamic adjustment of the profiling intervals is presented. It is based on the behavior of the program and reduces the profiling overhead more than three fold. The results are obtained from executing multiprocessor benchmarks on a given processor. In order to show the program phases clearly, throughput and utilization of execution intervals are presented on a scatter plot. The results are presented for both fixed and variable intervals.

Keywords- Heterogeneous multi-core processor; multiprocessor benchmarks; program phase, execution intervals; dynamic profiling; throughput; resource utilization;
27. Paper 09110913: Syllable Analysis to Build a Dictation System in Telugu language (pp. 171-176)
N. Kalyani, Assoc. Prof, CSE Dept, G.N.I.T.S, Hyderabad, India.
Dr K. V. N. Sunitha, .Professor & HOD, CSE Dept., G.N.I.T.S, Hyderabad, India

Abstract— In recent decades, Speech interactive systems gained increasing importance. To develop Dictation System like Dragon for Indian languages it is most important to adapt the system to a speaker with minimum training. In this paper we focus on the importance of creating speech database at syllable units and identifying minimum text to be considered while training any speech recognition system. There are systems developed for continuous speech recognition in English and in few Indian languages like Hindi and Tamil. This paper gives the statistical details of syllables in Telugu and its use in minimizing the search space during recognition of speech. The minimum words that cover maximum syllables are identified. This words list can be used for preparing a small text which can be used for collecting speech sample while training the dictation system. The results are plotted for frequency of syllables and the number of syllables in each word. This approach is applied on the CIIL Mysore text corpus which is of 3 million words.

28. Paper 30110958: Sinusoidal Frequency Doublers Circuit With Low Voltage + 1.5 Volt CMOS Inverter (pp. 177-180)
Bancha Burapattanasiri, Department of Electronic and Telecommunication Engineering, Engineering Collaborative Research Center, Faculty of Engineering, Kasem Bundit University, Bangkok, Thailand 10250

Abstract — This paper is present sinusoidal frequency doublers circuit with low voltage + 1.5 volt CMOS inverter. Main structure of circuit has three parts that is CMOS inverter circuit, differential amplifier circuit, and square root circuit. This circuit has designed to receive input voltage and give output voltage use few MOS transistor, easy to understand, non complex of circuit, high precision, low error and low power. The Simulation of circuit has MOS transistor functional in active and saturation period. PSpice programmed has used to confirmation of testing and simulation.

Keywords-component; sinusoidal frequency doublers; low voltage; CMOS inverter;
29. Paper 30110972: Speech Recognition by Machine: A Review (pp. 181-205)
M. A. Anusuya, Department of Computer Science and Engineering, Sri Jaya chamarajendra College of Engineering, Mysore, India
S. K. Katti, Department of Computer Science and Engineering, Sri Jayachamarajendra College of Engineering, Mysore, India

Abstract - This paper presents a brief survey on Automatic Speech Recognition and discusses the major themes and advances made in the past 60 years of research, so as to provide a technological perspective and an appreciation of the fundamental progress that has been accomplished in this important area of speech communication. After years of research and development the accuracy of automatic speech recognition remains one of the important research challenges (e.g., variations of the context, speakers, and environment).The design of Speech Recognition system requires careful attentions to the following issues: Definition of various types of speech classes, speech representation, feature extraction techniques, speech classifiers, database and performance evaluation. The problems that are existing in ASR and the various techniques to solve these problems constructed by various research workers have been presented in a chronological order. Hence authors hope that this work shall be a contribution in the area of speech recognition. The objective of this review paper is to summarize and compare some of the well known methods used in various stages of speech recognition system and identify research topic and applications which are at the forefront of this exciting and challenging field.

Key words: Automatic Speech Recognition; Statistical Modeling; Robust speech recognition; Noisy speech recognition; classifiers; feature extraction; performance evaluation; Database;
30. Paper 29110948: An Extension for Combination of Duty Constraints in Role-Based Access Control (pp. 206-215)
Full Text: PDF
Ali Hosseini, ICT Group, E-Learning Center, Iran University of Science and Technology, Tehran, Iran
Mohammad Abdollahi Azgomi, School of Computer Engineering, Iran University of Science and Technology, Tehran, Iran

Abstract—Among access control models, Role-Based Access Control (RBAC) is very useful and is used in many computer systems. Static Combination of Duty (SCD) and Dynamic Combination of Duty (DCD) constraints have been introduced recently for this model to handle dependent roles. These roles must be used together and can be considered as a contrary point of conflicting roles. In this paper, we propose several new types of SCD and DCD constraints. Also, we introduce strong dependent roles and define new groups of SCD constraints for these types of roles as SCD with common items and SCD with union items. In addition, we present an extension for SCD constraints in the presence of hierarchy.

Keywords- Role-Based Access Control (RBAC); Combination of Duty (CD); Static combination of Duty (SCD); Dynamic Combination of Duty (DCD); Dependent Roles.
31. Paper 30110961: An Improved Approach to High Level Privacy Preserving Itemset Mining (pp. 216-223)
Full Text: PDF
Rajesh Kumar Boora, Ruchi Shukla, A. K. Misra
Computer Science and Engineering Department, Motilal Nehru National Institute of Technology, Allahabad, India – 211004

Abstract—Privacy preserving association rule mining has triggered the development of many privacy-preserving data mining techniques. A large fraction of them use randomized data distortion techniques to mask the data for preserving. This paper proposes a new transaction randomization method which is a
combination of the fake transaction randomization method and a new per-transaction randomization method. This method distorts the items within each transaction and ensures a higher level of data privacy in comparison to the previous approaches. The pertransaction randomization method involves a randomization function to replace the item by a random number guarantying privacy within the transaction also. A tool has also been developed to implement the proposed approach to mine frequent
itemsets and association rules from the data guaranteeing the anti-monotonic property.

Keywords; Data Mining; Privacy; Randomization; Association Rules;
32. Paper 22110931: Call Admission Control performance model for Beyond 3G Wireless Networks (pp. 224-229)
Full Text: PDF
Ramesh Babu H.S., Department of Information Science and Engineering, Acharya Institute of Technology
Gowrishankar, Department of Computer Science and Engineering, B.M.S. College of Engineering,
Satyanarayana P.S, Department of Electronics and Communication Engineering, B.M.S. College of Engineering, Bangalore, India

Abstract— The Next Generation Wireless Networks (NGWN) will be heterogeneous in nature where the different Radio Access Technologies (RATs) operate together .The mobile terminals operating in this heterogeneous environment will have different QoS requirements to be handled by the system. These QoS requirements are determined by a set of QoS parameters. The radio resource management is one of the key challenges in NGWN.Call admission control is one of the radio resource management technique plays instrumental role in ensure the desired QoS to the users working on different applications which have diversified QoS requirements from the wireless networks . The call blocking probability is one such QoS parameter for the wireless network. For better QoS it is desirable to reduce the call blocking probability. In this customary scenario it is highly desirable to obtain analytic Performance model. In this paper we propose a higher order Markov chain based performance model for call admission control in a heterogeneous wireless network environment. In the proposed algorithm we have considered three classes of traffic having different QoS requirements and we have considered the heterogeneous network environment which includes the RATs that can effectively handle applications like voice calls, Web browsing and file transfer applications which are with varied QoS parameters. The paper presents the call blocking probabilities for all the three types of traffic both for fixed and varied traffic scenario.

Keywords: Radio Access Technologies; Call admission control; Call blocking probability; Markov model; Heterogeneous wireless Networks.
33. Paper 05110908: Efficient Candidacy Reduction For Frequent Pattern Mining (pp. 230-237)
Full Text: PDF
Mohammad Nadimi-Shahraki, Faculty of Computer Engineering, Islamic Azad University, Najafabad branch, Iran, & Ph.D. Candidate of Computer Science, University of Putra Malaysia
Norwati Mustapha, Faculty of Computer Science and Information Technology,University of Putra Malaysia (UPM), Selangor, Malaysia.
Md Nasir B Sulaiman, Faculty of Computer Science and Information Technology, University of Putra Malaysia (UPM), Selangor, Malaysia.
Ali B Mamat, Faculty of Computer Science and Information Technology,University of Putra Malaysia (UPM), Selangor, Malaysia.

Abstract— Certainly, nowadays knowledge discovery or extracting knowledge from large amount of data is a desirable task in competitive businesses. Data mining is a main step in knowledge discovery process. Meanwhile frequent patterns play central role in data mining tasks such as clustering, classification, and association analysis. Identifying all frequent patterns is the most time consuming process due to a massive number of candidate patterns. For the past decade there have been an increasing number of efficient algorithms to mine the frequent patterns. However reducing the number of candidate patterns and comparisons for support counting are still two problems in this field which have made the frequent pattern mining one of the active research themes in data mining. A reasonable solution is identifying a small candidate pattern set from which can generate all frequent patterns. In this paper, a method is proposed based on a new candidate set called candidate head set or H which forms a small set of candidate patterns. The experimental results verify the accuracy of the proposed method and reduction of the number of candidate patterns and comparisons. 

Keywords- Data mining; Frequent patterns; Maximal frequent patterns; Candidate pattern
34. Paper 30110964: Application of a Fuzzy Programming Technique to Production Planning in the Textile Industry (pp. 238-243)
Full Text: PDF
I. Elamvazuthi , T. Ganesan, P. Vasant, Universiti Technologi PETRONAS, Tronoh, Malaysia
J. F. Webb, Swinburne University of Technology Sarawak Campus, Kuching, Sarawak, Malaysia

Abstract—Many engineering optimization problems can be considered as linear programming problems where all or some of the parameters involved are linguistic in nature. These can only be quantified using fuzzy sets. The aim of this paper is to solve a fuzzy linear programming problem in which the parameters involved are fuzzy quantities with logistic membership functions. To explore the applicability of the method a numerical example is considered to determine the monthly production planning quotas and profit of a home-textile group. 

Keywords: fuzzy set theory, fuzzy linear programming, logistic membership function, decision making
35. Paper 30110971: The Application of Mamdani Fuzzy Model for Auto Zoom Function of a Digital Camera (pp. 244-249)
I. Elamvazuthi, P. Vasant, Universiti Technologi PETRONAS, Tronoh, Malaysia
J. F. Webb, Swinburne University of Technology Sarawak Campus, Kuching, Sarawak, Malaysia

Abstract—Mamdani Fuzzy Model is an important technique in Computational Intelligence (CI) study. This paper presents an implementation of a supervised learning method based on membership function training in the context of Mamdani fuzzy models. Specifically, auto zoom function of a digital camera is modelled using Mamdani technique. The performance of control method is verified through a series of simulation and numerical results are provided as illustrations.

Keywords-component: Mamdani fuzzy model, fuzzy logic, auto zoom, digital camera
36. Paper 30110968: Comparative Evaluation and Analysis of IAX and RSW (pp. 250-252)
Full Text: PDF
Manjur S Kolhar, Mosleh M. Abu-Alhaj, Omar Abouabdalla, Tat Chee Wan, and Ahmad M. Manasrah
National Advanced IPv6 Centre of Excellence, Universiti Sains Malaysia, Penang, Malaysia

Abstract— Voice over IP (VoIP) is a technology to transport media over IP networks such as the Internet. VoIP has the capability of connecting people over packet switched networks instead of traditional circuit switched networks. Recently, the InterAsterisk Exchange Protocol (IAX) has emerged as a new VoIP which is gaining popularity among VoIP products. IAX is known for its simplicity, NAT-friendliness, efficiency, and robustness. More recently, the Real time Switching (RSW) control criterion has emerged as a multimedia conferencing protocol. In this paper, we made a comparative evaluation and analysis of IAX and RSW using Mean Opinion Score rating (MOS) and found that they both perform well under different network packet delays in ms. 

Keywords-VoIP; MOS; InterAsterisk eXchange Protocol and Real Time Switching Control Criteria.

37. Paper 09090907: On Utilization and Importance of Perl Status Reporter (SRr) in Text Mining
Full Text: PDF
Sugam Sharma, Department of Computer Science, Iowa State University, USA 
Tzusheng Pei, and Hari Cohly, Center for Bioinformatics, Jackson State University, USA

Abstract - In Bioinformatics, text mining and text data mining sometimes interchangeably used is a process to derive high-quality information from text. Perl Status Reporter (SRr) [1] is a data fetching tool from a flat text file and in this research paper we illustrate the use of SRr in text/data mining. SRr needs a flat text input file where the mining process to be performed. SRr reads input file and derives the high-quality information from it. Typically text mining tasks are text categorization, text clustering, concept and entity extraction, and document summarization. SRr can be utilized for any of these tasks with little or none customizing efforts. In our implementation we perform text categorization mining operation on input file. The input file has two parameters of interest (firstKey and secondKey). The composition of these two parameters describes the uniqueness of entries in that file in the similar manner as done by composite key in database. SRr reads the input file line by line and extracts the parameters of interest and form a composite key by joining them together. It subsequently generates an output file consisting of the name as firstKey_secondKey. SRr reads the input file and tracks the composite key. It further stores all that data lines, having the same composite key, in output file generated by SRr based on that composite key. 

Keywords: Perl, regexpr, File handler.