Vol. 8 No. 2 MAY 2010

Vol. 8 No. 2 May 2010 International Journal of Computer Science and Information Security

Publication May 2010, Volume 8 No. 2 (Download Full Journal) (Archive)

.

Copyright © IJCSIS. This is an open access journal distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

1. Paper 22041035: Policy-based Self-Adaptive Media Service Architecture for Reliable Multimedia Service Provisioning (pp. 1-8)

Full Text: PDF

.

1 G. Maria kalavathy, 2 N. Edison Rathinam and 3 P. Seethalakshmi

1 Sathyabama University, Chennai, India

2 Madras university Chennai, India

3 Anna university Tiruchirapalli, India

.

Abstract— The main objective of this paper is to design and develop the Self-Adaptive Media Service Architecture (SAMSA) for providing reliable multimedia services through policy-based actions. The distributed multimedia services deployed using SOA can be accessed in heterogeneous environments that are prone to changes during run-time. To provide reliable multimedia services, a powerful self-adaptable architecture is necessary to adapt at run time and react to the environment. The adaptability in this proposed architecture is achieved by enabling the service providers to Monitor, Analyze and Act on the defined policies that support customization of compositions of multimedia services. The Media Service Monitor (MSM) observes the business and quality metrics associated with the media services at run-time. The Adaptive Media Service Manager (AMSM) takes corrective actions based on the monitored results, through the policies defined as an extension of WS-Policy. The effectiveness of the proposed SAMSA has been evaluated on Dynamic Composite Real-Time Video on Demand Web Service (DCRVWS) for a maximum of 200 simultaneous client’s requests. The analysis of results shows that the proposed architecture provides 20% improvement on reliability, response time and user satisfaction.

.

Index Terms— DCRVWS, Media Service Monitor, Reliable Multimedia Service, SAMSA.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

2. Paper 22041036: Marker-less 3D Human Body Modeling using Thinning algorithm in Monocular Video (pp. 9-15)

Full Text: PDF

.

K. Srinivasan, Department of EIE, Sri Ramakrishna Engineering College, Coimbatore, India

K. Porkumaran, Department of EEE, Dr. N. G. P Institute of Technology, Coimbatore, India

G. Sainarayanan, Head, R & D, ICT Academy of Tamilnadu, Chennai, India

.

Abstract— Automatic marker-less 3D human body modeling for the motion analysis in security systems has been an active research field in computer vision. This research work attempts to develop an approach for 3D human body modeling using thinning algorithm in monocular indoor video sequences for the activity analysis. Here, the thinning algorithm has been used to extract the skeleton of the human body for the pre-defined poses. This approach includes 13 feature points such as Head, Neck, Left shoulder, Right shoulder, Left hand elbow, Right hand elbow, Abdomen, Left hand, Right hand, Left knee, Right knee, Left leg and Right leg in the upper body as well as in the lower body. Here, eleven activities have been analyzed for different videos and persons who are wearing half sleeve and full sleeve shirts. We evaluate the time utilization and efficiency of our proposed algorithm. Experimental results validate both the likelihood and the effectiveness of the proposed method for the analysis of human activities.

.

Keywords- Video surveillance, Background subtraction, Human body modeling, Thinning algorithm, Activity analysis.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

3. Paper 25041045: Cryptanalysis on two multi-server password based authentication protocols (pp. 16-20)

Full Text: PDF

.

Jue-Sam Chou 1, Chun-Hui Huang 2, Yalin Chen *3,

1 Department of Information Management, Nanhua University, Taiwan

2 Department of Information Management, Nanhua University, Taiwan

3 Institute of Information Systems and Applications, National Tsing Hua University, Taiwan

.

Abstract- In 2004 and 2005, Tsaur et al. proposed two smart card based password authentication protocols for multi-server environments. They claimed that their protocols are safe and can withstand various kinds of attacks. However, after analyses, we found both of them have some security loopholes. In this article, we will demonstrate the security loopholes of the two protocols.

.

Keywords- multi-server; remote password authenticationl; smart card; key agreement; Lagrange interpolating polynomial

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

4. Paper 27041056: An Efficient Feature Extraction Technique for Texture Learning (pp. 21-28)

Full Text: PDF

.

R. Suguna, Research Scholar, Department of Information Technology Madras Institute of Technology, Anna University, Chennai- 600 044, Tamil Nadu, India.

P. Anandhakumar, Assistant Professor, Department of Information Tech., Madras Institute of Technology, Anna University, Chennai- 600 044, Tamil Nadu, India.

.

Abstract— This paper presents a new methodology for discovering features of texture images. Orthonormal Polynomial based Transform is used to extract the features from the images. Using orthonormal polynomial basis function polynomial operators with different sizes are generated. These operators are applied over the images to capture the texture features. The training images are segmented with fixed size blocks and features are extracted from it. The operators are applied over the block and their inner product yields the transform coefficients. These set of transform coefficients form a feature set of a particular texture class. Using clustering technique, a codebook is generated for each class. Then significant class representative vectors are calculated which characterizes the textures. Once the orthonormal basis function of particular size is found, the operators can be realized with few matrix operations and hence the approach is computationally simple. Euclidean Distance measure is used in the classification phase. The transform coefficients have rotation invariant capability. In the training phase the classifier is trained with samples with one particular angle of image and tested with samples at different angles. Texture images are collected from Brodatz album. Experimental results prove that the proposed approach provides good discrimination between the textures.

.

Keywords- Texture Analysis; Orthonormal Transform;codebook generation; Texture Class representatives; Texture Characterization.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

5. Paper 30041077: A Comparative Study of Microarray Data Classification with Missing Values Imputation (pp. 29-32)

Full Text: PDF

.

Kairung Hengpraphrom 1, Sageemas Na Wichian 2 and Phayung Meesad 3

1 Department of Information Technology, Faculty of Information Technology

2 Department of Social and Applied Science, College of Industrial Technology

3 Department of Teacher Training in Electrical Engineering, Faculty of Technical Education

King Mongkut's University of Technology North Bangkok, 1518 Piboolsongkram Rd.Bangsue, Bangkok 10800, Thailand

.

Abstract—The incomplete data is an important problem in data mining. The consequent downstream analysis becomes less effective. Most algorithms for statistical data analysis need a complete set of data. Microarray data usually consists of a small number of samples with high dimensions but with a number of missing values. Many missing value imputation methods have been developed for microarray data, but only a few studies have investigated the relationship between missing value imputation method and classification accuracy. In this paper we carry out experiments with Colon Cancer dataset to evaluate the effectiveness of the four methods dealing with missing values imputations: the Row average method, KNN imputation, KNNFS imputation and Multiple Linear Regression imputation procedure. The considered classifier is the Support Vector Machine (SVM).

.

Keywords; KNN, Regression, Microarray, Imputation, Missing Values

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

6. Paper 30041079: Dependability Analysis on Web Service Security: Business Logic Driven Approach (pp. 33-42)

Full Text: PDF

.

Saleem Basha, Department of Computer Science, Pondicherry University, Puducherry, India

Dhavachelvan Ponnurangam, Department of Computer Science, Pondicherry University, Puducherry, India

Abstract— In the modern computing world internet and ebusiness are the composite blend of web service and technology. Organization must secure their state of computing system or risk to malicious attacks. The business logic is the fundamental drive for computer based business tasks, where business process and business function adds their features for better illustration for the abstract view of the business domain. The advent and astronomical raise of internet and ebusiness makes the business logic to specify and drive the web service. Due to the loosely coupling of web service with the application, analyzing dependability of the business logic becomes an essential artifact to produce complex web service composition and orchestrations to complete a business task. This paper extended the Markov chain for the dependability analysis of the business logic driven web service security.

.

Keywords- Web Servcie; Dependability Analysis; Busienss Logic; Web Servcie Security

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

7. Paper 18031020: Data mining Aided Proficient approach for optimal inventory control in supply chain management (pp. 43-50)

Full Text: PDF

.

Chitriki Thotappa, Assistant Professor, Department of Mechanical Engineering, Proudadevaraya Institute of Technology, Hospet. Visvesvaraya Technological University, Karnataka, India

Dr. Karnam Ravindranath, Principal, Annamacharya Institute of Technology, Tirupati

.

Abstract— Optimal inventory control is one of the significant tasks in supply chain management. The optimal inventory control methodologies intend to reduce the supply chain (SC) cost by controlling the inventory in an effective manner, such that, the SC members will not be affected by surplus as well as shortage of inventory. In this paper, we propose an efficient approach that effectively utilizes the data mining concepts as well as genetic algorithm for optimal inventory control. The proposed approach consists of two major functions, mining association rules for inventory and selecting SC cost-impact rules. Firstly, the association rules are mined from EMA-based inventory data, which is determined from the original historical data. Apriori, a classic data mining algorithm is utilized for mining association rules from EMA-based inventory data. Secondly, with the aid of genetic algorithm, SC cost-impact rules are selected for every SC member. The obtained SC cost-impact rules will possibly signify the future state of inventory in any SC member. Moreover, the level of holding or reducing the inventory can be determined from the SC cost-impact rules. Thus, the SC cost-impact rules that are derived using the proposed approach greatly facilitate optimal inventory control and hence make the supply chain management more effective.

.

Keywords-SC cost; SC cost-impact rule; EMA-based inventory; Apriori; Genetic Algorithm (GA).

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

8. Paper 16041024: Robust Video Watermarking Algorithm Using Spatial Domain Against Geometric Attacks (pp. 51-58)

Full Text: PDF

.

Sadik Ali. M. Al-Taweel, Putra. Sumari, Saleh Ali K. Alomari

School of Computer Science, Universiti Sains Malaysia, 11800 Penang, Malaysia

.

Abstract— it is important for Digital watermarking to have digital data and multimedia, such as video, music, text, and image copyright protection because of network and multimedia techniques that easily copy. One of the significant problems in video watermarking is the Geometric attacks. In this paper new robust watermarking algorithm has been proposed, based on spatial domain which is robust against geometric attacks such as downscaling, cropping, rotation, and frame dropping. Besides, the embedded data rate is high and robust. The experimental results show that the embedded watermark is robust and invisible. The watermark was successfully extracted from the video after various attacks.

.

Keywords- Video watermarking, geometric attacks, copyright protection.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

9. Paper 10041017: An Energy Efficient Reliable Multipath Routing Protocol for Data Gathering In Wireless Sensor Networks (pp. 59-64)

Full Text: PDF

.

U. B. Mahadevaswamy, Assistant Professor, Department of Electronics and Communication, Sri Jayachamarajendra college of Engineering, Mysore, Karnataka, India.

M. N. Shanmukhaswamy, Professor, Department of Electronics and communication, Sri Jayachamarajendra college of Engineering, Mysore, Karnataka, India.

.

Abstract—In Wireless Sensor Networks (WSN), the protocols that are accessible today have their own set of problems and most of them deal with energy efficiency. There is no specific work done on high network traffic or contention issue and significant work is remaining related to robustness and reliability. An important topic addressed by the wireless sensor networks community has been in-network data aggregation, because of the severe energy constraints of sensor nodes and the limited transport capacity of multihop wireless networks. In this paper, we propose to design an energy efficient reliable multipath routing protocol for data gathering in wireless sensor networks. This protocol is intended to provide a reliable transmission environment with low energy consumption, by efficiently utilizing the energy availability of the forwarding nodes to gather and distribute the data to sink, according to its requirements. By simulation results, we show that our proposed algorithm attains good packet delivery ratio with reduced energy consumption and delay.

.

Keywords- WSN, Multipath Routing Protocol, contention issue, sensor nodes, energy consumption, sink, Periodic Interest Propagation.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

10. Paper 10041013: A Novel Approach towards Cost Effective Region-Based Group Key Agreement Protocol for Secure Group Communication (pp. 65-74)

Full Text: PDF

.

K. Kumar, Research Scholar, Lecturer in CSE Government College of Engg, Bargur- 635104, Tamil Nadu, India

J. Nafeesa Begum, Research Scholar & Sr. Lecturer in CSE, Government College of Engg, Bargur- 635104, Tamil Nadu, India

Dr.V. Sumathy, Asst .Professor in ECE, Government College of Technology, Coimbatore, Tamil Nadu, India

.

Abstract—This paper addresses an interesting security problem in wireless ad hoc networks: the Dynamic Group Key Agreement key establishment. For secure group communication in an Ad hoc network, a group key shared by all group members is required. This group key should be updated when there are membership changes (when the new member joins or current member leaves) in the group. In this paper, We propose a novel, secure, scalable and efficient Region-Based Group Key Agreement protocol (RBGKA) for ad-hoc networks. This is implemented by a two-level structure and a new scheme of group key update. The idea is to divide the group into subgroups, each maintaining its subgroup keys using Group Diffie-Hellman (GDH) Protocol and links with other subgroups in a Tree structure using Tree-based Group Diffie-Hellman (TGDH) protocol. By introducing region-based approach, messages and key updates will be limited within subgroup and outer group; hence computation load is distributed among many hosts. Both theoretical analysis and experimental results show that this Region-based key agreement protocol performs better for the key establishment problem in ad –hoc network in terms of memory cost, computation cost and communication cost.

.

Keywords- Ad Hoc Network, Region-Based Group Key Agreement Protocol, Group Diffie-Hellman, Tree-Based Group Diffie-Hellman.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

11. Paper 07041002: Image Processing Algorithm JPEG to Binary Conversion (pp. 75-77)

Full Text: PDF

.

Mansi Gupta, Dept. of Computer Sc. & Engg., Lingaya’s University, Faridabad, Haryana, India

Meha Garg, Dept. of Computer Sc. & Engg., Lingaya’s University, Faridabad, Haryana,India

Prateek Dhawan, Dept. of Computer Sc. & Engg., Lingaya’s University, Faridabad, Haryana, India

.

Abstract – The JPEG processing algorithm works best on photographs and paintings of realistic scenes with smooth variations of tone and colour but is not well suited to files that will undergo multiple edits. The direct conversion of jpeg image into binary format is very low in efficiency. In this paper, the process of conversion of jpeg image to binary image is being done in a step by step manner, without using direct inbuilt function of jpeg to binary in MATLAB. As the binary image is used for comparison purposes, the jpeg image is converted into LAB format to make the luminance scale perceptually more uniform, so that the procedure becomes more efficient.

.

Keywords: LAB, Binary image, sign language

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

12. Paper 09041007: Ontology Based Information Retrieval for E-Tourism (pp. 78-83)

Full Text: PDF

.

G. Sudha Sadasivam, C.Kavitha, M.SaravanaPriya

PSG College of Technology, Coimbatore, India

.

Abstract - This paper reports work done in the E-Tourism project. The overall goal of the project is to improve information creation, maintenance and delivery in the tourism industry by introducing semantic technologies. This paper analyzes the weakness of keyword based techniques and proposes need for semantic based intelligent information retrieval for tourism domain. The Semantic Web is an evolving development of the World Wide Web in which the meaning of information and services on the web is defined, making it possible for the web to understand and satisfy the requests of people and machines to use the web content. It also supports the transparent exchange of information and knowledge among collaborating e-business organizations. It focuses meaningful exchange of knowledge between organizations. Major challenge faced by the semantic web application is modeling of ontology and ontology based information retrieval. The software framework has been developed using Protégé tool for Travels and Tourism domain. This framework facilitates creation and maintenance of ontology. The paper also proposes two methods for information retrieval namely top down and bottom up approach. A comparison of these two approaches also presented in the paper.

.

Keywords: Semantic Web, Keyword based Search Engine, Ontology, Protégé Tool, Jambalaya, Jena Agent.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

13. Paper 10041011: Mean – Variance parametric Model for the Classification based on Cries of Babies (pp. 84-88)

Full Text: PDF

.

Khalid Nazim S. A., Dr. M.B Sanjay Pande

Department of Computer Science & Engineering, GSSSIETW, Mysore, India

.

Abstract- Cry is a feature which makes a individual to take certain care about the infant which has initiated it. It is also equally understood that cry makes a person to take certain steps. In the present work, we have tried to implement a mathematical model which can classify the cry into its cluster or group based on certain parameters based on which a cry is classified into a normal or abnormal. To corroborate the methodology we taken 17 distinguished features of cry. The implemented mathematical model takes into account Doyle’s distance to identify the required features out the 17 features for classifying the dataset. The dataset of 100 samples were taken to substantiate the efficacy of the Model.

.

Keywords: Cry, Doyle’s distance.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

14. Paper 10041014: Comparative Performance of Information Hiding in Vector Quantized Codebooks using LBG, KPE, KMCG and KFCG (pp. 89-95)

Full Text: PDF

.

Dr. H.B. Kekre, Senior Professor, MPSTME, NMIMS University, Vile-parle(W), Mumbai-56, India

Archana Athawale, Assistant Professor, Thadomal Shahani Engineering College, Bandra(W), Mumbai-50, India

Ms. Tanuja K. Sarode, Assistant Professor, Thadomal Shahani Engineering College, Bandra(W), Mumbai-5, India

Kalpana Sagvekar, Lecturer, Fr. Conceicao Rodrigues COE, Bandra(W), Mumbai-50, India

.

Abstract - In traditional VQ - data hiding schemes secret data is hidden inside index based cover image resulting in limited embedding capacity. To improve the embedding capacity as well as to have minimum distortion to carrier media, we have proposed one novel method of hiding secret data into the codebook. In this paper we have used four different algorithms Linde Buzo and Gray (LBG), Kekre’s Proportionate Error (KPE), Kekre’s Median Codebook Generation algorithm (KMCG) and Kekre’s Fast Codebook Generation Algorithm (KFCG) to prepare codebooks. It is observed that KFCG gives minimum distortion.

.

Keywords - Reversible (lossless) data hiding, VQ, LBG, KPE, KMCG, KFCG.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

15. Paper 10041016: Registration of Brain Images using Fast Walsh Hadamard Transform (pp. 96-105)

Full Text: PDF

.

D. Sasikala 1 and R. Neelaveni 2

1 Research Scholar, Assistant Professor, Bannari Amman Institute of Technology, Sathyamangalam. Tamil Nadu - 638401.

2 Assistant Professor, PSG College of Technology, Coimbatore, Tamil Nadu - 641004.

.

Abstract - A lot of image registration techniques have been developed with great significance for data analysis in medicine, astrophotography, satellite imaging and few other areas. This work proposes a method for medical image registration using Fast Walsh Hadamard transform. This algorithm registers images of the same or different modalities. Each image bit is lengthened in terms of Fast Walsh Hadamard basis functions. Each basis function is a notion of determining various aspects of local structure, e.g., horizontal edge, corner, etc. These coefficients are normalized and used as numerals in a chosen number system which allows one to form a unique number for each type of local structure. The experimental results show that Fast Walsh Hadamard transform accomplished better results than the conventional Walsh transform in the time domain. Also Fast Walsh Hadamard transform is more reliable in medical image registration consuming less time.

.

Keywords: Walsh Transform, Fast Walsh Hadamard Transform, Local Structure, Medical Image Registration, Normalization.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

16. Paper 12031007: Multi - Level Intrusion Detection Model Using Mobile Agents In Distributed Network Environment (pp. 106-111)

Full Text: PDF

.

S. Ramamoorthy, Sathyabama university,Chennai

Dr. V. Shanthi, St. Joseph’s college of engineering, Chennai

.

Abstract - Computer security in today’s networks is one of the fastest expanding areas of the computer

industry. Therefore protecting resources from intruders is a difficult task that must be automated so that it is efficient and responsive. Most intrusion-detection systems currently rely on some type of centralized processing to analyze the data necessary to detect an intruder in real time. A centralized approach can be vulnerable to attack. If an intruder can disable the central detection system, then most protection is weakened. The paper presented here demonstrates that independent detection agents can be run in a distributed fashion at three levels, each operating mostly independent of the others, thereby cooperating and communicating with the help of mobile agents to provide a truly distributed detection mechanism without a single point of failure. The agents can run along with user and system applications without much consumption of system resources, and without generating much amount of network traffic during an attack.

.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

17. Paper 14041021: Defending AODV Routing Protocol Against the Black Hole Attack (pp. 112-117)

Full Text: PDF

.

Fatima Ameza, Department of computer sciences, University of Bejaia, 06000 Algeria.

Nassima Assam, Department of computer sciences, University of Bejaia, 06000 Algeria.

Rachid Beghdad, LAMOS laboratory, Faculty of Sciences, University of Bejaia, 06000 Algeria.

.

Abstract—In this paper we propose a simple method to detect Black hole attacks in the Ad hoc On Demand Vector (AODV) routing protocol. Even if many previous works focused on authentication and cryptography techniques, nevertheless these techniques suffer from some weaknesses. In fact, this kind of solution is just a first line of defense, which should be completed by an intrusion detection system as a second line. The second line which is proposed here consists of including the source route in the header of the control packets (RREQ). In addition to that, any intermediate node records the sequence number of the destination. Thus, if the packet is compromised, the destination node can easily retrieve the address of the attacker. To secure RREP packets, any intermediate node records the addresses of the nodes to which it forwards RREQ. Thus, any node receiving RREP can check if the sender is legitimate or not. Simulation results show the robustness of our protocol and that it allows delivering a high ratio of data and consumes less route establishment delay.

.

Keywords-component; AODV routing protocol; Black hole attacks; Intrusion detection; Reactive routing protocols; Wireless ad hoc networks.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

18. Paper 20041030: An Efficient OFDM Transceiver Design suitable to IEEE 802.11a WLAN standard (pp. 118-122)

Full Text: PDF

.

T. Suresh, Research Scholar, R.M.K Engineering College, Anna University, Chennai TamilNadu, India

Dr. K. L. Shunmugathan, Professor & Head, Department of CSE, R.M.K Engineering College, Kavaraipettai, TamilNadu, India

.

Abstract—In today’s advanced Communication technology one of the multicarrier modulations like Orthogonal Frequency Division Multiplexing (OFDM) has become broadened, mostly in the field of wireless and wired communications such as digital audio/video broadcast (DAB/DVB), wireless LAN (802.11a and HiperLAN2), and broadband wireless (802.16). In this paper we discuss an efficient design technique of OFDM transceiver according to the IEEE 802.11a WLAN standard. The various blocks of OFDM transceiver is simulated using ModelSimSE v6.5 and implemented in FPGA Xilinx Spartan-3E Platform. Efficient techniques like pipelining and strength reduction techniques are utilized to improve the performance of the system. This implementation results show that there is a remarkable savings in consumed power and silicon area. Moreover, the design has encouraged the reduction in hardware resources by utilizing the efficient reconfigurable modules.

.

Keywords- FPGA; VHDL; OFDM; FFT; IFFT; IEEE 802.11a

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

19. Paper 20041031: Comparitive analysis of smart antenna array, basis of beamforming schemes and algorithems : A Review (pp. 123-128)

Full Text: PDF

.

Abhishek Rawat , R. N. Yadav and S. C. Shrivastava

Maulana Azad National Institute Of Technology, Bhopal, INDIA

.

Abstract— The smart antenna array is a group of antennas in which the relative phases of the respective signals feeding the antennas are varied in such a way that the effective radiation pattern of the array is reinforced in a desired direction and suppressed in undesired directions. Smart antenna are the array with smart signal processing algorithms used to identify spatial signal signature such as the direction of arriving of the signal, and use it to calculate beam forming vector, to track and locate the antenna beam on the mobile/target. An array antenna may be used to point a fixed radiation pattern, or to scan rapidly in azimuth or elevation. This paper explains the architecture; evolution of smart antenna differs from the basic format of antenna. The paper further discusses different Beamforming schemes and algorithms of smart antenna array.

.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

20. Paper 20041032: Comments on five smart card based password authentication protocols (pp. 129-132)

Full Text: PDF

.

Yalin Chen 1, Jue-Sam Chou 2,* , Chun-Hui Huang 3

1 Institute of information systems and applications, National Tsing Hua University, Taiwan

2 Department of Information Management, Nanhua University, Taiwan

3 Department of Information Management, Nanhua University, Taiwan

.

Abstract- In this paper, we use the ten security requirements proposed by Liao et al. for a smart card based authentication protocol to examine five recent work in this area. After analyses, we found that the protocols of Juang et al., Hsiang et al., Kim et al., and Li et al. all suffer from offline password guessing attack if the smart card is lost, and the protocol of Xu et al.¡s is subjected to an insider impersonation attack.

.

Keywords- password authentication protocol; insider attack; smart card loss problem; password guessing attack

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

21. Paper 20041033: Cryptanalysis on four two-party authentication protocols (pp. 133-137)

Full Text: PDF

.

Yalin Chen 1, Jue-Sam Chou 2,* , Chun-Hui Huang 3

1 Institute of information systems and applications, National Tsing Hua University, Taiwan

2 Department of Information Management, Nanhua University, Taiwan

3 Department of Information Management, Nanhua University, Taiwan

.

Abstract- In this paper, we analyze four authentication protocols of Bindu et al., Goriparthi et al., Wang et al. and Holbl et al.. After investigation, we reveal several weaknesses of these schemes. First, Bindu et al. protocol suffers from an insider impersonation attack if a malicious user obtains a lost smart card. Second, both Goriparthi et al. and Wang et al. protocols cannot withstand a DoS attack in the password change phase, i.e. an attacker can involve the phase to make user’s password never be used in subsequent authentications. Third, Holbl et al. protocol is vulnerable to an insider attack since a legal but malevolent user can deduce KGC’s secret key.

.

Keywords- password authentication protocol; insider attack; denial-of-service attack; smart card lost problem; mutual authentication; man-in-the-middle attack

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

22. Paper 20041034: Software Metrics: Some degree of software measurement and analysis (pp. 138-144)

Full Text: PDF

.

Rakesh. L, Department of Computer-Science, SCT Institute of Technology, Bangalore, India-560075

Dr. Manoranjan Kumar Singh, PG Department of Mathematics, Magadh University, Bodhagaya, India-824234

Dr. Gunaseelan Devaraj, Department of Information Technology, Ibri college of Technology, Ibri, Sultanate of Oman- 516

.

Abstract — Measurement lies at the heart of many systems that govern our lives. Measurement is essential to our daily life and measuring has become a common place and well accepted. Engineering discipline use methods that are based on models and theories. Methodological improvements alone do not make an engineering discipline. Measurement encourages us to improve our processes and products. This paper examines the realm of software engineering to see why measurement is needed and also set the scene for new perspective on software reliability metrics and its improvement. Software measurement is not a mainstream topic within software engineering rather it is a diverse collection of fringe topics. Unlike other engineering discipline measurement must become an integral part of software engineering practice.

.

Keywords- External Attribute, Reliability model, Fault tolerance.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

23. Paper 22041038: Preprocessing of video image with unconstrained background for Drowsy Driver Detection (pp. 145-151)

Full Text: PDF

.

M. Moorthi 1, Dr. M.Arthanari 2, M.Sivakumar 3

1 Assistant Professor, Kongu Arts and Science College, Erode – 638 107, Tamil Nadu, India

2 Prof. & Head, Tejaa Sakthi Institute of Technology for Women, Coimbatore – 641 659, Tamil Nadu, India

3 Doctoral Research Scholar, Anna University, Coimbatore, Tamil Nadu, India

.

Abstract - The face recognition includes enhancement and segmentation of face image, detection of face boundary and facial features, matching of extracted features, and finally recognition of the face. Though a number of algorithms are devised for face recognition, the technology is not matured enough to recognize the face of a person since the algorithm deal with significant amount of illumination variation in image. We propose a new image preprocessing algorithm that compensates for the problem. The proposed algorithm enhances the contrast of images by transforming the values in an intensity image, so that the histogram of the output image is approximately uniformly distributed on pixel. Our algorithm does not require any training steps or reflective surface models for illumination compensation. We apply the algorithm to face images prior to recognition. Simulation is done using seventy five web camera images using Mat lab 7.0.

.

Keywords: Facial recognition, Facial features extraction, Eye detection

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

24. Paper 23041041: Ultra Fast Computing Using Photonic Crystal Based Logic Gates (pp. 152-155)

Full Text: PDF

.

X. Susan Christina, Dept. of ECE, Mookambigai College of Engg., Trichy- 622 502, India.

A. P. Kapilan, Dept. of ECE, Chettinad College of Engg & Tech, Karur,. 639114. India.

P. Elizabeth Caroline, Dept. of ECE, JJ College of Engg &Tech, Trichy –620 009,India

.

Abstract—A Noveldesign of all-optical fundamental NAND and XNOR logic gates based on two dimensional photonic crystals hasbeen presentedin this paper. In a photonic crystal self collimated beams are partially transmitted and partially reflected with a phase lag at line defect in Γ-X direction. By employing a appropriate phase shifter, the reflected and transmitted inputbeams areinterferedconstructivelyor destructively to obtain therequired logic outputs.The operation of the logic gates is simulated using two dimensional Finite Difference Time Domain (FDTD) method.

.

Keywords- optical computing; logic gated; photonic crystal; self collimated beam; FDTD

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

25. Paper 25041043: Markov Chain Simulation of HIV/AIDS Movement Pattern (pp. 156-167)

Full Text: PDF

.

Ruth Stephen Bature, Department of Computer/Mathematical Science, School of Science Technology, Federal College of Chemical and Leather Technology, Zaria, Nigeria.

Obiniyi, A. A., Department of Mathematics, Ahmadu Bello University, Zaria, Nigeria

Ezugwu El-Shamir Absalom, Department of Mathematics, Ahmadu Bello University, Zaria, Nigeria

Sule, O. O., Department of Computer/Mathematical Science, School of Science Technology, Federal College of Chemical and Leather Technology, Zaria, Nigeria.

.

Abstract - The objective of this research work is to simulate the spread of HIV/AIDS from one generation to another or from one person to another, with a view of contributing to the control of the disease. This will be accomplished using Markov Chain method with a computer program written in Java to simulate the process. This paper is also concerned with the movement pattern of HIV/AIDS from one generation to another generation over a period of 20 years. This can help professional take the probability measures of HIV/AIDS over a given period of time, within a specific area or location.

.

Keywords: HIV/AIDS, Markov Chain, Transition Matrix, Probability Matrix

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

26. Paper 25041044: Webpage Classification based on URL Features and Features of Sibling Pages (pp. 168-173)

Full Text: PDF

.

Sara Meshkizadeh,Department of Computer engineering, Science and Research branch, Islamic Azad University(IAU) , Khouzestan, Iran

Dr. Amir masoud rahmani, Department of Computer engineering, Science and Research branch, Islamic Azad University(IAU) , Tehran, Iran

Dr. Mashallah Abbasi Dezfuli, Department of Computer engineering, Science and Research branch, Islamic Azad University(IAU) , Khouzestan, Iran

.

Abstract - Webpage classification plays an important role in information organization and retrieval. It involves assignment of one webpage to one or more than one predetermined categories. The uncontrolled features of web content implies that more work is required for webpage classification compared with traditional text classification. The interconnected nature of hyper text, however, carries some features which contribute to the process, for example URL features of a webpage. This study illustrates that using such features along with features of sibling pages, i.e. pages from the same sibling, as well as Bayesian algorithm for combining the results of these features, it would be possible to improve the accuracy of webpage classification based on this algorithm.

.

Keywords: classification, hyper text, URL, sibling pages, Bayesian algorithm

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

27. Paper 25041046: Clustering Unstructured Data (Flat Files) - An Implementation in Text Mining Tool (pp. 174-180)

Full Text: PDF

.

Yasir Safeer, Atika Mustafa and Anis Noor Ali

Department of Computer Science FAST – National University of Computer and Emerging Sciences Karachi, Pakistan

.

Abstract—With the advancement of technology and reduced storage costs, individuals and organizations are tending towards the usage of electronic media for storing textual information and documents. It is time consuming for readers to retrieve relevant information from unstructured document collection. It is easier and less time consuming to find documents from a large collection when the collection is ordered or classified by group or category. The problem of finding best such grouping is still there. This paper discusses the implementation of k-Means clustering algorithm for clustering unstructured text documents that we implemented, beginning with the representation of unstructured text and reaching the resulting set of clusters. Based on the analysis of resulting clusters for a sample set of documents, we have also proposed a technique to represent documents that can further improve the clustering result.

.

Keywords — Information Extraction (IE); Clustering, k-Means Algorithm; Document Classification; Bag-of-words; Document Matching; Document Ranking; Text Mining

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

28. Paper 25041047: Controlling Wheelchair Using Electroencephalogram (pp. 181-187)

Full Text: PDF

.

Vijay Khare, Jaypee Institute of Information Technology, Dept. of Electronics and Communication, Engineering, Nioda, India.

Jayashree Santhosh, Indian Institute of Technology, Computer Services Centre, Delhi, India.

Sneh Anand, Indian Institute of Technology, Centre for Biomedical Engineering Centre, Delhi, India.

Manvir Bhatia, Sir Ganga Ram Hospital, Department of Sleep Medicine, New Delhi, India.

.

Abstract— This paper present the development of a power wheelchair controller based on Electroencephalogram (EEG).To achieve this goal wavelet packet transform (WPT) was used for feature extraction of the relevant frequency bands from electroencephalogram (EEG) signals. Radial Basis Function network was used to classify the pre defined movements such as rest, forward, backward, left and right of the wheelchair. Classification and evaluation results showed the feasibility of EEG as an input interface to control a mechanical device like powered wheelchair.

.

Keywords— Electroencephalogram (EEG), Wavelet Packet Transform (WPT), Radial Basis Function neural network (RBFNN), Brain computer interface (BCI), Rehabilitation, Wheelchair Controller.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

29. Paper 25041049: A New Biometrics based Key Exchange and Deniable Authentication Protocol (pp. 188-193)

Full Text: PDF

.

K. Saraswathi * Dr. R. Balasubramanian #

* Asst.Proffessor, Department of Computer Science, Govt Arts College, Udumalpet, Tirupur, India.

# Dean Academic Affairs, PPG Institute of Technology, Coimbatore, India.

.

Abstract - Wireless Local Area Networks (WLANs) are gaining recognition as they are fast, cost effective, supple and easy to use. The networks face a serious of issues and challenges in establishing security to the users of the network. With users accessing networks remotely, transmitting data by means of the Internet and carrying around laptops containing sensitive data, ensuring security is an increasingly multifarious challenge. Therefore it is necessary to make sure the security of the network users. In order to provide network security many techniques and systems have been proposed earlier in literature. Most of these traditional methods make use of password, smart cards and so on to provide security to the network users. Though these traditional methods are effective in ensuring security they posses some limitations too. The problem with these traditional approaches is that there is possibility to forget the password. Moreover, compromised password lead to a fact, that unauthorized user can have access to the accounts of the valid user. This paper proposes an approach for network security using biometrics and deniable authentication protocol. The human biometrics like hand geometry, face, fingerprint, retina, iris, DNA, signature and voice can be effectively used to ensure the network security. The diverse phases included in this proposed approach are user registration, fingerprint enhancement, minutiae point extraction, mapping function and deniable authentication protocol. Furthermore, biometric authentication systems can be more convenient for the users since it involves no password that might be feared to be forgotten by the network users or key to be lost and therefore a single biometric trait (e.g., fingerprint) can be used to access several accounts without the burden of remembering passwords. This proposed paper also explains some of the fingerprint enhancement techniques to make the biometric template noise free. Experiments are conducted to evaluate

the performance measure of the proposed approach.

.

Keywords - Biometrics, Cryptography, Data Security, Fingerprint, Mapping Function, Minutiae Point, Network Security, User Registration.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

30. Paper 25041050: A New Region based Group Key Management Protocol for MANETs (pp. 194-200)

Full Text: PDF

.

N. Vimala *, Dr. R. Balasubramanian #

* Senior Lecturer, Department of Computer Science, CMS College of Science and Commerce, Coimbatore, India.

# Dean Academic Affairs, PPG Institute of Technology, Coimbatore, India.

.

Abstract - Key management in the ad hoc network is a challenging issue concerning the security of the group communication. Group key management protocols can be approximately classified into three categories; centralized, decentralized, and distributed. The most suitable solution to provide the services like authentication, data integrity and data confidentiality is the establishment of a key management protocol. This paper proposes an approach for the design and analysis of region-based key management protocols for scalable and reconfigurable group key management in Mobile Ad Hoc Networks (MANETs). Most of the centralized key management protocols arises an issue on data security on group communication. The proposed region-based group key management protocol divides a group into region-based subgroups based on decentralized key management principles. This region-based group key management protocols deal with outsider attacks in MANETs to preserve the security properties. A performance model to evaluate the network traffic cost generated for group key management in the proposed region-based protocol for MANETs is developed. Cost for joining or leaving the group and the cost for group communication are considered in evaluating the performance of the proposed region-based group key management scheme.

.

Keywords- Cluster Head, Group Key, Key Management Protocol, Mobile Ad Hoc Networks (MANETs), Region-based, and Rekeying.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

31. Paper 26041053: Automated Rapid Prototyping of TUG Specifications Using Prolog for Implementing Atomic Read/ Write Shared Memory in Mobile Ad Hoc Networks (pp. 201-216)

Full Text: PDF

.

Fatma Omara # , Said El Zoghdy *, Reham Anwer *

# Information Systems and Computers Faculty - Cairo University-Egypt.

* Science Faculty – Menufiya University- Egypt.

.

Abstract - Rapid prototyping has been used for exploring vague user requirements in the front-end of the software life cycle. Automated rapid prototyping may reduce cost of prototyping and the time of developing it .One automated rapid prototyping technique is the direct execution of a specification. Direct execution of a specification has the benefits of quick construction of the prototype, direct support for formal specification, and quick response to the specification changes. However existing formal specification languages still have difficulties in specifying software systems such as non functional behavior of the systems. For non-executable formal specification languages, a prototype may be derived from the specification via software transformations. This approach to rapid prototyping uses a formal specification language to automatically generate a prototype in Prolog via a set of software transformation rules. Because there is a direct correspondence between the language and Prolog, the transformation is mechanical and straight forward. Specifiers can concentrate on generating the prototype without the distraction of transforming one notation into another. This formal specification language may not provide enough abstractions for provide enough abstraction for prototyping some particular features of systems. Therefore, this approach is designed to support the derived prototype to be extended or modified in a modular manner. The specification is written in modules in terms of the language patterns that support module independence, the prototype is then derived in a modular way that supports the ease of modifications to the prototype. The software transformation rules used for the derivation of prototypes in Prolog are presented. In this paper, we apply this specification on the implementation for atomic object Read/Write shared memory in mobile ad hoc network.

.

Keywords: Rapid Prototyping, TUG language, Prolog, Mobile Ad Hoc Networks.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

32. Paper 27041055: PSS Design Based on RNN and the MFA\FEP Control Strategy (pp. 217-221)

Full Text: PDF

.

Rebiha Metidji and Boubekeur Mendil

Electronic Engineering Department, University of A. Mira, Targua Ouzemour, Bejaia, 06000, Algeria.

.

Abstract – The conventional design of PSS (power system stabilizers) was carried out using a linearized model around the nominal operating point of the plant, which is naturally nonlinear. This limits the PSS performance and robustness. In this paper, we propose a new design using RNN (recurrent neural networks) and the model free approach (MFA) based on the FEP (feed-forward error propagation) training algorithm [15]. The results show the effectiveness of the proposed approach. The system response is less oscillatory with a shorter transient time. The study was extended to faulty power plants.

.

Keywords - Power Network; Synchronous Generator; Neural Network; Power System Stabilizer; MFA/FEP control.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

33. Paper 27041057: An Efficient SJRR CPU Scheduling Algorithm (pp. 222-230)

Full Text: PDF

.

Saeeda Bibi, Farooque Azam, Sameera Amjad, Wasi Haider Butt, Hina Gull, Rashid Ahmed,

Department of Computer Engineering, College of Electrical and Mechanical Engineering, NUST, Rawalpindi, Pakistan

Yasir Chaudhry, Department of Computer Science, Maharishi University of Management, Fairfield, Iowa, USA

.

Abstract— CPU Scheduling is a vital discipline which helps us gain deep insight into the complex set of policies and mechanisms used to govern the order in which tasks are executed by the processor. This article proposes an efficient Shortest Job Round Robin (SJRR) CPU Scheduling algorithm having better average waiting time (AWT) and average turnaround time (ATT) as compared to other CPU Scheduling techniques. The primary objective of this algorithm is to optimize system performance according to the criteria deemed most important by system designers. Included in this work, is a simulation that compares the proposed algorithms with some well known practices to CPU scheduling.

.

Keywords-component; First Come First Serve Algorithm, Shortest Job First Algorithm, Round Robin Algorithm, Priority Algorithm, Average Waiting Time, Turnaround Time, Response Time, Throughput

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

34. Paper 28021071: Robust Resilient Two Server Password Authentication Vs Single Server (pp. 231-237)

Full Text: PDF

.

T. S. Thangavel, Dr. A. Krishnan

K. S. Rangasamy College of Technology, Tiruchengode, Tamilnadu, India

.

Abstract - The authentication system stores the password in a Central Server, and the possibility for the intruder to obtain the password is very easy and can gain access to the contents of the user. For the purpose of authentication, the multi-server systems we proposed to communicate with one or all of the servers. It requires high communication bandwidth at the same time is not easy to maintain and also the protocols are highly expensive. The Two Server Authentication System avoids this problem, which uses the passwords and the session keys, rather than performing the cryptographic techniques. It consists of two servers, the front end and the back end server. The front end server communicates with the user, whereas the back end control server is only visible to the service server. These two servers are responsible for the authentication. The password is split into two words, which is one with the service server and the other with the control server. Both the servers are validated during the form validation process. The system is suitable for both the computation and communication system. The servers are also used for the multiple clients and also for the single server systems.

.

Keywords: Password-Authentication, Two Servers password, Cryptosystem, single sever Secure Password, Service sever, control server.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

35. Paper 28041059: Effective MSE optimization in fractal image compression (pp. 238-243)

Full Text: PDF

.

A. Muruganandham, Sona College of Technology, salem-05.,India.

Dr. R. S. D. Wahida banu, Govt Engineering College, Salem-11, India

.

Abstract- The Fractal image compression encodes image at low bitrate with acceptable image quality, but time taken for encoding is large. In this paper we proposed a fast fractal encoding using particle swarm optimization (PSO). Here optimization technique is used to optimize MSE between range block and domain block. PSO technique speedup the fractal encoder and preserve the image quality.

.

Keywords- mean square error (MSE) ,particle swarm optimization (PSO), fractal image compression (FIC), Iteration Function System (IFS)

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

36. Paper 28041061: Embedding Expert Knowledge to Hybrid Bio-Inspired Techniques- An Adaptive Strategy Towards Focussed Land Cover Feature Extraction (pp. 244-253)

Full Text: PDF

.

Lavika Goel , M.E. (Masters) Student , Delhi College of Engineering, New Delhi, India

Dr. V.K. Panchal, ADD.DIRECTOR & SCIENTIST 'G', Defence Terrain & Research Lab(DTRL) , DRDO , Delhi

Dr. Daya Gupta, HEAD OF DEPARTMENT, Computer Engineering Department, Delhi College of Engineering, Delhi

.

Abstract --- The findings of recent studies are showing strong evidence to the fact that some aspects of biogeography can be adaptively applied to solve specific problems in science and engineering. This paper presents a hybrid biologically inspired technique called the ACO2/PSO/BBO (Ant Colony Optimization2/Particle Swarm Optimization / Biogeography Based Optimization) Technique that can be adapted according to the database of expert knowledge for a more focussed satellite image classification. The hybrid classifier explores the adaptive nature of Biogeography Based Optimization technique and therefore is flexible enough to classify a particular land cover feature more efficiently than others based on the 7-band image data and hence can be adapted according to the application. The paper also presents a comparative study of the proposed classifier and the other recent soft computing classifiers such as ACO, Hybrid Particle Swarm Optimization – cAntMiner (PSO-ACO2), Hybrid ACO-BBO Classifier, Fuzzy sets, Rough-Fuzzy Tie up and the Semantic Web Based classifiers with the traditional probabilistic classifiers such as the Minimum Distance to Mean Classifier (MDMC) and the Maximum Likelihood Classifier (MLC). The proposed algorithm has been applied to the 7-band cartoset satellite image of size 472 X 576 of the Alwar area in Rajasthan since it contains a variety of land cover features. The algorithm has been verified on water pixels on which it shows the maximum achievable efficiency i.e. 100%. The accuracy of the results have been checked by obtaining the error matrix and KHAT statistics .The results show that highly accurate land cover features can be extracted effectively when the proposed algorithm is applied to the 7-Band Image , with an overall Kappa coefficient of 0.982.

Keywords - Biogeography based Optimization, Rough Set Theory, Remote Sensing, Feature Extraction, Particle Swarm Optimization, Ant Colony Optimization, Flexible Classifier, Kappa Coefficient.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

37. Paper 29041063: On Multi-Classifier Systems for Network Anomaly Detection and Features Selection (pp. 254-263)

Full Text: PDF

.

Munif M. Jazzer, Faculty of ITC, Arab Open University-Kuwait, Kuwait.

Mahmoud Jazzar, Dept. of Computer Science, Birzeit University, Birzeit, Palestine

Aman Jantan, School of Computer Sciences, University of Science Malaysia, Pulau Pinang, Malaysia

.

Abstract—Due to the irrelevant patterns and noise of network data, most of network intrusion detection sensors suffer from the false alerts which the sensors produce. This condition gets worse when deploying intrusion detection measures in real-time environment. In addition, most of the existing IDS sensors consider all network packets features. Using all packets features for network intrusion detection will result in lengthy and contaminated intrusion detection. In this research we highlight the necessity of using important features in various anomaly detection cases. The paper presents a new multi-classifier system for intrusion detection. The basic idea is to quantify the causal inference relation to attacks and attacks free data to determine the attack detection and the severity of odd packets. Initially, we have refined the data patterns and attributes to classify the training data and then we have used the SOM clustering method and the fuzzy cognitive maps diagnosis to replicate attacks and normal network connection. Experimental results shows that the classifiers gives better representation of normal and attack connection using significant features.

.

Keywords- Anomaly Detection; SOM; FCM; Security

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

38. Paper 29041066: AccSearch: A Specialized Search Engine for Traffic Analysis (pp. 264-271)

Full Text: PDF

.

K. Renganathan, Computer Science and Engineering Department, SRM University, India

B. Amutha, Computer Science and Engineering Department, SRM University, India

.

Abstract— AccSearch is a specialized web search engine to provide information about road accidents within Chennai, India and assist the traffic authorities, police, NGOs, lawyers, students and statistical bureaus. The people who are in need of road accident information for various reasons are very much struggling to collect the correct information under a single search. Special purpose search engines are designed to work on a particular domain which fill the gap where an all purpose search engine lacks. As the existing search engines cannot do the traffic search alone well for several reasons, we have designed a search algorithm using Markov chain, to provide the search information in a faster manner. The mathematical proof of our modified Markov chain algorithm shows that the speed and efficiency seems to be better in comparison with the existing search algorithms. As Markov chain can be used for prediction purposes, our search engine concentrates on one particular domain which is traffic analysis it will result in exact responses to the user queries and will lead to a greater amount of user satisfaction.

.

Keywords; AccSearch; road traffic; accident; Markov chain; accident prediction;

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

39. Paper 30031085: A Study of Voice over Internet Protocol (pp. 272-278)

Full Text: PDF

.

Mohsen Gerami, The Faculty of Applied Science of Post and Communications, Danesh Blv, Jenah Ave, Azadi Sqr, Tehran, Iran.

.

Abstract—Voice over Internet Protocol, is an application that enables data packet networks to transport real time voice traffic. VOIP uses the Internet as the transmission network. This paper describes VoIP and its requirements. The paper further discusses various VoIP protocol, security and its market.

.

Keywords: VOIP; H.323; SIP; Security; Market;

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

40. Paper 30041067: Performance Issues of Health Care System using SQL server (pp. 279-284)

Full Text: PDF

.

Narendra Kohli, Electrical Engineering Department, Indian Institute of Technology, Kanpur, India

Nishchal K. Verma, Electrical Engineering Department, Indian Institute of Technology, Kanpur, India

.

Abstract—: In this paper, a smart card based on line health care system and its performance issues using SQL server are proposed. To provide a good quality of treatment in the hospital, it is required to integrate all the hospitals of country via internet. A Smart Card with 10 digits unique registration no. with his some personal information is issued to patient. After getting registration in any hospital of the hospital network, patient has to go for checkup with smart card only. All the patient information i.e. personal, doctor prescriptions, test reports etc. will be stored in the database of the local server of the hospital and time to time uploaded to the centralized server. On the basis of unique registration no., all the patient information can be retrieved from the database of the centralized server. Smart card based online health care system application has been designed as front end .Net and back end in SQL server. The block size or page size being used during the database creation is playing very important role in performance tuning. It is very important to decide the proper block size before database design. You cannot change the block size once you have created the database. Recreating the database again is a very costly affair.

.

Keywords- hospital, patient, smart card, SQL server 2005

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

41. Paper 30041068: Color Steel Plates Defect Detection Using Wavelet And Color Analysis (pp. 285-292)

Full Text: PDF

.

Ebrahim Abouei Mehrizi, Department of Electronic Engineering, Islamic Azad University, najafabad branch, Isfahan, 81746, Iran

Amirhassan Monadjemi, Department of Computer Engineering, University of Isfahan, Isfahan, 81746, Iran

Mohsen Ashorian, Department of Electronic Engineering, Islamic Azad University, shahremajlesi branch, Isfahan, 81746, Iran

.

Abstract— In this study, having reviewed the automatic surfaces inspection and it’s benefits compared to the handcrafted inspection, we will explain the wavelet transformation method with an emphasis on it. There are various methods for image segmentation. Yet, in this essay, we will use the wavelet transformation for segmentation of steel colorful plates to areas of normal and defective. Each image needs to be converted to RGB, HSL, LAB color spaces. Afterward, considering a color space, discrete wavelet transformation is applied to three dimensional channels of the image and detailed images at various levels are obtained. Due to visible differences in normal and defective areas, it is expected that defective areas have clear borders with normal areas in some detailed images and in some way clustering the image to the areas of defective and normal be possible. Finally, the results obtained from different colorful channels are compared. It is worth mentioning that the tests have been done on a set of images of normal and defective steel surfaces, showing the quality of wavelet method.

.

Keywords- Color; Image segmentation; Metals industry; Defect detection; Wavelet Transform; Steel Surfaces Inspection; color spaces;

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

42. Paper 30041069: Clustering in Mobile Ad hoc Networks: A Review (pp. 293-301)

Full Text: PDF

.

Meenu Chawla, Department of CSE, MANIT, Bhopal, India

Jyoti Singhai, Department of ECE, MANIT, Bhopal, India

J L Rana, Department of CSE, MANIT, Bhopal, India

.

Abstract—Mobile Ad-hoc Networks (MANETs) are future wireless networks consisting entirely of mobile nodes that communicate on-the-move without base stations. Nodes in these networks generate user and application traffic and carry out network control and routing functions. Dynamic and random topologies lead to rapidly changing connectivity and network partitions. This dynamic nature along with bandwidth and power constraints together pose new problems in network scalability, network control, especially in the design of higher level protocols such as routing, and in implementing applications with Quality of Service requirements. Hierarchical routing provides a means to tackle the above mentioned problems in large scale networks. Clustering is the process of building hierarchies among nodes in the network. In this approach an ad hoc network is partitioned into group of nodes called as clusters. This paper presents a review of the different clustering algorithms and the criterion on the basis of which each of them takes the clustering decisions.

.

Keywords- Mobile Ad-hoc networks; clustering; clusterhead selection.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

43. Paper 30041071: Survey of Nearest Neighbor Techniques (pp. 302-305)

Full Text: PDF

.

Nitin Bhatia, Department of Computer Science, DAV College, Jalandhar, India

Vandana, SSCS, Deputy Commissioner’s Office, Jalandhar, India

.

Abstract— The nearest neighbor (NN) technique is very simple, highly efficient and effective in the field of pattern recognition, text categorization, object recognition etc. Its simplicity is its main advantage, but the disadvantages can’t be ignored even. The memory requirement and computation complexity also matter. Many techniques are developed to overcome these limitations. NN techniques are broadly classified into structure less and structure based techniques. In this paper, we present the survey of such techniques. Weighted kNN, Model based kNN, Condensed NN, Reduced NN, Generalized NN are structure less techniques whereas k-d tree, ball tree, Principal Axis Tree, Nearest Feature Line, Tunable NN, Orthogonal Search Tree are structure based algorithms developed on the basis of kNN. The structure less method overcome memory limitation and structure based techniques reduce the computational complexity.

.

Keywords- Nearest neighbor (NN), kNN, Model based kNN, Weighted kNN, Condensed NN, Reduced NN.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

44. Paper 30041081: Time Domain Analysis Based Fault Diagnosis Methodology for Analog Circuits - A Comparative Study of Fuzzy and Neural Classifier Performance (pp. 306-313)

Full Text: PDF

.

V. Prasannamoorthy 1, R. Bharat Ram 2, V. Manikandan 3, N. Devarajan 4

1, 2, 4 Department of Electrical Engineering, Government College of Technology Coimbatore, India

3 Department of Electrical Engineering, Coimbatore Institute of Technology Coimbatore, India

.

Abstract — In this paper, we attempt to diagnose the occurrence of faults in analog electronic circuits based upon variations in time domain specifications corresponding to the circuit condition under consideration relative to the fault free circuit. To achieve this, both a fuzzy as well as a neural classifier have been utilized to operate with the fault dictionary data as base. Through this process, a general comparison is drawn out between the performance of either route in dealing with fault diagnosis of circuits. An illustrative example is considered, on which both the fuzzy and neural algorithms are tested, and their performance in fault diagnosis is compared. Further, the suitability of the fuzzy and neural techniques to various kinds of diagnosis problems depending upon the nature of data available is also discussed.

Keywords — Fault diagnosis, fuzzy logic system, neural networks, Sallen-key Bandpass filter.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

45. Paper 30041082: Evaluation of English-Telugu and English-Tamil Cross Language Information Retrieval System using Dictionary Based Query Translation Method (pp. 314-319)

Full Text: PDF

.

P. Sujatha , Department of Computer Science, Pondicherry Central University, Pondicherry-605014, India

P. Dhavachelvan, Department of Computer Science, Pondicherry Central University, Pondicherry-605014, India

V. Narasimhulu, Department of Computer Science, Pondicherry Central University, Pondicherry-605014, India

.

Abstract—Cross Lingual Information Retrieval (CLIR) system helps the users to pose the query in one language and retrieve the documents in another language. We developed a CLIR system in computer science domain to retrieve the documents in Telugu and Tamil languages for the given English query. We opted for the method of translating queries for English-Tamil and English-Telugu language pairs using bilingual dictionaries. Transliteration is also performed for the named entities present in the query. Finally, the translation and transliteration results are combined and used the resultant query to the searching module for retrieving target language documents. For Telugu, we achieve a Mean Average Precision (MAP) of 0.3835 and for Tamil, we achieve a MAP of 0.3665.

.

Keywords - Cross Lingual Information Retrieval; Translation; Transliteration; Ranking.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

46. Paper 30041086: A Novel Approach for Hand Analysis Using Image Processing Techniques (pp. 320-323)

Full Text: PDF

.

Vishwaratana Nigam, Divakar Yadav, Manish K Thakur

Department of Computer Science & Engineering and Information Technology, Jaypee Institute of Information Technology, Noida, India

.

Abstract- Palmistry is the art of characterization and foretelling the future through the study of the palm, also known as palm reading, or chorology. With the help of palm lines and fingers one can know the characteristics as well as can foretell the future of a person but still this field is not much technically developed and a person has to analyze hands personally. In this paper we propose a ratio based system to characterize persons on the basis of their palm width-length and their finger length. We applied image processing techniques to generate and analyze the results.

.

Keywords- Palmistry, Palm-width (Pw), Palm-length (Pl), Finger-length (Fl), Jupiter ruled, Saturn ruled, Sun ruled, Mercury ruled.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

47. Paper 30041087: Applying l-Diversity in anonymizing collaborative social network (pp. 324-329)

Full Text: PDF

.

G. K. Panda, Department of CSE & IT, MITS, Sriram Vihar Rayagada,, India

A. Mitra, Department of CSE & IT , MITS, Sriram Vihar Rayagada,, India

Ajay Prasad, Department of CSE, Sir Padampat Singhania University, Udaipur, India

Arjun Singh, Department of CSE, Sir Padampat Singhania University, Udaipur, India

Deepak Gour, Department of CSE, Sir Padampat Singhania University, Udaipur, India

.

Abstract — To date publish of a giant social network jointly from different parties is an easier collaborative approach. Agencies and researchers who collect such social network data often have a compelling interest in allowing others to analyze the data. In many cases the data describes relationships that are private and sharing the data in full can result in unacceptable disclosures. Thus, preserving privacy without revealing sensitive information in the social network is a serious concern. Recent developments for preserving privacy using anonymization techniques are focused on relational data only. Preserving privacy in social networks against neighborhood attacks is an initiation which uses the definition of privacy called k-anonymity. k-anonymous social network still may leak privacy under the cases of homogeneity and background knowledge attacks. To overcome, we find a place to use a new practical and efficient definition of privacy called ldiversity. In this paper, we take a step further on preserving privacy in collaborative social network data with algorithms and analyze the effect on the utility of the data for social network analysis.

.

Keywords- bottom R-equal, top R-equal, R-equal, bottom Requivalent, top R-equivalent and R-equivalent, l-diversity

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

48. Paper 30041072: 3D-Mesh denoising using an improved vertex based anisotropic diffusion (pp. 330-337)

Full Text: PDF

.

Mohammed El Hassouni, DESTEC, FLSHR, University of Mohammed V-Agdal- Rabat, Morocco

Driss Aboutajdine, LRIT, UA CNRST, FSR, University of Mohammed V-Agdal-Rabat, Morocco

.

Abstract — This paper deals with an improvement of vertex based nonlinear diffusion for mesh denoising. This method directly filters the position of the vertices using Laplace, reduced centered Gaussian and Rayleigh probability density functions as diffusivities. The use of these PDFs improves the performance of a vertex-based diffusion method which are adapted to the underlying mesh structure. We also compare the proposed method to other mesh denoising methods such as Laplacian flow, mean, median, min and the adaptive MMSE filtering. To evaluate these methods of filtering, we use two error metrics. The first is based on the vertices and the second is based on the normals. Experimental results demonstrate the effectiveness of our proposed method in comparison with the existing methods.

.

Keywords- Mesh denoising, diffusion, vertex.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

49. Paper 30041083: A New Approach for Security Risk Assessment Caused by Vulnerabilities of System by Considering the Dependencies (pp. 338-346)

Full Text: PDF

.

Mohammad Taromi, Performance and Dependability Eng. Lab., School of Computer Engineering, Iran University of Science and Technology, Tehran, Iran

Mohammad Abdollahi Azgomi, Performance and Dependability Eng. Lab., School of Computer Engineering, Iran University of Science and Technology, Tehran, Iran

.

Abstract — Risk estimation is a necessary step in risk management which is the measurement of impact caused by the probability of exploiting vulnerabilities recognized in the system. At the moment, the qualitative metrics are used for this purpose that is believed to suffer subjectivity. The risk caused by a recognized vulnerability is computed using the values of common vulnerabilities scoring system (CVSS) attributes. But the great challenge in this field is that the dependency between vulnerabilities recognized in the system is not taken into account. In this paper, a new approach to risk assessment for the risks caused by vulnerabilities of system has been proposed which considers the dependencies among vulnerabilities. This approach consists of three steps. In the first step, after recognizing vulnerabilities of system and configuring the system, an attack graph is generated for all the critical resources of the system using MulVAL framework. Using these attack graphs, the dependency among vulnerabilities is extracted. In the second step, using the dependencies extracted among the vulnerabilities and estimated impact and exploitability defined based on CVSS attributes for individual vulnerability, a Markov model is generated. In the third step, using the Markov model, the quantitative security risk is estimated as the attacker keeps progressing in the system. In this paper we introduce the proposed approach, a case study demonstrating the above steps and the results of quantitative security risk estimation.

.

Keywords-Security Risk Assessment; Vulnerability; Attack Graph

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

50. Paper 28041060: Image Super Resolution Using Marginal Ditribution Prior (pp. 347-351)

Full Text: PDF

.

S. Ravishankar, Department of Electronics and Communication, Amrita Vishwa Vidyapeetham University, Bangalore, India

Dr. K.V.V. Murthy, Department of Electronics and Communication, Amrita Vishwa Vidyapeetham University, Bangalore, India

.

Abstract— In this paper, we propose a new technique for image super-resolution. Given a single low resolution (LR) observation and a database consisting of low resolution images and their high resolution versions, we obtain super-resolution for the LR observation using regularization framework. First we obtain a close approximation of the super-resolved image using learning based technique. We learn high frequency details of the observation using Discrete Cosine Transform (DCT). The LR observation is represented using a linear model. We model the texture of the HR image using marginal distribution and use the same as priori information to preserve the texture. We extract the features of the texture in the image by computing histograms of the filtered images obtained by applying filters in a filter bank and match them to that of the close approximation. We arrive at the cost function consisting of a data fitting term and a prior term and optimize it using Particle Swarm Optimization (PSO). We show the efficacy of the proposed method by comparing the results with interpolation methods and existing super-resolution techniques. The advantage of the proposed method is that it quickly converges to final solution and does not require number low resolution observations.

Keywords-

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

51. Paper 10041012: A Survey on WiMAX (pp. 352-357)

Full Text: PDF

Mohsen Gerami, The Faculty of Applied Science of Post and Communications, Danesh Blv, Jenah Ave, Azadi Sqr, Tehran, Iran.

Abstract— This paper describes an overview of WiMAX. The paper outlines fundamental architectural components for WiMAX and explains WiMAX Security Issues. Furthermore various 802.16 standards, IEEE 802.16 protocol architecture and WiMAX Market will be discussed.

Keywords: WiMAX; IEEE 802.16; Security; Protocol; Market;

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

52. Paper : Critical Success factors for Enterprise Resource Planning implementation in Indian Retail Industry: An Exploratory study (pp. 358-363)

Full Text: PDF

.

Poonam Garg, Professor, Information Technology and Management Dept., Institute of Management Technology, Ghaziabad-India

.

Abstract— Enterprise resource Planning (ERP) has become a key business driver in today’s world. Retailers are also trying to reap in the benefits of the ERP. In most large Indian Retail Industry ERP systems have replaced nonintegrated information systems with integrated and maintainable software. Retail ERP solution integrates demand and supply effectively to help improve bottom line. The implementation of ERP systems in such firms is a difficult task. So far, ERP implementations have yielded more failures than successes. Very few implementation failures are recorded in the literature because few companies wish to publicize their implementation failure. This paper explores and validates the existing literature empirically to find out the critical success factors that lead to the success of ERP in context to Indian retail industry. The findings of the results provide valuable insights for the researchers and practitioners who are interested in implementing Enterprise Resource Planning systems in retail industry, how best they can utilize their limited resources and to pay adequate attention to those factors that are most likely to have an impact upon the implementation of the ERP system.

.

Keywords: Enterprise Resource Planning, Retail, CSF

.