Vol. 8 No. 4 JUL 2010

Vol. 8 No. 4 July 2010 International Journal of Computer Science and Information Security

Publication July 2010, Volume 8 No. 4 (Download Full Journal) (Archive)

.

Copyright © IJCSIS. This is an open access journal distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

1. Paper 28061064: Expert-Aware Approach: An Innovative Approach To Improve Network Data Visualization (pp. 1-7)

Full Text: PDF

Doris Hooi-Ten Wong, National Advanced IPv6 Centre (NAv6), Universiti Sains Malaysia, 11800, Penang, MALAYSIA

Kok-Soon Chai, National Advanced IPv6 Centre (NAv6), Universiti Sains Malaysia, 11800, Penang, MALAYSIA

Sureswaran Ramadass, National Advanced IPv6 Centre (NAv6), Universiti Sains Malaysia, 11800, Penang, MALAYSIA

Nicolas Vavasseur, Université de Franche Comté, 16 route de Gray, 25030 Besançon cedex, FRANCE

.

Abstract—Computers have been infected by the computer anomalies. The availability of network data visualization tools greatly facilitate to perceive computer users from being affected by these anomalies. Many of the network data visualization tools are designed particularly for users with advanced network knowledge even though the tools are indispensable by diverse computer users. We proposed an expert-aware approach to designing a system which formulated with a large amount of network data and adaptive for diverse computer users. In the preliminary phase, we construct an intelligent expertise classification algorithm which provides a default setting for the expert-aware network data visualization tool. Besides, the tool will learn from continual user feedbacks in order to statistically satisfy the needs of majority tool users. In this paper, we will focus on the expert-aware approach with the users’ expertise level in network security and adapts the visualization views that are best suitable for the computer user. Our initial results from the approach implementation showed that it is capable of representing several of network security data not only from small network but also for complicated high dimensional network data. Our main focus in this paper is to fulfill different requirements from diverse computer users.

.

Keywords- network data visualization tool, network knowledge, expert-aware approach, network security.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

2. Paper 11061012: Load Balancing in Distributed Computer Systems (pp. 8-13)

Full Text: PDF

Ali M. Alakeel, College of Computing and Information Technology, Tabuk University, Tabuk, Saudi Arabia

Abstract—Load balancing in distributed computer systems is the process of redistributing the work load among processors in the system to improve system performance. Trying to accomplish this, however, is not an easy task. In recent research and literature, various approaches have been proposed to achieve this goal. Rather than promoting a specific load balancing policy, this paper presents and discusses different orientations toward solving the problem.

.

Keywords-distributed systems; load balancing; algorithms; performance evaluation

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

3. Paper 14061013: Intrusion Detection using Multi-Stage Neural Network (pp. 14-20)

Full Text: PDF

Sahar Selim, Mohamed Hashem and Taymoor M. Nazmy

Faculty of Computer and Information Science, Ain Shams University, Cairo, Egypt

Abstract— Security has become a crucial issue for computer systems. New security failures are discovered everyday and there are a growing number of bad-intentioned people trying to take advantage of such failures. Intrusion detection is a critical process in network security. Intrusion Detection Systems (IDS) aim at protecting networks and computers from malicious network-based or host-based attacks. This paper presents a neural network approach to intrusion detection. We compare the use of our proposed multi-stage to single-stage neural network for intrusion detection using single layer perceptron. The advantage of the proposed mutli-stage system is not only accuracy but also the parallelism as every network can be trained on separate computer which provides less training time. Also the multi-stage powers the system with scalability because if new attacks of specific class are added we don't have to train all the networks but only the branch (the neural networks) affected by the new attack. The results showed that the designed multi-stage network is capable of classifying records with 99.71% accuracy and 98.67% accuracy for single stage network.

Keywords-component; network intrusion detection; neural network; NSL-KDD dataset

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

4. Paper 28061063: Improvement of the Performance of Advanced Local Area Optical Communication Networks by Reduction the Effects of the Propagation Problems (pp. 21-31)

Full Text: PDF

Mahmoud I. Abd-Alla, and Fatma M. Aref M. Houssien

Electronics & Communication Department, Faculty of Engineering, Zagazig University

Abstract - In the present paper, the improvement of transmission distance and bit rates of Advanced Local Area Optical Communication Networks (ALAOCN) are investigated by reducing the effects of propagation problems over wide range of the affecting parameters. Dispersion characteristics in high-speed optical transmission systems are deeply analyzed over a span of optical wavelengths from 1.2 μm up to 1.65 μm. Two different fiber structures for dispersion management are investigated. Two types of fabrication material link of single mode fiber made of Germania doped Silica and plastic fibers are suggested. Two successive segments of single mode fiber made of Germania doped Silica are suggested to be employed periodically in the long-haul systems. The two successive segments are: i) of different chemical structures (x), or ii) of different relative refractive index differences (Δn). As well as the total spectral losses of both fabrication materials and total insertion losses of connectors and splices of these fabrication materials are presented under the thermal effect of to be processed to handle both transmission lengths and bit rates per channel for cables of multi links over wide range of the affecting parameters. Within soliton and maximum time division multiplexing (MTDM) transmission techniques, both the transmission bit rate and capacity-distance product per channels for both of silica doped and plastic fabrication materials are estimated. The bit rates are studied within thermal sensitivity effects, loss and dispersion sensitivity effects of the refractive index of the fabrication core materials are taken into account to present the effects on the performance of optical fiber cable links. Dispersion characteristics and dispersion management are deeply studied where two types of optical fiber cable core materials are used. A chromatic dispersion management technique in optical single mode fiber is introduced which is suitable for (ALAOCN) to facilitate the design of the highest and the best transmission performance of bit rates in optical networks.

Keywords: Propagation problems, Single mode fiber (SMF), Fiber losses, Dispersion types, Dispersion management, Soliton Bit rate thermal sensitivity, optical link design, Thermal effects, Advanced-optical networks.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

5. Paper 29061070: Classifying Maintenance Request in Bug Tracking System (pp. 32-38)

Full Text: PDF

Naghmeh Mahmoodian, Rusli Abdullah, Masrah Azrifah Azim Murad

University Putra Malaysia, Faculty of computer science and information technology, UPM, 43400 upm serdang, selongor Malaysia, Kuala Lumpur, Malaysia

Abstract—Complex process of software modifications called software maintenance (SM), is a costly and time consuming task. Classification of maintenance request (MR) on the type MR which are corrective, adaptive, perfective or preventive. The maintenance type (MT) are important in keeping the quality factors of the system. However, classification of MT is difficult in nature and this affect maintainability of the system. Thus, there is a need for tool which is able to classify MRs into MT automatically. In this study, we present new result of combination texts of features during MR classification. Three different classifications techniques, namely Decision Tree, Naïve Bayesian and Logistic Regression are applied to perform classification. 600 MRs from Mozilla bug tracking system (BTS) are used as source of data. The results show that the model is able to classify MRs into MT with accuracy between 67.33 and 78.17%.

.

KeywordsClassification; Software Maintenance; Maintenance Type; Classification; Corrective Maintenance

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

6. Paper 22061041: An Optimized Clustering Algorithm Using Genetic Algorithm and Rough set Theory based on Kohonen self organizing map (pp. 39-44)

Full Text: PDF

Asgarali Bouyer, Department of Computer Science, Islamic Azad University – Miyandoab Branch, Miyandoab, Iran

Abdolreza Hatamlou, Department of Computer Science, University Kebangsaan Malaysia, Selangor, Malaysia

Abdul Hanan Abdullah, Department of Computer and Information Systems, Faculty of Computer Science and Information Systems, Universiti Teknologi Malaysia, 81310 Skudai, Johor Bahru, Malaysia

.

Abstract—The Kohonen self organizing map is an efficient tool in exploratory phase of data mining and pattern recognition. The SOM is a popular tool that maps high dimensional space into a small number of dimensions by placing similar elements close together, forming clusters. Recently, most of the researchers found that to take the uncertainty concerned in cluster analysis, using the crisp boundaries in some clustering operations is not necessary. In this paper, an optimized two-level clustering algorithm based on SOM which employs the rough set theory and genetic algorithm is proposed to defeat the uncertainty problem. The evaluation of proposed algorithm on our gathered poultry diseases data and Iris data expresses more accurate compared with the crisp clustering methods and reduces the errors.

.

Index Terms- SOM, Clustering, Rough set theory, Genetic Algorithm.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

7. Paper 28061059: Secured Communication through Hybrid Crypto-Steganography (pp. 45-48)

Full Text: PDF

A. Joseph Raphael, Research Scholar – Karpagm University, Coimbatore, India and Lecturer in Information Technology, Ibra College of Technology, Sultanate of Oman

Dr. V.Sundaram, Head and Director, Department of Computer Applications Karpagam College of Engineering Coimbatore, India

.

Abstract-In this paper we present a hybrid technology of mixing cryptography and steganography to send secret messages. This method has got the advantages of both the methods, and even if one fails the other comes to the rescue. In cryptography we have used the RSA method for encryption and decryption of the original message and further the LSB (Least Significant Bit) method is used to hide the encrypted message in the cover image and send to the recipient. The original message is retrieved by the reverse process, first by collecting the LSB of the pixels and then by RSA decryption. Since the private key for RSA method is very difficult to find, this method we suggested is a strong encryption method and the messages can be communicated in much secured way in an insecure channel.

.

Keywords-stegano object; cryptosystem

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

8. Paper 29061075: Lossy audio coding flowchart based on adaptive time- frequency mapping, wavelet coefficients quantization and SNR psychoacoustic output (pp. 49-59)

Full Text: PDF

Khalil Abid, Laboratory of Systems and Signal Processing (LSTS), National Engineering School of Tunis ( ENIT ), BP 37, Le Belvédère 1002, Tunis, Tunisia

Kais Ouni and Noureddine Ellouze, Laboratory of Systems and Signal Processing (LSTS), National Engineering School of Tunis ( ENIT ), BP 37, Le Belvédère 1002, Tunis, Tunisia

.

Abstract—This paper describes a novel wavelet based audio synthesis and coding method. The adaptive wavelet transform selection and the coefficient bit allocation procedures are designed to take advantage of the masking effect in human hearing. They minimize the number of bits required to represent each frame of audio material at a fixed distortion level. This model incorporates psychoacoustic model into adaptive wavelet packet scheme to achieve perceptually transparent compression of high quality audio signals.

.

Keywords- D.W.T; Psychoacoustic Model; Signal to Noise Ratio; Quantization

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

9. Paper 11061007: A Tropos Based Requirement Engineering Frameworks for Self Adaptive Systems (pp. 60-67)

Full Text: PDF

Farah Noman, Department of Computer Science, National University of Computer and Emerging Sciences, Karachi, Pakistan

Zafar Nasir, Department of Computer Science, National University of Computer and Emerging Sciences, Karachi, Pakistan

.

Abstract—The applications developed during the current era are deployed in environments which change over the course of time. These changes if occur in a normal application would require re-work so that design and architectural level updates should be implemented to cater to the newly changed application environment. This is turns results in wasted effort and increased cost for the system maintenance. Hence there arise a need for systems that are able to alter their functioning so as to adapt to the changing environmental needs and to heal themselves automatically from likely errors and system failures without the need of human intervention. Such systems are known as Self Healing, Self Adaptive or Autonomic Systems. The approaches and frameworks used for gathering, analyzing and specifying requirements for Self Adaptive system are quite different from the traditional life cycle and require a different line of action from the processes which are followed when capturing requirements for a normal system whose environments are relatively stable and all the system states are known before hand. This research focuses on analyzing the various methods, techniques and frameworks for gathering requirements for Self Adaptive systems. A Tropos based approach has also been proposed for requirements engineering for self adaptive system.

.

Keyword-component; Self-adaptive Systems; Requirement Engineering; Autonomic Systems; Agent oriented Methodologies

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

10. Paper 11061008: Fuzzy Logic in a Low Speed Cruise-Controlled Automobile (pp. 68-77)

Full Text: PDF

Mary Lourde R., Waris Sami Misbah,

Department of Electrical & Electronics Engineering , BITS, Pilani-Dubai, Dubai International Academic City, U.A.E

.

Abstract — Traffic congestion is a major problem that drivers face these days Long rush hours exhibit both mental and physical toll on a driver. This paper describes the design of cruise control system based on fuzzy logic, in order to reduce the workload on a driver during traffic congestion. The proposed low speed cruise control system operates by sensing the speed and headway distance of the preceding vehicle and controlling the host vehicle’s speed accordingly. The vehicle speed is controlled by controlling throttle and the brakes. The fuzzy logic based cruise controlled vehicle is simulated using MATLAB Simulink and the results are presented in this paper.

.

Keywords - fuzzy logic, cruise control, low speed, and traffic congestion.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

11. Paper 11061009: Plant Classification Based on Leaf Recognition (pp. 78-81)

Full Text: PDF

Abdolvahab Ehsanirad, Department of Computer Science, Islamic Azad University, Minoodasht Branch, Iran

.

Abstract — Current study used the image processing techniques in order to classification of plants based on leaves recognition. Two methods called the Gray-Level Co-occurrence matrix (GLCM) and Principal Component Analysis (PCA) algorithms have been applied to extract the leaves texture features. To classify 13 kinds of plants with 65 new or deformed leaves as test images, the Algorithms are trained by 390 leaves. The findings indicate that the accuracy of PCA method with 98% come out to be more efficiency compare to the GLCM method with 78% accuracy.

.

Keywords - Classification, GLCM, PCA, Feature Extraction.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

12. Paper 20051026: Reliable Routing With Optimized Power Routing For Wireless Adhoc Network (pp. 82-89)

Full Text: PDF

T.K.Shaik Shavali, Professor , Department of Computer Science, Lords institute of Engineering & Tech, Hyderabad-08, A.P. , INDIA

Dr T. Bhaskara Reddy Department of Computer Science & Technology, S.K. University, Anantapur-03, A.P., INDIA

Sk fairooz Associate Prof, Department of ECE, AHCET, Hyderabad-08, A.P. , INDIA

.

Abstract - In this paper work, a routing protocol called RMP (route management protocol) is implemented to cope with misbehavior operation in AdHoc network. It enables nodes to detect misbehavior by first-hand observation and use the second-hand information provided by other nodes. This RMP protocol can run on any routing protocol to cope with misbehavior. In this paper work, we have tested for DSR routing protocol.( ie DSR with RMP). The efficiency of communication routes is tested over the node power consumption and developed a mechanism to optimize the power consumption in routing scheme.

.

Keyword: route management protocol, adhoc network, power optimization, network efficiency

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

13. Paper 21061031: Performance of Hybrid Routing Protocol for Adhoc Network under Bandwidth Constraints (pp. 90-98)

Full Text: PDF

A K Daniel, Assistant Professor, Computer Sc & Engg Department, M M M Engineering College, GORAKHPUR (U P) India,

R Singh, Assistant Professor, Department of CS & I T, M J P Rohilkhand University, BAREILLY (U P) India

J P Saini, Principal, M M M Engineering College, GORAKHPUR (U P) India

.

ABSTRACT: An Ad hoc network is a collection of wireless mobile nodes dynamically forming a temporary network without the use of any existing network infrastructure or centralized administration. Routing protocols used inside ad hoc networks must be prepared to automatically adjust to an environment that can vary between the extremes of high mobility with low band width, and low mobility with high bandwidth. In this paper, a bandwidth efficient multicast routing protocol for ad-hoc networks is presented. A hybrid routing protocol under bandwidth constraints (HRP-BC) has been proposed. The proposed protocol achieves low communication over head, and achieves high multicast efficiency this protocol has improved existing routing protocols by creating a mesh and providing multiple alternate routes. The protocol considered the following 1) Route setup as routing Distance of path, 2) Load at the node as traffic and 3) Bandwidth as queue length at the node. The proposed scheme utilizes the path information, traffic and bandwidth resource information at each node, for selection of route path, and compared to traditional DSR schemes. The simulation results shows that the proposed HRP-BC protocol achieves better performance to the DSR protocol for the maintenance overhead and the path reliability. It reduces congestion in network and improves bandwidth utilization. Thus provide efficient use of bandwidth in the ad hoc network.

.

Keywords:- MANET, Proactive, Reactive, Hybrid, bandwidth

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

14. Paper 21061032: MVDR an Optimum Beamformer for a Smart Antenna System in CDMA Environment (pp. 99-106)

Full Text: PDF2 & PDF

M Yasin, Pervez Akhtar, M Junaid Khan

Department of Electronics and Power Engineering, Pakistan Navy Engineering College, National University of Sciences and Technology (NUST), Karachi, PAKISTAN

.

Abstract: Efficient utilization of limited radio frequency spectrum is only possible to use smart/adaptive antenna array system. Minimum Variance Distortionless Response (MVDR) algorithm is an option for smart antenna to exploit spatial distribution of the users and the access delay distribution of signal paths to enhance mobile systems capabilities for quality voice and data communication. This paper analyzes the performance of MVDR (blind algorithm) and Kernel Affine Projection Algorithm (KAPA) (nonblind algorithm) for CDMA application. For the first time, KAPA is implemented in [1] in the context of noise cancellation but we are using it for adaptive beamforming which is novel in this application. Smart antenna incorporates these algorithms in coded form which calculates optimum weight vector which minimizes the total received power except the power coming from desired direction. Simulation results verify that MVDR a blind algorithm has high resolution not only for beam formation but also better for null generation as compared to nonblind algorithm KAPA. Therefore, MVDR is found more efficient Beamformer.

.

Keywords: Adaptive Filtering, Minimum Variance Distortionless Response (MVDR) Algorithm and Kernel Affine Projection Algorithm (KAPA).

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

15. Paper 22061035: Specifying And Validating Quality Characteristics For Academic Web-sites – Indian Origin (pp. 107-113)

Full Text: PDF

Ritu Shrivastava, Department of Computer Science and Engineering, Sagar Institute of Research Technology & Science, Bhopal 462007, India

J. L. Rana, Retired Professor, Department of Computer Science and Engineering, Maulana Azad National Institute of Technology, Bhopal 462002, India

M Kumar , Prof. & Dean, Department of Computer Science and Engineering, Sagar Institute of Research & Technology, Bhopal 462007, India

.

Abstract— Every stakeholder of Academic Web-sites is mainly concerned with external quality, viz., usability, functionality, and reliability. Signore and Olsina have given hierarchical quality characteristics for measuring quality of Web-sites, especially for e-commerce and museum domains. In this paper, the authors have proposed a hierarchical model of attributes, sub-attributes, and metrics for measuring external quality of academic Websites – Indian origin. The theoretical validation of model has been carried out using distance measure construction method. The empirical validation is in progress and will be reported soon.

.

Keywords-component; Web-site Quality, Academic domain, Hierarchical model, Attributes, Metrics

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

16. Paper 22061036: ISOR: Intelligent Secure On-Demand Routing Protocol (pp. 114-119)

Full Text: PDF

Moitreyee Dasgupta, Department of Computer Science and Engg., JSS Academy of Technical Education, Noida, New Delhi,

Gaurav Sandhu, Department of Computer Science and Engg., GTBIT, New Delhi, India.

Usha Banerjee, Department of Computer Science & Engg. , College of Engineering Rookee, Roorkee, India

.

Abstract— MANETs are highly vulnerable to attacks due to their inherent characteristics of the lack of infrastructure and complexity of wireless communication. Considerable improvements have been made towards providing ad hoc network security and existent solutions apply cryptography, intrusion detection systems or reputation systems. However, these conventional defense lines are inefficient to put all attacks and intrusions off. Our approach is to study the behavior of the AODV routing protocol in the presence of blackhole attacks, one of the major Denial-of Service attacks. In the first phase of this research, we provide the detailed simulation methodology of black hole attacks, and detail out the steps of creating a new routing protocol named as Intelligent Secure On-Demand Routing protocol (ISOR) using NS-2. In ISOR, an intelligent prevention scheme has been presented where every node will behave intelligently to prevent black hole attacks. Simulation studies show that compared to the original ad hoc on-demand distance vector (AODV) routing scheme, our proposed solution can verify 75% to 98% of the routes to the destination depending on the pause times at minimum delay in the networks.

.

Keywords- Blackhole attacks, DoS Attacks, MANET, Security in MANET routing protocol

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

17. Paper 22071026: High Performance Fingerprint Identification System (pp. 120-125)

Full Text: PDF

Dr. R. Seshadri , B.Tech,M.E,Ph.D, Director, S.V.U.Computer Center, S.V.University, Tirupati

Yaswanth Kumar.Avulapati, M.C.A,M.Tech,(Ph.D), Research Scholar, Dept of Computer Science

S.V.University, Tirupati

.

Abstract - Biometrics is the science of establishing the identity of an individual based on their physical, chemical and behavioral characteristics of the person. Fingerprints are the most widely used biometric feature for person identification and verification in the field of biometric identification .A finger print is the representation of the epidermis of a finger. It consists of a pattern of interleaved ridges and valleys. Fingerprints are graphical flow-like ridges present on human fingers. They are fully formed at about seven months of fetus development and finger ridge configurations do not change throughout the life of an individual except due to accidents such as bruises and cuts on the fingertips. This property makes fingerprints a very attractive biometric identifier. This paper presents an approach to classifying the fingerprints into different groups and increases the performance of the system. It increases the performance of fingerprint matching while matching the input template with stored template.

.

Keywords- Biometrics, Verification, Identification

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

18. Paper 25061047: Constraint-free Optimal Meta Similarity Clusters Using Dynamic Minimum Spanning Tree (pp. 126-135)

Full Text: PDF

S. John Peter, Department of Computer Science and Research Center, St. Xavier’s College, Palayamkottai Tamil Nadu, India.

S.P. Victor, Department of Computer Science and Research Center, St. Xavier’s College, Palayamkottai Tamil Nadu, India.

.

Abstract — Clustering is a process of discovering groups of objects such that the objects of the same group are similar, and objects belonging to different groups are dissimilar. A number of clustering algorithms exist that can solve the problem of clustering, but most of them are very sensitive to their input parameters. Therefore it is very important to evaluate the result of them. The minimum spanning tree clustering algorithm is capable of detecting clusters with irregular boundaries. In this paper we propose a constraint- free minimum spanning tree based clustering algorithm. The algorithm constructs hierarchy from top to bottom. At each hierarchical level, it optimizes the number of cluster, from which the proper hierarchical structure of underlying dataset can be found. The algorithm uses a new cluster validation criterion based on the geometric property of data partition of the data set in order to find the proper number of clusters at each level. The algorithm works in two phases. The first phase of the algorithm create clusters with guaranteed intra-cluster similarity, where as the second phase of the algorithm create dendrogram using the clusters as objects with guaranteed inter-cluster similarity. The first phase of the algorithm uses divisive approach, where as the second phase uses agglomerative approach. In this paper we used both the approaches in the algorithm to find Optimal Meta similarity clusters.

.

Keywords: Euclidean minimum spanning tree, Subtree, Clustering, Eccentricity, Center, Hierarchical clustering, Dendrogram, Cluster validity, Cluster Separation

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

19. Paper 25061054: Media Streaming using Multiple Description Coding in Overlay Networks (pp. 136-139)

Full Text: PDF

Sachin Yadav, Department of CSE, SGIT College of Engineering, Ghaziabad, India

Ranjeeta Yadav, Department of ECE, SGIT College of Engineering, Ghaziabad, India

Shailendra Mishra, Department of CSE, Kumaon Engineering College, Dwarahat, India

.

Abstract—In this paper we examine the performance of two types of Overlay networks i.e. Peer-to-Peer (P2P) & Content Delivery Network (CDN) media streaming using Multiple Description Coding (MDC). In both the approaches many servers simultaneously serve one requesting client with complementary descriptions. This approach improves reliability and decreases the data rate a server has to provide. We have implemented both approaches in the ns-2 network simulator. The experimental results indicate that the performance of Multiple Description Coding-based media streaming in case of P2P network is better than CDN.

.

Keywords- MDC; CDN; Video Streaming; P2P; Overlay Network

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

20. Paper 28061056: Secured and QoS based multicast routing in MANETs (pp. 140-148)

Full Text: PDF

Maya Mohan, Department of CSE, NSS College of Engineering, Palakkad, Kerala

S.Mary Saira Bhanu, Department of CSE, National Institute of Technology, Thiruchirappalli, TN

.

Abstract- A mobile ad-hoc network (MANET) is a dynamic network of self controlled mobile nodes without any centralized co-ordinator (access point or base station) or wired infrastructure. The main difficulty in designing a routing protocol for MANETs is the dynamical topology which results from the random movement of mobile nodes within the source’s transmission range. MANET, which is fundamentally different from conventional infrastructure based networks, is self-configuring and formed directly by a set of mobile nodes. In MANET, the heterogeneity of networks and destinations makes it difficult to improve bandwidth utilization and service flexibility. Therefore, mobility of nodes makes the design of data distribution jobs greatly challenging. The wide use of multiparty conferences in MANETs leads to multicast routing for the transmission of information, such as video and other streaming data. In multicasting quality of service (QoS) and security are the leading challenges. The QoS deals with bandwidth utilization and network failures and security provides group communication to be confidential. In this paper MAODV protocol is modified by including QoS as well as security to the group communication. The QoS includes the link failures and the node failures. The security is provided by using symmetric key encryption method.

.

Key Words- multicast; MANET; QoS; security;

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

21. Paper 28061061: Analytical Comparison of Fairness Principles for Resource Sharing in Packet-Based Communication Networks (pp. 149-156)

Full Text: PDF

Yaser Miaji and Suhaidi Hassan

InterNetWorks Research Group, UUM College of Arts and Sciences, University Utara Malaysia, 06010, UUM Sintok, Malaysia

.

Abstract - Current Internet users are enormously increased and application that they are using is magnificently bandwidth devoured. With this manner, Internet is no longer a fair and protective environment. The diversity in the Internet applications required a reconsideration of the mechanisms used to deliver each packet pass through a router in order to provide better fairness and more protective place. Furthermore, the observer of the Internet packet could easily identify the purpose of the delay which is indeed caused by the queuing in the output buffer of the router. Therefore, to reduce such delay for those sensitive applications such as real-time applications, scholars develop many fairness principle which by turn could improve the QoS and hence the fairness and the protection aspect. This study highlight most famous fairness principles used in the literature and some other novel ideas in the concept of fairness. The analytical comparison of these principles shows the weakness and the strength of each principle. Furthermore, it illuminates which fairness principle is more appropriate in which environment.

.

Keywords-components; Fairness, max-min, proportional fairness, balanced, max-min charge

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

22. Paper 28061062: Multiple Values Bidirectional Square Root Search (pp. 157-161)

Full Text: PDF

Syed Zaki Hassan Kazmi, Department of Computer Science, IQRA University H-9, Islamabad, Pakistan

Syeda Shehla Kazmi, Department of Computing & Mathematics, Manchester Metropolitan University, United Kingdom

Jamil Ahmad, Department of Computer Science, IQRA University H-9, Islamabad, Pakistan

Syeda Sobia Hassan Kazmi, Department of Computer Science, The University Of Azad Jammu And Kashmir Muzaffarabad A.K, Pakistan

.

Abstract—The research in hand is an effort to introduce a new efficient searching technique known as Multiple Values Bidirectional Square Root Search. In this technique, a sorted list of values can be searched from another sorted list very efficiently. The overall time for sorting the values to be searched and searching is less than the time taken by the Linear Search and Binary Search. In this technique, the size of targeting list is reduced for every next value to be searched. As a result the searching time of remaining values to be searched is reduced, while in linear search and binary search the size of target list remains same for every value to be searched. In other words, we can say that the targeting list is traversed completely each time for every value to be searched from beginning to end. Running Cost analysis and the result obtained after implementation are provided in the graphical form with an objective to compare the efficiency of the technique with linear search as well as binary search, as binary search is consider as an efficient and fast one.

.

Keywords- Searching; Linear Search; Binary Search.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

23. Paper 29061068: Chunk Sort (pp. 162-166)

Full Text: PDF

Syed Zaki Hassan Kazmi, Department of Computer Science, IQRA University H-9, Islamabad, Pakistan

Syeda Shehla Kazmi, Department of Computing & Mathematics, Manchester Metropolitan University, United Kingdom

Syeda Sobia Hassan Kazmi, Department of Computer Science, The University Of Azad Jammu And Kashmir, Muzaffarabad A.K, Pakistan

Syed Raza Hussain Bukhari , Department of Computer Science, The University Of Azad Jammu And Kashmir, Muzaffarabad A.K, Pakistan

.

Abstract—the objective of this paper is to develop new efficient sorting technique known as Chunk Sort. Chunk sort is based on the idea of Merge Sort. In this sort divide main data list into number of sub list then sort sub list and combine them until the whole list become sorted. The difference between merge sort and chunk sort is, merge sort merge two sub list in to one, but chunk sort merge three sub list in to single sorted list. It is fast then many existing sorting techniques including merge sort and quick sort. Running Cost analysis and the result obtained after implementation are provided in the graphical form with an objective to compare the efficiency of the technique with quick sort and merge sort, as both are consider as efficient and fast one.

.

Keywords- Sorting; Merge Sort; Quick Sort.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

24. Paper 29061069: Top-Down Approach for the Development of Scheduling Mechanism in Packet-Switching Networks (pp. 167-173)

Full Text: PDF

Yaser Miaji and Suhaidi Hassan

InternetWorks Research Group, UUM College of Arts and Sciences, Universiti Utara Malaysia, 06010 UUM Sintok, MALAYSIA.

.

Abstract - Resource sharing in network society is common particularly with the current enormous increase in the number of Internet users with respect to limited resources. The emergence of new application such as real-time applications with more sensitivity and odd behavior required fundamental change in resource sharing policy. The delay which caused by congestion could sensationally demolish all the aspiration for delivering such sensitive application with respect to its restrictions. Scheduling mechanism plays an essential and crucial part of this function since queuing delay is the largest delay which experienced by the packet. Therefore, this paper is looking at the evolution of scheduling mechanism in the academic environment. We present the development in top-down approach form the last proposal to the very beginning. Although such approach provides a comprehensive knowledge, intensive information and rigorous tracking of the evolution of scheduling mechanism, the results shows that there is no much changes in the principle of the scheduling except for most recent scheduler named as Just Queuing.

.

Keywords: QoS, real time application, scheduling, queueing, network layer.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

25. Paper 31051080: Survey on Text Document Clustering (pp. 174-178)

Full Text: PDF

M.Thangamani, Computer Technology, Kongu Engineering College, Perundurai, Tamilnadu, India

Dr.P.Thangaraj, Dean, School of Computer Technology and Applications, Kongu Engineering College, Perundurai, Tamilnadu, India

.

Abstract—Document clustering is also referred as text clustering, and its concept is merely equal to data clustering. It is hardly difficult to find the selective information from an ‘N’number of series information, so that document clustering came into picture. Basically cluster means a group of similar data, document clustering means segregating the data into different groups of similar data. Clustering can be of mathematical, statistical or numerical domain. Clustering is a fundamental data analysis technique used for various applications such as biology, psychology, control and signal processing, information theory and mining technologies. For theoretical or machine learning perspective the cluster represent hidden pattern means search can be done by unsupervised learning, called data concept. For practical perspective clustering plays vital role in data mining applications such as scientific data exploration, information retrieval and text mining, spatial database applications, Web Analysis, CRM, marketing, medical diagnostics, computational, biology, cybernetics, genetics, marketing etc., in this survey we mainly concentrate on text mining and data mining. The process of extracting interesting information and knowledge from unstructured text is referred as text mining. Data mining is sorting through data to identify patterns and plot out the relationship. There are lot of algorithm based on text and data mining.

.

Keywords-Text Mining, Information Retrieval and Text Mining, Spatial Database Applications, Web Analysis.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

26. Paper 08061001: Simulation Analysis of Node Misbehavior in an Ad-hoc Network using NS2 (pp. 179-182)

Full Text: PDF

Rekha Kaushik, Department of Information Technology, MANIT, Bhopal, M.P, India

Dr. Jyoti Singhai, Department of Electronics and Communication Engineering, Bhopal, M.P, India

.

Abstract- Proper operation of MANET requires mutual cooperation of participating nodes. Due to presence of selfish or malicious nodes, performance of network degrades significantly. Selfish nodes drop packets coming from its neighbor nodes to conserve its energy or push forward their own packets in the buffer queue. To prevent this misbehavior, selfish nodes need to be detected and isolated from network. This paper, detect selfish nodes which are created due to nodes conserving their energy. After their detection, performance analysis of networks has done by comparing the ideal network and the network with selfish node using NS2.

.

Keywords- MANET, DSR, Selfish node.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

27. Paper 11061006: Survey on Fuzzy Clustering and Rule Mining (pp. 183-187)

Full Text: PDF

D. Vanisri, Computer Technology, Kongu Engineering College, Perundurai, Tamilnadu, India

Dr. C. Loganathan, Principal, Maharaja Arts and Science College, Coimbatore, Tamilnadu, India

.

Abstract—The document clustering improves the retrieval effectiveness of the information retrieval System. The association rule discovers the interesting relations between variables in transaction databases. Transaction data in realworld applications use fuzzy and quantitative values, to design sophisticated data mining algorithms for optimization. If documents can be clustered together in a sensible order, then indexing and retrieval operations can be optimized. This study presents a review on fuzzy document clustering. This survey paper also aims at giving an overview to some of the previous researches done in fuzzy rule mining, evaluating the current status of the field, and envisioning possible future trends in this area.

.

Keywords- Fuzzy set, Fuzzy clustering, Fuzzy rule mining, Information Retrieval, Web analysis.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

28. Paper 11061011: An Agent Based Approach for End-to-End QoS Guarantees in Multimedia IP networks (pp. 188-197)

Full Text: PDF

A. Veerabhadra Reddy, Lecturer in ECE, Government Polytechnic for Women, Hindupur

Dr. D. Sreenivasa Rao, Professor, Department of ECE, JNTU CE, Hyderabad

.

Abstract— Quality of Service (QoS) guarantees are important, if the network capacity is insufficient, particularly for real-time streaming multimedia applications such as voice over IP. Differentiated Services or DiffServ are the services of the original internet that prioritizes flows according to their service class and

provides much better bandwidth utilization. Predicting the end-to-end behavior and acquiring the method by which individual routers deal with the type of service field is difficult and fairly appropriate. Moreover it becomes more difficult if a packet crosses two or more DiffServ clouds, before reaching its destination. In this paper, we propose a QoS mapping framework to achieve scalability and end-to-end accuracy in QoS, using a Policy Agent (PA) in every DiffServ domain. This agent performs admission control decisions depending on a policy database. It configures the ingress and egress routers to perform traffic policing and conditioning jobs. Moreover, it constructs the shortest path between a source and destination satisfying the QoS constraints Bandwidth and Delay. By simulation results, we show that our proposed approach attains high throughput with reduced packet loss when compared with the normal DiffServ architecture.

.

Keywords- Quality of Service (QoS); Policy Agent (PA);DiffServ domain; QoS Route Selection; Packet loss, Throughput.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

29. Paper 14061014: High Performance Reconfigurable Balanced Shared Memory Architecture For Embedded DSP (pp. 198-206)

Full Text: PDF

J.L. Mazher Iqbal, Assistant Professor, ECE Department, Rajalakshmi Engineering College, Chennai-602 105, India

S. Varadarajan, Associate Professor, ECE Department, Sri Venkateswara College of Engineering, Sri Venkateswara University, Tirupati-517 502, India

.

Abstract—Reconfigurable computing greatly accelerates a wide variety of applications hence it has become a subject of a great deal of research. It has the ability to perform computations in hardware to increase performance, while keeping much of the flexibility of a software solution. In addition reconfigurable computers contain functional resources that may be easily modified after field deployment in response to changing operational parameters and datasets. Till date the core processing element of most reconfigurable computers has been the field programmable gate array (FPGA) [3]. This paper presents reconfigurable FPGA-based hardware accelerator for embedded DSP. Reconfigurable FPGAs have significant logic, memory and multiplier resources. These can be used in a parallel manner to implement very high performance DSP processing. The advantages of DSP design using FPGAs are high number of Instructions/Clock, high number of Multipliers, high Bandwidth Flexible I/O and Memory Connectivity. The proposed processor is a reconfigurable processing element architecture that consists of processing elements (PEs), memories and interconnection network and control elements. Processing element based on bit serial arithmetic (multiplication and addition) was also given. In this paper, it is established that specific universal balanced architecture implemented in FPGA is a universal solution, suited to wide range of DSP algorithms. At first the principle of modified shared-memory based processor are shown and then specific universal balanced architecture is proposed. An example of processor for TVDFT Transformation on the given accelerator is also given. By the proposed architecture, we could reduce cost, area and hence power in the best-known designs in the Xilinx FPGA technology.

.

Key Word; Reconfigurable architectures; FPGA;Pipeline; Processing Element; Hardware Accelerator

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

30. Paper 16061022: A Novel approach of Data Hiding Using Pixel Mapping Method (PMM) (pp. 207-214)

Full Text: PDF

Souvik Bhattacharyya, Lalan Kumar, and Gautam Sanyal

.

Abstract—Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. The carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each two bit of the secret message in each of the neighbor pixel according to the features of that pixel in a specified manner. This approach can be modified for mapping of four bits of the secret message by considering more no of features of the embedding pixel. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

.

Keywords— Cover Image, Pixel Mapping Method (PMM), Stego Image.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

31. Paper 16061023: Matching SHIQ Ontologies (pp. 215-222)

Full Text: PDF

B.O. Akinkunmi, A.O. Osofisan, and A.F. Donfack Kana

Department of Computer Science, University of Ibadan, Nigeria.

.

Abstract- This paper proposes and evaluates an approach to facilitate semantic interoperability between Ontologies built in SHIQ description logic language in an attempt to overcome the heterogeneity problem of Ontologies. The structural definition of Ontologies is used as a key point to predict their similarities. Based on SHIQ General Concept Inclusion, Ontologies to be mapped are translated into hierarchical trees and a graph matching technique is used to find out similarities between the trees. Similarity between concepts is predicted based on their level of hierarchy and their logical definition. Semantic similarities between concepts are evaluated by putting more emphasis on the logical operators used in defining concepts with less reference to concepts syntactic similarities analysis. The obtained result shows that a pure structural comparison based mainly on logical operators used in defining Ontologies concepts provides a better approximation than a comparison combining the logical and syntactic similarities analysis evaluated based on the edit distance function.

.

Keywords: Ontology, Description logics, Mapping, Interoperability, Semantic.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

32. Paper 18061027: Parallel Genetic Algorithm System (pp. 223-228)

Full Text: PDF

Nagaraju Sangepu, Assistant Professor

K.Vikram, CSE dept, KITS, Warangal, India

.

Abstract – Genetic Algorithm (GA) is a popular technique to find the optimum of transformation, because of its simple implementation procedure. In image processing GAs are used as a parameter-search-for procedure, this processing requires very high performance of the computer. Recently, parallel processing used to reduce the time by distributing the appropriate amount of work to each computer in the clustering system. The processing time reduces with the number of dedicated computers. Parallel implementations of systems can be grouped into 3 categories: 1) Parallel Hardware architectures designed specially for parallel processing. 2) Supporting Software implementations on machines with hardware support for parallel processing and 3) parallel processing algorithms implemented entirely in software on general- purpose hardware. It includes the clustering architecture consisting of homogeneous collection of general purpose computer systems connected via networks, also termed a clustered computing environment. . The queue length is optimally adjusted using GA so that queue length is minimized during data transfer in order to keep the bandwidth at a stable condition. Graph is also drawn to show the difference of bandwidth. Implementation includes all those activities that take place to convert from the old system to the new. The new system may be totally new, replacing an existing system or it may be major modification to the system currently put into use. This application implemented with simulation model of computer network, constructed along with the router. The options are given to invoke the FCFS and Genetic Algorithm. The path between source and destination were drawn and the result of both algorithms is discussed.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

33. Paper 21061033: Framework for vulnerability reduction in real time intrusion detection and prevention systems using SOM based IDS with Netfilter-Iptables (pp. 229-233)

Full Text: PDF

Abhinav Kumar, Kunal Chadha, Dr. Krishna Asawa

Jaypee Institute of Information Technology, Deemed University, Noida, India

.

Abstract— Intrusion detection systems and Intrusion Prevention system are few of the possible ways for handling various types of attacks or intrusions. But the credibility of such systems itself are at stake. None of the existing systems can assure you, your safety. In this paper we propose integration of SOM based intrusion detection system with an intrusion prevention system in the Linux platform for preventing intrusions. We propose a framework for reducing the real time security risks by using Selforganizing maps for intrusion detection accompanied by packet filtering through Netfilter-Iptable to handle the malicious data Packets.

.

Keywords-Intrusion Detection System, SOM.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

34. Paper 23061044: Challenges in Managing Information Security From an Organization’s Perspective (pp. 234-243)

Full Text: PDF

Patrick Kanyolo Ngumbi , School of Science and Engineering, Atlantic International University, Hawaii, USA

.

Abstract: This study used purposefully selected employees to fill self-administered unstructured questionnaires to provide information on aspects concerning information security at organizational level. The responses were subjected to non-probability analysis from which understanding of challenges encountered and subsequent impact were obtained. Six evaluation questions were used to gain insight into information security components. The study documented four categories of challenges encountered, possible outcomes of challenges and consequential impact. These results are beneficial to business end-users, information security managers, top and senior management in organizations.

.

Keywords: Information security management, organizational level, business information systems, challenges, outcome, impact

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

35. Paper 25061046: Image Retrieval with Texture Features Extracted using Kekre’s Median Codebook Generation of Vector Quantization (pp. 244-251)

Full Text: PDF

Dr. H.B.Kekre , Sr. Professor, MPSTME, NMIMS Vileparle(W), Mumbai 400056, India

Sudeep D. Thepade, Ph.D. Scholar & Assistant Professor, MPSTME, NMIMS Vileparle(W), Mumbai 400-056, India

Tanuja K. Sarode, Ph.D. Scholar MPSTME, NMIMS Assistant Professor, TSEC, Mumbai 400-050, India

Vaishali Suryavanshi. , Lecturer, Thadomal Shahani Engg. College, Bandra (w), Mumbai 400-050, India

.

Abstract — In this paper novel methods for image retrieval based on texture feature extraction using Vector Quantization (VQ) are proposed. We have used Linde-Buzo-Gray (LBG), and Kekre’s Median Codebook Generation (KMCG) algorithms for texture feature extraction. The image is first divided into blocks of size 2x2 pixels (each pixel with red, green and blue component). A training vector of dimensions 12 is created using this block. Collection of all such training vectors is a training set. To generate the texture feature vector of the image, LBG and KMCG algorithms are applied on the initial training set to obtain codebooks of size 16, 32, 64, 128, 256 and 512. These codebooks are considered as feature vectors for CBIR. Thus the codebook generation algorithms and five different codebook sizes per algorithm result in 12 proposed image retrieval techniques. The proposed image retrieval techniques are tested on generic image database respectively having 1000 images. Results are also compared with the Gray Level Co-occurance Matrix (GLCM) method. The proposed CBIR methods outperform GLCM with higher precision and recall values. KMCG based CBIR give performance improvement over LBG based CBIR. The performance of KMCG CBIR improves with increasing codebook size. Overall in all KMCG CBIR with codebook size 512 gives best results with higher precision and recall values for both databases.

.

Keywords— CBIR, Vector Quantization, GLCM, LBG, KMCG

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

36. Paper 25061049: An Efficient Trust Establishment Framework for MANETs (pp. 252-259)

Full Text: PDF

Mohammad Karami, Mohammad Fathian

Department of Industrial Engineering, Iran University of Science and Technology, Tehran, Iran

.

Abstract— In this paper, we present a general trust establishment framework comprising three components. The first part is the trust computation model that evaluates the trust level of each participating node through monitoring and quantification of some relevant behavioral indicative metrics. The second part is the trust evidence distribution scheme that distributes the trust evidences obtained by the first component. And finally the third part is the reputation computation model that combines the collected trust evidences from other nodes to form an overall reputation score and a judgment basis regarding the trustworthiness level of each node. The trust computation model is based on first-hand evidences obtained via direct observations at the MAC layer. The proposed trust evidence distribution scheme is an efficient, scalable and completely distributed scheme based on ant colony optimization algorithm. For combination of collected evidences in the reputation computation model, Dempster’s rule for combination is applied. Dempster’s rule for combination gives a numerical procedure for fusing together multiple pieces of evidence from unreliable observers. The paper, illustrates the applicability of the proposed framework on data packet delivery functionality with Dynamic Source Routing (DSR) as the underlying routing protocol. We present simulation results which demonstrate the effectiveness and efficiency of the proposed framework.

.

Keywords- Trust establishment framework; mobile ad hoc network (MANAT); evidence distribution; ant colony optimization; Dempster-Shafer theory

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

37. Paper 25061053: Fault Analysis Attacks and Its Countermeasure using Elliptic Curve Cryptography (pp. 260-262)

Full Text: PDF

M. Prabu, Anna University Coimbatore, Tamil Nadu, India

R. Shanmugalakshmi, Government College of Technology, Tamil Nadu, India

.

Abstract- In the last decade, many researchers had published the overall analysis attacks of cryptographic devices against implementation on elliptic curve attacks. Usually such type of information is not sufficient to learn about the individual attacks. Now in this article, we indisputably concentrated on fault analysis attack and its countermeasure.

.

Key words-components: Elliptic Curve, Implementation Attacks, Individual attack, Fault analysis attack

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

38. Paper 28061060: A Compressed Video Steganography using Random Embedding Scheme (pp. 263-267)

Full Text: PDF

Sherly A P, TIFAC CORE in Cyber Security, Amrita Vishwa Vidyapeetham, Coimbatore, India

Sapna Sasidharan, TIFAC CORE in Cyber Security, Amrita Vishwa Vidyapeetham, Coimbatore, India

Amritha P P, TIFAC CORE in Cyber Security, Amrita Vishwa Vidyapeetham, Coimbatore, India

.

Abstract − Steganography is the art of hiding while the communication is taking place, by hiding information in other information. Many different carrier file formats can be used, images, videos, audios, image etc. This paper proposes a Compressed Video Steganographic Scheme. In this scheme, data hiding operations are executed entirely in the compressed domain. Here data are embedded in the macro blocks of I frame with maximum scene change. To enlarge the capacity of the hidden secret information and to provide an imperceptible stego-image for human vision, random embedding scheme (Pixel Value Differencing) is used. Decompression process is not required in this scheme. Experimental results demonstrate that the proposed algorithm has high imperceptibility and capacity.

.

Keywords- Video Steganography; MPEG-4; PVD

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

39. Paper 29061065: Selective Image Encryption Using DCT with Stream Cipher (pp. 268-274)

Full Text: PDF

Sapna Sasidharan, TIFAC CORE in Cyber Security, Amrita Vishwa Vidyapeetham, Coimbatore, India

Jithin R, TIFAC CORE in Cyber Security, Amrita Vishwa Vidyapeetham, Coimbatore, India

.

Abstract ─ Encryption is used to securely transmit data in open networks. Each type of data has its own features; therefore different techniques should be used to protect confidential image data from unauthorized access. In this paper, selective image encryption using DCT with Stream Cipher is done. In the DCT method, the basic idea is to decompose the image into 8×8 blocks and these blocks are transformed from the spatial domain to the frequency domain by the DCT. Then, the DCT coefficients correlated to the lower frequencies of the image block are encrypted using the RC4 Stream Cipher. The resulted encrypted blocks are shuffled using the Shuffling Algorithm. Selective encryption is a recent approach where only parts of the data are encrypted to reduce the computational requirements for huge volumes of images.

.

Keywords- DCT; Stream Cipher; Shuffling Algorithm;Selective Encryption

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

40. Paper 29061067: Adaptive Background Estimation and object detection applying in Automated visual surveillance (pp. 275-279)

Full Text: PDF

M. Sankari, Department of Computer Applications, Nehru Institute of Engineering and Technology,

Coimbatore, India.

C. Meena, Head, Computer Centre, Avinashilingam University, Coimbatore, India.

.

Abstract— Automated visual surveillance is currently a hot topic in computer vision research. A common approach is to perform background subtraction, which identifies moving objects from the sequence of video frames that differs significantly from a background model. As a basic, the background image must be a representation of the scene with no moving objects and must be kept regularly updated. There are many challenges in developing a good background subtraction algorithm. We have proposed a methodology to perform background subtraction from moving vehicles in traffic video sequences that combines statistical assumptions of moving objects using the previous frames. It is necessary to update the background image frequently in order to guarantee reliability of the motion detection. For that, a binary moving objects hypothesis mask is constructed to classify any group of lattices as being from a moving object based on the optimal threshold. Then, the new incoming information is integrated into the current background image using a Kalman filter. In order to improve the performance, it is necessary to perform post- processing. It has been accomplished by shadow and noise removal algorithms operating at the lattice which identifies object-level elements. The results of post-processing can be used to detect object more efficiently. Experimental results and comparisons using real data demonstrate the superiority of the proposed approach which has achieved an average accuracy of 92% on completely novel test images.

.

Keywords- Background subtraction; Background updation; Binary segmentation mask; Kalman filter; Noise remova; Shadow removal; Traffic video sequences.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

41. Paper 29061073: Securing Web Communication with Quantum Cryptography (pp. 280-283)

Full Text: PDF

R.K.Pateriya 1, R.K. Baghel 2, Anupriya Gupta 3

1 Associate Professor, Department of Information Technology

2 Associate Professor, Department of Electronics Engineering

3 M.Tech (Information Security) Scholar, Department of Computer Science & Engineering

Maulana Azad National Institute of Technology, Bhopal, India

.

Abstract — The problem of transmitting secret messages securely between two parties is very old one. Human imagination has come up with clever ways of overcoming the difficulties associated with this problem, in particular preventing a malevolent eavesdropper from obtaining information about the secret message exchanged over the communication channel. Now a days internet security is most important issue because a large number of people depends on online transaction. During recent year quantum cryptography has been the object of a strong activity and is now extending its activity into various areas. Quantum cryptography is now a days widely used for communicating secret data between two authenticated parties. It has great potential to become the key technology for securing confidentiality and privacy of communication and thus to become the driver for the success of a series of web services in the field of e-governance, e-commerce, e-health, transmission of biometric data etc. The main problem with quantum cryptography is to find the initial raw key. This problem is discussed in this paper and a method is proposed which uses quantum cryptography in SSL/TLS server for securing web communication.

.

Keywords - BB84 protocol , Random key generation, QKD.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

42. Paper 30061081: A Robust -knowledge guided fusion of clustering Ensembles (pp. 284-290)

Full Text: PDF

Anandhi R J, Research Scholar, Dept. of CSE, Dr MGR University, Chennai, India

Dr Natarajan Subramaniyan, Professor, Dept of ISE, PES Institute of Technology, Bangalore, India

.

Abstract— Discovering interesting, implicit knowledge and general relationships in geographic information databases is very important to understand and to use the spatial data. Spatial Clustering has been recognized as a primary data mining method for knowledge discovery in spatial databases. In this paper, we have analyzed that by using a guided approach in combining the outputs of the various clusterers, we can reduce the intensive computations and also will result in robust clusters .We have discussed our proposed layered cluster merging technique for spatial datasets and used it in our three-phase clustering combination technique in this paper. At the first level, m heterogeneous ensembles are run against the same spatial data set to generate B1…Bm results. The major challenge in fusion of ensembles is the generation of voting matrix or proximity matrix which is in the order of n2, where n is the number of data points. This is very expensive both in time and space factors, with respect to spatial datasets. Instead, in our method, we compute a symmetric clusterer compatibility matrix of order (m x m) , where m is the number of clusterers and m <<n, using the cumulative similarity between the clusters of the clusterers. This matrix is used for identifying which two clusterers, if considered for fusion initially, will provide more information gain. As we travel down the layered merge, for every layer, we calculate a factor called Degree of Agreement (DOA), based on the agreed clusterers. Using the updated DOA at every layer, the movement of unresolved, unsettled data elements will be handled at much reduced the computational cost. Added to this advantage, we have pruned the datasets after every (m-1)/2 layers, using the gained knowledge in previous layer. This helps in faster convergence compared to the existing cluster aggregation techniques. The correctness and efficiency of the proposed cluster ensemble algorithm is demonstrated on real world datasets available in UCI data repository.

.

Keywords- Clustering ensembles, Spatial Data mining, Degree of Agreement, Cluster Compatibility matrix.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

43. Paper 30061083: Fault Diagnosis Algorithm for Analog Electronic Circuits based on Node-Frequency Approach (pp. 291-298)

Full Text: PDF

S.P. Venu Madhava Rao, Department of ECE, KMIT, Hyderabad, India.

Dr. N. Sarat Chandra Babu, & Dr. K. Lal Kishore

.

Abstract: In this paper we present a novel approach to analog electronic circuits fault diagnosis based on selection of both nodes and frequency for the first time as far as we know. Two fault isolation and localization algorithms are presented in this paper. The first algorithm selects nodes and frequencies which isolate all or desired number of faults. The second algorithm presented converts the fault dictionary contents into binary form. Importantly this helps in the automation of the fault diagnosis process.

.

Keywords: Fault Dictionary, Fault Isolation Table, Binary dictionary, singletons.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

44. Paper 30061085: Significance of Rapid Solutions Development to Business Process Management (pp. 299-303)

Full Text: PDF

Steve Kruba, Northrop Grumman, 3975 Virginia Mallory Drive, Chantilly VA 20151, USA

.

Abstract—Business process management (BPM) is moving from a niche market into the mainstream. One of the factors leading to this transformation is the emergence of very powerful rapid solutions development tools for creating BPM solutions (BPM RSD). It has been widely recognized that this facility is important for achieving benefits quickly. Similar benefits are attributed to the agile software movement, but BPM RSD differs in that the objective is to reduce the need for custom software development. As the BPM RSD features of some of the current business process management suites (BPMS) products have matured, additional benefits have emerged that fundamentally change the way we approach solutions in this space.

.

Keywords—BPM, Business process management, workflow, agile, rapid applications development, rapid solutions development, RAD, BPM RSD, BPM RAD.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

45. Paper 30061087: A Hybrid Network Interface Card-Based Intrusion Detection System (pp. 304-313)

Full Text: PDF

Samir Elmougy, Faculty of Computers and Information Sciences, Mansoura University, Mansoura 35516, Egypt,

Mohammed Mohsen, Faculty of Computers and Information Sciences, Mansoura University, Mansoura 35516, Egypt

.

Abstract—In recent years, the networks have played a vital factor in modern society. To prevent data tampering as well as eavesdropping, it’s important to ensure that connections are always private and secure. Intrusion Detection Systems (IDSs) are gaining more importance to the applied technologies and become an integral part of the security infrastructure of organizations. In this paper, a new hybrid intrusion detection system called HSIDS combines both of heuristic and signature intrusion detection approaches is proposed and implemented based on reading bytes from the Network Interface Cards (NICs). Embedding the capturing module in the protocols stack is another capturing method used in HSIDS. HSIDS's structured is layered which allows to detect bugs fast and easily. Also, its functionality is not depending on any external applications, so it is easy to upgrade its protocols parsing classes. The experimentation results show that the proposed system is an efficient IDS.

.

Keywords-Computer security, hybrid intrusion detection system, network interface cards (NIC), heuristic intrusion detection, signature intrusion detection.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

46. Paper 30061090: Scheduling of Workflows in Grid Computing with Probabilistic Tabu Search (pp. 314-319)

Full Text: PDF

R. Joshua Samuel Raj, CSE, VV college of Engineering, Tirunelveli, India

Dr. V. Vasudevan, Prof. & Head/IT, Kalasalingam University, Srivilliputur, India

.

Abstract: In Grid Environment the number of resources and tasks to be scheduled is usually variable and dynamic in nature. This characteristic emphasizes the scheduling approach as a complex optimization problem. Scheduling is a key issue which must be solved in grid computing study and a better scheduling scheme can greatly improve the efficiency. The objective of this paper is to explore the Probabilistic Tabu Search to promote compute intensive grid applications to maximize the Job Completion Ratio and minimize lateness in job completion based on the comprehensive understanding of the challenges and the state of the art of current research. Experimental results demonstrate the effectiveness and robustness of the proposed algorithm. Further the comparative evaluation with other scheduling algorithms such as First Come First Serve (FCFS), Last Come First Serve (LCFS), Earliest Deadline First (EDF) and Tabu Search are plotted.

.

Key words: grid computing, workflow, Tabu Search, scheduling problem, Probabilistic Tabu Search

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

47. Paper 30061092: Overclocked Load Scheduling in Large Clustered Reservation Systems (pp. 320-325)

Full Text: PDF

Tania Taami, Islamic Azad University, Science and Research Branch, Tehran, Iran

Amir Masoud Rahmani, Islamic Azad University, Science and Research Branch, Tehran, Iran

Ahmad Khademzade, Islamic Azad University, Science and Research Branch, Tehran, Iran

Ismail Ataie, Jam Petro. Complex, Tehran, Iran

.

Abstract—Advanced resource reservation has a great role in maintaining QoS of requests. Resource allocation and management to reservation requests for optimal utilization and guarantee of quality of service is challenging effort. When a reservation request for a resource type fails although enough free capacity might be available, there is not any chance for resolving conflicts. Inflexibility of reservation request in support of replacement on time axis, results in rigid resource utilization and even poor QoS of the system. But with the help of new overclocking technologies for doing over-clocking on some current scheduled reservation chunks, new chances emerge to beat these restrictions [1]. Using strict overclocking schema with traditional processors in limited time in cluster of servers, simulation results show QoS of reservations could be improved. This is came through with improvement to utilizing of resources and increasing accepted reservations without any side effects on processing and reliability of computations.

.

Keywords- scheduling; overclocking; thermal behaviour; advance reservation; cluster; QoS;

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

48. Paper 30061095: Skew Correction and Noise Reduction for Automatic Gridding of Microarray Images (pp. 326-334)

Full Text: PDF

Manjunath S S, Asistant Professor, Dept of Computer Science, Dayananda Sagar College of Engineering, Bangalore, India

Dr. Lalitha Rangarajan, Reader, Dept of Studies in Computer Science, University of Mysore, India

.

Abstract- Complementary DNA (cDNA) microarrays are a powerful high throughput technology developed in the last decade allowing researchers to analyze the behavior and interaction of thousands of genes simultaneously. The large amount of information provided by microarray images requires automatic techniques for efficient processing of microarray images to arrive at accurate biological conclusion. Most of the methods discussed in the literature need different levels of human intervention, which inevitably reduces the efficiency and reproducibility of the entire automation process. In this paper a novel approach for automatic gridding of skewed and noisy microarray images is presented. The microrarray image is skew corrected, noise removed using adaptive thresholds computed on various segments, spatial topology of spots detected, gridding performed and finally grids are refined. Experiments conducted on selected microarray images (skewed and noisy) of Stanford and UNC databases are encouraging

.

Keywords: Microarray, Gridding, Adaptive threshold, Spatial topology, Grid refinement, Skewed images, Noisy images.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

49. Paper 30061098: LDCP+: An Optimal Algorithm for Static Task Scheduling in Grid Systems (pp. 335-340)

Full Text: PDF

Negin Rzavi, Islamic Azad University, Science and Research Branch, Tehran, Iran

Safieh Siadat, Islamic Azad University, Science and Research Branch, Tehran, Iran

Amir Masoud Rahmani, Islamic Azad University, Science and Research Branch, Tehran, Iran

.

Abstract— after a computational job is designed and realized as a set of tasks, an optimal assignment of these tasks to the processing elements in a given architecture needs to be determined. In grid system with the existence of heterogeneous processing elements and data transferring time between them, determining an assignment of tasks to processing elements in order to optimize the performance and efficiency is so important. In this paper a heuristic algorithm named LDCP+ is presented, which has optimized the Longest Dynamic Critical Path algorithm (LDCP) presented by Mohammad L. Daoud and Nawwaf Kharma in 2007. This algorithm is a list-based algorithm in the way it assigns each task a priority for its execution. Using task duplication, using idle processing element's time and also optimizing priority assignment method which is used in LDCP algorithm, are the basic specifications of LDCP+, since LDCP algorithm is executable with the assumption that computation cost of tasks are monotonic, our algorithm which is presented in this paper has made the scheduling algorithm free from this restriction and in the case of non-monotonic computation costs, LDCP+ has the minimum total finish time in the comparison of other scheduling algorithms such as HEFT and CPOP.

.

Keywords- Grid; Static task scheduling; Longest Dynamic Critical Path.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

50. Paper 15061017: Density Distribution and Sector Mean with Zero-cal and Highest-sal Components in Walsh Transform Sectors as Feature Vectors for Image Retrieval (pp. 341-349)

Full Text: PDF

H. B. Kekre, Sr. Professor, MPSTME, SVKM’s NMIMS (Deemed-to be-University) Vile Parle West, Mumbai -56, India

Dhirendra Mishra, Assistant Professor & PhD Research Scholar, MPSTME, SVKM’s NMIMS (Deemed-to be-University), Vile Parle West, Mumbai -56, India

.

Abstract- We have introduced a novel idea of considering complex Walsh transform for sectorization of transformed components. In this process the first coefficient of zero-cal and the last coefficient highest-sal are not used. In this paper we have proposed two different approaches along with the extra components of zero-cal and highest-sal for feature vector generation namely sector density and sector mean. Two similarity measures such as sum of absolute difference and Euclidean distance are used and results are compared. The cross over point performance of overall average of precision and recall for both approaches on different sector sizes are compared. The density distribution of real (cal) and imaginary (sal) values and sector mean of Walsh sectors in all three color planes are considered to design the feature vector. The algorithm proposed here is worked over database of 1055 images spread over 12 different classes. Overall Average precision and recall is calculated for the performance evaluation and comparison of 4, 8, 12 & 16 Walsh sectors. The use of sum of absolute difference as similarity measure always gives lesser computational complexity and density distribution approach with sum of absolute difference as similarity measure of feature vector has the best retrieval performance.

.

Keywords-CBIR, Walsh Transform, Euclidian Distance, Absolute Difference, Precision, Recall

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

51. Paper 08061004: Comparison Of Neural Network And Multivariate Discriminant Analysis In Selecting New Cowpea Variety (pp. 350-358)

Full Text: PDF

.

Adewole, Adetunji Philip, Department of Computer Science, University of Agriculture, Abeokuta

Sofoluwe, A. B. , Department of Computer Science, University of Lagos, Akoka

Agwuegbo , Samuel Obi-Nnamdi , Department of Statistics, University of Agriculture, Abeokuta

.

Abstract - In this study, neural networks (NN) algorithm and multivariate discriminant (MDA) based model were developed to classify ten (10) varieties of cowpea which were widely planted in Kano. . In order to demonstrate the validity of our model, we use the case study to build a neural network model using Multilayer Feedforward Neural Network, and compare its classification performance against theMultivariate discriminant analysis. Two groups of data (Spray and Nospray) were used. Twenty kernels were used as training data set and test data to classify cowpea seed varieties. The neural network classified the new cowpea seed varieties based on the information it is trained with. At the end both methods were compared for their strength and weakness. It is noted that NN performed better than MDA, so that NN could be considered as a support tool in the process of selection of new cowpea varieties.

.

Keywords: Cowpea, Multivariate Discriminant Analysis (MDA), Neural Network (NN), Perceptron

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------