Vol. 7 No. 3 MAR 2010

Vol. 7 No. 3 March 2010 International Journal of Computer Science and Information Security
Publication March 2010, Volume 7 No. 3 (Download Full Journal)
.
Copyright © 2010 IJCSIS. This is an open access journal distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
1. Paper 28021082: Design and Implementation of an Intelligent Educational Model Based on Personality and Learner’s Emotion (pp. 1-13)
Full Text: PDF


Somayeh Fatahi, Department of Computer Engineering, Kermanshah University of Technology, Kermanshah, Iran
Nasser Ghasem-Aghaee, Department of Computer Engineering, Isfahan University, Isfahan, Iran


Abstract—The Personality and emotions are effective parameters in learning process. Thus, virtual learning environments should pay attention to these parameters. In this paper, a new e-learning model is designed and implemented according to these parameters. The Virtual learning environment that is presented here uses two agents: Virtual Tutor Agent (VTA), and Virtual Classmate Agent (VCA). During the learning process and depending on events happening in the environment, learner’s emotions are changed. In this situation, learning style should be revised according to the personality traits as well as the learner’s current emotions. VTA selects suitable learning style for the learners based on their personality traits. To improve the learning process, the system uses VCA in some of the learning steps. VCA is an intelligent agent and has its own personality. It is designed so that it can present an attractive and real learning environment in interaction with the learner. To recognize the learner’s personality, this system uses MBTI test and to obtain emotion values uses OCC model. Finally, the results of system tested in real environments show that considering the human features in interaction with the learner increases learning quality and satisfies the learner.

Keywords- Emotion; Learning Style; MBTI Indicator; Personality; Virtual Classmate Agent (VCA); Virtual Tutor Agent (VTA); Virtual learning.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
2. Paper 22021039: Signature Recognition using Multi Scale Fourier Descriptor And Wavelet Transform (pp. 14-19)
.
Ismail A. Ismail, Professor, Dean, College of Computers and Informatics ,Misr International University, , Egypt
Mohammed A. Ramadan, Professor, Department of Mathematics, Faculty of Science, Menofia University , Egypt
Talaat S. El danaf , lecturer, Department of Mathematics, Faculty of Science, Menofia University , Egypt
Ahmed H. Samak, Ass. Lecturer , Department of Mathematics, Faculty of Science, Menofia University, Egypt


Abstract - This paper present a novel off-line signature recognition method based on multi scale Fourier Descriptor and wavelet transform . The main steps of constructing a signature recognition system are discussed and experiments on real data sets show that the average error rate can reach 1%. Finally we compare 8 distance measures between feature vectors with respect to the recognition performance. 

Key words: signature recognition; Fourier Descriptor; Wavelet transform; personal verification
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
3. Paper 31011077: Feature-Based Adaptive Tolerance Tree (FATT): An Efficient Indexing Technique for Content-Based Image Retrieval Using Wavelet Transform (pp. 20-29)
.
Dr.P.AnandhaKumar, Department of Information Technology, Madras Institute of Technology, Anna University Chennai, Chennai India
V. Balamurugan, Research Scholar, Department of Information Technology, Madras Institute of Technology Anna University Chennai, Chennai, India
 

Abstract—This paper introduces a novel indexing and access method, called Feature- Based Adaptive Tolerance Tree (FATT), using wavelet transform is proposed to organize large image data sets efficiently and to support popular image access mechanisms like Content Based Image Retrieval (CBIR).Conventional database systems are designed for managing textual and numerical data and retrieving such data is often based on simple comparisons of text or numerical values. However, this method is no longer adequate for images, since the digital presentation of images does not convey the reality of images. Retrieval of images become difficult when the database is very large. This paper addresses such problems and presents a novel indexing technique, Feature Based Adaptive Tolerance Tree (FATT), which is designed to bring an effective solution especially for indexing large databases. The proposed indexing scheme is then used along with a query by image content, in order to achieve the ultimate goal from the user point of view that is retrieval of all relevant images. FATT indexing technique, features of the image is extracted using 2-dimensional discrete wavelet transform (2DDWT) and index code is generated from the determinant value of the features. Multiresolution analysis technique using 2D-DWT can decompose the image into components at different scales, so that the coarest scale components carry the global approximation information while the finer scale components contain the detailed information. Experimental results show that the FATT outperforms M-tree upto 200%, Slim-tree up to 120% and HCT upto 89%. FATT indexing technique is adopted to increase the efficiently of data storage and retrieval.

Index Terms— CBIR, FATT, indexing, wavelet transform.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
4. Paper 28021084: Ontology-supported processing of clinical text using medical knowledge integration for multi-label classification of diagnosis coding (pp. 30-35)
.
Phanu Waraporn 1,4,*, Phayung Meesad 2, Gareth Clayton 3
1 Department of Information Technology, Faculty of Information Technology
2 Department of Teacher Training in Electrical Engineering, Faculty of Technical Education
3 Department of Applied Statistics, Faculty of Applied Science, King Mongkut’s University of Technology North Bangkok
4 Division of Business Computing, Faculty of Management Science, Suan Sunandha Rajabhat University, Bangkok, Thailand


Abstract—This paper discusses the knowledge integration of clinical information extracted from distributed medical ontology in order to ameliorate a machine learning-based multi-label coding assignment system. The proposed approach is implemented using a decision tree based cascade hierarchical technique on the university hospital data for patients with Coronary Heart Disease (CHD). The preliminary results obtained show a satisfactory finding.

Keywords-component; medical ontology, diagnosis coding, knowledge integration, machine learning, decision tree.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
5. Paper 23021042: Botnet Detection by Monitoring Similar Communication Patterns (pp. 36-45)
.
Hossein Rouhani Zeidanloo, Faculty of Computer Science and Information System University of Technology Malaysia 54100 Kuala Lumpur, Malaysia
Azizah Bt Abdul Manaf, College of Science and Technology University of Technology Malaysia 54100 Kuala Lumpur, Malaysia


Abstract— Botnet is most widespread and occurs commonly in today‘s cyber attacks, resulting in serious threats to our network assets and organization’s properties. Botnets are collections of compromised computers (Bots) which are remotely controlled by its originator (BotMaster) under a common Command-and-Control (C&C) infrastructure. They are used to distribute commands to the Bots for malicious activities such as distributed denial-of-service (DDoS) attacks, spam and phishing. Most of the existing Botnet detection approaches concentrate only on particular Botnet command and control (C&C) protocols (e.g., IRC,HTTP) and structures (e.g., centralized), and can become ineffective as Botnets change their structure and C&C techniques. In this paper at first we provide taxonomy of Botnets C&C channels and evaluate well-known protocols which are being used in each of them. Then we proposed a new general detection framework which currently focuses on P2P based and IRC based Botnets. This proposed framework is based on definition of Botnets. Botnet has been defined as a group of bots that perform similar communication and malicious activity patterns within the same Botnet. The point that distinguishes our proposed detection framework from many other similar works is that there is no need for prior knowledge of Botnets such as Botnet signature. 

Keywords- Botnet; Bot; centralized; decentralized; P2P; similar behavior
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
6. Paper 04021004: JawaTEX: A System for Typesetting Javanese (pp. 46-52)
Full Text: PDF


Ema Utami 1, Jazi Eko Istiyanto 2, Sri Hartati 3, Marsono 4, and Ahmad Ashari 5
1 Information System Major of STMIK AMIKOM Yogyakarta, Ring Road Utara ST, Condong Catur, Depok Sleman Yogyakarta, Telp. (0274) 884201-884206, Faks. (0274) 884208, Candidate Doctor of Computer Science of Postgraduate School Gadjah Mada University
2,3,5 Doctoral Program in Computer Science, Graha Student Internet Center (SIC) 3rd floor, Faculty of Mathematic and Natural Sciences Gadjah Mada University, Sekip Utara Bulaksumur Yogyakarta. 55281, Telp/Fax: (0274) 522443
4 Sastra Nusantara Major of Culture Sciences Gadjah Mada University, Humaniora ST No.1, Bulaksumur, Yogyakarta, Fax. (0274) 550451


…This research is the beginning phase to overcome the problems of Latin-Javanese text document transliteration. Therefore it is necessary to build the transliteration model named JawaTEX to transliterate the Latin text document to Javanese characters.The parser method used in this research is The Context Free Recursive Descent Parser. The Latin text document processing becomes the list of the Latin string split pattern by using rule-based method, whereas the matching process of each Latin string split pattern in mapping form of LATEX uses Pattern Matching method. The using of rule-based method can solved problems of the previous reseraches by using certain methods. The established transliteration model is supported by the production rule of browsing the Latin string split pattern, the models of the Latin string split pattern, the production rule for the Latin-Javanese character mapping, the models of syntax coding pattern, style or macro LATEX, Javanese character Metafont. The spelling checker to correct the mistake of letter typing by applying Brute Force algorithm also provided within system. Several testing results above prove that if the user can write every word correctly including absorption suitable with the original pronunciation and write or re-arrange the Latin spelling in the source text, so the transliteration model of the Latin text document to Javanese character formed can be used to transliterate the Latin text document to Javanese character writing. The concept of the text document split and the established transliteration in this article can be used as a basis to develop other cases. For the next research, the Javanese character split writing in good form still needs to be developed. The Javanese character writing sometimes cannot be justified alignment since the Javanese character writing does not recognize space between words.

Key words: transliteration, Javanese characters, type-setting, The Context Free Recursive Descent Parser, Pattern Matching, rule based, Brute Force, LATEX, JawaTEX
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
7. Paper 04110902: Effective Query Retrieval System In Mobile Business Environment (pp. 53-57)
Full Text: PDF

.
R.Sivaraman, Dy.Director, Center for Convergence of Technologies (CCT), Anna University Tiruchirappalli, Tiruchirappalli, Tamil Nadu, India
RM. Chandrasekaran, Registrar, Anna University Tiruchirappalli, Tiruchirappalli, Tamil Nadu, India


Abstract - Web Based Query Management System (WBQMS) is a methodology to design and to implement Mobile Business, in which a server is the gateway to connect databases with clients which sends requests and receives responses in a distributive manner. The gateway, which communicates with mobile phone via GSM Modem, receives the coded queries from users and sends packed results back. The software which communicates with the gateway system via SHORT MESSAGE, packs users’ requests, IDs and codes, and sends the package to the gateway; then interprets the packed data for the users to read on a page of GUI. Whenever and wherever they are, the customer can query the information by sending messages through the client device which may be mobile phone or PC. The mobile clients can get the appropriate services through the mobile business architecture in distributed environment. The messages are secured through the client side encoding mechanism to avoid the intruders. The gateway system is programmed by Java, while the software at clients by J2ME and the database is created by Oracle for reliable and interoperable services.

Key words: Query, J2ME, Reliability, Database and Midlet
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
8. Paper 10021011: Predictive Gain Estimation – A mathematical analysis (pp. 58-61)
Full Text: PDF

.
P. Chakrabarti, Sir Padampat Singhania University, Udaipur, Rajasthan, India
.
Abstract- In case of realization of successful business, gain analysis is essential. In this paper we have cited some new techniques of gain expectation on the basis of neural property of perceptron. Support rule and Sequence mining based artificial intelligence oriented practices have also been done in this context. In the view of above fuzzy and statistical based gain sensing is also pointed out.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

10. Paper 12021016: Analysis of Empirical Software Effort Estimation Models (pp. 68-77)
Full Text: PDF

.
Saleem Basha, Department of Computer Science, Pondicherry University, Puducherry, India
Dhavachelvan Ponnurangam, Department of Computer Science, Pondicherry University, Puducherry, India

.
Abstract – Reliable effort estimation remains an ongoing challenge to software engineers. Accurate effort estimation is the state of art of software engineering, effort estimation of software is the preliminary phase between the client and the business enterprise. The relationship between the client and the business enterprise begins with the estimation of the software. The credibility of the client to the business enterprise increases with the accurate estimation. Effort estimation often requires generalizing from a small number of historical projects. Generalization from such limited experience is an inherently under constrained problem. Accurate estimation is a complex process because it can be visualized as software effort prediction, as the term indicates prediction never becomes an actual. This work follows the basics of the empirical software effort estimation models. The goal of this paper is to study the empirical software effort estimation. The primary conclusion is that no single technique is best for all situations, and that a careful comparison of the results of several approaches is most likely to produce realistic estimates. 

Keywords- Software Estimation Models, Conte’s Criteria, Wilcoxon Signed-Rank Test.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
11. Paper 12021018: A Survey on Preprocessing Methods for Web Usage Data (pp. 78-83)
.
V.Chitraa, Lecturer, CMS College of Science and Commerce, Coimbatore, Tamilnadu, India
Dr. Antony Selvdoss Davamani, Reader in Computer Science, NGM College (AUTONOMOUS ), Pollachi, Coimbatore,Tamilnadu, India


Abstract— World Wide Web is a huge repository of web pages and links. It provides abundance of information for the Internet users. The growth of web is tremendous as approximately one million pages are added daily. Users’ accesses are recorded in web logs. Because of the tremendous usage of web, the web log files are growing at a faster rate and the size is becoming huge. Web data mining is the application of data mining techniques in web data. Web Usage Mining applies mining techniques in log data to extract the behavior of users which is used in various applications like personalized services, adaptive web sites, customer profiling, prefetching, creating attractive web sites etc., Web usage mining consists of three phases preprocessing, pattern discovery and pattern analysis. Web log data is usually noisy and ambiguous and preprocessing is an important process before mining. For discovering patterns sessions are to be constructed efficiently. This paper reviews existing work done in the preprocessing stage. A brief overview of various data mining techniques for discovering patterns, and pattern analysis are discussed. Finally a glimpse of various applications of web usage mining is also presented.

Keywords- Data Cleaning, Path Completion, Session Identification , User Identification, Web Log Mining
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
12. Paper 16021023: Seamless Data Services for Real Time Communication in a Heterogeneous Networks using Network Tracking and Management (pp. 84-91)
.
Adiline Macriga. T, Research Scholar, Department of Information & Communication, MIT Campus, Anna University Chennai, Chennai – 600025.
Dr. P. Anandha Kumar, Asst. Professor, Department of Information Technology, MIT Campus, Anna University Chennai, Chennai – 600025.


Abstract - Heterogeneous Networks is the integration of all existing networks under a single environment with an understanding between the functional operations and also includes the ability to make use of multiple broadband transport technologies and to support generalized mobility. It is a challenging feature for Heterogeneous networks to integrate several IP-based access technologies in a seamless way. The focus of this paper is on the requirements of a mobility management scheme for multimedia real-time communication services - Mobile Video Conferencing. Nowadays, the range of available wireless access network technologies includes cellular or wide-area wireless systems, such as cellular networks (GSM/GPRS/UMTS) or Wi-Max, local area Network or personal area wireless systems, comprising for example, WLAN (802.11 a/b/g) and Bluetooth. As the mobile video conferencing is considered, the more advanced mobile terminals are capable of having more than one interface active at the same time. In addition, the heterogeneity of access technologies and also the seamless flow of information will increase in the future, making the seamless integration of the access network a key challenge for mobility management in a heterogeneous network environment. Services must be provided to the user regardless of the particular access technology and also the type of service provider or the network used.

Keywords: Location Tracking, Location Management, Mobility Management, Heterogeneous Networks, Seamless services.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
13. Paper 17021026: Effect of Weighting Scheme to QoS Properties in Web Service Discovery (pp. 92-100)
.
Agushaka J. O., Lawal M. M., Bagiwa, A. M. and Abdullahi B. F.
Mathematics Department, Ahmadu Bello University Zaria-Nigeria


Abstract - Specifying QoS properties can limit the selection of some good web services that the user will have considered; this is because the algorithm used strictly ensures that there is a match between QoS properties of the consumer with that of the available services. This is to say that, a situation may arise that some services might not have all that the user specifies but are rated high in those they have. With some tradeoffs specified in form of weight, these services will be made available to the user for consideration. This assertion is from the fact that, the user’s requirements for the specified QoS properties are of varying degree i.e. he will always prefer one ahead of the other. This can be captured in form of weight i.e. the one preferred most will have the highest weight. If a consumer specifies light weight for those QoS properties that a web service is deficient in and high weight for those it has, this will minimize the difference between them. Hence the service can be returned.

Key Words: QoS properties, QoS weighting vector, Distance Measure
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
14. Paper 17021030: Fuzzy Logic of Speed and Steering Control System for Three Dimensional Line Following of an Autonomous Vehicle (pp. 101-108)
.
Dr. Shailja Shukla, Department of electrical engineering J.E.C. Jabalpur
Mr. Mukesh Tiwari, Department of electrical engineering J.E.C. Jabalpur

.
Abstract- The major problem of robotics research today is that there is a huge barrier to entry into Robotics research due to system software complexity and need for a researcher to learn more about details, dependencies and intricacies of the complete system. This is because a robot system needs several different modules to communicate and execute in parallel. Today there are not much controlled comparisons of algorithms and solutions for a given task, which is the standard scientific method of other sciences. There is also very little sharing between groups and projects, requiring code to be written from scratch over and over again. This paper is to describe exploratory research on the design of a modular autonomous mobile robot controller. The controller incorporates a fuzzy logic [8] [9] approach for steering and speed control [37], a FL approach for ultrasound sensing and an overall expert system for guidance. The advantages of a modular system are related to portability and transportability, i.e. any vehicle can become autonomous with minimal modifications. A mobile robot test bed has been constructed in university of Cincinnati using a golf cart base. This cart has full speed control with guidance provided by a vision system and obstacle avoidance using ultrasonic sensors. The speed and steering fuzzy logic controller is supervised through a multi-axis motion controller. The obstacle avoidance system is based on a microcontroller interfaced with ultrasonic transducers. This micro-controller independently handles all timing and distance calculations and sends distance information back to the fuzzy logic controller via the serial line. This design yields a portable independent system in which high speed computer communication is not necessary. Vision guidance has been accomplished with the use of CCD cameras judging the current position of the robot.[34] [35][36] It will be generating a good image for reducing an uncertain wrong command from ground coordinate to tackle the parameter uncertainties of the system, and to obtain good WMR dynamic response.[1] Here we Apply 3D line following mythology. It transforms from 3D to 2D and also maps the image coordinates and vice versa, leading to the improved accuracy of the WMR position. The fuzzy logic Controller may give a good command signal; moreover we can find a highly accurate plant model to design the controller taking into account The unknown factors like friction and dynamic environment.This design, in its modularity, creates a portable autonomous fuzzy logic controller applicable to any mobile vehicle with only minor adaptations.
.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
15. Paper 19011013: A reversible high embedding capacity data hiding technique for hiding secret data in images (pp. 109-115)
Full Text: PDF

.
Mr. P. Mohan Kumar, Asst. Professor, CSE Department, Jeppiaar Engineering College, Chennai., India.
Dr. K. L. Shunmuganathan, Professor and Head, CSE Department, R.M.K. Engineering College, Chennai. India.

.
Abstract -- As the multimedia and internet technologies are growing fast, the transmission of digital media plays an important role in communication. The various digital media like audio, video and images are being transferred through internet. There are a lot of threats for the digital data that are transferred through internet. Also, a number of security techniques have been employed to protect the data that is transferred through internet. This paper proposes a new technique for sending secret messages securely, using steganographic technique. Since the proposed system uses multiple level of security for data hiding, where the data is hidden in an image file and the stego file is again concealed in another image. Previously, the secret message is being encrypted with the encryption algorithm which ensures the achievement of high security enabled data transfer through internet.

Keywords – steganography, watermarking, stego image, payload
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
16. Paper 19011014: Mining The Data From Distributed Database Using An Improved Mining Algorithm (pp. 116-121)
Full Text: PDF

.
J. Arokia Renjit, Asst. Professor/ CSE Department, Jeppiaar Engineering College, Chennai, TamilNadu,India – 600119.
Dr. K. L. Shunmuganathan, Professor & Head, Department of CSE, RMK Engineering College,TamilNadu , India – 601 206.
.

Abstract--Association rule mining is an active data mining research area and most ARM algorithms cater to a centralized environment. Centralized data mining to discover useful patterns in distributed databases isn't always feasible because merging data sets from different sites incurs huge network communication costs. In this paper, an Improved algorithm based on good performance level for data mining is being proposed. In local sites, it runs the application based on the improved LMatrix algorithm, which is used to calculate local support counts. Local Site also finds a centre site to manage every message exchanged to obtain all globally frequent item sets. It also reduces the time of scan of partition database by using LMatrix which increases the performance of the algorithm. Therefore, the research is to develop a distributed algorithm for geographically distributed data sets that reduces communication costs, superior running efficiency, and stronger scalability than direct application of a sequential algorithm in distributed databases.
.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
17. Paper 20021033: Node Sensing & Dynamic Discovering Routes for Wireless Sensor Networks (pp. 122-131)
.
Prof. Arabinda Nanda, Department of CSE, KEC, Bhubaneswar, India 
Prof (Dr) Amiya Kumar Rath, Department of CSE & IT, CEB, Bhubaneswar, India 
Prof. Saroj Kumar Rout, Department of CSE, KEC, Bhubaneswar, India


Abstract-The applications of Wireless Sensor Networks (WSN) contain a wide variety of scenarios. In most of them, the network is composed of a significant number of nodes deployed in an extensive area in which not all nodes are directly connected. Then, the data exchange is supported by multihop communications. Routing protocols are in charge of discovering and maintaining the routes in the network. However, the correctness of a particular routing protocol mainly depends on the capabilities of the nodes and on the application requirements. This paper presents a dynamic discover routing method for communication between sensor nodes and a base station in WSN. This method tolerates failures of arbitrary individual nodes in the network (node failure) or a small part of the network (area failure). Each node in the network does only local routing preservation, needs to record only its neighbor nodes’ information, and incurs no extra routing overhead during failure free periods. It dynamically discovers new routes when an intermediate node or a small part of the network in the path from a sensor node to a base station fails. In our planned method, every node decides its path based only on local information, such as its parent node and neighbor nodes’ routing information. So, it is possible to form a loop in the routing path. We believe that the loop problem in sensor network routing is not as serious as that in the Internet routing or traditional mobile ad-hoc routing. We are trying to find all possible loops and eliminate the loops as far as possible in WSN.

Keywords- routing protocol; wireless sensor network; node failure; area failure
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
18. Paper 20021034: A Robust Fuzzy Clustering Technique with Spatial Neighborhood Information for Effective Medical Image Segmentation (pp. 132-138)
.
S. Zulaikha Beevi, Assistant Professor, Department of IT, National College of Engineering, Tamilnadu, India.
M. Mohammed Sathik, Associate Professor, Department of Computer Science, Sathakathullah Appa College, Tamilndu, India.
K. Senthamaraikannan, Professor & Head, Department of Statistics, Manonmaniam Sundaranar University, Tamilnadu, India.


Abstract- Medical image segmentation demands an efficient and robust segmentation algorithm against noise. The conventional fuzzy c-means algorithm is an efficient clustering algorithm that is used in medical image segmentation. But FCM is highly vulnerable to noise since it uses only intensity values for clustering the images. This paper aims to develop a novel and efficient fuzzy spatial c-means clustering algorithm which is robust to noise. The proposed clustering algorithm uses fuzzy spatial information to calculate membership value. The input image is clustered using proposed ISFCM algorithm. A comparative study has been made between the conventional FCM and proposed ISFCM. The proposed approach is found to be outperforming the conventional FCM. 

Index Terms - clustering, fuzzy c-means, image segmentation, membership function.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
19. Paper 20021035: Design And Implementation Of Multilevel Access Control In Medical Image Transmission Using Symmetric Polynomial Based Audio Steganography (pp. 139-146)
.
J.Nafeesa Begum, Research Scholar &Sr. Lecturer in CSE, Government College of Engg, Bargur- 635104, Krishnagiri District , Tamil Nadu , India 
K. Kumar, Research Scholar &Lecturer in CSE ,Government College of Engg, Bargur- 635104, Tamil Nadu , India 
Dr. V. Sumathy, Asst. Professor in ECE , Government College of Technology,Coimbatore, Tamil Nadu, India


Abstract— Steganography techniques are used in Multimedia data transfer to prevent adversaries from eaves dropping. The medical profession is also under a strict duty to protect the confidentiality of patients' medical records as per the HIPAA ACT of 1996. Thus, protection of medical data/information in telemedicine is of paramount importance. Most telemedicine systems include some form of security measures such as the use of passwords. Password identification determines whether a user is authorized to gain access to a system. However, passwords are insufficient mechanisms to maintain patient confidentiality from intruders who gain knowledge of a user's password to log onto a system. This paper deals with the design and implementation of multilevel access control in medical image transmission using symmetric polynomial based audio steganography. Medical image transmission using audio steganography deals with covering the medical image on top of an audio file and subsequently sending it to multiple receivers .Only the intended recipients know that the Audio files actually contain medical images and they can use their key to see the images. We have developed a Multilevel access control model by which medical images sent to low class users like medical assistants can also be seen by physicians who are higher in the hierarchy whereas the vice-versa is not allowed. To provide multilevel access control, symmetric polynomial based scheme is used. The steganography scheme makes it possible to hide the medical image in different bit locations of host media without inviting suspicion. The Secret file is embedded in a cover media with a key. At the receiving end the key can be derived by all the classes which are higher in the hierarchy using symmetric polynomial and the medical image file can be retrieved. The system is implemented and found to be secure, fast and scalable. Simulation results show that the system is dynamic in nature and allows any type of hierarchy. The proposed approach performs better even during frequent member joins and leaves. The computation cost is reduced as the same algorithm is used for key computation and descendant key derivation. Steganographic technique used in this paper does not use the conventional LSB’s and uses two bit positions and the hidden data occurs only from a frame which is dictated by the key that is used. Hence the quality of stego data is improved.

Index Terms- HIPAA, Steganography, Multilevel Access control, audio file, symmetric polynomial, dynamic, scalable
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
20. Paper 23021040: Enhanced Authentication and Locality Aided - Destination Mobility in Dynamic Routing Protocol for MANET (pp. 147-152)
.
Sudhakar Sengan, Lecturer, Department of CSE, Nandha College of Technology, Erode -TamilNadu – India
Dr.S.Chenthur Pandian, Principal, Selvam College of Technology, Namakkal -TamilNadu – India

.
Abstract — Mobile Ad Hoc Network (MANET) is an emerging area of research in the communication network world. As the MANET is infrastructure less, it is having dynamic nature of arbitrary network topology. So, it needs set of new networking strategies to be implemented in order to provide efficient end to end communication. Node activities such as sending or receiving data is highly traceable. Nodes are vulnerable to attacks and disruptions. To identify such nodes, a method of direct validation is proposed. Since it is unlikely for 2 ad hoc nodes to stay at the same position concurrently, the match between a position and ID is definitely unique. This information is obtained via global positioning system (GPS) and location services. In the routing protocol, location information is distributed between nodes by means of position beacons. Routing schemes rely on the cooperation and information exchanged among the nodes. Here in addition to node ID, extra information such as positions of the nodes is used for making routing decisions. Its neighbouring nodes receive the request and content to access the channel for becoming the next hop using Receiver Contention Channel Access Mechanism. A receiver that is geographically closer to the destination is assigned a higher priority and can win the contention. The destination also finds the corresponding authentication code according to the position carried in the rreq and encrypts the code with the secret key of its secret key pair.The encrypted result is included in the rrep and sent to the source. The source finds out whether it reaches the right destination by decrypting the information with the destination’s key and comparing the authentication code with the one it obtained through the position request. To avoid intruder for routing, Packet Dropping, WatchDog, SYBIL Attacks and PathSelector are used. The watchdog identifies misbehaving nodes, while the Pathselector avoids routing packets through these nodes. The watchdog, the path selector is run by each server. Each Server maintains a rating for every other node it knows about in the VHR. In our proposed model, the route selection is a function of following parameters: hop count, trust level of node and security level of application. In this paper, to focus on secure neighbor detection, trust factor evaluation, operational mode, route discovery and route selection. The paper mainly address the security of geographic routing. The watchdog identifies misbehaving nodes, while the Pathselector avoids routing packets through these nodes. The watchdog, the pathselector is run by each server. In order to keep the source informed about the destination’s mobility, the destination keeps sending the alert message to its previous hop telling that it has changed its position and any reference to it for data packet forwarding be informed to the VHR server.

Keywords— Mobile ad hoc networks, routing protocols, multipath routing, Reliable Routing, Position Based.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
21. Paper 23021041: Processor Based Active Queue Management for providing QoS in Multimedia Application (pp. 153-158)
.
N. Saravana Selvam, Department of Computer Science and Engineering, Sree Sowdambika College of Engineering, Aruppukottai, India
Dr. S. Radhakrishnan, Department of Computer Science and Engineering, Arulmigu Kalasalingam College of Engineering, Krishnankoil, India


Abstract—The objective of this paper is to implement the Active Network based Active Queue Management Technique for providing Quality of Service (QoS) using Network Processor(NP) based router to enhance multimedia applications. The performance is evaluated using Intel IXP2400 NP Simulator. The results demonstrate that, Active Network based Active Queue Management has better performance than RED algorithm in case of congestion and is well suited to achieve high speed packet classification to support multimedia applications with minimum delay and Queue loss. Using simulation, we show that the proposed system can provide assurance for prioritized flows with improved network utilization where bandwidth is shared among the flows according to the levels of priority. We first analyze the feasibility and optimality of the load distribution schemes and then present separate solutions for non-delay sensitive streams and delay-sensitive streams. Rigorous simulations and experiments have been carried out to evaluate the performance.

Key words - Multimedia, QoS, Network Processor, IXP 2400.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
22. Paper 23021045: New Clustering Algorithm for Vector Quantization using Rotation of Error Vector (pp. 159-165)
.
Dr. H. B. Kekre, Computer Engineering, Mukesh Patel School of Technology Management and Engineering, NMIMS University, Vileparle(w) Mumbai 400–056, India
Tanuja K. Sarode, Ph.D. Scholar, MPSTME, NMIMS University, Assistant Professor, Computer Engineering, Thadomal Shahani Engineering College, Bandra(W), Mumbai 400-050, India


Abstract—The paper presents new clustering algorithm. The proposed algorithm gives less distortion as compared to well known Linde Buzo Gray (LBG) algorithm and Kekre’s Proportionate Error (KPE) Algorithm. Constant error is added every time to split the clusters in LBG, resulting in formation of cluster in one direction which is 1350 in 2-dimensional case. Because of this reason clustering is inefficient resulting in high MSE in LBG. To overcome this drawback of LBG proportionate error is added to change the cluster orientation in KPE. Though the cluster orientation in KPE is changed its variation is limited to ± 450 over 1350. The proposed algorithm takes care of this problem by introducing new orientation every time to split the clusters. The proposed method reduces PSNR by 2db to 5db for codebook size 128 to 1024 with respect to LBG. 

Keywords-component; Vector Quantization; Codebook; Codevector; Encoding; Compression.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
23. Paper 25021046: Enhanced Ad-Hoc on Demand Multipath Distance Vector Routing protocol (pp. 166-170)
.
Mrs. Sujata V. Mallapur, Department of Information Science and Engineering Appa Institute of Engineering and Technology Gulbarga, India 
Prof. Sujata .Terdal, Department of Computer Science and Engineering P.D.A College of Engineering Gulbarga, India


Abstract—Due to mobility in Ad-Hoc network the topology of the network may change randomly, rapidly and unexpectedly, because of these aspects, the routes in the network often disappear and new to arise. To avoid frequent route discovery and route failure EAOMDV was proposed based on existing routing protocol AOMDV. The EAOMDV (Enhanced Ad-Hoc on Demand Multipath Distance Vector) Routing protocol was proposed to solve the “route failure” problem in AOMDV. EAOMDV protocol reduces the route failure problem by preemptively predicting the link failure by the signal power received by the receiver (pr). This proposed protocol controls overhead, increases throughput and reduces the delay. The EAOMDV protocol was implemented on NS-2 and evaluation results show that the EAOMDV outperformed AOMDV.

Keywords— Ad-Hoc Networks, AODV, AOMDV, Multipath Routing, EAOMDV.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
24. Paper 25021047: A Survey on Space-Time Turbo Codes (pp. 171-177)
.
Dr. C. V. Seshaiah, Prof and Head, Sri Ramakrishna Engg. College 
S. Nagarani, Research Scholar, Anna University, Coimbatore


Abstract—As wireless communication systems look intently to compose the transition from voice communication to interactive Internet data, achieving higher bit rates becomes both increasingly desirable and challenging. Space-time coding (STC) is a communications technique for wireless systems that inhabit multiple transmit antennas and single or multiple receive antennas. Space-time codes make use of advantage of both the spatial diversity provided by multiple antennas and the temporal diversity available with time-varying fading. Space-time codes can be divided into block codes and trellis codes. Space–time trellis coding merges signal processing at the receiver with coding techniques appropriate to multiple transmit antennas. The advantages of space-time codes (STC) make it extremely remarkable for high-rate wireless applications. Initial STC research efforts focused on narrowband flat-fading channels. The decoding complexity of Space-time turbo codes STTC increases exponentially as a function of the diversity level and transmission rate. This proposed paper provides an over view on various techniques used for the design of space-time turbo codes. This paper also discusses the techniques handled by researchers to built encoder and decoder section for multiple transmits and receives antennas. In addition the future enhancement gives a general idea for improvement and development of various codes which will involve implementing viterbi decoder with soft decoding in a multi-antenna scenario. In addition the space-time code may be analyzed using some of the available metrics and finally to simulate it for different receive antenna configurations. 

Keywords—Antenna, Bit Rate, Block Codes, Diversity, Space-Time Coding (STC), Transmitter, Trellis Codes, and Viterbi Decoder.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
25. Paper 25021048: Mathematical Principles in Software Quality Engineering (pp. 178-184)
.
Dr. Manoranjan Kumar Singh, PG Department of Mathematics, Magadha University, Bodhagaya, Gaya, Bihar, India-823001
Rakesh. L, Department of Computer-Science, SCT Institute of Technology, Bangalore, India-560075


Abstract— Mathematics has many useful properties for developing of complex software systems. One is that it can exactly describe a physical situation of the object or outcome of an action. Mathematics support abstraction and this is an excellent medium for modeling, since it is an exact medium there is a little possibility of ambiguity. This paper demonstrates that mathematics provides a high level of validation when it is used as a software medium. It also outlines distinguishing characteristics of structural testing which is based on the source code of the program tested. Structural testing methods are very amenable to rigorous definition, mathematical analysis and precise measurement. Finally, it also discusses functional and structural testing debate to have a sense of complete testing. Any program can be considered to be a function in the sense that program input forms its domain and program outputs form its range. In general discrete mathematics is more applicable to functional testing, while graph theory pertains more to structural testing.

Keywords— Propositional logic, Graph theory, Validation, Fault.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
26. Paper 27021050: An Analytical Study on Behavior of Clusters Using K Means, EM and K* Means Algorithm (pp. 185-190)
.
G. Nathiya, Department of Computer Science, P.S.G.R Krishnammal College for Women, Coimbatore- 641004, Tamilnadu India.
S. C. Punitha, Department of Computer Science, P.S.G.R Krishnammal College for Women, Coimbatore- 641004, Tamilnadu India.
Dr. M. Punithavalli, Director of the Computer Science Department, Sri Ramakrishna college of Arts and Science for Women, Coimbatore, Tamilnadu, India.


Abstract—Clustering is an unsupervised learning method that constitutes a cornerstone of an intelligent data analysis process. It is used for the exploration of inter-relationships among a collection of patterns, by organizing them into homogeneous clusters. Clustering has been dynamically applied to a variety of tasks in the field of Information Retrieval (IR). Clustering has become one of the most active area of research and the development. Clustering attempts to discover the set of consequential groups where those within each group are more closely related to one another than the others assigned to different groups. The resultant clusters can provide a structure for organizing large bodies of text for efficient browsing and searching. There exists a wide variety of clustering algorithms that has been intensively studied in the clustering problem. Among the algorithms that remain the most common and effectual, the iterative optimization clustering algorithms have been demonstrated reasonable performance for clustering, e.g. the Expectation Maximization (EM) algorithm and its variants, and the well known k-means algorithm. This paper presents an analysis on how partition method clustering techniques – EM, K –means and K* Means algorithm work on heartspect dataset with below mentioned features – Purity, Entropy, CPU time, Cluster wise analysis, Mean value analysis and inter cluster distance. Thus the paper finally provides the experimental results of datasets for five clusters to strengthen the results that the quality of the behavior in clusters in EM algorithm is far better than k-means algorithm and k*means algorithm.

Keywords—Cluster, EM, K- means, K* means, Purity, Entropy, Purity, Entropy, Cluster wise analysis and Mean value analysis.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
27. Paper 27021055: Node inspection and analysis thereof in the light of area estimation and curve fitting (pp. 191-197)
.
A. Kumar, Dept. of Comp. Sc. & Engg., Sir Padampat Singhania, University, Udaipur, India.  
P. Chakrabarti, Dept. of Comp. Sc. & Engg., Sir Padampat Singhania, University, Udaipur, India.  
P. Saini, Dept. of Comp. Sc. & Engg., Sir Padampat Singhania, University, Udaipur, India
.

Abstract- In this paper, we have given an idea of area specification and its corresponding sensing of nodes in a dynamic network. We have applied the concept of Monte Carlo methods in this respect. We have cited certain statistical as well as artificial intelligence based techniques for realizing the position of a node. We have also applied curve fitting concept for node detection and relative verification. 

Keywords-Monte Carlo, least square curve fitting.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
28. Paper 27021059: An Improved Fixed Switching Frequency Direct Torque Control of Induction Motor Drives Fed by Direct Matrix Converter (pp. 198-205)
.
Nabil Taïb and Toufik Rekioua, Electrical Engineering Department, University of A. Mira, Targua Ouzemour, Bejaia, 06000, Algeria. 
Bruno François, L2EP Laboratory, Central School of Lille, Lille 59651, France


Abstract— a few papers have been interested by the fixed switching frequency direct torque control fed by direct matrix converters, where we can find just the use of direct torque controlled space vector modulated method. In this present paper, we present an improved method used for a fixed switching frequency direct torque control (DTC) using a direct matrix converter (DMC). This method is characterized by a simple structure, a fixed switching frequency which causes minimal torque ripple and a unity input power factor. Using this strategy, we combine the direct matrix converters advantages with those of direct torque control (DTC) schemes. The used technique for constant frequency is combined with the input current space vector to create the switching table of direct matrix converter (DMC). Simulation results clearly demonstrate a better dynamic and steady state performances of the proposed method.

Keywords- Direct matrix converter; fixed switching frequency; space vector modulation; direct torque control
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
29. Paper 28021065: Internet ware cloud computing :Challenges (pp. 206-210)
.
Dr. S Qamar, Department of Computer Science, CAS, King Saud University,Riyadh, Saudi Arabia
Niranjan Lal, Department of Information Technology, SRM University-NCR Campus, Ghaziabad ,India
Mrityunjay Singh, Department of Information Technology, SRM University-NCR Campus, Ghaziabad , India


Abstract- After decades of engineering development and infrastructural investment, Internet connections have become commodity product in many countries, and Internet scale “cloud computing” has started to compete with traditional software business through its technological advantages and economy of scale. Cloud computing is a promising enabling technology of Internet ware Cloud Computing is termed as the next big thing in the modern corporate world. Apart from the present day software and technologies, cloud computing will have a growing impact on enterprise IT and business activities in many large organizations. This paper provides an insight to cloud computing, its impacts and discusses various issues that business organizations face while implementing cloud computing. Further, it recommends various strategies that organizations need to adopt while migrating to cloud computing. The purpose of this paper is to develop an understanding of cloud computing in the modern world and its impact on organizations and businesses. Initially the paper provides a brief description of the cloud computing model introduction and its purposes. Further it discusses various technical and non-technical issues that need to be overcome in order for the benefits of cloud computing to be realized in corporate businesses and organizations. It then provides various recommendations and strategies that businesses need to work on before stepping into new technologies.

Keywords: Distributed system, Distributing computing, cloud computing, issues and Challenges in cloud computing, Grid computing, Saa, Iaas, Paas..
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
30. Paper 28021070: Mobile Database System: Role of Mobility on the Query Processing (pp. 211-216)
.
Samidha Dwivedi Sharma and Dr. R. S. Kasana, Department of Computer Science & Applications, Dr. H. S. Gour, University, Sagar, MP, India

Abstract—The rapidly expanding technology of mobile communication will give mobile users capability of accessing information from anywhere and any time. The wireless technology has made it possible to achieve continuous connectivity in mobile environment. When the query is specified as continuous, the requesting mobile user can obtain continuously changing result. In order to provide accurate and timely outcome to requesting mobile user, the locations of moving object has to be closely monitored. The objective of paper is to discuss the problem related to the role of personal and terminal mobility and query processing in the mobile environment.

Keywords- Mobile Computing, Mobile Database, Location Management, Location Dependent Data
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
31. Paper 28021076: Secure Iris Authentication Using Visual Cryptography (pp. 217-221)
.
P.S. Revenkar, Faculty of Department of Computer Science and Engineering, Government College of Engineering, Aurangabad, Maharashra, India 
Anisa Anjum, Department of Computer Science and Engineering, Government College of Engineering, Aurangabad, Maharashtra, India
W. Z. Gandhare, Principal of Government College of Engineering, Aurangabad, Maharashtra , India


Abstract—Biometrics deal with automated methods of identifying a person or verifying the identity of a person based on physiological or behavioral characteristics. Visual cryptography is a secret sharing scheme where a secret image is encrypted into the shares which independently disclose no information about the original secret image. As biometric template are stored in the centralized database, due to security threats biometric template may be modified by attacker. If biometric template is altered authorized user will not be allowed to access the resource. To deal this issue visual cryptography schemes can be applied to secure the iris template. Visual cryptography provides great means for helping such security needs as well as extra layer of authentication.

Keywords-component; Biometrics, Visual cryptography, Iris, Authentication.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
32. Paper 28021077: A New Approach to Lung Image Segmentation using Fuzzy Possibilistic C-Means Algorithm (pp. 222-228)
Full Text: PDF

.
M. Gomathi, Department of MCA, Velalar College Of Engineering and Technology, Thindal (PO), Erode, India 
Dr. P.Thangaraj, Dean, School of Computer Technology and Applications, Kongu Engineering College, Perundurai, Erode, India

.
Abstract- Image segmentation is a vital part of image processing. Segmentation has its application widespread in the field of medical images in order to diagnose curious diseases. The same medical images can be segmented manually. But the accuracy of image segmentation using the segmentation algorithms is more when compared with the manual segmentation. In the field of medical diagnosis an extensive diversity of imaging techniques is presently available, such as radiography, computed tomography (CT) and magnetic resonance imaging (MRI). Medical image segmentation is an essential step for most consequent image analysis tasks. Although the original FCM algorithm yields good results for segmenting noise free images, it fails to segment images corrupted by noise, outliers and other imaging artifact. This paper presents an image segmentation approach using Modified Fuzzy C-Means (FCM) algorithm and Fuzzy Possibilistic c-means algorithm (FPCM). This approach is a generalized version of standard Fuzzy CMeans Clustering (FCM) algorithm. The limitation of the conventional FCM technique is eliminated in modifying the standard technique. The Modified FCM algorithm is formulated by modifying the distance measurement of the standard FCM algorithm to permit the labeling of a pixel to be influenced by other pixels and to restrain the noise effect during segmentation. Instead of having one term in the objective function, a second term is included, forcing the membership to be as high as possible without a maximum limit constraint of one. Experiments are conducted on real images to investigate the performance of the proposed modified FCM technique in segmenting the medical images. Standard FCM, Modified FCM, Fuzzy Possibilistic CMeans algorithm (FPCM) are compared to explore the accuracy of our proposed approach. 
.
Keywords-Fuzzy C-Means Clustering Algorithm, Modified FCM, Fuzzy Possibilistic C-Means Clustering Algorithm, Lung Nodule Detection, Medical Image Processing and Image Segmentation
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
33. Paper 28021078: Protection of Web Applications from Cross-Site Scripting Attacks in Browser Side (pp. 229-236)
Full Text: PDF

.
K. Selvamani, Department of Computer Science and Engineering, Anna University, Chennai, India 
A. Duraisamy, Department of Computer Science and Engineering, Anna University, Chennai, India 
A.Kannan, Department of Computer Science and Engineering, Anna University, Chennai, India

.
Abstract— Cross Site Scripting (XSS) Flaws are currently the most popular security problems in modern web applications. These Flaws make use of vulnerabilities in the code of web-applications, resulting in serious consequences, such as theft of cookies, passwords and other personal credentials. Cross-Site scripting Flaws occur when accessing information in intermediate trusted sites. Client side solution acts as a web proxy to mitigate Cross Site Scripting Flaws which manually generated rules to mitigate Cross Site Scripting attempts. Client side solution effectively protects against information leakage from the user’s environment. Cross Site Scripting Flaws are easy to execute, but difficult to detect and prevent. This paper provides client-side solution to mitigate cross-site scripting Flaws. The existing client-side solutions degrade the performance of client’s system resulting in a poor web surfing experience. In this project provides a client side solution that uses a step by step approach to protect cross site scripting, without degrading much the user’s web browsing experience.
.
Keywords-Web Application; Cross Site Scripting; Client Side Solution; Detection of XSS Attacks
.---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
34. Paper 28021079: Review of Robust Video Watermarking Algorithms (pp. 237-246)
.
Mrs Neeta Deshpande, Research Scholar, SRTM University, Nanded India  
Dr.Archana Rajurkar, Professor and Head, MGM College of Engineering, Nanded India, 
Dr. R Manthalkar, Professor and Head, SGGS Institute of Engineering and Technology Nanded India

.
Abstract— There has been a remarkable increase in the data exchange over web and the widespread use of digital media. As a result, multimedia data transfers also had a boost up. The mounting interest with reference to digital watermarking throughout the last decade is certainly due to the increase in the need of copyright protection of digital content. This is also enhanced due to commercial prospective. Applications of video watermarking in copy control, broadcast monitoring, fingerprinting, video authentication, copyright protection etc is immensely rising. The main aspects of information hiding are capacity, security and robustness. Capacity deals with the amount of information that can be hidden. The skill of anyone detecting the information is security and robustness refers to the resistance to modification of the cover content before concealed information is destroyed. Video watermarking algorithms normally prefers robustness. In a robust algorithm it is not possible to eliminate the watermark without rigorous degradation of the cover content. In this paper, we introduce the notion of Video Watermarking and the features required to design a robust watermarked video for a valuable application. We review several algorithms, and introduce frequently used key techniques. The aim of this paper is to focus on the various domains of video watermarking techniques. The majority of the reviewed methods based on video watermarking emphasize on the notion of robustness of the algorithm.

Keywords- Video Watermarking, Authentication, Robust, Techniques, DCT, DWT.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
35. Paper 28021080: Terrorism Event Classification Using Fuzzy Inference Systems (pp. 247-256)
.
Uraiwan Inyaem, Faculty of Information Technology, King Mongkut’s University of Technology North Bangkok, Bangkok, Thailand
Choochart Haruechaiyasak, Human Language Technology Laboratory, National Electrics and Computer Technology Center, Pathumthani, Thailand
Phayung Meesad, Faculty of Technical Education, King Mongkut’s University of Technology North Bangkok, Bangkok, Thailand
Dat Tran, Faculty of Information Science and Engineering, University of Canberra, ACT, Australia


Abstract—Terrorism has led to many problems in Thai societies, not only property damage but also civilian casualties. Predicting terrorism activities in advance can help prepare and manage risk from sabotage by these activities. This paper proposes a framework focusing on event classification in terrorism domain using fuzzy inference systems (FISs). Each FIS is a decision-making model combining fuzzy logic and approximate reasoning. It is generated in five main parts: the input interface, the fuzzification interface, knowledge base unit, decision making unit and output defuzzification interface. Adaptive neuro-fuzzy inference system (ANFIS) is a FIS model adapted by combining the fuzzy logic and neural network. The ANFIS utilizes automatic identification of fuzzy logic rules and adjustment of membership function (MF). Moreover, neural network can directly learn from data set to construct fuzzy logic rules and MF implemented in various applications. FIS settings are evaluated based on two comparisons. The first evaluation is the comparison between unstructured and structured events using the same FIS setting. The second comparison is the model settings between FIS and ANFIS for classifying structured events. The data set consists of news articles related to terrorism events in three southern provinces of Thailand. The experimental results show that the classification performance of the FIS resulting from structured events achieves satisfactory accuracy and is better than the unstructured events. In addition, the classification of structured events using ANFIS gives higher performance than the events using only FIS in the prediction of terrorism events.

Keywords- Event classification; terrorism domain; fuzzy inference system (FIS); adaptive neuro-fuzzy inference system (ANFIS); membership function (MF)
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
36. Paper 28021086: A Model of Cloud Based Application Environment for Software Testing (pp. 257-260)
.
T. Vengattaraman, Department of Computer Science, Pondicherry University, India 
P. Dhavachelvan, Department of Computer Science, Pondicherry University, India.
R. Baskaran, Department of Computer Science and Engineering, Anna University, Chennai, India.


Abstract— Cloud computing is an emerging platform of service computing designed for swift and dynamic delivery of assured computing resources. Cloud computing provide Service-Level Agreements (SLAs) for guaranteed uptime availability for enabling convenient and on-demand network access to the distributed and shared computing resources. Though the cloud computing paradigm holds its potential status in the field of distributed computing, cloud platforms are not yet to the attention of majority of the researchers and practitioners. More specifically, still the researchers and practitioners community has fragmented and imperfect knowledge on cloud computing principles and techniques. In this context, one of the primary motivations of the work presented in this paper is to reveal the versatile merits of cloud computing paradigm and hence the objective of this work is defined to bring out the remarkable significances of cloud computing paradigm through an application environment. In this work, a cloud computing model for software testing is developed.

Keywords-component; Cloud Computing; Software Testing; Web Services
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
37. Paper 28021089: Joint Design of Congestion Control Routing With Distributed Multi Channel Assignment in Wireless Mesh Networks (pp. 261-266)
.
K.Valarmathi, Research Scolar, Sathyabama University, Chennai, India
N. Malmurugan, Principal, Oxford Engineering College, Trichy, India


Abstract— In Wireless Mesh Networks (WMN), a channel assignment has to balance the objectives of maintaining connectivity and increasing the aggregate bandwidth. The main aim of the channel assignment algorithm is to assign the channels to the network interfaces, from the given expected load on each virtual link. From the existing work done so far, we can examine that there is no combined solution of multi-channel assignment with routing and congestion control. In this paper, we propose a congestion control routing protocol along with multi-channel assignment. We use a traffic aware metric in this protocol in order to provide quality of service. The proposed protocol can improve the throughput and channel utilization to very high extent because it provides solution for multi-channel assignment and congestion control. The proposed algorithm assigns the channels in a way that, congestion is avoided and co-channel interference levels among links with same channel are reduced. By our simulation results in NS2, we show that the proposed protocol attains high throughput and channel utilization along with reduced latency.

Keywords- Wireless Mesh Networks (WMN), channel assignment algorithm, multi-channel assignment, routing, congestion control.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
38. Paper 28021095: Mobile Broadband Possibilities considering the Arrival of IEEE 802.16m & LTE with an Emphasis on South Asia (pp. 267-275)
.
Nafiz Imtiaz Bin Hamid 1, Md. Zakir Hossain 2, Md. R. H. Khandokar 3, Taskin Jamal4, Md.A. Shoeb 5 
Department of Electrical and Electronic Engineering (EEE) 
1 Islamic University of Technology (IUT), Board Bazar, Gazipur-1704, Bangladesh. 
4 The University of Asia Pacific (UAP), Dhanmondi R/A, Dhaka-1209, Bangladesh. 
5 Stamford University, Siddeswari, Dhaka-1217, Bangladesh. 
3 School of Engineering and Computer Science. Independent University, Bangladesh. 
2 Radio Access Network (RAN) Department, Qubee. Augure Wireless Broadband Bangladesh Limited.


Abstract— This paper intends to look deeper into finding an ideal mobile broadband solution. Special stress has been put in the South Asian region through some comparative analysis. Proving their competency in numerous aspects, WiMAX and LTE already have already made a strong position in telecommunication industry. Both WiMAX and LTE are 4G technologies designed to move data rather than voice having IP networks based on OFDM technology. So, they aren’t like typical technological rivals as of GSM and CDMA. But still a gesture of hostility seems to outburst long before the stable commercial launch of LTE. In this paper various aspects of WiMAX and LTE for deployment have been analyzed. Again, we tried to make every possible consideration with respect to south Asia i.e. how mass people of this region may be benefited. As a result, it might be regarded as a good source in case of making major BWA deployment decisions in this region. Besides these, it also opens the path for further research and in depth thinking in this issue. 

Keywords-BWA;WiMAX; IEEE 802.16e; IEEE 802.16m; LTE; LTE-Advanced
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
39. Paper 28021096: SAR Image Segmentation using Vector Quantization Technique on Entropy Images (pp. 276-282)
.
Dr. H. B. Kekre, Computer Engineering, MPSTME, NMIMS University, Vileparle(w) Mumbai 400–056, India
Saylee Gharge, Ph.D. Scholar, MPSTME, NMIMS University, Assistant Professor, V.E.S.I.T, Mumbai-400071, India
Tanuja K. Sarode, Ph.D. Scholar, MPSTME, NMIMS University, Associate Professor, TSEC, Mumbai 400-050, India


Abstract— The development and application of various remote sensing platforms result in the production of huge amounts of satellite image data. Therefore, there is an increasing need for effective querying and browsing in these image databases. In order to take advantage and make good use of satellite images data, we must be able to extract meaningful information from the imagery. Hence we proposed a new algorithm for SAR image segmentation. In this paper we propose segmentation using vector quantization technique on entropy image. Initially, we obtain entropy image and in second step we use Kekre’s Fast Codebook Generation (KFCG) algorithm for segmentation of the entropy image. Thereafter, a codebook of size 128 was generated for the Entropy image. These code vectors were further clustered in 8 clusters using same KFCG algorithm and converted into 8 images. These 8 images were displayed as a result. This approach does not lead to over segmentation or under segmentation. We compared these results with well known Gray Level Co-occurrence Matrix. The proposed algorithm gives better segmentation with less complexity.

Keywords-component; SAR image; image Segmentation; Probability; Entropy; Vector Quantization; Codevector;
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
40. Paper 28021098: Reversible Image data Hiding using Lifting wavelet Transform and Histogram Shifting (pp. 283-289)
.
S. Kurshid Jinna, Professor, Dept of Computer Science & Engineering, PET Engineering College, Vallioor, Tirunelveli, India
Dr. L. Ganesan, Professor, Dept of Computer Science & Engineering, A.C College of Engineering & Technology, Karaikudi, India


Abstract- A method of lossless data hiding in images using integer wavelet transform and histogram shifting for gray scale images is proposed. The method shifts part of the histogram, to create space for embedding the watermark information bits. The method embeds watermark while maintaining the visual quality well. The method is completely reversible. The original image and the watermark data can be recovered without any loss.

Keywords: Data Hiding, Histogram shifting, reversible watermarking, Integer-integer wavelet transforms.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
41. Paper 31011061: GIS: (Geographic Information System) An application for socio-economical data collection for rural area (pp. 290-293)
Full Text: PDF

.
Mr.Nayak S.K., Head, Dept. of Computer Science, Bahirji Smarak Mahavidyalaya, Basmathnagar, Dist. Hingoli. (MS), India
Dr.S.B.Thorat, Director, Institute of Technology and Mgmt, Nanded, Dist.Nanded. (MS), India
Dr.Kalyankar N.V., Principal, Yeshwant Mahavidyalaya, Nanded, Nanded (MS) India

.
Abstract—The country India follows the planning through planning commission. This is on the basis of information collected by traditional, tedious and manual method which is too slow to sustain. Now we are in the age of 21th century. We have seen in last few decades that the progress of information technology with leaps and bounds, which have completely changed the way of life in the developed nations. While internet has changed the established working practice and opened new vistas and provided a platform to connect, this gives the opportunity for collaborative work space that goes beyond the global boundary. We are living in the global economy and India leading towards Liberalize Market Oriented Economy (LMOE). Considering this things, focusing on GIS, we proposed a system for collection of socio economic data and water resource management information of rural area via internet.
.
Keywords- Cartography, photogrammetry, digital-divide, data capture.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
42. Paper 28021073: Probabilistic Semantic Web Mining Using Artificial Neural Analysis (pp. 294-304)
.
Mr.T.Krishna Kishore, Assistant Professor, St.Ann's College of Engineering and Technology. Chirala-523187.
Mr.T.Sasi Vardhan, Assistant Professor, St.Ann's Engineering College , Chirala-523187
Mr.N.Lakshmi Narayana, Assistant Professor, St.Ann's College of Engineering and Technology. Chirala-523187


Abstract - Most of the web user’s requirements are search or navigation time and getting correctly matched result. These constrains can be satisfied with some additional modules attached to the existing search engines and web servers. This paper proposes that powerful architecture for search engines with the title of Probabilistic Semantic Web Mining named from the methods used. With the increase of larger and larger collection of various data resources on the World Wide Web (WWW), Web Mining has become one of the most important requirements for the web users. Web servers will store various formats of data including text, image, audio, video etc., but servers can not identify the contents of the data. These search techniques can be improved by adding some special techniques including semantic web mining and probabilistic analysis to get more accurate results. Semantic web mining technique can provide meaningful search of data resources by eliminating useless information with mining process. In this technique web servers will maintain Meta information of each and every data resources available in that particular web server. This will help the search engine to retrieve information that is relevant to user given input string. This paper proposing the idea of combing these two techniques Semantic web mining and Probabilistic analysis for efficient and accurate search results of web mining. SPF can be calculated by considering both semantic accuracy and syntactic accuracy of data with the input string. This will be the deciding factor for producing results.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
43. Paper 28021075: Document Clustering using Sequential Information Bottleneck Method (pp. 305-312)
Full Text: PDF
.
MS. P.J.Gayathri, 1M.Phil scholar, P.S.G.R. Krishnammal College, for Women, Coimbatore, India
MRS. S.C. Punitha, 2 HOD, Department of Computer science, P.S.G.R. Krishnammal College for Women, Coimbatore, India.
Dr.M. Punithavalli, 3 Director , Department of Computer science, Sri Ramakrishna college of Arts and Science for Women, Coimbatore, India.

.
Abstract-Document clustering is a subset of the larger field of data clustering, which borrows concepts from the fields of information retrieval (IR), natural language processing (NLP), and machine learning (ML). It is a more specific technique for unsupervised document organization, automatic topic extraction and fast information retrieval or filtering. There exist a wide variety of unsupervised clustering algorithms. In this paper presents a sequential algorithm for document clustering based with an enhancement on the features of the existing algorithms. This paper illustrates the Principal Direction Divisive Partitioning (PDDP) algorithm and describes its drawbacks and introduces a combinatorial framework of the Principal Direction Divisive Partitioning (PDDP) algorithm, then describes the simplified version of the EM algorithm called the spherical Gaussian EM (sGEM) algorithm and Information Bottleneck method (IB) is a technique for finding accuracy, complexity and time space. The PDDP algorithm recursively splits the data samples into two sub clusters using the hyper plane normal to the principal direction derived from the covariance matrix, which is the central logic of the algorithm. However, the PDDP algorithm can yield poor results, especially when clusters are not well separated from one another. To improve the quality of the clustering results problem, it is resolved by reallocating new cluster membership using the IB algorithm with different settings. IB Method gives accuracy but time consumption is more. Furthermore, based on the theoretical background of the sGEM algorithm and sequential Information Bottleneck method(sIB), it can be obvious to extend the framework to cover the problem of estimating the number of clusters using the Bayesian Information Criterion. Experimental results are given to show the effectiveness of the proposed algorithm with comparison to the existing algorithm. 
.
Keywords- Principal Direction Divisive Partitioning algorithm (PDDP), Spherical Gaussian Expectation - Maximization (sGEM), Sequential Information Bottleneck Method (sIB), Bayesian Information Criterion (BIC), Centroid Scattered Value (CSV).


---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------


Comments