Vol. 9 No. 6 JUN 2011

Vol. 9 No. 6 June 2011 International Journal of Computer Science and Information Security

Publication June 2011, Volume 9 No. 6 (Download Full Journal) (Archive)

.

Copyright © IJCSIS. This is an open access journal distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

1. Paper 19051107: Additive Model of Reliability of Biometric Systems with Exponential Distribution of Failure Probability (pp. 1-4)

Full Text: PDF

.

Zoran Ćosić, Director, Statheros d.o.o., Kaštel Stari, Croatia

Jasmin Ćosić, IT Section of Police Administration, Ministry of Interior of Una-sana canton, Bihać, Bosnia and Hercegovina

Miroslav Bača, Professor, Faculty of Organisational and Informational science, Varaždin, Croatia

.

Abstract — Approaches for reliability analysis of biometric systems are subject to a review of numerous scientific papers. Most of them consider issues of reliability of component software applications. System reliability, considering technical and software part, is of crucial importance for users and for manufacturers of biometric systems. In this paper, the authors developed a mathematical model to analyse the reliability of biometric systems, regarding the dependence of components with exponential distribution of failure probability.

.

Keywords- Additive model, Biometric system, reliability, exponential distribution, UML

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

2. Paper 26051129: IP Private Branch eXchange of Saint Joseph University, Macao: a Design Case Study (pp. 5-11)

Full Text: PDF

.

A. Cotão, R. Whitfield, J. Negreiros

Information Technology Department, University of Saint Joseph, Macau, China

.

Abstract — To present the specification project of a digital telephone system, IP PBX, for the new campus of Saint Joseph University (USJ), Macao, is the main goal of this research. Given that the new USJ campus at Green Island, Macao, was projected for this coming September 2012 with the latest technologies available to achieve energy savings and to contribute somehow to the environment sustainability, the available prototype was designed using VoIP (Voice Over IP) to follow this novelty trend. It is expected to conclude, as well, that there is a financial reason for preferring a VoIP phone system over a conventional one. Choosing this platform eliminates the need for conventional telephone wiring for the new campus, which represents a considerable cost savings and logistics, for instance. Further, good VoIP Open Source software are already available such as AsteriskNOW©. At last, the internal USJ connection to the Catholic University of Lisbon, Portugal, adds another financial reason for this project since international calls can be quite expensive. To analyze the setup, test and implementation of a prototype becomes, hence, the aim of this paper, including the explanation of the difficulties, technical lessons and recommendations on the tested IP PBX.

.

Keywords- Voice Over IP (VoIP), Private Branch eXchange (PBX), AsteriskNOW©, PSTN, IP handsets, IP Providers.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

3. Paper 30051137: A Novel and Secure Data Sharing Model with Full Owner Control in the Cloud Environment (pp. 12-17)

Full Text: PDF

.

Mohamed Meky and Amjad Ali

Center of Security Studies, University of Maryland University College, Adelphi, Maryland, USA

.

Abstract — Cloud computing is a rapidly growing segment of the IT industry that will bring new service opportunities with significant cost reduction in IT capital expenditures and operating costs, on-demand capacity, and pay-per-use pricing models for IT service providers. Among these services are Software-as-a-Service, Platform-as-a-Service, Infrastructure-as–a-Service, Communication-as-a-Service, Monitoring-as-a-Service, and Storage-as-a-Service. Storage-as-a-Service provides data owners a cost effective service to store massive data and handles efficient routine data backup by utilizing the vast storage capacity offered by a cloud computing infrastructure. However, shifting data storage to cloud computing infrastructure introduces several security threats to data as cloud providers may have complete control on the computing infrastructure that underpins the services. These security threats include unauthorized data access, compromise data integrity and confidentiality, and less direct control over data for data owner. The current literatures propose several approaches for storing and sharing data in the cloud environments. However, these approaches are either applicable to specific data formats or encryption techniques. In this paper, unlike previous studies, we introduce a secure and efficient model that allows the data owners to have full control over data sharing in the cloud environment. In addition, it prevents cloud providers from revealing data to unauthorized users. The proposed model can be used in different IT areas, with different data and encryption techniques, to provide secure data sharing for fixed and mobile computing devices.

.

Keywords- cloud computing; cloud storage; data sharing model; data access control; data owner full control, cloud storage as a service; data encryption

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

4. Paper 17051101: Performances Evaluation of Inter-System Handover between IEEE802.16e and IEEE802.11 Networks (pp. 18-24)

Full Text: PDF

.

Abderrezak Djemai, Mourad Hadjila, Mohammed Feham

STIC laboratory, University of Tlemcen, Algeria

.

Abstract— This article presents the mechanisms to be implemented for analyzing the performances of the inter-system handover between WiFi and WiMAX networks. The presence of an entity of handover is significant so that the mobile terminal supports both technologies enabling it to make heterogeneous transfers. In this paper, we propose the development of a software platform able to manage the interoperability between WiMAX and WiFi with uninterrupted communication.

.

Keywords- Networks, Wireless, WiFi, WiMAX, Handover, Packets.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

5. Paper 24051124: Recognizing the Electronic Medical Record Data from Unstructured Medical Data Using Visual Text Mining Techniques (pp. 25-35)

Full Text: PDF

.

Prof. Hussain Bushinak, Faculty of Medicine, Ain Shams University, Cairo, Egypt

Dr. Sayed AbdelGaber, Faculty of Computers and Information, Helwan University, Cairo, Egypt

Mr. Fahad Kamal AlSharif, Collage of Computer Science, Modern Academy, Cairo, Egypt

.

Abstract - Computer systems and communication technologies made a strong and influential presence in the different fields of medicine. The cornerstone of a functional medical information system is the Electronic Health Records (EHR) management system. EHR implementation and adoption face different barriers that slow down its deployment in different organizations. This research focuses on resolving the most public barriers, which are data entry, unstructured clinical data modifying the physician work flow. This research proposed a solution, which use Text mining and Natural language processing techniques. This solution tested and verified in four real-world clinical organizations. The suggested solution proved correctness and preciseness with 91.88%.

.

Keywords: Electronic Health Reacord, Textmining, Unstructured Medical Data, medical Data entry, Health Information Technology.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

6. Paper 31051149: Creating an Appropriate Programming Language for Student Compiler Project (pp. 36-39)

Full Text: PDF

.

Elinda Kajo Mece, Department of Informatics Engineering, Polytechnic University of Tirana, Tirana, Albania

.

Abstract — Finding an appropriate and simple source language, to be used in implementing student compiler project, is one of challenges, especially in cases when the students are not familiar with high level programming languages. This paper presents a new programming language intended principally for beginners and didactic purposes in the course of compiler design. SimJ, a reduced form of the Java programming language, is designed for a simple and faster programming. More readable code, no complexity, and basic functionality are the primary goals of SimJ. The language includes the most important functions and data structures needed for creating simple programs found generally in beginners programming text books. The Polyglot compiler framework is used for the implementation of SimJ.

.

Keywords- compiler design; new programming language; polyglot framework

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

7. Paper 31051152: The History of Web Application Security Risks (pp. 40-47)

Full Text: PDF

.

Fahad Alanazi, Software Technology Research Laboratory, De Montfort University, Leicester, LE1 9BH UK

Mohamed Sarrab, Software Technology Research Laboratory, De Montfort University, Leicester, LE1 9BH UK

.

Abstract — This article refers generally to current web application risks that are causing public concern, and piquing the interest of many scientists and organizations, as a result of an increase in attacks. The primary concern of many governments, organizations and companies is data loss and theft. Thus, these organizations are seeking to insure their web applications against vulnerabilities. Revealing that awareness of the vulnerabilities of web applications leads to recognition of the need for improvements. The three main facets of web security are: confidentiality, integrity and safety of content, and continuity. This paper identifies and discusses ten web application vulnerabilities, detailing the opinions of researchers and OWASP regarding risk assessment and protection.

.

Keywords:

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

8. Paper 31051156: Improving the Performance of Translation Wavelet Transform using BMICA (pp. 48-56)

Full Text: PDF

.

Janett Walters-Williams, School of Computing & Information Technology, University of Technology, Jamaica, Kingston 6, Jamaica W.I.

Yan Li, Department of Mathematics & Computing, Centre for Systems Biology, University of Southern Queensland, Toowoomba, Australia

.

Abstract — Research has shown Wavelet Transform to be one of the best methods for denoising biosignals. Translation-Invariant form of this method has been found to be the best performance. In this paper however we utilize this method and merger with our newly created Independent Component Analysis method – BMICA. Different EEG signals are used to verify the method within the MATLAB environment. Results are then compared with those of the actual Translation-Invariant algorithm and evaluated using the performance measures Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR), Signal to Distortion Ratio (SDR), and Signal to Interference Ratio (SIR). Experiments revealed that the BMICA Translation-Invariant Wavelet Transform out performed in all four measures. This indicates that it performed superior to the basic Translation-Invariant Wavelet Transform algorithm producing cleaner EEG signals which can influence diagnosis as well as clinical studies of the brain.

.

Keywords-B-Spline; Independent Component Analysis; Mutual Information; Translation-Invariant Wavelet Transform

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

9. Paper 31051157: Hole Filing IFCNN Simulation by Parallel RK(5,6) Techniques (pp. 57-64)

Full Text: PDF

.

S. Senthilkumar and Abd Rahni Mt Piah,

Universiti Sains Malaysia, School of Mathematical Sciences, Pulau Pinang-11800, Penang, Malaysia

.

Abstract — This paper concentrates on employing different parallel RK(5,6) techniques for hole-filing via unique characteristics of improved fuzzy cellular neural network (IFCNN) simulation to improve the performance of an image or handwritten character recognition. Results are presented according to the range of template selected for simulation.

.

Keywords- Parallel 5-order 6-stage numerical integration techniques, Improved fuzzy cellular neural network, Hole filing, Simulation, Ordinary differential equations.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

10. Paper 31051160: Location Estimation and Mobility Prediction Using Neuro-fuzzy Networks In Cellular Networks (pp. 65-69)

Full Text: PDF

.

Maryam Borna, Department of Electrical Engineering, Iran University of Science and Technology, Tehran, Iran

Mohammad Soleimani, Department of Electrical Engineering, Iran University of Science and Technology, Tehran, Iran

.

Abstract - In this paper an approach is proposed for location estimation, tracking and mobility prediction in cellular networks in dense urban areas using neural and neuro-fuzzy networks. In urban areas with high buildings, due to the effects of multipath fading and Non-Line-of-Sight conditions, the accuracy of positioning methods based on direction finding and ranging degrades significantly. Also in these areas, due to high user traffic there's a need for network resources management. Knowing the next possible position of user would be helpful in this case. Here using fingerprint positioning concept, after choosing appropriate parameters for fingerprinting in GSM cellular networks, MLP and RBF neural networks were used for position estimation. Then by the use of neuro-fuzzy networks a tracking and post-processing method is applied to estimated locations. For mobility prediction purpose the use of ANFIS neuro-fuzzy is implemented.

.

Keywords- position estimation; neuro-fuzzy; prediction; cellular networks.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

11. Paper 31051161: A Fuzzy Clustering Based Approach for Mining Usage Profiles from Web Log Data (pp. 70-79)

Full Text: PDF

.

Zahid Ansari (1), Mohammad Fazle Azeem (2), A. Vinaya Babu (3) and Waseem Ahmed (4)

1,4 Dept. of Computer Science Engineering, P.A. College of Engineering, Mangalore, India

2 Dept. of Electronics and Communication Engineering, P.A. College of Engineering, Mangalore, India

3 Dept. of Computer Science Engineering, Jawaharlal Nehru Technological University, Hyderabad, India

.

Abstract — The World Wide Web continues to grow at an amazing rate in both the size and complexity of Web sites and is well on it’s way to being the main reservoir of information and data. Due to this increase in growth and complexity of WWW, web site publishers are facing increasing difficulty in attracting and retaining users. To design popular and attractive websites publishers must understand their users’ needs. Therefore analyzing users’ behaviour is an important part of web page design. Web Usage Mining (WUM) is the application of data mining techniques to web usage log repositories in order to discover the usage patterns that can be used to analyze the user’s navigational behavior [1]. WUM contains three main steps: preprocessing, knowledge extraction and results analysis. The goal of the preprocessing stage in Web usage mining is to transform the raw web log data into a set of user profiles. Each such profile captures a sequence or a set of URLs representing a user session. This sessionized data can be used as the input for a variety of data mining tasks such as clustering [2], association rule mining [3], sequence mining [4] etc. If the data mining task at hand is clustering, the session files are filtered to remove very small sessions in order to eliminate the noise from the data [5]. But direct removal of these small sized sessions may result in loss of a significant amount of information especially when the number of small sessions is large. We propose a “Fuzzy Set Theoretic” approach to deal with this problem. Instead of directly removing all the small sessions below a specified threshold, we assign weights to all the sessions using a “Fuzzy Membership Function” based on the number of URLs accessed by the sessions. After assigning the weights we apply a “Fuzzy c-Mean Clustering” algorithm to discover the clusters of user profiles. In this paper, we discuss our methodology to preprocess the web log data including data cleaning, user identification and session identification. We also describe our methodology to perform feature selection (or dimensionality reduction) and session weight assignment tasks. Finally we compare our soft computing based approach of session weight assignment with the traditional hard computing based approach of small session elimination.

.

Keywords- web usage mining; data preprocessing, fuzzy Clustering, knowledge discovery;

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

12. Paper 31051168: Inception of Hybrid Wavelet Transform using Two Orthogonal Transforms and It’s use for Image Compression (pp. 80-87)

Full Text: PDF

.

Dr. H.B. Kekre, Senior Professor, Computer Engineering Department, SVKM’s NMIMS (Deemed-to-be University), Vile Parle(W), Mumbai, India.

Dr. Tanuja K. Sarode, Assistant Professor, Computer Engineering Department, Thadomal Shahani Engineering College, Bandra(W), Mumbai, India.

Sudeep D. Thepade, Associate Professor, Computer Engineering Department, SVKM’s NMIMS (Deemed-to-be University), Vile Parle(W), Mumbai, India

.

Abstract —The paper presents the novel hybrid wavelet transform generation technique using two orthogonal transforms. The orthogonal transforms are used for analysis of global properties of the data into frequency domain. For studying the local properties of the signal, the concept of wavelet transform is introduced, where the mother wavelet function gives the global properties of the signal and wavelet basis functions which are compressed versions of mother wavelet are used to study the local properties of the signal. In wavelets of some orthogonal transforms the global characteristics of the data are hauled out better and some orthogonal transforms might give the local characteristics in better way. The idea of hybrid wavelet transform comes in to picture in view of combining the traits of two different orthogonal transform wavelets to exploit the strengths of both the transform wavelets. The paper proves the worth of hybrid wavelet transforms for the image compression which can further be extended to other image processing applications like steganography, biometric identification, content based image retrieval etc. Here the hybrid wavelet transforms are generated using four orthogonal transforms alias Discrete Cosine transform (DCT), Discrete Hartley transform (DHT), Discrete Walsh transform (DWT) and Discrete Kekre transform (DKT). The comparison of the hybrid wavelet transforms is also done with the original orthogonal transforms and their wavelet transforms. The experimentation results have shown that the transform wavelets have given better quality of image compression than the respective original orthogonal transforms but for hybrid transform wavelets the performance is best. Here the hybrid of DCT and DKT gives the best results among the combinations of the four mentioned image transforms used for generating hybrid wavelet transforms.

.

Keywords-Orthogonal transform; Wavelet transform; Hybrid Wavelet transform; Compression.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

13. Paper 31051177: A Model for the Controlled Development of Software Complexity Impacts (pp. 88-93)

Full Text: PDF

.

Ghazal Keshavarz, Computer department, Science and Research Branch, Islamic Azad University, Tehran, Iran

Nasser Modiri, Computer department, Islamic Azad University, Zanjan, Iran

Mirmohsen Pedram, Computer department, Tarbiat Mollem University, Karaj, Iran

.

Abstract — Several researches have shown software complexity has affected different features of software. The most important ones are productivity, quality and maintenance of software. Thus, measuring and controlling of complexity will have an important influence to improve these features. So far, most of the proposed approaches to control and measure complexity are in code and design phase and mainly have based on code and cognitive methods; But measuring and control the complexity in these phases (design and code) is too late. In this paper, with emphasis on requirement engineering process, we analyze the factors affecting complexity in the early stages of software life cycle and present a model. This model enables software engineering to identify the complexity reasons that are the origin of many costs in later phases (especially in maintenance phase) and prevent error publishing. We also specify the relationship between software complexity and important features of software, namely quality, productivity and maintainability and present a model too.

.

Keywords- Requirement Engineering, Software Complexity, Software Quality

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

14. Paper 31051178: A Hierarchical Overlay Design for Peer to Peer and SIP Integration (pp. 94-99)

Full Text: PDF

.

Md. Safiqul Islam & Syed Ashiqur Rahman, Computer Science and Engineering Department, Daffodil International University, Dhaka, Bangladesh

Rezwan Ahmed, American International University – Bangladesh, Dhaka, Bangladesh

Mahmudul Hasan, Computer Science and Engineering Department, Daffodil International University, Dhaka, Bangladesh

.

Abstract — Peer-to-Peer Session Initiation Protocol (P2PSIP) is the upcoming migration from the traditional client-server based SIP system. Traditional centralized server based SIP system is vulnerable to several problems like performance bottleneck, single point of failure. So, integration of Peer-to-Peer system (P2P) with Session Initiation Protocol (SIP) will improve the performance of a conventional SIP system because a P2P system is highly scalable, robust, and fault tolerant due to its decentralized manner and self-organization of the network. However, P2PSIP architecture faces several challenges including trustworthiness of peers, resource lookup delay, Network Address Translation (NAT) traversal, etc. This paper focuses on understanding the needs of integration of P2P and SIP. It also reviews the existing approaches to identify their advantages and shortcomings. Based on the existing approaches, it proposes a layered architecture to address the major challenges introduced by P2PSIP.

.

Keyword:

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

15. Paper 31051199: Evaluation of CPU Consuming, Memory Utilization and Time Transfering Between Virtual Machines in Network by using HTTP and FTP techniques (pp. 100-105)

Full Text: PDF

.

Igli TAFA, Elinda KAJO, Elma ZANAJ, Ariana BEJLERI, Aleksandër XHUVANI

Polytechnic University of Tirana, Information Technology Faculty, Computer Engineering Department, Tiranë, Albania

.

Abstract - In this paper we want to evaluate Transfer Time, Memory Utilization and CPU Consuming between virtual machines in Network by using FTP and HTTP benchmarks. As a virtualization platform for running the benchmarks we have used Xen hypervisor in para-virtualization mode. The virtual machine technology offers some benefits such as live migration, fault tolerance, security, resource management etc. The experiments performed show that virtual machines above the hypervisor consume more CPU, memory and have bigger transfer times than in a non virtualized environment.

.

Keywords: Transfer Time, Memory Utilization, CPU Consuming, Virtual Machines, Xen-Hypervisor.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

16. Paper 17051103: A Proposal for Common Vulnerability Classification Scheme Based on Analysis of Taxonomic Features in Vulnerability Databases (pp. 106-111)

Full Text: PDF

.

Anshu Tripathi, Department of Information Technology, Mahakal Institute of Technology, Ujjain, India

Umesh Kumar Singh, Institute of Computer Science, Vikram University, Ujjain, India

.

Abstract — A proper vulnerability classification scheme aids in improving system security evaluation process. Many vulnerability classification schemes exist but there is lacking of a standard classification scheme. Focus of this work is to devise a common classification scheme by combining characteristics derived from classification schemes of prominent vulnerability databases in effective way. In order to identify a balanced set of characteristics for proposed scheme comparative analysis of existing classification schemes done on five major vulnerability databases. A set of taxonomic features and classes extracted as a result of analysis. Further a common vulnerability classification scheme proposed by harmonizing extracted set of taxonomic features and classes. Mapping of proposed scheme to existing classification schemes also presented to eliminate inconsistencies across selected set of databases.

.

Keywords- Vulnerability; Classification scheme; Vulnerability databases; Taxonomy; Security evaluation.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

17. Paper 19051108: Abrupt Change Detection of Fault in Power System Using Independent Component Analysis (pp. 112-118)

Full Text: PDF

.

Satyabrata Das, Asstt Prof., Department of CSE, College of Engineering Bhubaneswar, Orissa, India-751024

Soumya Ranjan Mohanty, Asstt Prof., Department of EE, Motilal Neheru National Institute of Technology, Allahabad, India-211004

Sabyasachi Pattnaik, Prof., Department of I&CT, Fakir Mohan University, Balasore, India -756019

.

Abstract — This paper proposes a novel approach for fault detection in a power system based on Independent Component Analysis (ICA). The index for detection of fault is derived from independent components of faulty current samples. The proposed approach is tested on simulated data obtained from MATLAB/Simulink for a typical power system. The proposed approach is compared with existing approaches available in literature for fault detection in time-series data. The comparison demonstrates the accuracy and consistency of the proposed approach in considered changing conditions of a typical power system. By virtue of its accuracy and consistency, the proposed approach can be used in real time applications also.

.

Index Terms— Digital relaying, distance relay, fault detection, independent component analysis.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

18. Paper 19051113: Modeling and Analyze the Deep Web: Surfacing Hidden Value (pp. 119-124)

Full Text: PDF

.

Suneet Kumar, Associate Professor; Computer Science Dept., Dehradun Institute of Technology, Dehradun, India

Anuj Kumar Yadav, Assistant Professor; Computer Science Dept., Dehradun Institute of Technology, Dehradun, India

Rakesh Bharati, Assistant Professor; Computer Science Dept., Dehradun Institute of Technology, Dehradun, India

Rani Choudhary, Sr. Lecturer; Computer Science Dept., BBDIT, Ghaziabad, India

.

Abstract — Focused web crawlers have recently emerged as an alternative to the well-established web search engines. While the well-known focused crawlers retrieve relevant web-pages, there are various applications which target whole websites instead of single web-pages. For example, companies are represented by websites, not by individual web-pages. To answer queries targeted at Websites, web directories are an established solution. In this paper, we introduce a novel focused website crawler to employ the paradigm of focused crawling for the search of relevant websites. The proposed crawler is based on two-level architecture and corresponding crawl strategies with an explicit concept of websites. The external crawler views the web as a graph of linked websites, selects the websites to be examined next and invokes internal crawlers. Each internal crawler views the web-pages of a single given website and performs focused (page) crawling within that website. Our Experimental evaluation demonstrates that the proposed focused website crawler clearly outperforms previous methods of focused crawling which were adapted to retrieve websites instead of single web-pages.

.

Keywords- Deep Web ; Link references ; Searchable Databases;Site page-views.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

19. Paper 24051122: Instigation of Orthogonal Wavelet Transforms using Walsh, Cosine, Hartley, Kekre Transforms and their use in Image Compression (pp. 125-133)

Full Text: PDF

.

Dr. H. B.Kekre, Sr. Professor, MPSTME, SVKM’s, NMIMS (Deemed-to-be University, Vileparle(W), Mumbai-56, India.

Dr. Tanuja K. Sarode, Asst. Professor, Thadomal Shahani Engg. College, Bandra (W), Mumbai-50, India.

Sudeep D. Thepade, Associate Professor, MPSTME, SVKM’s, NMIMS (Deemed-to-be University, Vileparle(W), Mumbai-56, India.

Ms. Sonal Shroff, Lecturer, Thadomal Shahani Engg. College Bandra (W), Mumbai-50, India

.

Abstract — In this paper a novel orthogonal wavelet transform generation method is proposed. To check the advantage of wavelet transforms over the respective orthogonal transform in image compression, the generated wavelet transforms are applied to the color images of size 256x256x3 on each of the color planes R, G, and B separately, and thus the transformed R, G, and B planes are obtained. Form each of these transformed color planes, the 70% to 95% of the data (in form of coefficients having lower energy values) is removed and image is reconstructed. The orthogonal transforms Discrete Cosine Transform (DCT), Walsh Transform, Hartley Transform and Kekre Transform are used for the generation of DCT Wavelets, Walsh Wavelets, Hartley Wavelets, and Kekre Wavelets respectively. From the results it is observed that the respective Wavelet transform outperforms the original orthogonal transform.

.

Keywords:

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

20. Paper 25051125: Analysing Assorted Window Sizes with LBG and KPE Codebook Generation Techniques for Grayscale Image Colorization (pp. 134-138)

Full Text: PDF

.

Dr. H. B. Kekre, Sr. Professor, MPSTME, SVKM’s, NMIMS (Deemed-to-be University, Vileparle(W), Mumbai-56, India.

Dr. Tanuja K. Sarode, Asst. Professor, Thadomal Shahani Engg. College, Bandra (W), Mumbai-50, India.

Sudeep D. Thepade, Associate Professor, MPSTME, SVKM’s, NMIMS (Deemed-to-be University, Vileparle(W), Mumbai-56, India.

Ms. Supriya Kamoji, Sr.Lecturer, Fr.Conceicao Rodrigues College of Engg, Bandra (W), Mumbai-50, India

.

Abstract — This paper presents use of assorted window sizes and their impact on colorization of grayscale images using Vector Quantization (VQ) Code Book generation techniques. The problem of coloring grayscale image has no exact solution. Attempt is made to minimize the human efforts needed in manually coloring grayscale images. Here human interaction is only to find reference image of similar type. The job of transferring color from reference image to grayscale is done by proposed techniques. Vector quantization algorithms Linde Buzo and Gray Algorithm (LBG) and Kekre Proportionate Error (KPE) are used to generate color palette in RGB and Kekre’s LUV color space. For colorization source color image is taken as reference image which is divided into non overlapping pixel windows. Initial clusters are formed using VQ algorithms LBG and KPE, used to generate the color palette. Grayscale image which is to be colored is also divided in non overlapping pixel windows. Every pixel window of gray image is compared with color palette to get the nearest color values. Best match is found using least mean squared error. To test the performance of these algorithms, color image is converted into gray scale image and the same grayscale image is recolored back. Finally MSE of recolored image and original image is compared. Experiment is conducted on both RGB and Kekre’s LUV color space for the different pixel windows of size 1x2, 2x1, 2x2, 2x3, 3x2, 3x3, 1x3, 3x1, 2x4, 4x2, 1x4, 4x1. However Kekre’s LUV color space gives outstanding performance. For different pixel windows KPE with 1x2 and LBG with 2x1 pixel window perform well with respect to image quality.

.

Keywords- Colorization, Pixel Window, Color Palette, Vector Quantization(VQ) , LBG, KPE.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

21. Paper 27051132: Evolving Fuzzy Classification Systems from Numerical Data (pp. 139-147)

Full Text: PDF

.

Pardeep Sandhu, Maharishi Markandeshwar University, Mullana, Haryana, India

Shakti Kumar, Institute of Science and Technology, Klawad, Haryana, India

Himanshu Sharma, Maharishi Markandeshwar University, Mullana, Haryana, India

Parvinder Bhalla, Institute of Science and Technology, Klawad, Haryana, India

.

Abstract — Fuzzy Classifiers are an important class of fuzzy systems. Evolving fuzzy classifiers from numerical data has assumed lot of significance in the recent past. This paper proposes a method of evolving fuzzy classifiers using a three step approach. In the first step, we applied a modified Fuzzy C–Means Clustering technique to generate membership functions. In the second step, we generated rule base using Wang and Mendel algorithm. The third step was used to reduce the size of the generated rule base. This way rule explosion issue was successfully tackled. The proposed method was implemented using MATLAB. The approach was tested on four very well known multi dimensional classification data sets. The bench mark classification data sets contain: Iris Data, Wine Data, Glass Data and Pima Indian Diabetes Data sets. The performance of the proposed method was very encouraging. We further implemented our algorithm on a Mamdani type control model for a quick fuzzy battery charger data set. This integrated approach was able to evolve model quickly.

.

Keywords — Linguistic rules, Fuzzy classifier, Fuzzy logic, Rule base.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

22. Paper 27051134: A Low-Power CMOS Implementation of a Cellular Neural Network for Connected Component Detection (pp. 148-152)

Full Text: PDF

.

S. El-Din, A.K. Abol El-Seoud, and A. El-Fahar

Electrical Engineering Department, University of Alexandria, Alex, Egypt.

M. El-Sayed Ragab, School of Electronics, Comm. and Computer Eng., E-JUST. , Alexandria, Egypt.

.

Abstract - In this paper, we describe an analog VISI implementation of a Cellular Neural Network (CNN) for Connected Component Detector (CCD) applications. In this implementation, a novel compact network architecture based on a low-power CMOS realization has been employed. The functionality of the proposed network has been verified through SPICE simulations for 1-D vectors of arbitrary black-and-white pixels.

.

Keywords: Cellular Neural Network, Low-power CMOS, Connected Component Detector.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

23. Paper 30041170: An Integrated Framework for Content Based Image Retrieval (pp. 153-157)

Full Text: PDF

.

Ritika Hirwane, SOIT, RGPV, Bhopal

Prof. Nishchol Mishra, SOIT, RGPV, Bhopal

.

Abstract — Content-based image retrieval (CBIR) is an important research area for manipulating large amount of images from the databases. Extraction of invariant features is the basis of CBIR. Color, texture, shape and spatial information have been important image descriptors in content based image retrieval systems. This paper presents a framework for combining all the three features i.e. color, texture and shape and accomplish higher retrieval efficiency using image by dempster shafer theory of evidence (DST). The main aim of evidence theory is to represent and handle uncertain information. An important property of this theory is its ability to merge different data sources in order to improve the quality of the information retrieval. Dempster evidence integrating the color, shape and texture analysis in image retrieval, the accuracy is much higher than using the techniques separately.

.

Keywords- Belief function, Dempster shafer theory, Evidence theory, Feature extraction, Probabilities

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

24. Paper 30051135: A Novel Approach for Intranet Mailing For Providing User Authentication (pp. 158-163)

Full Text: PDF

.

ASN Chakravarthy, Sri Sai Aditya Institute of Science & Technology, Suram Palem,E.G.Dist , Andhra Pradesh, India

A.S.S.D. Toyaza, Sri Sai Aditya Institute of Science & Technology, Suram Palem,E.G.Dist , Andhra Pradesh, India

.

Abstract - With the explosion of the public Internet and e-commerce, private computers, and computer networks, if not adequately secured, are increasingly vulnerable to damaging attacks. Hackers, viruses, vindictive employees and even human error all represent clear and present dangers to networks. Various antidotes that are in fact inextricable with security issues are – Cryptography, Authentication, Integrity and Non Repudiation, Key Distribution and certification, Access control by implementing Firewalls etc. The main idea of this paper is to overcome the PGP’s(Pretty Good Privacy) main limitation of incomplete non-repudiation Service, which increases the degree of security and efficiency of an email message communication through NRR(Non-Repudiation of Receipt) and including PGPs original feature of NRO(Non-Repudiation of Origin), and there it assures new security service of Mutual Non- Repudiation (MNR).

.

Keywords: PGP, EPGP, Non-Repudiation, NRO, NRR, MNR, Security.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

25. Paper 31011189: Visualization of Fluid Flow Patterns in Horizontal Circular Pipe Ducts (pp. 164-170)

Full Text: PDF

.

Olagunju, Mukaila, Department of Computer Science, Kwara State Polytechnic, Ilorin, Nigeria

Taiwo, O. A (Ph.D), Department of Mathematics, University of Ilorin, Nigeria.

.

Abstract — This paper developed a visualization model for determination of frictional Head loss in a circular pipe ducts. Head loss is due to friction when the liquid or gases come in contact with wall of the pipe. To determine the loss at each duct, modified Hagen postulates equation was used in visualization. Frame work stages were developed which consists of data generation framework stages, data enrichment framework stages, data rendering framework stages, visualization development stages and output representation framework stages. Based on the model of visualization stages, MATLAB program was used to determine head loss due to pressure drop and represented in tabular form and 2D representation. This model greatly assist the learners and instructors in determine the flow patterns especially the head loss of fluid in a pipe wall by considering different points or ducts, this model serve as a reusable for both learner and instructors by assist in determine the region along the wall of the pipe where the head loss is very great.

.

Keywords: Step Wise Visualization, Patterns, circular, pipe ducts, fluid flow.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

26. Paper 31031183: Iris Image Pre-Processing and Minutiae Points Extraction (pp. 171-174)

Full Text: PDF

.

Archana R. C., J. Naveenkumar, Prof. Dr. Suhas.H.Patil

Computer Engineering, VDUCOE, Pune, Maharashtra, India

.

Abstract — An efficient method for personal identification based on the pattern of human iris is proposed in this paper. Crypto-biometrics is an emerging architecture where cryptography and biometrics are merged to achieve high level security systems. Iris recognition is a method for biometric authentication that uses pattern-recognition techniques based on high-resolution images of the irides of an individual's eyes. Here we discuss about ‘recognizing the iris and storing the pattern of the iris recognized.

.

Keywords- Biometrics, Cryptography, pattern recognition, Canny edge detection, Hough transform

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

27. Paper 31051142: Establishing Relationships among Chidamber And Kemerer’s Suite of Metrics using Field Experiments (pp. 175-182)

Full Text: PDF

.

Ezekiel U Okike, Department of computer Science, University of Ibadan, Ibadan, Nigeria

Adenike O. Osofisan, Department of computer Science, University of Ibadan, Ibadan, Nigeria

.

Abstract — Chidamber and Kemerer suite of metrics were used in the study of 3250 classes from five JAVA based industrial systems. System 1 had 34 classes, System 2 had 318 classes, System 3 383 classes, System 4 1055 classes, and System 5 1460 classes. Metric values were computed for the Lack of Cohesion in Methods (LCOM), Coupling between Object Classes (CBO), Response For a Class (RFC), Number of Children (NOC), Depth of Inheritance (DIT), Weighted Methods per Class (WMC), Number of Public Methods (NPM) and Afferent Coupling (CA). LCOM was used as cohesion variable; CBO, RFC, CA as coupling variables; WMC, NPM, NOC as size variables and DIT as inheritance variable. Descriptive statistics and correlation analysis were used to analyze the results. Tests of hypotheses were carried out to examine if there were significant relationships between cohesion and coupling

variables, cohesion and size variables, cohesion and inheritance variables, coupling and inheritance variables, coupling and size variables, inheritance and size variables. The results of the study showed that in System 1, there were strong significant relationships between cohesion and coupling (CBO, RFC), cohesion and size (WMC,NPM), coupling and size variables (WMC, NPM), and a weak relationship between Affarent coupling CA and inheritance (DIT). In System 2, there were strong significant relationship between cohesion and coupling (CBO, RFC), cohesion and size variables (WMC, NPM), coupling and size (WMC, NPM); a low relationship between coupling by CA and WMC, NOC, and a weak relationship between inheritance DIT and WMC, RFC, NPM. In System 3, there were strong significant relationship between cohesion and coupling (CBO, RFC), cohesion and size (WMC, NPM), coupling and size (WMC, NPM); a low relationship between coupling by CA. In System 4, there were strong significant relationship between cohesion and coupling (RFC), cohesion and size (WMC, NPM), size WMC and Size NPM. There were low relationships between cohesion and coupling (CBO, CA) and weak relationships between inheritance DIT and CA, inheritance and size (WMC). In System 5, there were strong significant relationship between cohesion and size (WMC, NPM), coupling and size (WMC, NPM); a low relationship between cohesion and coupling, inheritance and coupling (CBO), coupling and size.

.

Keywords-software measurement; cohesion, coupling, inheritance, Chidamber and Kemerer metrics

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

28. Paper 31051143: Risk Assessment of Authentication Protocol: Kerberos (pp. 183-187)

Full Text: PDF

.

Pathan Mohd. Shafi, Smtg Kashibai Navale College of Engineering,Pune

Dr Abdul sattar, Royal Institute of Technology and Science R. R. Dist.

Dr. P. chenna Reddy, JNTU College of Engineering, Pulivendula.

.

Abstract — Kerberos is a well-established authentication system. As new authentication methods arise, incorporating them into Kerberos is desirable. However, extending Kerberos poses challenges due to a lack of source code availability for some implementations and a lengthy standardization process. This proposal presents important functions, strengths and weakness in Kerberos. It details out design issues, limitations and risks associated with Kerberos. This proposal also explains one more aspect that is “extensibility” with Kerberos. This proposal also briefs Kerberos enhancements for using public-key cryptography. It discusses scalability and security needs & how public-key helps in accomplishing these goals. At the end it focuses on commercial application/services those uses Kerberos rigidly. This proposal on pre-authentication. A Pre-Authentication in Kerberos (EPAK) is Kerberos extension which enables many authentication methods to be loosely coupled with Kerberos, without further modification to Kerberos. Why pre-authentication is necessary? An attacker usually impersonates user by obtaining authentication responses & performs offline dictionary attacks against the encrypted data, this is a likely attack with Kerberos. Thus “pre-authentication” can lower the possibility for offline password-guessing attacks. We also discuss two prototype examples to understand flexibility Kerberos using EPAK .It uses public key approach which is very resource consuming but in the era of wireless communication and mobile devices we need to find the light weight application. Kerberos has grown to become the most widely deployed system for authentication and authorization in modern computer networks. Kerberos is currently shipped with all major computer operating systems and is uniquely positioned to become a universal solution to the distributed authentication and authorization problem of communicating parties.

.

Keywords:

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

29. Paper 31051144: TOTN: Development of A Tourism – Specific Ontology For Information Retrieval In Tamilnadu Tourism (pp. 188-193)

Full Text: PDF

.

K. R. Ananthapadmanaban, Research Scholar, Sri Chandrasekarendra SaraswathiViswa Mahavidyalaya University, Enathur, Kanchipuram-631 561

Dr. S. K. Srivatsa, Senior Professor, St.Joseph‘s College of Engg., Jeppiaar Nagar, Chennai-600 064

.

Abstract — Tourism is an information business. Electronic tourism is one of the activities that have enjoyed of an important success in the Internet. The constant fast growth in travel related information makes it difficult to find, organize, access and maintain the information required by users. E-tourism is a perfect candidate form Semantic web and the success of Semantic Web depends on ontologies. Ontology, an explicit specification of Conceptualization [1] and provides a description of the domain of interest. This paper focus on creating tourism ontology in order to improve the process of searching for the perfect tourism package according to user context for Tamilnadu tourism.

.

Keywords-component; Ontologies, semantic web, Protégé tool, OWL

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

30. Paper 31051145: Unified Fast Algorithm for Most Commonly used Transforms using Mixed Radix and Kronecker Product (pp. 194-202)

Full Text: PDF

.

Dr. H.B. Kekre, Senior Professor, Department of Computer Science, Mukesh Patel School of Technology Management and Engineering, Mumbai, India

Dr. Tanuja Sarode, Associate Professor, Department of Computer Science, Thadomal Shahani College of Engineering, Mumbai, India

Rekha Vig, Asst. Prof. and Research Scholar, Dept. of Elec. and Telecom., Mukesh Patel School of Technology Management and Engineering, Mumbai, India

.

Abstract — In this paper we present a unified algorithm with some minor modifications applicable to most of the transforms. There are many transforms used in signal and image processing for data compression and many other applications. Many authors have given different algorithms for reducing the complexity to increase the speed of computation. These algorithms have been developed at different points of time. The paper also shows how the mixed radix system of counting can be used along with Kronecker product of matrices leading to fast algorithm reducing the complexity to logarithmic order. The results of use of such transforms have been shown for both 1-D and 2-D (image) signals and considerable compression is observed in each case.

.

Keywords - Orthogonal transforms, Data compression, Fast algorithm, Kronecker product, Decimation in Time, Decimation in Frequency, mixed radix system of counting

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

31. Paper 31051147: A Framework for Identifying Software Vulnerabilities within SDLC Phases (pp. 203-207)

Full Text: PDF

.

Zeinab Moghbel, Department of Computer Engineering, Science and Research Branch, Islamic Azad University, Tehran, Iran

Nasser Modiri, Department of Computer Engineering, Zanjan Branch, Islamic Azad University, Zanjan, Iran

.

Abstract — Considering the fast development of software and its complexity, the requirement of securing has faced new aspects. The more the software becomes complex and its access rate rises, a creative technique is being created to attack, access, or manipulate its data. Therefore, creating a new approach in order to detect software vulnerability is essential. Various studies have proved that in case of considering security in late phases of software development and testing to mitigate software vulnerabilities, will be time consuming and complex, and it is probably that it couldn’t supply the security completely. So, taking into account the security issue from the early phases of software development is essential. In this paper, we propose a framework in order to identify software vulnerability. In this framework, we use common criteria standard (ISO/IEC 15408) and CVE (Common Vulnerabilities and Exposures) to identify software vulnerability, which is done in every phase of the software development life cycle. Therefore, the process of secure software development will be improved, and software with less vulnerability will be produced.

.

Keywords- Software vulnerability; Common Criteria (CC);Common Vulnerabilities and Exposures (CVE); Common Vulnerability Scoring System (CVSS); secure software

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

32. Paper 31051148: Text Clustering Based on Frequent Items Using Zoning and Ranking (pp. 208-214)

Full Text: PDF

.

S. Suneetha, Dr. M. Usha Rani, Department of Computer Science, SPMVV, Tirupati

Yaswanth Kumar.Avulapati, Dept of Computer Science, S.V.University, Tirupati

.

Abstract — In today’s information age, there is an incredible nonstop growth in the textual information available in electronic form. This increasing textual data has led to the task of mining useful or interesting frequent itemsets (words/terms) from very large unstructured text databases and this task still seems to be quite challenging. The use of such frequent association for text clustering has received a great deal of attention in research communities since the mined frequent itemsets reduces the dimensionality of the documents drastically. In this work, an effective approach for text clustering is developed in accordance with the frequent itemsets that provides significant dimensionality reduction. Here, Apriori algorithm, a well-known method for mining the frequent itemsets is used. Then, a set of non-overlapping partitions are obtained using these frequent itemsets and the resultant clusters are generated within the partition for the document collections. An extensive analysis of frequent item-based text clustering approach is conducted with a real life text dataset, Reuters-21578. The experimental results of the frequent item-based text clustering approach for 100 documents of Reuters-21578 dataset are given, and the performance of the same has been evaluated with Precision, Recall and F-measure. The results ensured that the performance of the proposed approach improved effectively. Thus, this approach effectively groups the documents into clusters and mostly, it provides better precision for dataset taken for experimentation.

.

Keywords— Text Mining, Text Clustering, Text Documents, Frequent Itemsets, Apriori Algorithm, Reuters-21578.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

33. Paper 31051162: Steganography based on Contourlet Transform (pp. 215-220)

Full Text: PDF

.

Sushil Kumar, Department of Mathematics, Rajdhani college, University of Delhi, New Delhi, India

S.K. Muttoo, Department of Computer Science, University of Delhi, Delhi, India

.

Abstract — In this paper we present a steganographic technique based on Contourlet transform (CTT). The proposed technique uses a self-synchronizing variable length code to encode the original message which has been proved better than Huffman code in terms of power energy. The secret data then is embedded in the high frequency sub-bands obtained by applying CTT to the cover-image using variable LSB method and Thresholding method. The Contourlet transform is more suitable for data hiding applications as Contourlet gives more edges. Moreover more data can be hidden in the high frequency regions without perceptibility distorting the original image. Experimental results show that the original message and original image both can be recovered form stego-image accurately. The results are compared with existing steganographic techniques [10-12] based on Discrete Wavelet Transform (DWT) and Discrete Slantlet Transform (SLT). It is known that SLT is a better candidate for signal compression compared to the DWT based scheme and it can provide better time localization. Experimental results have confirmed CTT based method gives better imperceptibility and better embedding rate than the DWT.

.

Keywords- Steganography, DWT, SLT, CTT, LSB, PSNR

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

34. Paper 31051164: A Comparative Study of Proposed Improved PSO Algorithm with Proposed Hybrid Algorithm for Multiprocessor Job Scheduling (pp. 221-228)

Full Text: PDF

.

K. Thanushkodi, Akshaya College of Engineering and Technology Coimbatore, India

K. Deeba, Department of Computer Science and Engineering, Kalaignar Karunanidhi Institute of Technology, Coimbatore, India

.

Abstract — Particle Swarm Optimization is currently employed in several optimization and search problems due its ease and ability to find solutions successfully. A variant of PSO, called as Improved PSO has been developed in this paper and is hybridized with the simulated annealing approach to achieve better solutions. The hybrid technique has been employed, inorder to improve the performance of improved PSO. This paper shows the application of hybrid improved PSO in Scheduling multiprocessor tasks. A comparative performance study is reported. It is observed that the proposed hybrid approach gives better solution in solving multiprocessor job scheduling.

.

Keywords— PSO, Improved PSO, Simulated Annealing, Hybrid Improved PSO, Job Scheduling.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

35. Paper 31051165: SCAM – Software Component Assessment Model (pp. 229-234)

Full Text: PDF

.

Hasan Tahir, Aasia Khannum, Ruhma Tahir

Department of Computer Engineering, College of Electrical & Mechanical Engineering, National University of Sciences and Technology (NUST), Islamabad, Pakistan

.

Abstract - It is widely understood that component based development is different from conventional development because components offer accelerated growth. In the absence of an effective component assessment strategy the developers of a software project have no way of assessing the quality of the software component they are about to incorporate into the project. We present two laws that link software components, software projects and their quality. We further propose a simple software component assessment strategy based on which both the component developers and component consumers can independently assess their component.

.

Keywords – software component; software component quality; software component assessment

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

36. Paper 31051170: Selected Problems on Mobile Agent Communication (pp. 235-239)

Full Text: PDF

.

Yinka A. Adekunle and Sola S. Maitanmi

Department of Computer Science & Mathematics, Babcock University, Ilisan Remo, Ogun State, Nigeria.

.

Abstract - Mobile agent technology offers a new computing paradigm in which a program, in the form of a software agent, can transfer its execution from agent to agent masquerading itself as the original source of message. The use of mobile code has a long history dating back to the use of remote job entry systems in the 1960's. Today's agent incarnations can be characterized in a number of ways ranging from simple distributed objects to highly secured software with algorithm that can only be interpreted by only the sender and the receiver. As the sophistication of mobile software has increased over time, so too have the associated threats to security. This paper studies masquerading as one of these threats and provide appropriate solution in form of algorithm.

.

Keywords: Mobile agent, masquerading, encryption and decryption.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

37. Paper 31051175: KGS Based Control for Parking Brake Cable Manufacturing System (pp. 240-248)

Full Text: PDF

.

Geeta Khare, S.S.J.P, Asangaon

Dr. R.S. Prasad, R.R.I.M.T., Lucknow

.

Abstract - In today’s competitive production environment, process industries, demand a totally integrated control and optimization solution that can increase productivity, reliability and quality while minimizing cost. Automation is a step beyond mechanization. For automation of production plant either centralized or distributed control system are used. There are standardized approaches and standard hardware & software available worldwide as per the requirement. Investment in process control system is also an important factor in improving product quality and lowering production cost while providing a competitive edge to industries, which has forced to others to improve their process control technology. Various aspects of centralized and distributed digital control and its pros and cons have been studied. Here an attempt is made to overcome the disadvantages of above system to improve efficiency as well as profit level by going into minute details from root to top level i.e. on line operator up to management or supervisor level, the system named as KGS i.e. Kaleidoscopic Governing systems. Paper presents, the technical viability of ‘KGS’ system how it is efficient in various ways to optimize output, improve profit margin and reliable for output prediction in the operation of manufacturing of PBC Cable which is used in automobile sector.

.

Keywords: - Distributed Control systems, KGS, Conduit, Abutment, PBC

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

38. Paper 31051183: Performance Appraise of Assorted Thresholding Methods in CBIR using Block Truncation Coding (pp. 249-255)

Full Text: PDF

.

Dr. H.B. Kekre, Sudeep D. Thepade, Shrikant Sanas

Computer Engineering Department, MPSTME, SVKM’s NMIMS (Deemed-to-be University), Mumbai, India

.

Abstract — The paper proposes various types of thresholding methods for generation of image bitmaps used in Block Truncation Coding (BTC), also the performance comparison of these assorted thresholding methods in image retrieval using multilevel BTC is presented. The different thresholding methods discussed here alias Global thresholding, Local thresholding and Intermediate thresholding. Based on the type of thresholding method used for bitmap generation in BTC the performance of the respective BTC based image retrieval method varies. The proposed variations of BTC based image retrieval techniques are tested on extended Wang generic image database of 1000 images spread across 11 categories. For each CBIR (content based image retrieval) technique, 1000 queries are fired on image database to compute average precision and recall for all queries with respect to number of retrieved image. These values are plotted to obtain the crossover point of precision and recall, which is used as criteria for performance comparison. The results have shown the performance improvement (i.e., higher precision and recall crossover point values) with Intermediate BTC-CBIR method. The performance of multileveled Intermediate BTC-CBIR increases gradually with increase in level up to certain extent (Level 3) and then increases slightly due to voids being created at higher levels. In all level 3 of BTC Intermediate-9 BTC gives best performance.

.

Keywords: CBIR, BTC, Multilevel BTC, Thresholding.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

39. Paper 31051193: Performance Analysis of Cryptographic Algorithms Like ElGamal, RSA, and ECC for Routing Protocols in Distributed Sensor Networks (pp. 256-263)

Full Text: PDF

.

Suresha , Department of CSE, Reva Institute of Technology and Management, Bangalore ,Karnataka, India

Dr. Nalini.N , Prof. and Head, Department of CSE, Nitte Meenakshi Institute of Technology, Bangalore, Karnataka, India

.

Abstract - Distributed Sensor Networks (DSNs) are multihop networks, which depend on the intermediate nodes to transmit the data packet to the destination. These nodes are equipped with lesser memory, limited battery power, little computation capability, small range of communication and need a secured and efficient routing path to forward the incoming packet. Sensor nodes are used to collect data in hostile environments, but the energy, processing speed and security are very much concerned in large scale deployment. In this paper, the comparisons of two routing protocols (FLAT and HIERARCHICAL) have been made with respect to Energy Dissipation for transmission of data and also with and without security features for the routing protocols. The cryptographic algorithms such as ElGamal, RSA and ECC, provide security features like Confidentiality are considered for the performance Analysis. The proposed model estimates the energy required for providing security features for the routing protocols.

.

Keywords: Routing Protocols, Energy, Cryptosystems.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

40. Paper 310511100: Exaggerate Self Quotient Image Model For Face Recognition Enlist Subspace Method (pp. 264-269)

Full Text: PDF

.

S. Muruganantham, Assistant professor, S.T.Hindu college, Nagercoil, India -629003

T. Jebarajan, Principal, Kings College of Engineering, Chennai, India - 600105

.

Abstract - The attribute of reliance facial recognition technique is frequently impinge on variation of illumination such as shadow and illumination direction changes. In this paper, to enrich the performance of self quotient image model for the elimination of lighting effect for face recognition is recited and dispelled in detail. A Histogram equalization is embrace to enhance the contrast of samples. Then normalize the samples as the result of enhanced SQI. To apply enhanced SQI to face recognition subspace analysis algorithms ( PCA, KPCA and ICA ) are tout to perform subspace analysis on normalized samples. To prosecute experiments on CAS-PEAL face database. The face samples are preprocessed by enhance SQI to meliorate the performance of subspace analysis algorithms. Experimental result canonical the preprocessing of samples could salutary the robust to not only lighting but also facial expressions, masking and occlusion etc. in face recognition domain.

.

Keywords: Face recognition, Histogram equalization, Subspace analysis, illumination.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

41. Paper 10000000: PHCC: Predictive Hop-by-Hop Congestion Control Protocol for Wireless Sensor Networks (pp. 270-274)

Full Text: PDF

.

Shahram Babaie, Eslam Mohammadi, Saeed Rasouli Heikalabad, Hossein Rasouli

Technical and Engineering Dept., Tabriz Branch, Islamic Azad University, Tabriz, Iran

.

Abstract — In wireless sensor networks (WSNs) Congestion may cause packet loss, delay, and energy waste due to a large number of packet drops and retransmissions. Therefore congestion in WSNs needs to be controlled for high energy-efficiency, to prolong system lifetime, improve fairness, and improve quality of service in terms of throughput and packet loss ratio along with the packet delay. To achieve this objective, a predictive ho-by-hop congestion control (PHCC) algorithm is proposed in this paper. The PHCC can predict congestion in the node and adjusts every upstream traffic rate with its node priority to mitigate congestion hop by hop. PHCC introduces a priority-based rate adjustment algorithm which guarantees weighted fairness in multipath routing WSNs. In addition, PHCC can broadcast traffic on the entire network fairly. Simulation results show that the performance of proposed protocol is more efficient than previous algorithms.

.

Keywords-wireless sensor network; congestion control; predictive; priority-based fairness

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

42. Paper 31051158: Secured Right Angled or Ant Search Protocol for Reducing Congestion Effects and Detecting Malicious Node in Mobile Ad hoc Networks by Multipath Routing (pp. 275-283)

Full Text: PDF

.

Lt. Dr. S Santhosh Baboo, P.G. Research Dept of Com. Science, Arumbakkam, Chennai – 106., D G Vaishnav College, Arumbakkam, Chennai – 106.

V J Chakravarthy, Research Scholar, Dravidian University

.

Abstract – In this paper, we develop a security based protocol using Biased Geographical – Ant Search multipath routing approach which attains confidentiality and authentication of packets in both routing and link layers of MANETs. In first phase we developed a new method for routing the packets from source to destination using right angled geographical routing techniques and shortest path by ant search method to reduce congestion effects. Secondly we proposed a protocol called SRAOA (Secured Right Angled or Ant Search) an on-demand routing protocol. Thirdly, we added security to our proposed protocol using MD 5 (digest algorithm) which provides link level security for the packets transmission between source and destination and for detecting and isolating malicious nodes using certificate generation for the nodes which are in the parent network. In the next phase of the protocol, we did encryption and decryption for authentication, we used RSA algorithm. The performance of our SRAOA protocol is compared / validated with some prominent routing protocols for mobile ad hoc networks, in the presence of malicious node in the simulation environment, namely Ad hoc On Demand Distance vector (AODV) and Ad hoc On-demand Multipath Distance Vector (AOMDV), DSR (Dynamic Source Routing), DSDV (Destination Sequenced Distance Vector. We have chosen four performance metrics, such as Average Delay, Packet Delivery Ratio, Routing Load, and Throughput. We did simulation for the protocol scheme in NS-2. Simulation results show that RAOA achieves the fairness throughput, high packet delivery attaining low delay and overhead in the presence of malicious nodes.

.

Keywords:

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

43. Paper 31051190: Side Lobe Reduction Of A Planar Array Antenna By Complex Weight Control Using SQP Algorithm And Tchebychev Method (pp. 284-289)

Full Text: PDF

.

A. Hammami, R. Ghayoula, and A. Gharsallah

Unité de recherche: Circuits et systmes électroniques HF, Faculté des Sciences de Tunis, Campus Universitaire Tunis EL-manar, 2092, Tunisie

.

Abstract — In this paper, we propose an efficient hybrid method based on the sequential quadratic programming (SQP) algorithm and Dolph-Tchebychev for the pattern synthesis of planar antenna arrays with prescribed pattern nulls in the interference direction and minimum side lobe levels SLL by controlling only the phase of each array element. The SQP algorithm is the most widely used to solve nonlinear optimization problems. It consists of transforming the nonlinear problem to sequence of quadratic subproblems by using a quadratic approximation of

the lagrangian function. In order to illustrate the performance of the proposed method, several examples of complex excited planar array patterns with one-half wavelength spaced isotropic elements to place the main beam in the direction of the useful signal while reduce the side lobe level were investigated.

.

keywords—Planar antenna arrays, Synthesis method, null steering, sequential quadratic programming

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

44. Paper 22051119: Image Compression Algorithm- A Review (pp. 290-295)

Full Text: PDF

.

Marcus karnan, Tamilnadu College of Engineering, Coimbatore, India

M.S.Tamilselvi, Research Scholar, Dept of Computer Science & Engg., ANNA University, Coimbatore, India

.

Abstract - Existing Video Indexing Models are analyzed and a practical approach to the optimal Video Indexing is introduced. It is studied under all the phases of video indexing processes like segmentation, indexing, database storage, query based access, browsing and video clip retrieval, etc… The main aim is to easily parse the video stream into meaningful scenes, maintain them in an effective database with minimal data repetitions, efficient query handling and user friendly browsing capabilities. Conceptual Graph, Motion Estimation, Mean Absolute Frame Difference, Displaced Frame Difference, Dublin Core and other important existing techniques are utilized in this model. The main aim is to reduce the memory storage of video clippings without visible loss in quality by using a predictive video compression technique. Today almost all video clippings face a compromise between their quality and memory size. Even Video clippings are not advised to include in the web pages because of their downloading time and its memory size. To try to rectify it and to take positive measures to convert a video clipping file similar to *.swf (flashplayer’s shockwave files) is the aim of this presentation.

.

Keywords : :Coding, Block Materials, Inter Frame Compress Technique, Sub sampling, Difference Coding

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

45. Paper 31051189: Compensation of Nonlinear Distortion in OFDM Systems Using an Efficient Evaluation Technique (pp. 296-299)

Full Text: PDF

.

Dr. (Mrs.). R. Sukanesh, Professor / Department of ECE / TCE, Madurai – 15, India.

R. Sundaraguru, Research Scholar, Anna University, Chennai-25, India.

.

Abstract — Orthogonal Frequency Division Multiplexing (OFDM) signal with larger peak to average power ratio (PAPR) will cause the undesirable spectrum re-growth and performance degradation in bit error rate (BER), both due to the intermodulation products occurring in the nonlinear amplifier at the transmitter. This paper proposes a new approach to compensate the nonlinearity introduced by the HPA. By approximating the attenuation coefficient of HPA model, the distortion is estimated, and then it is subtracted from the received symbol at the receiver. By performing several iterations, the estimation of the distortion becomes more accurate, and cancels the nonlinear distortion. Simulation results show that the presented scheme is more efficient to compensate the nonlinear distortion in OFDM systems.

.

Keywords— Orthogonal Frequency division Multiplexing (OFDM), Nonlinear Distortion (NLD), High Power Amplifier (HPA), Bit Error Rate (BER), Peak to Average Power Ratio (PAPR).

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

46. Paper 31051191: Performance Prediction of Single Static String Algorithms on Cluster Configurations (pp. 300-306)

Full Text: PDF

.

Prasad J. C., Research Scholar, Dept. of CSE, Dr.MG.R University, Chennai

cum Asst. Professor, Dept of CSE, FISAT, Angamaly, India

K. S. M. Panicker, Professor, Dept of CSE, Federal Institute of Science and Technology [FISAT], Angamaly, India

.

Abstract — This paper study various factors in the performance of static single pattern searching algorithms and predict the searching time for the operation to be performed in a cluster computing environment. Master-worker model of parallel computation and communication is designed for searching algorithms with MPI technologies. Prediction and analysis is based on KMP, BM, BMH, ZT, QS, BR, FS, SSABS, TVBS, ZTBMH and BRFS algorithms. Performances have compared and discussed the results. Theoretical and practical results have presented. This work consists of implementation of the same algorithms in two different cluster architecture environments Dhakshina-I and Dhakshina-II. This has improved the reliability of the prediction method. The result of algorisms helps us to predict and implement this technology in the areas of text search, DNA search and Web related fields in a cost effective manner.

.

Keywords- Static String Searching, Beowulf Cluster, Parallel programming

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

47. Paper 31051181: Explicit Solution of Hyperbolic Partial Differential Equations by an Iterative Decomposition Method (pp. 307-309)

Full Text: PDF

.

Adekunle, Y.A., Department of Computer Science and Mathematics, Babcock University, Ilisan-Remo Ogun State, Nigeria

Kadiri, K.O., Department Electrical/Electronics Engineering, Federal Polytechnic, Offa, Kwara State, Nigeria

Odetunde, O.S., Department of Mathematical Science, Olabisi Onabanjo University, Ago-Iwoye, Ogun State,Nigeria

.

Abstract - In this paper, an iterative decomposition method is applied to solve partial differential equations. The solution of a partial differential equation of the hyperbolic form is obtained by the stated method in the form of an infinite series of easily computable terms. Some examples are given and the solutions obtained by the method are found to compare favourably with the known exact solutions.

.

Keywords: Hyperbolic partial differential equation, iterative decomposition method, analytical solution.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

48. Paper 31051182: Numerical Approximation Of Generalised Riccati Equations By Iterative Decomposition Method (pp. 310-312)

Full Text: PDF

.

Adekunle, Y.A., Department of Computer Science and Mathematics, Babcock University, Ilisan-Remo Ogun State, Nigeria

Kadiri, K.O., Department Electrical/Electronics Engineering, Federal Polytechnic, Offa, Kwara State, Nigeria

Odetunde, O.S., Department of Mathematical Science, Olabisi Onabanjo University, Ago-Iwoye, Ogun State,Nigeria

.

Abstract - In this paper, the Iterative Decomposition Method is applied to solve the generalized Riccati equations. We considered equations with variable coefficients, as well as those with constant coefficients. The present method presents solutions as easily computable, fast convergent infinite series, requiring no discretization. Examples are presented to establish the accuracy and efficiency of the method.

.

Keyword: Generalised Riccati Equations, Iterative Decomposition Method.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

49. Paper 26051128: Automated Face Detection and Feature Extraction Using Color FERET Image Database (pp. 313-318)

Full Text: PDF

.

Dewi Agushinta R.(1), Fitria Handayani S. (2)

(1) Information System, (2) Informatics

Gunadarma University, Jl. Margonda Raya 100 Pondok Cina, Depok 16424, Indonesia

.

Abstract — Detecting the location of human faces and then extracting the facial features in an image is an important ability with a wide range of applications, such as human face recognition, surveillance systems, human-computer interfacing, biometric identification, etc. Both face detection and face features extraction methods have been reported by the researchers, each with a separate process in the field of face recognition. They need to be connected through adapting the face detection results to be the input face in the extraction process by turning the minimum face size results from the detection process, and the way of face cropping process from the extraction process. The identification and recognition of human face features that has developed in this research is the combination of face features detecting and extracting process in 150 frontal single still face images from color FERET facial image database with additional extracted face features and face features’ distances.

.

Keywords- face detection, face extraction, face features, face recognition, feret

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

50. Paper 31051186: Simplified Neural Network Design for Hand Written Digit Recognition (pp. 319-322)

Full Text: PDF

.

Muhammad Zubair Asghar (1), Hussain Ahmad (1), Shakeel Ahmad (1), Sheikh Muhammad Saqib (1), Bashir Ahmad (1) and Muhammad Junaid Asghar (2)

(1) Institute of Computing and Information Technology Gomal University, D.I.Khan, Pakistan

(2) Faculty of Pharmacy, Gomal University, D.I.Khan, Pakistan

.

Abstract :- Neural Network is abstraction of the central nervous system and works as parallel processing system. Optimization, image processing, Diagnosis and many other applications are made very simple through neural networks, which are difficult and time consuming when conventional methods are used for their implementation. Neural Network is the simplified version of human brain. Like human brain, neural networks also exhibit efficient performance on perceptive tasks like recognition of visual images of objects and handwritten characters etc: Recognition of handwritten digits is one of the oldest applications of ANN. The recognition of digits written in different handwritings and also from scanned text has remained a trouble thus it has received much attention of researchers in the field of artificial neural networks. We can distinguish among handwriting of different persons due to the fact that human brain is capable to even slight variations of visual images. In this research work a very simple and flexible neural network scheme is proposed and implemented for handwritten digit recognition, which will assist beginners and A.I students who want to understand perceptive capability of neural network. In the proposed system, a very simple design of artificial neural networks is implemented. First of all learning mechanism of the neural network is described and then its architecture is discussed. Proposed network is trained in supervised manner using various (approx: 250) patterns /fonts of handwritten digits. Unique token is allocated to digit when it is made input to the system. Network becomes adaptive when different patterns of the same digit are taught to the network for one particular token.

.

Keywords: Neural Network, Visual Images, digit recognition

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

51. Paper 31051186: Diagnosis of Skin Diseases using Online Expert System (pp. 323-325)

Full Text: PDF

.

Muhammad Zubair Asghar (1), Muhammad Junaid Asghar (2), Sheikh Muhammad Saqib (1), Bashir Ahmad (1), Shakeel Ahmad (1) and Hussain Ahmad (1).

(1) Institute of Computing and Information Technology Gomal University, D.I.Khan, Pakistan

(2) Faculty of Pharmacy, Gomal University, D.I.Khan, Pakistan

.

Abstract -- This paper describes Expert System (ES) for diagnosis and management of skin diseases. More than 13 types of skin diseases can be diagnosed and treated by our system. It is rule based web-supported expert system, assisting skin specialists, medical students doing specialization in dermatology, researchers as well as skin patients having computer know-how. System was developed with Java Technology. The expert rules were developed on the symptoms of each type of skin disease, and they were presented using tree-graph and inferred using forward-chaining with depth-first search method. User interaction with system is enhanced with efficient user interfaces. The web based expert system described in this paper can detect and give early diagnosis of thirteen plus skin diseases. This ES can be extended to diagnose all types of skin-diseases.

.

Keywords: Expert System, Skin Disease, Diagnose, Artificial Intelligence, Knowledge, Database.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

52. Paper 31051194: Cloud based Data Warehouse 2.0 Storage Architecture: An extension to traditional Data Warehousing (pp. 326-329)

Full Text: PDF

.

Kifayat Ullah Khan, Sheikh Muhammad Saqib, Bashir Ahmad, Shakeel Ahmad and Muhammad Ahmad Jan

Institute of Computing and Information Technology Gomal University, D.I.Khan, Pakistan

.

Abstract — Data Warehouse plays a critical role in organizational decision making. This gigantic environment provides an extremely conducive decision making grounds provided its smooth operation and maintenance is guaranteed. The recent enhancements in the Data Warehouse Architecture, DW 2.0, provide an approach which organizes the data into separate physical boundaries. The maintenance of such a massive data repository is really a hard nut to crack. If this maintenance responsibility is followed a divide and conquer approach in terms of splitting these sectors of data over the cloud then a maximum desirable output can be ensured. The authors in this research have given a model which explains how the data should be split, what data should be available on local and over the cloud storage. In this way, the organizational management can ponder maximum concentration on data analysis and stay trouble free from focusing on non-functional requirements.

.

Keywords- Data Warehouse, , Maintainance, DW 2.0 and Cloud Computing.

.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

.