Vol. 2, JUNE 2009

International Journal of Computer Science and Information Security

 IJCSIS June 2009 Volume 2 (Download Full Journal).

Copyright © 2009 IJCSIS. This is an open access journal distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Paper 28040902: Intrusion Detection System Using Advanced Honeypots
Download Published Paper @ [arXiv], [docstoc]
Ram Kumar Singh, Deptt. of Information Technology, International School of Business & Tech., Kampala, Uganda, East Africa.
Prof. T.Ramanujam, Deptt. of Electronics & Communication Engg., Krishna Engg. College, Mohan Nagar,Ghaziabad,U.P, India

Abstract - The exponential growth of Internet traffic has made public servers increasingly vulnerable to unauthorized accesses and intrusions. In addition to maintaining low latency for the client, filtering unauthorized accesses has become one of the major concerns of a server maintainer. This implementation of an Intrusion Detection System distinguishes between the traffic coming from clients and the traffic originated from the attackers, in an attempt to simultaneously mitigate the problems of both latency and security. We then present the results of a series of stress and scalability tests, and suggest a number of potential uses for such a system. As computer attacks are becoming more and more difficult to identify the need for better and more efficient intrusion detection systems increases. The main problem with current intrusion detection systems is high rate of false alarms. Using honeypots provides effective solution to increase the security;

Paper 28040903: Automatic Music Genre Classification System
Download Published Paper @ [arXiv], [docstoc]

Rajinder Kumar Math, A. V. Sutagundar, Basaveswar Engineering College, Bagalkot, Karntaka, India

Abstract- This paper proposes the design and development of an easy and effective automatic Music Genre Classification System to provide easy and quick classification of Mp3 files within no time. Though the music genre classification remains fuzzy in nature, this paper attempts to design such a system. The designed system not only provides accurate classification but also helps easy search and management of music files. The design of the system is mainly divided into two parts first part being feature extraction and second classification part. In the first part the important features pertaining to the Mp3 file are extracted and statistics are obtained for these extracted features. Histograms are obtained for all statistics corresponding to each feature. A simple algorithm is used for measuring the similarity between different classes of music. In this work, classifier is employed which classifies the music files into different genres by employing histogram based class separable measure. Classifier calculates the power spectral density from the static features for each class, thereby calculating histogram errors for each class as opposed to the other classes. The probability of error is evaluated, these error rates are used to group the Mp3 files into different classes, and the class is being the genres. Finally the rank list of classified genres is generated.

Keywords: Classification, Genres, automatic genre classification.
Paper 11050915: Rough Set Model for Discovering Hybrid Association Rules
Download Published Paper @ [arXiv], [docstoc]
Anjana Pandey, Research Scholar, Maulana Azad National Institute of Technology Bhopal,(India) 

K.R.Pardasani, Professor & Head, Department of Mathematics, Maulana Azad National Institute of Technology,Bhopal,(India) 

Abstract—In this paper, the mining of hybrid association rules with rough set approach is investigated as the algorithm RSHAR.The RSHAR algorithm is constituted of two steps mainly. At first, to join the participant tables into a general table to generate the rules which is expressing the relationship between two or more domains that belong to several different tables in a database. Then we apply the mapping code on selected dimension, which can be added directly into the information system as one certain attribute. To find the association rules, frequent itemsets are generated in second step where candidate itemsets are generated through equivalence classes and also transforming the mapping code in to real dimensions. The searching method for candidate itemset is similar to apriori algorithm. The analysis of the performance of algorithm has been carried out. 

Index Terms—Rough Set, multidimensional, inter-dimension association rule, data mining 

Paper 12050921: A Rough Sets Partitioning Model for Mining Sequential Patterns with Time Constraint
Download Published Paper @  [arXiv], [docstoc]

Jigyasa Bisaria, Namita Shrivastava and K.R. Pardasani Department of Mathematics Maulana Azad National Institute of Technology (A Deemed University) Bhopal 462051 India 

Abstract— Now a days, data mining and knowledge discovery methods are applied to a variety of enterprise and engineering disciplines to uncover interesting patterns from databases. The study of Sequential patterns is an important data mining problem due to its wide applications to real world time dependent databases. Sequential patterns are inter-event patterns ordered over a time-period associated with specific objects under study. Analysis and discovery of frequent sequential patterns over a predetermined time-period are interesting datamining results, and can aid in decision support in many enterprise applications. The problem of sequential pattern mining poses computational challenges as “a long frequent sequence” contains enormous number of frequent subsequences.Also useful results depend on the right choice of event window. In this paper, we have studied the problem of sequential pattern mining through two perspectives, one the computational aspect of the problem and the other is incorporation and adjustability of time constraint. We have used Indiscernibility relation from theory of rough sets to partition the search space of sequential patterns and have proposed a novel algorithm that allows previsualization of patterns and allows adjustment of time constraint prior to execution of mining task.The algorithm Rough Set Partitioning is atleast ten times faster than the naive time constraint based sequential pattern mining algorithm GSP. Besides this an additional knowledge of time interval of sequential patterns is also determined with the method.

Keywords- Data mining, Sequential patterns, indiscernibility relation, partitioning etc.


Paper 12050922: XDANNG: XML based Distributed Artificial Neural Network with Globus Toolkit
Download Published Paper @ [arXiv], [docstoc]

Hamidreza Mahini, ELearning Department, Iran University of Science and Technology Tehran, Iran.

Alireza Mahini, Academic Staff of Computer Engineering Department, Islamic Azad University-Gorgan branch, Gorgan, Iran.

Javad Ghofrani, Computer Department, Iran University of Science and Technology-Behshahr Branch, Behshahr, Iran
Abstract— Artificial Neural Network is one of the most common AI application fields. This field has direct and indirect usages most sciences. The main goal of ANN is to imitate biological neural networks for solving scientific problems. But the level of parallelism is the main problem of ANN systems in comparison with biological systems. To solve this problem, we have offered a XML-based framework for implementing ANN on the Globus Toolkit Platform. Globus Toolkit is well known management software for multipurpose Grids. Using the Grid for simulating the neuron network will lead to a high degree of parallelism in the implementation of ANN. We have used the XML for improving flexibility and scalability in our framework.

Keywords— Grid Computing; Artificial Neural Network; Artificial Intelligence; XML; Globus Toolkit; MPI; Distributed Systems;

Paper 19050932: Integrating Genetic Algorithm, Tabu Search Approach for Job Shop Scheduling
Download Published Paper @ [arXiv]
R.Thamilselvan, Kongu Engineering College/Computer Science and Engineering, Erode, India 

Dr.P.Balasubramanie, Kongu Engineering College/Computer Science and Engineering, Erode, India

Abstract— This paper presents a new algorithm based on integrating Genetic Algorithms and Tabu Search methods to solve the Job Shop Scheduling problem. The idea of the proposed algorithm is derived from Genetic Algorithms. Most of the scheduling problems require either exponential time or space to generate an optimal answer. Job Shop scheduling (JSS) is the general scheduling problem and it is a NP-complete problem, but it is difficult to find the optimal solution. This paper applies Genetic Algorithms and Tabu Search for Job Shop Scheduling problem and compares the results obtained by each. With the implementation of our approach the JSS problems reaches optimal solution and minimize the makespan. 

Index Terms —Genetic Algorithm, Tabu Search, Simulated Annealing, Clustering Algorithm, Job Shop Scheduling
Paper 19050933: Analysis of the various key management algorithms and new proposal in the secure multicast communications
Download Published Paper @ [arXiv], [docstoc]

Joe Prathap P M. , Department of Information Technology, Arulmigu Kalasalingam College of Engineering, Krishnankoil. Madurai, Tamilnadu, India.

V.Vasudevan , Department of Information Technology, Arulmigu Kalasalingam College of Engineering, Krishnankoil. Madurai, Tamilnadu, India.

Abstract— With the evolution of the Internet, multicast communications seem particularly well adapted for large scale commercial distribution applications, for example, the pay TV channels and secure videoconferencing. Key management for multicast remains an open topic in secure Communications today. Key management mainly has to do with the distribution and update of keying material during the group life. Several key tree based approach has been proposed by various authors to create and distribute the multicast group key in effective manner. There are different key management algorithms that facilitate efficient distribution and rekeying of the group key. These protocols normally add communication overhead as well as computation overhead at the group key controller and at the group members. This paper explores the various algorithms along with the performances and derives an improved method.
Keywords- Group key management, Key tree, Multicast security, Rekeying

Paper 22050943: A Bandwidth Characterization Tool for MPEG-2 File
Download Published Paper @ [arXiv] , [ScientificCommons] , [docstoc]
*Sandeep. Kugali, **S. S. Manvi, *A. V. Sutagundar
*Basaveswar Engineering College, Bagalkot, Karntaka, India
**Reva Institute of Technology and Mangement, Banglore

Abstract- This paper proposes the design and development of MPEG-2 Video Decoder to offer flexible and effective utilization of bandwidth services. The decoder is capable of decoding the MPEG-2 bit stream on a single host machine. The present decoder is designed to be simple, but yet effectively reconstruct the video from MPEG-2 bit stream. The designed MPEG-2 video decoder will display the name of each frame and corresponding frame bandwidth and also displays the buffer fullness for each picture ‘n’, bit rate ‘R’, time interval ‘I’, maximum quantisation error ‘E” and decoded size of the pervious picture ‘d’ etc,. In order to characterize the bandwidth requirement, the proposed multimedia tool is tested on various MPEG-2 bit streams for the operation of effectiveness in terms of various performance parameters such as: cumulative bandwidth, peak bandwidth, average bandwidth and minimum bandwidth. 

Keywords: Bandwidth, MPEG-2, 2D-IDCT, Multimedia
Paper 23050945: Measurable & Scalable NFRs using Fuzzy Logic and Likert Scale 
Download Published Paper @ [arXiv], [docstoc]

Nasir Mahmood, Arif Mushtaq, Samina Khalid, Tehmina Khalil, Department of Computer Science, Bahria University, Islamabad, Pakistan

Abstract - 
Most of the research related to Non Functional Requirements (NFRs) have presented NFRs frameworks by integrating non functional requirements with functional requirements while we proposed that measurement of NFRs is possible e.g. cost and performance and NFR like usability can be scaled. Our novel hybrid approach integrates three things rather than two i.e. Functional Requirements (FRs), Measurable NFRs (M-NFRs) and Scalable NFRs (S-NFRs). We have also found the use of Fuzzy Logic and Likert Scale effective for handling of discretely measurable as well as scalable NFRs as these techniques can provide a simple way to arrive at a discrete or scalable NFR in contrast to vague, ambiguous, imprecise, noisy or missing NFR. Our approach can act as baseline for new NFR and aspect oriented frameworks by using all types of UML diagrams. 
Keywords: Non Functional Requirements, Functional Requirements, NFRs, FRs, Fuzzy Logic, Likert Scale 
Paper 24050946: Forecasting Model for Crude Oil Price and Commodity Futures Prices using Artificial Neural Networks
Download Published Paper @ [arXiv], [docstoc]
Siddhivinayak Kulkarni*, Graduate School of Information Technology and Mathematical Sciences, University of Ballarat, Ballarat, Australia
Imad Haidar, Graduate School of Information Technology and Mathematical Sciences, University of Ballarat, Ballarat, Australia

Abstract— This paper presents a model based on multilayer feedforward neural network to forecast crude oil spot price direction in the short-term, up to three days ahead. A great deal of attention was paid on finding the optimal ANN model structure. In addition, several methods of data pre-processing were tested. Our approach is to create a benchmark based on lagged value of pre-processed spot price, then add pre-processed futures prices for 1, 2, 3,and four months to maturity, one by one and also altogether. The results on the benchmark suggest that a dynamic model of 13 lags is the optimal to forecast spot price direction for the short-term. Further, the forecast accuracy of the direction of the market was 78%, 66%, and 53% for one, two, and three days in future conclusively. For all the experiments, that include futures data as an input, the results show that on the short-term, futures prices do hold new information on the spot price direction. The results obtained will generate comprehensive understanding of the crude oil dynamic which help investors and individuals for risk managements.
Keywords-Crude Oil; Future Price; ANN; Prediction Models
Paper 27050954: Threshold Verification Technique for Network Intrusion Detection System
Download Published Paper @ [arXiv], [docstoc]

Mohd Faizal A., Mohd Zaki M.,Shahrin S., Robiah Y, Siti Rahayu S.
Faculty of Information Technology and Communication 
Univeristi Teknikal Malaysia Melaka, Ayer Keroh, Melaka, Malaysia

Internet has played a vital role in this modern world, the possibilities and opportunities offered are limitless. Despite all the hype, Internet services are liable to intrusion attack that could tamper the confidentiality and integrity of important information. An attack started with gathering the information of the attack target, this gathering of information activity can be done as either fast or slow attack. The defensive measure network administrator can take to overcome this liability is by introducing Intrusion Detection Systems (IDSs) in their network. IDS have the capabilities to analyze the network traffic and recognize incoming and on-going intrusion. Unfortunately the combination of both modules in real time network traffic slowed down the detection process. In real time network, early detection of fast attack can prevent any further attack and reduce the unauthorized access on the targeted machine. The suitable set of feature selection and the correct threshold value, add an extra advantage for IDS to detect anomalies in the network. Therefore this paper discusses a new technique for selecting static threshold value from a minimum standard features in detecting fast attack from the victim perspective. In order to increase the confidence of the threshold value the result is verified using Statistical Process Control (SPC). The implementation of this approach shows that the threshold selected is suitable for identifying the fast attack in real time.

Keyword;fast attack detection, Intrusion detection system, Statistical Process Control
Paper 29050959: Defense Strategies Against Modern Botnets
Download Published Paper @ [arXiv], [docstoc]

Srdjan Stanković, Personnel Department, Ministry of Defense, Belgrade, Serbia 
Dejan Simić, Department of Information Technology, Faculty of Organizational Sciences, Belgrade, Serbia

Abstract—Botnets are networks of compromised computers with malicious code which are remotely controlled and which are used for starting distributed denial of service (DDoS) attacks, sending enormous number of e-mails (SPAM) and other sorts of attacks. Defense against modern Botnets is a real challenge. This paper offers several strategies for defense against Botnets with a list and description of measures and activities which should be carried out in order to establish successful defense. The paper also offers parallel preview of the strategies with their advantages and disadvantages considered in accordance with various criteria.
Keyword –Botnets, Defense, Security, Strategies, DDoS, SPAM.

Paper 29050960: Using Agent to Coordinate Web Services
Download Published Paper @ [arXiv], [docstoc]

C. H. Liu, Y. F. Lin and J. Y. Chen
Department of Computer Science & Information Engineering
National Central University, Jhong-Li, Taiwan

Abstract— Traditionally, agent and web service are two separate research areas. We figure that, through agent communication, agent is suitable to coordinate web services. However, there exist agent communication problems due to the lack of uniform, cross-platform vocabulary. Fortunately, ontology defines a vocabulary. We thus propose a new agent communication layer and present the web ontology language (OWL)-based operational ontologies that provides a declarative description, which can be accessed by various engines to facilitate agent communication. Further, in our operational ontologies, we define the mental attitudes of agents that can be shared among other agents. Our architecture enhanced the 3APL agent platform, and it is implemented as an agent communication framework. Finally, we extended the framework to be compatible with the web ontology language for service (OWL-S), and then develop a movie recommendation system with four OWL-S semantic web services on the framework. The benefits of this work are: 1) dynamic web service coordination, 2) ontological reasoning through uniform representation, namely, the declarative description, and 3) easy reuse and extension of both ontology and engine through extending ontology.
Keywords- agent communication; semantic web service; agent mentality layer

Paper 30050961: Effective Focused Crawling Based on Content and Link Structure Analysis
Download Published Paper @ [arXiv], [docstoc]
Anshika Pal, Deepak Singh Tomar, S.C. Shrivastava
Department of Computer Science & Engineering, Maulana Azad National Institute of Technology, Bhopal, India

Abstract — A focused crawler traverses the web selecting out relevant pages to a predefined topic and neglecting those out of concern. While surfing the internet it is difficult to deal with irrelevant pages and to predict which links lead to quality pages. In this paper a technique of effective focused crawling is implemented to improve the quality of web navigation. To check the similarity of web pages w.r.t. topic keywords a similarity function is used and the priorities of extracted out links are also calculated based on meta data and resultant pages generated from focused crawler. The proposed work also uses a method for traversing the irrelevant pages that met during crawling to improve the coverage of a specific topic.
Keywords- focused crawler, metadata, weight table, World-Wide Web, Search Engine, links ranking.

Paper 30050962: Efficient IRIS Recognition through Improvement of Feature Extraction and subset Selection

Download Published Paper @ [arXiv], [docstoc]

Amir Azizi, Islamic Azad University Mashhad Branch, 
Hamid Reza Pourreza, Ferdowsi University of Mashhad

Abstract—the selection of the optimal feature subset and the classification has become an important issue in the field of iris recognition. In this paper we propose several methods for iris feature subset selection and vector creation. The deterministic feature sequence is extracted from the iris image by using the contourlet transform technique. Contourlet transform captures the intrinsic geometrical structures of iris image. It decomposes the iris image into a set of directional sub-bands with texture details captured in different orientations at various scales so for reducing the feature vector dimensions we use the method for extract only significant bit and information from normalized iris images. In this method we ignore fragile bits. And finally we use SVM (Support Vector Machine) classifier for approximating the amount of people identification in our proposed system. Experimental result show that most proposed method reduces processing time and increase the classification accuracy and also the iris feature vector length is much smaller
versus the other methods. 

Paper 31050963: A New Approach to Manage QoS in Distributed Multimedia Systems

Download Published Paper @ [arXiv], [docstoc]

Bechir Alaya, Claude Duvallet, Bruno Sadeg 
LITIS, UFR des Sciences et Techniques

Abstract—Dealing with network congestion is a criterion used to enhance quality of service (QoS) in distributed multimedia systems. The existing solutions for the problem of network congestion ignore scalability considerations because they maintain a separate classification for each video stream. In this paper, we propose a new method allowing to control QoS provided to clients according to the network congestion, by discarding some frames when needed. The technique proposed, called (m,k)-frame, is scalable with little degradation in application performances. (m,k)-frame method is issued from the notion of (m,k)-firm realtime constraints which means that among k invocations of a task, m invocations must meet their deadline. Our simulation studies show the usefulness of (m,k)-frame method to adapt the QoS to the real conditions in a multimedia application, according to the current system load. Notably, the system must adjust the QoS provided to active clients1 when their number varies, i.e. dynamic arrival of clients.

Paper 31050964 : Fuzzy Logic Based Method for Improving Text Summarization

Download Published Paper @ [arXiv], [docstoc]

Ladda Suanmali and Naomie Salim 
Faculty of Science and Technology, Suan Dusit Rajabhat University, Bangkok, Thailand 

Mohammed Salem Binwahlan
Faculty of Computer Science and Information System, Universiti Teknologi Malaysia 

Abstract—Text summarization can be classified into two approaches: extraction and abstraction. This paper focuses on extraction approach. The goal of text summarization based on extraction approach is sentence selection. One of the methods to obtain the suitable sentences is to assign some numerical measure of a sentence for the summary called sentence weighting and then select the best ones. The first step in summarization by extraction is the identification of important features. In our experiment, we used 125 test documents in DUC2002 data set. Each document is prepared by preprocessing process: sentence segmentation, tokenization, removing stop word, and word stemming. Then, we use 8 important features and calculate their score for each sentence. We propose text summarization based on fuzzy logic to improve the quality of the summary created by the general statistic method. We compare our results with the baseline summarizer and Microsoft Word 2007 summarizers. The results show that the best average precision, recall, and f-measure for the summaries were obtained by fuzzy method.

Keywords - 

Paper 31050966: Incidence Handling and Response System

Download Published Paper @ [arXiv], [docstoc]

Prof.Dhananjay R. Kalbande, Asst.Prof Comp. Engg., S.P.I.T,Mumbai  
Dr.G.T.Thampi Principal PIIT,New Panvel. 
Mr.Manish Singh B.E.Comps SPIT,Mumbai

Abstract - A computer network can be attacked in a number of ways. The security-related threats have become not only numerous but also diverse and they may also come in the form of blended attacks. It becomes difficult for any security system to block all types of attacks. This gives rise to the need of an incidence handling capability which is necessary for rapidly detecting incidents, minimizing loss and destruction, mitigating the weaknesses that were exploited and restoring the computing services. Incidence response has always been an important aspect of information security but it is often overlooked by security administrators. in this paper, we propose an automated system which will handle the security threats and make the computer network capable enough to withstand any kind of attack. we also present the state-of-the-art technology in computer, network and software which is required to build such a system. 
Key Words- Incidence, Incidence Handling

Paper 31050970: Energy-Aware Multicast Routing in MANETs: A Genetic Algorithm Approach
Download Published Paper @ [arXiv], [docstoc]

Dilip Kumar S.M  
Dept. of Computer Science Engineering, University Visvesvaraya College of Engineering (UVCE), Bangalore University, Bangalore - 560 001, INDIA.
Vijaya Kumar B.P.
Dept. of Computer Science and Engineering, Reva Institute of Technology and Management (RITM), Yelahanka, Bangalore - 560 064, INDIA.

Abstract—Energy-aware multicasting in mobile ad hoc networks (MANETs) is an important issue due to the energy constraint of battery in each mobile node. In this paper, we propose an energy-aware source-based multicast routing algorithm for MANET, that finds a path for each node-pair (s, di) consisting of minimum number of links, that passes through common links and extend the lifetime of an ad hoc network using optimization techniques, where s is a source node and di is a member of the destination set. A linear programming and genetic algorithm approach is used for optimization. The simulation results demonstrates the computational power of the proposed algorithm and shows that this approach is efficient and robust for constructing an optimized multicast tree1. 

keywords—Mobile ad hoc networks; energy-aware; multicast routing; optimization; genetic algorithm

Paper 31050971: Application of non-uniform laxity to EDF for aperiodic tasks to improve task utilisation on multicore platforms
Download Published Paper @ [arXiv], [docstoc]

K Pradheep Kumar and A P Shanthi , Department of CSE , Anna University, Chennai, India 

Abstract - This paper proposes a new scheduler applying the concept of non-uniform laxity to Earliest deadline first (EDF) approach for aperiodic tasks. This scheduler improves task utilisation (Execution time / deadline) and also increases the number of tasks that are being scheduled. Laxity is a measure of the spare time permitted for the task before it misses its deadline, and is computed using the expression (deadline - (current time + execution time)). Weight decides the priority of the task and is defined by the expression (quantum slice time / allocated time)*total core time for the task. Quantum slice time is the time actually used, allocated time is the time allocated by the scheduler, and total core time is the time actually reserved by the core for execution of one quantum of the task. Non-uniform laxity enables scheduling of tasks that have higher priority before the normal execution of other tasks and is computed by multiplying the weight of the task with its laxity. The algorithm presented in the paper has been simulated on Cheddar, a real time scheduling tool and also on SESC, an architectural simulator for multicore platforms, for upto 5000 random task sets, and upto 5000 cores. This scheduler improves task utilisation by 35% and the number of tasks being scheduled by 36%, compared to conventional EDF.

Keywords –

Paper 31050975:
A Novel Two-Staged Decision Support based Threat Evaluation and Weapon Assignment Algorithm
Asset-based Dynamic Weapon Scheduling using Artificial Intelligence Techinques
Download Published Paper @ [arXiv], [docstoc]

Huma Naeem, Asif Masood, Mukhtar Hussain, and Shoab A. Khan
Department of Computer Science, National University of Science and Technology, Rawalpindi, Pakistan 
Abstract— Surveillance control and reporting (SCR) system for air threats play an important role in the defense of a country. SCR system corresponds to air and ground situation management/processing along with information fusion, communication, coordination, simulation and other critical defense oriented tasks. Threat Evaluation and Weapon Assignment (TEWA) sits at the core of SCR system. In such a system, maximal or near maximal utilization of constrained resources is of extreme importance. Manual TEWA systems cannot provide optimality because of different limitations e.g. surface to air missile (SAM) can fire from a distance of 5Km, but manual TEWA systems are constrained by human vision range and other constraints. Current TEWA systems usually work on target-by-target basis using some type of greedy algorithm thus affecting the optimality of the solution and failing in multi-target scenario. his paper relates to a novel two-staged flexible dynamic decision support based optimal threat evaluation and weapon assignment algorithm for multi-target air-borne threats.
Keywords- Optimization Algorithm; Threat Evaluation (TE) and Weapon Assignment (WA) Algorithm (TEWA); Decision Support System (DSS); Stable Marriage Algorithm (SMA); Cybernetics Application, preferential and subtractive defense strategies.

Paper 31050976: A new approach for digit recognition based on hand gesture analysis
Download Published Paper @ [arXiv], [docstoc]

Ahmed BEN JMAA, Walid MAHDI, Abdelmajid BEN HAMADOU
Multimedia InfoRmation systems and Advanced Computing Laboratory
Higher Institute of Computer Science and Multimedia of Sfax, Tunisia
Yousra BEN JEMAA,Signal and System Research Unit, National Engineering School of Tunis, Tunisia

Abstract—We present in this paper a new approach for hand gesture analysis that allows digit recognition. The analysis is based on extracting a set of features from a hand image
and then combining them by using an induction graph. The most important features we extract from each image are the fingers locations, their heights and the distance between each pair of fingers. Our approach consists of three steps: (i) Hand detection and localization, (ii) fingers extraction and (iii) features identification and combination to digit recognition. Each input image is assumed to contain only one person, thus we apply a fuzzy classifier to identify the skin pixels. In the finger extraction step, we attempt to remove all the hand components except the fingers, this process is based on the hand anatomy properties. The final step consists on representing histogram of the detected
fingers in order to extract features that will be used for digit recognition. The approach is invariant to scale, rotation and translation of the hand. Some experiments have been undertaken to show the effectiveness of the proposed approach.

Paper 31050977: TTSS Packet Classification Algorithm to enhance Multimedia Applications in Network Processor based Router 
Download Published Paper @ [arXiv], [docstoc]

R.Avudaiammal, Research Scholar, Anna University
R.SivaSubramanian , Research Scholar, Vinayaka Mission’s University 
R.Pandian , Research Scholar, Anna University
P. Seethalakshmi, Anna University 

Abstract- The objective of this paper is to implement the Trie based Tuple Space Search(TTSS) packet classification algorithm for Network Processor(NP) based router to enhance multimedia applications. The performance is evaluated using Intel IXP2400 NP Simulator. The results demonstrate that, TTSS has better performance than Tuple Space Search algorithm and is well suited to achieve high speed packet classification to support multimedia applications.

Keywords - Multimedia, QoS, Packet Classification, TTSS, Network Processor, IXP 2400.

Paper 31050979: Acquiring Knowledge for Evaluation of Teachers’ Performance in Higher Education – using a Questionnaire
Download Published Paper @ [arXiv], [docstoc]
Hafeez Ullah Amin, Institute of Information Technology, Kohat University of Science & Technology (KUST), Kohat, Pakistan.
Abdur Rashid Khan, Institute of Computer and Information Technology, Gomal University
D.I.Khan, Pakistan

Abstract— In this paper, we present the step by step knowledge acquisition process by choosing a structured method through using a questionnaire as a knowledge acquisition tool. Here we want to depict the problem domain as, “how to evaluate teacher’s performance in higher education through the use of expert system technology”. The problem is how to acquire the specific knowledge for a selected problem efficiently and effectively from human experts and encode it in the suitable computer format. Acquiring knowledge from human experts in the process of Expert Systems development is one of the most common problems cited till yet. This questionnaire was sent to 87 domain experts within all public and private universities in Pakistani. Among them 25 domain experts sent their valuable opinions. Most of the domain experts were highly qualified, well experienced and highly responsible persons. The whole questionnaire was divided into 15 main groups of factors, which were further divided into 99 individual questions. These facts were analyzed further to give a final shape to the questionnaire. This knowledge acquisition technique may be used as a learning tool for further research work.  
Keywords- Expert system, knowledge acquisition process, knowledge acquisition techniques, questionnaire, domain experts

Paper 01060980: Global Stability Analysis for an Internet Congestion Control Model with a Time-Varying Link Capacity
Download Published Paper @ [arXiv], [docstoc]
B. Rezaie, M.-R. Jahed Motlagh, M. Analoui, Iran University of Science and Technology
S. Khorsandi, Amirkabir University of Technology

Abstract— In this paper, a global stability analysis is given for a rate-based congestion control system modeled by a nonlinear delayed differential equation. The model determines the dynamics of a single-source single-link network, with a time-varying capacity of link and a fixed communication delay. We obtain a sufficient delay-independent conditions on system parameters under which global asymptotic stability of the system is guarantied. The proof is based on an extension of Lyapunov-Krasovskii theorem for a class of nonlinear time-delay systems. The numerical simulations for a typical scenario justify the theorical results.
Keywords- Internet congestion control; global stability; time-varying capacity; nonlinear time-delay system.

IJCSIS Editor,
Jul 2, 2009, 2:30 AM
IJCSIS Editor,
Jul 3, 2009, 10:11 AM