Vol. 10 No. 5 MAY 2012

Vol. 10 No. 5 May 2012 International Journal of Computer Science and Information Security

Publication May 2012, Volume 10 No. 5 (Download Full Journal) (Archive) (Download 2)

.

Copyright © IJCSIS. This is an open access journal distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

1. Paper 30041202: Multidimensional Analysis in Intelligence Business Systems (pp. 1-6)

Full Text: PDF

.

Elma Zanaj, Ledion Liço, Indrit Enesi

Faculty of Information Technology, Polytechnic University of Tirana, Tirana, Albania

.

Abstract — The purpose of this study is to create an OLTP (Online Transaction Processing) and a DW (Data Warehouse) in order to make it simpler the extraction of various reports and to take information from different systems. Another purpose is to make a comparison between different OLAP (Online Analytical Processing) technologies for a large number of records. We will compare HOLAP and ROLAP technologies and their performance will be evaluated. For this reason it will be tested on DW a query by using ROLAP and to the intelligent cubes that will be created by using HOLAP for a considerable number of records and the system response time will be analyzed.

.

Keywords::information, systems, business, analysis

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

2. Paper 30041208: Fingerprint Feature Extraction and Identification using Direction Oriented Matrix with Color Band (pp. 7-13)

Full Text: PDF

.

T. Vidhya, Dept. of Information and Communication Engineering, Sri Venkateswara College of Engineering, Sriperumbudur, India

T. K. Thivakaran, Dept. of Information and Communication Engineering, Sri Venkateswara College of Engineering, Sriperumbudur, India

.

Abstract — Fingerprint recognition is the method of biometric authentication that uses pattern recognition techniques based on the high resolution fingerprint images. Fingerprints have several advantages over other biometrics such as the following: high universality, distinctiveness, permanence and performance, easy collectable and wide acceptability. The fingerprint image is made up of pattern of ridges and valleys; they are replica of the human fingertips. The fingerprint image represents a system of oriented texture and has very rich structural information within the image. A new algorithm to extract the fingerprint features using direction matrix is proposed. The work flow is to transform the fingerprint image into 16X16 matrix by computing the orientation field using gradient information. The feature matrix is used to identify the features, in this regard, a color band information is derived which follows a different pattern for different feature based fingerprint images. The algorithm was implemented in MATLAB environment using the images from FVC databases.

.

Keywords- orientation; direction matrix; pattern; identification

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

3. Paper 30041209: Behavioural API based Virus Analysis and Detection (pp. 14-22)

Full Text: PDF

.

Sulaiman Al amro, Software Technology Research Laboratory (STRL), De Montfort University, Leicester, UK

Antonio Cau, Software Technology Research Laboratory (STRL), De Montfort University, Leicester, UK

.

Abstract — The growing number of computer viruses and the detection of zero day malware have been the concern for security researchers for a large period of time. Existing antivirus products (AVs) rely on detecting virus signatures which do not provide a full solution to the problems associated with these viruses. The use of logic formulae to model the behaviour of viruses is one of the most encouraging recent developments in virus research, which provides alternatives to classic virus detection methods. To address the limitation of traditional AVs, we proposed a virus detection system based on extracting Application Program Interface (API) calls from virus behaviours. The proposed research uses a temporal logic and behaviour-based detection mechanism to detect viruses at both user and kernel level. Interval Temporal Logic (ITL) will be used for virus specifications, properties and formulae based on the analysis of API calls representing the behaviour of computer viruses.

.

Keywords-computer viruses; virus behaviour; API calls; interval temporal logic

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

4. Paper 30041269: An Efficient GPU Implementation of Modified Discrete Cosine Transform Using CUDA (pp. 23-30)

Full Text: PDF

.

Massimo Panella, Luigi Basset

Dpt. of Information Engineering, Electronics and Telecommunications (DIET), University of Rome “La Sapienza”, Via Eudossiana 18, 00184 Rome, Italy

.

Abstract — A new method is presented in this paper for using general purpose programming tools of graphics processing units. It aims to calculate the modified discrete cosine transform in audio coding and compression algorithms for popular audio formats such as MP3, AAC/AC-3, and WMA. The proposed algorithm consists of matrix multiplications that are performed by the graphics processing unit. The experiments show that the proposed implementation is considerably faster than usual implementations based on early algorithms for standard hardware, so that the proposed approach can be considered for critical applications in real-time multimedia signal processing.

.

Keywords- GPU computation; parallel pipelined architecture; modified discrete cosine transform; multimedia signal processing.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

5. Paper 30041224: Using Hybrid Decision Tree -Houph Transform Approach For Automatic Bank Check Processing (pp. 31-37)

Full Text: PDF

.

Heba A. Elnemr, Computer science Department, Akhbar Elyoum Academy, Computer and systems department, Electronics Research Institute, Giza, Egypt

.

Abstract — One of the first steps in the realization of an automatic system of bank check processing is the automatic classification of checks and extraction of handwritten area. This paper presents a new hybrid method which couple together the statistical color histogram features, the entropy, the energy and the Houph transform to achieve the automatic classification of checks as well as the segmentation and recognition of the various information on the check. The proposed method relies on two stages. First, a two-step classification algorithm is implemented. In the first step, a decision classification tree is built using the entropy, the energy, the logo location and histogram features of colored bank checks. These features are used to classify checks into several groups. Each group may contain one or more type of checks. Therefore, in the second step the bank logo or bank name are matched against its stored template to identify the correct prototype. Second, Hough transform is utilized to detect lines in the classified checks. These lines are used as indicator to the bank check fields. A group of experiments is performed showing that the proposed technique is promising as regards classifying the bank checks and extracting the important fields in that check.

.

Keywords-component; automatic check processing; color histogram; houph transform; statistical features; decision tree classifier

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

6. Paper 30041255: Data Aggregation with Energy Efficient Reliable Routing Protocol For Wireless Sensor Networks (pp. 38-43)

Full Text: PDF

.

Basavaraj S. Mathapati, Dept. of Computer Science & Engg., Appa IET, Gulbarga, Karanataka, India

Siddarama. R. Patil, Dept. of Electronics & Comm. Engg., P. D. A College of Engineering, Gulbarga, Karanataka, India

V. D. Mytri, Principal, GND College of Engineering, Bidar, Karanataka, India

.

Abstract - In Wireless Sensor Networks (WSN), data aggregation is essentially used to gather and aggregate data in an energy efficient manner so that network lifetime is enhanced. Data aggregation protocols aims at eliminating redundant data transmission. Power consumption is an important aspect to be considered in the data aggregation which is a scarce resource and they are irreplaceable. In addition to power consumption, reliability is also of major concern in data aggregation. In this paper, we propose to design an energy efficient reliable data aggregation technique for wireless sensor networks. Initially we form clusters and a coordinator node (CN) is selected near the cluster in order to monitor the nodes in the cluster. The CN selects a cluster head (CH) in each cluster based upon the energy level and the distance to the CN. The packets sent by the sensor nodes are aggregated at the CH and transmitted to the CN. The CN measures the loss ratio and compares it with a threshold value of loss ratio. Depending upon this value, the forward node count is incremented or decremented and the cluster size is adaptively changed, ensuring reliability and balanced energy consumption. From our simulation results we prove that this technique is efficient in energy consumption and reliability.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

7. Paper 30041256: The requirements of Parallel Data Warehousing Environment to Improve the Performance with dominating sets for Next generation Users (pp. 44-51)

Full Text: PDF

.

Umapavankumar Kethavarapu, Research Scholar at Pondicherry Engineering College, CSE Dept, Pondicherry, India

Dr. S. Saraswathi, Associate Professor, IT Dept, Pondicherry Engineering College, Pondicherry, India

.

Abstract — The data warehousing (DWH) environment is useful to handle bulk data processing by providing techno-functional aspects, the parallelism and distribution in DWH will greatly serve the customers and management. In this paper we described the master data management (MDM), data virtualization, and integration of middle ware technologies to the parallel data warehouse (PDW).It is very much useful because the users are requiring their data in less time and they want to interact with the system in less complex way. The management of the DWH also wanted to handle their data in efficient manner so as to produce strategic decisions. To achieve all the requirements management of the Master data, data virtualization and as well as middleware technologies (MWT) usage in the PDW giving a better solution to both developers and customers. The proposed work gives a betterment in extraction, Transformation and Loading(ETL)process, management of common data in data marts, data warehouse and middleware benefits to the PDW. To handle faster data processing and work distribution in PDW ,MDM,data virtualization and MWT we proposed one common solution is usage of Dominating sets in the identification of systems and transactions which are participating in data processing the domain of participated systems we named as critical sub-system and the transactions activated are known as critical transactions.

.

Keywords-Data warehousing; parallel Data warehousing; Master Data Management; Data Virtualization; Middle ware technologies; dominating sets; Critical Transactions.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

8. Paper 30041259: Talking Business Card Using Augmented Reality (pp. 52-58)

Full Text: PDF

.

Farimah Ghazaei, Master of Computer Science, Multi Media, Faculty of Computer Science and Information Technology, University Putra Malaysia, Serdang, Malaysia

Sahar Sabbaghi Mahmouei, Master of Smart Technology and Robotic Program, Institute of Advanced Technology (ITMA), University Putra Malaysia, Serdang, Malaysia

.

Abstract — Augmented reality (AR) is a relatively new technology that allows mixing virtual with real world in different proportions to achieve a level of immersion that no virtual equipment can provide. Recent advances in the field of computers and virtual environments make possible AR technology to go many applications. AR technology aims to enhance the user’s perception and interaction with the real world by implementing the real world with 3Dvirtual objects, which appear to coexist in the same space as the real world. The traditional business cards are no longer popular, since showing all the relevant data in a small space on a business card is impractical, time consuming or costly. Therefore there is a need for developing, an Augmented Reality Advertising Application which is capable of storing and representing a huge amount of data in a reasonable time and cost. This paper, implements the current status of the AR systems for Business Card which is the new type of automated applications and act to enhance the effectiveness and attractiveness of marketing for people in a real life scene. Augmented Reality Business Card (ARBC) is a really great way of getting people talking in something really special. This research have proposed a system which is based on Augmented Reality and designing a business card with marker.

.

Keywords-component; Business card; Augmented Reality; Marker; virtual environment.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

9. Paper 30041264: Survival Analysis in Cancer Gene Using Vector Space Model (pp. 59-65)

Full Text: PDF

.

Jitasha Mishra, Debashis Hati, Amritesh Kumar

Computer Science and Engineering, Gandhi Institute Of Technology And Management, Bhubaneswar, India

.

Abstract — The study is an effort to design a stable classification system and make a survival analysis to categorize microarray gene expression profiles. Currently, high-throughput microarray technology has been widely used to simultaneously probe the expression values of thousands genes in a biological sample. However, due to the nature of DNA hybridization, the expression profiles are highly noisy and demand specialized data mining methods for analysis. Our proposed approach focuses on developing an effective and stable sample classification system using gene expression data. The traditional cancer prognostic tools of tumor stage and morphology are inadequate benchmarks for the accurate determination of patient risk. The emergence of microarray technology has enabled the simultaneous measurement of thousands of gene expression levels, allowing researchers to apply sophisticated data mining and statistical techniques in the search for a superior prognostic methodology. This paper extends an existing procedure called Bayesian Model Averaging (BMA) to Cosine Bayesian Model Averaging for application to survival analysis. Cosine BMA is a method for predicting survival prognosis by isolating a small group of relevant predictor genes from a highdimensional microarray dataset. In this paper, the Cosine BMA algorithm for survival analysis is applied to two real cancer datasets: diffuse large B-cell lymphoma and breast cancer. The selected genes are used to divide patients into high and low-risk categories. Results show that the Cosine BMA algorithm for survival analysis consistently selects a small number of relevant genes while providing a higher degree of predictive accuracy than other feature selection methods. The procedure shows promise as a powerful and cost-effective prognostic tool in future cancer research.

.

Keywords- Microarray, Supervised learning, survival analysis, classification, DNA, FNAC, biopsy.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

10. Paper 30041204: Impact of Predicate on Object Oriented Programming (pp. 66-68)

Full Text: PDF

.

Mohammad Ahmer Munir Khan, Computer science department, ITM University Gurgoan, Gurgoan Haryana, India

Rita Chhikara, Computer science department, ITM University Gurgoan, Gurgoan Haryana, India

.

Abstract - A programming language is an artificial language designed to express computations that can be performed by a machine, particularly a computer. Object Oriented programming work around object to relate real world entity on computer. Object is the main active body in real world. Object having a set of attributes and perform some task according to their behavior. The methods in object oriented language are defined to give them behavior of actions. If we provide constraints and logic predicate in term of object and their behavior and attributes, system will start working like actual object in real world and its behaviors can be invoke according to logic and constraints which will provide artificial intelligence in object defined by programmer. In this paper, we are proposing for a Object Oriented programming language with predicate around that to make programming more realistic and comparative to real world.

.

Keywords - Object Oriented Programming, Predicate, Class, Domain declarative programming, imprative programming

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

11. Paper 30041260: A Framework for Multimedia Data Mining in Information Technology Environment (pp. 69-77)

Full Text: PDF

.

Owoade A. Akeem, Ogunyinka T. K., Abimbola B. L.

Department of Computer Science, Tai Solarin University of Education, Ijebu Ode, Nigeria

Department of Computer Science, Gateway (ICT) Polytechnic,saapade, Remo, Ogun state, Nigeria

.

Abstract - The digital information revolution has brought about profound changes in our society and our lives. The many advantages of digital information have also generated new challenges and new opportunities for innovation which necessitated the mining of multimedia data since multimedia data sets such as audio, speech, text, web, image, video and combinations of several types are becoming increasingly available and are almost unstructured or semi structured data by nature which makes it difficult for human beings to extract information without powerful tools. This call for need to develop data mining techniques which can work for all kinds of multimedia data.

.

Keywords: Multimedia data mining, Data mining, Multimedia database.

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

12. Paper 20051203: Intrusion Detection and Prevention System: Classification and Quick Review (pp. 78-83)

Full Text: PDF

.

G. Ramesh Kumar, Research Scholar, Dept. Of Computer Science, Dravidian University, Kuppam, Andhra Pradesh.

Dr. Ujwal A. Lanjewar, Research Supervisor, Hod, Dept. Of Computer Science, Centre Point College, Samarth Nagar, Wardha Road, Nagpur.

.

Abstract — Peer-to-Peer technology, also known as peer computing, is an emerging paradigm that is now viewed as a potential technology that could provide a decentralized infrastructure for information sharing. The term peer-to-peer refers to the concept that in a network of equals (peers) using appropriate information and communication systems, two or more individuals are able to spontaneously collaborate without necessarily needing central coordination. This paper defines P2P concepts, specifies how P2P is different from Client-Server Model, Distributed Systems and a Grid, and discusses various applications of P2P systems. The main aim of this paper is to review P2P concepts and to highlight its importance through its advantages.

.

Keywords:

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

13. Paper 30041266: Performance Evaluation for Scalable Recursive Multicast Protocol using NS2 Simulator (pp. 84-87)

Full Text: PDF

.

1 Jafar Ababneh,2 Firas E.Albalas,1 Nidhal Kamel Taha El-Omari,1 Abdel Rahman A.Alkarabsheh,1 Abd Alsalam Obiadat,3 Mahmood Baklizi

(1) Faculty of science and information technology, The World Islamic Sciences and Education (W.I.S.E.) University, Amman, 11947, P.O. Box 1101, Jordan

(2) Faculty of science and information technology, Jadara University, Amman – Irbid main Road, 21110 , P.O. Box 733, Jordan

(3) National advanced IPV6 center (NAV6) university sains Malaysia,11800 USM, penang, Malaysia

.

Abstract - In multicast routing the scalability issue should be considered, this issue comes because the increasing in the size of the Multicast Forwarding Table (MFT) because of the increase in multicast group members or the increase in the number of multicast groups. SReM[1] is a multicast routing protocol that addressed this issue by explicitly encode the building the multicast tree. An extensive evaluation performance is considered for this protocol in this paper. As a result, this protocol gave an improvement in the scalability issue by minimizing the header size and gave an improvement in the packet delivery ration and the end to end delay.

.

Keywords:

.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

.