IJCSIS EDITORIAL BOARD

BEST PAPER AWARD

Vol. 8 No. 8 NOV 2010

Vol. 8 No. 8 November 2010 International Journal of Computer Science and Information Security
Publication November 2010, Volume 8 No. 8 (Download Full Journal) (Archive

.
Copyright © IJCSIS. This is an open access journal distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

1. Paper 12101008: A Brief Survey on RFID Security and Privacy Issues (pp. 1-10)
Full Text: PDF

.
Mohammad Tauhidul Islam
Department of Mathematics and Computer Science, University of Lethbridge, Alberta, Canada T1K 3M4.

.
Abstract—Radio Frequency IDentification (RFID) security and privacy are exciting research areas that involve affluent interactions among many disciplines like signal processing, supply-chain logistics, hardware design, privacy rights and cryptography. There remain connections to be explored between the work surveyed here and other areas of study. This paper explores by highlighting a few of these. The majority of the articles treated in this survey explore security and privacy as an issue between RFID tags and readers and also compare with other technologies such as Barcode. Of course, tags and readers lie at the periphery of a full-scale RFID system. Many of the attendant data-security problems like that of authenticating readers to servers involve already familiar data-security protocols. This paper also mentions key management, costing, tag collision for RFID and identifies PIN distribution for tags as one such potential problem.
.
Keywords-RFID; Privacy and security; RFID tags; RFID readers
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

2. Paper 26101031: An Improved Fuzzy Time Series Model For Forecasting (pp. 11-19)
Full Text: PDF

.
Ashraf K. Abd-Elaal, Department of Computer and Information Sciences, The High Institute of Computer Science, Sohag, Egypt
Hesham A. Hefny, Department of Computer and Information Sciences, Institute of Statistical Studies and Research, Cairo University, Egypt
Ashraf H. Abd-Elwahab, Department of Computer Sciences, Electronics Research Institute National Center for Research, Cairo, Egypt

.
Abstract— Researchers introduce in this paper, an efficient fuzzy time series forecasting model based on fuzzy clustering to handle forecasting problems and improving forecasting accuracy. Each value (observation) is represented by a fuzzy set. The transition between consecutive values is taken into account in order to model the time series data. Proposed model employed eight main steps in time-invariant fuzzy time-series and time-variant fuzzy time series models to increase the performance of the proposed fuzzy time series model. The method of FCMI is integrated in the processes of fuzzy time series to partition datasets. The proposed model has been implemented to forecast the world production of iron and steel and the enrollments of the University of Alabama. The proposed model provide higher accuracy in forecasting. Our results show that this approach can lead to satisfactory performance for fuzzy time series.
.
Keywords- forecasting; fuzzy Clustering; fuzzy time series; iron.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

3. Paper 30101046: The 2D Image-Based Anthropologic Measurement By Using Chinese Medical Acupuncture And Human Body Slice Model (pp. 20-29)
Full Text: PDF

.
Sheng-Fuu Lin, Institute of Electrical Control Engineering, National Chiao Tung University, 1001 Ta Hsueh Road, Hsinchu City, Taiwan 300, ROC
Shih-Che Chien, Institute of Electrical Control Engineering, National Chiao Tung University, 1001 Ta Hsueh Road, Hsinchu City, Taiwan 300, ROC
Kuo-Yu Chiu, Institute of Electrical Control Engineering, National Chiao Tung University, 1001 Ta Hsueh Road, Hsinchu City, Taiwan 300, ROC

.
Abstract — Human body anthropometric measurement is widely used in daily life and has become an indispensable item for people and garment manufactures. Two-dimensional image-based anthropometric measurement system provides another choice to make anthropometric measurement alternative to traditional and three-dimension methods. These measurement systems are attractive because of their lower cost and easier to use. Although these systems have appeared in this type of application, most systems require the user to wear as little as possible for reducing the errors which come from garments in the measurement, and furthermore the measurement equipments are not easy available at anywhere and the setup are always complex. This paper presents an approach with fewer constraints and more simplified operation, and has the performance as good as manual measurement. In this approach, the Chinese medicine acupuncture theory is used to locate the position which measurement concerned and replace the manual marking or other feature extraction methods. For circumferences measurement, the human body slice model is supplied to approach the circumference shapes and used the piecewise Bezier curve to approximate the circumference curve. At the final, a compensation system of garment thickness is employed to amend the measurement data, which are obtained through the direct measuring of subject wearing clothes, to ensure the accuracy. Through of these methods, the subjects for measurement are not required for wearing as little as possible and the results of experiments also shows that the approach is quite comparable to traditional measurement methods.
.
Keywords - anthropometric measurement; Chinese medicine acupuncture; garment thickness
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

4. Paper 31101073: A Fast Fractal Image Encoding Based On Haar Wavelet Transform (pp. 30-36)
Full Text: PDF

.
Sofia Douda, Département de Mathématiques et Informatique & ENIC, Faculté des Sciences et Techniques, Université Hassan 1er, Settat, Morocco.
Abdallah Bagri, ENIC, Faculté des Sciences et Techniques, Université Hassan 1er, Settat, Morocco.
Abdelhakim El Imrani, LCS, Faculté des Sciences, Université Mohammed V, Rabat, Morocco
.
Abstract — In order to improve the fractal image encoding, we propose a fast method based on the Haar wavelet transform. This proposed method speed up the fractal image encoding by reducing the size of the domain pool. This reduction uses the Haar wavelet coefficients. The experimental results on the test images show that the proposed method reaches a high speedup factor without decreasing the image quality.
.
Keywords - Fractal image compression, PIFS, Haar wavelet transform, SSIM index.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

5. Paper 31101082: A New Noise Estimation Technique of Speech Signal by Degree of Noise Refinement (pp. 37-43)
Full Text: PDF

.
Md. Ekramul Hamid, College of Computer Science, King Khalid University, Abha, Kingdom of Saudi Arabia
Md. Zashim Uddin, Department of Computer Science and Engg., University of Rajshahi, Rajshahi, Bangladesh.
Md. Humayun Kabir Biswas, College of Computer Science, King Khalid University, Abha, Kingdom of Saudi Arabia
Somlal Das, Dept. of Computer Science, University of Rajshahi , Rajshahi, Bangladesh
.
Abstract — An improved method for noise estimation of speech utterances which are disturbed by additive noise is presented in this paper. Here, we introduce degree of noise refinement of minima value sequence (MVS) and some additional techniques for noise estimation. Initially, noise is estimated from the valleys of the spectrum based on the harmonic properties of noisy speech, called MVS. However, the valleys of the spectrum are not pronounced enough to warrant reliable noise estimates. We, therefore, initially use the estimated Degree of Noise (DON) to adjust the MVS level. For every English phoneme DON is calculated and averaged within those processing frames for the each input SNR. We consider this calculated average DONs as standard value corresponding to the input SNR which is aligned with the true DON using the least-squares (LS) method results a function to estimate the degree of noise. Therefore, using the technique, it is possible to estimate the state of the added noise more accurately. We use two stage refinements of estimated DON to update the MVS as well as to estimate a nonlinear weight for noise subtraction. The performance of the proposed noise estimation is good when it is integrated with the speech enhancement technique.
.
Keywords - component; Noise Estimation, the Degree of Noise, Speech Enhancement, Nonlinear Weighted Noise Subtraction

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

6. Paper 31101060: Scalable Video Coding in Online Video transmission with Bandwidth Limitation (pp. 44-47)
Full Text: PDF

.
Sima Ahmadpour, Salah Noori Saleh, Omar Amer Abouabdalla, Mahmoud Baklizi, Nibras Abdullah
National Advanced IPv6 Center of Excellence, University Science Malaysia, Penang, Malaysia

.
Abstract— Resource limitation and variety of network and users cause many obstacles while transmitting data especially online video data through network. Video applications in Internet face by significant growth in several market segments and bandwidth limitation is one of those challenges which consider as a main obstacle in this paper.
.
Keywords - component; bandwidth limitation, video codec, video conferencing, SVC
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

7. Paper 30101053: Off-Line Handwritten Signature Retrieval using Curvelet Transforms (pp. 48-51)
Full Text: PDF

.
M. S. Shirdhonkar, Dept. of Computer Science and Engineering, B.L.D.E.A’s College of Engineering and Technology, Bijapur, India
Manesh Kokare, Dept. of Electronics and Telecommunication, S.G.G.S Institute of Engineering and Technology, Nanded, India

.
Abstract —— In this paper, a new method for offline handwritten signature retrieval is based on curvelet transform is proposed. Many applications in image processing require similarity retrieval of an image from a large collection of images. In such cases, image indexing becomes important for efficient organization and retrieval of images. These papers address this issue in the context of a database of handwritten signature images and describes a system for similarity retrieval. The proposed system uses a curvelet based texture features extraction .The performance of the system has been tested with an image database of 180 signatures. The results obtained indicate that the proposed system is able to identify signatures with great with accuracy even when a part of a signature is missing.
.
Keywords- Handwritten recognition, Image indexing, Similarity retrieval, Signature verification, Signature identification.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

8. Paper 14101014: Low Complexity MMSE Based Channel Estimation Technique for LTE OFDMA Systems (pp. 52-56)
Full Text: PDF

.
Md. Masud Rana, Department of Electronics and Radio Engineering Kyung Hee University, South Korea
Abbas Z. Kouzani, School of Engineering, Deakin University, Geelong, Victoria 3217, Australia

.
Abstract — Long term evolution (LTE) is designed for high speed data rate, higher spectral efficiency, and lower latency as well as high-capacity voice support. LTE uses single carrier frequency division multiple access (SC-FDMA) scheme for the uplink transmission and orthogonal frequency division multiple access (OFDMA) in downlink. The one of the most important challenges for a terminal implementation are channel estimation (CE) and equalization. In this paper, a minimum mean square error (MMSE) based channel estimator is proposed for an OFDMA systems that can avoid the ill-conditioned least square (LS) problem with lower computational complexity. This channel estimation technique uses knowledge of channel properties to estimate the unknown channel transfer function at non-pilot subcarriers.
.
Index Terms — Channel estimation, LTE, least-square, OFDMA, SC-FDMA.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

9. Paper 27101033: Survey: RTCP Feedback In A Large Streaming Sessions (pp. 57-62)
Full Text: PDF

.
Adel Nadhem Naeem , Ali Abdulqader Bin Salem , Mohammed Faiz Aboalmaaly , and Sureswaran Ramadass
National Advanced IPv6 Centre, Universiti Sains Malaysia, Pinang, Malaysia
.
Abstract — RTCP has limitation with scalability for large streaming sessions; because of the limitation of the bandwidth space that given to RTCP reports. Many researchers studied and still studying how to solve this limitation, and most of the researchers come out with tree structure as a solution but in a different ways.
.
Keywords- RTCP/RTP; Scalability; Large Streaming Sessions
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

10. Paper 27101034: Performance Analysis Of Nonlinear Distortions For Downlink MC-CDMA Systems (pp. 63-70)
Full Text: PDF

.
Labib Francis Gergis
Misr Academy for Engineering and Technology, Mansoura, Egypt

.
Abstract - Multi-carrier (MC) scheme became a promising technique for its spectral efficiency and robustness against frequency-selective fading. Multi-carrier code division multiple access (MC-CDMA) is a powerful modulation technique that is being considered in many emerging broadband communication systems. MC-CDMA combines the advantages of multi-carrier modulation with that of code-division multiple access (CDMA) to offer reliable high-data-rate downlink cellular communication services. The MC-CDMA signals are a superposition of many narrow-band signals and, as a result suffer from strong envelope fluctuations which make them very prone to nonlinear effects introduced by high power amplifier (HPA). HPA introduces conversion in both amplitude and phase. In this paper we have focused on the signals at the output of the nonlinear distorting device. A practical technique for determining the bit error rate (BER) of downlink MC-CDMA systems using binary phase- shift keying (BPSK) modulation scheme. The results are applicable to systems employing a coherent demodulation with maximal ratio combining (MRC) and equal gain combining (EGC).
.
Keywords- MC-CDMA systems, high power amplifiers, nonlinear distortions, maximal ratio combining (MRC), equal gain combining (EGC).
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

11. Paper 16101016: Channel Estimation Algorithms, Complexities and LTE Implementation Challenges (pp. 71-76)
Full Text: PDF
.
Md. Masud Rana
Department of Electronics and Communication Engineering, Khulna University of Engineering and Technology, Khunla, Bangladesh

.
Abstract — The main purposes of the long term evolution (LTE) are substantially improved enduser throughputs, low latency, reduced user equipment (UE) complexity, high data rate, and significantly improved user experience with full mobility. LTE uses single carrier-frequency division multiple access (SC-FDMA) for uplink transmission and orthogonal frequency division multiple access (OFDMA) for downlink transmission. The major challenges for LTE terminal implementation are efficient channel estimation (CE) method as well as equalization. This paper discusses the basic CE techniques and future direction for research in CE fields. Simulation results demonstraters that the linear mean square error (LMMSE) CE method outperforms the least square (LS) CE method in term of mean square error (MSE) by more than around 3dB. Hence, based on a given LTE systems resources and specifications, a appropriate method among the presented methods can be applied for OFDMA systems.
.
Keywords — LS, LMMSE, LTE, OFDMA.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

12. Paper 18101017: Implementation Of Wavelet And RBF For Power Quality Disturbance Classification (pp. 77-82)
Full Text: PDF

.
Pramila P 1 , Puttamadappa C 2 and S. Purushothaman 3
1 Department of Electrical & Electronics Engineering , Bangalore Institute Of Technology , Bangalore, India
2 Department of Electronics & Communication Engineering , SJB Institute Of Technology , Bangalore, India
3 Sun College Of Engineering & Technology , Sunnagar , Kanyakumari, Tamilnadu , India

.
Abstract - This paper presents application of wavelet and Radial Basis Function (RBF) for power quality disturbance classification. Features are extracted from the electrical signals by using db wavelets. The features obtained from the wavelet are unique to each type of electrical fault. These features are normalized and given to the RBF. The data required are generated by simulating various faults in the test system. The performance of the proposed method is compared with the existing feature extraction techniques. Simulation results show the effectiveness of the proposed method for power quality disturbance classification.
.
Keywords - wavelets, Radial basis function (RBF), Harmonics, Power quality
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

13. Paper 28091037: GA-ANN based Dominant Gene Prediction in Microarray Dataset (pp. 83-91)
Full Text: PDF

.
Manaswini Pradhan , Lecturer, P.G. Department of Information and Communication Technology, Fakir Mohan University, Orissa, India
Dr. Sabyasachi Pattnaik, Reader,P.G. Department of Information and Communication Technology, Fakir Mohan University, Orissa, India.
Dr. B. Mittra, Reader, School of Biotechnology, Fakir Mohan University, Orissa, India
Dr. Ranjit Kumar Sahu, Assistant Surgeon, Post Doctoral Department of Plastic and Reconstructive Surgery, S.C.B. Medical College, Cuttack,Orissa, India

.
Abstract - Genome Analysis of a human being permits useful insight into the ancestry of that person and also facilitates the determination of weaknesses and susceptibilities of that person towards inherited diseases. The amount of accumulated genome data is increasing at a tremendous rate with the rapid development of genome sequencing technologies and gene prediction is one of the most challenging tasks in genome analysis. Many tools have been developed for gene prediction which still remains as an active research area. Gene prediction involves the analysis of the entire genomic data that is accumulated in the database and hence scrutinizing the predicted genes takes too much of time. However, the computational time can be reduced and the process can be made more effective through the selection of dominant genes. In this paper, a novel method is presented to predict the dominant genes of ALL/AML cancer. First, to train an FFANN a combinational data of the input dataset is generated and its dimensionality is reduced through Probability Principal Component Analysis (PPCA). Then, the classified database of ALL/AML cancer is given as the training dataset to design the FF-ANN. After the FF-ANN is designed, the genetic algorithm is applied on the test input sequence and the fitness function is computed using the designed FF-ANN. After that, the genetic operations crossover, mutation and selection are carried out. Finally, through analysis, the optimal dominant genes are predicted.
.
Keywords - gene prediction, Microarray gene expression data, Probabilistic PCA (PPCA), dimensionality reduction, Artificial Neural Network (ANN), Back propagation (BP), dominant gene, genetic algorithm.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

14. Paper 29091039: Therapeutic Diet Prediction for Integrated Mining of Anemia Human Subjects using Statistical Techniques (pp. 92-95)
Full Text: PDF

.
Sanjay Choudhary, Department of Mathematics & Computer Science, Govt. Narmada P.G. Mahavidyalaya , Hoshangabad, India
Abha Wadhwa, Department of Computer Science & Application, Govt Girls P.G. College, Hoshangabad, India
Kamal Wadhwa, Department of Mathematics & Computer Science, Govt. Narmada P.G. Mahavidyalaya, Hoshangabad, India
Anjana Mishra, Department of Mathematics & Computer Science, Govt. Narmada P.G. Mahavidyalaya, Hoshangabad, India

.
Abstract :- Chronic disease anemia [1] occurs when blood doesn’t have enough hemoglobin. Hemoglobin is a protein in red blood cells that carries oxygen from lungs to the rest of our body. All the body parts need oxygen. Anemia can starve our body of the oxygen it needs to survive. Possible causes of anemia include low vitamin B12 or folic acid intake and some chronic illnesses. But the most common cause is not having enough iron in blood which needs to make hemoglobin. This type of anemia is called iron deficiency anemia. Data Mining is widely used in database communities because of its wide applicability. One major application area of Data Mining is in therapeutic diet prediction. There are several chronic diseases which can be prevented using nutritive food. This paper presents association and correlation between anemia human subject and its prevention through diet nutrients. The role of diet in preventing and controlling iron deficiency is significant. Due to changes in dietary and life style patterns anemia can become catastrophic, so by predicting proper and sufficient diet nutrients for individuals, we can reduce the impact of anemia on human subjects.
.
Keywords :- Chronic Disease, Anemia, Diet Nutrients, Clinical System, Correlation, Data Mining
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

15. Paper 29101041: Improve the Test Case Design of Object Oriented Software by Refactoring (pp. 96-100)
Full Text: PDF

.
Divya Prakash Shrivastava , Department of Computer Science, Al Jabal Al Garbi University, Zawya, Libya
R.C. Jain, Department of Computer Application, Samrat Ashoka Technological Institute, Vidisha, India
.
Abstract — This Refactoring is the process of changing a software system aimed at organizing the design of source code, making the system easier to change and less error-prone, while preserving observable behavior. This concept has become popular in Agile software methodologies, such as eXtreme Programming (XP), which maintains source code as the only relevant software artifact. Although refactoring was originally conceived to deal with source code changes. Two key aspects of eXtreme Programming (XP) are unit testing and merciless refactoring. We found that refactoring test code is different from refactoring production code in two ways: (1) there is a distinct set of bad smells involved, and (2) improving test code involves additional test code refactorings. We describe a set of code smells indicating trouble in test code and a collection of test code refactorings explaining how to overcome some of these problems through a simple program modification.
.
Keywords- Test Smell, Test Case, Refactoring, Unit Testing, Object Oriented, TDD.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

16. Paper 29101042: Extraction of Information from Images using Dewrapping Techniques (pp. 101-109)
Full Text: PDF

.
Khalid Nazim S. A., Research Scholar, Singhania University, Rajasthan, India.
Dr. M.B. Sanjay Pande, Professor and Head, Department of Computer Science & Engineering, VVIET, Mysore, India

.
Abstract - An image containing textual information is called a document image. The textual information in document images is useful in areas like vehicle number plate reading, passport reading and cargo container reading and so on. Thus extracting useful textual information in the document image plays an important role in many applications. One of the major challenges in camera document analysis is to deal with the wrap and perspective distortions. In spite of the prevalence of dewrapping techniques, there is no standard efficient algorithm for the performance evaluation that concentrates on visualization. Wrapping is a common appearance document image before recognition. In order to capture the document images a mobile camera of 2megapixel resolution is used. A database is developed with variations in background, size and colour along with wrapped images, blurred and clean images. This database will be explored and text extraction from those document images is performed. In case of wrapped images no efficient dewrapping techniques have been implemented till date. Thus extracting the text from the wrapped images is done by maintaining a suitable template database. Further, the extracted text from the wrapped or other document images will be converted into an editable form such as Notepad or MS word document. The experimental results were corroborated on various objects of database.
.
Keywords: Dewrapping, Template Database, Text Extraction.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

17. Paper 29101043: Secured Authentication Protocol System Using Images (pp. 110-116)
Full Text: PDF

.
G. Arumugam, Prof. & Head, Computer Science Department, Madurai Kamaraj University, Madurai, India
R. Sujatha, Research Associate, SSE Project, Department of Computer Science, Madurai Kamaraj University, Madurai, India

.
Abstract — In order to protect secret information from sensitive and various applications, secured authentication system should be incorporated; it should contain security and confidentiality. Even if it is assumed that the cryptographic primitives are perfect, the security goals may not be achieved: the system itself may have weaknesses that can be exploited by an attacker in network attacks. In this paper a Secured Authentication Protocol System using Images (SAPSI) is presented. It ensures confidentiality, and authentication using server and Image based authentication mechanism.
.
Keywords- Confidentiality, Security, Server, Image-Based Authentication System, Authentication.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

18. Paper 29101044: SIP and RSW: A Comparative Evaluation Study (pp. 117-119)
Full Text: PDF

.
Mahmoud Baklizi, Nibras Abdullah, Omar Abouabdalla, Sima Ahmadpour
National Advanced IPv6 Centre of Excellence, Universiti Sains Malaysia, Penang, Malaysia

.
Abstract — Voice over internet protocol (VoIP) is a technology that uses Internet to transmit voice digital information. The Session Initiation Protocol (SIP) and Real time Switching (RSW) are signaling protocols that emerged as a new VoIP which gained popularity among VoIP products. In literature, many comparative studies have been conducted to evaluate signaling protocols, but none of them addressed the targeted protocols. In this paper, we make a comparative evaluation and analysis for SIP and RSW using Mean Opinion Score rating (MOS). We found that RSW performs better than SIP under different networks in terms of (packet delays).
.
Keywords - VoIP; MOS; InterAsterisk eXchange Protocol; Real-Time Switching Control Criteria and Session Initiation protocol.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

19. Paper 30091053: A role oriented requirements analysis for ERP implementation in health care Organizations (pp. 120-124)
Full Text: PDF

.
Kirti Pancholi, Acropolis Institute of Pharmaceutical Education and Research, Indore, MP, India
Durgesh Kumar Mishra, Acropolis Institute of Technology and Research, Indore, MP, India

.
Abstract - Information is worthwhile only if it can be accessed at the right time, by the right person & is useful for the purpose defined. Health care providers have a strong tradition of safeguarding private health information. Today's world belongs to Information Technology. With information broadly held and transmitted electronically, the rule provides clear standards for all parties regarding protection of personal health information. Medical resources integration concerns has also been a long-standing problem, which need to work in collaboration with information technology, aiming at a common goal. The complexity and extension of roles of the planning system demand extensive seamless integrations in the organization. Medical enterprise resources planning (ERP) integrates each level of healthcare staff by providing information and knowledge in timely manner, making the ERP a synchronizing solution for all the roles required in the organization for various timely decision makings. From this motivation, this paper proposes a role-oriented requirement definition analysis for ERP implementations in the organizations. Integrated hospitals need a central planning and control system to plan patients’ processes and the required capacity. Given the changes in healthcare one can ask the question what type of information systems can best support these healthcare delivery organizations. We focus in this review on the potential of enterprise resource planning (ERP) systems for healthcare delivery organizations.
.
Keywords: Patient Logistics, Planning and control, clinical management, ERP, AP.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

20. Paper 30101050: Fuzzy expert system for evaluation of students and online exams (pp. 125-130)
Full Text: PDF

.
Mohammed E. Abd-Alazeem, Computer science department, Faculty of computers and information, Mansoura, Egypt.
Sherief I. Barakat, Information system department, Faculty of computers and information, Mansoura , Egypt.

.
Abstract - In this paper we will introduce an expert system for evaluation of online exam. We use fuzzy system for classifying students based on their usage data and the final marks obtained in their respective courses. We have used real data from nine Moodle courses with Mansoura University Pharmacy students and apply techniques on two hundred students. This expert system will be able to facilitate education and play the role to play the role of virtual intelligent teacher referring to student capabilities by following the feedback mechanisms and will evaluate the online exams and questions to measure the difficulty level of exams. The main components of this expert system are Inference Engine, Knowledge Acquisition Facility and Knowledge-base that construct back-end of the system. We realize the model by a fuzzy rule-based expert system with its inference engine that uses various inference methods for education.
.
Keywords: Fuzzy rule base, Knowledge base, Inference engine
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

21. Paper 30101051: Intelligent Controller for Networked DC Motor Control (pp. 131-137)
Full Text: PDF

.
B. Sharmila, Department of EIE, Sri Ramakrishna Engineering College, Coimbatore, India
N. Devarajan, Department of EEE, Government College of Tech,.Coimbatore, India

.
Abstract — This paper focuses on the feasibility of Neural Network controller for Networked Control Systems. The Intelligent Controllers has been developed for controlling the speed of the Networked DC Motor by exploiting the features of Neural Networks and Fuzzy Logic Controllers. The major challenges in Networked Control Systems are the network induced delays and data packet losses in the closed loop. These challenges degrade the performance and destabilize the systems. The aim of the proposed Neural Network Controller and Fuzzy Logic Controller schemes improve the performance of the networked DC motor and also compare the results with the Zeigler-Nichols tuned Proportional-Integral-Derivative Controller. The performance of the proposed controllers has been verified through simulation using MATLAB/SIMULINK package. The effective results show that the performance of networked dc motor is improved by using Intelligent Controller than the other controllers.
.
Keywords- Networked Control Systems (NCS); Network Challenges; Tuning; Proportional – Integral – Derivative Controllers (PID); Fuzzy Logic Controller (FLC); Artificial Neural Networks (ANN).
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

22. Paper 31101062: A Novel LTCC Bandpass Filter for UWB Applications (pp. 138-140)
Full Text: PDF

.
Thirumalaivasan K. and Nakkeeran R.
Department of Electronics and Communication Engineering, Pondicherry Engineering College, Puducherry-605014, India

.
Abstract — Bandpass filter based on parallel coupled line microstrip structure is designed in low-temperature co-fired ceramic technology (LTCC) suitable for short range Ultra-Wideband (UWB) applications. Fifth order Chebyshev filter of 0.05 dB passband ripple with fractional bandwidth of 62.17% is proposed using insertion loss method. The filter demonstrates -10 dB bandwidth and linear phase response over the frequency range 3.8 GHz - 7.4 GHz. With the above functional features, the overall dimension of the filter is 33.5 mm (height) × 1.6 mm (length) × 1.6 mm (breadth). It is not only compact but also delivers excellent scattering parameters with the magnitude of insertion loss, |S21| lower than -0.09 dB and return loss better than -49 dB. In the passband, the computed group delay is well within the tolerable variation of 0.1 ns.
.
Keywords- Ultra-wideband; bandpass filter; parallel coupled line; low-temperature co-fired ceramic; group delay
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

23. Paper 25101027: Retrieval of Bitmap Compression History (pp. 141-146)
Full Text: PDF

.
Salma Hamdy, Haytham El-Messiry, Mohamed Roushdy, Essam Kahlifa
Faculty of Computer and Information Sciences, Ain Shams University, Cairo, Egypt

.
Abstract — The histogram of Discrete Cosine Transform coefficients contains information on the compression parameters for JPEGs and previously JPEG compressed bitmaps. In this paper we extend the work in [1] to identify previously compressed bitmaps and estimate the quantization table that was used for compression, from the peaks of the histogram of DCT coefficients. This can help in establishing bitmap compression history which is particularly useful in applications like image authentication, JPEG artifact removal, and JPEG recompression with less distortion. Furthermore, the estimated table calculates distortion measures to classify the bitmap as genuine or forged. The method shows good average estimation accuracy of around 92.88% against MLE and autocorrelation methods. In addition, because bitmaps do not experience data loss, detecting inconsistencies becomes easier. Detection performance resulted in an average false negative rate of 3.81% and 2.26% for two distortion measures, respectively.
.
Keywords: Digital image forensics; forgery detection; compression history; Quantization tables.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

24. Paper 11101007: Steganography and Error-Correcting Codes (pp. 147-149)
Full Text: PDF

.
M.B. Ould MEDENI and El Mamoun SOUIDI
Laboratory of Mathematic Informatics and Applications, University Mohammed V-Agdal, Faculty of Sciences, Rabat , BP 1014, Morocco

.
Abstract — In this work, We study how we used the error correcting codes in steganographic protocols (sometimes also called the “matrix encoding”), which uses a linear code as an ingredient. Among all codes of a fixed block length and fixed dimension (and thus of a fixed information rate), an optimal code is one that makes most of the maximum length embeddable (MLE). the steganographic protocols are close in error-correcting codes. We will clarify this link, which will lead us to a bound on the maximum capacity that can have a steganographic protocol and give us a way to build up optimal steganographic protocols.
.
Keywords: Steganography, Error-correcting code, average distortion, matrix encoding, embeding efficient.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

25. Paper 29091047: A Comparative Study on Kakkot Sort and Other Sorting Methods (pp. 150-155)
Full Text: PDF

.
Rajesh Ramachandran, HOD, Department of Computer Science, Naipunnya Institute of Management & Information Technology, Pongam, Kerala
Dr. E. Kirubakaran, Sr. DGM(Outsourcing), BHEL, Trichy

.
Abstract: Several efficient algorithms were developed to cope with the popular task of sorting. Kakkot sort is a new variant of Quick and Insertion sort. The Kakkot sort algorithm requires O( n log n ) comparisons for worst case and average case. Typically, Kakkot Sort is significantly faster in practice than other O ( n log n ) algorithms , because its inner loop can be efficiently implemented on most architectures . This sorting method requires data movement, but less than that of insertion sort. This data movement can be reduced by implementing the algorithm using linked list. In this comparative study the mathematical results of Kakkot sort were verified experimentally on ten randomly generated unsorted numbers. To have some experimental data to sustain this comparison four different sorting methods were chosen and code was executed and execution time was noted to verify and analyze the performance. The Kakkot Sort algorithm performance was found better as compared to other sorting methods.
.
Keywords: Complexity, performance of algorithms, sorting
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

26. Paper 27101037: A Generalization of the PVD Steganographic Method (pp. 156-159)
Full Text: PDF
.
M. B. Ould MEDENI, Laboratory of Mathematic Informatics and Applications, University Mohammed V-Agdal, Faculty of Sciences, Rabat , BP 1014, Morocco
El Mamoun SOUIDI, Laboratory of Mathematic Informatics and Applications, University Mohammed V-Agdal, Faculty of Sciences, Rabat , BP 1014, Morocco

.
Abstract — In this work we propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by dividing the cover into blocks of equal sizes and then embeds the message in the edge of the block depending on the number of ones in left four bits of the pixel. The purpose of this work is to generalize the PVD method [7] With four-pixel differencing instead of two pixel differencing and use the LSB Substitution to hide the secret message in the cover image
.
Keywords: Steganography, Watermarking, Least Significant Bit(LSB), PVD method, Digital Images, Information Hiding, Pixel-value differencing.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

27. Paper 12101009: Implementation of Polynimial Neural Network in Web Usage Mining (pp. 160-167)
Full Text: PDF

.
S. Santhi, Research Scholar, Mother Teresa Women’s University, Kodaikanal, India
Dr. S. Purushothaman, Principal, Sun college of Engineering and Technology, Nagarkoil, India
.
Abstract — Education, banking, various business and humans’ necessary needs are made available on the Internet. Day by day number of users and service providers of these facilities are exponentially growing up. The people face the challenges of how to reach their target among the enormous Information on web on the other side the owners of web site striving to retain their visitors among their competitors. Personalized attention on a user is one of the best solutions to meet the challenges. Thousands of papers have been published about personalization. Most of the papers are distinct either in gathering users’ logs, or preprocessing the web logs or Mining algorithm. In this paper simple codification is performed to filter the valid web logs. The codified logs are preprocessed with polynomial vector preprocessing and then trained with Back Propagation Algorithms. The computational efforts are calculated with various set of usage logs. The results are proved the goodness of the algorithm than the conventional methods.
.
Keywords- web usage mining; Back propagation algorithm; Polynomial vector processing.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

28. Paper 14101013: Efficient Probabilistic Classification Methods for NIDS (pp. 168-172)
Full Text: PDF

.
S.M. Aqil Burney, Meritorious Professor Department of Computer Science, University of Karachi, Karachi-Pakistan
M. Sadiq Ali Khan, Assistant Professor Department of Computer Science, University of Karachi, Karachi-Pakistan.
Jawed Naseem, Principal Scientific Officer-PARC
.
Abstract: As technology improve, attackers are trying to get access of the network system resources by so many means, open loop holes in the network allow them to penetrate in the network more easily. Various approaches are tried for classification of attacks. In this paper we have compared two methods Naïve Bayes and Junction Tree Algorithm on reduced set of features by improving the performance as compared to full data set. For feature reduction PCA is used that helped in proposing a new method for efficient classification. We proposed a Bayesian network-based model with reduced set of features for Intrusion Detection. Our proposed method generates a less false positive rate that increase the detection efficiency by reducing the workload and that increase the overall performance of an IDS. We also investigated that whether conditional independence really effect on the attacks/ threats detection.
.
Keywords - Network Intrusion Detection System (NIDS); Bayesain Networks; Junction Tree Algorithm
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

29. Paper 16101015: A Survey on Digital Image Enhancement Techniques (pp. 173-178)
Full Text: PDF

.
V. Saradhadevi, Research scholar, Karpagam University, Coimbatore, India.
Dr. V. Sundaram, Director of MCA , karpagam Engineering College, Coimbatore, India.

.
Abstract --- Image enhancement is one of the major research fields in image processing. In many applications such as medical application, military application, media etc., the image enhancement plays an important role. There are many techniques proposed by different authors in order to remove the noise from the image and produce the clear visual of the image. Also, there are many filters and image smoothing methods available. All these available techniques are designed for particular kind of noises. Recently, neural networks turn to be a very effective tool to support the image enhancement. Neural network is applied in image enhancement because it provides many advantages over the other techniques. Also, neural network can be suitable for removal of all kinds of noises based on its training data. This paper provides survey about some of the techniques applied for image enhancement. This survey deals with the several existing methods for image enhancement using neural networks.
.
Keywords --- Image Enhancement, Image Denoising, Neural Network, Image Filter, Image Restoration.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

30. Paper 18101019: A Survey on Designing Metrics suite to Asses the Quality of Ontology (pp. 179-184)
Full Text: PDF

.
K. R. Uthayan, Department of Information Technology, SSN College of Engineering, Chennai, India
G. S. Anandha Mala, Professor & Head, Department of Computer Science & Engineering, St. Joseph’s College of Engineering, Chennai, India

.
Abstract --- With the persistent growth of the World Wide Web, the difficulty is increased in the retrieval of relevant information for a user’s query. Present search engines offer the user with several web pages, but different levels of relevancy. To overcome this, the Semantic Web has been proposed by various authors to retrieve and utilize additional semantic information from the web. As the Semantic Web adds importance for sharing knowledge on the internet this has guide to the development and publishing of several ontologies in different domains. Using the database terminology, it can be said that the web-ontology of a semantic web system is schema of that system. As web ontology is an integral aspect of semantic web systems, hence, design quality of a semantic web system can be deliberated by measuring the quality of its web-ontology. This survey focuses on developing good ontologies. This survey draws upon semiotic theory to develop a suite of metrics that assess the syntactic, semantic, pragmatic, and social aspects of ontology quality. This research deliberates about the metrics that may contribute in developing a high quality semantic web system.
.
Keywords--- Quality Metrics, Web ontology, Semiotic Metrics, Semantic Quality, Domain modularity.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

31. Paper 20101021: An Anomaly-Based Network Intrusion Detection System Using Fuzzy Logic (pp. 185-193)
Full Text: PDF

.
R. Shanmugavadivu, Assistant professor, Department of Computer Science, PSG College of Arts & Science, Coimbatore.
Dr. N. Nagarajan, Principal, Coimbatore Institute of Engineering and Information Technology, Coimbatore.

.
Abstract—IDS which are increasingly a key part of system defense are used to identify abnormal activities in a computer system. In general, the traditional intrusion detection relies on the extensive knowledge of security experts, in particular, on their familiarity with the computer system to be protected. To reduce this dependence, various data-mining and machine learning techniques have been used in the literature. In the proposed system, we have designed fuzzy logic-based system for effectively identifying the intrusion activities within a network. The proposed fuzzy logic-based system can be able to detect an intrusion behavior of the networks since the rule base contains a better set of rules. Here, we have used automated strategy for generation of fuzzy rules, which are obtained from the definite rules using frequent items. The experiments and evaluations of the proposed intrusion detection system are performed with the KDD Cup 99 intrusion detection dataset. The experimental results clearly show that the proposed system achieved higher precision in identifying whether the records are normal or attack one.
.
Keywords - Intrusion Detection System (IDS); Anomaly based intrusion detection; Fuzzy logic; Rule learning; KDD Cup 99 dataset.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

32. Paper 25101030: Blemish Tolerance in Cellular Automata And Evaluation Reliability (pp. 194-200)
Full Text: PDF

.
Rogheye parikhani, Engineering Department, Islamic Azad University, Tabriz branch, Tabriz, Iran
Mohmad teshnelab, Department of Controls Engineering, Faculty of Electrical and Computer Engineering, KN Toosi University of Technology, Tehran, Iran
Shahram babaye, Engineering Department, Islamic Azad University, Tabriz branch, Tabriz, Iran

.
Abstract — The computational paradigm known as quantum-dot cellular automata (QCA) encodes binary information in the charge configuration of Coulomb-coupled quantum-dot cells. Functioning QCA devices made of metal-dot cells have been fabricated and measured. We focus here on the issue of robustness in the presence of disorder and thermal fluctuations. We examine the performance of a semi-infinite QCA shift register as a function of both clock period and temperature. The existence of power gain in QCA cells acts to restore signal levels even in situations where high speed operation and high temperature operation threaten signal stability. Random variations in capacitance values can also be tolerated.
.
Keywords-component; QCA, molecular electronics, single electronics, quantum-dot cellular automata, nanoelectronics
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

33. Paper 30101048: Feed Forward Neural Network Algorithm for Frequent Patterns Mining (pp. 201-205)
Full Text: PDF

.
Amit Bhagat , Department of Computer Applications
Dr. Sanjay Sharma , Associate Prof. Deptt. of Computer Applications
Dr. K.R.Pardasani , Professor Deptt. of Mathematics

Maulana Azad National Institute of Technology, Bhopal (M.P.)462051, India
.
Abstract: Association rule mining is used to find relationships among items in large data sets. Frequent patterns mining is an important aspect in association rule mining. In this paper, an efficient algorithm named Apriori-Feed Forward(AFF) based on Apriori algorithm and the Feed Forward Neural Network is presented to mine frequent patterns. Apriori algorithm scans database many times to generate frequent itemsets whereas Apriori-Feed Forward(AFF) algorithm scans database Only Once. Computational results show the Apriori-Feed Forward(AFF) algorithm performs much faster than Apriori algorithm.
.
Keywords: Association rule mining, dataset scan, frequent itemsets, Neural Network.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

34. Paper 30101057: An Efficient Vector Quantization Method for Image Compression with Codebook generation using Modified K-Means (pp. 206-212)
Full Text: PDF

.
S. Sathappan, Associate Professor of Computer Science, Erode Arts and Science College, Erode-638 009. Tamil Nadu. India.
.
Abstract — With the growth of internet and multimedia, compression techniques have become the thrust area in the fields of computers. Image compression is a technique of efficiently coding digital image to reduce the number of bits required in representing image. Many image compression techniques presently exist for the compression of different types of images. In this paper Vector Quantization based compression scheme is introduced. In this scheme a low bit rate still image compression is performed by compressing the indices of Vector Quantization and residual codebook is generated. The indices of VQ are compressed by exploiting correlation among image blocks, which reduces the bit per index. A residual codebook similar to VQ codebook is generated that represents the distortion produced in VQ. Using this residual codebook the distortion in the reconstructed image is removed, thereby increasing the image quality. The proposed technique combines these two methods and by replacing the Modified k-means algorithm for LBG in the codebook generation. Experimental results on standard image Lena show that the proposed scheme can give a reconstructed image with a higher PSNR value than all the existing image compression techniques.
.
Keywords — Image compression, Vector Quantization, Residual Codebook, Modified K-Means
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

35. Paper 31101064: Optimization of work flow execution in ETL using Secure Genetic Algorithm (pp. 213-222)
Full Text: PDF

.
Raman Kumar, Saumya Singla, Sagar Bhalla and Harshit Arora
Department of Computer Science and Engineering, D A V Institute of Engineering and Technology, Jalandhar, Punjab, India.
.
Abstract — Data Warehouses (DW) typically grows asynchronously, fed by a variety of sources which all serve a different purpose resulting in, for example, different reference data. ETL is a key process to bring heterogeneous and asynchronous source extracts to a homogeneous environment. The range of data values or data quality in an operational system may exceed the expectations of designers at the time validation and transformation rules are specified. Data profiling of a source during data analysis is recommended to identify the data conditions that will need to be managed by transformation rules and its specifications. This will lead to implementation of the ETL process. Extraction-Transformation-Loading (ETL) tools are set of processes by which data is extracted from numerous databases, applications and systems transformed as appropriate and loaded into target systems - including, but not limited to, data warehouses, data marts, analytical applications, etc. Usually ETL activity must be completed in certain time frame. So there is a need to optimize the ETL process. A data warehouse (DW) contains multiple views accessed by queries. One of the most important decisions in designing a data warehouse is selecting views to materialize for the purpose of efficiently supporting decision making. Therefore heuristics have been used to search for an optimal solution. Evolutionary algorithms for materialized view selection based on multiple global processing plans for queries are also implemented. The ETL systems work on the theory of random numbers, this research paper relates that the optimal solution for ETL systems can be reached in fewer stages using genetic algorithm. This early reaching of the optimal solution results in saving of the bandwidth and CPU time which it can efficiently use to do some other task. Therefore, the proposed scheme is secure and efficient against notorious conspiracy goals, information processing.
.
Keywords- Extract, Transform, Load, Data Warehouse(DW), Genetic Algorithm (GA), Architecture, Information Management System, Virtual Storage Acess Method and Indexed Sequential Access Method
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

36. Paper 31101066: 3D Protein Structure Comparison and Retrieval Methods : Investigation Study (pp. 223-227)
Full Text: PDF

.
Muhannad A. Abu-Hashem, Nur’Aini Abdul Rashid, Rosni Abdullah, Hesham A. Bahamish
School of Computer Science, Universiti Sains Malaysia USM, Penang, Malaysia

.
Abstract— The speed of the daily growth of computational biology databases opens the door for researchers in this field of study. Although much work have been done in this field, the results and performance are still imperfect due to insufficient review of the current methods. Here in this paper we discuss the common and most popular methods in the field of 3D protein structure comparison and retrieval. Also, we discuss the representation methods that have been used to support similarity process in order to get better results. The most important challenge related to the study of protein structure is to identify its function and chemical properties. At this point, the main factor in determining the chemical properties and the function of protein is the three dimensional structure of the protein. In other words, we cannot identify the function of a protein unless we represent it in its three dimensional structure. Hence, many methods were proposed for protein 3D structure representation, comparison, and retrieval. This paper summarizes the challenges, advantages and disadvantages of the current methods.
.
Keywords-3D protein structure; protein structure retrieval; protein structure comparison; PDB;
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

37. Paper 31101084: The Impact of Speed on the Performance of Dynamic Source Routing in Mobile Ad-Hoc Networks (pp. 228-233)
Full Text: PDF

.
Naseer Ali Husieen, Osman B Ghazali, Suhaidi Hassan, Mohammed M. Kadhum
Internetworks Research Group, College of Arts and Sciences, University Utara Malaysia, 06010 UUM Sintok, Malaysia

.
Abstract — Ad-hoc networks are characterized by multihop wireless connectivity, frequently changing network topology and the need for efficient dynamic routing protocols plays an important role. Due to mobility in Ad-hoc network, the topology of the network may change rapidly. The mobility models represent the moving behavior of each mobile node in the MANET that should be realistic. This paper concerns performance of mobile Ad-hoc network (MANET) routing protocol with respect to the effects of mobility model on the performance of DSR protocol for the purpose of finding the optimal settings of node speed. In this paper, we evaluate the performance of DSR protocol using Random Waypoint Mobility Model in terms of node speed, number of connections, and number of nodes.
.
Keywords - MANET, Mobility Models, Routing Protocol, DSR Protocol.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

38. Paper XXXXXX: Multidimensionality in Agile Software Development (pp. 234-238)
Full Text: PDF

.
Ashima, Assistant Professor, Computer Science and Engineering Department, Thapar University, Patiala
Dr. Himanshu Aggarwal, Associate Professor. Faculty of Computer Engineering, Punjabi University, Patiala.

.
Abstract - Among new software development processes, Agile Software Development (ASD) gives the software industry a new idea of quick and timely delivery of product. Agile methodologies got overwhelming response by all levels of software organizations. But limited scope of software designing and reusability of components do not let it to be made first choice of software development process and professionals. Agility addresses Multidimensional constraints like software design and Reusability, architecture and risk, iterations and changeability. Rapid development combined with changeability at later phases adds charm to ASD but missing designing and reusability act as a hurdle. Popularity of any software product is actually in length of its stay in market that ofcouse yields them rewards in terms of money compared to their investments. Agility’s approach of development towards specialized components also lessens their probability of staying long in market. This paper aims to find how reusability by adding a bit of designing and developing specialized cum generalized components can be achieved in ASD.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

39. Paper 31101093: Aggregating Intrusion Detection System Alerts Based on Row Echelon Form Concept (pp. 239-242)
Full Text: PDF

.
Homam El-Taj, Omar Abouabdalla, Ahmed Manasrah, Moein Mayeh, Mohammed Elhalabi
National Advanced IPv6 Center (NAv6) UNIVERSITI SAINS MALAYSIA Penang, Malaysia 11800

.
Abstract — Intrusion Detection Systems (IDS) are one of the well-known systems used to secure the computer environments, these systems triggers thousands of alerts per day to become a serious issue to the analyst, because they need to analyze the severity of the alerts and other issues such as the IP addresses, ports and so on to get better understanding about the relations between the alerts. This will lead to have a better understanding about the attacks. This paper Investigates the most popular aggregation methods, which deals with IDS alerts. In addition, we propose Time Threshold Aggregation algorithm (TTA) to handle IDS alerts. TTA is based on time as a main component to aggregate the alerts. On the other hand, TTA supports aggregating alerts without threshold, which can be done by setting the threshold value to 0.
.
Keywords—Intrusion Detection System, False Positive, Redundant Alerts, Alert Aggregation.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

40. Paper 31101056: Evaluation of Vision based Surface Roughness using Wavelet Transforms with Neural Network Approach (pp. 243-252)
Full Text: PDF

.
T.K. Thivakaran Research scholar, MS University, Thirunelveli
Dr. RM. Chandrasekaran, Professor, Annamalai University, Chidambaram

.
Abstract --- Machine vision for industry has generated a great deal of interest in the technical community over the past several years. Extensive research has been performed on machine vision applications in manufacturing, because it has the advantage of being non-contact and as well faster than the contact methods. Using Machine Vision, it is possible to evaluate and analyze the area of the surface, in which machine vision extracted the information with the help of array of sensors to enable the user to make intelligent decision based on the applications. In this work, Estimation of surface roughness has been done and analyzed using digital images of machined surface obtained by Machine vision system. Features are extracted from the enhanced images in spatial frequency domain using a two dimensional Fourier Transform and Wavelet Transform. An artificial neural network (ANN) is trained using feature extracted values as input obtained from wavelet Transform and tested to get Rt as output. The estimated roughness parameter (Rt) results based on ANN is compared with the Rt values obtained from Stylus method and the best correlation between both the values are determined.
.
Keywords--- Surface roughness, Machine vision, Milling, Grinding, Wavelet Transform, Neural Network.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

41. Paper 31101059: An In-Depth Study on Requirement Engineering (pp. 253-262)
Full Text: PDF

.
Mohammad Shabbir Hasan, Abdullah Al Mahmood, Farin Rahman, Sk. Md. Nahid Hasan,
Panacea Research Lab, Dhaka, Bangladesh

.
Abstract — Software development includes Requirement Engineering (RE) which is comprised of discovering stakeholders and their need, proper documentation, communication and subsequent implementation. It can be viewed as the most crucial phase as the success of the software depends largely on it. Requirement Engineering is receiving increasing attention of the researchers and also people associated with software development. This paper investigates RE activities in detail followed by some current challenges and also proposes some suggestions to overcome these challenging issues.
.
Keywords- Software Requirement, Requirement Engineering, Requirement Elicitation, Requirement Management.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

42. Paper 31101072: GCC license plates detection and recognition using morphological filtering and neural networks (pp. 263-269)
Full Text: PDF

.
Mohammed Deriche,
Electrical Engineering Department, King Fahd University of Petroleum and Minerals, Dhahran, 31261 Saudi Arabia.

.
Abstract — License Plate Recognition (LPR) systems play an important role in intelligent transportation applications. These systems have extensively been used in highway and bridge charge, port, airport gate monitoring, parking and toll applications, to mention a few. We propose here an automatic license plate detection and recognition system for GCC countries license plates containing Arabic letters and numerals. The system introduces a robust algorithm for the extraction of the license plate region using adaptive thresholding and morphological filtering. The recognition stage is based on extracting LDA (Linear Discriminant Analysis) features with a neural network classifier. Preliminary experiments on the system have been carried with real images of vehicles captured under various conditions. The proposed system is shown to achieve high recognition accuracy under different illumination conditions.
.
Keywords-component; license palte recognition, LDA, Arabic character recognition; GCC countries
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

45. Paper 29101040: Localization Accuracy Improved Methods Based on Adaptive Weighted Centroid Localization Algorithm in Wireless Sensor Networks (pp. 284-288)
Full Text: PDF

.
Chang-Woo Song, Jun-Ling Ma, Jung-Hyun Lee, Department of Information Engineering, INHA University, Incheon, Korea.
Kyung-Yong Chung, Department of Computer Information Engineering, Sangji University, Wonju, Korea
Kee-Wook Rim, Department of Computer and Information Science, Sunmoon University, Asan, Korea
.
Abstract — Generally, see Localization of nodes is a key technology for application of wireless sensor network. Having a GPS receiver on every sensor node is costly. In the past, several approaches, including range-based and range-free, have been proposed to calculate positions for randomly deployed sensor nodes. Most of them use some special nodes, called anchor nodes, which are assumed to know their own locations. Other sensors compute their locations based on the information provided by these anchor nodes. This paper uses a single mobile anchor node to move in the sensing field and broadcast its current position periodically. We provide an adaptive weighted centroid localization algorithm that uses coefficients, which are decided by the influence of mobile anchor node to unknown nodes, to prompt localization accuracy. We also suggest a criterion which is used to select mobile anchor node which involve in computing the position of nodes for improving localization accuracy. The localization accuracy of adaptive weighted centroid localization algorithm is better than maximum likelihood estimation which is used very often.
.
Keywords-component; Weighted Centroid Algorithm; Wireless Sensor Networks; Localization;
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.

46. Paper 31101068: A Novel Hybridization of ABC with CBR for Pseudoknotted RNA Structure (pp. 289-299)
Full Text: PDF

.
Ra’ed M. Al-Khatib, Nur’Aini Abdul Rashid and Rosni Abdullah
School of Computer Science, Universiti Sains Malaysia USM, Penang, Malaysia

.
Abstract— The RNA molecule is substantiated to play important functions in living cells. The class of RNA with pseudoknots, has essential roles in designing remedies for many virus diseases in therapeutic domain. These various useful functions can be inferred from RNA secondary structure with pseudoknots. Many computational intensive efforts have been emerged with the aim of predicting the pseudoknotted RNA secondary structure. The computational approaches are much promising to predict the RNA structure. The reason behind this is that, the experimental methods for determining the RNA tertiary structure are difficult, timeconsuming and tedious. In this paper, we introduce ABCRna, a novel method for predicting RNA secondary structure with pseudoknots. This method combines heuristic-based KnotSeeker with a thermodynamic programming model, UNAFold. ABCRna is a hybrid swarm-based intelligence method inspired by the secreting honey process in natural honey-bee colonies. The novel aspect of this method is adapting Case-Based Reasoning (CBR) and knowledge base, two prominent Artificial Intelligence techniques. They are employed particularly to enhance the quality performance of the proposed method. The CBR provides an intelligent decision, which results more accurate predicted RNA structure. This modified ABCRna method is tested using different kinds of RNA sequences to prove and compare its efficiency against other pseudoknotted RNA predicted methods in the literature. The proposed ABCRna algorithm performs faster with significant improvement in accuracy, even for long RNA sequences.
.
Keywords- RNA secondary structure; pseudoknots; Case-Bases Reasoning; Artificial Bee Colony (ABC) algorithm.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------..

47. Paper 31101070: Hybrid JPEG Compression Using Histogram Based Segmentation (pp. 300-306)
Full Text: PDF

.
M.Mohamed Sathik, Department of Computer Science, Sadakathullah Appa College, Tirunelveli, India.
K.Senthamarai Kannan, Department of Statistics, Manonmaniam Sundaranar University, Tirunelveli, India.
Y.Jacob Vetha Raj, Department of Statistics, Manonmaniam Sundaranar University, Tirunelveli, India.

.
Abstract-- Image compression is an inevitable solution for image transmission since the channel bandwidth is limited and the demand is for faster transmission. Storage limitation is also forcing to go for image compression as the color resolution and spatial resolutions are increasing according to quality requirements. JPEG compression is a widely used compression technique. JPEG compression method uses linear quantization and threshold values to maintain certain quality in an entire image. The proposed method estimates the vitality of the block of the image and adapts variable quantization and threshold values. This ensures that the vital area of the image is highly preserved than the other areas of the image. This hybrid approach increases the compression ratio and produces a desired high quality output image.
.
Key words-- Image Compression, Edge-Detection, Segmentation. Image Transformation, JPEG, Quantization.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------.



Comments