Vol. 9 No. 9 SEP 2011

Vol. 9 No. 9 September 2011 International Journal of Computer Science and Information Security
Publication September 2011, Volume 9 No. 9 (Download Full Journal) (Archive) (Download 2)

.
Copyright © IJCSIS. This is an open access journal distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
1. Paper 31081156: Using Image Steganography to Establish Covert Communication Channels (pp. 1-7)
Full Text: PDF

.
Keith L Haynes
Center for Security Studies, University of Maryland University College, Adelphi, Maryland, USA

.
Abstract - Steganography is the art or science of sending and receiving hidden information. This paper investigates the use of image steganography to breach an organization’s physical and cyber defenses to steal valuable information. Furthermore, it proposes a steganographic technique that exploits the characteristics of the computer vision process that are favorable for encryption. The result is an image steganographic system that is undetectable and secure.
.
Keywords- Steganography, computer vision, machine learning, image hiding
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
2. Paper 27081127: Virtual Education and its Importance as a New Method in Educational System (pp. 8-12)
Full Text: PDF

.
Mohammad Behrouzian Nejad, Young Researchers Club, Dezfoul Branch, Islamic Azad University, Dezfoul, Iran
Ebrahim Behrouzian Nejad, Department of Computer Engineering, Shoushtar Branch, Islamic Azad University, Shoushtar, Iran

.
Abstract — The increasing development of technology, especially information technology in education has led to many changes, including the cases that can be pointed to the emergence of Virtual Education. Virtual Education have been affected teaching and learning systems and itself as one of the main methods of learning has emerged. Courses offered in the multimedia environment removing the limitations of time and place for inclusive education to provide rapid feedback and such cases the advantages of this method is one of education. In the near future other structure and process of traditional training needs of human society not responsive in the information age, but knowledge is central to the goal. So Virtual Education as a new method and efficient can be very useful. In this paper we will examine the concepts, advantages, features and differences between traditional learning and teaching quality and efficiency to help executives implement effective this training method which can commensurate with the circumstances which they are located and make correct decisions in the application, implementation and development of Virtual Education.
.
Keywords- Virtual Education, E-Learning, Educational Technology; Information Technology.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
3. Paper 31071181: Study of Neural Network Algorithm for Straight-Line Drawings of Planar Graphs (pp. 13-19)
Full Text: PDF

.
Mohamed A. El-Sayed (a), S. Abdel-Khalek (b), and Hanan H. Amin (c)
(a) Mathematics department, Faculty of Science, Fayoum University, 63514 Fayoum, Egypt
(b,c) Mathematics department, Faculty of Science, Sohag University, 82524 Sohag, Egypt
(a) CS department, Faculty of Computers and Information Science , Taif Univesity, 21974 Taif, KSA
(b) Mathematics department, Faculty of Science , Taif Univesity, 21974 Taif, KSA

.
Abstract - Graph drawing addresses the problem of finding a layout of a graph that satisfies given aesthetic and understandability objectives. The most important objective in graph drawing is minimization of the number of crossings in the drawing, as the aesthetics and readability of graph drawings depend on the number of edge crossings. VLSI layouts with fewer crossings are more easily realizable and consequently cheaper. A straight-line drawing of a planar graph G of n vertices is a drawing of G such that each edge is drawn as a straight-line segment without edge crossings. However, a problem with current graph layout methods which are capable of producing satisfactory results for a wide range of graphs is that they often put an extremely high demand on computational resources. This paper introduces a new layout method, which nicely draws internally convex of planar graph that consumes only little computational resources and does not need any heavy duty preprocessing. Here, we use two methods: The first is self organizing map known from unsupervised neural networks which is known as (SOM) and the second method is Inverse Self Organized Map (ISOM).
.
Keywords-SOM algorithm, convex graph drawing, straight-line drawing
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
4. Paper 31081135: Multithreaded Image Processing (pp. 20-22)
Full Text: PDF

.
Jamil A. M. Saif, Computer Science Department, Faculty of Computer Science and engineering, Hodeidah University, Hodeidah, Yemen
Hamid S. S. Alraimi, Computer Science Department, Faculty of Computer Science and engineering, Hodeidah University, Hodeidah, Yemen

.
Abstract — Real time image processing applications require a huge amount of processing power, computing ability and large resources to perform the image processing applications. The nature of processing in typical image processing algorithms ranges from large arithmetic operations to fewer one. This paper presents an implementation of image processing operations using simultaneous multithreading, the performance of multithreading is analyzed and discussed, for the varying number of images.
.
Keywords - multithreading; image processing; performance.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
5. Paper 31081136: A New Efficient Symbol Timing Synchronization Scheme for MB-OFDM UWB Systems (pp. 23-28)
Full Text: PDF

.
Reza Shahbazian, Department of Electrical Engineering, Iran University of Science and Technology, Tehran, Iran
Bahman Abolhassani, Department of Electrical Engineering, Iran University of Science and Technology, Tehran, Iran

.
Abstract — Conventional symbol timing synchronization algorithms show improper performance in low SNR values. In this paper a new low complexity and efficient symbol timing synchronization (ESTS) algorithm is proposed for MB-OFDM UWB systems. The proposed algorithm locates the start of Fast Fourier Transform (FFT) window during packet/frame synchronization (PS/FS) sequences of the received signal. First, a cross correlation based function is defined to determine the time instant of the useful and successfully detected OFDM symbol. The threshold value in detection of the OFDM symbol is predetermined by considering the trade-off between the probability of false alarming and missed detection. The exact boundary of the FFT window for each OFDM symbol is estimated by a maximum likelihood metric and choosing the argument of the peak value. Verifying the estimated timing offset is the last step to locate the start of the FFT window. The proposed algorithm shows great improvement in the MSE, synchronization probability and bit error rate metrics compared with those of earlier works.
.
Keywords- MB-OFDM, Synchronization, Ultra Wide Band,Fast Fourier Transform, Maximum Likelihood.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
6. Paper 31081138: Securing the Multilevel Information System (pp. 29-35)
Full Text: PDF

.
Mohan H.S., Research Scholar, Dr. MGR University, Chennai, India
A. Raji Reddy, Professor & Head, Dept of ECE, Madanapalle Institute of Technology & Science, Madanapalle, Chittoor, India

.
Abstract— Nowadays, multilevel secure database is common in distributed systems. These databases require a generalized software system for multiuser and simultaneous access in the distributed system, as the client systems may be dissimilar (heterogeneous hardware and software.) The information system will usually be a blend of both information retrieval system and information management (create and maintain) system. This paper gives an approach in developing a generalized multilevel secure information system using three-tier architecture. The approach shows how data level integrity can be achieved using access and security levels on users/subjects and data/objects respectively.
.
Keywords- multilevel secure database; information system; generalized software system
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
7. Paper 31081144: Streamed Coefficients Approach for Quantization Table Estimation in JPEG Images (pp. 36-41)
Full Text: PDF

.
Salma Hamdy, Faculty of Computer and Information Sciences, Ain Shams University, Cairo, Egypt
.
Abstract — A forensic analyst is often confronted with low quality digital images, in terms of resolution and/or compression, raising the need for forensic tools specifically applicable to detecting tampering in low quality images. In this paper we propose a method for quantization table estimation for JPEG compressed images, based on streamed DCT coefficients. Reconstructed dequantized DCT coefficients are used with their corresponding compressed values to estimate quantization steps. Rounding errors and truncations errors are excluded to eliminate the need for statistical modeling and minimize estimation errors, respectively. Furthermore, the estimated values are then used with distortion measures in verifying the authenticity of test images and exposing forged parts if any. The method shows high average estimation accuracy of around 93.64% against MLE and power spectrum methods. Detection performance resulted in an average false negative rate of 6.64% and 1.69% for two distortion measures, respectively.
.
Keywords: Digital image forensics; forgery detection; compression history; Quantization tables.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
8. Paper 31081146: GPS L2C Signal Acquisition Algorithms for Resource-Limited Applications in Challenging Environments (pp. 42-50)
Full Text: PDF

.
Nesreen I Ziedan, Computer and Systems Engineering Department, Faculty of Engineering, Zagazig University, Egypt
.
Abstract — Many emerging indoor and wireless applications require the positioning capabilities of GPS. GPS signals, however, suffer from attenuations when they penetrate natural or manmade obstacles. Conventional GPS receivers are designed to detect signals when they have a clear view of the sky, but they fail to detect weak signals. This paper introduces novel algorithms to detect the new GPS L2C civilian signal in challenging environments. The signal structure is utilized in the design to achieve high sensitivity with reduced processing and memory requirements to accommodate the capabilities of resource-limited applications, like wireless devices. The L2C signal consists of a medium length data-modulated code (CM) and a long length dataless code (CL). The CM code is acquired using long coherent and incoherent integrations to increase the acquisition sensitivity. The correlation is calculated in the frequency domain using an FFT-based approach. A bit synchronization method is implemented to avoid acquisition degradation due to correlating over the unknown bit boundaries. The carrier parameters are refined using a Viterbi-based algorithm. The CL code is acquired by searching only a small number of delays, using a circular correlation based approach. The algorithms’ computational complexities are analyzed. The performances are demonstrated using simulated L2C GPS signals with carrier to noise ratio down to 10 dB-Hz, and TCXO clocks.
.
Index Terms—GPS, L2C, Acquisition, Weak Signal, Indoor, Viterbi
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
9. Paper 31081149: A Method for Fingerprint Authentication for ATM Based Banking Application (pp. 51-58)
Full Text: PDF

.
S. Koteswari #1, Dr. P. John Paul *2, V. Pradeep kumar #1, A.B.S.R. Manohar #1
,#1 Dept of ECE, Andhra Pradesh. India.
*Professor, Dept of CSE, GATES Engineering College, Gooty, Ananthapur, Andhra Pradesh, India.

.
Abstract - Fingerprint authentication is widely used in various authentication applications. It is because that fingerprints can achieve the best balance among authentication performance, cost, size of device and ease of use. With identity fraud in our society reaching unprecedented proportions and with an increasing emphasis on the emerging automatic personal identification applications such as biometrics-based verification, especially fingerprint-based identification is preferable as it is used for banking applications. In this paper we are providing authentication using fingerprints of the persons. Here there is two cases train and test. In train case we register the finger print of persons to whom we wish to give authorization .So after register the persons into the data base of the fingerprints .These are changed into templates of predefined .After making Templates the database will be compared with the testing In testing we just make verification after adding the fingerprint of persons. It compares with that template, which are available in database. If it is already in database, it shows matched result else it gives not matched .Finally, we show that the matching performance can be improved by combining the decisions of the matchers based on complementary (minutiae-based and filter based) fingerprint information. The localization of core point represents the most critical step of the whole process. A good matching requires an accurate positioning, so the small errors must also be avoided by usage of complex filtering techniques.
.
Keywords - Authentication, Fingerprints, Biometric application, Templates.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
10. Paper 31081150: Empirical Study of Evolution of Decision making Factors from 1990-2010 (pp. 59-66)
Full Text: PDF

.
Mohammed Suliman Al-Shakkah* , School of Computing, College of Arts and Sciences, University Utara Malaysia, UUM, 06010 UUM-Sintok, Kedah, Malaysia
Wan Rozaini Sheik Osman, School of Computing, College of Arts and Sciences, University Utara Malaysia, UUM, 06010 UUM-Sintok, Kedah, Malaysia

.
Abstract — The intense competition make DM process important for their survival. There are many factors that affect DM in all types of organizations, especially business. In this qualitative study the result has come out with new view for the decision making processing through (observing) analyzing the nine decision making factors from 1990-2010 from 210 papers which were selected randomly from the available resources. Seven partitions were made for the time period of three years and 30 papers for each period. Qualitative method was used here. By analyzing figures and chart with Microsoft excel, the nine decision making factors were categorized into two groups. The main group consists of five factors: time, cost, risk, benefits, and resources. While the second group of the factors consists of four: financial impact, feasibility, intangibles, and ethics. However, time was the most relevant factor at all. More researches in decision making are needed to solve the problems in organizations and in different scopes related to decisions.
.
Keywords- Decision making (DM); decision making process (DMP); decision support system (DSS).
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
11. Paper 31081157: Role Based Authentication Schemes to Support Multiple Realms for Security Automation (pp. 67-73)
Full Text: PDF

.
Rajasekhar B. M., Dept of Computer Science, S. K. University, Anantapur, Andhra Pradesh, India
Assoc Prof. Dr. G. A. Ramachandra, Head of The Dept of Computer Science & Technologies, S. K. University, Anantapur, Andhrapradesh, India

.
Abstract — Academy Automation implies to the various different computing hardware and software that can be used to digitally create, manipulate, collect, store, and relay Academy information needed for accomplishing basic Operation like admissions and registration to finance, student and faculty interaction, online library, medical and business development. Raw data storage, electronic transfer, and the management of electronic business information comprise the basic activities of an Academy automation system. The main aim of this work was to design and implement Multiple Realms Authentication where in each realm authentication can be implemented by using Role Based Authentication (RBA) System, where in each user has certain roles allotted to him/her which defines the user’s limits and capabilities of making changes, accessing various areas of the software and transferring/allotting these roles recursively. Strict security measures had kept in mind while designing such a system and proper encryption and decryption techniques are used at both ends to prevent any possibility of any third party attacks. Further, various new age authentication techniques like OpenID and WindowsCardSpace are surveyed and discussed to serve as a foundation for future work in this area.
.
Index Terms - RBA, Encryption/Decryption, OpenID, WindowsCard-Space.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
12. Paper 31081158: Parallel Edge Projection and Pruning (PEPP) Based Sequence Graph protrude approach for Closed Itemset Mining (pp. 74-81)
Full Text: PDF

.
kalli Srinivasa Nageswara Prasad, Sri Venkateswara University, Tirupati, Andhra Pradesh , India.
Prof. S. Ramakrishna, Department of Computer Science, Sri Venkateswara University, Tirupati, Andhra Pradesh , India

.
Abstract - Past observations have shown that a frequent item set mining algorithm are supposed to mine the closed ones as the end gives a compact and a complete progress set and better efficiency. Anyhow, the latest closed item set mining algorithms works with candidate maintenance combined with test paradigm which is expensive in runtime as well as space usage when support threshold is less or the item sets gets long. Here, we show, PEPP, which is a capable algorithm used for mining closed sequences without candidate. It implements a novel sequence closure checking format that based on Sequence Graph protruding by an approach labeled “Parallel Edge projection and pruning” in short can refer as PEPP. A complete observation having sparse and dense real-life data sets proved that PEPP performs greater compared to older algorithms as it takes low memory and is more faster than any algorithms those cited in literature frequently.
.
Key words – Data Mining; Graph Based Mining; Frequent itemset; Closed itemset; Pattern Mining; candidate; Itemset Mining; Sequential Itemset Mining.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
13. Paper 31081160: A Hierarchical View for Level Set Method Based on Segmentation of Non-Constant Intensity Objects (pp. 82-85)
Full Text: PDF

.
M. Janani, M.Phil Scholar, P.S.G.R Krishnammal College For Women, Coimbatore-641004
D. Kavitha Devi, Assistant Professor, P.S.G.R Krishnammal College For Women, Coimbatore-641004.

.
Abstract — Segmentation of non-constant intensity object has been an important and vital issue for many applications. Segmentation of non- constant intensity object is a fundamental importance in image processing. Segmentation is difficult task in noisy images. The complementary method of the Mumford shah model for segmentation of non-constant intensity objects is been intended by level set method. The level set method retrieve the possible multiple membership of the pixels. Additive is forced through level set method which allows the user to control the degree of non-constant intensity objects and is more secure than the soft constraint the enhanced method increase efficiency, improve the effectiveness of segmentation. The numerical and qualitative analysis show that the level set algorithm provide more accurate segmentation result with good robustness.
.
Keywords- level set method, non-constant intensity object, terzopoulos, kass, witkins, lipschitz.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
14. Paper 31081166: Customer Relationship Management and its Implementation in E-Commerce (pp. 86-90)
Full Text: PDF

.
Mohammad Behrouzian Nejad, Young Researchers Club, Dezfoul Branch, Islamic Azad University, Dezfoul, Iran
.
Abstract — In the new business gain customers is dedicated to its crucial position in the organization's goals and senior managers are well aware that their success in achieving the overall goals of the organization depends on satisfying our customers. Customer Relationship Management (CRM) includes all the steps that an organization to create and establish beneficial relationships with the customer takes. CRM is now a core is allocated to the business world. E-commerce, one of the issues raised the effect of information technology is expanding. Regarding the topic of CRM is important in E-Commerce and management, this paper shows the impact of CRM to improve relationships with customers in organizations and E-Commerce. Also, examines the general definition of CRM, goals, benefits, success factors and implementation of traditional and Electronic CRM.
.
Keywords-Customer Relationship Management, E-Commerce, Information Technology, Organizations.

.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
15. Paper 31081170: A Decision Tree Based Model to Identify the Career Focus of Computer Stream Students in ITES Industry (pp. 91-97)
Full Text: PDF

.
T. Hemalatha, Research Scholar, R&D centre, Bharathiar University,Combatore, Asst.Prof, M.C.A. Dept, VELS University, Chennai, India
Dr. Ananthi Sheshasaayee - Associate Professor and Head, Dept of Computer Science, Quaid-E-Millath Govt. College for women (Autonomous), Chennai –600 002,India

.
Abstract - This paper focuses on the various career opportunities that are available for the computer stream students in the field of ITES industry. This paper analyses the various attributes of the Skill Set of computer stream students, from which a decision tree can be generated to help them to improve the confidence level of students in selecting a career in ITES industry. For the past few years it has become a passion for students to choose computer science as their main stream for their studies. During the final semester of their graduation they struggle a lot to choose a career based on the skill set they posses which is of due importance. With the use of Decision tree this paper provides a guideline to take decision to choose career in ITES Industry.
.
Keywords - skill set, career, computer stream, ITES, decision tree, decision
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
16. Paper 21081106: Real Time Transmission Protocol (RTP) and Real Time Transmission Control Protocol (RTCP) Library Implementation (pp. 98-100)
Full Text: PDF

.
Mohammad Monirul Islam, Computer Science and Engineering Department, Daffodil International University 102, Shukrabad, Dhaka-1207, Bangladesh
.
Abstract— The system proposed is to create a library that will cover all the specifications given in RFC 3550.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
17. Paper 21081107: Towards Discriminant Analysis Modeling of Web 3.0 Design and Development for Students, Faculty and IT Professionals (pp. 101-108)
Full Text: PDF

.
S. Padma, Bharathiar University, Coimbatore & Vels University, Chennai, India
Dr. Ananthi Seshasaayee, Quaid-e-Millath Govt. College for women, Chennai, India

.
Abstract - Web 3.0 is an evolving extension of the web 2.0 scenario. The perceptions regarding web 3.0 is different from person to person . Web 3.0 Architecture supports ubiquitous connectivity, network computing, open identity, intelligent web, distributed databases and intelligent applications. Some of the technologies which lead to the design and development of web 3.0 applications are Artificial intelligence, Automated reasoning, Cognitive architecture, Semantic web . An attempt is made to capture the requirements of Students, Faculties and IT professionals regarding Web 3.0 applications so as to bridge the gap between the design and development of web 3.0 applications and requirements among Students, Faculties and IT professionals. Discriminant modeling of the requirements facilitate the identification of key areas in the design and development of software products for Students, Faculties and IT professionals in Web 3.0.
.
Keywords : Web 3.0, Discriminant analysis , Design and Development, Model
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
18. Paper 21081108: Identifying Harmonics By Empirical Mode Decomposition For Effective Control Of Active Filters During Electric Vehicle Charging (pp. 109-113)
Full Text: PDF

.
B.V. Dhananjay, Research Scholar, Vinayaka Mission’s University, Salem, India
Dr. T. Senthil kumar, Professor, Automobile Engineering, Bharathidhasan University, Trichirapalli, India

.
Abstract - This paper provides Hilbert Huang Transform (HHT) method (an empirical mode decomposition(EMD)) for identifying the presence of harmonics during electric vehicle battery charging when harmonics are generated into the electric line, due to switching actions of the power electronics. Activation of the active filters based on the difference between load current and fundamental current measured from the line is done. By using active power filter (APF) injection of the required current to minimize the harmonics is done. As part of simulation, the accuracy of the HHT is above 95%. By correctly recognizing the harmonics using HHT and injecting the compensating current into the line, the charging time of the battery can be reduced. The reduction in the charging time also depends on the battery condition.
.
Keywords - Hilbert Huang Transform; active power filter.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
19. Paper 24081116: A Multilevel Representation of Repository for Software Reuse (pp. 114-119)
Full Text: PDF

.
C. V. Guru Rao, Professor & Head, Department of Computer Science and Engineering, SR Engineering college, Warangal, Andhra Pradesh, India – 506 371
P. Niranjan, Associate Professor, Department of Computer Science and Engineering, Kakatiya Institute of Technology and Science, Warangal, Andhra Pradesh, India – 506 015

.
Abstract-- Effective software Reuse will be due to classification schemes used on software components that are stored into and retrieve from a software repository. This work proposes a new methodology for efficient classification and retrieval of multimedia software components based on user requirements by using attribute and faceted classification schemes. Whenever a user desires to trace a component with specified characteristics (Attributes) are identified and then compared with the characteristics of the existing components in repositories to retrieve relevant components. A web based software tool developed here to classify the multimedia software components is more efficient.
.
Keywords: Software Reuse, Classification Schemes, Reuse Repository.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
20. Paper 24081119: Application of Honeypots to Study Character of Attackers Based on their Accountability in the Network (pp. 120-124)
Full Text: PDF

.
Tushar Kanti, Department of Computer Science, L.N.C.T, Bhopal, India
Vineet Richhariya, Department of Computer Science, L.N.C.T, Bhopal, India
Vivek Richhariya, Department of Computer Science, L.N.C.T, Bhopal, India

.
Abstract — Malware in the form of computer viruses, worms, trojan horses, rootkits, and spyware acts as a major threat to the security of networks and creates significant security risks to the organizations. In order to protect the networked systems against these kinds of threats and try to find methods to stop at least some part of them, we must learn more about their behavior, and also methods and tactics of the attackers, which attack our networks. This paper makes an analysis of observed attacks and exploited vulnerabilities using honeypots in an organization network. Based on this, we study the attackers behavior and in particular the skill level of the attackers once they gain access to the honeypot systems. The work describes the honeypot architecture as well as design details so that we can observe the attackers behavior. We have also proposed a hybrid honeypot framework solution which will be used in the future work.
.
Keywords- Honeypot; Accountability; Classification; Honeynet; Virtual Machines; Honeyd
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
21. Paper 27081128: Performance of Input and Output Selection Techniques on Routing Efficiency in Network-On-Chip: A Review (pp. 125-130)
Full Text: PDF

.
Mohammad Behrouzian Nejad, Young Researchers Club, Dezfoul Branch, Islamic Azad University, Dezfoul, Iran
Amin Mehranzadeh, Department of Computer Engineering, Islamic Azad University, Dezfoul Branch, Dezfoul, Iran
Mehdi Hoodgar, Department of Computer Engineering, Islamic Azad University, Dezfoul Branch, Dezfoul, Iran

.
Abstract — Network-On-Chip (NOC) is a new paradigm to make the interconnections inside a System-On-Chip (SOC) system. Networks-On-Chip have emerged as alternative to buses to provide a packet-switched communication medium for modular development of large Systems-On-Chip. The performance of Network-On-Chip largely depends on the underlying routing techniques. Routing algorithm can be classified into three categories, namely, deterministic routing, oblivious routing and adaptive routing. Each routing algorithm has two constituencies: output selection and input selection. In this paper we discuss about some input and output selection techniques which used by routing algorithms. Also, to provide new and more efficient algorithms we examine the strengths and weaknesses of the algorithm.
.
Keywords: Network, System-On-Chip, Network-On-Chip, Routing algorithm, Input selection, Output selection.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
22. Paper 31011175: Critical Analysis of Design Criteria and Reliability Of Safety Instrumented System (Sis) For Offshore Oil & Gas Production Platforms In India (pp. 131-135)
Full Text: PDF

.
Rakesh Sethi (1), Manjeet Patterh (2)
(1) Superintending Engineer ONGC Research scholar Punjabi university Patiala, India
(2) Director, University College of Engineering Punjabi University Patiala, India

.
Abstract - In this paper observed that there is a growing need in offshore oil & gas industry to gain insight into the significant aspects and parameters of safety instrumented systems so as to manage the process in a more reliable and safer manner. The diversity of issues and the use of different subsystems demand a multidisciplinary team with expertise in process, instrumentation, control, safety, maintenance, reliability and management to develop the basis for the design, implementation, and maintenance and successfully design Criteria and Reliability of Safety Instrumented System for Offshore Oil & Gas Production Platform in India.
.
Keywords: safety Instrumented System, Offshore Oil and Gas Industry.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
23. Paper 31081140: PAPR Reduction of OFDM Signal Using Error Correcting Code and Selective Mapping (pp. 136-139)
Full Text: PDF

.
Anshu, ECE Deptartment, Mahrishi Markendeshwar University, Mullana, India
Er. Himanshu Sharma, Lecturer, ECE Department, Maharishi Markendeshwar University, Mullana, India

.
Abstract - Orthogonal frequency division multiplexing (OFDM) technique is a promising technique to offer high data rate and reliable communications over fading channels. The main implementation disadvantage of OFDM is the possibility of high peak to average power ratio (PAPR). This paper presents a novel technique to reduce the PAPR using errorcorrecting coding and selective mapping (SLM).We show that the probability of the PAPR of OFDM signal with 100 subcarriers.
.
Keywords- OFDM, SLM, CCDF, PAPR, PAR.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
24. Paper 31081143: Lossless Image Compression for Transmitting Over Low Bandwidth Line (pp. 140-145)
Full Text: PDF

.
G. Murugan, Research Scholar, Singhania University & Sri Venkateswara College of Engg, Thiruvallur
Dr. E. Kannan, Supervisor, Singhania University and Dean Academic Veltech University
S. Arun, ECE Dept. Asst. Professor Veltech High Engg college, Chennai

.
Abstract - The aim of this paper is to develop an effective loss less algorithm technique to convert original image into a compressed one. Here we are using a lossless algorithm technique in order to convert original image into compressed one. Without changing the clarity of the original image. Lossless image compression is a class of image compression algorithms that allows the exact original image to be reconstructed from the compressed data. We present a compression technique that provides progressive transmission as well as lossless and near-lossless compression in a single framework. The proposed technique produces a bit stream that results in a progressive and ultimately lossless reconstruction of an image similar to what one can obtain with a reversible wavelet codec. In addition, the proposed scheme provides near-lossless reconstruction with respect to a given bound after decoding of each layer of the successively refineable bit stream. We formulate the image data compression problem as one of successively refining the probability density function (pdf) estimate of each pixel. Experimental results for both lossless and near-lossless cases indicate that the proposed compression scheme, that innovatively combines lossless, near-lossless and progressive coding attributes, gives competitive performance in comparison to state-of-the-art compression schemes.
.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
.



Comments