Volume 2, Issue 3, March 2010

An Optimal Prefix Replication Strategy for VoD Services [ Full-Text ]

M Dakshayini and T R GopalaKrishnan Nair

In this paper we propose scalable proxy servers cluster architecture of interconnected proxy servers for high quality and high availability services. We also propose an optimal regional popularity based video prefix replication strategy and a scene change based replica caching algorithm that utilizes the zipf-like video popularity distribution to maximize the availability of videos closer to the client and request-servicing rate thereby reducing the client rejection ratio and the response time for the client. The simulation results of our proposed architecture and algorithm show the greater achievement in maximizing the availability of videos, client request-servicing rate and in reduction of initial start-up latency and client rejection ratio.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

A Comprehensive Review of Image Enhancement Techniques [ Full-Text ]

Raman Maini and Himanshu Aggarwal

Principle objective of Image enhancement is to process an image so that result is more suitable than original image for specific application. Digital image enhancement techniques provide a multitude of choices for improving the visual quality of images. Appropriate choice of such techniques is greatly influenced by the imaging modality, task at hand and viewing conditions. This paper will provide an overview of underlying concepts, along with algorithms commonly used for image enhancement. The paper focuses on spatial domain techniques for image enhancement, with particular reference to point processing methods and histogram processing.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

A Rank Based Replacement Policy for Multimedia Server Cache Using Zipf-Like Law [ Full-Text ]

T R Gopalakrishnan Nair and P Jayarekha

The cache replacement algorithm plays an important role in the overall performance of Proxy-Server system. In this paper we have proposed VoD cache memory replacement algorithm for a multimedia server system. We propose a Rank based cache replacement policy to manage the cache space in individual proxy server cache. Proposed replacement strategy incorporates in a simple way the most important characteristics of the video and its accesses such as its size, access frequency, recentness of the last access and the cost incurred while transferring the requested video from the server to the proxy. We compare our algorithm with some popular cache replacement algorithm using simulation. The video objects are ranked based on the access trend by considering the factors such as size, frequency and cost. Many studies have demonstrated that Zipf’s-like law can govern many features of the VoD and is used to describe the popularity of the video. In this paper, we have designed a model, which ranks the video on the basis of its popularity using the Zipf-like law.  The video with higher ranking is named “hot”, while the video with lower ranking is named “cold”.  The result show that the proposed rank based algorithm improves cache hit ratio, cache byte ratio and average request latencies compared to other algorithms. Our experimental results indicate that Rank based cache replacement algorithm outperforms LRU, LFU and Greedy Dual.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Analysis of Supply Chain Network Using RFID Technique with Hybrid Algorithm [ Full-Text ]

P Suresh and R Kesavan

Radio Frequency IDentification (RFID) is a dedicated short range communication technology. The term RFID is used to describe various technologies that use radio waves to automatically identify people or objects. RFID is a method of remotely storing and retrieving data using RFID tag. Radio Frequency Identification (RFID) technology has been attracting considerable attention with the expectation of improved supply chain visibility for consumer goods, apparel, and pharmaceutical manufacturers, as well as retailers and government procurement agencies. RFID technology is used today in many applications, including security and access control, transportation and supply chain track-ing. Supply Chain Management (SCM) is now at the centre stage of Manufacturing and service organizations. According to the strategies in markets, supply chains and logistics are naturally being modelled as distributed systems. The economic importance has motivated both pri-vate companies and academic researchers to pursue the use of operations research and management service tools to improve the efficiency of Transportation. Referring to such scenario, in this work RFID Technique adopted with hybrid algorithm to optimize supply chain distribution network.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Measuring Bandwidth for Super Computer Workloads [ Full-Text ]

A. Neela Madheswari and R. S. D. Wahida Banu

Parallel computing plays a major role in almost all the fields from research to major concern problem solving purposes. Many researches are till now focusing towards the area of parallel processing. Nowadays it extends its usage towards the end user application such as GPU as well as multi-core processor development. The bandwidth measurement is essential for resource management and for studying the various performance factors of the existing super computer systems which will be helpful for better system utilization since super computers are very few and their resources should be properly utilized. In this paper the real workload trace of one of the super computers LANL is taken and shown how the bandwidth is estimated with the given parameters.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Plagiarism Detection using ROUGE and WordNet [ Full-Text ]

Chien-Ying Chen, Jen-Yuan Yeh and Hao-Ren Ke

With the arrival of digital era and Internet, the lack of information control provides an incentive for people to freely use any content available to them. Plagiarism occurs when users fail to credit the original owner for the content referred to, and such behavior leads to violation of intellectual property. Two main approaches to plagiarism detection are fingerprinting and term occurrence; however, one common weakness shared by both approaches, especially fingerprinting, is the incapability to detect modified text plagiarism. This study proposes adoption of ROUGE and WordNet to plagiarism detection. The former includes n-gram co-occurrence statistics, skip-bigram, and longest common subsequence (LCS), while the latter acts as a thesaurus and provides semantic information. N-gram co-occurrence statistics can detect verbatim copy and certain sentence modification, skip-bigram and LCS are immune from text modification such as simple addition or deletion of words, and WordNet may handle the problem of word substitution.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

A Security Based Data Mining Approach in Data Grid [ Full-Text ]

S. Vidhya and S. Karthikeyan

Grid computing is the next logical step to distributed computing. Main objective of grid computing is an innovative approach to share resources such as CPU usage; memory sharing and software sharing. Data Grids provide transparent access to semantically related data resources in a heterogeneous system. The system incorporates both data mining and grid computing techniques where Grid application reduces the time for sending results to several clients at the same time and Data mining application on computational grids gives fast and sophisticated results to users. In this work, grid based data mining technique is used to do automatic allocation based on probabilistic mining frequent sequence algorithm. It finds frequent sequences for many users at a time with accurate result. It also includes the trust management architecture for trust enhanced security.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Computation of Reducts Using Topology and Measure of Significance of Attributes [ Full-Text ]

P. G. JansiRani and R. Bhaskaran

Data generated in the fields of science, technology, business and in many other fields of research are increasing in an exponential rate. The way to extract knowledge from a huge set of data is a challenging task. This paper aims to propose a hybrid and viable method to deal with an information system in data mining, using topological techniques and the significance of the attributes measured using rough set theory, to compute the reduct, This will reduce the randomness in the process of elimination of redundant attributes, which, in turn, will reduce the complexity of the computation of reducts of an information system where a large amount of data have to be processed.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Land-cover Classification and Mapping for Eastern Himalayan State Sikkim [ Full-Text ]

Ratika Pradhan, Mohan P. Pradhan, Ashish Bhusan, Ronak. K. Pradhan and M. K. Ghose

Area of classifying satellite imagery has become a challenging task in current era where there is tremendous growth in settlement i.e. construction of buildings, roads, bridges, dam etc.  This paper suggests an improvised k-means and Artificial Neural Network (ANN) classifier for land-cover mapping of Eastern Himalayan state Sikkim.  The improvised k-means algorithm shows satisfactory results compared to existing methods that includes k-Nearest Neighbor and maximum likelihood classifier. The strength of the Artificial Neural Network (ANN) classifier lies in the fact that they are fast and have good recognition rate and it’s capability of self-learning compared to other classification algorithms has made it widely accepted. Classifier based on ANN shows satisfactory and accurate result in comparison with the classical method.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

A Novel Approach for Discovery Multi Level Fuzzy Association Rule Mining [ Full-Text ]

Pratima Gautam and K. R. Pardasani

Finding multilevel association rules in transaction databases is most commonly seen in is widely used in data mining. In this paper, we present a model of mining multilevel association rules which satisfies the different minimum support at each level, we have employed fuzzy set concepts, multi-level taxonomy and different minimum supports to find fuzzy multilevel association rules in a given transaction data set. Apriori property is used in model to prune the item sets.  The proposed model adopts a top-down progressively deepening approach to derive large itemsets. This approach incorporates fuzzy boundaries instead of sharp boundary intervals. An example is also given to demonstrate and support that the proposed mining algorithm can derive the multiple-level association rules under different supports in a simple and effective manner.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Comparative Study of Hidden Node Problem and Solution using Different Techniques and Protocols

                                                                                                                                                                                      [ Full-Text ]

Viral V. Kapadia, Sudarshan N. Patel and Rutvij H. Jhaveri

Hidden nodes in a wireless network refer to nodes that are out of range of other nodes or a collection of nodes. We will discuss a few problems introduced by the RTS/CTS mechanism of collision avoidance and focus on the virtual jamming problem, which allows a malicious node to effectively jam a large fragment of a wireless network at minimum expense of power. We have also discussed WiCCP (Wireless Central Coordinated Protocol) which is a protocol booster that also provides good solution to hidden nodes.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

A Distributed k-Secure Sum Protocol for Secure Multi-Party Computations [ Full-Text ]

Rashid Sheikh, Beerendra Kumar and Durgesh Kumar Mishra

Secure sum computation of private data inputs is an interesting example of Secure Multiparty Computation (SMC) which has attracted many researchers to devise secure protocols with lower probability of data leakage. In this paper, we provide a novel protocol to compute the sum of individual data inputs with zero probability of data leakage when two neighbor parties collude to know the data of a middle party. We break the data block of each party into number of segments and redistribute the segments among parties before the computation. These entire steps create a scenario in which it becomes impossible for semi  honest parties to know the private data of some other party.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Quality of Service with Bandwidth [ Full-Text ]

Shivaji P. Mirashe and N. V. Kalyankar

This paper deals with providing Quality of Service (QoS) over IP based networks. We are going to give a brief survey about this topic, and present our work at this area. There are many solutions of the problem, but the standardization of the methods is not finished yet. At the moment there are two kinds of approaches of the reservation problem. The distributed method handles the network nodes independently, and get the nodes making their own admittance decisions along the reservation path (i.e. Border Gateway Reservation Protocol BGRP. The centralized way -we discuss in details-, which collects the network nodes into domains, and handles them using a network manager. Generally there are two significant parts of the network management: intra domain, and inter-domain. This article focuses on making reservations over several domains, which is the part of the inter-domain functions.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Cloud Computing [ Full-Text ]

Shivaji P. Mirashe and N. V. Kalyankar

Computing as you know it is about to change, your applications and documents are going to move from the desktop into the cloud. I’m talking about cloud computing, where applications and files are hosted on a “cloud” consisting of thousands of computers and servers, all linked together and accessible via the Internet. With cloud computing, everything you do is now web based instead of being desktop based. You can access all your programs and documents from any computer that’s connected to the Internet. How will cloud computing change the way you work? For one thing, you’re no longer tied to a single computer. You can take your work anywhere because it’s always accessible via the web. In addition, cloud computing facilitates group collaboration, as all group members can access the same programs and documents from wherever they happen to be located. Cloud computing might sound far-fetched, but chances are you’re already using some cloud applications. If you’re using a web-based email program, such as Gmail or Hotmail, you’re computing in the cloud. If you’re using a web-based application such as Google Calendar or Apple Mobile Me, you’re computing in the cloud. If you’re using a file- or photo-sharing site, such as Flickr or Picasa Web Albums, you’re computing in the cloud. It’s the technology of the future, available to use today.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

A Neuro-Fuzzy Multi Swarm  FastSLAM Framework [ Full-Text ]

R. Havangi, M. Teshnehlab and M. A. Nekoui

FastSLAM is a framework for simultaneous localization using a Rao-Blackwellized particle filter. In FastSLAM, particle filter is used for the mobile robot pose (position and orientation) estimation, and an Extended Kalman Filter (EKF) is used for the feature location’s estimation. However, FastSLAM degenerates over time. This degeneracy is due to the fact that a particle set estimating the pose of the robot loses its diversity. One of the main reasons for loosing particle diversity in FastSLAM is sample impoverishment. It occurs when likelihood lies in the tail of the proposal distribution. In this case, most of particle weights are insignificant. Another problem of FastSLAM relates to the design of an extended Kalman filter for landmark position’s estimation. The performance of the EKF and the quality of the estimation depends heavily on correct a priori knowledge of the process and measurement noise covariance matrices ( Q and R ) that are in most applications unknown. On the other hand, an incorrect a priori knowledge of  Q and R may seriously degrade the performance of the Kalman filter. This paper presents a Neuro-Fuzzy Multi Swarm FastSLAM Framework. In our proposed method, a Neuro-Fuzzy extended kalman filter for landmark feature estimation, and a particle filter based on particle swarm optimization are presented to overcome the impoverishment of FastSLAM. Experimental results demonstrate the effectiveness of the proposed algorithm.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Similarity Data Item Set Approach: An Encoded Temporal Data Base Technique [ Full-Text ]

M. S. Danessh, C. Balasubramanian and K. Duraiswamy

Data mining has been widely recognized as a powerful tool to explore added value from large-scale databases. Finding frequent item sets in databases is a crucial in data mining process of extracting association rules. Many algorithms were developed to find the frequent item sets. This paper presents a summary and a comparative study of the available FP-growth algorithm variations produced for mining frequent item sets showing their capabilities and efficiency in terms of time and memory consumption on association rule mining by taking application of specific information into account. It proposes pattern growth mining paradigm based FP-tree growth algorithm, which employs a tree structure to compress the database. The performance study shows that the anti- FP-growth method is efficient and scalable for mining both long and short frequent patterns and is about an order of magnitude faster than the Apriority algorithm and also faster than some recently reported new frequent-pattern mining.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Mapping The Best Practices of XP and Project Management: Well defined approach for Project Manager

                                                                                                                                                                                    [ Full-Text ]

Muhammad Javed, Bashir Ahmad, Shahid Hussain and Shakeel Ahmad

Software engineering is one of the most recent additions in various disciplines of system engineering. It has emerged as a key obedience of system engineering in a quick succession of time. Various Software Engineering approaches are followed in order to produce comprehensive software solutions of affordable cost with reasonable delivery timeframe with less uncertainty. All these objectives are only satisfied when project’s status is properly monitoring and controlled eXtreme Programming (XP) uses the best practices of AGILE methodology and helps in development of small size software very sharply. In this paper, authors proposed that via XP, high quality software with less uncertainty and under estimated cost can be developed due to proper monitoring and controlling of project. Moreover, authors give guidelines that how activities of project management can be embedded into development life cycle of XP to enhance the quality of software products and reduce the uncertainty.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

A Group Vehicular Mobility Model for Routing Protocol Analysis in Mobile Ad Hoc Network [ Full-Text ]

Shrirang Ambaji Kulkarni and G Raghavendra Rao

Performance of routing protocols in mobile ad-hoc networks is greatly affected by the dynamic nature of nodes, route failures, wireless channels with variable bandwidth and scalability issues. A mobility model imitates the real world movement of mobile nodes and is central component to simulation based studies. In this paper we consider mobility nodes which mimic the vehicular motion of nodes like Manhattan mobility model and City Section mobility model. We also propose a new Group Vehicular mobility model that takes the best features of group mobility models like Reference Point Group mobility model and applies it to vehicular models. We analyze the performance of our model known as Group Vehicular mobility model (GVMM) and other vehicular mobility models with various metrics. This analysis provides us with an insight about the impact of mobility models on the performance of routing protocols for ad-hoc networks. The routing protocols are simulated and measured for performance and finally we arrive at the correlation about the impact of mobility models on routing protocols, which are central to the design of mobile ad-hoc networks.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Gene Expression Data Knowledge Discovery using Global and Local Clustering [ Full-Text ]

Swathi. H

To understand complex biological systems, the research community has produced huge corpus of gene expression data. A large number of clustering approaches have been proposed for the analysis of gene expression data. However, extracting important biological knowledge is still harder. To address this task, clustering techniques are used. In this paper, hybrid Hierarchical k-Means algorithm is used for clustering and biclustering gene expression data is used. To discover both local and global clustering structure biclustering and clustering algorithms are utilized. A validation technique, Figure of Merit is used to determine the quality of clustering results. Appropriate knowledge is mined from the clusters by embedding a BLAST similarity search program into the clustering and biclustering process. To discover both local and global clustering structure biclustering and clustering algorithms are utilized. To determine the quality of clustering results, a validation technique, Figure of Merit is used. Appropriate knowledge is mined from the clusters by embedding a BLAST similarity search program into the clustering and biclustering process.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

A RFID-based Campus  Context-Aware Notification System  [ Full-Text ]

Nazleeni S. Haron, Nur S. Saleem, Mohd H. Hasan, Mazeyanti M. Ariffin and Izzatdin A. Aziz

This paper presents the design and development of a context-aware notification system for university students using RFID technology. This system is leveraging on the student’s matrix card as the RFID tag (sensor), RFID reader and server as the processors and screen monitor at the various locations in the campus as the actuator of the output. This system aims to deliver urgent notifications to the intended students immediately at their respective locations. In addition, the system is also able to display personalized information based on the students’ preferences and current location when accessing the system. The background of the study, the design approaches for this system and the preliminary evaluation of the prototype are presented in this paper. The evaluation results have indicated that the the proposed system is useful and easy to use.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Fuzzy-based Navigation and Control of a  Non-Holonomic Mobile Robot  [ Full-Text ]

Razif Rashid, I. Elamvazuthi, Mumtaj Begam and M. Arrofiq

In recent years, the use of non-analytical methods of computing such as fuzzy logic, evolutionary computation, and neural networks has demonstrated the utility and potential of these paradigms for intelligent control of mobile robot navigation. In this paper, a theoretical model of a fuzzy based controller for an autonomous mobile robot is developed. The paper begins with the mathematical model of the robot that involves the kinematic model. Then, the fuzzy logic controller is developed and discussed in detail. The proposed method is successfully tested in simulations, and it compares the effectiveness of three different set of membership of functions. It is shown that fuzzy logic controller with input membership of three provides better performance compared with five and seven membership functions.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Proficient Pair of Replacement Algorithms on L1 and L2 Cache for Merge Sort [ Full-Text ]

Richa Gupta and Sanjiv Tokekar

Memory hierarchy is used to compete the processors speed. Cache memory is the fast memory which is used to conduit the speed difference of memory and processor.  The access patterns of Level 1 cache (L1) and Level 2 cache (L2) are different, when CPU not gets the desired data in L1 then it accesses L2. Thus the replacement algorithm which works efficiently on L1 may not be as efficient on L2. Similarly various applications such as Matrix Multiplication, Web, Fast Fourier Transform (FFT) etc will have varying access pattern. Thus same replacement algorithm for all types of application may not be efficient. This paper works for getting an efficient pair of replacement algorithm on L1 and L2 for the algorithm Merge Sort. With the memory reference string of Merge Sort, we have analyzed the behavior of various existing replacement algorithms on L1. The existing replacement algorithms which are taken into consideration are: Least Recently Used (LRU), Least Frequently Used (LFU) and First In First Out (FIFO). After Analyzing the memory reference pattern of Merge Sort, we have proposed a Partition Based Replacement algorithm (PBR_L1)) on L1 Cache. Furthermore we have analyzed various pairs of algorithms on L1 and L2 respectively, resulting in finding a suitable pair of replacement algorithms. Simulation on L1 shows, among the considered existing replacement algorithms FIFO is performing better than others. While the proposed replacement algorithm PBR_L1 is working about 1.7% to 44 % better than FIFO for various cache sizes. The analysis for various pairs on L1 and L2 respectively shows that among the considered existing various pairs the best pair is FIFO followed by FIFO. While the proposed replacement policy PBR_L1 followed by FIFO works approximately 66% to 100% better than the pair FIFO-FIFO for various cache sizes. Furthermore simulation results by fixing the cache size L1 and L2 and varying list length shows that the performance of proposed algorithm on L1 is better than others considered in this paper. Similar analysis done for various pairs shows that the pair PBR_L1 on L1 followed by FIFO on L2 is superior to other pairs for varying length list.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Voice Recognition Algorithms using Mel Frequency Cepstral Coefficient (MFCC) and Dynamic Time Warping (DTW) Techniques                                                                                                                                 [ Full-Text ]

Lindasalwa Muda, Mumtaj Begam and I. Elamvazuthi

Digital processing of speech signal and voice recognition algorithm is very important for fast and accurate automatic voice recognition technology. The voice is a signal of infinite information. A direct analysis and synthesizing the complex voice signal is due to too much information contained in the signal. Therefore the digital signal  processes such as Feature Extraction and Feature Matching are introduced to represent the voice signal. Several methods such as Liner Predictive Predictive Coding (LPC), Hidden Markov Model (HMM), Artificial Neural Network (ANN) and etc are evaluated with a view to identify a straight forward and effective method for voice signal. The extraction and matching process is implemented right after the Pre Processing or filtering signal is performed. The non-parametric method for modelling the human auditory perception system, Mel Frequency Cepstral Coefficients (MFCCs) are utilize as extraction techniques. The non linear sequence alignment known as Dynamic Time Warping (DTW) introduced by Sakoe Chiba has been used as features matching techniques. Since it’s obvious that the voice signal tends to have different temporal rate, the alignment is important to produce the better performance.This paper present the viability of MFCC to extract features and DTW  to compare the test patterns.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

New Classification Methods for Hiding Information into Two Parts: Multimedia Files and Non Multimedia Files                                                                                                                                                                             [ Full-Text ]

Hamdan.O.Alanazi, A.A.Zaidan, B.B.Zaidan, Hamid A.Jalab and Zaidoon Kh. AL-Ani

With the rapid development of various multimedia technologies, more and more multimedia data are generated and transmitted in the medical, commercial, and military fields, which may include some sensitive information which should not be accessed by or can only be partially exposed to the general users. Therefore, security and privacy has become an important, Another problem with digital document and video is that undetectable modifications can be made with very simple and widely available equipment, which put the digital material for evidential purposes under question .With the large flood of information and the development of the digital format Information hiding considers one of the techniques which used to protect the important information. The main goals for this paper, provides a general overview of the New Classification Methods for Hiding Information into Two Parts:  Multimedia Files and Non Multimedia Files.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

New Comparative Study Between DES, 3DES and AES within Nine Factors [ Full-Text ]

Hamdan.O.Alanazi, B.B.Zaidan, A.A.Zaidan, Hamid A.Jalab, M.Shabbir and Y. Al-Nabhani

With the rapid development of various multimedia technologies, more and more multimedia data are generated and transmitted in the medical, also the internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information .Besides that, digital documents are also easy to copy and distribute, therefore it will be faced by many threats. It is a big security and privacy issue, it become necessary to find appropriate protection because of the significance, accuracy and sensitivity of the information. , which may include some sensitive information which should not be accessed by or can only be partially exposed to the general users. Therefore, security and privacy has become an important. Another problem with digital document and video is that undetectable modifications can be made with very simple and widely available equipment, which put the digital material for evidential purposes under question. Cryptography considers one of the techniques which used to protect the important information. In this paper a three algorithm of multimedia encryption schemes have been proposed in the literature and description. The New Comparative Study between DES, 3DES and AES within Nine Factors achieving an efficiency, flexibility and security, which is a challenge of researchers.

--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Overview: Main Fundamentals for Steganography [ Full-Text ]

Zaidoon Kh. AL-Ani, A.A.Zaidan, B.B.Zaidan and Hamdan.O.Alanazi

The rapid development of multimedia and internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information .Besides that, digital documents are also easy to copy and distribute, therefore it will be faced by many threats. It is a big security and privacy issue, it become necessary to find appropriate protection because of the significance, accuracy and sensitivity of the information. Steganography considers one of the techniques which used to protect the important information. The main goals for this paper, to recognize the researchers for the main fundamentals of steganography.  In this paper provides a general overview of the following subject areas: Steganography types, General Steganography system, Characterization of Steganography Systems and Classification of Steganography Techniques.