COURSE OUTCOME

Semester l

Descriptive Statistics ( Core Course 1)

Descriptive statistics quantitatively describes or summaries the observations that have been collected. Such summaries may be either quantitative in the form of summary statistics or visual in the form of simple graphs and charts.

Univariate analysis involves describing the distribution of a single variable, including its central tendency (including the mean, median, mode, quartile etc) and dispersion (including the range, quartile deviation, mean deviation, variance, standard deviation, coefficient of variation etc). The shape of the distribution can be described using measures such as skewness and kurtosis. Characteristics of a variable's distribution can also be depicted graphically by histogram and ogive.

Bivariate analysis deals with the methods of summarisation of data arising out of variation in two variables. In this case, descriptive statistics include:

· Graphical representation based on scatter plots.

· Quantitative measures of dependence between variables using Pearson's correlation of coefficient and Spearman's rank correlation coefficient.

· Description of the relationship between variables through the regression analysis.

Semester II

Elementary Probability Theory ( Core Course 2 )

Probability is the logic of uncertainty and randomness occurring in every field of human activity. So it is extremely useful and interesting to understand probability on which depends many of the principles and procedures of Statistics. The probability theory provides a means of getting an idea of the likelihood of occurrence of different events resulting from a random experiment in terms of quantitative measures.

The concept of conditional probability deserves special attention due to its twofold importance. Firstly we are often interested in calculating probabilities when some. Secondly even when no impartial information is available , conditional probabilities can often be used to compute the desired probabilities more easily.

In analysing the data from an enquiry, we often have to use mathematical representations of the frequency distribution of one or more variables in the population concerned. Theoretical distributions are meant to serve precisely this purpose.

Semester lll

Introduction of Statistical Inferences ( Core Course 3):

Statistical inference is the process making conclusions about the unknown population from the known sample. Usually the concept of statistical inference appears in two major categories – problem of estimation and problem of hypothesis testing.

Firstly some feature of the population in which an enquirer is interested, may be completely unknown and he/she may want to make a guess about this feature completely based on a random sample from the population. This is the problem of estimation.

Secondly some information related to the feature of the population may be available to the enquirer and he/she may want to judge whether the information is tenable in the light of the random sample selected from the population. This is the problem of hypothesis testing.

Experimentation and making inferences are twin essential features of Statistics. Experiments are performed to manufacture data for making inferences about a population which has no existence in reality. Variation due to various causes in a data set is inherent. Partitioning the total variability of a data set into components ascribable to different sources of variation is the grand goal of the analysis of variance. The systematic and efficient conduct of an experiment leading us to identify these sources of variation is the subject of design of experiments.

Semester lV

Applications of Statistics ( Core Course 4 ):

The theory of sample surveys has the objective of developing methods for collecting samples of observations from a population that exists in its own way such that the sample can adequately represent and accurately interpret the population.

Index numbers are statistical tools for measuring changes of a group of related variables like prices of commodities, value, quantity of sales, amount of imports and exports of industrial production, volume of agricultural production, cost of living etc. with respect to time, geographical location or other characteristics. These indicators are of paramount importance to the Government, trade and commerce in taking decisions as to the planning of economy of a country, development of commercial, industrial and agricultural activities etc.

Time series is a collection of observations recorded chronologically. The main objectives in investigating a time series are to gain understanding of the process generating the time series and predicting the future values of the observed series. Time series analysis is used for various applications such as stock market analysis, pattern recognition, weather prediction, economic forecasting, census analysis and so on.

Another important area of statistical applications is to study the Vital statistics which deals with the collection of information on the occurrence of vital events like death, birth, morbidity , marriage, population growth etc. and the techniques used in the analysis of data pertaining to vital events.

Semester V

Operations Research ( Discipline Specific Elective )

Operations research (OR) encompasses the development and the application of a wide range of problem-solving methods and techniques applied in the pursuit of improved decision-making and efficiency to help make better decisions. The OR methods and techniques involve the construction of mathematical models that aim at describing a problem and then optimization of the problem. Because of its emphasis on human-technology interaction and of its focus on practical applications, it has wide applications, for instance, in agricultural planning, distribution of goods and resources, emergency and rescue operations, engineering systems design, environmental management, financial planning, health care management, inventory control, manpower and resource allocation, manufacturing of goods, military operations, production process control, risk management, scheduling of tasks, telecommunications, traf0fic control etc.

Because of the computational and statistical nature of most of the techniques, OR also has strong ties to computer science and analytics.

Research Methodology ( Skill Enhancement Course )

Humankind constantly attempts to improve the world through research, the systematic foundation used to attain new knowledge, add to existing knowledge, and to develop new processes and techniques. However, in order to conduct research, the researcher must implement research methods. These research methods are the strategies, tools, and techniques used by the researcher to collect the relevant evidence needed to create theories. Consequently, these research methods need to be credible, valid, and reliable. This is accomplished by writing a sound methodology, which consists of a systematic and theoretical analysis of the above research methods. A methodology allows the researcher to evaluate and validate the rigour of the study and methods used to obtain the new information.

Objectives of Research Methodology

1. To gain familiarity with a phenomenon or to achieve new insights into it.

2. To portray accurately the characteristics of a particular individual, situation or group of individuals.

3. To determine the frequency with which something occurs or with which it is associated with something else.

4. To test a hypothesis of a causal relationship between variables.

Semester VI

Project ( Discipline Specific Elective)

Project work is considered as a special course involving application of statistical knowledge in exploring and analysing a real life problem. The purpose of a Statistics project is to address a particular real life problem by collecting, analysing, organizing and interpreting relevant information. The data in such project should be presented in a certain form and according to the defined instructions. The end goal of such paper is to provide an understandable and exhaustive conclusion to the reader by using statistical methodology.

Monte Carlo Methods ( Skill Enhancement Course)

Monte Carlo methods are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. Monte Carlo methods are mainly used in three problem classes - optimization, numerical integration and generating sample from a probability distribution. The underlying concept is to use randomness to solve problems that might be deterministic in principle. They are often applied in physical and mathematical problems and are most useful when it is difficult or impossible to use other approaches.