There is a very definite role played by diet in a person's health. What we eat and drink forms one of the major contributors to the state of physical and mental health.

Many people are coming over to this way of thinking now that the relevance diet plays in the nation's health is being popularized by many TV documentary shows.

Where the idea of fitness was once synonymous with health and diet took a back seat to getting fit by working out or taking Pilates or dancer aerobic classes or whatever, however you look at it, exercise and working out alone does not complete the entire picture of health.

My belief is that diet is vitally important. So what exactly is the important role that diet has to play in the overall scheme of things related to health?