This criterion assesses the extent to which the student’s report provides evidence that the student has selected, recorded, processed and interpreted the data in ways that are relevant to the research question and can support a conclusion.
At the root of this section is the data generated and how it is processed. If there is insufficient data then any treatment will be superficial. It is hoped that you would recognize such a lack and revisit the method before the analysis is arrived at. Alternatively, the use of databases or simulations to provide sufficient material for analysis could help in such situations.
Any treatment of the data must be appropriate to the focus of the investigation in an attempt to answer the research question. The conclusions drawn must be based on the evidence obtained from the data rather than on assumptions. Given the scope of the internal assessment and the time allocated, it is more than likely that variability in the data will lead to a tentative conclusion. This should be recognized and the extent of the variability considered.
The variability should be demonstrated and explained and its impact on the conclusion fully acknowledged. It is important to note that, in this criterion, the word “conclusion” refers to a deduction based on direct interpretation of the data, which is based on asking questions such as: What does the graph show? Does any statistical test used support the conclusion?
For more information on presenting data click here.
Extra notes:
Recording Raw Data
Include all results and observations you made during the experiment.
The table should have a full title that makes the reader understand the results without even needing to read the whole lab report.
Titles, units and uncertainties for the columns need to be displayed in the table and should be shown clearly.
The results should be related to the aim and the hypothesis. So if you are testing the effect of temperature on enzyme activity the table of results will have the first column showing a range of temperatures and the second column will have the rate of reaction, which could be recorded in different ways: for example the volume of gas produced, color change, mass of a product or a reactant, depending on the type of experiment.
Decimal points consistent with precision of the measuring equipment and constant for each variable.
Qualitative data, i.e. what you noticed but could not measure, should be included. It demonstrates that you were not doing the experiment blindly, but evaluating through the whole process. You will also gain insights to help with your evaluation at this point too.
Processing Data
In Biology a form of processing such as % change (in mass/volume/length) or reaction rate (1/time) should be done, if appropriate.
Adjust the precision to reflect processing (if needed) guidance is given below. If you are using a reaction rate or % change calculation then you lose uncertainty at this step. See the presentation for guidance:
Precision in processed data from Chris Paine
Next you should always aim to calculate the mean and standard deviation if you have 5 trials. This is your most basic kind of data processing.
In some investigations you will be looking for patterns (i.e. 5 x 5 minimum), drawing a curve of best-fit and judging the strength of the correlation. In others you will be looking for significant differences between two (or more) mean values (minimum 10 values per mean). If you are looking for significant difference between the means then consider carrying out a t-test at this point. Review 1. Statistics, or excel t-tests from click4biology if you are unsure how to do this.
Always state how you carried out the processing by including the formula used (and/or the excel functions used). Giving example calculations as well a quoting the formula is advised.
Presenting Data
You are presenting processed data, do not graph raw data by mistake. Raw data may additionally be graphed, should only be done if helpful as it will not gain you marks.
Titles self-explanatory, complete and consistent with the processed data.
Axis scaled correctly, you don’t’ have to start at zero, just graph the range of your data. Make sure gridlines have been added to your graphs.
Units must be included on the axis labels. Decimal places must be consistent with processed data
Error bars should be included if you have calculated the standard deviation and do not forget to state the source, e.g. an annotation under the graph. Refer to the statistics topic to review how this is done.
Graph should almost always be a line graph (in MS Excel this is referred to as a scatter graph – stay away from their line graphs). Sometime a bar chart will be appropriate if you are comparing means of two sets of data, or if you have categoric IV.
A curve (which may be a straight line in some circumstances) of best fit should be drawn, do not simply join the points unless the fit is perfect. Remember a good curve will go through the error bars (wherever possible) and the curve will go through the middle of the points. The errors (gaps from mean to the curve of best fit) above the curve should equal the errors below the curve.
Avoid using Excel’s trend fitting tools unless you understand the underlying maths, it’s easy to make a poor choice of algorithm. instead print the graph off and draw a line of best fit by hand.