What's it for?

SCORM avoids stating explicitly what certain elements are for and how they should be used. This is generally a good thing, however it has lead to some confusion regarding the purpose of some elements (see SCORM Bloopers)

In this section I will describe how I use the elements. Be aware though that these are only my opinion and you should feel free to disagree.

Comments

As shown in bloopers, the purpose of the comments element is can be misunderstood. Firstly let's look at what it was not designed for, but what it is commonly misinterpreted as:

1. A Survey Form - there are better ways

The unfortunate name of this element does lend itself to this form of misinterpretation. A quick look at how the data element is structured under SCORM 1.2 shows why its not designed for this. The comments element in SCORM 1.2 is just a big 4K block of text. There is no standardized way to separate individual data items out of it. Also note that the interaction element (completely separate from the comments) does contain a 'likert' interaction, the most common form of non-textual survey feedback, and it also allows you to store individual typewritten responses to questions - all of which are reportable. So if you absolutely must use the SCORM data element for collecting survey data, it makes far more sense to store that data in the interaction element rather than the comments block.

2. Learner Notes - an invasion of privacy

There is no problem with doing this. Provided that the learners are aware that their notes are not private. Some learners may actually get quite upset if they learned that their private course notes are stored on the LMS and available to the LMS admins and training staff. Comments can be output in reports, and in fact are designed to be. Its really only my personal opinion but I feel that if the learners aren't aware that their "comments" are open to scrutiny by others, then its an invasion of privacy.

So what is for then?

As I stated above 'comments' is badly named. If I could rename it I would rename it "Issue report". Comments is actually designed to work in cooperation with the comments from LMS functionality. 

Comments should be used to allow learners to report problems with the course content or technical issues to the Learning Administrators.

A typical case use would be:
  1. The learner discovers that, in a course describing the President of the USA, it is showing a picture of G.W.Bush. 
  2. The learner activates the comments function, and types in "Hey guys, are you die-hard republications or just slack. You still have a picture of Bush as the President, c'mon how long has it been already." They then submit it, and the result is stored in cmi.comments.
  3. The LMS admins run a regular report on all comments and realise that this comment is correct; the course has out of date material. However, they cannot change that themselves so they forward the comment to development which will take about a week to make the change. 
  4. In the meantime they enter a new comment_from_lms which will be shown to all learners when they log in (SCORM 1.2 works this way. SCORM 2004 is a bit more sophisticated and can be targeted to actually screens): "On screen 7 there is an out of date photograph, naming G.W.Bush as president. Please note that the current president is Barack Obama. This issue has been submitted to development and the photo is currently being updated." This comment_from_lms is live to students from the moment it is entered. 
Whether or not your LMS and/or courseware allows you to actually use comments and comments_from_lms with way under SCORM 1.2 you will need to investigate on a case by case basis. Hopefully under SCORM 2004 you will have better luck, as it is mandatory for the LMS (at least) to support the elements. 

Objectives

It would appear that this is another unfortunately named element, however that is not the case. It is a fairly general purpose element and its designed to be used in a number of different ways. What I'll describe here are only three of the ways you might like to use it. If you know of any others that you think are worth mentioning, please let me know.

1. Tracking modules

The term "objectives" does lead the instructional designer to believe that it must track right down to the learning objective or competency level. It certainly can be used for that, but in many cases its overkill to track to that level. In many cases, all the client wants is to track whether a learner has completed a module, and what they scored for that module. What does the objectives element store? A completion status and one score. It's a match made in virtual heaven.

In this method you simply have one objective for each module.

Things become somewhat more difficult if you have modules and topics, but all you really need to do is have one objective for each topic, and then one objective for the umbrella module that you will have to calculate, based on the learner's performance through the sub-ordinate topics.

To be honest in SCORM 1.2 this is all that I would use objectives for. In SCORM 2004 the ability track a progress measure makes it a bit more useful when applying different methodologies.

2. Pre-assessment based recognition of prior learning

Now I know many people believe that pre-assessments are bad. I accept those arguments, but I believe that the advantages of using a pre-assessment to measure current learning outweighs the disadvantages - especially if the learner is not aware that they are doing an assessment.

The concept here is to mimic the initial discussion between a mentor and a learner. Typically the mentor wants to know what the learner already knows so that they don't waste time covering things the learner already knows. Surprise, surprise that's what we are interested in too, if we want to avoid boring the learner and keep the time-out-for-training to a minimum. 

Throughout the question and answer session, difficult questions are asked. If the learner does demonstrate full competence across a learning objective, then we mark that objective as complete without them having seen the training. Note that the questions must be difficult, and must cover all aspects of that module. It must be  difficult enough so that we can say, with assurance, that the learner would gain no benefit from viewing the training material. 

3. Tracking individual competencies

This would actually work a little more like a matrix. Let's say you have a Captivate course that has a demonstration, a follow along demonstration and an assessment simulation. Through each of these you track three things: ability to open an edit panel, ability to enter data, ability to submit the data. Each competency is tracked individually as an objective. In each module the learner has their performance on each objective and each objective is updated in every module.

This is obviously a more complex application of the objectives, and really only worth the effort if you a good reason to do it. 

Interactions

The interactions section of SCORM (be it in 1.2 or 2004) is easily the most complex element in the system. However it is reasonable well understood as to what it is supposed to track. There is little in the way of misunderstanding its purpose. What I do want to discuss, because this can get confusing, is the differences between SCORM 1.2 and SCORM 2004.

I believe the confusion arises simply because there is so much similarity between the two versions, that only by careful reading of the RTE will you spot the differences. 

Write only versus read/write 

The biggest difference is that, under SCORM 1.2, the interaction element is write only. Once the data is submitted to the LMS you can't read it back. You have to hold it in active memory if you want to use it later in the session, and if the learner logs out and then logs back in again - you are completely out of luck. 
SCORM 2004 does allow read and write to the interaction element.

What does this really mean? In SCORM 1.2 if you wanted a learner to be able to review an assessment that shows what they answered for any question, across multiple sessions, it would require storing the result in suspend_data element as well as the interaction element (double storage is never a good idea). Even that may not be possible if you already have a lot of suspended data already. The SCORM 1.2 suspend data element is only 4K after all. With SCORM 2004 you can read back what questions were answered and all the details attached to that interaction. The 2004 RTE even discusses two approaches to storing results data and how that can be used.

Smallest Permitted Maximum (SPM): 250 interactions 

Firstly we need to explain the concept of a smallest permitted maximum. SCORM, typically does not set upper limits. If an LMS wants to support an infinite number of interactions that is not illegal or a breach of the standards. If, on the other hand, the LMS developers decided to be stingy and only allow their LMS to track a maximum of 10 interactions, then this could bring the SCORM system into disrepute. therefore what SCORM does set is the smallest maximum value that an LMS developer is allowed to support to be compliant. 

In the case of the interaction element, the smallest maximum value for the interaction element is 250 interactions. No LMS is allowed to support less than that and call itself compliant. .

While describing the limit this way does have its advantages, the only logical conclusion that a SCO developer can come to is that eventually their SCO will be run on an LMS that used the smallest permitted maximum, and therefore self-limits their SCO to 250 tracked interactions.

Be aware that this limit only exists in SCORM 2004. Oddly in SCORM 1.2 there is no limit specified in the RTE that I can find (I may not have looked in the right place of course, let me know if I am wrong about that). 

What this means is that something that did work in SCORM 1.2 may not work in SCORM 2004. I have no real examples of that happening, as I have no proof of whether there is a SPM in 1.2 or not. Here however is an example of what could happen.

If you had an eLearning course and the client insisted that the learner read every page, and they determined that the best way to do that is track the Next button on each page (actually I am not making this bit up - I have seen a course that does this). If that course had a combined total of assessment questions and next buttons greater than 250 it would not be possible to move it as is to SCORM 2004 and expect it to work. The number of tracked interactions exceeds the SPM of 250, and therefore as a SCO developer we cannot guarantee that it will work on every SCORM 2004 LMS. It will only work if the LMS provider has elected to use a maximum greater than the SPM - obviously this is well beyond the control of a SCO developer to control.

The advice here is to be aware that there is a limit in 2004.

Its a fairly large limit if you have discreet, small SCOs, its even still reasonably roomy if you are producing multi-module SCOs with trackable assessments in each module. But if you are trying to track next buttons and various other items as well as a monster multi-module, multi-topic SCO you will run out of room. 

 
Comments