One of the first questions I get from clients is: "Which SCORM version should we use?" It sounds like it should have a simple answer. It mostly does — but the "mostly" matters.
Both SCORM 1.2 and SCORM 2004 are actively used, both are supported by virtually every modern LMS, and both can deliver effective e-learning. The right choice depends on what you need to track, how complex your courses are, and which LMS you're using.
If you're not sure what SCORM is, start there. This guide assumes you know the basics and need to decide between versions.
The Quick Answer
Choose SCORM 1.2 if you want maximum compatibility, simple tracking (completion + score), and the fewest headaches.
Choose SCORM 2004 if you need sequencing rules, multiple learning objectives, or detailed reporting beyond basic pass/fail.
For the majority of e-learning projects I work on, SCORM 1.2 is the right choice — it remains the dominant version in use. But a significant minority genuinely needs 2004's features.
Feature Comparison
Here's a side-by-side comparison of the two versions:
| Feature | SCORM 1.2 | SCORM 2004 |
|---|---|---|
| Release date | 2001 | 2004 (4th edition: 2009) |
| LMS support | Virtually universal | Widely supported, but implementation depth varies |
| Completion status | completed, incomplete, not attempted, browsed | Separates completion (completed, incomplete, not attempted, unknown) from success (passed, failed, unknown) |
| Score tracking | Three sub-fields: raw, min, max (0–100 by convention, not mandated) | Multiple scores: raw, scaled (-1 to 1), min, max |
| Objectives | Multiple objectives via cmi.objectives.n.*, but no sequencing integration | Multiple objectives with individual completion, scores, and sequencing rollup integration |
| Sequencing | None — learner navigates freely | Full sequencing and navigation rules (prerequisite, flow control, rollup) |
| Bookmarking | cmi.core.lesson_location (255 chars) | cmi.location (1000 chars) |
| Suspend data | cmi.suspend_data (4,096 chars) | cmi.suspend_data (64,000 chars in 3rd/4th editions; 4,000 in 2nd edition) |
| Interaction types | 8 types (true-false, choice, fill-in, matching, performance, sequencing, likert, numeric) | 10 types (adds long-fill-in and other) + improved response patterns |
| Time tracking | cmi.core.session_time (HHHH:MM:SS.SS format) | cmi.session_time (ISO 8601 duration) |
| Communication | JavaScript API (API object) | JavaScript API (API_1484_11 object) |
| Package size limit | Practical, varies by LMS | Same practical limits |
| Specification complexity | Considerably smaller (across 3 books) | Significantly larger (across 4 books + conformance requirements) |
SCORM 1.2: The Reliable Workhorse
SCORM 1.2 has been around since 2001, and it's still the most commonly used version. There's a reason for that — it does the fundamentals well without unnecessary complexity.
What SCORM 1.2 Does Well
Broad compatibility. Virtually every LMS on the market supports SCORM 1.2 — it's the most widely adopted e-learning standard in existence. This matters enormously when you're deploying across multiple platforms or don't control which LMS your content ends up in.
Simple implementation. The API is straightforward. There are fewer data model elements to deal with, fewer edge cases, and fewer opportunities for things to go wrong. When I'm building custom SCORM courses, 1.2 packages consistently have fewer LMS-specific issues.
Battle-tested. Twenty-five years of real-world use means the quirks are well-documented and the workarounds are established. The common SCORM errors you'll encounter with 1.2 are well-understood.
Authoring tool support. Every SCORM authoring tool supports 1.2 output, and most default to it.
Where SCORM 1.2 Falls Short
Single status field. SCORM 1.2 combines completion and success into one field (cmi.core.lesson_status). A learner is either "passed," "failed," "completed," "incomplete," "browsed," or "not attempted." You can't separately track "completed the course but failed the assessment" without workarounds.
Limited suspend data. At 4,096 characters, the suspend data field can be tight for courses that need to save complex state — branching scenario progress, multiple quiz attempts, interactive element states.
No sequencing. SCORM 1.2 has no built-in mechanism for enforcing navigation order. If you need learners to complete Module 1 before accessing Module 2, the course itself must handle that logic, not the standard.
Basic score model. SCORM 1.2 supports raw, min, and max score sub-fields, but only at the course level — there's no scaled score, and per-objective scores don't integrate with sequencing or rollup. If you need granular competency-level reporting, you're working around the standard rather than with it.
SCORM 2004: The Feature-Rich Option
SCORM 2004 addressed the limitations of 1.2 with a significantly more capable specification. It went through four editions, with the 4th edition (2009) being the current version.
What SCORM 2004 Does Well
Separate completion and success. This is the single biggest practical improvement. With SCORM 2004, you can track that a learner completed all the content (completion_status = "completed") but failed the final assessment (success_status = "failed"). For compliance training, this distinction is often critical.
Multiple objectives. Each SCO (Sharable Content Object) can report against multiple learning objectives, each with its own completion and score. This enables detailed competency mapping — "the learner met Objective A and C but not B."
Sequencing and navigation. SCORM 2004 can enforce navigation rules at the LMS level: prerequisites, attempt limits, choice restrictions. The LMS controls what the learner can access and when. This is powerful for structured learning paths.
More data space. 64,000 characters of suspend data (in the 3rd and 4th editions) and 1,000-character bookmarks give courses much more room to save complex state.
Richer scoring. Scaled scores (-1 to 1) plus raw, minimum, and maximum values per objective provide much more granular performance data.
Where SCORM 2004 Falls Short
Sequencing complexity. SCORM 2004's Sequencing and Navigation (SN) specification is notoriously complex. Many LMSs implement it differently, leading to inconsistent behaviour. Some LMSs claim 2004 support but struggle with advanced sequencing rules. I've seen more hours lost to sequencing debugging than any other SCORM issue.
API differences. The API object name changes from API (1.2) to API_1484_11 (2004). The data model element names change too (cmi.core.lesson_status becomes cmi.completion_status and cmi.success_status). These aren't dramatic changes, but they mean a course built for one version won't work with the other without modification.
Implementation inconsistency. Because the specification is significantly larger and more complex than SCORM 1.2, LMS vendors interpret parts of it differently. A 2004 course that works perfectly in Moodle might behave unexpectedly in Cornerstone or SAP SuccessFactors. I encounter these discrepancies regularly in my testing and consultation work.
Overkill for simple content. If you just need "did they finish?" and "what did they score?" — SCORM 2004 adds complexity without benefit.
Decision Framework
Here's how I advise clients:
Use SCORM 1.2 When:
- You need maximum LMS compatibility
- Your tracking requirements are straightforward (completion + one score)
- You're distributing content to multiple organisations or platforms
- You're converting PowerPoint presentations to e-learning
- You want the simplest, most reliable option
- Your LMS's SCORM 2004 support hasn't been verified
Use SCORM 2004 When:
- You need to separately track completion and pass/fail (common in compliance)
- You must track multiple learning objectives within a single course
- You need LMS-controlled sequencing (prerequisites between modules)
- You need more than 4,096 characters of suspend data
- You've verified your target LMS handles SCORM 2004 correctly
- Your reporting requirements demand per-objective scoring
Check With Your LMS First
Before committing to SCORM 2004, test it with your specific LMS. Upload a sample 2004 package and verify:
- Completion status reports correctly
- Success status (passed/failed) reports correctly
- Scores record as expected
- Bookmarking works on resume
- Sequencing rules (if used) behave as intended
Terminate()fires correctly on all exit paths (normal close, browser close, session timeout) — improper termination is a frequent cause of lost data- Suspend data persists across sessions (especially if your course saves complex state)
- Interaction/assessment data records correctly (question IDs, responses, results)
- Rollup behaviour works as expected for multi-SCO courses
I've seen too many projects hit problems late in development because someone assumed their LMS fully supported 2004 features it only partially implemented. Always test on SCORM Cloud first, then on your actual production LMS. See version-specific testing differences in our SCORM testing checklist.
Real-World Examples
Compliance Training
A client needed annual compliance training where employees must view all content AND pass a final assessment with 80% or above. SCORM 1.2 can only report one status — so if someone views everything but scores 60%, what do you record? "Failed" loses the completion data. "Completed" hides the failure.
SCORM 2004 was the right call here: completion_status = "completed", success_status = "failed". The LMS could then require a re-take of the assessment without making the learner redo the entire course.
Product Training Library
A company had 200+ short product training modules being deployed across three different LMS platforms (their own, a partner's, and a client's). Simple tracking — just completion, no scoring needed.
SCORM 1.2 was the obvious choice. Maximum compatibility, zero sequencing complexity, and the simplest possible tracking.
Competency-Based Programme
A professional development programme needed to track learner progress across 12 competency areas within a blended course. Each module assessed multiple competencies, and the programme needed to identify specific skill gaps.
SCORM 2004 with multiple objectives tracking was essential. Each SCO reported scores against 3-4 competency objectives via the cmi.objectives.n.* data model. Note that cross-SCO aggregation required careful use of global objectives and thorough LMS testing — rollup rules operate on activity status, not objectives directly, so this took extra implementation work to get right.
What About xAPI?
If you're evaluating SCORM versions, you might wonder whether to skip SCORM entirely and use xAPI (Experience API). xAPI is more modern and flexible, but:
- Not all LMSs support xAPI natively — many require plugins or a separate LRS
- xAPI requires a Learning Record Store (LRS), which adds infrastructure unless your LMS has one built in
- xAPI has no built-in sequencing or packaging standard (cmi5 addresses packaging and launch, but intentionally omits complex sequencing)
- SCORM's ecosystem of tools and expertise is vastly larger
For LMS-hosted courseware, SCORM remains the most practical choice in 2026. xAPI excels when you need to track learning beyond the LMS — but that's a different conversation.
My Recommendation
If you're not sure, start with SCORM 1.2. Most authoring tools let you republish in SCORM 2004 later, though you'll need to redesign course logic to actually use 2004-specific features like separate completion/success tracking or sequencing — simply changing the export format doesn't give you those capabilities automatically. Also note that historical learner data in your LMS may not migrate to a republished package (this varies by LMS). Still, this is far less painful than the alternative — discovering mid-project that your LMS doesn't handle 2004 well and having to strip out features your design depends on.
If you know you need SCORM 2004's features, invest time in testing early. Build a proof-of-concept, test it in your LMS, and verify the specific features you need actually work as expected before committing to a full development programme.
Need help deciding? I'm happy to review your requirements and give you a straight answer. Book a free consultation — it's the fastest way to get clarity on what'll work for your specific situation.
Further Reading
- SCORM Run-Time Reference Chart — the definitive data model comparison (Rustici Software)
- SCORM Versions: The Evolution of eLearning Standards — version history and adoption data
- SCORM 2004 Sequencing and Navigation — deep dive into the S&N specification
- ADL SCORM 2004 4th Edition Documentation — official specification documents
- xAPI Overview — if you're considering moving beyond SCORM
- cmi5 Technical 101 — the bridge between SCORM packaging and xAPI tracking