The Complete SCORM Testing Checklist
The same checklist I use for professional SCORM testing. Covers functionality, data tracking, cross-browser compatibility, and common errors.
Last updated: December 2025 | Works for SCORM 1.2, SCORM 2004, and cmi5
Why Testing Matters
A course that works in preview mode can still fail spectacularly in production. SCORM packages interact with LMS platforms through a standardised API, but "standardised" doesn't mean "identical." Every LMS implements SCORM slightly differently.
This checklist helps you catch issues before your learners do. Use it for every course, every time. The 20 minutes you spend testing will save hours of troubleshooting later.
Before You Start
- Identify your SCORM version (1.2 or 2004)
- Document expected behaviours (pass score, completion criteria)
- Prepare test scenarios (pass, fail, partial completion)
- Have access to SCORM Cloud or your target LMS
Package Validation
Before uploading to any LMS, verify your SCORM package structure is correct.
Manifest file exists at root
The imsmanifest.xml file must be at the root level of your ZIP, not in a subdirectory.
Manifest XML is valid
The imsmanifest.xml must be well-formed XML with no syntax errors.
All resource references exist
Every file referenced in the manifest actually exists in the package.
SCO vs Asset declaration correct
Resources are correctly declared as SCO (trackable) or Asset (static content).
Unique resource identifiers
All resource IDs in the manifest are unique - no duplicates.
No absolute paths
All file paths in the manifest and course content are relative, not absolute.
Functional Testing
Test the basic functionality of your course before checking SCORM-specific features.
Course launches correctly
The course opens without errors and displays the first screen properly.
All navigation works
Next, previous, menu, and any custom navigation function correctly.
Media plays correctly
All videos, audio, and animations play as expected with proper controls.
Interactions function
Drag-and-drop, clicks, hovers, and form inputs work properly.
Quizzes score correctly
Questions record correct/incorrect answers and calculate scores properly.
Course closes cleanly
The course exits without errors when the learner finishes or closes the window.
Accessibility Testing
WCAG 2.2 AA compliance is now the industry standard for corporate training. Test these critical areas.
Keyboard navigation works
All interactive elements can be accessed and operated using only a keyboard (Tab, Enter, Arrow keys).
Focus order is logical
When tabbing through the course, focus moves in a logical, predictable order.
Color contrast meets standards
Text has at least 4.5:1 contrast ratio against its background (3:1 for large text).
Images have alt text
All meaningful images have descriptive alt text; decorative images are marked as such.
Videos have captions
All video content has accurate closed captions or subtitles available.
Audio has transcripts
Audio-only content has text transcripts available.
Screen reader compatible
Test with at least one screen reader (NVDA, JAWS, or VoiceOver).
No flashing content
Content doesn't flash more than 3 times per second (seizure risk).
SCORM Data Testing
Verify that your course correctly communicates with the LMS through the SCORM API.
API initialises correctly
LMSInitialize() (1.2) or Initialize() (2004) is called before any other API calls.
Data commits regularly
LMSCommit() is called periodically to save progress, not just at the end.
Clean termination on exit
LMSFinish() (1.2) or Terminate() (2004) is called before the course window closes.
Completion status updates
cmi.core.lesson_status (1.2) or cmi.completion_status (2004) changes to "completed" when appropriate.
Success/failure status tracks
Pass/fail status reflects quiz performance correctly.
Score reports accurately
The score shown to the learner matches what's reported to the LMS.
Suspend data saves
Bookmark/progress data is saved so learners can resume where they left off.
Time tracking records
Session time and total time report correctly to the LMS.
Resume & Bookmark Testing
Suspend/resume is one of the most common failure points. Test these edge cases thoroughly.
Resume from mid-course
Close the course partway through, relaunch, and verify it returns to the correct slide.
Resume from mid-quiz
Close while answering a quiz, relaunch, and verify quiz state is preserved.
Resume from mid-video
Close while a video is playing, relaunch, and check if playback position is restored.
Resume after extended time
Wait several hours or days between sessions to verify suspend data persists.
Resume after course update
If you update and republish the course, test what happens to existing learner progress.
Suspend data size check
Verify your suspend data doesn't exceed limits (4,096 chars for SCORM 1.2).
Cross-Environment Testing
Your course needs to work across different browsers, devices, and LMS platforms. Each has specific quirks.
Chrome (latest)
Test all functionality in the latest Chrome browser.
Firefox (latest)
Test all functionality in the latest Firefox browser.
Safari (latest)
Test all functionality in Safari on macOS.
Edge (latest)
Test all functionality in Microsoft Edge.
iOS Safari
Test on iPhone/iPad if mobile support is required.
Android Chrome
Test on Android devices if mobile support is required.
Target LMS
Test in the actual LMS where the course will be deployed.
LMS Mastery Score setting
Check if your LMS uses its own mastery score calculation that overrides course-reported status.
Common Errors & Solutions
These are the issues I see most frequently when testing SCORM packages. Save this reference for troubleshooting.
| Error | Common Cause | Solution |
|---|---|---|
| Course won't complete | Missing cmi.core.lesson_status or cmi.completion_status API call | Add status update call on final slide or quiz completion |
| Resume doesn't work | Suspend data exceeds 4,096 characters (SCORM 1.2) | Reduce bookmark data size or upgrade to SCORM 2004 |
| Score shows 0 | Using wrong score element or not setting min/max values | Use cmi.core.score.raw and set cmi.core.score.min/max |
| Status resets on re-launch | LMSCommit() not called before window close | Call LMSCommit() and LMSFinish() in onbeforeunload handler |
| Works in SCORM Cloud, fails in LMS | LMS-specific implementation differences | Test in target LMS and check API call timing |
| Completed but shows incomplete | Both completion AND success status required by LMS | Set both completion_status and success_status (SCORM 2004) |
| Course passes but LMS shows fail | LMS "Mastery Score" setting overrides course-reported status | Disable LMS mastery score or ensure score meets LMS threshold |
| Course hangs on loading (Firefox) | Firefox blocks autoplay by default | Add user interaction before playing media, or use muted autoplay |
| Manifest not found error | imsmanifest.xml in subdirectory instead of ZIP root | Repackage with manifest at root level of ZIP file |
| API not found / LMS not detected | Course not finding SCORM API in parent frames | Check API discovery code scans all parent/opener frames |
| Course works once, fails on retry | Entry status not being checked on re-launch | Check cmi.core.entry (1.2) or cmi.entry (2004) on Initialize |
SCORM 1.2 vs 2004: Key Differences
Testing approach varies depending on your SCORM version. Here are the critical differences to keep in mind.
SCORM 1.2
- Suspend data limit: 4,096 characters
- Status element:
cmi.core.lesson_status - Combined completion + success status
- No sequencing/navigation rules
- Wider LMS compatibility
SCORM 2004
- Suspend data limit: 64,000 characters
- Separate:
cmi.completion_status+cmi.success_status - Sequencing & navigation controls
- Detailed interaction tracking
- Multiple editions (2nd, 3rd, 4th)
Future-Proofing: cmi5 & xAPI
If you're planning to migrate from SCORM to modern standards, here's what you need to know about testing differences.
What is cmi5?
cmi5 bridges SCORM and xAPI. It uses xAPI statements for tracking but maintains LMS launch and session management like SCORM.
- Manifest file is
cmi5.xml - Uses REST/JSON instead of JavaScript API
- Data stored in Learning Record Store (LRS)
Testing Differences
cmi5 testing requires different tools and approaches than SCORM testing.
- SCORM Cloud also supports cmi5 testing
- Check xAPI statements in LRS, not JavaScript console
- Verify AU (Assignable Unit) registration works
Migration tip: If converting SCORM to xAPI/cmi5, test both versions side-by-side initially. Track the same learner actions and compare what each system records.
Free Testing Tools
These tools will help you test your SCORM packages. Each has strengths and limitations.
SCORM Cloud
The industry standard for SCORM testing. Upload your package, run through the course, and review detailed debug logs.
Browser Developer Tools
Use the browser console (F12) to monitor JavaScript errors and network requests during course playback.
Your Target LMS
Nothing beats testing in the actual LMS. Most platforms have a test/preview mode for developers.
Local Moodle Installation
Set up Moodle locally using XAMPP, Docker, or Moodle's Bitnami package. Full LMS testing without cloud uploads.
SCORM Again (pipwerks)
Open-source JavaScript library that includes a test harness for local SCORM debugging.
WAVE Accessibility Tool
Browser extension that checks web content for accessibility issues (WCAG compliance).
NVDA Screen Reader
Free, open-source screen reader for Windows. Essential for testing accessibility.
Need Professional Testing?
This checklist covers the basics, but professional testing catches issues that DIY testing misses. Get expert analysis, detailed reports, and fix recommendations.
Have a Course That Needs Testing?
Send me your SCORM package and I'll have a detailed test report back to you within 24-48 hours.