Did xAPI fail?

The Experience API (xAPI), a specification – and now a standard – for collecting data about learning experiences, has been around for over a decade. Despite the initial excitement, many learning practitioners seem to feel that it has failed to revolutionise learning in the way they had hoped. A lot of hype, but no results. All fur coat, no knickers.

But is that really the case? Has xAPI truly failed?

Spiderman meme with L&D teams pointing at Tech vendors and vice verss
L+D teams and tech vendors seem to blame each other for not adopting xAPI

Back to the start

I jumped onto the xAPI bandwagon in the autumn of 2013. We were exploring ideas around social learning but struggled to find a standard way to track and report the new types of participation we were seeing, which went beyond SCORM’s focus on course completion.

xAPI – commonly known as Tin Can back then – was still in its infancy, but the foundation had been laid. I’ll be clear: I didn’t contribute to its early development; I merely benefited from the hard work of others. 

The concept was part of a broader vision of a “Total Learning Architecture” – a future where learning was no longer episodic but seamlessly integrated into daily work. This vision, even in today’s world of generative AI, still holds much promise, a credit to its originators.

In those early days, Dave Tosh and I worked on a portable, personal learning record store. Our idea was to create a learning record that could follow you from job to job – no more retaking the same training, no more lost evidence of skills acquired elsewhere.

Everyone gets a personal learning locker!

It was a good idea, but it didn’t take off. We realised companies first needed to adopt standardised ways of storing and using data before they could accept records from a third-party source. Dave and I pivoted, and within weeks, Learning Locker (for companies, not individuals) was born. Over the next five years, I devoted myself to promoting xAPI, Learning Record Stores, and the idea of tracking learning in a new way. But the questions and criticisms were always there: 

Why hasn’t it changed eLearning yet? Why aren’t more vendors adopting it? What can xAPI actually do that’s useful?

What is xAPI actually good for?

At its core, xAPI is simply a technical specification – it defines how to collect, store, and retrieve data. What you do with that data was always up to the implementers. The problem was, not many people were thinking beyond traditional, episodic training modules.

The learning industry often saw xAPI as “SCORM 2.0,” which was a misunderstanding. SCORM and xAPI have shared ancestry, but like Homo sapiens and Neanderthals, they represent fundamentally different evolutions. SCORM was about tracking course completion in a Learning Management System (LMS); xAPI was meant to track a broad range of learning activities across many contexts. 

However, instead of embracing this opportunity to track new learning experiences, the industry remained stuck comparing xAPI to SCORM—and finding it lacking. The CMI-5 specification, which allowed xAPI content to launch like SCORM in an LMS, in part helped to answer this question, but came probably too late in the adoption cycle for the mainstream to notice. It was what people thought they wanted. But looking back, maybe they were asking for a faster horse when what they really needed was a car.

We were given a tool to break free from the traditional course + LMS model, but instead, the industry insisted on making it work within that same, limiting structure.

Learning Analytics and ROI

Launching content in an LMS wasn’t the only challenge. Once you had xAPI data, the next question was: What do you do with it?

Some focused on learning analytics. Could xAPI help us determine what learning worked and what didn’t? Our competitor, Watershed, made great strides here, using xAPI to deliver detailed analytics – like assessing the effectiveness of specific quiz questions. But the flexibility of xAPI, while powerful, also made it difficult to standardize data. Without a common data model, it was hard to build dashboards that worked across multiple experiences. Essentially, the juice wasn’t always worth the squeeze.

Others focused on measuring Return on Investment (ROI). With Learning Pool, we had a successful case study with Villeroy & Boch, showing a clear ROI for digital learning over traditional training. But this remained a relatively isolated example in a very large field. Why was it so hard to prove the impact of learning?

The answer lies in the complexity of measuring learning’s impact. There are countless variables that affect outcomes, making it nearly impossible to definitively prove that one learning program led to one specific result. In most business decisions – whether in sales, marketing, or technology – clear ROI is rarely easy to demonstrate. Even if you have time and resources for thorough analysis, the metrics that matter – retention, sales growth, productivity – often live in systems outside of xAPI. These are business questions, not xAPI questions, and their answers lie elsewhere. And, frankly, L&D rarely has the time or the resources to make this happen.

Were we ready for the xAPI revolution?

Perhaps we weren’t. Expecting a technical specification to drive industry change is a lot to ask. The unfortunate reality is that most practitioners simply waited for an “xAPI” button to appear in their authoring tools or LMS. Meanwhile, vendors – who profited from selling courses or software to host courses – had little incentive to implement something that could disrupt their business models.

The promise of xAPI wasn’t to replace SCORM. It was to create a new paradigm of tracking all types of learning experiences, from formal to informal, from workplace performance to social learning. But the industry wasn’t ready for such a leap. Maybe xAPI alone wasn’t a big enough step to break free of the course + LMS model. 

Looking ahead: A new era with AI?

Now, with the rise of generative AI, we might finally be at the cusp of a revolution in learning technology. AI has the potential to realise some of the promise that more grandiose claims said xAPI alone would fill – personalizing learning experiences, automating data collection, and integrating learning data with business outcomes in real time.

Could AI succeed in changing L&D where xAPI struggled? Perhaps; the limitations of xAPI were not so much in its design but in the resistance of the industry to change. The future of learning technology lies in how willing we are to embrace that change, whether through AI, xAPI, or something entirely new.

xAPI certainly did fail to live up to the hype we managed to generate, but it didn’t fail; it grew and evolved as many technical specifications do. It struggled in the popular forums because the industry wasn’t ready to step beyond the traditional learning paradigms it knew – and it certainly wasn’t capable of doing the engineering required to do it. But that doesn’t mean xAPI’s promise of change is gone. As we move into a new era, the tools are there to revolutionize learning – if we’re brave enough to use them.


Posted

in

by

Tags: