Instructional design (ID) activities differ greatly, depending on the mode of delivery. Creating learning can take many different forms, and the role of an instructional designer includes many specialist skills, from needs analysis to cross-platform implementation.
Instructional design for face-to-face or instructor-led virtual training
For these learning modes, ID means supporting the instructor to facilitate live learning sessions, taking into account the facilitator’s skills, the learning cohort, the external learning environment, and resources such as (digital) hand-outs, running sheets, guides, and learning props.
Self-directed eLearning on LMS
Here, the focus of ID is to create interactive, engaging learning content to maximise learner engagement on and offline. IDs need to be aware of different LMS’s functionalities and technology constraints.
Microlearning and performance support tools
This requires the ID to create short, sharp, on-the-job learning pieces or digital tools that can be accessed at the point of need. The ID needs to take into account the nature of the task and requirements of the role and understand the goals of the business.
Subject matter experts and instructional design
IDs work closely with subject matter experts to ensure the learning content is factually accurate and contextualised correctly. This requires high-level communication and organisational skills from the instructional designer.
Instructional designers visualise and tell stories
In eLearning, strong, relevant visuals (video or animation) play an important role in making the learning more accessible, authentic, and memorable. IDs need to know how to create contextualised scenarios to support the learning outcomes.
‘You can’t get it right for everyone, and we have our own training rooms that we should use. That’s what they are there for,’ said the branch manager.
‘That may be true, but we should give it our best shot and meet our learners where they like to learn whenever it is possible,’ said the learning and development lead, ‘Our new recruits are digital natives and, in their pre-employment survey, 72% have indicated they’d prefer to learn about this particular topic on their personal device, self-directed when it suits them within the given timeframe.’
Does this scenario sound familiar? It is one of the many ways in which innovation in L&D stalls, hampering effective learning design efforts that could yield higher returns on the average training dollar spent. We feel it’s time to uncover some of the typical barriers to human-centric learning design we have observed over the years:
01. Discovery phase
Where do you want to go and what is already in your hands to get there?
If we want learners to consistently apply what they have learnt in the workplace, everybody needs to be in the know. Managers/supervisors need to model and support said practices – and not just during the training phase.
Anything else turns learning into a tick-and-flick exercise, with little sustainable impact. For this reason, it is crucial to clearly articulate what the future state should look like in terms of the desired behaviour of the people who need to learn a skill or task AND the people who need to supervise them.
Another aspect that often gets neglected is consideration of an organisation’s existing learning culture, i.e. is there time and value given to formal face-to-face training, yet self-directed learning during working hours is frowned upon? Is your learning culture run by functional imperatives such as learning infrastructure that could be seen as a sunk cost if not used as frequently as before?
Vice versa, there may be existing learning materials that could be leveraged in a project or others that will be impacted by a new learning project’s development.
02. Definition phase
What actually IS the problem?
This is one of the hardest phases to get right, as different stakeholder groups may have different perceptions of the same issue to be solved. Sometimes, it takes more enquiry to get to the bottom of the matter and find out that the initial training intent would have scratched the symptoms, but not addressed the core performance problem underneath.
Learning, ideally, is a part of a bigger, integrated communications ecosystem, with congruent messaging cascading all the way into the team and individual business outcome KPIs. Hence, a learning project team must have a sufficient understanding of a learning piece’s scope, risks and dependencies.
03. Design phase
Finding the balance between ideal and practical
‘There are many ways to slice an apple; learners should have a choice on how they complete a work task,’ said the learning and development project officer after reviewing a learner guide draft.
‘Um, sure; however, in this instance, we are training construction workers in standard operating procedures and mandatory job site machinery safety checks, so there really cannot be any choice.’
L&D teams are experts in their field: education. They rely on effective collaboration with subject matter experts to co-create a practicable learning solution with tangible outcomes that can be applied on the job directly.
Often, these same L&D teams are under-resourced, meaning they can lack the capacity to staff sufficient stakeholder engagement. This scarcity of resources leads to difficulties in keeping up with the latest L&D technology trends in-house, meaning that barriers to human-centric learning design often comes through external providers.
04. Development phase
The right strokes for the right folks
The art of a well-received project with stakeholder support is to give everyone who needs it ‘skin in the game’. Not just at the beginning of a project and on delivery, but throughout development. Iterative testing and actively seeking input from test users will enhance the learning output so, at deployment, there should be no surprises.
Sometimes, we see a reluctance to explore or adopt new instructional design tools that may offer new and better ways of reaching the learner audiences than traditional authoring tools. Time and resourcing pressure in L&D, ironically, stand in the way of learning new tech when it comes out.
There are many more communication channels in most organisations than are usually considered for L&D projects and which could be used to enhance and reinforce learning messages before, during and after a learning intervention.
Wherever possible, give your learners a choice about where they would like to access the content and keep refreshers close and relevant to job contexts that will require that access. For example, display short, sharp product handling information or brief customer service reminders on POS systems.
What are your organisation’s barriers to human-centric learning design?
Uncovering the answer to that question might take your learning further than you think.
This article originally appeared in Training & Development magazine, June 2021 Vol. 48 No. 2, published by the Australian Institute of Training and Development.
The game-changer has a name and came into workplaces across the globe, leaving few lives untouched. In 2020, Covid-19 altered the way we live, the way we work and the way we learn. It forced upon us radical changes in record time and at a record scale, with little chance to reflect, plan and strategise.
Organisations needed to pivot their usual work practices to stay operable; many unfortunately had to downsize and learn how to achieve the same output with fewer resources. Activities that usually happen in person were forced to move online, including learning and development and performance management.
Now, a few months into this new awareness of what is possible in the online space, it is time to take stock and examine how best to take forward any lessons learned. One thing seems inevitable: digital learning solutions are critically entrenched in businesses – at a minimum due to risk management requirements for future crises, in the best case because they are a proven, enriching element that drives workforce engagement and performance.
Arguably, the current situation lends itself well for experimenting with new ways to operate L&D; it is a forgiving time to embrace data analytics and digital learning and put in place what can work for your organisational context in a more agile, staged approach.
In parallel to the Covid-19 return to the “new normal”, here is a roadmap for your L&D technology staged release.
Stage 1: Understand Learning Data
In this first step, L&D configures a learning record store (LRS) and learns how to read and track learning activity statistics such as completion rates, view and interaction statistics, average time spent per screen. Here, it is not enough to merely look at raw data because external factors have an influence which needs to be considered in the analysis.
For example, take a mobile responsive course where a learner spends 5-10 minutes “time on screen” as it is a longer vertical scrolling piece. The equivalent learning artefact built for a tablet/PC screen could have a different interactive functionality such as “click next” or “swipe next” and as a result produces a totally different value for “time on screen” – which has nothing to do with the screen content viewed by the various learners.
Analysing data requires an awareness of influencing factors that can throw your metrics out. It is why taking time to experiment, learning how to read and interpret data and understand the direct influence the build of your training has in relation to the data, is time well invested.
Stage 2: Create a Data Analysis Governance Framework
The lessons learned in Stage 1 will enable your L&D team to identify valuable themes and patterns to consider before rolling out learning data analysis in your specific organisational context:
What are meaningful metrics that can support your L&D function for continuous improvement?
What method will be used to measure performance consistently?
What are some metrics that will be useful in the compliance and risk management context?
What are the privacy and data protection implications of collecting, recording, and using digital learning data?
What policies and procedures need to be put in place to turn digital learning and data analytics into business as usual?
What data sets do you need to track for what purpose?
What information can support performance management and support, and to what extent?
What tools and functional guidelines will be used in relation to data governance and consistency?
Stage 3: Implementation
Depending on the granularity of your tracking, data analysis from L&D activities can potentially inform analysis of performance, productivity, efficiency, management, leadership, contributions – all the way to gauging the effectiveness of training programs.
It can motivate individuals to take more ownership of their learning, ideally because the data analysis leads to improved learning content, or, to quote Peter Drucker, just because ‘What gets measured gets done.’
Gone are the days where learners can quickly flick through a learning piece without spending the appropriate time and attention to learn what they need to know and do for their job. Moreover, data analytics is also valuable in regard to supporting AI and machine learning programs.
Learning and development can save lives. That is another hard lesson we have collectively learned during these pandemic times. Let’s keep it digital and accessible – and let’s keep learning to make it better, based on data, not opinions or inconsistently measured statistics.
As COVID-19 spreads like wildfire, it leaves a path of disruption in its wake – but not all of it is negative news. In some ways, the current pandemic forced organisations in all industries to reckon with their status quo and acted as a catalyst to move away from ingrained processes. This included the way organisations are able to train their staff and run their business in a safe manner.
Those organisations that already had a foot in the digital world at the beginning of 2020 had a headstart – and for those who had to catch up, there were many options ready to go. And yet, not all technology proved to be useful in overcoming the challenges at hand.
Overall, we saw two tech-related approaches emerge that provided exceptional value for money and performance in these dire times.
The new virtual reality: VR in the education and retail sectors
Australian universities were hit hard with a complete halt to prospective international students travelling, attending classes or even going to open days, cutting one of their main income and marketing channels. Pair that with the simultaneous Australian Government reforms to student fees, along with the need to move all teaching activity online within 2 weeks, and you had a perfect example of ‘we need a magic wand and we need it now to turn this around!’
One solution lay in using virtual reality (VR) technology. In a fast-paced, low-budget project, Liberate Learning worked with a large Australian university to create a virtual open day experience. This allowed prospective students to virtually walk through the lanes of Melbourne, visit key campus amenities and learn about Australian culture. All the content was, and still is, available on web browsers from anywhere in the world, thus increasing the number of potential open day attendees way beyond what a real-life event could have catered for. Web-based VR has advantages over immersive VR (that’s the one set up in a specific fixed location with the VR goggles) in that it can be deployed to anywhere in the world via web browser using low bandwidth. It also helps participants avoid motion sickness and comes with a much smaller price tag and shorter production times compared to immersive VR. It is much more suitable for crisis application where timing, budget and remote access are all equally critical.
2021 can be a time to embrace the changes and advantages thrust upon us in 2020. (Rodney Beach, Managing Director, Liberate Learning)
Australian universities were hit hard with a complete halt to prospective international students travelling, attending classes or even going to open days, cutting one of their main income and marketing channels. Pair that with the simultaneous Australian Government reforms to student fees, along with the need to move all teaching activity online within 2 weeks, and you had a perfect example of ‘we need a magic wand and we need it now to turn this around!’
One solution lay in using virtual reality (VR) technology. In a fast-paced, low-budget project, Liberate Learning worked with a large Australian university to create a virtual open day experience. This allowed prospective students to virtually walk through the lanes of Melbourne, visit key campus amenities and learn about Australian culture. All the content was, and still is, available on web browsers from anywhere in the world, thus increasing the number of potential open day attendees way beyond what a real-life event could have catered for.
Web-based VR has advantages over immersive VR (that’s the one set up in a specific fixed location with the VR goggles) in that it can be deployed to anywhere in the world via web browser using low bandwidth. It also helps participants avoid motion sickness and comes with a much smaller price tag and shorter production times compared to immersive VR. It is much more suitable for crisis application where timing, budget and remote access are all equally critical.
Another industry taking advantage of this VR solution were organisations in the fast-moving consumer (FMCG) sector who were faced with a higher-than-average infection risk and fluctuating staff cohorts due to quarantine requirements and panic buying.
They needed a way to train remote learners in authentic and locality-specific topics – and web-based VR delivered for over a hundred thousand learners. Unlike fully immersive VR, it also integrated into the retailer’s learning management system (LMS) allowing crucial training tracking for accountability and compliance in workplace health and safety.
The move towards minimum viable products (MVP)
The pandemic often required learning artefact development in timeframes way below the industry norm. As a result, some hastily produced solutions were under par in terms of learner outcomes, engagement, good learning design and measurability.
A case in point is that many organisations were uploading session slides and calling them ‘online learning’ – when at best it could be called ‘information dissemination’. One way out of death-by-PowerPoint was the rapid development of shorter minimum viable products – a pared down initial product with sufficient features to fulfil basic needs – using modern authoring tools, multiple deployment platforms, including mobile devices, and modern tracking mechanisms.
For example, we saw subject matter experts take videos of manual handling procedures with their smartphones, eliminating the need to put camera crews at risk on site. We observed learning teams embrace a new role as curators of such user-generated content. We also witnessed a higher sense of ownership of learning content from floor staff, as clips came straight from the horse’s mouth, produced by real-life Jack at the deli counter.
In the end, as the quality level of the videos was similar to self-made videos on social media platforms, learners felt part of the learning story because it was authentic and created a sense of collective effort.
Time will tell how long we will need to contend with this socially distant way of going about our business. However, it is very clear that the digital age has brought many advantages that our ancestors in the times of the influenza or plague pandemics did not have – so let’s be grateful for these opportunities.
2021 can be a time to embrace the changes and advantages thrust upon us in 2020, using them wisely for the betterment of these, and future, circumstances.
“Sorry, that’s not part of this project. We can help you if you sign this contract variation.”
If this sounds familiar, you may have experienced what we call the dreaded scope creep.
Scope creeps come in different forms and sizes. Still, they all have this in common, particularly when it occurs when you are working with an eLearning vendor: an aftertaste of sub-optimal process and possibly even deliverables, impacting your willingness to work with the said vendor again, and vice versa.
1. Scope creep by lack of planning
Scope creep happens when the initial brief is incomplete, unclear or does not include all the expected deliverables. This then forms the basis of the vendor quote (and ultimately the contract).
Especially when engaging a new vendor, make sure you include all upfront information the vendor will need to succeed:
project requirements including timelines
expectations on review cycles, branding and style guidelines
technical requirements and deployment environment
learning design requirements
stakeholder environment and sign-off process, milestone payments on the successful delivery of project parts.
Providing complete documentation will help the vendor understand your specific environment. It is up to them to read, plan for compliance, and quote accordingly. A good, experienced vendor will know when and what to ask for to obtain complete information, and automatically include ‘obvious’ requirements in their scope, e.g., certain Government standards that need to be adhered to. An experienced and reputable vendor should also be aware of many of your requirements based on their years of experience even if you don’t formally identify the needs for said requirements. This type of working relationship would be more suited to being referred to as a true ‘partner’ versus a ‘vendor’.
2. Lack of communication – beware of the word ‘Just’…
Competent vendors will work with you to develop a learning solution that suits your specific context and budget needs. And yet, in the middle of the learning development phase (i.e. in production), one of your key stakeholders may want to add “just a couple of branching case study streams” in a learning piece that was supposed to be short, succinct and very low budget.
A request like this can have a domino effect on the entire learning piece’s integrity, so the design phase may have to start again to integrate it well and ensure the solution is sound. The stakeholder may not be aware of the impact. Now you need to spend valuable time explaining and navigating conflicting requirements for a disjointed learning solution, with potential contract variations, and blown out timelines and costs.
Consult with your subject matter experts (SME) and key stakeholders before going out to market for a quote, or involve the solution designer early in the process. Ask them about their requirements and expectations of what a successful learning solution looks like to them.
Explain the impact of increasing content or functions, such as interactivity levels, scenarios, animations, or video, on both time and budget. Your vendor can provide various solutions and talk about the quality, cost or timeline expectations for each solution, so you can have these conversations before the work begins.
3. Lack of imagination
There are many ways to skin a cat, they say, and indeed there are many, many more ways to design learning.
Design, in general, is one of the most challenging topics to discuss in words (try describing the hue of blue or grey in front of your window to someone right now). The same is true for the look and feel of a learning project.
Please do not wait for the vendor to finish an entire learning piece to determine the funky flat illustration style they chose does not suit your law practice’s corporate, traditional style.
Ask your vendor to show you what styles they recommend and get a few test screens done so you can picture it better via an early prototype.
Get sign-off on the preferred style by other stakeholders that may need to be involved, e.g. marketing/internal communications, to make sure your overall branding is on-point throughout your organisation.
A vendor you can trust will genuinely advocate for a great solution and not just ‘take orders’.
4. Inattention to detail and testing
Generally speaking, a well-rounded and ‘engaging’ eLearning project consists of roughly 30% instructional design/pedagogical scripting (engagement of the mind), 30% goes towards creative visual design and on-point artwork elements (engagement by the eye), and 30% function development and media production (engagement by the ear and screen). Noting, roughly 10% goes into project management.
All parts are equally important to get right, 100% of the time, as they build on each other. The treacherous belief that “We signed off on that script so the copy in the learning screens should be correct” has tripped over many a deadline.
Ensure your vendor has robust QA processes in place. You should expect to receive error-free, complete proofs for your review. Still, human error is real, so it is the instructional designers’ responsibility to check every proof in detail for its correctness and completeness before you can be signing it off to go to the next phase.
Test and ensure the solution works in your technical environment and according to all specifications (e.g. WCAG accessibility, various browsers and SCORM compatibility just to name a few). If your vendor says, ‘it works fine in our test environment’, ask them for validation in the form of a test certificate, and still test it yourself given you’re ultimately responsible for final sign-off. An easy way to do that is to upload the SCORM package to a free SCORM platform.
Sometimes, a learning piece takes two weeks to make it through all the internal technical mills and get uploaded to your LMS. That’s a long wait to see if all works correctly. If you work with a vendor you have built a trusted relationship with, you can share access to your LMS and they can do the testing directly in your environment for you.
5. Scope creep as vendor business model
Our least favourite scope creeps happened to some of our clients in previous vendor relationships they’ve had. These are veritable scope traps, set by vendors on purpose. Some vendors will win projects by underquoting and try to make their margins back by forcing contract variations at any opportunity (including project administration fees each time) for every single client request of alteration, no matter how small. That is undoubtedly not a sustainable way to work with clients long-term, yet they exist from what we hear.
Here is how you can protect yourself:
Set boundaries and list upfront what will warrant a contract variation and what will not, agree on this with the vendor and make sure the contract reflects this agreement.
A well-established learning provider will be interested in building a longer-term business partnership (as opposed to selling to you), so may more likely be willing to help you with small change requests, even after the project has been signed-off and deployed. Over time, give and take will even things out, and often it is better to get the project out the door than to squabble about minutiae.
Over the past few months, many people have had their first encounter with ‘learning online’, as opposed to attending face-to-face sessions. Social media feeds are full of mixed views around online learning, from both the people suddenly tasked to develop online content in short time frames and the learners on the receiving end.
The danger is that the extraordinary times we live in right now – and the necessary stop-gap learning measures – will shape the perceived quality of eLearning design overall. Therefore, we believe it is vital to take a step back and look at what constitutes quality, workplace-related eLearning – and what does not quite measure up.
Overall, designing relevant and engaging eLearning encompasses the disciplines of both the instructional designer (educational design) and the eLearning developer (technical UX and/or UI design). These disciplines need to complement each other for a learning piece to work well.
Can you imagine a less inspiring learning experience than clicking ‘next’ to read through a series of regurgitated snippets of a new policy, with a multiple-choice quiz at the end to see what you have ‘remembered?’ We can’t, and yet, it is not uncommon; we have seen it a lot over the years. Unsurprisingly, ‘eLearning’ designed like this is of low value to the learners and to the organisations who invested time and resources to create it. We would not call this type of content ‘eLearning’; we would call it ‘information dissemination’, which has nothing to do with the actual art of facilitating the learning process.
For eLearning content to work well, it needs to tap into what people believe they want or why they need to learn something. Context goes a long way in that, together with intrinsic incentives – and extrinsic motivations – to make the learner think.
Effective quality eLearning content is more engaging and relevant when it:
uses stories and authentic case studies that the learners can relate to because they use familiar workplace scenes, behaviours and tacit language
includes typical workplace problems to discover and solve along the way
progresses the learner from unconsciously incompetent to consciously competent through scaffolding and learner progression with feedback and practise points
supports the practical application of new knowledge and skills in the workplace.
Create real engagement
In face-to-face learning, a facilitator can engage in direct, person-to-person, peer-to-peer, group discussions and activities, and share stories to energise the room. In contrast, content creation in eLearning needs to plan for these elements or simulate it in order to achieve the same effect.
Going back to the aforementioned policy learning content, let’s imagine an eLearning piece that invites the learner to witness two different interactive workplace scenarios, with relatable problems, different behaviours, and two or more different outcomes. In both situations, all involved claim they followed the policy, and yet, one or more scenario paths have directly resulted – or indirectly influenced – non-compliant behaviour and outcomes.
With this scenario approach, dependent on the desired learning outcome and cohort, you can spin off a range of activities that allow learners to discover:
how to interpret right from wrong workplace behaviour in the context of the policy and organisation
how to manage a situation like this from the viewpoint of different roles in the organisation
how to lead self and others through any grey area situations
how and where to access support tools when needed after the eLearning piece is complete.
Now that we have looked at how smart instructional design can influence the quality of an eLearning piece, let’s look at what we can do with the learning space itself: the learning interface.
Get the visuals, audio and interactivity right
‘An image is worth a thousand words’ is an often-quoted cliché, to which we would add …’if it is not just a visual object on the screen and if it is directly helping to tell your eLearning story.’
What do we mean by that?
A quality learning developer will be able to take the content and instructional design blueprint and create a learning experience interface with just the right balance.
In other words, it is not enough to place a stock image next to five lines of copy in a box to ‘make the screen look interesting’. A quality learning designer makes the learner wonder, imagine and think, building in smart and interactive feedback loops to signal learner progression along the way. They do it with an awareness of the entire bandwidth of audio-visual elements (colours, text, buttons, video, audio narration, animation), paying special attention to haptic and spacial design elements in the case of VR or supportive and assistance design elements in the case of accessibility.
The level of sophistication is, as with any other project, dependent on the context of the learning project, desired quality, its time or technical constraints, and available budget. And yet, a good learning designer will only use what is truly necessary. Instead of overloading the learner with input and interactions of the finger, not the mind, they will give a learner functionality choices to cater for individual preferences (e.g. selecting a persona, pathways, and choices such as switch audio/captions on or off).
“Workplace learning is finally moving away from facilitated one-way delivery and static learning.”
Functionality brings us to the third element that makes a significant difference to a learner’s experience with eLearning and, therefore, its potential to be remembered for the right reasons.
Using the right tools and learning platforms
We believe if you’re going to do something, do it well. In the eLearning industry, there is so much room for improvement in the way we can make our content meet the learner where they are, when they need it, in a way they prefer. That is the power of our 21st-century technology.
However, the current mindset around what eLearning technical limits are seems to be:
teach on Zoom and upload videos and presentation files
chunk policy into paragraphs with a supporting image
paste text on the left of the screen and add a stock image on the right.
However, in our daily lives, ‘eLearning’ at micro-level happens everywhere, at our fingertips. For example, every time we ask Google ‘where is the next petrol station?’ because we just noticed the blinking symbol on our car dashboard. Where is that same intuitive level of information accessibility when, say, a retail client is about to buy a model C of brand A in a hardware store? What would happen if the retail assistant, upon hearing why the client opted for that model, could look up their user requirements on a smart device and recommend a better-suited alternative?
Take the eLearning plunge
Workplace learning is finally moving away from facilitated one-way delivery and static learning platforms with eLearning courses on a desktop, to a more organic learning ecosystem approach; however, for many organisations, this kind of ‘learning’ still sounds futuristic, expensive or complicated. We believe this is because many people in learning and development are unaware that the technology needed to create this kind of learning is already here – and is neither expensive nor difficult to use. Yes, this kind of quality eLearning takes more thought from a pedagogical point of view, but the potential to deliver effective outcomes is far greater, too.
In the past, a career was considered to be a profession, occupation, trade or vocation along a permanent and non-diverse path over a long period. A career could mean working as a banker, teacher, builder, beautician, real estate agent, financial advisor, software developer or any other job that might come to mind for one’s entire life. However, in 2020 and beyond the word is taking on another connotation—the progress and actions taken by a person during their working lifetime regardless of their profession or trade at any particular point in their life. Enter the need to manage a career learning portfolio.
We are being told that the era of frequent job changes is upon us. The Australian Bureau of Statistics notes that over 1 million Australians changed their job or their business during 2017. Over half of these people entered a new industry. The Australian Institute of Business writes “on average, today’s Australian employee changes jobs 12 times throughout their life, with an average tenure of 3.3 years. For workers over 45, the average job tenure is six years and eight months, while for under 25s, it’s just one year and eight months.” With the disruption to the world’s workforce due to the Covid19 pandemic environment, the rise of the gig economy, and an increasing casualised workforce, the frequency of job changes is likely to get even higher.
Similarly, the Foundation for Young Australians (FYA) argues that youth will have ‘portfolio careers’ potentially having 17 different jobs involving five different vocations. Will they need to be qualified for each of these jobs, before entering employment as we do now? Professionals and technicians are reported as those most likely to change their job, which includes educators and L&D professionals. Will they need to engage in three, four or five years of study at an accredited institution before they can seek employment in the field? Given the fast pace of change, by the time they finish under the current educational system, the skill and knowledge required to perform the job are likely to be different from what they studied. In short, the skills of high performance keep changing as technology assumes many of the functions we thought would never change. For example, a classroom teacher’s role changed almost overnight as the pandemic spread with many of the teachers ill-prepared for online delivery.
The FYA 2017 report, The New Work Smarts highlights that by 2030, we will be spending 30 per cent more time learning skills on-the-job. These will relate to solving work problems using critical thinking and judgement, verbal communication, and interpersonal skills supported by an entrepreneurial mindset. More so than ever before, learning will be lifelong critical practice. To support the trend towards frequent change leading to portfolio careers, we need a timelier system of documenting learning and career progression.
Tertiary institutions are archaic and out of touch with the digital revolution. It is time they take a whole-of-life approach to support learners through their “career learning portfolio”.
Rather than courses that are defined by a list of acceptable subjects, learners need opportunities to create their own programs based on their on-the-job skill requirements, or their portfolio career goals. Direct links, fuelled by intelligent agents and xAPI big data, between what work needs to be done and the institutions where individuals gain their skills and knowledge could just be the life-saver that tertiary institutions need to survive and enter the digital world.
Let’s talk about making eLearning accessible to everyone. Thankfully, it is a request we often receive or an action we routinely recommend when it is not. Sometimes such a request comes without the necessary appreciation of what making eLearning accessible means for learning design, or the project itself.
Before beginning any custom eLearning project, eLearning designers need to clarify with you what level of accessibility you require for the intended learning cohort. The answer will have a significant impact on the nature of what content we can develop, will influence the interaction types to be used, and may even determine the choice of an authoring tool for content development.
Generally speaking, the higher the level of accessibility you aim to achieve, the more considerate we need to be about the interactive functionalities, i.e. what we will be able to use in your custom eLearning piece.
Audience, content, and context matter for accessibility
In determining the ‘right’ level of accessibility, the critical factors to consider are:
Your learner audience
Specific project requirements, and
Your overall organisational compliance context.
This exercise requires you, and later the learning designer, to put themselves in someone else’s shoes and to evaluate the learning piece from different ability angles. Rod, Liberate’s Managing Director, recalls an example when we were engaged in creating a suite of custom eLearning modules to help medical diagnoses, which required learners to identify the severity of different types of wounds, such as pressure ulcers:
“I was in the process of writing the alt tags (which are the narrated words associated with an image that is read by a screen reader), and I realised that the audience needs to have relatively sharp eyesight in order to visually identify the wound, infections, colouring and inflammation of the skin, degree of healing and more. It made me wonder—does this course need to cater for accessibility using screen readers given the audience obviously needs to be sighted?”
Can you use Web Accessibility design techniques?
“Do you use web accessibility techniques?” is another question we often get asked, understandably, as the web accessibility guidelines underpin eLearning accessibility.
In reality, designing for web accessibility is different from designing for eLearning accessibility because the purpose/user experience of a website is to make information as accessible as possible for every user. In contrast, for a learner, it is not just a matter of making the information accessible; it’s crucial to provide the ‘learning experience’ as an equally meaningful endeavour for all learners. Therefore, we cannot just rely on technology, or on ticking WCAG guideline checklists alone to achieve this.
Designing for an ‘equal learning experience’ often needs to go beyond the notion of allowing learners to access the same piece of learning via an assistive technology or through a different mode.
The role of the eLearning designer
Our eLearning designers will always be upfront with our clients about what can and what cannot be done for certain levels of accessibility. If we get asked: “Let’s attach a transcript, so the video is accessible now”, well, that’s not good enough in many instances. Take, for example, a branching scenario with different decision-making points for the learner. That kind of learning experience cannot be replicated in a long-form transcript for the learner to read through, for the experience isn’t an equallearning experience.
Take another example, where you present a complex pie chart or diagram and expect someone to learn by having a screen reader read the chart or diagram from the top left to bottom right. Can you imagine how difficult it would be to identify the relation between the X and Y axis while listening to it in a linear order? A case like this would not be a meaningfullearning experience.
For the reasons above, in navigating accessibility considerations, competent eLearning designers will work with you and rely on their vast experience in finding the sweet spot between
using technical solutions (e.g. short/long descriptive alternatives, colour contrasting, keyboard tabbing, among others), and
applying sound pedagogy practices (e.g. creation of learning activities that are equally meaningful and engaging for all users).
Authoring tools can help achieve your eLearning design goals
With the common adoption of authoring tools in the eLearning profession, eLearning designers may need to work with a range of authoring and testing tools that give the best flexibility when it comes to accessibility. It is good to be aware that some eLearning development platforms are more ‘technically’ suited to achieving higher WCAG levels than others. For example, our sister company App-eLearn.com offers the ability to accommodate for WCAG2.1 AA accessibility. Do you know what is possible with your in-house systems?
Understand how to test for accessibility
Many organisations request for their eLearning designers and developers to meet WCAG accessibility guidelines; however, it is essential that you know how to test your eLearning for its effectiveness. How can you determine whether accessibility is not only functionally achieved, but that the learning experience is equal, and the accessible alternatives are meaningful? It is critical that you are aware and well-informed so that you are signing off on quality learning solutions that provide everyone with a valuable learning experiences.
Would you like to learn more?
Would you like to explore more ways to make your learning accessible? Call the Liberate team and let’s start the conversation: Call Rod on 0413 982 712 or connect with him on LinkedIn.
Are university study programs, where students take ‘time out from life’ to study towards a future career, becoming an outdated way of gaining qualifications? Enter technology-supported customised learning, where learners acquire the knowledge and skills they need at the time they need them, and receive and use micro-credentials when needed.
Let’s consider the mature-age worker employed as a software developer. She attended university but left when deciding that the course was not keeping up-to-date with technological changes. She has worked in a number of businesses, demonstrating advanced levels of knowledge and skill in her profession by learning on-the-job. Now in the middle of her career, a project management position has become available for which she believes she is highly qualified, yet does not have the degrees which will attract a new employer’s notice. How can she compete in the employment market?
Recognition of Prior Learning (RPL) is one strategy that she might use. Most universities or TAFE colleges will offer some form of RPL. However, as one university states, RPL will not be recognised until an offer of enrolment is accepted. The Australian Qualifications Framework requires any RPL process to be of the same standard of assessment as would be required at any accredited institution and performed by an academic or teaching staff with the expertise in the subject. These put restrictions on the timeliness of both learning and accreditation.
What a tedious process RPL has become, with the expectations that verified supporting documentation attests to what is known, understood or has been performed. Few of us look to the future and compile work samples or copious documentation of what we do from day to day, either in a professional or personal capacity. Yet, these are the experiences which accumulate to build our expertise in many areas. Learning is a never-ending process, and acknowledgement of what we know and how we use it can provide many new opportunities for employees.
What if we all had the opportunity to keep track of our learning journeys easily by using the capabilities of technology and the interoperability of learning systems? What if technology helped to collect the data associated with our chunks of learning, compiling these into usable summaries linked to professional competencies? A secure linking between private and public cloud services might play an integral role in supporting changes between the tertiary education sector and personal learning spaces.
The benefits of big data and data analytics using modern technologies such as Learning Record Stores (LRS) and xAPI tracking make it possible to think about a completely different approach involving micro-credentials. This approach encompasses how university students acquire their knowledge and skills and how these results are tracked, recorded and credentialed.
Does your Learning and Development (L&D) Team have the finger on the ‘business pulse’ when they decide whether to use a learning course or performance support tools? Do they ask the right questions before they suggest the best option for workplace outcomes?
Why do we ask?
We often receive requests to develop an eLearning course, when, in fact, performance support tools should at least be considered as an additional, if not THE solution.
In working with our client partners, there have been quite a few projects which benefitted from us spending a little extra time to get to the bottom of the business need that led to the learning request. In many instances, we were subsequently able to help develop something better geared at solving the business problem, and all it took is to ask some questions and innovative thinking.
Learning or performance support tools – or both?
The terms’ learning’ and ‘performance support tools’ are sometimes used interchangeably, but this takes away from their different, but equally important, functions in successful workplace skill development.
If there is a need to develop skills and knowledge, the first reaction is typically to address it through formal learning interventions like workshops, courses, and classes (or during this COVID period, webinars and virtual classrooms). It means time away from the job for the duration of the learning intervention, which is often a one-off event. This type of solution is related to the individual’s memory recall capacity, with the knowledge often needing to be recalled from the memory-bank months afterwards.
Learning’s often overlooked and underestimated cousins are performance support tools, which individual learners use on the job, at the time and place needed. Performance tools can stand alone or act as an additional mechanism after the formal learning is complete. They increase learning retention, as well as removing the need to rely solely on one’s memory for knowledge recall.
How can L&D find out what the business needs?
Here are three questions learning teams could work through, together with the business area:
1. What is the business problem you want to solve?
For example, let’s consider an organisation that wants to roll out a new business ethics policy for their management, with a new, anonymous way that anyone in the organisation can self-assess, and receive support.
The business problem to solve in this case is to a) raise awareness of the policy, b) explain the new content and what it means for every manager, and c) facilitate access to the anonymous self-assessment and support channel.
2. How often will the learner need the new knowledge or skill in their role?
While this kind of policy may not need to be accessed often, it is essential to educate everyone in the organisation about the principles, expectations, and main topic points of the new policy and how to access the self-assessment and support system. However, there is little benefit in, say, learning it off by heart, to have every word of it retained in their head at all times. Instead, a short and sharp eLearning piece can achieve the education part – through creating a solid awareness of the key principles of the policy (and certainly not reiterating the policy online with some multi-choice questions at the end). More important in this case is that everyone can access the self-assessment and support tool when they need it – so it is a just in time, just for me, and just enough solution.
3. What is the consequence if the learner cannot access the knowledge when they need it?
Now, imagine a manager on business travel invited to a business dinner with a supplier. She subsequently needs to decide if a planned business interaction discussed over the dinner could be a break of the policy. It is unrealistic to expect the manager should be able to remember the eLearning piece from several months ago, or practical to expect them to go back into the learning artefact and find the list of criteria to make the right decision. This is the time when a performance support tool (or electronic performance support system) comes in. What if the manager had a company app on their phone or the Intranet that allows them to go through a set of self-check questions to self-identify whether certain components of the business dinner are within company policy boundaries, together with links to the policy and supporting documents for fast retrieval if needed?
It is easy to see how the consequence of not being able to access this information reliably and fast could be potentially catastrophic for the business. It is also easy to see that an approach like the above will save the business hours, and therefore money, on training the managers.
Liberate have experience in developing performance support tools