The Paradox of Recruiting Well in Learning and Development

It is a common belief that recruiting learning and development specialists should be based on them holding specific minimum qualifications to prove their suitability for the job role. Whilst we value formal tertiary qualifications, we respectfully wish to challenge this view.

Our team has over 70 people and our learning expert team supports a wide range of blue-chip and federal government, client-based, internal L&D teams across various industry sectors.

Although most of our team have degrees, master’s or PhDs in their respective fields, we have no formal requirements for minimum qualifications if someone wishes to join our team.

The Paradox of Recruiting in Learning and Development

Our recruiting pool is our valued L&D network – and our recruitment method is based on attracting the best of the best of Australia’s talent. We get unsolicited applications daily, yet not many candidates will ever cut the mustard.

The challenge that recruiting graduates brings to the business end of L&D work

Whenever people contact us fresh from finishing their formal education course, the main problem is that their practical skills and experiences are 2-5 years behind what the market needs. It could be blended learning, or eLearning design skills, or technical understanding in interfaces, or virtual learning practices in modern, digitalised workplaces.

So, while the candidate’s theoretical knowledge may be excellent, their ability to translate that knowledge into tangible client value is lacking.

Another aspect that is hard to achieve in a formal learning setting is the sheer level of creativity and multitasking the job demands daily. We need the freshest minds to perform well to come up with the umpteenth brand-new, unique scenario around customer service skills, compliance training or operating heavy machinery safely, for example. Rigid, institution-based learning plans and the ticking of pre-determined learning boxes are not conducive to fostering that required level of mental gymnastics.

We need the freshest minds to perform well to come up with the umpteenth brand-new, unique scenario

Rodney Beach

We didn’t notice this paradox for a while in our early stages after founding Liberate Learning. Over time, as we had larger project teams that needed exceptional project acumen and evolving technical skills, the costs of recruiting the wrong skillsets increased considerably.

Are we alone in this situation and can the dilemma be resolved?

In case you’re wondering, no, this type of situation is not unique to our industry. I’ve seen it happen in other industries where keeping up-to-date with tools and tech beats any formal certificate.

So how do we avoid recruiting the wrong people? What do we do differently? Here are three areas of focus for us:

01. Know what’s coming and plan for it

First, we need to clearly understand each team member’s technical skills and what is required for their specialist work activities. We are also responsible for knowing which technology will come in over the coming 2-3 years driven by industry demand, so it’s up to us to set priorities and request these skills in advance when recruiting. This is especially true in our industry where different vendors offer many tools and platforms, and the overall innovation rate is high. We need to be up-to-date across all of them and consider emerging tech to stay ahead of the game.

02. Understand and cater for the complexities of the job field

Second, we need to understand how complex it is for our team members to work effectively in our fast-paced client environment. For example, if clients want to involve multiple stakeholders in a project, we can’t automatically expect them to meet tight review timeframes. However, if our people have excellent self-management skills and use their project management acumen to proactively provide solutions and proven recommendations, it will be easier for everyone to succeed.

03. Be prepared to invest in internal people development

Finally, we need to respect that even people with excellent L&D and technical skills may need substantial upskilling and onboarding in other areas of learning innovation. This is a profound time commitment when talking about long training periods scheduled during working hours. The solution to this issue is not to hire L&D generalists, but to make sure every technical specialist learns what they need to unlearn to then relearn, as quickly as possible.

For us, the key to successful L&D recruitment is to understand how our team works and make the appropriate trade-offs in broad skill expectations. The paradox of hiring freshly graduated learning specialists is that they will appear well educated for a short time, but soon enough you’ll be back to square one. This is because their knowledge and skills may not transfer to real-life projects and the practical application of learning innovation. It’s much better to invest more in onboarding, training and a supportive workplace culture – and spend less on hiring ‘formal’ qualifiers.

If you have any idea how to resolve the paradox, we are happy to hear from you and share ideas within the AITD/NZATD community.

This article originally appeared in Training & Development magazine, March 2022 Vol. 49 No. 1, published by the Australian Institute of Training and Development.

RODNEY BEACH

Top 5 L&D tech trends to test drive yourself

We understand; the amount of new emerging in L&D tech can be mind-boggling – and outright scary – for many organisations. It’s hard to choose which bandwagon to jump onto.

As in other areas in life, the adage of ‘try before you buy’ can help inform you of the opportunities a new technology solution may hold, and your team will be coming out of it with new skills.

Having personally seen some incredible results, here are a few trends we believe live up to the hype and are worth giving a go.

1. Discover what a learning record store can do to make learning engagement more personal

Some things in life need to be experienced to be understood – and we believe a learning record store (LRS) is just one of those things.

An LRS has the potential to hugely improve the way you offer and track workplace learning. That’s why it’s well worth investing a little time experimenting with low-risk eLearning assets and some essential LRS functions to understand how it all works.

Dipping a toe into this new tech is as easy as finding a free LRS online, setting it up and trialling its functions with a voting or rating feature in one of your existing eLearning courses. Tracking and displaying learner ratings or votes and presenting them to a course ‘live’ is a great way to experiment. It will give you a first insight on your way to determining whether this technology could be suitable for your needs and it’s also a practical first step to building a business case for it.

2. Explore chatbot technology as part of L&D tech to help streamline your operations

Chatbots can be very useful in supporting any business function that encounters frequently asked questions (FAQs) from internal and external stakeholders.

If you have an extensive FAQ database, feed the questions and answers into a chatbot and test how people interact with it. Ideally, the person seeking information will instantly receive the answer they seek within the chatbot and find the correct information on your central location. Thus, they are no more waiting in helpdesk phone queues.

3. Consider user-generated content to freshen up your learning

Did you know your subject matter experts may have the highest credibility – and therefore highest learner buy-in – when it comes to job skills training?

Photographer pointing a camera as L&D tech towards viewer
Photo by Terje Sollie from Pexels

Entice SMEs to create content such as vox pops, piece-to-camera interviews or videos of processes using smartphones that can be shared with learners. Review their effectiveness and curate any collaboration.

4. Add performance support tools to boost learning transfer

Performance support tools can be a powerful way to carry learning into workplace practice, but they are often neglected in the L&D tech stack.

First, incorporate principles like policies and concepts into a formal learning piece so your learners understand the ‘why’ and the ‘what’ that is involved. Then move the actual ‘how-to’ learning into moment-of-need learning pieces that people can search for and find when and where they need it.

These tools are especially useful for job activities that people don’t perform frequently and need that little extra support to get right.

5. Practice mobile-first learning design to understand its point of difference

eLearning design for mobile-first is different from its traditional counterpart where the deployment primarily happens on desktop screens. Mobile learning has played a more significant role for a while now, especially in engaging cohorts that aren’t desk-bound. Therefore, it would help to practise a few specific design principles behind good mobile learning design.

Use modern, mobile-first authoring tools to test this new way of learning design. Consider how the characteristics of mobile design will impact your learning content and structure, for example vertical screens and scrolling, screen haptics – and the limitations in some functions eLearning designers often use such as drag-and-drop activities.

Some interactions do not meaningfully translate onto a mobile screen, but on the other hand, there are many new interactions to discover in the vertical, mobile-learning world.

Reach out for help

If you are unsure where and how to start, you can always engage a learning consulting service to help you work through your barriers and to make sure the learning tech you’re trialling will actually fulfil a real business need of yours.

This article originally appeared in the Training & Development magazine, December 2021 Vol. 48 No. 4, published by the Australian Institute of Training & Development.

Optimising your learning with data: Are you on the right track?

by Rod Beach

So, you have recently embarked on the adventure of tracking all the learning data you can get your hands on with your learning record store (LRS) or the equivalent systems you have. After all, more learning data means more information – and more information means better decisions, right?

After a couple of months of collecting stats, you sit down with your team in L&D to discuss a couple of unexpected findings:

01. The learning logs report shows that in the past month only 30% of learners have successfully passed the How to Conduct Product Quality Assurance quiz in your quality assurance procedures training.

02. In your LRS, you can see that a particular sequence of a training video about this same topic gets paused, rewound and replayed by over half of the learners at timestamp 2:39:00-2:43:00.

After some debate, this is what your L&D team concludes:

The training video cannot be up to scratch if people have to watch it several times to ‘get it’. This is supported by the fact that 70% have failed to answer the quiz correctly.

Would you agree?

Image shows magifying glass lying on top of data and footprints

Let’s start with the ‘why?’ of learning data

In our daily practise working with large organisations across Australia and internationally, I speak with many clients who are keen to jump on the ‘big data’ bandwagon in L&D.

This is something I wholeheartedly welcome, having seen first-hand the impact that it can have on learner satisfaction and business outcomes. Our ability to measure the application of learning within the flow of work is both exciting and powerful. However, success in this field requires giving data the respect it deserves, in all aspects.

I ask our clients, ‘Why do you want to track learning data at this level? What will you do with the data?’ And the broad response I receive sounds something like, ‘The more data we gather, the more we will know. We can think about what we will do with it later.’

I beg to differ.

This watering can approach will provide you with a plethora of unstructured words, numbers, and stats. Instead, I encourage people in L&D to liaise with the business units to uncover, for example, the underperformance issues that are supposed to be addressed by the training. What are some KPIs that could be linked back to training-success measures? What is the current performance level and where do we need to get to?

To use data to its full extent, we first need to think about the information we want it to provide us.

Rod Beach

What will you measure?

The next question to answer is what learning performance indicators could you use to link to the intended performance augmentation? For example, could ‘frequency of refresher training completed during a calendar year’ be relevant? Or could ‘speed of rollout of maintenance instructions for new machinery’ make a difference to the number of days the machines stand still on your factory floor?

You can see where I am going with this. To use learning data to its full extent, we first need to think about the information we want it to provide us.

How and where will you measure?

In modern workplaces, there are many data pools to tap into to paint a picture of what is going on. In L&D, data pools range from LMS completion data to training feedback sheets, to xAPI statements and even Google Analytics.

Look at your tech stack and think laterally about which learning data taps you need and want to turn on, depending on your ‘why?’ and ‘what?’ explored above. With modern authoring and publishing tools, for example, you can see which particular cohorts have taken which learning path, completed branching video scenarios and which haven’t.

Who can help you interpret the learning data?

Large data sets need skilled and trained professionals to interpret them within the business and L&D context. Statistics have a notoriety for their ability to skew narratives, so it is important to work with learning- data analysts and/or project/line managers to create meaningful reports that can help you make decisions.

Upskilling or augmenting L&D departments in learning analytics is, in my view, one of the most effective investments that L&D can make to future-proof their influence and to stay a relevant and integral part of an organisation’s success.

So, what really happened with the quality assurance procedures training?

In the case of the quality assurance training mentioned at the beginning of this article, it turned out that in the month that the data was recorded, a one-off, temporary product development task force had worked on a special product innovation push, triggered by an increased number of faulty product returns.

One of the task force’s activities was for the entire team to go through the online training from the QA user/ learner perspective. The project team had decided to not review the quiz content in the online learning piece, so most of them skipped it or put in minimal effort to ‘pass’.

As to why the video element was watched more times than normal? At time stamp 2:39:00, the weak component that had caused the product fault was featured during initial assembly and then in quality assurance.

Mystery solved.

This article originally appeared in Training & Development magazine, September 2021 Vol. 48 No. 3, published by the Australian Institute of Training and Development.

The many facets of instructional design

Instructional design (ID) activities differ greatly, depending on the mode of delivery. Creating learning can take many different forms, and the role of an instructional designer includes many specialist skills, from needs analysis to cross-platform implementation.

Instructional design: A smorgasbord of skills

Instructional design for face-to-face or instructor-led virtual training

For these learning modes, ID means supporting the instructor to facilitate live learning sessions, taking into account the facilitator’s skills, the learning cohort, the external learning environment, and resources such as (digital) hand-outs, running sheets, guides, and learning props.

Self-directed eLearning on LMS

Here, the focus of ID is to create interactive, engaging learning content to maximise learner engagement on and offline. IDs need to be aware of different LMS’s functionalities and technology constraints.

Microlearning and performance support tools

This requires the ID to create short, sharp, on-the-job learning pieces or digital tools that can be accessed at the point of need. The ID needs to take into account the nature of the task and requirements of the role and understand the goals of the business.

Subject matter experts and instructional design

IDs work closely with subject matter experts to ensure the learning content is factually accurate and contextualised correctly. This requires high-level communication and organisational skills from the instructional designer.

Instructional designers visualise and tell stories

In eLearning, strong, relevant visuals (video or animation) play an important role in making the learning more accessible, authentic, and memorable. IDs need to know how to create contextualised scenarios to support the learning outcomes.

So we can ensure all our clients’ needs are met, Liberate Learning has the full stack of instructional design skills in-house.

Overcoming real-world barriers to human-centric learning design

‘You can’t get it right for everyone, and we have our own training rooms that we should use. That’s what they are there for,’ said the branch manager.

‘That may be true, but we should give it our best shot and meet our learners where they like to learn whenever it is possible,’ said the learning and development lead, ‘Our new recruits are digital natives and, in their pre-employment survey, 72% have indicated they’d prefer to learn about this particular topic on their personal device, self-directed when it suits them within the given timeframe.’

Does this scenario sound familiar? It is one of the many ways in which innovation in L&D stalls, hampering effective learning design efforts that could yield higher returns on the average training dollar spent. We feel it’s time to uncover some of the typical barriers to human-centric learning design we have observed over the years:

01. Discovery phase

Where do you want to go and what is already in your hands to get there?

If we want learners to consistently apply what they have learnt in the workplace, everybody needs to be in the know. Managers/supervisors need to model and support said practices – and not just during the training phase.

Anything else turns learning into a tick-and-flick exercise, with little sustainable impact. For this reason, it is crucial to clearly articulate what the future state should look like in terms of the desired behaviour of the people who need to learn a skill or task AND the people who need to supervise them.

Another aspect that often gets neglected is consideration of an organisation’s existing learning culture, i.e. is there time and value given to formal face-to-face training, yet self-directed learning during working hours is frowned upon? Is your learning culture run by functional imperatives such as learning infrastructure that could be seen as a sunk cost if not used as frequently as before?

Overcoming barriers to human-centric learning design includes examining the existing learning culture in an organisation.

Vice versa, there may be existing learning materials that could be leveraged in a project or others that will be impacted by a new learning project’s development.

02. Definition phase

What actually IS the problem?

This is one of the hardest phases to get right, as different stakeholder groups may have different perceptions of the same issue to be solved. Sometimes, it takes more enquiry to get to the bottom of the matter and find out that the initial training intent would have scratched the symptoms, but not addressed the core performance problem underneath.

Learning, ideally, is a part of a bigger, integrated communications ecosystem, with congruent messaging cascading all the way into the team and individual business outcome KPIs. Hence, a learning project team must have a sufficient understanding of a learning piece’s scope, risks and dependencies.

03. Design phase

Finding the balance between ideal and practical

‘There are many ways to slice an apple; learners should have a choice on how they complete a work task,’ said the learning and development project officer after reviewing a learner guide draft.

‘Um, sure; however, in this instance, we are training construction workers in standard operating procedures and mandatory job site machinery safety checks, so there really cannot be any choice.’

L&D teams are experts in their field: education. They rely on effective collaboration with subject matter experts to co-create a practicable learning solution with tangible outcomes that can be applied on the job directly.

Often, these same L&D teams are under-resourced, meaning they can lack the capacity to staff sufficient stakeholder engagement. This scarcity of resources leads to difficulties in keeping up with the latest L&D technology trends in-house, meaning that barriers to human-centric learning design often comes through external providers.

04. Development phase

The right strokes for the right folks

The art of a well-received project with stakeholder support is to give everyone who needs it ‘skin in the game’. Not just at the beginning of a project and on delivery, but throughout development. Iterative testing and actively seeking input from test users will enhance the learning output so, at deployment, there should be no surprises.

Sometimes, we see a reluctance to explore or adopt new instructional design tools that may offer new and better ways of reaching the learner audiences than traditional authoring tools. Time and resourcing pressure in L&D, ironically, stand in the way of learning new tech when it comes out.

05. Delivery phase

Channelling and timing done right

The one question we always ask is, ‘what if…?’. And the first answer we often receive is, ‘oh, we never thought of offering it that way.’

There are many more communication channels in most organisations than are usually considered for L&D projects and which could be used to enhance and reinforce learning messages before, during and after a learning intervention.

Wherever possible, give your learners a choice about where they would like to access the content and keep refreshers close and relevant to job contexts that will require that access. For example, display short, sharp product handling information or brief customer service reminders on POS systems.

What are your organisation’s barriers to human-centric learning design?

Uncovering the answer to that question might take your learning further than you think.

This article originally appeared in Training & Development magazine, June 2021 Vol. 48 No. 2, published by the Australian Institute of Training and Development.

The e-version of the magazine for AITD members can be found at Training & Development – June 2021 – Australian Institute of Training and Development (AITD).

Why now is the time to leverage new technology and data analytics in L&D

The game-changer has a name and came into workplaces across the globe, leaving few lives untouched. In 2020, Covid-19 altered the way we live, the way we work and the way we learn. It forced upon us radical changes in record time and at a record scale, with little chance to reflect, plan and strategise.

Organisations needed to pivot their usual work practices to stay operable; many unfortunately had to downsize and learn how to achieve the same output with fewer resources. Activities that usually happen in person were forced to move online, including learning and development and performance management. 

Now, a few months into this new awareness of what is possible in the online space, it is time to take stock and examine how best to take forward any lessons learned. One thing seems inevitable: digital learning solutions are critically entrenched in businesses – at a minimum due to risk management requirements for future crises, in the best case because they are a proven, enriching element that drives workforce engagement and performance.

Image shows data analytics on a tablet

Arguably, the current situation lends itself well for experimenting with new ways to operate L&D; it is a forgiving time to embrace data analytics and digital learning and put in place what can work for your organisational context in a more agile, staged approach. 

In parallel to the Covid-19 return to the “new normal”, here is a roadmap for your L&D technology staged release.

Stage 1: Understand Learning Data

In this first step, L&D configures a learning record store (LRS) and learns how to read and track learning activity statistics such as completion rates, view and interaction statistics, average time spent per screen. Here, it is not enough to merely look at raw data because external factors have an influence which needs to be considered in the analysis. 

For example, take a mobile responsive course where a learner spends 5-10 minutes “time on screen” as it is a longer vertical scrolling piece. The equivalent learning artefact built for a tablet/PC screen could have a different interactive functionality such as “click next” or “swipe next” and as a result produces a totally different value for “time on screen” – which has nothing to do with the screen content viewed by the various learners.

Analysing data requires an awareness of influencing factors that can throw your metrics out. It is why taking time to experiment, learning how to read and interpret data and understand the direct influence the build of your training has in relation to the data, is time well invested.

Stage 2: Create a Data Analysis Governance Framework

The lessons learned in Stage 1 will enable your L&D team to identify valuable themes and patterns to consider before rolling out learning data analysis in your specific organisational context:

  • What are meaningful metrics that can support your L&D function for continuous improvement?
  • What method will be used to measure performance consistently?
  • What are some metrics that will be useful in the compliance and risk management context?
  • What are the privacy and data protection implications of collecting, recording, and using digital learning data? 
  • What policies and procedures need to be put in place to turn digital learning and data analytics into business as usual?
  • What data sets do you need to track for what purpose?
  • What information can support performance management and support, and to what extent?
  • What tools and functional guidelines will be used in relation to data governance and consistency?

Stage 3: Implementation

Depending on the granularity of your tracking, data analysis from L&D activities can potentially inform analysis of performance, productivity, efficiency, management, leadership, contributions – all the way to gauging the effectiveness of training programs.

It can motivate individuals to take more ownership of their learning, ideally because the data analysis leads to improved learning content, or, to quote Peter Drucker, just because ‘What gets measured gets done.’

Gone are the days where learners can quickly flick through a learning piece without spending the appropriate time and attention to learn what they need to know and do for their job. Moreover, data analytics is also valuable in regard to supporting AI and machine learning programs.

Learning and development can save lives. That is another hard lesson we have collectively learned during these pandemic times. Let’s keep it digital and accessible – and let’s keep learning to make it better, based on data, not opinions or inconsistently measured statistics.

This article originally appeared in Training & Development Magazine, December 2020 Vol. 47 No. 4, published by the Australian Institute of Training and Development.

Leverage technology to virtually engage during a global pandemic

As COVID-19 spreads like wildfire, it leaves a path of disruption in its wake – but not all of it is negative news. In some ways, the current pandemic forced organisations in all industries to reckon with their status quo and acted as a catalyst to move away from ingrained processes. This included the way organisations are able to train their staff and run their business in a safe manner.

Those organisations that already had a foot in the digital world at the beginning of 2020 had a headstart – and for those who had to catch up, there were many options ready to go. And yet, not all technology proved to be useful in overcoming the challenges at hand.

Overall, we saw two tech-related approaches emerge that provided exceptional value for money and performance in these dire times.

The new virtual reality: VR in the education and retail sectors

Australian universities were hit hard with a complete halt to prospective international students travelling, attending classes or even going to open days, cutting one of their main income and marketing channels. Pair that with the simultaneous Australian Government reforms to student fees, along with the need to move all teaching activity online within 2 weeks, and you had a perfect example of ‘we need a magic wand and we need it now to turn this around!’

Image of finger pressingVR button on interactive technology screen

One solution lay in using virtual reality (VR) technology. In a fast-paced, low-budget project, Liberate Learning worked with a large Australian university to create a virtual open day experience. This allowed prospective students to virtually walk through the lanes of Melbourne, visit key campus amenities and learn about Australian culture. All the content was, and still is, available on web browsers from anywhere in the world, thus increasing the number of potential open day attendees way beyond what a real-life event could have catered for. Web-based VR has advantages over immersive VR (that’s the one set up in a specific fixed location with the VR goggles) in that it can be deployed to anywhere in the world via web browser using low bandwidth. It also helps participants avoid motion sickness and comes with a much smaller price tag and shorter production times compared to immersive VR. It is much more suitable for crisis application where timing, budget and remote access are all equally critical.

2021 can be a time to embrace the changes and advantages thrust upon us in 2020.
(Rodney Beach, Managing Director, Liberate Learning)

Australian universities were hit hard with a complete halt to prospective international students travelling, attending classes or even going to open days, cutting one of their main income and marketing channels. Pair that with the simultaneous Australian Government reforms to student fees, along with the need to move all teaching activity online within 2 weeks, and you had a perfect example of ‘we need a magic wand and we need it now to turn this around!’

One solution lay in using virtual reality (VR) technology. In a fast-paced, low-budget project, Liberate Learning worked with a large Australian university to create a virtual open day experience. This allowed prospective students to virtually walk through the lanes of Melbourne, visit key campus amenities and learn about Australian culture. All the content was, and still is, available on web browsers from anywhere in the world, thus increasing the number of potential open day attendees way beyond what a real-life event could have catered for.

Web-based VR has advantages over immersive VR (that’s the one set up in a specific fixed location with the VR goggles) in that it can be deployed to anywhere in the world via web browser using low bandwidth. It also helps participants avoid motion sickness and comes with a much smaller price tag and shorter production times compared to immersive VR. It is much more suitable for crisis application where timing, budget and remote access are all equally critical.

Another industry taking advantage of this VR solution were organisations in the fast-moving consumer (FMCG) sector who were faced with a higher-than-average infection risk and fluctuating staff cohorts due to quarantine requirements and panic buying.

They needed a way to train remote learners in authentic and locality-specific topics – and web-based VR delivered for over a hundred thousand learners. Unlike fully immersive VR, it also integrated into the retailer’s learning management system (LMS) allowing crucial training tracking for accountability and compliance in workplace health and safety.

The move towards minimum viable products (MVP)

The pandemic often required learning artefact development in timeframes way below the industry norm. As a result, some hastily produced solutions were under par in terms of learner outcomes, engagement, good learning design and measurability.

A case in point is that many organisations were uploading session slides and calling them ‘online learning’ – when at best it could be called ‘information dissemination’. One way out of death-by-PowerPoint was the rapid development of shorter minimum viable products – a pared down initial product with sufficient features to fulfil basic needs – using modern authoring tools, multiple deployment platforms, including mobile devices, and modern tracking mechanisms.

For example, we saw subject matter experts take videos of manual handling procedures with their smartphones, eliminating the need to put camera crews at risk on site. We observed learning teams embrace a new role as curators of such user-generated content. We also witnessed a higher sense of ownership of learning content from floor staff, as clips came straight from the horse’s mouth, produced by real-life Jack at the deli counter.

In the end, as the quality level of the videos was similar to self-made videos on social media platforms, learners felt part of the learning story because it was authentic and created a sense of collective effort.

Time will tell how long we will need to contend with this socially distant way of going about our business. However, it is very clear that the digital age has brought many advantages that our ancestors in the times of the influenza or plague pandemics did not have – so let’s be grateful for these opportunities.

2021 can be a time to embrace the changes and advantages thrust upon us in 2020, using them wisely for the betterment of these, and future, circumstances.

Download this article as a PDF:


The full article appeared in Training & Development Magazine, March 2021 Vol. 48 No. 1, published by the Australian Institute of Training and Development.

Scope creep – How not to work with an eLearning vendor

“Sorry, that’s not part of this project. 
 We can help you if you sign this contract variation.”

If this sounds familiar, you may have experienced what we call the dreaded scope creep.

Scope creeps come in different forms and sizes. Still, they all have this in common, particularly when it occurs when you are working with an eLearning vendor: an aftertaste of sub-optimal process and possibly even deliverables, impacting your willingness to work with the said vendor again, and vice versa.

1. Scope creep by lack of planning

Scope creep happens when the initial brief is incomplete, unclear or does not include all the expected deliverables. This then forms the basis of the vendor quote (and ultimately the contract).

Especially when engaging a new vendor, make sure you include all upfront information the vendor will need to succeed:

  • project requirements including timelines
  • expectations on review cycles, branding and style guidelines
  • technical requirements and deployment environment
  • learning design requirements
  • assessment requirements
  • stakeholder environment and sign-off process, milestone payments on the successful delivery of project parts.

Providing complete documentation will help the vendor understand your specific environment. It is up to them to read, plan for compliance, and quote accordingly. A good, experienced vendor will know when and what to ask for to obtain complete information, and automatically include ‘obvious’ requirements in their scope, e.g., certain Government standards that need to be adhered to. An experienced and reputable vendor should also be aware of many of your requirements based on their years of experience even if you don’t formally identify the needs for said requirements. This type of working relationship would be more suited to being referred to as a true ‘partner’ versus a ‘vendor’.

avoiding scope creep: image shows business meeting between client and eLearning vendor who asks for signing a contract variation for a small client change

2. Lack of communication – beware of the word ‘Just’…

Competent vendors will work with you to develop a learning solution that suits your specific context and budget needs. And yet, in the middle of the learning development phase (i.e. in production), one of your key stakeholders may want to add “just a couple of branching case study streams” in a learning piece that was supposed to be short, succinct and very low budget.

A request like this can have a domino effect on the entire learning piece’s integrity, so the design phase may have to start again to integrate it well and ensure the solution is sound. The stakeholder may not be aware of the impact. Now you need to spend valuable time explaining and navigating conflicting requirements for a disjointed learning solution, with potential contract variations, and blown out timelines and costs.

  • Consult with your subject matter experts (SME) and key stakeholders before going out to market for a quote, or involve the solution designer early in the process. Ask them about their requirements and expectations of what a successful learning solution looks like to them.
  • Explain the impact of increasing content or functions, such as interactivity levels, scenarios, animations, or video, on both time and budget. Your vendor can provide various solutions and talk about the quality, cost or timeline expectations for each solution, so you can have these conversations before the work begins.

3. Lack of imagination

There are many ways to skin a cat, they say, and indeed there are many, many more ways to design learning. 

Design, in general, is one of the most challenging topics to discuss in words (try describing the hue of blue or grey in front of your window to someone right now). The same is true for the look and feel of a learning project. 

  • Please do not wait for the vendor to finish an entire learning piece to determine the funky flat illustration style they chose does not suit your law practice’s corporate, traditional style.
  • Ask your vendor to show you what styles they recommend and get a few test screens done so you can picture it better via an early prototype.
  • Get sign-off on the preferred style by other stakeholders that may need to be involved, e.g. marketing/internal communications, to make sure your overall branding is on-point throughout your organisation.
  • A vendor you can trust will genuinely advocate for a great solution and not just ‘take orders’.

4. Inattention to detail and testing

Generally speaking, a well-rounded and ‘engaging’ eLearning project consists of roughly 30% instructional design/pedagogical scripting (engagement of the mind), 30% goes towards creative visual design and on-point artwork elements (engagement by the eye), and 30% function development and media production (engagement by the ear and screen). Noting, roughly 10% goes into project management.

All parts are equally important to get right, 100% of the time, as they build on each other. The treacherous belief that “We signed off on that script so the copy in the learning screens should be correct” has tripped over many a deadline. 

  • Ensure your vendor has robust QA processes in place. You should expect to receive error-free, complete proofs for your review. Still, human error is real, so it is the instructional designers’ responsibility to check every proof in detail for its correctness and completeness before you can be signing it off to go to the next phase.
  • Test and ensure the solution works in your technical environment and according to all specifications (e.g. WCAG accessibility, various browsers and SCORM compatibility just to name a few). If your vendor says, ‘it works fine in our test environment’, ask them for validation in the form of a test certificate, and still test it yourself given you’re ultimately responsible for final sign-off. An easy way to do that is to upload the SCORM package to a free SCORM platform.
  • Sometimes, a learning piece takes two weeks to make it through all the internal technical mills and get uploaded to your LMS. That’s a long wait to see if all works correctly. If you work with a vendor you have built a trusted relationship with, you can share access to your LMS and they can do the testing directly in your environment for you.

5. Scope creep as vendor business model

Our least favourite scope creeps happened to some of our clients in previous vendor relationships they’ve had. These are veritable scope traps, set by vendors on purpose. Some vendors will win projects by underquoting and try to make their margins back by forcing contract variations at any opportunity (including project administration fees each time) for every single client request of alteration, no matter how small. That is undoubtedly not a sustainable way to work with clients long-term, yet they exist from what we hear.

Here is how you can protect yourself:

  • Set boundaries and list upfront what will warrant a contract variation and what will not, agree on this with the vendor and make sure the contract reflects this agreement.
  • A well-established learning provider will be interested in building a longer-term business partnership (as opposed to selling to you), so may more likely be willing to help you with small change requests, even after the project has been signed-off and deployed. Over time, give and take will even things out, and often it is better to get the project out the door than to squabble about minutiae. 

We hope this article will help you avoid some of the pitfalls we have seen over the past 10 years. If you’d like to explore what it is like to work with learning experts that are interested in building long, trusted working relationships with their clients, that’s us!

Contact our Managing Director Rodney Beach if you are interested to learn more.

Designing quality eLearning that works

Over the past few months, many people have had their first encounter with ‘learning online’, as opposed to attending face-to-face sessions. Social media feeds are full of mixed views around online learning, from both the people suddenly tasked to develop online content in short time frames and the learners on the receiving end. 

The danger is that the extraordinary times we live in right now – and the necessary stop-gap learning measures – will shape the perceived quality of eLearning design overall. Therefore, we believe it is vital to take a step back and look at what constitutes quality, workplace-related eLearning – and what does not quite measure up.

Overall, designing relevant and engaging eLearning encompasses the disciplines of both the instructional designer (educational design) and the eLearning developer (technical UX and/or UI design). These disciplines need to complement each other for a learning piece to work well.

illustration of eLearning designer thinking about success factors of good eLearning design

Establish context

Can you imagine a less inspiring learning experience than clicking ‘next’ to read through a series of regurgitated snippets of a new policy, with a multiple-choice quiz at the end to see what you have ‘remembered?’ We can’t, and yet, it is not uncommon; we have seen it a lot over the years. Unsurprisingly, ‘eLearning’ designed like this is of low value to the learners and to the organisations who invested time and resources to create it. We would not call this type of content ‘eLearning’; we would call it ‘information dissemination’, which has nothing to do with the actual art of facilitating the learning process.

For eLearning content to work well, it needs to tap into what people believe they want or why they need to learn something. Context goes a long way in that, together with intrinsic incentives – and extrinsic motivations – to make the learner think. 

Effective quality eLearning content is more engaging and relevant when it:

  • uses stories and authentic case studies that the learners can relate to because they use familiar workplace scenes, behaviours and tacit language
  • includes typical workplace problems to discover and solve along the way
  • progresses the learner from unconsciously incompetent to consciously competent through scaffolding and learner progression with feedback and practise points
  • supports the practical application of new knowledge and skills in the workplace.

Create real engagement

In face-to-face learning, a facilitator can engage in direct, person-to-person, peer-to-peer, group discussions and activities, and share stories to energise the room. In contrast, content creation in eLearning needs to plan for these elements or simulate it in order to achieve the same effect.

Going back to the aforementioned policy learning content, let’s imagine an eLearning piece that invites the learner to witness two different interactive workplace scenarios, with relatable problems, different behaviours, and two or more different outcomes. In both situations, all involved claim they followed the policy, and yet, one or more scenario paths have directly resulted – or indirectly influenced – non-compliant behaviour and outcomes. 

With this scenario approach, dependent on the desired learning outcome and cohort, you can spin off a range of activities that allow learners to discover:

  • how to interpret right from wrong workplace behaviour in the context of the policy and organisation
  • how to manage a situation like this from the viewpoint of different roles in the organisation
  • how to lead self and others through any grey area situations
  • how and where to access support tools when needed after the eLearning piece is complete.

Now that we have looked at how smart instructional design can influence the quality of an eLearning piece, let’s look at what we can do with the learning space itself: the learning interface.

Get the visuals, audio and interactivity right

‘An image is worth a thousand words’ is an often-quoted cliché, to which we would add …’if it is not just a visual object on the screen and if it is directly helping to tell your eLearning story.’

What do we mean by that? 

A quality learning developer will be able to take the content and instructional design blueprint and create a learning experience interface with just the right balance.

In other words, it is not enough to place a stock image next to five lines of copy in a box to ‘make the screen look interesting’. A quality learning designer makes the learner wonder, imagine and think, building in smart and interactive feedback loops to signal learner progression along the way. They do it with an awareness of the entire bandwidth of audio-visual elements (colours, text, buttons, video, audio narration, animation), paying special attention to haptic and spacial design elements in the case of VR or supportive and assistance design elements in the case of accessibility.

The level of sophistication is, as with any other project, dependent on the context of the learning project, desired quality, its time or technical constraints, and available budget. And yet, a good learning designer will only use what is truly necessary. Instead of overloading the learner with input and interactions of the finger, not the mind, they will give a learner functionality choices to cater for individual preferences (e.g. selecting a persona, pathways, and choices such as switch audio/captions on or off).

“Workplace learning is finally moving away from facilitated one-way delivery and static learning.”

Functionality brings us to the third element that makes a significant difference to a learner’s experience with eLearning and, therefore, its potential to be remembered for the right reasons. 

Using the right tools and learning platforms

We believe if you’re going to do something, do it well. In the eLearning industry, there is so much room for improvement in the way we can make our content meet the learner where they are, when they need it, in a way they prefer. That is the power of our 21st-century technology. 

However, the current mindset around what eLearning technical limits are seems to be:

  • teach on Zoom and upload videos and presentation files
  • chunk policy into paragraphs with a supporting image
  • paste text on the left of the screen and add a stock image on the right.

However, in our daily lives, ‘eLearning’ at micro-level happens everywhere, at our fingertips. For example, every time we ask Google ‘where is the next petrol station?’ because we just noticed the blinking symbol on our car dashboard. Where is that same intuitive level of information accessibility when, say, a retail client is about to buy a model C of brand A in a hardware store? What would happen if the retail assistant, upon hearing why the client opted for that model, could look up their user requirements on a smart device and recommend a better-suited alternative? 

Take the eLearning plunge

Workplace learning is finally moving away from facilitated one-way delivery and static learning platforms with eLearning courses on a desktop, to a more organic learning ecosystem approach; however, for many organisations, this kind of ‘learning’ still sounds futuristic, expensive or complicated. We believe this is because many people in learning and development are unaware that the technology needed to create this kind of learning is already here – and is neither expensive nor difficult to use. Yes, this kind of quality eLearning takes more thought from a pedagogical point of view, but the potential to deliver effective outcomes is far greater, too.

This article was published in the Australian Institute of Training and Development’s Training and Development Magazine, September 2020.

Managing a Career Learning Portfolio

In the past, a career was considered to be a profession, occupation, trade or vocation along a permanent and non-diverse path over a long period. A career could mean working as a banker, teacher, builder, beautician, real estate agent, financial advisor, software developer or any other job that might come to mind for one’s entire life. However, in 2020 and beyond the word is taking on another connotation—the progress and actions taken by a person during their working lifetime regardless of their profession or trade at any particular point in their life. Enter the need to manage a career learning portfolio.

We are being told that the era of frequent job changes is upon us. The Australian Bureau of Statistics notes that over 1 million Australians changed their job or their business during 2017. Over half of these people entered a new industry. The Australian Institute of Business writes “on average, today’s Australian employee changes jobs 12 times throughout their life, with an average tenure of 3.3 years. For workers over 45, the average job tenure is six years and eight months, while for under 25s, it’s just one year and eight months.” With the disruption to the world’s workforce due to the Covid19 pandemic environment, the rise of the gig economy, and an increasing casualised workforce, the frequency of job changes is likely to get even higher. 

Similarly, the Foundation for Young Australians (FYA) argues that youth will have ‘portfolio careers’ potentially having 17 different jobs involving five different vocations. Will they need to be qualified for each of these jobs, before entering employment as we do now? Professionals and technicians are reported as those most likely to change their job, which includes educators and L&D professionals. Will they need to engage in three, four or five years of study at an accredited institution before they can seek employment in the field? Given the fast pace of change, by the time they finish under the current educational system, the skill and knowledge required to perform the job are likely to be different from what they studied. In short, the skills of high performance keep changing as technology assumes many of the functions we thought would never change. For example, a classroom teacher’s role changed almost overnight as the pandemic spread with many of the teachers ill-prepared for online delivery.

The FYA 2017 report, The New Work Smarts highlights that by 2030, we will be spending 30 per cent more time learning skills on-the-job. These will relate to solving work problems using critical thinking and judgement, verbal communication, and interpersonal skills supported by an entrepreneurial mindset. More so than ever before, learning will be lifelong critical practice. To support the trend towards frequent change leading to portfolio careers, we need a timelier system of documenting learning and career progression. 

Illustration shows people climbing a career ladder between different professions, with a career learning portfolio in their hand.

Software developers are building intelligent systems to aid HR and L&D departments and large organisations when recommending career paths to employees, conducting job matching, or what Josh Bersin calls “intelligent talent mobility”.

Tertiary institutions are archaic and out of touch with the digital revolution. It is time they take a whole-of-life approach to support learners through their “career learning portfolio”.

Rather than courses that are defined by a list of acceptable subjects, learners need opportunities to create their own programs based on their on-the-job skill requirements, or their portfolio career goals. Direct links, fuelled by intelligent agents and xAPI big data, between what work needs to be done and the institutions where individuals gain their skills and knowledge could just be the life-saver that tertiary institutions need to survive and enter the digital world. 

Read more about how Universities can help shape learning in the future.