Fostering connectivity and engagement in online learning

Fostering connectivity and engagement in online learning – how to do it and why should you even care? 

A rapid movement from live education to online versions is a big leap. Organization HR, teachers, education providers, trainers, you name it, are making huge efforts in forming online course scripts, allocating relevant existing materials, producing new ones in a speed of light, writing instructions, and designing assignments and evaluation criteria, just to name a few. Training programs and courses should’ve been active yesterday, employees and students are already waiting, and organizations are expecting to scale up and smoothen their operations already in this quartal, whether we talk about corporates, training companies, SME’s, educational institutions or NGO’s. Especially currently, during the COVID-19 crisis, things are crazy and transformation fast. There is a danger that this anxiety, insecurity, hurry, and even stress are unwantedly transferred to the otherwise so amazing online courses and training. The important question is, how can you overcome this?  


Viivi Pentikäinen a CEO and training program leader from Positive Education experienced first-hand this rapid and, according to her own words, even unpleasant change of moving from live training to online programs. She has always believed in people coming together, connecting personally, sharing their thoughts and feelings, and experiencing the feeling of being together, a team, a community. But she, too, had to bow in front of COVID-19 and figure out how to adjust her business to the new order of the world.

“There was no choice left but to start transferring our materials, training programs, and learning to an online format. Of course, the first versions of our training programs were pretty much hard-wired, not so pretty, but we made it. We were able to get our content online and keep running our training programs. For us, the most important thing, as well as a concern, was to figure out how to help learners to maintain the feeling of social connectivity as a group as well as a sense of well-being even when studying online. Another thing that we paid a lot of attention to was to figure out how to build and maintain a safe learning environment and encourage dialogue in an online environment. These are things our training are known and praised for, in addition to the quality of content, and what our customers kept in high value.” Viivi says that she sighed from relief and even wrote in social media about it when she got customer feedback thanking Positive Education for being able to create such a warm and welcoming, inspirational atmosphere in their online learning environment.  


So, what did Viivi and Positive Education do right?

Here are some tips on how to create a safe and encouraging atmosphere for your online course environment and how to ensure the feeling of social connectivity even when studying online.  

1) People first – Social connections are a must! For Positive Education it is important that people know each other and trust each other when they learn together. They use online live sessions actively for strengthening the social connectivity of the group, not so much for teaching new content and without very intense learning objectives. They have assignments over which learners can get to know each other, share ideas, and team up. “We had people call a study partner from the group and just chat how everyone was feeling at the beginning of the learning modules as a way to add social connectivity at the beginning of each module,” Viivi says.  

2) Good instructions with a personalized message create a safe learning environment. Remember that not everybody is used to or comfortable studying online. For some people just logging on to a new platform might cause anxiety and negative feelings that in turn make learning not optimal. Good instructions and welcoming introductory videos on how the learning process will unfold provide people with a safe feeling because now they know what is coming and how to proceed. Also, a personalized greeting from the teacher to the learners give a sense that the learners are important, and that the teacher is “walking with them” on the study path. “I am terrible at following instructions and impatient if I don’t understand something right away which makes me pretty good at writing instructions because I make them SO clear that even I would understand them!” Viivi laughs.  

3) Take breaks, recovery, and wellbeing seriously. With live training, there are a lot of things that enhance wellbeing that is not present when studying online. In live training, most people get energy from social interaction, discussions, and even coffee breaks during the teaching sessions. It is important to include restorative practices, breaks, and wellbeing sessions such as mindfulness moments to an online learning experience to strengthen wellbeing and energy levels during the study time.  

4) You get learning engagement by engaging people. A successful online learning experience is one where the learner feels engaged during the learning process and completes the learning tasks. If the learning experience is not engaging people drop out. So, to get more engagement you need to involve and engage people in the learning process personally. A good way is to ask questions, add personal reflections and commentary assignments, and utilize activating, emotional hooks to catch the learners’ attention.  

5) Use humor and activate positive emotions as much as possible! Humor and sharing of positive emotions connect people. Laughter clears out the tension and anxiety and is often created together when teams feel secure enough. It is a good idea to use laid back wordings in your instructions, materials, and assignments and smile during video conferences. Laughing about your own mistakes and showing your learners that even though learning might not always be so fun, you can still have fun in your learning group, are essential in helping your learners relax, too. Or why not share jokes at the beginning of the zoom-session?

“Here is one of our favorite shared to me by my son: “Most students know that pressing the control+f4 key is used to exit a game or a meeting. So, when teachers ask students for advice on, for example, how to get the brightness up or the camera on, be aware of the reply is: …just press the control+f4! – bye-bye, teacher!”  


So, what do you think? Have you remembered to include elements to your online courses that foster social connectivity, dialogue, feelings of togetherness, and make the environment psychologically safe and engaging?

Not to worry if this has been forgotten during all the hassle, you have plenty of time and iterations to start doing small changes and additions, one by one. And of course, we are happy to help!

Our mini course on the topic will be out for purchasing already this spring. If you are interested, contact us at to be the first one to get it. 

About the Author

Solja Sulkunen works as Head of Customer Experience in Claned Group creating the best possible learning experience for all Claned users. She ensures everyday fast and easy deployment of the platform for our new users.

A buyer’s cheat sheet to UX in online learning platforms

Perhaps the biggest unknown factor when choosing an online learning platform for your organization, is whether people will actually use it or not. And beyond that: will they like it and find it useful? These are questions of user experience (often abbreviated as UX). When making decisions, this topic is a beast of a problem, as it is equal parts vague and important.

We often see buyers and evaluators try to evaluate the user experience of a learning platform during the purchasing process, but this usually leads to platitudes and hand-waving instead of useful information. As a UX person, I have insight into getting the answers you are looking for in order to make good decisions. This is your hands-on guide to get those answers


Why UX matters when selecting a learning platform

First things first. Why should you care about UX?

Whatever term you choose to signify the user experience of a learning platform (we often see intuitiveness, usability, ease of use, perceived value and end-user value used as rough synonyms); the effects on your business are the same.

Good user experience in the platform will positively affect:

  • How much the platform is used (which is good, unless you are billed by the amount of usage :))
  • How likely learners are to persist until the end of your course
  • How likely learners are to recommend your services to others
  • How likely learners are to make additional purchases

In a nutshell, learners and educators will not separate the learning platform from your content and pedagogical design as a matter of experience, so don’t waste your investments into content and pedagogy with a bad learning platform choice.


Good metrics

Okay, so what data should you collect and how?

Piloting with a representative population of learners and educators, even at a small scale, is by far the best way to tell if the product is a good fit. When piloting, try to actually utilize the new affordances created by the platform, don’t just copy over the existing learning process. You wouldn’t buy a car without driving it first and when you drive it, you would also try the features that your current car doesn’t have.

Collect feedback on the overall experience as well as pain-points and actually use those to inform your decision. We see organizations that try learning platforms, in the end, disregard the end user feedback. We recommend asking specifically about the following factors.

  • Do they understand how to use the platform? (measures perceived ease of use)
  • Do they feel they trust the platform? ( measures perceived trustworthiness)
  • Do they feel they have the necessary skills and information to use the platform? (measures perceived capability)
  • Do they feel the platform is useful? (measures perceived usefulness)
  • Do they like the platform? (measures enjoyment and attitude)

These questions have been proven to be good predictors of use in e-learning systems.



Of course, piloting and user tests are not always possible for the usual reasons (money or time constraints). These questions will give you some indication of the UX-focus of the learning platform provider. The questions are intentionally concrete, in order to create as little wiggle room for ambiguous answers as possible.

Ask about the ratio of designers to developers in the product team. In a UX-focused team, you should have a ratio of no more than 1:5 (designer: developer) and of course, the more designers the better (to a point, but we rarely see the opposite issue). If there are combined roles (like a UX developer) count those as half. Front end developers should almost never be considered design resources, save for some unicorns.

Ask how many updates were made to the product during the last year. In a truly agile and user-focused team, you should have no less than 5 major updates per year. Size does not replace frequency here. When counting, disregard bug fixes and security updates if you have release notes available. Make sure you will get continuous updates after the purchase, instead of being stuck with whatever version you buy.

Ask about overall end user feedback from organizations similar to your own. Ask the prospective provider to provide some statistics, not quotes or testimonials. Especially good is to get an idea about the customers’ and users’ willingness to recommend the product. Results from formal usability tests are also a good indicator but are rarely available.


Bad metrics

There are, of course, many pitfalls and seemingly good questions that don’t really have anything to do with what you are trying to measure. Here are some common fallacies:

The number of matching feature bullet points has little to do with how useful the product is when it’s actually used. The added value will be created in a dialogue between the user’s needs and the opportunities the platform creates, these are not separate phenomena. Feature lists are important, but they should not trump results in real user testing.

Asking about the design approach is too vague. There’s too much wiggle room here, any team can claim to be user-centered, mobile first, agile lean and whatever else is the hot topic now.

Crude usability metrics, such as the number of clicks required to navigate to any point in the product, are minuscule parts of the bigger picture. These are definitely a part of the user experience, but should not be mistaken to indicate it as a whole.

First impressions of people who are not representatives of end users (i.e. learners or educators). IT experts and administrators have important insight into the process, but can’t be expected to predict the experience of a different kind of person.


Red flags during the sales process

Here are some worrying behaviors we sometimes see from representatives of platforms where the user experience has been an afterthought. These should help you weed out the worst choices from a UX standpoint.

When asking about user experience, pedagogy or end-user feedback, the discussion is directed back to a “more substantive” topic, such as administration, infrastructure, pricing or information security.

The sales deck has no real content about learner experience or pedagogical affordances created by the product a few buzzwords that don’t correspond to any features you see in the product demo.

Requiring a sizeable commitment and investment before being able to do a pilot or trial project with the platform.



Trying a prospective learning platform in a real scenario with actual learners and educators is the way to go, if at all possible. If this isn’t possible, ask concrete questions about feedback from learners, their design staff and their product development schedule with little as little wiggle room and interpretation as possible.

Hopefully, this UX cheat sheet is useful to you! Let us know if you have other useful tips or insight?


Relevant sources

Lee, Y. H., Hsieh, Y. C., & Chen, Y. H. (2013). An investigation of employees’ use of e-learning systems: applying the technology acceptance model. Behaviour & Information Technology, 32(2), 173-189.

Šumak, Boštjan & Hericko, Marjan & Pušnik, Maja. (2011). A meta-analysis of e-learning technology acceptance: The role of user types and e-learning technology types. Computers in Human Behavior. 27. 2067–2077. 10.1016/j.chb.2011.08.005.

Zaharias, Panagiotis. (2009). Comprehensive Usability Design in the context of e-Learning: A Framework augmenting ‘traditional’ usability constructs with instructional design and motivation to learn. International Journal of Technology and Human Interaction. 5. 37-59.

About the Author

Miska Noponen worked as a user experience and concept development lead in Claned Group. He is a multi-talented UX designer, ardent pedagogue, computer programmer and lifelong learner.

Three key benefits to demand from learning analytics

The best thing about learning analytics is that it sheds light into issues surrounding learning usually hidden from the naked eye. That is, it provides ways to peek into the details of the learning process. More importantly, analytics provides new ground for evidence-based learning interventions without taking away the need for a skilled educator but instead highlights the importance of pedagogical expertise.

Education is largely different from many other fields, which have been revolutionised by data. As it is largely related to human interaction, decision making will remain in the hands of skilled educators. Learning analytics do not offer fixes. Instead, using it provides an effective way of improving educational programs and making effective interventions. And the impact of analytics is directly linked with the quality of analytics that you have access to. So, if you are about to invest into a new learning environment that promises analytics, or you already receive analytical views into the training process, let’s ask some critical questions on what matters most in learning analytics.

Can I increase the engagement on my course?

The most crucial problem in online learning relates to the frustration when facing ambiguous challenges and getting stuck. Too often this is caused by a confusing or unsuitable learning passage that can hamper even the best course.

In order to take corrective action, learning analytics should provide a way of highlighting the sections of a course, which are challenging. On the other hand, analytics should also show which parts work and create engagement.

Additionally, you should be able to identify the challenges faced by the whole learner population or just or particular groups of learners. Thanks to automatic keywording of materials using natural language processing, in addition to single materials, you should have the option of viewing this at a topic level. For example, which of the central themes in a course are problematic.

So, after setting the initial question and goal, in this case, engagement, learning analytics can highlight the sections in need of a fix and what works already. This offers the basis for identifying the target of improvements and measuring the effects afterward. However, no technology does or will do the actual work of designing appropriate learning activities. But with the support of analytics, the content iteration will be more successful.

Can I personalize my course for different kinds of learners?

Learners are different. They have varying amounts of experience, different strengths, and diverse interests. Some might find your course too easy while others are struggling. Just how different are they and where they differ is up for the learning analytics to reveal.

Learning analytics should be designed to identify groups of participants or individuals with specific needs or challenges. As the analytics can identify distinctions in behavioral patterns, you can focus on how these relate to performance, quality of social interaction and background variables, such as professional role in a company.

Once you have been able to identify the differences between learners, you can provide tailored paths based on their interests and background. For example, by offering background information for those with less experience in a topic, or alternatively more advanced information for those that need more challenge.

You should be able to intertwine groups from two different backgrounds by first providing a path for each to prepare them for the subject,  and then offer a shared part where they can share their knowledge. Imagine, all this with the help of effective learning analytics using your existing materials.

Can I see whether my training programs increase work effectiveness?

Whether you work in a company offering training programs or responsible for courses inside an organization, the holy grail is showing the real impact of training. That is, how does the actual performance change as a result of training.

An example of an industry transformed by data is finance, an area that naturally lends itself well to smart technologies. It has clear metrics and outcomes. As a result, algorithms which optimise the desirable outcomes can do their magic. Learning, on the other hand, is another story.

In a learning process, the affecting factors, nature of the information or collected data, and even the results are not easily defined. However, this is slowly changing as more developed technologies, such as natural language processing can make sense of sets of unstructured data.

However, combining learning data with performance indicators it is possible to reveal what kind of impact the training programs are producing. These outcome metrics can include customer service, sales or other measurable results. For education providers, the possibility to finally show what they can deliver through their training programs is now within their reach and will soon distinguish the best in the business.

Arm yourself with goal-oriented analytics

When we started our journey in developing Claned, we wanted to invent the new standard for learning analytics and how it provides the ability to support successful training programs. In order to do this, we had to create the best learning analytics in the business. And we did, and it is now available to enhance training programs anywhere. We believe the investment into learning platforms should contain a clear view into ROI, for example, through the number of drop-out rates, improved learning results, employee satisfaction, and improved sales.

We believe the use of learning analytics should relate to your training goals from the start.

That is why, in Claned, we focus on two main ways of using learning analytics to help education providers. Our learning platform holds inbuilt analytics for real-time follow-up on an education program level. These you will have for any course implemented in the platform. Secondly, we offer tailored data consultation services to answer specific challenges. Evidence-based decisions and personalization according to gathered analytics can significantly help you improve the experience for your learners.

If you want to hear more about how we can help you to achieve new heights of success for your training programs, get in touch and we tell you more.

About the Author

Dr Topi Litmanen worked as a Chief Educational Scientist in Claned Group. He was responsible for ensuring, that the pedadgogical aspects of Claned are based on evidence based learning research.

Preparing your organisation for AI supported learning

Supporting learning with data is a process of using analytics to empower learning and making decisions based on data-backed evidence.

Smart new technologies, such as machine learning and data mining have made significant progress in recent years, and their impact is growing in many areas of life. The finance sector is a prime example of how data and machine learning are used to optimise business processes. The field of finance naturally lends itself to data. It runs on clear metrics and has precise targets for optimisation.

Organisational learning, on the other hand, is an example of a field with varied goals and fuzzy data. That is the main reason, the area of learning has largely been lacking analytics and data-powered decision making. However, technological developments are reaching a stage in which these obstacles can be overcome.

HR has a natural position to becoming a true leader in digital learning. This transition has started, and it will gain strength with time. The phases and tools of the process are described in Graph 1 and explained in the following text.

Graph 1. The phases of leading learning with data  

Scattered data
The data is already here, but where exactly?  

Most organisations already collect and hold massive amounts of learning related data. Digital platforms, online courses, tests, competence appraisals, self-reports, education feedbacks, and other similar systems produce a wealth of information. One of the key challenges is that data is located in multiple unlinked systems. To use this data, one needs to derive it from various sources and combine it manually. As a result, HR is left with periodically producing one-off reports with cross-sectional analyses about the current state of affairs.

Manual reporting, even if partially automated to reduce the amount of work, is not real-time. It focuses on a pre-selected set of explicit variables and is rudimentary and lacks the possibilities for insights compared to more present-day solutions.


Combined data and dashboards

Once an organisation combines its databases or builds interfaces for that purpose, more efficient data-enabled learning can begin to commence. Combining the learning management system, educational data, competency evaluations performance data brings insight into how all of these relate to one another. Ideally, this information is presented in a simple dashboard providing real-time analytics about learning and development. It can bring insights into where learners are spending time, what they are engaged in and what seem to be the evident obstacles for development. Correlating the educational variables with business results, such as sales data, provides ways to explore the progress and effects of specific programs or interventions.

For the most part, dashboards are still constrained with human deficiencies in decision making. They only show what they are planned to show. We are inclined to focus on the explicit relations between the most obvious variables. Many of the implicit reasons between causes and effects are hidden from a human investigator.


Predictive insights

Even with the ill-defined datasets related to learning and development, advancement in natural language processing and image recognition allow algorithms to make sense of contents and contexts in materials. That is, they can mine the data points for meaningful correlations that often escape the naked eye, such as finding relations between implicit, hidden variables, and draw on historical data and decisions within the organisations. They lack some of the pitfalls compared to human decision making and can outperform even most experienced human practitioners. An effective way to gain insights into learning data is merging the desired outcomes, such as sales results or customer feedback with the usage patterns in a learning platform. This highlights effectiveness of different ways of engaging with the learning possibilities offered by an organisation. The information can then be used to adjust the learning programs for future learners.

These systems can be assigned to identify learners who are not participating, or whose skills are in danger of lagging behind. They can also highlight some of the knowledge gaps or strengths within an organisation. In parallel with helping HR professionals, the same algorithms can be harnessed to serve the learners. Indeed, the next level of learning systems is that which can make accurate recommendations for learners, educators, and HR.


Dynamic, actionable recommendations

Web stores and social media are effective in making interpretations about our interests and recommending us products or services that appeal to us. This same technology can be applied to support learning. This enables an organization to deliver truly tailored recommendations for ‘just-in-time‘ learning and personalised training programs for each employee instead of fixed courses designed for the masses.

When a system has understanding about the needs, interests and learning activities of employees, it has a robust set of data to conclude from. It can recommend materials, activities and interactions to a specific learner based on identified needs. Further on, a learning platform can make accurate recommendations for future learners about which actions would be beneficial for them based on previous learners‘ activities.

The next developments in this process will be automated learning paths using materials inside an organisation and automatically providing appropriate tests to measure learning and motivation.



Digitalisation and data are not solutions to every problem. Leading with data is about developing new ways of operating. It is slow; it requires work and most of all, it requires a comprehensive understanding of current operations. The first step is recognising the current state of learning data in an organisation and designing the steps to take the process forward.

Data-driven systems do not replace effective competence management, but HR professionals that refuse to leverage available data for this purpose will be replaced by those who do.

This article was also posted on the EAPM Newsletter

About the Author

Dr Topi Litmanen worked as a Chief Educational Scientist in Claned Group. He was responsible for ensuring, that the pedadgogical aspects of Claned are based on evidence based learning research.

Design to support learning

You could have the most professional learning content in the world, but if your visual composition isn’t well designed, it will be harder to engage your learners. To make it easier for your learners to absorb the content you are offering, you should pay attention to its visual composition.

Start by thinking about your content, and what is the most important thing your learners should learn from your material? Are your learners easily able to get an idea of what your content is all about? The visual composition of your content should ideally support also learners who just want to skim the content. The “essence” of the learning materials should be obvious from the start.

Principles of Effective Visual Design

When you are building your course materials to your learners, your content and your design should be equally paid attention to, and ideally working in harmony. As you carefully choose your words, equally, You need to choose focal points for your visuals that help the learning experience delivered effectively. There are some universal visual design principles that you might want to get acquainted with.

Gestalt Principles are in the Mind, Not the Eye

Gestalt principles in visual design, help users perceive the overall design as opposed to individual elements. If the design elements are arranged properly, the Gestalt of the overall design will be very clear.

The Gestalt Principles are a set of laws arising from 1920s’ psychology, describing how humans typically see objects by grouping similar elements, recognizing patterns and simplifying complex images. Designers use these to engage users via powerful -yet natural- “tricks” of perspective and best practice design standards.

The Gestalt Principles are pivotal in UX design, notably in interfaces, as users must be able to understand what they see—and find what they want—at a glance.

Source: (

Some useful Visual design principles:

Unity means that all elements on a page visually or conceptually belong together. The visual design must strike a balance between unity and variety to avoid a dull or overwhelming design. When creating learning materials, unity means similar concepts and themes are conceptually and visually aligned. This way the visual composition supports the learning topics.

The use of space helps reduce noise and increase readability. Use of white space (empty areas) can be a surprisingly important part of your layout strategy. So try out leaving more space between graphical elements, larger margins, and line spacing.

Hierarchy shows the difference in significance between graphical elements and signals importance visually. Hierarchies are often created through different font sizes, colors, and placement of elements on the page. So, you might make a more important element bigger and bolder than a less important element which might be smaller and fainter.

Scale is also a useful tool used to help communicate hierarchy by drawing attention toward and away from certain elements, thus signifying their importance to communication.

Contrast focuses on making items stand out by emphasizing differences in size, color, direction, and other characteristics.

Dominance focuses on having one element as the focal point and others being subordinate. This is often done through scaling and contrasting based on size, color, position, shape, etc.

Similarity refers to creating continuity throughout a design without direct duplication. The similarity, for example, in the use of colors, in the shape of elements, text styles, is used to match the different pieces visually and make them work together in a layout.

By using these universal visual design principles, you can produce learning materials where the “essence” of the material will be obvious.

Hopefully these tips are useful to you when designing your next learning content materials.

Let me know if you have any specific challenges? I would love to hear from you. Send me an email at

About the Author

Virve Johansson heads Claned´s branding, marketing, and product design, ensuring the platform is intuitive and user centric. She is passionate about visualization of data and information and building the Claned brand.