We need to design services for those who have difficulty accessing services to support users who have difficulty accessing the service and for the 25% of the adults who do not use the internet or who lack the skills or motivation to access the services we provide.
We may well find it more helpful to think about Designing Inclusive Services rather than designing services which provide for the Assisted Digital User.
Claire – Confessions of an Assisted Digital Researcher:
AD (Assisted Digital) Users: Finding an assisted digital user is problematic. People do not understand the concept of AD and we can’t find users to test our services. Once we stopped looking for AD Users and just worked with people, we found people with AD needs.
In short, there are no AD users, trying to find them is to embark upon a Unicorn Hunt.
The Users we need to plan to meet the needs of may not be digitally illiterate, they may well have problems accessing the service because of their personal context rather than their skills.
Claire related a number of case studies all with the same theme – we need to plan for and work with the Anxious User. These anxieties stemmed from their personal circumstances, not necessarily their digital skills. Users’ context matter, their feelings matter. Their Digital Skills may or may not be relevant.
“Look for people who need help”
Ben – A Review of the purpose of the Assisted Digital team – Inclusive Service Design.
Service design may well be impacted negatively by applying a blunt understanding of Assisted Digital. To some extent we are all AD Users, the service itself should be the support for AD Users, this is not contingent upon it being a digital service. Many people can’t/won’t access [online] service because of their context or skills.
Some people just need an offline service. Those offline services that people need are often badly designed. The cost to Government for poorly designed services is huge because they fail regularly and require human intervention to rescue the User.
Ben is promoting the holistic design of a service, to include the offline channel, so that Users can make the most cost-effective choice of channel when accessing a service and the episodes of ‘failure’ requiring human intervention to rescue the User are reduced.
Perhaps this is more an issue of culture in Government, which may see the Digital Service as distinct from the Service itself.
Hypotheses (the highlights):
- Service designers do not apply the same standards to all channels of the service.
- The largest volume channel seems to 5 times more likely than any other channel.
- Service Teams seem to measure KPIs for online services more readily.
- The User Centred Design approach for the online channel permeates through to the other channels.
- Service teams can’t measure or iterate every User Journey in every channel because those journeys may take the user beyond the remit of the design team.
- AD team saw low awareness of 3rd party services.
“AD simply means providing a complementary channel to enable those who can’t/won’t access the online channel.”
Ben shared a number of insights as to how User behaviour and Service design can be impacted upon by external factors such as KPIs.
“Service Teams may lack the strategic, decision making or the policy making authority to design truly inclusive services.”
John – What are we really looking for in a Service Assessment?
The GDS Design Principles say:
- This is for everyone (Principle 6)
- Understand Context (Principle 7)
- Build Digital Services not Websites (Principle 8)
Do Good Research:
- Understanding the variety of Users who use the service.
- Consider different levels of Access, Skill & Confidence
- Consider the support needs of those Users.
- Understand current behaviour on the “as is” process.
- Understand how people think about support and how they access it.
Identify the biggest barriers and failure:
- What are the biggest barriers and failure points and how will you design them out?
- What needs aren’t we meeting?
- What drives the need for support? (can we design those needs out of the service?)
Research and test the end to end service:
- Explore the entire sequence of the service including the parts preceding the digital service. (For example: Have you seen people receive the letter triggering the process and act upon it?)
- Include support options in the beta service, do user testing on them.
- Gather performance data and feedback for the support options.
Make Research Inclusive:
- Watch for bias in recruitment of participants.
- Watch for bias in methods (some people will decline to take part in groups or want to be filmed, their views must be included too.)
- How carefully are users categorised?
Do Research in Context:
- Explore how context affects the way Users interact with the service.
- Watch how they seek support and watch them get that support.
- Work with people away from the lab and in their life context.
What will the evolution of the Digital Inclusion Scale will look like?
It needs to evolve. It needs to reflect device dependency. It needs to be less linear. It needs to broaden to encompass their niche needs. Maybe call it harder to meet needs…..Digital Exclusion and Social Exclusion are closely linked.
Should services be designed to work across hand held devices and computers?
Designing something to fit all those devices might not mean squeezing the same service into the different devices. It may also mean designing the service as having distinct channels where the service operates differently on the different devices.
How should we recruit hard to reach users?
The most effective way is to work with the support organisations who help the hard to reach users. It is possible to incentivise their help by paying for room hire etc. It also works when you deploy ethnographic strategies and spend time in the environment until your presence is normalised. It may also be possible to deploy a refer a friend type strategy. A small number of the right users is better than a large number of AD users who hardly use the service at all.