The future of design is words and data

The future of design is words and data

Sometimes, when you view a collection of objects you notice qualities, characteristics and connections that wouldn’t have been noticed if each object had been viewed individually.

Daniel Harvey, Pamela Pavliscak, Sarah Doody and Andrea Resmini were amongst the speakers at Interact London 2016. Four individually fascinating and excellently delivered talks.

The theme of the Interact was How will people interact with technology in the future and how do we design for it today? On the flight back to Stockholm, I reflected on the conference and the presentations of Daniel, Pamela, Sarah and Andrea in particular.

Conversation is the command line of tomorrow

Daniel Harvey talked about how conversation is the command line of tomorrow.

“Software is eating the world” said venture capitalist Marc Andreessen in a 2011 Wall Street Journal article.

Daniel said, If software is eating the world then it’s becoming clear that messaging is eating software. Chat is going to be a very important part of our future; It’s just keeps on growing.

The amount of messages sent globally using one particular messaging app — WhatsApp — whizzed passed the global number of SMS by the end of 2014. For a few years now we’ve been messaging more than talking.

Both E-commerce and customer service are moving to chat. WeChat in Asia is leading the way with in-chat transactions. Facebook messenger and others are also joining in. Customer service in the form of chat is often available on service providers’ websites, as well as in other channels — The ability to chat directly with your driver in the Uber app is one such example.

It’s not just human to human chat that’s increasing. Chatbots are helping us with various tasks and queries. Sometimes they are simple bots that post information automatically — such as daily weather forecasts, new Trello cards to Slack, roaming costs when we arrive in new countries.

In other situations the bot is more advanced and makes use of natural language processing and responds to our queries — artificial narrow intelligence. It’s learning and inferring rather than thinking. Narrow AI is making our interactions with chatbots a bit more human and bit less like ELIZA.

A screenshot of a conversation between a user and ELIZA

Image borrowed from sart3480

 

In a chat based world, the words are the UI. Rather than tapping or clicking, we’re writing and responding. It’s a dialogue. Crafting the right words becomes interaction design.

When Your Internet Things Know How You Feel

Pamela Pavliscak talked about how emotional data will be collected to augment the existing data layer — When Your Internet Things Know How You Feel.

There’s a huge amount of data being collected about us and what we do. Machines already know a lot, but Pamela points out that this vast quantity of data is lacking emotion. Yes we can like, heart and favourite things, but it’s still just a superficial click — it’s not genuinely emotive.

An animation showing Twitter's popping fave heart

Twitter’s “fave” icon

 

Genuine emotions will become another input into our data driven ecosystems. Bringing along with it a whole load of ethical considerations.

Facial coding, voice and tone recognition, pulse rate, textual analysis, personality profiling. “Tomorrow would be a better day to argue with your partner”. “The person approaching you is potentially hostile”. “That hug had a score of 47”.

A picture of a couple hugging

That hug had a score of 47. Image: credit h3h (CC BY 2.0)

 

When machines know how we feel, and are capable of predicting how we will feel and respond, the question becomes if and when they should. Does my hug really need a score? How much will AI and emotionally enabled devices influence my behaviour and (subconsciously) affect how I think?

Knowing that a person has a “low hug score” could prime me into expecting that the next hug from that person will also be poor.

The behavioural psychology that comes into play in this future not only requires a good understanding of what is happening, but also requires strong design ethics.

Those design ethics need to come from us. We can’t rely on organisations and businesses to develop and uphold these ethical standards on our behalf.

“Design for an ethical future” — Pamela Pavliscak

Anticipatory design & the invisible interface

Sarah Doody talked about Finding balance between anticipatory and automation in UX design. Surfacing information and choices at timely and contextually appropriate moments. How in some situations we can use data to eliminate the need for choice completely.

This could perhaps lead to deskilling and a degree of dysfunctionality. We would lose our ability over time to handle certain situations or information as they become more less frequent and practised.

“If you’re not anticipating people’s needs, then you’re doing it wrong.” — Sarah Doody

By making more information available to the (digital) agents that assist us, we increase the likelihood that they can anticipate our needs and proactively help us. It can help us make better decisions and perhaps even avoid decision fatigue.

We can contrast to the notion that Pamela raised, where data (and emotionally enriched data) raises ethical dilemmas and lead to questionable influence on our thoughts and actions.

A slide from Sarah's presentation showing her calendar and the mysterious tentative appointment

Ethical considerations aside, the more we allow our data to travel, the more interconnections it builds. We’re lining up the dominoes, putting on a wonderful show when it’s all done — but if one falls over too soon, it’s hard to stop and we’ve got quite a mess on our hands.

One example Sarah gave was how the email data mining feature of iOS had found an email containing an invitation to an event. From the email, a tentative calendar entry had been automatically added to Sarah’s calendar.

A slide from Sarah's presentation showing her calendar and the mysterious tentative appointment

Image taken from Sarah’s Interact London 2016 presentation

 

The first thing Sarah knew about the event was when she received a notification saying it was time to start driving if she was going to make it in time. The dominoes were falling over — one particular thing that was attempting to be helpful was leading to another particular thing trying to be helpful — and together there were being utterly unhelpful.

In a connected world where our data is passed between multiple entities, we need to design these intersections. Consideration needs to be given to the paths our data will take and the possible outcomes that these data-journeys could generate.

Designing how to deal with and anticipate these “wayward paths” will be crucial. This is affordance in the context of machine-to-machine interaction so that we avoid creating confusion and misunderstanding for an actor.

Mapping cross-channel ecosystems

Andrea Resmini talked about cross-channel ecosystems. A system of actors, actions (tasks), channels, touchpoints and seams.

This is a holistic approach to service design, lifting your perception above the individual (organisationally constrained) service and considering the reality of the person in the middle of it all and the multitude of services they might actually consume as part of achieving a goal.

I love the brutal honesty of this approach.

To give you a better idea of how it works, here are some of the central components of the system. I’ve lifted these directly from one of Andrea’s presentations.

  • An actor is any agent in the ecosystem trying to achieve a future desired state
  • A task is any activity an actor performs towards that state
  • A channel is a pervasive ecosystem-wide information layer
  • A touchpoint is an individual point of interaction in a channel
  • Touchpoints are vessels for information
  • Touchpoints are medium-specific, but the information they convey is medium-aspecific
  • Touchpoints belonging to more than one channel act as a seam between them
  • A seam is a threshold connecting touchpoints on the same or different channels
  • While experiences should be “seamless” sometimes seams should be visible or perceptible

“Cross-channel ecosystems are semantic constructs that straddle digital and physical spaces, instantiated by individual actors moving freely and at will between locations, devices, and contexts.” — Andrea Resmini

The example Andrea used was that of going to the cinema. That process involves multiple locations, actions, touchpoints, devices and contexts. Some actions are interconnected, some are actually quite detached but necessary as part of the overall task.

An ecosystem map showing four channels and multiple tasks including multiple actions

Image taken from Andrea’s Interact London 2016 presentation

 

Purchasing the actual tickets, chatting to a friend to arrange what to see and when, buying popcorn, physically transporting yourself to the cinema. There are many components in the “going to the cinema” ecosystem — we, as humans, hold these complex constructs together with relative ease, understanding and maintaining the overall purpose — going to the cinema.

Organisations (and service design) have traditionally viewed services from a selfish standpoint. Viewing them as self-contained and self-absorbed. Considering and accepting that the service might be naturally more complex and involve a multitude of interconnected and distinct services is sidestepped.

It takes a certain amount of confidence (in the part organisations play as part of a bigger picture) for an organisation to accept the way the cross-channel ecosystem is — but by accepting this, service design takes a leap forward and we can design for the realistic scenarios that actors find themselves in.

Designing the seams — the visible and invisible connections between touchpoints — is where we deliver value to both actors and organisations.

What is the future of (digital) design?

So back to my reflections on the return flight to Stockholm. Where are we heading? What is the future of (digital) design? What will we be designing in the (near) future?

In this future, chat and Chatbots will have replaced many interfaces. The words will become the UI. It’s a full circle back to the beginning of the web. The work of the interaction designer of the future will be to craft the best words and to be the guardian of standard design patterns.

Data and AI will combine to serve us information and choices only when necessary. The (user) experience designer of the future will be an experience architect, designing the way we (and our data) will move across seams. Screen flows will become data flows. We’ll design what gets surfaced to the agent and design what will happen when the dominoes fall.

Perhaps in a future where much of our design work is firmly rooted a step away from pixels, we stand a better chance of delivering the user experience we have been striving for.

Design ethics, rather than visual design will be our defining moment.

Cartoon drawing of an angel and a devil balanced on a see-saw

Design ethics – Good vs evil.

James Royal-Lawson co-hosts UX Podcast, a popular twice monthly user experience podcast and is a freelance UX consultant based in Stockholm, Sweden.