Janeiro de 2024 – Vol. 29 – Nº 1

George Gillett1,2 BJPsych Bulletin (2020) 44, 121–123, doi:10.1192/bjb.2020.22 * This article
was the winner of the 2019 Praxis Editorial Award. 1 Oxford University Clinical Academic
Graduate School, UK; 2Department of Psychiatry, University of Oxford, UK Correspondence to
George Gillett ([email protected]) First received 13 Feb 2020, accepted 20 Feb 2020
© The Author 2020. This is an Open Access article, distributed under the terms of the Creative
Commons Attribution-NonCommercial NoDerivatives licence (http://
creativecommons.org/licenses/by-ncnd/4.0/), which permits noncommercial re-use,
distribution, and reproduction in any medium, provided the original work is unaltered and is
properly cited. The written permission of Cambridge University Press must be obtained for
commercial re-use or in order to create a derivative work.
Summary
Digital phenotyping (such as using live data from personal digital devices
on sleep, activity and social media interactions) to monitor and interpret
people’s current mental state is a newly emerging development in
psychiatry. This article offers an imaginary insight into its future potential
for both psychiatrist and patient. Declaration of interest None.
Keywords Information technologies; bipolar affective disorders;
schizophrenia; risk assessment; ethics.
The most exciting development in modern psychiatry is arguably the field
of digital phenotyping. Encompassing data related to sleep, speech,
activity, social media and keypad interactions, digital phenotyping
promises to measure and interpret human behaviour at unprecedented
scale. In psychosis, researchers strive to use such data to predict relapse,
while others aim to predict suicide risk using machine-learning
techniques.1,2 However, with the field in its infancy, the potential social
effects of such technological advances are unclear. How successful will
digital phenotyping be in clarifying psychiatry’s uncertainties? Most
importantly, where will the algorithm take us? A vision of the future
Arriving at work, Dr Singh rests her coffee on the desk and logs into her
electronic record system. Ahead of her first appointment she browses the
neuroimaging, bloodwork and behavioural data for her patients that day.
Reminiscing on how quickly this new world had been sold to the
profession, she remembers a lecture, 30 years ago, marking her first
encounter with digital phenotyping. ‘Where will the algorithm take us?’,
the conference programme asked, leaving Dr Singh shocked at the science

fiction surrounding her. The algorithms, already claiming more accurate
suicide risk assessment than clinicians, had begun to quantify mood,
anxiety, sleep, physical activity, interpersonal interactions and geolocation
as markers of social functioning.3 A former Director of the Institute of
Mental Health had left to join Silicon Valley as early as 2015 and social
media companies boasted about their own suicide risk screening tools
only 3 years later.4,5 While psychiatrists of the past stole glimpses of
psychopathology from snapshot mental state examinations and unreliable
histories, the ‘big data’ revolution promised to chart patients’ entire
behavioural phenotype for doctors to assess. Even the machines were
learning, they claimed. Steve’s alarm woke him. The day introduced him
with a motivating message, psychoeducation they called it, as he hunched
over a bowl of cereal, inputting its nutritional information to his diet-
tracking app. The phone prompted him of his exercise schedule for later
that day, a pending mood assessment and reflective diary entry, shortly
followed by a reminder for his annual appointment with his psychiatrist.
Meanwhile, Dr Singh, like the rest of us, played catch-up. Medical students
continued to read textbooks and revise mental state examinations. Mental
health legislation remained reliant on risk, while algorithms predicted
numbers but failed to rationalise their judgements in a way that humans
could PRAXIS 121 https://doi.org/10.1192/bjb.2020.22 Published online
by Cambridge University Press understand. Dr Singh watched the world
reduce itself to binary, while her former colleagues – the radiologists and
general surgeons she’d known since medical school – began fearing losing
their jobs to automation and robotics. Governments, eager to improve
‘population well-being’ and keen to avoid culpability for suicide and
violence, implemented their own machine-learning projects. Insurance
companies demanded that their clients wear smartwatches, such that
their every behaviour could be monitored.6 The rush towards big data,
‘the new oil’ as one economist put it, spared no profession, field or
domain of daily life.7 As the future came to stay, the algorithms marked
their next victim. Psychiatry? Yes, Steve replied, as the receptionist
beckoned him towards the sign marking the out-patient department. He
took a seat in the waiting room, reflecting on what to do. How he might
break the news. Remembering, tentatively, how he had stood on the
bridge, looking across the city, contemplating ending it all. It was News
Year’s Eve 2049, the new year coaxing him intolerably, daring him on. The
fireworks exploding in the distance. A note waiting at his flat. A future

without a place for him. He hadn’t seen his psychiatrist since then. Would
she know what had happened? Yet in this brave new world, technology
wasn’t just an adjunct to clinical decision-making. It strove to compete,
claiming a therapeutic relationship of its own with patients. As early as the
2010s, self-help phone applications advertised themselves with taglines
such as ‘rule your mind or it will rule you’, while others offered
individualised therapy through artificial intelligence techniques.8 The
market flooded well beyond the traditional boundaries of academia,
researchers were inundated with innovation but starved of time to
regulate it. A 2018 analysis found that only 14 of approximately 100
studies using mental health apps had clinically validated evidence of their
effectiveness.9 However, fears about the field’s lack of regulation could
only chase the technology into the future. His phone knew. The app which
monitored his mood knew, as did the one which monitored his
geolocation. Maybe the software which monitored his sleep had worked it
out too. He was sure that some of his social media followers had guessed.
His internet searches knew. His family didn’t. His phone had recognised
something his friends had missed. Perhaps it was his phone which had
stopped him. Or had it merely helped him stop himself? In anticipation of
Steve’s appointment, Dr Singh downloaded his data. She analysed the GPS
data first, before turning to the sleep data, exercise records and daily
mood assessments. She was sceptical of algorithms that claimed to close
the loop between users and their care, seeking to displace the clinician.
Even the most well-intentioned apps, the most effective, lacked
something, she felt. But she didn’t resist the technology entirely. With
data of its own, psychiatry demanded parity with physical health. Digital
biomarkers offered patients objective evidence for years of lived
experience and routine dismissal. The algorithms informed clinical
decisions, streamlined cloudy diagnoses and personalised treatment
choices. Yet alone they lacked something profoundly human, profoundly
therapeutic. Steve walks into the clinic and sees his doctor of 17 years. A
tear drops from his cheek. The psychiatrist offers a tissue. More come.
They forget the numbers for a moment. A moment’s pause in a world
bustling of answers. A moment of silence in a world full of data. Steve
looks up. ‘It’s good to see you’ he says, wondering how to tell someone
what they already know. They talk, and they reflect. Dr Singh browses
Steve’s numbers in the way that doctors have glanced over blood tests
and clinical observations for years. Steve adds meaning to the data,

reflects on the read-outs, adds humanity to the algorithms. Together, they
identify where the models shadow their subject like ill-fitting clothes.
Algorithms worked in broad assumptions, clinical acumen dealt with
individuals, Dr Singh explained, as she tailored the numbers to the man in
front of her. She was reasonable, Steve felt. He liked her. An ally with
whom to navigate this increasingly impersonal world. As the appointment
drew to a close, Dr Singh offered Steve an outreach service. We can have
the data alert us if anything takes a turn for the worse, she explained, a
run of poor sleep or abnormal text messaging could prompt your
community team to check up on you, or even drop you a visit. It was
something which had appeal, a safety net he wondered if he’d benefit
from. Dr Singh was hesitant to offer the remote-monitoring outreach
service to all her patients. The qualitative studies had identified that some
found it too intrusive, whereas others worried they would become fixated
on their own mental health, causing an anxiety of its own.10 Historical
concerns around the medicalisation of everyday experience persisted in
new forms. Part of her was relieved when Steve declined. New Year’s Eve

  1. Another year had passed. As 2051 beckoned, Steve continued to
    struggle with his symptoms. He’d developed coping strategies, the data
    had helped him identify his triggers of relapse. Sometimes he wondered if
    the algorithms knew him better than he knew himself. And he wondered if
    that was OK. With their help, he’d learnt to predict and prepare for his
    relapses. At his most optimistic he wondered if he’d managed to prevent
    them. After seeing her last patient for the day, Dr Singh turned back to her
    computer. Flicking through the screens, she glanced through a collection
    of records the remote-monitoring outreach programme had flagged for
    her attention. A man with schizophrenia exhibiting an unusual geolocation
    trail and a woman with bipolar disorder whose sleep had
    becomeincreasingly erratic. She would call them in the morning, reassured
    that even technology could not evade the uncertainties of clinical practice.
    After shutting down her computer and returning her coffee mug to the
    kitchen, Dr Singh exited the clinic into the cold December evening.
    Approaching midnight, Steve’s phone notified him of another upcoming
    daily mood assessment. He glanced down, hesitated and turned it off.
    Placing his phone on the table next to him, he looked to the sky, stood up
    and walked towards the fireworks.

About the author George Gillett, BA, BM BCh, is an Academic Foundation
Doctor at the Oxford University Clinical Academic Graduate School, and at
the Department of Psychiatry, University of Oxford, UK. 122 PRAXIS Gillett
A day in the life of a psychiatrist in 2050
https://doi.org/10.1192/bjb.2020.22
Published online by Cambridge University Press
References
1 Bedi G, Carrillo F, Cecchi GA, Slezak DF, Sigman M, Mota NB, et al.
Automated analysis of free speech predicts psychosis onset in high-risk
youths. npj Schizophrenia 2015; 1: 15030.
2 Coppersmith G, Leary R, Crutchley P, Fine A, et al. Natural language
processing of social media as screening for suicide risk. Biomed Inform
Insights 2018; 10: 1–11.
3 Tran T, Luo W, Phung D, Harvey R, Berk M, Kennedy RL, et al. Risk
stratification using data from electronic medical records better predicts
suicide risks than clinician assessments. BMC Psychiatry 2014; 14: 76.
4 Dobbs D. The smartphone psychiatrist. The Atlantic 2017; Jul/Aug.
5 Singer N. In screening for suicide, Facebook takes on tricky public health
role. New York Times 2018; 31 Dec.
6 BBC News. John Hancock adds fitness tracking to all policies. BBC News:
Tech 2018; 20 Sep.
7 The Economist. The world’s most valuable resource is no longer oil, but
data. Economist, 2017; 6 May.
8 Dowling J. How I use data to help me manage my mental fitness.
Presentation at Happiness and Humans Conference. The Happiness Index,
2019; 22 May (https://www.youtube.com/watch?v=bZBdwNzBXbY).
9 Wang K, Varma DS, Prosperi M. A systematic review of the effectiveness
of mobile apps for monitoring and management of mental health
symptoms or disorders. J Psychiatr Res 2018; 107: 73–8.
10 Gillett G, Saunders K. Remote monitoring for understanding
mechanisms and prediction in psychiatry. Curr Behav Neurosci Rep 2019;
6: 51–6

Similar Posts