Centre of Excellence for Information Sharing
As part of my government fast stream programme, at the start of April I joined the Centre and quickly became immersed in the world of information sharing. To find out more, I recently attended the latest brilliant Economic and Social Research Council (ESRC) seminar in the series on information sharing for devolution and public service reform. During the seminar we heard from an eclectic variety of practitioners and academics who had, all in their own ways, made inroads into the often bewilderingly complex world of information sharing. They ranged from local health providers in the Greater Manchester area, to public servants analysing the labour market in Lombardy, Italy.
The day started with a presentation of patient-centred care by Gary Leeming from the Academic Health Service Network (AHSN) operating in Greater Manchester. The problem he outlined was “disconnected care”, where patients in the Greater Manchester area would go from one health service provider to another, with all of them having different health records for them. In response, the AHSN helped build a system called DataWell around principles of consent, ownership, and security to allow different medical services to access the same data on patients.
We also had a presentation by Andy Baker and Si Chun Lam from Coventry City Council, who used data to deep dive into the experiences of persons with multiple and complex needs, focusing on three big indicators (substance misuse, homelessness, and offending) to produce visually engaging dashboards, timelines and maps.
Dr. Kellie Thompson from Liverpool Hope University focused on child welfare; specifically on the factors preventing professionals from sharing information. In particular, she pointed to the lack of support for affective- or intuitionally-sourced perceptions of risk, and towards confirmation bias, and the bystander effect as factors in deficits in information sharing in this area. She argued that the ‘jigsaw metaphor’ was a misleading one due to the assumptions it encourages: for example, that everyone has a piece of it. Her recommendations were to design services with communication in mind; strengthen voluntary networks; and to explore what context can be created for professionals to share their affective needs (hunches, feelings, anxieties etc.).
We also heard from Giampaolo Montaletti, a humourous, bright and friendly economist from the Regional Agency for Education in Lombardy on how he had developed a groundbreaking analytical system for labour markets in 2007. This system could, for example, track a single individual through their career by capturing when they had left or signed an employment contract, and when they were receiving unemployment benefits. This allowed them to evaluate certain labour market policies and their effectiveness.
Professors Cathy Parker and Cathy Urquhart of Manchester Metropolitan University introduced us to the use of big data on the high street. They used data such as footfall, weather, vacancy rates and shopper population to produce three dashboards for each of their ‘clients’: policy, town and store. A fascinating insight of theirs was their discovery of 201 factors which affect vitality and viability of high streets, and mapped these on a chart of influential vs. controllable factors. Naturally, the shops and localities were most interested in the quadrant of the chart were the most influential factors they could actually control resided, and chose to prioritise these.
What I found most striking about the event was the consistent theme throughout of understanding individual journeys and needs. This may come as a surprise in the age of mass information, where macro-trends are all too frequently seen as the holy grail of analysis, often having predictive capabilities attributed to them under a one-size-fits all approach that can be packaged and sold. But it should not; we are moving from an age of uniformity involving mass communication, interests and production towards a society where the individual is put front and centre. What this means in practice is that data is being cut into individual strands over time to gain depth of understanding rather than sliced into cohorts to generate mass insights and trends.
If this is where we are headed, it represents a fundamental shift in approach towards understanding people’s individual journeys and needs. This may have a myriad of applications. Individualised medicine is beginning to appear on the scene, but we could also end up with applications that tailor themselves to the kinds of issues vulnerable people are facing at each stage of their lives to help them to always have access to the right services. Does this mean we are abandoning mass data approaches with predictive analytics and such like? Probably not. Pattern-seeking is a major ingredient in our unique brand of intelligence as humans, and these patterns are good at elucidating basic correlations.
The two methods are actually beautifully complementary. What you can miss out on using the mass analytical method is a sequential or linear understanding underpinning those correlations; the glue in space and time that holds big slices of data together. You need to dive deeper to understand causality, and this is where the case study proves most powerful. The benefit of this relatively new capability to track individual journeys is that it reveals a narrative behind people’s lives and circumstances, allowing for deeper investigation of causal links which can then be used as the basis of theories. Zooming out again, their constituent hypotheses can then be tested using the mass data, allowing for the kind of evaluation of policies and their impacts Giampaolo Montaletti achieved in Lombardy. Perhaps with this exciting prospect coming over the horizon, as Si Chun Lam pointed out, the real test is to do this safely and with consent. But then these are the kinds challenges we in the Centre try to help our partners overcome.