top of page


With great power comes great responsibility – wisdom that comic heroes like spiderman took to heart. In the digital realm, a room with no secrets, great power requires not only great responsibility but great accountability. The speed and degree of connectivity have drastically changed how organisations can and are expected to uphold and monitor adherence to the responsible use of data. But as business and society increasingly intertwine, digital accountability is not solely directed towards the user anymore but addresses the society at large.


For the humanitarian sector, digital means may provide new thinking and bring about transparent and accessible feedback systems. But on the contrary, data is becoming exponentially more personal, and the world slowly wakes up to the negative implications on privacy and trust. In Europe, GDPR marks a first step in protecting the data privacy of the average user and providing a framework for digital accountability. In the case of forcibly displaced, personal data is far more vulnerable and requires adequate measures from humanitarian agencies to act in harmony with the humanitarian pledge to do no harm.


How do adequate measures in the digital realm look like? How can the participation and the voice of forcibly displaced be enhanced through digital means and analytics? Who is ultimately accountable when it comes to the use of data?


This future explores digital trends that will accelerate empathic analytics as well as new mindsets around accountability. We unfold both opportunities and rising concerns as we ask of you to consider:


How can the humanitarian sector effectively uphold digital accountability towards the forcibly displaced – while factoring in the potential lack of digital literacy?


Below, you find three selected accelerators, most prominently influencing the development of digital accountability.

Responsible data and AI

Resilient, ethical, safe, and transparent tools to measure, analyse, and inform decisions are paving their way in digital. Responsible AI, artificial intelligence solutions that are trained with inclusive data, is spreading its wings.


In fact, it’s predicted that 75% of organisations will deploy AI accountability throughout their organisation within the next three years. Still, unconscious human biases are found to creep into algorithms and as AI is becoming omnipresent, the danger of using the solution is that these biases scale. The opportunity here then becomes to reveal biases that went unnoticed before and even hold the algorithms accountable to a larger extent than is possible with human unconscious biases. Here, so-called cognitive analytics go a step further and are meant to act like a human brain. It blends AI, machine- and deep learning and semantics with the newest discoveries in cognitive science. With these digital measures available, organisations have increasing possibility to measure the voice of society in an ethical and safe manner.


Meanwhile, we see multiple technological advances in the analytical sphere aiming to simplify data usage. Estimates are that close to two-thirds of data available goes unused – so although businesses are eager to get more data about operations and their users, it is rarely transformed to actual insights. As data collection and storing contributes to energy emission a ‘more is less movement’ is rising. The same is holds true for the vast number of low-code and no-code analytical solutions that allow even the non-coder to easily access data and analytics. Amazon Kinesis is one example of a service provider for analytical solutions of which many offer packages that include both technology and the service.


As we move towards simplified and safer digital analytics, data and analytics teams are shifting from a secondary to a primary role in an organisation. This shift demands a massive upskilling of workforces on data literacy. Companies such as Bloomberg, Guardian Insurance, and Adobe take a lead and established digital academies teaching abilities to interpret data, draw insights, and ask the right questions.


In this future, we see measuring and using data in a responsible and useful way becoming increasingly accessible. Data accountability is then no longer a choice, but a given and necessary fact. To strive for radical transparency and responsibility, what are adequate measures and analytics for the humanitarian context? How can data literacy be ensured within humanitarian agencies? How can advanced digital tools allow for empathy at scale? What biases arise through the digital divide when data is used to assess critical situations? 



Accountability for the prosumers

In the digital space, the consumer has elevated from a passive to an active role of a so-called prosumer – both producing and consuming data. The data user generates today, is exponentially more personal and valuable than ever before. As a result, we see personal data function as a currency for exchanging data security and privacy with personal benefits. This two-way street serves greatly in situations of natural disasters. Facebook introduced a Safety Check feature that enables users to mark themselves as safe and thus inform friends and families. Other social media platforms allowed emergency managers to retrieve qualitative information on the consequences of a disaster in real-time. Here, the prosumers played an active part in the dialogue which enhanced their voices and brought personal and societal benefits to those in need.


But as access and quantity of personal data increase the trade-off could ultimately be a loss of privacy. Concerns arise when the prosumer is not informed or aware of this loss of privacy. In a recent example, personal biometric data of Rohingya forcibly displaced was shared with Myanmar (the country the forcibly displaced had fled from) without a full data impact assessment. In Germany, the screening of social media data was suggested by political voices to assess forcibly displaced applications for asylum. In both of these cases, no dialogue was established between the prosumer and the user of the personal data.


However, numbers show that most prosumers are aware and in fact very careful when to agree to a loss of data privacy. Acknowledging this, 82% of companies state that they would withdraw from participating in other companies’ digital ecosystems if data security and ethical controls are lacking. Once the digital trust is broken, the prosumer is unforgiving. As data becomes exponentially more personal, the voice of the prosumer will echo back. Thus, organisations have only one chance to get data privacy and security right!


How can forcibly displaced hold users of their personal data accountable? How will they stay aware and well-informed over potential loss of privacy? When can a loss of privacy be accepted to offer certain services to forcibly displaced? And how does the humanitarian sector get the trust equation right?  

Building from a mindset of scale​

Back in 1969, we landed on the moon guided by NASA’s supercomputer that had less power and computational technology than our smartphones have today. Technologies experience exponential growth in past years – a tendency projected to continue, which will lay the ground for solving societal challenges at scale.


An example of these scalable solutions, leveraging the digital, is the free mobile app Be My Eyes. The app allows blind and low-vision people to call for visual assistance and receive guidance from a volunteer through a live video call - available in 180 different languages. The enormous scale of Be My Eyes is enabled through the integration of digital into its core business model. Today, we see social enterprises leaning increasingly towards technological solutions to scale their reach, and thereby balance often low margins.


Surprisingly, the scalable solutions to societal challenges are often introduced by new actors outside of the humanitarian sector – a species of social entrepreneurs that embodies an exponential mindset and does not shy away from using technology to accelerate impact. Most prominent is such mindset in well-known, Silicon Valley. In Rwanda, delivering medicines and vaccines quickly via existing roads was long impossible until the San Francisco-based start-up Zipline launched a fleet of delivery drones. The so-called ‘sky ambulances’ drop-off life-saving blood and medical supply all around the world.


Besides imposing a mindset of scale, the next generation of these social business models will be composable. Rather than building solutions from scratch every time, existing application blocks will be leveraged using APIs to create the digital offerings. The composable enterprise is thus cost-efficient, agile and all-time ready for innovation.


Looking into the future, embracing scalability and composability will be essential. A mindset that is momentarily carried by players not traditionally active in the humanitarian sector. Who is accountable for solving these challenges? How can the humanitarian sector embrace this new mindset? What opportunities do composable business models hold to co-create solutions with and for forcibly displaced in much leaner ways? In which cases, could partnerships with non-traditional actors bring about scalable solutions to previously unsolvable problems?


In this future, you have read about empathic digital analytics, the voice of the prosumer, and scalable solutions that solve previously unsolvable issues. Digital allows to simplify and scale the usage and access to valuable data. It allows new and enhanced interactions with the forcibly displaced but also generates novel difficulties – the loss of privacy and ever-lasting footprint through personal data.


We have highlighted the power of data and analytics and how responsibility is moved to an augmented consumer in signals from the edge. Signals from within show rising concerns of surveillance but also shed light on health solutions through digital.


To engage with forcibly displaced in a direct and non-harmful manner is key to humanitarian work. Digital gives new powers to measure, collect and use vulnerable data. A power that requires great accountability. Let’s return to where we started this future:


How can the humanitarian sector effectively uphold digital accountability for forcibly displaced?

  • How can forcibly displaced hold users of their personal data accountable? How will they stay aware and well-informed over the potential loss of privacy – and when can a loss of privacy be accepted to offer certain services to forcibly displaced?

  • What biases arise through the digital divide when data is used to assess critical situations? 

  • With a mindset for scale and focus on composable business models, how can the humanitarian sector co-create solutions with and for forcibly displaced in much leaner ways?

  • How can upskilling on data literacy allow to empathise with forcibly displaced at scale?

These questions and more, we will explore together on our journey forward!

So what?


Below, you find signals from the edge as well as from within the humanitarian sector. Click the signals to explore them further and use the arrows to navigate between them. Here, we encourage you to navigate this section with a “what if” mindset

- and note down any ideas and thoughts that may arise.

bottom of page