These potential applications raise a number of important questions, many of which have ethical and moral dilemmas. How safe is this data that is being shared and who owns it? Blockchain is widely employed as a way of ensuring that this is kept and transmitted safely, but is this infallible? If your DNA is being profiled who are you happy being able to access this? Maybe you want your GP to see this, but what about your insurance company? What about researchers? With a big enough data we might be able to make some new breakthroughs in the health arena. So should we all consent to share our anonymised and aggregated data? The recent response to the My Health Record scheme suggests that many of us are wary about this.
Would knowledge of genomic predispositions make us behave more “rationally”? If you knew that you were more likely to develop heart disease would you eat more healthily? Conversely, if this wasn’t a worry for you would you then engage in more “risky” behaviours? If we know one thing for sure it is that people don’t always behave in ways that are predictable or considered the more ‘rational’ option.
Some of the companies we spoke with talked about offering incentives to individuals to share different aspects of their data in return for vouchers or discounts off other products and services. While this avoids a situation where individuals are not rewarded for the use of their data, it raises potential equity issues. Those most likely to respond to such incentives are likely to be those of lower socio-economic groups. Indeed, the Australian Human Rights Commission has recently raised a number of concerns relating to technology and their potential to enhance inequities.
Although the discourse around many of these technological developments is that they should make us more “safe”, there is already evidence that suggests reason for concern. Addiction to technology is a real concern for many, particularly in relation to younger children. China has uniquely set legislation addressing this after concerns about wellbeing and intense game play regarding on-line game Honour of Kings. The developer, Tencent, responded by limiting users to the amount of time they can spend on the game and at what times. Although welcome by some, this a blunt approach and does little to address our obsession with electronic devices more broadly.
A feeling of safety in many of the examples we came across typically also involved some significant surveillance, either via camera, by data or a combination. Although this might lead to some of us feeling more secure, for others this could come with concerns about issues of human rights. A recent story in the New York Times highlights the issues associated with the expansion of facial recognition software and surveillance in China and some significant concerns about this within the context of an authoritarian regime.
It is clear that there are some exciting developments to come in the technology space that will have a profound impact on our everyday lives. But these developments also bring with them a series of potential negative impacts and associated ethical and moral concerns. Although some of these developments are some way in the future, many of these exist in our everyday lives now. Yet, one of the issues we are coming across in our research into the use of robots in care services is that governments are not yet systematically having conversations in a widespread way about what these technologies mean in terms of the ways in which we design and deliver public services and some of the challenges that these raise. A failure to consider these issues will likely mean that the first time we ever really consider these issues is when some sort of incident arises, by which time it will be too late.
While there is much to be excited about in terms of the future of care services, there are some developments that should give us pause for thought. It is a future we need to prepare for to ensure that we get the type of care services we want and need.
This post contributed by Catherine Smith, Melbourne Graduate School of Education, University of Melbourne and Helen Dickinson, Public Service Research Group, University of New South Wales, Canberra.