Sunday 4 January 2015

Am I normal?

This article is about the dangers of the word ‘normal’, combined with the ever-increasing uptake of health and fitness devices. This has made me dream of a wondrous future for health technology, built on the ideal of open source.
___

Today my phone told me I wasn't normal. My phone has a built in heart monitor. In fact, many smartphones are capable of giving a rough estimate of your heart rate these days. However my phone now pairs this hardware with some 'fitness' software. Today the software told me that my resting heart rate was outside the 'normal' range. In case there are any insurers reading I won't say if it's too slow or too fast and by what number but needless to say it gave me cause for concern. Suddenly I'm not normal, and more than that I might not be normal in a potentially lethal way. My first issue was with the word 'normal'. Normal can be a very helpful word, especially in a medical context. However, that context is one of statistics and averages, not of individuals. Therefore one person cannot be described as normal, normal is simply a result of statistics; a result being described as normal in comparison to a wider data set. However, normal takes on a different meaning for many people, especially people less familiar with medical terminology and statistics (for example people with a lower level of health literacy, some younger people, or those for whom English is not first language). ‘Normal’ in everyday life is linked to ideas about identity, social belonging and a host of other associations including appearance and behaviour. As I consider myself fairly able to navigate the Internet and seek out trustworthy sources, I was able to quickly learn I didn't have too much to worry about. However it doesn't take a huge leap of imagination to see somebody reading 'not normal' and worrying, perhaps even worrying themselves to death. At this point I would just like to say I think the use of technology in modern health and self-management is a wondrous and fantastic thing. Seeing friends with diabetes managing their blood glucose levels with gadgets, you can begin to see how some of this technology will soon go mainstream. However, developers of technology to help people manage their own health and fitness must learn quickly the dangers of getting it wrong. My example is a very small one, but points to the tip of an iceberg. A free market economy combined with a semi-regulated technology may create an environment which may not afford developers and manufactures the time or conditions to adequately test and refine their creations. This creates a new frontier for the public, patients, users and consumers of devices to have a more active role in the development and monitoring of the quality of information, devices and services. To anticipate this, I dream of some kind of voluntary international mark or accreditation which developers and manufacturers can sign up to (it would have to be international, or it would be pointless). To get the accreditation, developers must show that information is evidence-based, reliably cited and that the words used have been developed and improved by users and consumers and other members of the public. Additionally, there should be a clear way for people to comment and give feedback on content.
Finally, I would encourage all ‘for profit’ developers and manufacturers to make as much of their work and code as possible ‘open source’. If you’re ‘not for profit’ (charity or Government) then there’s absolutely no reason to hold anything back. There is little value to be gained from intellectual property surrounding bad services and products. Make them open source, make them transparent and let the world and the community improve and develop them. Ideas and code, along with hardware, will increasingly form part of a delivery model for a service, rather than the valuable commodity itself. In health technology, the value and revenue will likely come from delivering a service which is useful to people who need it (think of Google ‘giving away’ the Android operating system, letting other people build the hardware and code that then brings the revenue back to Google). Trust will form a huge part of a business model, with users increasingly handing over the most personal of data to servers, perhaps even our whole genomes. And a quick note on law. No one is above it and we all need it to be in place and upheld to protect everyone. As a result, yes, we need lawyers. But revenue won't come from paying lawyers to help monopolize discrete pieces of information that make a wider system work. That will just get us more lawyers.

The example that comes to mind is Apple, patenting the action of moving a finger across a screen to ‘unlock’ the device– or ‘performing a gesture’ as they put it. Well I’m imagining performing a gesture at the people who employ this kind of thinking, especially when applied to health technology.

To quote Dickens, the ‘lawyers always win’. To quote a more up-to-date source ‘the answer to the innovator's dilemma is not here in the courtroom suing people’ (John Quinn, Samsung's lawyer). The balance must be between employing lawyers to protect people and protecting the incentive for creativity.

Openness is a strength, not a weakness. A strange Orwellian paradox is that the more open and transparent a project is, the less likely we are to have products and services with vulnerabilities. In the panopticon, we have ultimate transparency and we need to strike a balance between having everything be open, shareable and hackable/improvable – and having security vulnerabilities in things like pacemakers. The United States Food and Drug Administration has already communicated on this issue and where this balance is struck should be a conversation everyone is invited to be a part of.

When I'm dying, and a robotic combination of code and hardware is keeping me alive, I want to know that anyone has been able to improve the code, that everyone owns it, and I, along with anyone else, have had the chance to make it better.

Is that normal?