Digital Health in India - Ethics and Consent
All technology has certain inherent flaws, and each technological innovation must overcome various obstacles to adoption. These range from the age-old fear of loss of employment to the contemporary threat of manipulation of political systems and even the human mind by artificial intelligence. In the health sector, there is even less room for error – the potential cost of failure is too high. If digital health is to be the future of the health sector, it needs to address the risks it poses – from the perspective of technology and medicine – in clinical settings and in regard to population health. Timely and incisive legislation incorporating these legal and ethical considerations would help ensure the establishment of an effective regulatory framework to encourage and steer the development of the digital health sector in India so as to maximise the benefits and minimise the costs.
Digital Technology in the Indian Health Sector – Ethical Concerns
The Indian health sector has historically faced numerous hurdles – with a capacity of somewhere between 0.4 – 0.9 qualified doctors[i] and 0.7 hospital beds[ii] for every 1000 people in India. The strain on the system is exacerbated by the urban-rural divide, with healthcare facilities concentrated in urban areas[iii], while two-thirds of the population resides in rural areas[iv], leading to rural populations relying on unqualified personnel for medical treatment[v]. Digital health presents an opportunity to address these inadequacies and inequities, however, low levels of education and digital literacy[vi] pose additional challenges to the adoption of digital health in India.
In this context, various ethical issues arise in regard to the deployment of digital health products and services (“DHPS”) – these include issues around privacy and data security, equity in access, data ownership, individual autonomy, procedural transparency, and accountability, among others. These may be examined in the context of the different aspects of the digital health space:
Technology – one of the primary ethical concerns in respect of digital health technology is privacy and data security i.e. data breaches, unauthorized or unethical collection, use, and sharing of data, and corruption or loss of data, which can be addressed inter alia, through robust technology and transparent systems[vii]; however, with digital services being developed and deployed in the absence of regulation, there is the danger of unchecked ethical transgressions prior to the institution of legal and regulatory oversight. As Gopichandran et al. observe in the 2020 World Health Organization Bulletin[viii]: “In the absence of strong regulation of electronic health records in low- and middle-income countries, linking sensitive health information” to a unique identification system “results in the privacy of patients being compromised”. Personal data can be exploited to deny/ limit insurance, employment, and social benefits, among other things[ix].
Healthcare – technological innovation has outstripped legal and regulatory progress – it is necessary to ensure that those DHPS that have already reached the end-user are retrospectively assessed for compliance with regulatory standards. Ensuring the security and integrity of health data is essential; it is also necessary to ensure that in the delivery of precision healthcare, inbuilt biases based on uneven sampling do not result in sub-standard care – it has been suggested that this be addressed by designing to overcome the inherent inequities in the system.[x] The needs of populations traditionally underserved by the health sector need to be prioritized in designing DHPS, to ensure equitable access across the spectrum of socio-economic backgrounds, digital literacy levels, and physical and geographical constraints. Additionally, adequate oversight of digital and automated processes by qualified healthcare providers is a particular concern in a country with limited numbers of qualified medical personnel, and a multitude of unqualified agents stepping in to fill this gap.
Research – in the research space, a concern has been raised, that in the rush to dispense with hypothesis-driven research, to be replaced with algorithm-guided searches for correlations between phenomena, safety and clinical utility may be neglected in the evaluation of novel therapies or public health interventions[xi]. Further, “Geolocation technologies on a mobile phone (eg, GPS, WLAN) can reveal a range of personal information. This might include where you live, where your children go to school, whether you visit a therapist and if so how often, how often you visit drinking or gambling establishments, whether you arrive early or late to work…It is possible to identify a specific individual with reasonable certainty from this information. Consequently, it may be impossible to deidentify an individual’s mobile phone data, the standard way of protecting personal privacy in research.”[xii] “Understanding the risks posed by third-party access to their personal health data can be difficult to communicate given the complexity of mHealth technologies.”[xiii]This calls for innovation in the manner in which consent is obtained and applied, so as to ascertain ownership of the data and the permitted uses thereof, and to give participants control over the sharing of their data, with particular attention to the low levels of digital literacy prevalent in India.
Population Health and ‘Big-Data’ – the collection of large quantities of personal data is fundamental to data-based studies and technological innovation, especially as regards epidemiological studies and population health; however, even basic health data – such as ethnicity, sexually transmitted infections, diseases with a genetic basis and risk exposures for disease – can be misused and lead to discrimination and safety concerns. As pointed out by Wyber et al.: “Sheer size increases both the potential risks and potential benefits…Although the approach may have the most value in low-resource settings, it is also most vulnerable to fragmentation and misuse in such settings”[xiv] Further, there is a fundamental imbalance between the poor and vulnerable majority, being the data generators, and the “profit-motivated, digitally empowered multinational companies”[xv], who are the consumers of this data, and can gain access to and exploit these large population databases even without consent.
Digital Health Data – Adapting Consent to Meet Ethical Challenges
While there are several factors and tools needed to resolve the ethical conflicts referred to above, a common theme arises, and that is – the need to empower the individual through transparent processes and innovative regulation. Building a system of trust and accountability, and in particular, adapting the concept of ‘consent’ to the digital health context, are necessary to address the ethical concerns highlighted above.
Valid Consent – valid ‘consent’ as traditionally understood under Indian law, requires the person giving consent to be competent, i.e. having attained majority and being of sound mind, and further, that the consent be freely given – free from the effects of coercion, undue influence, fraud, mistake, or misrepresentation. In the medical context, doctors are required to inform the patient of all the relevant details of a medical procedure/ treatment before obtaining their consent. In clinical settings before the digital age, where a doctor and patient sit face to face, and diagnoses, procedures, and treatment are explained to an individual, this idea of consent may have sufficed, however, in the digital space, there are many more variables at play, for instance: (a) Communication over the phone / videoconference may not always be as effective as in person communication, especially in the case of persons who are less educated and aware; (b) Submission of information through apps or online portals in such cases, may also result in mistakes and miscommunication; (c) Personal and health data once collected, even if anonymized, may be re-identified later on; this data may also be correlated with other data such as location data to derive secondary information from the original data; (d) Data collected for certain research purposes today, may be utilized by researchers for different purposes tomorrow, however, the DHPS user may not be aware of the future applications of their data.
Models of Consent – various models of consent have been the subject of debate in the medical research world. These range from explicit ‘informed consent’, in which the individual is given all relevant information in language that they understand, to enable them to voluntarily make a specific health related decision, to ‘broad consent’ or ‘open/ blanket consent’, which represents a spectrum ranging from the individual agreeing to a broad range of applications of their data, to relinquishing confidentiality, and future rights to govern the use of their data.[xvi]
Indian Context – consent in the Indian context needs to account for varying levels of digital literacy – the design of the user interface and presentation of information must provide for the lowest literacy levels. It must, further, account for the limited access to healthcare providers and facilities, the traditional reliance on local medical/ purported medical practitioners, multiple users of a single mobile device within a family, gender disparity in internet usage[xvii], and limited education, among other things.
Dynamic Consent – in the above context, dynamic consent stands out as a plausible alternative to traditional explicit consent and the contemporary concepts of broad and open consent, which occupy either end of the spectrum. Dynamic consent is a modified, personalized approach involving communication through a platform that places the user at the centre of the process, and specifically, enables researchers to reach out to participants in real-time, to obtain their consent to different applications of their data as the need arises.[xviii]
Consent, Clickwrap Agreements, and Contract Law
Beyond Consent – An obstacle to the adoption of dynamic consent would be the capacity of participants to invest time and energy in making decisions that they may find cumbersome or beyond their scope. The challenge for the technology is to overcome this through design. The challenge for the law is to ensure that safety mechanisms are incorporated in the legal framework. In particular, there is a need to create a safe space for innovation in the digital health sphere, such that the autonomy of the patient and the interests of the public are protected. Mechanisms like dynamic consent are transitory – they will serve for a short time, until the technology expands beyond its guardrails.
Clickwrap Agreements –i.e. mass contracts usually required to be signed by users of digital products, are adhesion contracts. These standard-form contracts are a manifestation of the complete power imbalance between the data generator and the data consumer; they sacrifice individual autonomy at the altar of technological progress. An essential ingredient of consent as traditionally understood, is comprehension. The person consenting must understand the subject matter and consequences of their consent. In the fast-evolving digital space, informed consent will likely become impossible. A person who cannot comprehend the workings of a neural network, let alone contend with its larger consequences, cannot be expected to give or withhold consent in any meaningful way.
Contract Act – further to the discussion of ‘Valid Consent’ above, Section 16(3) of the Indian Contract Act, 1872 provides that “where a person who is in a position to dominate the will of another, enters into a contract with him, and the transaction appears, on the face of it or on the evidence adduced, to be unconscionable, the burden of proving that such contract was not induced by undue influence shall be upon the person in a position to dominate the will of the other.” As pointed out by Ramaseshan[xix], this may be applied by the courts to ascertain whether an adhesion contract is ‘unconscionable’, however, this leaves significant room for interpretation. In the first instance, it may not, technically, protect a person who is adjudged competent to contract; secondly, it requires the court to exercise expert judgment in respect of cutting-edge technology, which is a tall order.
Tools such as de-identification of data, robust risk assessment, delegation of data governance to a trusted third party, and other such design and administrative measures may serve as alternatives to the traditional standard of explicit informed consent;[xx] however, as highlighted above, the law must also provide for a fail-safe mechanism. Where the average person lacks the capacity to exercise ‘prudent’ judgment – a standard traditionally applied in order to ascertain liability – the legal obligation to act responsibly must be imposed on the person/ entity in the more powerful position. Thus, there is a need for innovation in the law in keeping with the speed and scope of technological innovation, to ensure that patients are not left behind in the race toward digital health.
[i] The World Bank – Data <https://data.worldbank.org/indicator/SH.MED.PHYS.ZS> accessed: 17 June 2020; Samarth Bansal, ‘WHO report sounds alarm on ‘doctors’ in India’ The Hindu (New Delhi, 18 July, 2016) <https://www.thehindu.com/data/WHO-report-sounds-alarm-on-%E2%80%98doctors%E2%80%99-in-India/article14495884.ece> accessed: 17 June 2020
[ii] The World Bank – Data <https://data.worldbank.org/indicator/SH.MED.BEDS.ZS?locations=IN&view=chart> accessed: 17 June 2020
[iii]<https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(10)61894-6/fulltext> accessed: 18 June 2020
[iv] The World Bank – Data <https://data.worldbank.org/indicator/SP.RUR.TOTL.ZS> accessed: 17 June 2020
[v] Samarth Bansal, ‘WHO report sounds alarm on ‘doctors’ in India The Hindu (New Delhi, 18 July 2016) <https://www.thehindu.com/data/WHO-report-sounds-alarm-on-%E2%80%98doctors%E2%80%99-in-India/article14495884.ece> accessed: 17 June 2020
[vi] FE Bureau, ‘A look at India’s deep digital literacy divide and why it needs to be bridged’ Financial Express (New Delhi, 24 September 2018) <https://www.financialexpress.com/education-2/a-look-at-indias-deep-digital-literacy-divide-and-why-it-needs-to-be-bridged/1323822/> accessed: 05 July 2020
[vii]Vayena Effy, Haeusermann Tobias, Adjekum Afua, Blasimme Alessandro, ‘Digital health: meeting the ethical and policy challenges’ (Swiss Medical Weekly, 16 January 2018) <https://smw.ch/article/doi/smw.2018.14571/> accessed: 28 May 2020
[viii]Vijayaprasad Gopichandran, Parasuraman Ganeshkumar, Sambit Dash, and Aarthy Ramasamy, ‘Ethical challenges of digital health technologies: Aadhaar, India’ (Bull World Health Organ 2020;98:277–281) <https://www.who.int/bulletin/volumes/98/4/19-237123.pdf> accessed: 14 June 2020
[ix]Dankar FK, Gergely M, Dankar SK, ‘Informed consent in biomedical research’ Computational and Structural Biotechnology Journal 2019;17:463–74 <https://www.sciencedirect.com/science/article/pii/S2001037018303489?via%3Dihub> accessed: 29 June 2020
[x] Caroline Brall, Peter Schroder-Back Els Maeckelberghe, ‘Ethical aspects of digital health from a justice point of view’ European Journal of Public Health, Vol. 29, Supplement 3, 18–22 <https://academic.oup.com/eurpub/article/29/Supplement_3/18/5628045> accessed 26 May 2020
[xi]Vayena Effy, Haeusermann Tobias, Adjekum Afua, Blasimme Alessandro, ‘Digital health: meeting the ethical and policy challenges’ (Swiss Medical Weekly, 16 January 2018) <https://smw.ch/article/doi/smw.2018.14571/> accessed: 28 May 2020
[xii]Carter A, Liddle J, Hall W, Chenery H, ‘Mobile Phones in Research and Treatment: Ethical Guidelines and Future Directions’ JMIR Mhealth Uhealth 2015;3(4):e95<https://mhealth.jmir.org/2015/4/e95/> accessed: 05 July 2020
[xiv] Rosemary Wyber, Samuel Vaillancourt, William Perry, Priya Mannava, Temitope Folaranmi & Leo Anthony Celi, ‘Big data in global health: improving health in low- and middle-income countries’ (Bull World Health Organ 2015;93:203–208) <https://www.who.int/bulletin/volumes/93/3/14-139022.pdf> accessed: 9 June 2020
[xv]Gopichandran et. al. (n 7)
[xvi] Mark Sheehan, ‘Can Broad Consent be Informed Consent?’ Public Health Ethics 2011 Nov;4(3):226-235.
[xvii] Megha Mandavia, ET Bureau, ‘India has second highest number of Internet users after China: Report’ The Economic Times (Bengaluru, Last Updated: 26 September 2019) <https://economictimes.indiatimes.com/tech/internet/india-has-second-highest-number-of-internet-users-after-china-report/articleshow/71311705.cms> accessed: 05 July 2020
[xviii] J. Kaye et al., ‘Dynamic consent: a patient interface for twenty-first century research networks’ EuropeanJournalof Human Genetics (2015): 23, 141–146<https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4130658/pdf/ejhg201471a.pdf> accessed: 05 July 2020
[xix] V. Ramaseshan, ‘Adhesion Contracts and The Indian Law of Contract’ Vol. 17, No. 2(April-June 1975)Journal of the Indian Law Institute, pp. 237-256<https://www.jstor.org/stable/43950482?read-now=1&seq=1#metadata_info_tab_contents> accessed:05 July 2020
[xx]Dankar et al. (n 8)
By Ms. Kim D’Souza, Associate Fellow at Vidhi Centre for Legal Policy. She has been assisted by Sehaj Singh Cheema, a 3rd Year student of the Rajiv Gandhi National University of Law, Punjab and a Junior Editor at RSRR. This blog is part of the RSRR Blog Series on Digital Healthcare in India, in collaboration with Nishith Desai Associates.