The legal profession in India is witnessing a seismic tech shift. Last two years, elite law firms have rapidly integrated specialised legal-specific AI services into workflows, keeping pace with global counterparts. Tools like Lucio, Harvey and Legora are exponentially enhancing research, drafting and analytical capabilities of lawyers. In stark contrast, irresponsible and unethical use of AI services for drafting petitions has rightfully earned the ire of Supreme Court justices. What is more, legal aid counsels and National Legal Services Authority (NALSA) empanelled lawyers, who do the heavy lifting for litigants relying on state legal aid and pro bono services, continue to rely on manual research, stretched resources, and increasingly, free AI platforms like ChatGPT.
This asymmetry, if allowed to metastasize, runs the risk of engendering not merely a digital divide but a substantive justice gap. With the availability of legal aid already grossly insufficient, we face a future where the common citizen’s access to quality legal counsel and defense suffers due to technological deprivation of legal aid institutions and their representatives. Given rapid progress of generative AI and emergence of indigenous startups, there is real opportunity for GOI to empower its standing army of over 41,000 panel advocates and 3,164 Legal Aid Defense Counsels (LADCs) with modern tech.
Failure to act would run counter to Article 39A of the Constitution that mandates equal justice and free legal aid. To ensure this remains a living guarantee, the state cannot be a spectator while legal AI becomes the preserve of elite law firms. NALSA must engage, not just as a distant observer, but as an active market participant and ‘model adopter’, ensuring legal AI services in the country are secure, confidential, and localised to cater to India’s diverse linguistic needs.
Legal basis for such intervention is already embedded in the Legal Services Authorities Act, 1987 (Act). Under Section 4, NALSA is mandated to frame “the most effective and economical schemes” and “undertake and promote research in the field of legal services” to ensure justice is not denied by reason of “economic or other disabilities”. In the contemporary context, lack of access to specialised legal AI is itself a disability.
Advantages of these tools in efficiency and research depth are well documented. Their absence creates an outright barrier for legal aid lawyers. By invoking Section 4, Nalsa can act as a key procurer, setting stringent guidelines on data security, local contextualisation, and hallucination prevention. In fact, given the sheer numerical strength of LADCs and panel lawyers, Nalsa could become the largest institutional procurer of legal AI services globally.
This aligns with GOI’s “AI for All” strategy and Phase III of the e-Courts project, which envisions AI-assisted decision-making, case management, forecasting and automated workflows. Such a procurement programme shall ensure that “competent legal services” promised under the Act are not outpaced by the private market’s digital evolution.
Data security and attorney-client privilege is critical. Currently, many lawyers lacking specialised tools are turning to public, free LLMs, creating a privacy nightmare. Such use feeds granular, sensitive personal data of vulnerable litigants into unsecured, non-domestic servers. This risks use of sensitive personal data for training, aggregated learning, and targeted commercialisation, leading to a serious breach of attorney-client privilege.
To ensure data confidentiality and limited, purposeful use, NALSA can build on security layer of AI legal service providers by mandating they host the data of LADCs and panel lawyers on secure, self-hosted servers, as part of procurement conditions. This will establish a walled garden of litigant data ensuring they are not put to private commercial use.
Nalsa could also introduce a “privilege shield”: work products generated within this state-procured tool should enjoy enhanced legal protections. This would incentivise LADCs and panel lawyers to abandon insecure, free alternatives in favor of a secure, state-sanctioned ecosystem.
Critics would argue that legal AI tools for legal aid lawyers is unnecessary luxury. On the contrary, it is a mathematical necessity to expand legal aid services. British economist Jevons had, in the 19th century, studied how technological efficiency is a key driver for increased consumption. The same argument holds true for availability of legal aid in India. At present market rate of high-quality legal AI services, annual cost of procuring AI services for 41,000 panel advocates and 3,164 legal aid defense would not exceed Rs 30cr -- less than 10% of central grant NALSA received in 2023-24. With finite budget and lawyer hours available for legal aid, the only sustainable path to increase availability of legal aid services is to leverage tech.
Currently, an LADC lawyer may be overwhelmed by caseloads that preclude high quality, and time-bound representation. By automating large segments of a lawyer’s workflow, specialised legal AI lowers time investment for providing high-quality, effective representation. This increased productivity would allow for an expansion of coverage; more litigants can be served with a level of rigour previously reserved for those who could afford private counsel.
Much like UPI success story, state and judiciary have much to deliver as active market participants. Today, public sector has shown little participation in how AI is leveraged by legal practitioners. As a primary provider of specialised legal AI, state would have both budget and data necessary to influence design principles and ethical standards.
As CJI articulated, access to justice is a “sapient right”, and while tech acts as a “force multiplier”, it must not reduce justice to a “factory of canned responses”. AI for Nalsa does not seek to automate the judicial mind but equip legal aid lawyers with the same analytical toolkit available to the most expensive corporate counsel. If state fails to democratise access to legal AI, it will inadvertently privatise the quality of justice itself.
Dhawan is LLM Candidate at Yale Law School (Information Society Project). Agrawal is Head of Strategy, Lucio