Category: Paperless Courts

  • When Courts Confuse Asymmetry with Injustice: Kenya’s AI Ruling and the Fear of the Machine

    When Courts Confuse Asymmetry with Injustice: Kenya’s AI Ruling and the Fear of the Machine

    A comparative East African reflection on artificial intelligence, procedural fairness, and the future of legal drafting

    Enen Legal World Logo.


    A self-represented litigant in Nairobi used artificial intelligence to draft his pleadings. He reviewed, edited, and adopted every word. He swore no fabricated cases, no false citations. He acted transparently, disclosing his use of AI tools.

    Then the High Court of Kenya at Milimani set aside his judgment, called his conduct an abuse of process, and barred him from ever filing any “machine‑generated” pleading in any Kenyan court – unless Parliament first passes a law explicitly allowing AI‑assisted drafting.

    That is not judicial caution. It is judicial anxiety in the face of technological disruption.

    The Ruling in Brief

    In Republic of Kenya, High Court at Nairobi County, Milimani High Court, HCJRMISC/E120/2025 (ruling delivered 16 April 2026), Justice J. Chigiti (SC) considered whether it is legal to draft pleadings using artificial intelligence tools. The respondent/ex parte applicant admitted using what he described as ordinary digital tools, including legal research tools, to assist in writing. He maintained that he had personally reviewed, edited, and adopted every document and remained personally responsible for all factual statements on oath and legal citations. He argued that his pleadings contained no fabricated cases, false citations, or invented quotations, and that, being self‑represented, he had used lawful tools to participate effectively in court.

    The Court disagreed. It held that:

    · The use of personalised drafting tools, structures and methodologies not provided for under the rules of drafting was “deplorable”.
    · Allowing such departures would create a “litigation disaster” leaving judges with no guiding beacons.
    · Generating pleadings through unknown tools or AI gives an unfair advantage to the user, amounting to an affront to access to justice under Article 48 of the Constitution.
    · The fact the applicant admitted using such tools amounted to an abuse of court.
    · The applicant could not “vouch for or verify for the court the truthfulness or accuracy” of AI‑generated pleadings, because that would mean he acted as a judge in his own case, violating natural justice.

    On that basis, the Court barred the applicant from filing any other pleadings in any court that are machine‑generated, unless a law is passed in Kenya allowing or providing for drafting using artificial intelligence tools.

    The Court did observe that technology is a powerful socio‑economic growth tool when harnessed within a legal framework, and invited the Rules Committee to consider amending the Civil Procedure Rules through public participation to embrace technology and AI drafting rules. But the prohibition stands.



    The Flaws in the Judgment

    Respectfully, the ruling cannot withstand serious scrutiny. I identify four fundamental errors.

    1. The “Procedural Integrity” Error

    The Court reasoned that because the Civil Procedure Rules do not mention AI, using AI is unlawful. But the Civil Procedure Rules do not mention laptops, either. They do not mention word processors, grammar‑check software, the delete key, or the backspace button. No judge has ever struck a pleading for being typed rather than handwritten.

    Silence in the rules is not a prohibition. It is a gap that the rules themselves empower courts to fill – reasonably, proportionately, and with an eye to justice, not to ritual.

    2. The “Unfair Advantage” Error – This One Is Fatal

    The Court held that a litigant using AI has an unfair advantage over one who does not, and that this violates equality of arms.

    Let us apply that logic consistently.

    · Google vs. Law Reports – A lawyer with a smartphone and an internet connection can find authorities in seconds. Another, relying on a dusty shelf of hardbound law reports, takes hours. Is that unfair? No judge has ever said so.
    · AfricanLii / KenyaLii – These digital databases make case law searchable, cross‑referenced, and instantly accessible. A litigant without them is at a disadvantage. Has any court called that an affront to Article 48? On the contrary, the Judiciary itself promotes these tools.
    · Ulii (Uganda Legal Information Institute) – It now uses AI to summarise judgments. No judge in Uganda has condemned it. No advocate has been barred for citing an AI‑generated summary. The tool is public, free, and welcomed.
    · Modern medicine – A patient in a Nairobi teaching hospital has access to MRI scans, robotic surgery, and AI‑assisted diagnostics. A patient in a remote clinic does not. That inequality is real. But no court has banned MRI machines because not everyone can afford them. The answer is to spread the technology, not to ban it.

    The Court confused asymmetry with injustice. An asymmetry is unjust only when it is arbitrary (only one side gets the tool), hidden (use is not disclosed), or undermines a core right (such as the ability to test evidence). None of those conditions applied here. The litigant disclosed his AI use. The tools are widely available. And the core right – to present a truthful, coherent pleading – was enhanced, not undermined.

    If the Court’s logic were applied consistently, we would still be filing pleadings in quill and ink. The unfair advantage is not in the tool. It is in the refusal to adapt.

    3. The “Judicial Capacity” Error

    The Court said it cannot “verify” AI‑generated content, so the safer course is to ban it entirely.

    But courts never “verify” how a human wrote a pleading. They do not audit pen strokes, interview secretaries, or review dictation logs. They look at the final document. If it contains lies, fake cases, or false citations, they sanction the filer. That same framework works perfectly well for AI.

    The Court could have required disclosure, a personal verification oath, and a statement that no fabricated content is included. That is governance, not prohibition. Instead, it chose the nuclear option.

    4. The “Parliament’s Prerogative” Error

    The Court held that only Parliament, not the courts, can authorise AI use in legal process.

    Artificial intelligence is not a controlled substance. It is a tool. Courts do not need a statute to permit the use of search engines, word processors, or online databases. They do not need an Act of Parliament to allow a lawyer to take a typing class.

    Mandating a legislative framework for basic productivity software is not judicial restraint. It is jurisdictional abdication.



    A Constitutional Mirror: Article 159 of the Kenya Constitution

    The ruling’s approach sits uneasily with Kenya’s own constitutional framework. Article 159(2)(d) of the Kenya Constitution 2010 commands that “justice shall be administered without undue regard to procedural technicalities.”

    Procedure exists to serve justice – not to imprison it. A prohibition on an entire category of drafting tools, without any evidence of misuse, elevates form over substance. That is precisely what Article 159 warns against.

    If a self‑represented litigant files a pleading that is truthful, coherent, and personally verified, does the mere fact that an AI assisted in its composition make it less worthy of consideration? The Constitution suggests the answer is no.



    What the Court Could Have Done – And What Others Are Doing

    A more thoughtful, proportionate approach is not only possible; it is already being implemented elsewhere.

    In Kenya itself, Justice Bahati Mwamuye recently struck out an AI‑assisted filing – but for procedural defects (missing notice statements, non‑compliant affidavits), not for AI use itself. He gave the litigant leave to refile. That is proportionate. (See AllAfrica, 11 March 2026)

    Internationally, Singapore’s State Courts have issued a detailed Guide on the Use of Generative Artificial Intelligence Tools by Court Users (effective 1 October 2024). Lawyers may use AI but remain fully responsible for all content; must fact‑check; must not fabricate evidence; violations may lead to sanctions. No prohibition. Just governance. (Registrar’s Circular No. 9, State Courts of Singapore)

    In Estonia, small contract disputes below €7,000 can be decided by an AI judge that proposes a decision; a human judge then reviews and may modify or set it aside. That system has reduced backlog without sacrificing due process. (Law Society Journal, Australia, August 2024)

    Even Kenya’s own Chief Justice, Martha Koome, announced in August 2025 that the Judiciary is developing an AI Adoption Policy Framework to guide integration of AI tools while safeguarding judicial independence, data privacy and due process. (Judiciary of Kenya official website, 11 August 2025)

    The Chigiti ruling is swimming against the tide of its own institution’s planning.

    The correct path is clear:

    · Disclosure – A litigant or lawyer using AI to draft pleadings should disclose that fact.
    · Verification – The filer must personally review and adopt all content, swearing to its truthfulness.
    · Accountability – False citations, fabricated cases, or misleading content remain sanctionable, whether written by a human or generated by a machine.
    · No prohibition – The tool itself is not the offence. Misuse is.

    The Legal Profession Responds

    Prominent Kenyan lawyers have reacted with dismay.

    Ahmednasir Abdullahi, SC, one of Kenya’s most respected advocates, wrote on X: “What an absurd decision. Does it matter whether one drafts pleadings using AI tools or uses a typewriter? It is none of the court’s business.” (Nairobi Law Monthly, 21 April 2026)

    Steve Biko Wafula, senior counsel, published a detailed critique: “This ruling reads less like modern jurisprudence and more like a judicial panic attack in the face of technological change… The court had a first‑rate jurisprudential problem in its hands and squandered it, trying instead to drag the administration of justice back into the pre‑digital age.” (Soko Directory, 21 April 2026)

    These are not fringe voices. They are the heart of the Kenyan bar.

    A Word to My Ugandan Colleagues – And to Our Judges

    I write from Uganda, where we have not (yet) seen a ruling of this kind. Our judges have quietly tolerated – perhaps even welcomed – the steady digitisation of practice. We use e‑filing, and we cite Ulii’s AI‑generated summaries without panic.

    But the same instinct that produced the Chigiti ruling lives everywhere: the fear that the machine will replace the judge, that the algorithm will swallow the advocate, that technology will dissolve the profession’s hard‑won exclusivity.

    That fear is misplaced.

    AI does not abolish judgment. It does not abolish ethics. It does not abolish the court’s ultimate authority. What AI abolishes is inefficiency – hours spent searching for authorities that software can locate in seconds, repetitive drafting, and the false prestige built around scarcity of technical knowledge.

    And perhaps that is what truly frightens some corners of the profession. When information becomes democratised, gatekeepers begin to sweat.

    But justice does not belong to the gatekeepers. It belongs to the public. And the public does not care whether a pleading was drafted by candlelight, typewriter, Microsoft Word, or artificial intelligence. The public cares whether justice is accessible, affordable, timely, intelligible, and fair.

    If AI helps achieve that mission, then resisting it is not conservatism. It is obstruction. And obstruction disguised as professionalism remains obstruction.

    Conclusion: The Future Cannot Be Injuncted

    History is littered with institutions that initially resisted the printing press, telephones, computers and the internet – only to later embrace them as essential tools.

    Did the world wait for a complete legal framework before embracing mobile money? Did banks issue a constitutional petition before M‑Pesa rewired African commerce? Did Western Union obtain an injunction against digital wallets because “money transfers” had traditionally been their sacred territory? Of course not.

    Technology arrived. Society adapted. Regulators followed. That is how civilisation has always moved.

    The same will happen with AI in the legal profession. The only remaining question is: will courts lead this transformation – or become footnotes in it?

    To our Kenyan brothers and sisters: this ruling is a warning for all of us. Not because Kenya is wrong, but because the same instinct – to fear the machine, to reach for a prohibition when a guideline would suffice – lives in every jurisdiction, including ours. The question is not whether Uganda will face this debate. The question is whether we will face it more wisely.

    And to any judge reading this: thank you for your service. But please, do not ban the future. Regulate it, guide it, human‑oversight it – but do not pretend that a tool becomes an abuse simply because it is new.

    This time, let us not make the same mistake.

    ― END ―

    Disclaimer: This blog is a critique of a judicial ruling and a contribution to the conversation on technology and legal practice. It is not intended as legal advice, nor as an attack on any judicial officer or institution. The author remains committed to the rule of law, judicial independence, and the responsible integration of technology into the administration of justice.

    Enen Ambrose.  (File photo)


    Enen Ambrose

    Member: Judiciary Affairs Committee

    Uganda Law Society

    For feedback or questions, write to: enen@enenlegalworld.com

  • The Quiet Violence of Procedure: When Digital Service Serves No One

    The Quiet Violence of Procedure: When Digital Service Serves No One

    Enen Legal World Logo.


    There is a quiet violence in procedure. It does not shout. It does not argue. It simply assumes; and in that assumption, rights collapse without anyone noticing. This is exactly what happened in two recent High Court decisions: Visare Uganda Ltd vs Festus Katerega T/A Quickway Auctioneers and 3 others. A copy of it can be accessed here:

    and: Western Cable Company Limited vs. Juliet Namuli Asiya and 7 others. A copy of the rulinf can be accessed here:



    A case is filed. A hearing date is fixed. Somewhere deep within a digital system, a notice is uploaded. The law nods in satisfaction: service has been effected. The machinery moves. The courtroom sits. The judge writes. And somewhere else, perhaps across the city, perhaps across a fragile internet connection, a litigant knows nothing.

    We call this progress.

    We call this efficiency.

    We even call it justice.

    In the recent ruling of the High Court of Uganda in Misc. Application No. 2289 of 2025, the court took the position that once a hearing notice is posted onto ECCMIS, service is complete. It held that it is not mandatory for a party to actually receive an email or SMS notification, so long as the system reflects that service was effected.

    The implication is stark: the burden shifts entirely to the litigant or counsel to constantly monitor the system. Failure to do so is fatal. A case may be dismissed. Rights may evaporate. And yet, in the eyes of the law, nothing has gone wrong.

    But open justice demands something far more stubborn, far more human. It demands not that proceedings merely exist in public form, but that those whose rights are at stake are actually present; or at the very least, actually aware. The old wisdom insisted that justice must be seen to be done. It did not imagine a world where justice could be technically visible yet practically invisible; where a notice exists, but never reaches; where a hearing occurs, but never touches the party it condemns.

    And this is not an abstract concern. It is a doctrinal one.

    The Supreme Court of Uganda, in Geoffrey Gatete & Another v William Kyobe, confronted a similar question under the language of “deemed good service.” The Court drew a careful and deliberate distinction; one that modern digital procedure now risks erasing.

    It held that “deemed service” is a legal fiction, a procedural convenience that allows courts to proceed even where actual notice may not be proven. But it went further to warn that such service does not necessarily amount to “effective service.” For service to be effective, it must achieve its intended purpose: to bring the proceedings to the attention of the party.

    A copy of the decision in Gatete can be accessed here:



    This distinction is not semantic. It is foundational.

    Because once the law accepts that something may be “deemed” without being real, it must also accept the consequences; that the fiction may fail in practice. And where it fails, justice demands correction.

    Yes, there will be cases where a litigant deliberately avoids monitoring the system. But the system cannot punish the many for the bad faith of the few; especially when actual notice remains technically possible.

    Yet the modern system presses on, collapsing this distinction. ECCMIS becomes both the record and the proof, both the act and its consequence. Once a notice is uploaded, the law assumes its journey is complete.

    But a system is not a voice. A database is not a message. A record is not communication.

    And so we arrive at a troubling convergence: a digital architecture that satisfies procedural form while undermining substantive awareness.

    Context makes this even more urgent. Even in Kampala, internet access is not constant. Connectivity fluctuates. Costs are high. Power is unreliable. To build a legal system on the assumption that litigants and advocates will perpetually monitor an online platform is to design justice for an ideal world, not the real one.

    What then becomes of open justice?

    It remains, perhaps, in architecture. The courtroom doors are still open. The rulings are still written. The processes are still documented. But the litigant; the very person for whom the system exists; may never arrive, not out of defiance, but out of ignorance.

    And in that moment, something profound happens.

    Justice is no longer denied loudly. It is denied quietly.

    Not in secrecy, but in silence.

    Not by concealment, but by assumption.

    Justice does not only die in closed courtrooms. It also dies in silent systems, where notices exist, but never reach.

    This is not an argument against technology. It is an argument against unquestioned technology. Against systems that replace human communication with automated presumption. Against a jurisprudence that confuses efficiency with fairness.

    The answer is neither retreat nor resistance. It is correction.

    If ECCMIS is to be the backbone of modern judicial administration, then it must evolve beyond being a passive repository into an active communicator. It must speak, not just store. It must reach, not just record.

    External notification systems are not luxuries; they are necessities. SMS alerts. Email notifications. Web based and Android Push Notifications, Real-time prompts that move beyond the confines of the system and into the lived reality of the user. And more than that, they must not be optional embellishments. They must be integral guarantees, designed to ensure that service is not merely deemed, but actually effected.

    The Judiciary and the architects behind ECCMIS stand at a critical threshold. They have built the infrastructure. Now they must build the connection.

    Because the law may deem service to be good, but justice demands that service be real.

    A system that merely stores notices, without ensuring they reach those whose rights are at stake, does not advance justice, it endangers it. In a jurisdiction where access to digital infrastructure is uneven, to insist that litigants must constantly patrol an online platform is to replace fairness with fiction.

    Technology must serve justice, not obscure it.

    There is an old wisdom in scripture: No one lights a lamp and puts it under a bed. Instead, they set it on a stand, so that those who enter may see.

    ECCMIS is that lamp, lit, visible in theory. But when a notice sits in a database without actively reaching the litigant, we have placed it under the bed. The light exists. It just does not shine where it is needed most. (Mark 4:21)

    Let ECCMIS evolve, blending its internal efficiency with robust external communication, ensuring that every litigant is not merely assumed to know, but is given a real opportunity to know.

    For if justice is to remain open, it must also remain visible.

    Otherwise, quietly and without protest,
    justice will die in the darkness of its own systems.
    -THE END-

    Disclaimers:

    This Blog is not an attack on the Judicial officers who handed down the two decisions criticised above. It is not an attack on the institution of the Judiciary or EECMIS developers. It is intended to spark conversations to make E-Justice and the whole E-Government Digital Transformation a complete and wholesome journey and / or experience. 

    This Blog is not to be substituted for or taken for legal advice. The author does not accept responsibility or liability for damage suffered as a result of its use as legal advice. Readers are encouraged to consult a qualified and licensed attorney for situation specific legal advice.

    Enen Ambrose. (Personal Archive)

    Enen Ambrose

    Member, Judiciary Affairs Committee of

    Uganda Law Society.

    For feedback or questions, write to: enen@enenlegalworld.com