Catholic moral theologians, ethicists back Anthropic in government AI showdown
Catholic moral theologians and ethicists filed an amicus brief supporting AI company Anthropic in its lawsuit against the U.S. Department of War. The experts affirmed Anthropic's stance of maintaining guardrails on its AI technology regarding autonomous weapons and mass surveillance as responsible corporate citizenship. Anthropic sued the Pentagon after the President directed government agencies to cease working with the company due to disagreements over acceptable technology use by the War Department. Fourteen scholars specializing in Catholic moral theology, philosophy, and social thought contributed to the brief, offering a perspective grounded in moral tradition.
about 1 month ago
Anthropic sued the U.S. Department of War (DoW) on March 9, 2026, after losing a $200 million contract and facing a "supply chain risk" designation—the first for a U.S. company—due to its refusal to allow unrestricted AI use for mass surveillance and lethal autonomous weapons (LAWS).1 3 4 President Trump directed agencies to halt Anthropic product use within six months, prompting the lawsuits alleging First and Fifth Amendment violations.3 4 5
Anthropic CEO Dario Amodei stated current AI lacks reliability for life-or-death decisions without human oversight or population-scale surveillance.1 3 4
On March 13, 2026, 14 Catholic moral theologians and ethicists filed an amicus curiae brief supporting Anthropic in U.S. District Court for the Northern District of California.1 3 4 Principal authors: Charles Camosy (Catholic University of America), Joseph Vukov (Loyola Chicago), Brian J.A. Boyd, and Brian Patrick Green (Santa Clara University).1 3 4
Signers include professors, authors, and Father Michael Baggot; they praised Anthropic as a "responsible and moral corporate citizen."3 4 5 The brief grounds arguments in Catholic tradition on human dignity, privacy, and just war theory.1 4
Catholic scholars cite Church teaching on privacy from the Catechism and Pope Francis's 2023 call for AI treaties against a "surveillance society."1 3 Mass surveillance by the military undermines human relationships and dignity, treating people as data objects akin to the "technocratic paradigm."1
Subsidiarity opposes centralizing monitoring power, disempowering local agencies and risking totalitarianism, per Pope Pius XI's Quadragesimo Anno.1 3 4 Scholars align with Anthropic but emphasize moral boundaries beyond technical limits.3 5
LAWS fail jus in bello requirements like proportionality and noncombatant immunity, which demand human prudential judgment—not AI pattern matching.1 3 4 Vatican has opposed LAWS since 2013; scholars reject them even if reliable, unlike Anthropic's current-tech focus.1 3 5
LAWS obscure moral responsibility, accelerate decisions, and bypass human oversight essential for life-and-death choices.1 4 Camosy stressed war requires human moral accountability.4
Camosy told Crux war demands human oversight for justice.4 Vukov highlighted murky responsibility in machine decisions.4 Boyd called support "responsible citizenship" against government threats.4
Green praised Anthropic's stand sparking AI ethics debate but noted future tension, as Catholics reject LAWS outright.3 4 5 Brief quotes Pope Benedict XVI's Spe Salvi: unchecked tech progress threatens humanity.3 5
Dispute ignited national AI ethics discussion, echoing Vatican warnings.3 4 U.S. policy requires "appropriate" human judgment in weapons, but flexibility raises concerns amid arms race.4 Scholars urge ethical boundaries on tech violating dignity, regardless of legality.1 3
Does Catholic moral teaching permit AI firms to refuse military use?
Catholic moral teaching not only permits but often requires refusal to cooperate in actions that gravely violate human dignity, such as the development or use of lethal autonomous weapon systems (LAWS) lacking meaningful human control. This principle extends to AI firms, drawing from teachings on conscientious objection, non-cooperation in evil, and the urgent ethical concerns surrounding AI in warfare.
Catholic doctrine affirms the right and duty of conscience to refuse participation in immoral acts, even when mandated by civil authority or professional obligations. The Compendium of the Social Doctrine of the Church states:
Citizens are not obligated in conscience to follow the prescriptions of civil authorities if their precepts are contrary to the demands of the moral order, to the fundamental rights of persons or to the teachings of the Gospel. Unjust laws pose dramatic problems of conscience for morally upright people: when they are called to cooperate in morally evil acts they must refuse.
This refusal is a basic human right, protected from legal, professional, or financial penalties. The Catechism of the Catholic Church (CCC) further explains cooperation in sin:
We have a responsibility for the sins committed by others when we cooperate in them: by participating directly and voluntarily in them; by ordering, advising, praising, or approving them; by not disclosing or not hindering them when we have an obligation to do so; by protecting evil-doers.
Pope Leo XIV recently emphasized freedom of conscience as essential, allowing refusal of "legal or professional obligations that conflict with moral, ethical or religious principles," including military service. Scholarly analyses rooted in Aquinas and magisterial texts confirm that laws requiring participation in unjust acts (e.g., unjust wars) "must nowise be observed, because... we ought to obey God rather than man" (Acts 5:29).
These principles apply broadly, not limited to individuals: firms, as collections of persons, share moral responsibility and cannot invoke contracts or profit to justify evil.
Papal and dicasterial teachings repeatedly highlight AI's military applications—especially LAWS—as intrinsically problematic due to the delegation of life-and-death decisions to machines, which lack moral judgment. Pope Francis, in his G7 address, insisted:
No machine should ever choose to take the life of a human being.
The Dicastery for the Doctrine of the Faith's Antiqua et Nova deems LAWS a "cause for grave ethical concern" because they lack "the unique human capacity for moral judgment and ethical decision-making," calling for their prohibition. Pope Francis reiterated in his 2024 World Day of Peace message:
Autonomous weapon systems can never be morally responsible subjects. The unique human capacity for moral judgment... cannot be reduced to programming a machine... It is imperative to ensure adequate, meaningful and consistent human oversight.
Pope Leo XIV echoes this, questioning how AI serves the common good without undermining human dignity. Holy See statements urge a moratorium or ban on LAWS, as they detach warfare from human agency and risk indiscriminate harm.
Developing or supplying such AI constitutes formal or direct cooperation in evil if it enables intrinsically immoral acts, like autonomous killing, triggering the duty to refuse.
While sources primarily address individuals (e.g., soldiers, doctors), the principles scale to corporations: firms must prioritize human dignity over contracts. Pope Francis warns against AI fostering a "throwaway culture" or exacerbating injustice, demanding "proper human control." Refusal aligns with calls for "algor-ethics" and "ethics by design," where ethical considerations guide from research inception.
The Compendium obliges military personnel to resist criminal orders, extending analogously to suppliers enabling such crimes. Scholarly works affirm Catholic scrutiny of state force, supporting objection to "some methods of using force" or branches enabling immoral missions. No source mandates AI firm participation in military projects; instead, they promote peace-oriented AI use.
Firms refusing military AI contracts (e.g., LAWS) exercise conscientious objection, safeguarding conscience and the common good.
In summary, Catholic teaching permits—and in cases of grave evil like uncontrolled lethal AI, requires—AI firms to refuse military use, prioritizing obedience to God and human dignity over worldly demands. This fosters a "culture of encounter" amid technological revolutions.