Photojournalism relies on a set of ethical principles built around truth, accountability, and the safety of the people photographers document. The introduction of AI into newsrooms puts all three at risk.
Trust and consent: Photojournalists are granted access to people's most vulnerable moments. That access is built on trust — a promise to document reality faithfully and to protect the people photographed. Using those images to train AI without a source’s knowledge or consent breaks that promise. People who agree to be documented in moments of crisis, grief, or vulnerability did not consent to having their likenesses used to train commercial AI systems.
Truth and ethics: A generated image cannot tell the truth. It has no ethics, no accountability, no moment. Throughout journalism's history, documentary photographers have followed a strict set of ethical guidelines to communicate reality to readers. AI models do not account for these ethics. If news organizations use photojournalists' work to train AI, they undermine the very standards their own editorial guidelines are supposed to uphold.
Safety: AI-based surveillance tools are trained on the same kinds of images photojournalists make. The proliferation of these tools puts sources, subjects, and vulnerable communities at real risk — eroding a publication's ability to guarantee the safety of the people who appear in its pages.
This risk is not theoretical. Photographers working with undocumented communities, asylum seekers, activists, and others whose safety depends on anonymity have built their practice on an implicit promise: that access granted in trust will be protected. Work Made for Hire contracts and unlimited sublicensing rights break that promise at a structural level. A publication that holds WMFH rights to an entire body of work — including unpublished images from an assignment — has the legal basis to feed all of it into AI systems, including those that power facial recognition and surveillance technology. There are currently no industry-wide ethical guidelines preventing them from doing so.
This is not hypothetical. It is a present reality that photographers working in sensitive contexts can no longer ignore.
Economics: Photojournalism is already one of the most financially precarious forms of photography. Many photojournalists depend on licensing their work to make ends meet and have carefully protected their copyright to do so. New Work Made for Hire contracts eliminate those protections, opening the door to uncompensated AI use. Some publications are introducing these contracts while simultaneously negotiating multi-million dollar deals to license editorial content — including photographs — to AI companies.
Join our efforts by signing up for our newsletter.
Explore resources on this site:
FAQs: Your Visual Colleagues put this together to answer the questions we hear most from photographers who have signed our statement — and those who are still deciding.
AI and Photojournalism: The introduction of AI into newsrooms puts truth, trust, and the safety of the people we photograph at risk. Here's why it matters.
AI and Commercial Photography: From ad campaigns to stock libraries, AI is displacing photographers and dismantling the client relationships that sustain commercial careers.
The WSJ Contract: Publications like The Wall Street Journal are rewriting the terms of freelance photography. Here's how we got here — and what the fine print actually means.
Understanding Your Contract: Three contract terms every photographer needs to know — and what to do if you find them in your agreement.
What You Can Do: Concrete steps photographers can take right now to protect their work, their archive, and the people they photograph.
Resources: Organizations, tools, and legislation for photographers navigating the age of AI.