The Children’s Justice Fund is an IRC 501(c)(3) nonprofit organization whose main purpose is providing financial support, technical assistance, and strategic guidance to unrelated organizations, institutions, and individuals that serve victims of child trafficking, child sex abuse, online child sexual exploitation, and child pornography. CJF conducts and promotes legal, empirical, and social science research with the goal of encouraging the development and implementation of child-victim-centered best practices, public policy, and law reform. A core aspect of CJF’s mission is filing amicus briefs, writing law review articles, and issuing papers and reports in support and in furtherance of its overall mission and focus on child victims.

The Stop CSAM Act is the worst piece of
legislation for crime victims in 30 years

by James R. Marsh – February 2, 2024

When I started working in Congress almost thirty years ago, my hope and goal was to create a more just and equitable world for children and crime victims, a world which gave not only hope and the promise of justice, but the reality of justice and fairness to society’s most vulnerable victims.

Twenty years ago, when I discovered that children were being trafficked to create child pornography (now called child sex abuse material or CSAM—a term I pioneered in the federal courts), I helped create Masha’s Law—named after my client—to give victims a meaningful civil remedy in federal court. The bill passed both houses of congress unanimously.

Ten years ago, when the United States Supreme Court told my CSAM client Amy that piecemeal compensation from criminal defendants was all that Congress required, I helped pass the Amy Vicky and Andy Act—again named after my clients—to establish a minimum amount of compensation for victims from the criminals who trafficked and collected their images. The bill passed both houses of Congress unanimously and established a compensation fund for victims.

Now ten years later, Congress says it wants a reckoning with big tech—the new traffickers and criminals exploiting our children. Sadly, Stop CSAM doesn’t make things better, it makes things worse.

Imagine a world where, in order to hold the Catholic Church accountable, the bishop would need to know that a child was being raped, and that he had 48 hours to stop the rape which was happening. If the bishop did not know of each individual occurrence of child rape, the Church could not be held responsible. Imagine a world that even if the bishop knew that a child was being raped right next door in the rectory, he had 48 hours to stop the rape and could only be held responsible on hour 48 and one second.

Imagine the same law applied to the Boy Scouts of America, the Southern Baptist Convention, foster and group homes, and day cares. Would such a law make things better for children or worse?

Then imagine a world in which the lawyers bringing these cases on behalf of child victims could be sanctioned if a judge somewhere found that the offending material was not really CSAM but child erotica. We call this material lawful but awful; the girl with semen on her back or panties stuffed into her mouth with the words “hurt me” scrawled across her forehead might be awful, but it’s sadly often considered lawful. Stop CSAM gives big tech the right to demand mini-trials about whether or not a lawsuit should have been brought in the first place with sanctions if an attorney ‘got it wrong.’ Thanks to Stop CSAM, the victims will now be paying the tech companies. Is that the kind of accountability Congress is demanding?

This is the unfortunate world Stop CSAM will create. Despite the many positive aspects of this bill, which I thoroughly support, this part of the bill, Section 2255A, has only gotten worse in the past year, more punitive, more intolerant of victims and justice, and more intransient to reasonable changes.

Stop CSAM will create a world in which victims themselves will need to police social media, the internet, and the dark web, to try to locate their CSAM in order to notify tech companies with the hope that within 48 hours the images will be taken down. And if the tech companies refuse to take the images down—deeming them not in conflict with their terms of service—the victims will have to find a lawyer to bring a case and then be prepared to pay the tech companies money if a judge somewhere deems their images lawful but awful.

This is clearly not a regime which will stop CSAM, it is a world in which tech companies will continue to act with impunity, shielded by a law which too many people believe makes meaningful change. That is not what we heard from Congress just 48 hours ago, it is in many ways just the opposite. Now 48 hours and one second later, the path to justice, accountability, and fairness is closing for society’s most vulnerable victims; the children and parents who so bravely confronted big tech in Washington this week.

Yesterday, stop CSAM was fast-tracked in the Senate for an immediate up or down vote without debate or amendments. Pro-tech Senators could request a hold on the bill, but I predict it will pass with unanimous consent. That alone should give everyone pause, creating a world in which “Stop” CSAM will become an ironic reality; the king is dead, long live the king!