WeTransfer’s AI Policy Sparks Outcry — What Artists Need to Know
WeTransfer recently faced a wave of backlash after updating its Terms of Service to include a clause that alarmed artists, designers, and other creatives. The now-revised clause (originally Section 6.3) stated that users granted WeTransfer a broad license to use their uploaded files—including artwork, music, and video—for machine learning purposes. This included the right to modify, distribute, and even publicly display user content to “improve performance of machine learning models.”
For many, the language read like a green light for using user-generated work to train AI—raising red flags about intellectual property rights, consent, and creative control.
Wetransfer logo
After widespread criticism, WeTransfer backtracked. On July 15, the company removed all references to AI and machine learning from the clause and issued a public statement clarifying: “We do not train AI models on user content and have no plans to do so.”
However, the incident reveals a deeper issue: ambiguous legal language can expose artists to potential misuse of their work. Even if AI wasn’t being used now, the original terms opened the door for future use without consent or compensation.
⚠️ What artists should watch for:
Beware of “perpetual” and “sublicensable” licenses in TOS documents.
Don’t assume your work is safe—even on trusted platforms.
Monitor changes to terms regularly; platforms often update them quietly.
Consider using encrypted or privacy-first file-sharing alternatives.
Trust, once shaken, is hard to restore. As AI continues reshaping the creative landscape, artists must stay vigilant and proactive in defending their rights.
Bottom line: WeTransfer says they’re not training AI on your art—but the episode is a stark reminder: always read the fine print.