Relax, Adobe tells fans: Photoshop’s new fine print isn’t as controversial as it seems
Adobe has been under fire lately after being criticized by the media for its “shocking rejection of photography.” American Association of Media Photographers for some tone-deaf Photoshop ads, it started showing up a few weeks ago. And now the software giant has been forced to defend itself again, following an outcry on social media over some new Photoshop terms rolled out this week.
In recent days, a number of high-profile Photoshop users on The new fine print contains some seemingly alarming lines, including one that states “we access your content through both automated and manual methods, such as for content review.”
Adobe has now defended the new terms in a new statement blog post. In short, Adobe claims that the somewhat ambiguous legalese in its new fine print has created an unnecessary furore, and that nothing has fundamentally changed. The two biggest takeaways are that Adobe says it “does not train Firefly Gen AI models on customer content” and that it will “never own a customer’s work.”
On this last point, Adobe explains that apps like Photoshop need to access our cloud-based content to “perform the functions for which they are designed and used,” such as opening and editing files. The new terms and conditions also only affect cloud-based files, with the fine print stating that “we (Adobe) do not analyze content processed or stored locally on your device.”
Adobe also admits that the new fine print could have been better explained, also stating that “we will clarify the acceptance terms that customers see when opening applications.” But while the statement should allay some fears, other concerns are likely to remain.
One of the main points raised on social media was concern about what Adobe’s content review processes mean for NDA (Non-Disclosure Agreement) work. Adobe said in its statement that for work stored in the cloud, Adobe may use “technologies and other processes, including escalation for manual (human) review, to screen for certain types of illegal content.”
That may not entirely address privacy concerns for some Adobe users, though these concerns likely apply to cloud storage use in general, not Adobe specifically.
A crisis of confidence?
This Adobe incident is yet another example of how the aggressive expansion of cloud-based services and AI tools is contributing to a crisis of trust between tech giants and software users – in some cases understandably so.
On the one hand, the convenience of cloud storage has been a huge boon for creatives – especially those with remote teams spread around the world – and AI tools like Generative Fill in Photoshop can also be big time savers.
But it can also come at a cost, and it remains the case that the only way to ensure true privacy is to store your work locally rather than in the cloud. That won’t be a problem for many Photoshop users, but the fuss will undoubtedly still cause some to look for the best Photoshop alternatives that don’t have such a large cloud component.
When it comes to AI tools, Adobe remains the self-proclaimed torchbearer for “ethical” AI that is not trained in copyrighted works, although there are some controversies about it. Last month, for example, there was the legacy of legendary photography Ansel Adams accused Adobe of Threads of selling AI-created imitations of his work.
In fairness to Adobe, it removed the work, stating that it “violates our generative AI content policy.” But it once again highlights the delicate balancing act that companies like Adobe now find themselves in between rolling out powerful new AI-powered tools and maintaining the trust of users and creatives alike.