+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Why are so many creators talking about moving away from Adobe and Photoshop?

Jun 12, 2024, 12:08 IST
Business Insider India
In a digital age where privacy and data security are paramount, Adobe’s recent update to its Terms of Service has stirred significant controversy among its users. The changes have left many creatives concerned that their unpublished and in-progress projects could be used to train Adobe's AI models.
Advertisement

The Controversial Clause


The controversy began when Adobe Photoshop and Substance 3D — the bread-and-butter applications of designers and artists worldwide — received a pop-up notice stating, "We may access your content through both manual and automated methods, such as for content review." This notice, coupled with an update in Adobe's Terms of Service, has led to widespread confusion and concern. The specific language in the updated terms reads:

"Our automated systems may analyze your Content and Creative Cloud Customer Fonts (defined in section 3.10 (Creative Cloud Customer Fonts) below) using techniques such as machine learning in order to improve our Services and Software and the user experience."

This vague wording has sparked fears that any user-generated content, including works under non-disclosure agreements (NDAs), could be accessed and utilised by Adobe's AI systems for training purposes. Further, the changes had reportedly already taken effect by February, despite Adobe only choosing to notify users now.

User Outrage and Privacy Concerns


The backlash was swift and fierce. Artists and designers took to social media to voice their concerns. One notable comment came from artist @SamSantala, who posted on X, "I can't use Photoshop unless I'm okay with you having full access to anything I create with it, INCLUDING NDA work?" This sentiment echoes a broader fear that sensitive and confidential projects may no longer be secure.

Advertisement

The main concern is not just about the potential misuse of personal or professional work but also about the implications for privacy and intellectual property rights. For many creatives, the idea that their work could be used as training data for AI, without explicit consent or compensation, is unacceptable.

Adobe Releases Clarification


In response to the uproar, Adobe later issued a statement clarifying that it does not use unpublished user content to train its Firefly AI models. The company emphasised that only content stored in their Creative Cloud, and not content stored locally on users' devices, would be accessed. Furthermore, Adobe highlighted that only public content, such as contributions to Adobe Stock and submissions for Adobe Express, is used to train its algorithms.

Interestingly, Adobe Chief Product Officer Scott Belsky noted that the company has had “something like this in TOS” for over a decade and that the new modifications were minor at best. He also mentioned that the company’s legal team is working on clearing up the confusion caused by the ambiguous language.

"The focus of this update was to be clearer about the improvements to our moderation processes that we have in place," explained Adobe in a recent blog post. "Given the explosion of generative AI and our commitment to responsible innovation, we have added more human moderation to our content submissions review processes."

According to the post, Adobe applications only access the work for pertinent functions such as creating thumbnails and previews, developing their machine-learned features such as ‘Photoshop Neural Filters’ and ‘Remove Background,’ and screening for illegal content such as child sexual abuse material. The company also clarified that it will never assume ownership of a customer’s work.

Advertisement
However, the issue of using user content for AI training remains murky. While Adobe assures that only public domain data is used, the potential for copyrighted material to slip through remains a concern. This issue is not unique to Adobe; other AI tools, such as those from Midjourney, have faced similar allegations of copyright infringement.

As Adobe navigates this controversy, it faces the challenge of balancing the improvement of its AI tools with the need to respect user privacy and intellectual property. For now, Adobe users are left grappling with the implications of the new terms, and many are reconsidering their use of the software for sensitive and confidential projects.
You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article