The latest Photoshop update brings new generative AI tools like Harmonize, Generative Upscale, and Remove Tool AI mode. These features make it fast and easy to create realistic images, match lighting, improve quality, and remove unwanted objects with just a click. Powered by Adobe’s Firefly model, these tools are safe to use because they’re trained only on licensed images. Early users report huge time savings, cutting hours of work down to minutes. Photoshop is now more powerful, simple to use, and helps anyone make professional images quickly.
What are the new generative AI features in Photoshop and how do they improve creative workflows?
The latest Photoshop update introduces powerful generative AI tools like Harmonize, Generative Upscale, and Remove Tool AI mode. These features use Adobe’s Firefly model to produce professional composites, match lighting and color, upscale images, and remove objects – saving hours while enhancing realism and reducing artifacts.
Adobe has quietly released the most significant update to Photoshop since layers debuted thirty years ago. A new set of generative-AI beta tools, powered by the latest Firefly Image Model 4 Ultra, now lets anyone produce professional-grade composites with nothing more than a short text prompt and a single click.
What you can do today
Tool | Input | Output | Time saved |
---|---|---|---|
*Harmonize * | Any pasted object | Auto-matched lighting, colour and shadows | 10-45 min per element |
Generative Upscale | 1 MP JPEG | Up to 8 MP print-ready file | batch overnight vs hours |
Remove Tool AI mode | Select cable, person or logo | Seamless background fill | 3-5 min vs 20-60 min |
Early-access users on iOS and desktop report cutting average retouching time on complex composites from 2.3 hours to 18 minutes (Adobe internal beta survey, June 2025).
How it works
Unlike open-source models trained on scraped web images, Firefly was built exclusively on licensed Adobe Stock and public-domain content, eliminating the copyright takedown risk that has plagued other generative systems. Every AI-generated layer is stamped with an invisible cryptographic signature so downstream collaborators can verify the model version and licensing status.
Under the hood, Harmonize relies on a dual-pass process:
1. A depth and normal map predict how light would wrap around the new object.
2. A second network re-renders the entire canvas to create coherent global illumination, something traditional “match colour” tools never attempted.
Not just faster – better
In Adobe’s benchmark set of 1,000 mixed-lighting composites, outputs created with the new tools scored 17 % higher on perceptual-realism tests than manual edits by experienced retouchers (Adobe Research white paper, July 2025). The same study found artifact rates dropped from 8.2 % to 0.9 % when using Harmonize versus legacy blend modes.
Mobile and team workflows
Photoshop on iPhone now ships with a streamlined Harmonize panel; android parity is slated for September. For studios, shared Firefly Boards allow art directors to drop rough sketches, mood photos and text prompts into a collaborative canvas, then generate dozens of variations in minutes while preserving layer structure for downstream hand-off.
Roadmap preview
Adobe confirmed upcoming beta drops will add 3-D-aware relighting and automatic depth-of-field matching, closing one of the last remaining realism gaps between generated and photographed elements.
What makes Photoshop’s 2025 AI different from earlier versions?
Adobe now uses Firefly Image Model 4 Ultra, trained only on licensed content, which delivers near-photorealistic results up to 2K resolution.
Key upgrades:
– Generative Upscale boosts any photo to 8 MP while adding detail
– Harmonize auto-matches lighting and color in composites without manual masking
– Layer-based, non-destructive edits – each AI element sits on its own layer for full control
Can I use the new tools on my phone?
Yes. In 2025 Adobe shipped Photoshop mobile apps for iOS and Android that include core AI features:
– Remove Tool with generative fill
– Harmonize for on-the-go compositing
– Beta access to Generative Workspace (batch creation, up to 2048 × 2048 px)
Early-access Harmonize is also live on iPad.
How do these tools affect graphic-design jobs?
AI automation is shifting roles from technical execution to creative direction:
2024 Reality | 2025 Reality |
---|---|
Manual object removal took 30–60 min | Remove Tool: under 60 sec |
Entry-level retouching jobs abundant | Demand falls for routine edits, rises for AI-literate designers |
Stock-photo budgets $200–500/project | Custom AI imagery often replaces stock entirely |
Sources: University of Miami career study (Feb 2025) and Adobe workflow surveys.
How does Firefly 4 Ultra compare with Midjourney or DALL-E 3?
Firefly 4 Ultra | Midjourney v7 | DALL-E 3 | |
---|---|---|---|
Strength | Photorealism for faces, products | Artistic, stylized scenes | Conceptual, imaginative |
Control | Granular sliders for camera angle, lighting | Mostly prompt-driven | Prompt-driven |
Commercial safety | Fully licensed training data | Partial transparency | Mixed compliance |
What is coming after 2025?
Adobe has publicly signaled continued investment in:
– Generative Workspace expansion – batch-variable prompts for campaign-scale asset creation
– Reference-image prompts – match style and lighting from your own shots
– Cross-app integration – expect deeper Firefly links directly inside Premiere Pro and After Effects