
What followed wasn’t applause. It was outrage.
Artists, fans, and even legal experts voiced concerns: Studio Ghibli, whose iconic aesthetic was being remixed without consent, had nothing to do with this tool. Hayao Miyazaki - Ghibli’s co-founder and longtime AI critic - once called artificial intelligence "an insult to life itself." Yet here was a machine churning out images eerily close to his legacy, in seconds, with no attribution, no permission, and no understanding of the cultural soul behind it.
And for those of us building brands in 2025, this raises a critical question:
“When AI can copy a signature style at scale, where do we draw the line between inspiration and exploitation?”
If you’re using AI to generate visuals, your audience wants to know:
Are you doing it ethically?
In 2025, consumers are vocal - and vigilant. The Ghibli backlash wasn’t just about fandom. It was about a broader discomfort with corporations profiting from artists’ unpaid labor.
When Christie’s held an auction for AI-generated artworks trained on existing styles, more than 6,000 artists signed an open letter condemning it as “mass theft.” They argued that the auction house was legitimizing unlicensed appropriation.
In the U.S., a major lawsuit filed by artists against Stability AI, Midjourney, and DeviantArt is now moving forward. The claim? These platforms scraped billions of images without consent to train their models. The courts are taking it seriously - and this case could shape the future of copyright and AI.
According to the 2025 Edelman Trust Barometer, 81% of Gen Z consumers will boycott brands they believe use AI unethically. This isn’t about being anti-AI - it’s about being pro-transparency, pro-ethics, and pro-creators.

AI tools like DALL·E, Midjourney, or Firefly learn by analyzing massive datasets - millions of artworks, photographs, and illustrations from across the internet.
They don’t “copy” in the literal sense - but they learn styles, shapes, compositions, and techniques. And the images they generate often look like the art they were trained on.
It’s like remixing a song.
You’re not singing the same lyrics, but the melody might be familiar. And if you used Beyoncé’s beat without asking or crediting her, people would have something to say.
That’s the challenge with AI art: it creates something “new,” but it’s built from real, existing creativity.
So… when does inspiration become exploitation?

AI isn’t the enemy, in fact, it can be a powerful creative partner. But how you choose to use it says everything about your brand.
Don’t just say "AI-assisted" - show it. Use industry-standard tools like Adobe Content Credentials 2.0, which embed tamper-proof metadata into every image. This includes how the asset was created, what tools were used, and whether AI played a role.
Add a content label directly on your campaign landing page or press release stating: “This collection includes AI-assisted visual research, human-curated and finalised by our in-house design team.”
Bonus Tip: Make your AI process part of your storytelling. Brands that invite audiences behind the scenes build stronger trust.

Use platforms that respect creator rights - both legally and financially.
Avoid tools that scrape the internet without permission (e.g., early Midjourney or open-source Stable Diffusion models). If you can’t trace the source of training data, don’t use it for brand-facing assets. Bonus tip is that you should ask your creative partners what AI platforms they’re using - and request documentation.
In a strategic branding process, AI isn't the designer, it’s the provoker. Its strength lies in its ability to rapidly explore big-picture creative directions before the team commits time and energy to development. Used intentionally, AI can help surface unexpected ideas, generate early-stage moodboards, or even simulate how different brand concepts might feel in the world, visually, emotionally, or narratively.
For example, prompting AI with phrases like “ritual meets futurism” or “hyper-local, slow luxury” can produce conceptual visual outputs that help teams articulate and evaluate distinct brand directions. These outputs aren’t final assets. They’re conversation starters, provocations to test with stakeholders, pressure-test in strategy workshops, and spark debate within creative teams.
Once a few promising directions emerge, the real work begins. Human teams build on these ideas, layering in audience insight, cultural nuance, verbal identity, and system thinking. The initial AI-generated visuals may never appear in the final brand. What survives is the idea - refined, reshaped, and elevated through collaboration, design expertise, and strategic clarity.
This approach doesn't just streamline early-stage exploration, it also encourages more intentional brand development. By seeing a range of directions early, teams can align faster, avoid dead ends, and move forward with confidence in a concept that’s both inspired and owned.

As AI-generated visuals become more sophisticated, it’s easy to get swept up in what looks “cool.” But in branding, aesthetics without ethics are a liability. Creative teams need to cultivate a culture that prioritizes intention over trend, where every idea is held up to a simple, powerful question:
“Would I feel comfortable showing this to the artist - or the culture - that inspired it?”
If the answer is no, the direction needs to change. If the answer is “maybe,” that hesitation is a signal to slow down and unpack why. These moments of pause, however inconvenient, are essential. Build them into the process. During team critiques or stakeholder reviews, dedicate a few minutes to what we call a “respect check”: an honest discussion about cultural context, source material, and representation. Who are we referencing? Are we honoring, or extracting? This small habit leads to sharper thinking and more responsible work.
And when working with cultural traditions or visual languages outside your own expertise, don’t guess - ask. Involving cultural advisors early on isn’t just respectful, it strengthens the work. It ensures that what you’re building resonates authentically, rather than leaning on aesthetics detached from meaning.
So, is AI-generated art inspiration or theft?
The honest answer: it depends. It depends on the intent behind its use, the transparency of the process, and whether the tools involved were built on consent, or exploitation. AI itself isn’t inherently unethical. But the choices we make around it determine whether we’re amplifying creativity, or appropriating it.
In 2025, using AI is no longer the bold move. Using it responsibly is. Innovation without ethics is just noise, and audiences are paying attention. As creatives, strategists, and brand builders, it’s on us to ask harder questions, hold higher standards, and make space for both imagination and accountability in the process.
AI can absolutely be a powerful creative partner. But it’s up to us to ensure it’s supporting the work, not undermining the people and cultures that inspire it.
Curious how to integrate AI without compromising creativity or respect? Start by asking better questions. The rest follows.