• | 8:56 am

What’s the future of AI-driven creativity in Dubai? Transparency

Dubai’s Human–Machine Collaboration initiative challenges the region to balance innovation, trust, and bias in an AI-augmented creative landscape.

What’s the future of AI-driven creativity in Dubai? Transparency
[Source photo: Krishna Prasad/Fast Company Middle East]

Imagine scrolling through your feed and pausing on an eye-catching image or headline. Would it change your perception if you knew it had been curated by artificial intelligence (AI)? Well, that question is no longer hypothetical. Did you know that Dubai has rolled out a first-of-its-kind Human–Machine Collaboration (HMC) classification system, designed to bring radical transparency to the creative process? 

Backed by Dubai Crown Prince Sheikh Hamdan, the framework uses simple visual markers to reveal how much of a piece of work— whether a marketing campaign, news story, or design concept — was powered by human ingenuity, machine intelligence, or both. 

This could reframe the conversation around AI in creative industries, shifting it from suspicion to informed collaboration.

“The introduction of Dubai’s HMC classification system marks a timely step forward in how societies navigate the rise of generative AI. As the volume of AI-generated content continues to grow, clear, visible signals that indicate whether content was generated by a human, by AI, or through collaboration between the two can help reduce confusion, build trust, and ultimately normalize AI as a creative partner rather than a threat,” says Raghu Chakravarthi, EVP, Engineering and General Manager – Core42 US. 

According to Sid Bhatia, Area VP & General Manager – Middle East, Turkey & Africa at Dataiku, openness is non-negotiable when it comes to AI in content creation.

Transparency in content generation must not be undervalued.

“Audiences have a right to know when AI has played a role in generating content,” Bhatia says. “Transparency builds trust. Whether that knowledge changes engagement depends heavily on the context and quality of the content.”

Bhatia points out that AI can be a force multiplier that enhances, not replaces, human creativity. “If an AI-assisted process helped accelerate a campaign or tailor it more effectively to a segment, most people won’t mind; in fact, they might appreciate the relevance,” he says. “But if the AI-generated content replaces genuine human creativity without disclosure, audiences may feel misled. In our experience with enterprises deploying Universal AI, clearly communicating how AI adds value, not replacing humans, is key to acceptance and engagement.”

Meanwhile, David Boast, General Manager–UAE & KSA at Endava, says the conversation should go beyond simply stating that AI was involved. “At this point, there’s almost an unspoken assumption that AI plays some role in the creative process. So, the issue isn’t about revealing its presence, it’s about clearly defining its role,” he says. “Audiences don’t just want disclosure; they want distinction. Was it human-led with AI support, or entirely machine-generated? That clarity matters.”

Boast warns that if the lines are blurred, there’s a risk of diluting the human experience and undervaluing original thought. “AI is a tool, much like any other technology we’ve adopted over time,” he says. “Done well, transparency can elevate engagement by challenging preconceived ideas and showing how AI can amplify rather than replace creativity. That’s where trust is built.”

For Levent Ergin, Chief Strategist for Climate, Sustainability and AI, and Global Head of ESG Strategic Alliance Partnerships at Informatica, disclosure is not just important–it’s credibility-enhancing. “Audiences do have a right to know and, in today’s landscape, where AI touches nearly every corner of the creative process, disclosure doesn’t make work less valuable, it makes it more credible,” he says.

He points to a recent misstep as a cautionary tale. “Guess’ AI model in Vogue didn’t spark backlash because people spotted the tech. It sparked backlash because people felt creativity was being replaced, not enhanced, and AI was being used to set unrealistic standards,” Ergin says. “In this case, the public outcry wasn’t about whether AI was used, but why and how.”

CAN TRANSPARENCY REDEFINE CREATIVITY?

Could a system like Dubai’s HMC classification help reduce the stigma around using AI in creative industries, or might it create new forms of bias?

“Labeling the degree of AI involvement helps normalize the role of machines as co-creators rather than replacements. This can potentially reduce stigma and elevate discussions around responsible AI use,” says Bhatia.

However, he cautions that any classification system risks oversimplification or unintended bias. “For instance, labeling something as ‘machine-generated’ may lead some audiences to immediately dismiss its value,” he says. “That’s why the framing matters—it’s not about how much AI was used, but how responsibly and creatively it was applied.”

Weighing in on Dubai’s classification system’s potential impact, Boast says that if implemented well, it could help “normalize AI-driven augmentation in creative work.”

He emphasizes that execution will be crucial. “The key will be consistency. If the classification system lacks clarity or credibility, it could backfire, fuelling new biases or further confusion.”

Looking ahead, Boast envisions a positive outcome. “But if we get it right, with functions like a ‘blue tick’ for creative authenticity, it becomes a signal of trust,” he says. “In a world where generative content is growing exponentially, frameworks like this can help audiences understand what’s human, what’s machine, and where the real value lies. That’s not just useful but essential.”

Dubai’s initiative is a progressive move that addresses a critical distinction between AI-assisted and AI-generated content, Ergin says. “This differentiation is essential because it allows creators to embrace transparency without risking the devaluation of their work.”

However, the effectiveness of any labeling system depends heavily on how clearly it is communicated to audiences. “Without a well-defined framework and public understanding, there’s a significant risk of introducing a new form of bias–where any content associated with AI is prematurely judged as ‘less authentic’ or inferior,” he notes.

Education and consistency are crucial. “If stakeholders can collectively establish that AI functions as an augmenting tool rather than a replacement for human creativity, the persistent stigma around AI-generated content will begin to erode,” Ergin adds. “This shift could pave the way for a more nuanced, transparent, and honest conversation about the evolving dynamics of content creation.”

TIMELINES VS. TRANSPARENCY

The tension between transparency and protecting brand image is real. Some brands may hesitate to fully disclose AI involvement, fearing it could undermine perceptions of originality or slow down tight timelines. However, as consumer awareness grows and demands for authenticity rise, embracing transparency could become a competitive advantage rather than a liability.

“Time will tell whether all brands immediately embrace this level of transparency,” says Chakravarthi, underscoring the varied pace at which industries may adapt to new disclosure norms.

He adds, “Forward-looking organizations will recognize that trust builds credibility in the AI era, and disclosure can serve as a competitive advantage.” This perspective highlights how transparency is evolving from a compliance checkbox into a strategic differentiator.

Chakravarthi points to frameworks like Dubai’s Human–Machine Collaboration classification system as crucial in establishing best practices and global standards for responsible AI use. “At Core42, we believe that ethical AI applies not only to the development of models but also to how their outputs are interpreted and acted upon,” he says.

Brands that proactively communicate how AI augments human creativity may build deeper trust and differentiate themselves in crowded markets. Ultimately, the brands that succeed will be those willing to navigate this new paradigm openly, balancing speed with sincerity to meet evolving audience expectations.

Bhatia sees it as a delicate balance. “Brands working in high-pressure, competitive spaces may initially hesitate—fearing that revealing AI involvement could dilute their creative image or expose internal processes,” he says.

“But the most forward-looking brands will see transparency as a brand differentiator,” Bhatia adds. “Those who are first to say, ‘We use AI responsibly to enhance creativity,’ will gain reputational capital. Much like how brands now showcase their sustainability practices, the same will happen with AI ethics and transparency.”

According to Boast the most established players, the ones with decades of creative capital behind them, are likely to lead the charge. “For them, transparency becomes a strength, not a liability.”

He adds that once those pioneers show that AI can enhance, rather than erode brand identity, others will follow. “At this point, when the digital space will be awash with AI-generated or AI-assisted content, high-quality data which is verifiable and from trusted sources, will become the new currency. The brands that openly embrace that shift will be the ones who cut through the noise.”

Boast reiterates that Dubai’s classification system reflects a broader strategic shift towards human–AI collaboration, creating a roadmap for the global creative industry. “While this is a new frontier requiring thoughtful risk-taking and agile learning, old ways of thinking about originality and authenticity won’t fully translate. To stay relevant, businesses must embrace transparency, collaboration, and authenticity, recognising AI not as a threat, but as a partner that supercharges genuine human creativity and strategic insight.”

Ergin believes that disclosing AI involvement, and correctly referencing source material, can elevate brand integrity. 

“It’s not about slowing down creativity, it’s about using AI responsibly while reinforcing that the human is still in control.”

He stresses that in today’s crowded content market, credibility is currency. “Over time, this type of disclosure could become a trust signal rather than a red flag.”

  Be in the Know. Subscribe to our Newsletters.

ABOUT THE AUTHOR

Rachel Clare McGrath Dawson is a Senior Correspondent at Fast Company Middle East. More

More Top Stories:

FROM OUR PARTNERS