• | 8:00 am

How Mattel is using AI to bring your next Barbie box to life

Generative AI is helping the toy company’s designers brainstorm and fine-tune packaging for a range of products.

How Mattel is using AI to bring your next Barbie box to life
[Source photo: Adobe X Mattel]

Life in Barbie’s world is about to get a little more fantastic, thanks to a boost from Adobe Firefly’s generative AI software.

According to Chris Down, Mattel’s chief design officer, the company began offering the web version of Firefly to product designers across all of its subsidiary brands last year as a tool to help create compelling packaging—and the resulting designs are expected to hit shelves in the coming months.

“Mattel makes about 4,000 new toys a year, and a lot of that is packaging,” Down says. “There’s a high volume of stuff—and that gives the first clue as to why we’d be interested in tools that would make the outcomes better or stronger, would allow the creative process to move faster, and would give us a production or creative execution advantage.”

Given the packaging team’s annual workload, Mattel wanted to cut down on preproduction time and get new toys to the selling phase faster. Firefly’s strict approach to copyright and IP law (the software is trained only on stock images for which Adobe already holds the rights, as well as on openly licensed content and public-domain content) also made it a safer option for Mattel, Down says. While no Mattel designer is required to use Firefly—indeed, some still prefer taking pen to paper when possible—some are leading the charge to integrate the software into their daily workflow.

Firefly 3, the software’s most recent iteration, comes with several add-ons that make Photoshop a significantly more powerful design tool. With the Midjourney-esque feature Generate Image, a text prompt is all that’s needed for the program to spin up its own fully formed concept image. The Generative Fill with Reference Image tool can integrate any new object into a design, a process that previously would have taken a significant time investment, even for the most seasoned professional. And for menial tasks, like filling out a background, Generative Expand can interpret a scene and resize it with new borders.

So far, Down says, Firefly has served two key roles for Mattel’s package designers: helping to visualize fantastical new toy ideas in the pitching stage, and cutting down the extra labor associated with Photoshop’s more time-consuming tasks.

HOW BARBIE BOXES ARE MADE

Say, for example, that a designer developed an idea for a Barbie whose dress transforms into wings. Their next step would be convincing higher-ups that, through a standout packaging design, kids and adults would understand what the doll could do and therefore would want to buy it. At this stage, the designer could use Generate Image as a partner in the brainstorming process. Once they had a better sense of the product, they could then use Firefly features like Structure Reference and Style Reference to create an accurately scaled and themed environment for the Barbie to live in.

If the higher-ups approved the winged Barbie pitch, the designer (and their team) would do their own product photography, editing, and illustrating for any foregrounded elements of the final package design, like the Barbie doll and its accessories. Mattel’s guidelines around GenAI tools include a conservative outlook on including AI-generated images on final packaging: Firefly might be used to fine-tune a layout or expand a background, but not to generate subject matter like the doll itself or its human companions. The key to incorporating any new design tool, Down says, is for consumers not to not know it was used at all.

“We started using 3D printing around 20 years ago, using haptic arms and doing digital sculpting instead of wax and clay. No consumer would ever say, ‘Hey, that was a doll face that was sculpted using wax or clay’ versus ‘That’s a doll face that was using voxels and a haptic arm,’” Down says. “I think that the tools should be invisible, and the output should come from the ingenuity of our creators.”

Even at Mattel, Down concedes, there’s been some general hesitance around adopting artificial intelligence tools, as well as a dash of existential fear about what these capabilities might mean for the future of design. But he believes GenAI tools in this context can be viewed as collaborators rather than competitors.

“I was talking to one of my product designers, and he described it in a way that I thought was really compelling,” Down says. “He referenced the old Edison quote of ‘1% inspiration and 99% perspiration.’ [With GenAI tools] that notion is shifting. The 1% starts to expand, and the 99% starts to reduce. . . . It’s amplifying my creativity by compressing time and taking out some of that perspiration.”

  Be in the Know. Subscribe to our Newsletters.

ABOUT THE AUTHOR

More

More Top Stories:

FROM OUR PARTNERS