Leadership at two organizations handed down the same directive: use AI to 10x your design process. No framework. No tooling guidance. No map from the mandate to Monday morning.
The gap wasn't motivation. It was method. So I built the translation layer between what executives were asking for and what a practitioner could act on that week.
Mandate without method creates anxiety, not adoption.
Two engagements: building the curriculum live as a design manager at Salesloft, then packaging and delivering it as an invited speaker at Peerspace. The second org didn't happen because I pitched it. It happened because the first one worked.
At Salesloft, I was the design manager. I owned the problem end-to-end: diagnosed the gap, built the curriculum, and taught it in real time while the work was still evolving. IC and leadership scope in the same motion.
At Peerspace, I was an invited guest speaker. A lead product designer reached out after the Salesloft work circulated externally. I packaged what worked and ran a single session for a cold audience.
The distinction matters. Salesloft was iteration in progress. Peerspace was proof it had become transferable.
Three choices shaped how this curriculum landed and why it traveled beyond the org that built it.
Rather than introducing AI as a new discipline, I mapped every tool and method to the process already in use: Define, Ideate, Design, Learn. Designers didn't have to change how they thought about their work. They just had to see where AI fit inside it.
Waiting for a polished program would have meant the mandate outpaced the team. I ran sessions as the work evolved: research synthesis one week, prototyping the next. Real work became the test. A designer applying AI to a live project the same week as a session is a better signal than a completed deck.
AI tools make mistakes. Every session opened with that disclaimer, not as a caveat, but as a design principle. Designers who understand the limits of AI use it more confidently and more critically. Framing AI as a collaborator that needs checking lowered anxiety and raised adoption.
The curriculum was validated the way good design gets validated: by watching someone use it on a real problem and seeing what they do next.
Sessions ran live, one topic at a time. Between sessions, I coached designers through applying the methods to actual projects. One designer applied the research synthesis technique to a live project the same week, iterated his own prompting approach independently, and came back to the thread to share what he'd learned. Teach, apply, refine, report back. That loop is the signal a curriculum is working.
The Peerspace session was the packaged version of what had been validated at Salesloft. Running the same framework cold, to a new audience, without the iterative coaching layer was the real test of whether the curriculum could stand on its own. It did. A senior designer noted it made AI feel approachable specifically because it was tangible, not inspirational.
The inbound Peerspace invitation is itself a signal. The work circulated without being pushed. A lead designer reached out because it had external reach.
Designers at both orgs moved from a mandate with no map to practitioners applying methods to live work. The outcomes are a pattern of behavioral change, not a single number.
Feedback from both orgs pointed to the same differentiator: tangibility. Not inspiration, method. Designers left with specific things to try that week.
"They really made AI feel more approachable because a lot of what they mentioned I'm sort of doing, like just scratching the surface, and they gave so many tangible ways to dial that up to get a lot more out of it."
"I found that using the transcript with a great question led to better synthesis results. I also changed my prompt around a little bit, and it helped tremendously."
"That was so awesome, thanks for bringing Leny in. They really made AI feel more approachable and gave so many tangible ways to get a lot more out of it."
"Super helpful!! Thank you Leny. Yes. Thank you for all the help."
The principle I carry forward: a mandate is not a method. The design work is building the translation layer between what leadership is asking for and what a practitioner can act on that week.
Teaching iteratively was the right call. The live feedback loop at Salesloft is what made the Peerspace session land. A curriculum built in isolation would have been a deck. Built through real work, it became a system that traveled.
What I'd do differently: instrument the follow-through more deliberately. The behavioral signals were there, but capturing them systematically would have made the impact case stronger. Next time, I'd close that loop explicitly before the final session runs.