Skip to main content

Select your location

Building your AI dream team

Illustration of disperse team coming together

Generative AI is rewriting the rules on how enterprises define and cultivate talent. Employers must stay ahead of the curve—honing employee skills to be compatible with AI capabilities and instilling people with the knowledge to unlock its full potential. 

Business leaders are aware of this talent gap; 82% of respondents in Microsoft’s 2023 Work Trend Index Annual Report acknowledged that employees will need new skills to prepare for the growth of AI.

And those skills aren’t strictly technical—or confined to technical workers. Everyone has the potential to use AI to augment their skills and abilities. 

Existing roles are changing to make better use of AI, and new roles are quickly emerging. Prompt engineering, for instance, is a job title that didn’t exist until recently, says Andrew Mullins, Principal Data Scientist, Kin + Carta. 

Regardless of the roles you’re filling, adaptability is critical in the face of changing tools and technology. Here’s how to build your AI dream team.

Search for innovators and changemakers

Since accessible GenAI tools are relatively new, most organizations are starting from a fairly level playing field. Enterprises that encourage the workforce to experiment and make smart bets are likely to gain an advantage as early adopters. The key to integrating AI successfully is making sure those bets are calculated and intentional.

“You want to hire people who are willing to write the book—not just read it,” says Taylor Bradley, Head of HR Business Partners at Turing, an AI-powered “talent cloud” matching employers with remote developers. With GenAI, businesses can use existing data to create innovative solutions, but only if they hire people willing to test new prompts and practices. 

“Thinking about automation with AI is not a natural thing that we all just develop…It really is a muscle and a skill set,” says Writer CEO May Habib. “And the more that leaders can identify people who are leaning into that and doing it, the more ideation happens, the more experimentation happens.”

So what does that look like in a candidate? Practically speaking, “you need people who like to document their work,” Mullins says. Recording what works and what doesn’t and sharing those findings with colleagues is a critical piece of the puzzle.

 
Thinking about automation with AI is not a natural thing that we all just develop…It really is a muscle and a skill set. And the more that leaders can identify people who are leaning into that and doing it, the more ideation happens, the more experimentation happens.
May Habib, CEO, Writer

On a behavioral level, Bradley likes to ask candidates, “What have you done with generative AI to teach yourself something, or what have you learned from it?” His rationale for asking that question is, “If you have someone that’s continuously teaching themselves new things, and they're using generative AI to do that, they're going be able to solve big problems for your business because they're going to know 1) how to use this tool effectively, and 2) how to use it to drive impact.”

‌"This is such an exciting time to figure out who the innovators are in one's company," Habib says. "Because if they are the right people, they are already the ones that are experimenting and stepping up. You may not know about it, but they're doing it." 

Prioritize people comfortable with ambiguity

AI operates on a probabilistic framework, meaning it doesn't seek absolute solutions, but forecasts the most plausible outcomes based on accumulated data. This method holds particularly true when processing unstructured data (i.e. images, videos, and transcripts), which represents the most frequent type of business output.
Given this probabilistic nature, employers must prioritize candidates capable of navigating “gray areas” effectively. This involves embracing uncertainty and approaching tasks with an open mind, and advancing even without absolute clarity.

Being comfortable with ambiguity means treating the unknown as an opportunity for growth and discovery. People who can think flexibly and creatively, and who can come up with innovative (and ideally simple) solutions to complex problems, are invaluable during periods of disruption and growth.

The good news is that most leaders in enterprise organizations are already familiar with working in ambiguous circumstances. “Working with ambiguity is a hallmark of working in a hyper-growth setting,” Bradley says. 

To find the right people, ask behavioral questions to understand how candidates navigated challenges before GenAI entered the mainstream, especially around situations involving risk or ambiguity. Ask candidates to describe a time they encountered a situation with no clear answers. Their responses to ambiguity in the past are suggestive of how they’ll handle it in the future.

Illustration of disperse team coming together

Reframe experience through the lens of agility

Hiring for agility has always been important, but in the era of rapidly changing AI applications, it’s imperative. And there’s always room to refine what we need from talent, and how we identify those traits.

Hiring people who can thrive under uncertainty requires rethinking what you look for in a candidate. Chris Lord, Chief Technology Officer and Co-Founder at Lambent, for example, doesn’t hire for expertise; he looks for the ability to learn and apply new knowledge and practices.

“I look for examples where you have done something that shows that you have had to go into a different domain, different space, different technology, different stack—and then learned it to produce some value,” he says. When hiring developers, for instance, Lord doesn’t care what programming languages someone has experience in, as long as they can demonstrate that they’ve learned and adapted to different languages as needs have changed.

“I hire people who want to learn something new and are comfortable living outside of what they're comfortable doing,” Lord adds.

Taking the AI concept of continuous learning and applying it more generally to an organization’s culture can unlock new value and accelerate positive change. 

Ultimately, you want to look for evidence that a candidate can handle unexpected challenges, think on their feet, and stay positive and motivated when facing difficult circumstances.


I look for examples where you have done something that shows that you have had to go into a different domain, different space, different technology, different stack—and then learned it to produce some value
Chris Lord, CTO and Co-Founder, Lambent

Look for examples where candidates have overcome difficult challenges and adapted quickly to new situations. Perhaps a candidate spent many months working on a complex project at the beginning of 2020 and then had to pivot the project in a new direction during the pandemic. It's also useful to ask candidates to describe how they responded when they had to take initiative and think creatively to solve problems.

Reframing experience to focus less on formal training and more on a candidate’s broader skills also helps you attract a more diverse slate of candidates—and that’s critical ‌for ethical AI. As you experiment and set trends, greater diversity on your AI dream team provides unique perspectives and reduces the chance of building processes that perpetuate bias.

Find people who argue well and listen better

Organizational psychologist Adam Grant suggests people “argue like they’re right and listen like they’re wrong”—and that advice has never been more applicable than in the search for your AI dream team. 

This type of attitude shows that a candidate is open to different ideas and perspectives, and willing to work together with others to reach a successful outcome. These are essential qualities to have when working alongside GenAI programs. 

You need people who aren't afraid to voice their opinion but know how to listen to those around them and take their thoughts into account. In fact, Yale research shows that people who argue to learn are more likely to find success than people who argue to win. This attitude is essential in any workplace, as it allows for collaboration and growth.

Additionally, someone who argues well but listens better will likely be a more effective communicator. They understand how to respectfully express and justify their perspectives, while also being open to hearing other peoples’ viewpoints—qualities that are important for experimenting with AI.

Assemble intentionally diverse teams

A team full of people willing to hear each other out and learn from new perspectives is primed for higher diversity. And for teams working with and training AI algorithms, that matters. The more diverse your team is, the fewer biases and blindspots you’re likely to incorporate into your GenAI programs.

Teams composed of people with similar backgrounds—even if those backgrounds are universally impressive—rarely perform as well as those pioneered by diverse teams, Mullins notes. “The most successful projects are those that can incorporate diverse, interdisciplinary viewpoints.”

Applying this concept to a team that works with AI is one of the smart bets you need to make to get ahead. “No model lives in isolation—it needs to interact with customers, the business, and otherwise the world at large,” Mullins continues. “If you're closing your hiring aperture to only focus on a certain profile, you are almost certainly introducing blind spots that will prevent you from solving the hardest problems, the ones that require a lot of diverse contexts.”

As we brave a new frontier, AI's impact on hiring offers immense possibilities for growth and innovation. By fostering a team versed in AI, businesses can unlock new heights in productivity, creativity, and efficiency.

 

Share this article

Show me all