9 Steps to Implementing AI Without Being Racist
Eight Experts Weigh In on How Artificial Intelligence and DEI Intersect — and How to Overcome Human and Algorithmic AI Bias
Add bookmark
A New York City man signed up for LinkedIn and used artificial intelligence to create a fraudulent profile for an AI-generated white male in his 20s or 30s: a Stripe alum and the founder of a non-existing startup. Within 24 hours, the fake founder received a message from a venture capitalist interested in investing. Notice the line, “A few ex-Stripe buddies of mine had great things to say about you.”
Unfortunately, the “you” in that statement isn’t real. Never was. And no other story I’ve heard better demonstrates the dangers of AI — to security, to privacy, and to diversity, equity and inclusion (DEI).
On the other hand, AI tech can enhance business productivity by 40% — and businesses that employ AI will double their cash flow by 2030 while brands that don’t will see a 20% reduction. Already, more than 75% of businesses are using or exploring AI; nearly three in four executives believe AI will be their greatest future business advantage; and the global AI market is pushing toward a trillion US dollars.
My advice:
Download our report, Everything You Need to Know About AI Right Now:
Then, before investing, read this.
Why Your Employees Aren’t All Embracing AI
Facial recognition technology uses artificial intelligence to identify us via physiological biometric identifiers. Studies have shown it’s racist, algorithmically, and in implementation. ShotSpotter uses AI-powered audio identification and, according to its website, “accurately detects, locates and alerts police to gunfire” — but, unfortunately, it doesn’t; in one city alone, there were more than 40,000 errant AI-generated reports and unnecessary police encounters in predominantly Black and brown neighborhoods in only two years. And then there are the DEI issues also impacting other marginalized groups like women, older people and people with disabilities, including the macro- and microaggressions committed everyday in the workplace.
Take overpublicized OpenAI and ChatGPT, for instance:
- “The contributions to human history made by women, children and people who speak nonstandard English [like many Black, brown and economically disadvantaged Americans] will be underrepresented by chatbots like ChatGPT”
- The creator of ChatGPT, OpenAI kicked off its Bain & Company collaboration to “implement the value of AI” with Coca-Cola, one of the corporations financing Cop City while running Black Lives Matter ads
- OpenAI may soon face competition from a forthcoming — and even less diverse and inclusive — alternative, created by Elon Musk to “fight” “woke AI”
And what about the oft-forgotten native community? According to Jeff Doctor, an impact strategist for Animikii, an Indigenous tech innovation and equity organization, “these ‘AI’ reshape our data back into the very same stereotypes we've been fighting so hard to counter.”
So, if your staff has been hesitant to embrace AI tech and adopt your new AI strategy, it might be because tech isn’t your only hurdle — or opportunity.
Indeed, if you’re even having these conversations, you’re a phase or two ahead of at least some of your biggest competitors. And if your staff is concerned about bias in AI (in the workplace), you’ve probably already started investing in DEI. Which means you have the opportunity to not only improve processes and output because of AI, but also your employee engagement and experience — because you see the value in talking about AI (and DEI) outside of the C-suite.
Of course, you have to do it right; you have to overcome both types of bias in AI:
- Algorithmic bias, or data bias, produced when algorithms are trained using biased data
- Human bias, or societal bias, produced when our own assumptions, expectations or prejudices influence the AI’s output
How to Incorporate AI, Without Being Racist: A Conversation
“AI will pick up the biases of whoever creates and trains it, and it also typically will meet their needs as a consumer because they are thinking about using the product from their own perspective,” DEI thought leader Samantha Karlin told me.
That's why they either need significant inclusive design training or a team of different types of people — also including non-American, non-Western people, if it’s a global product — to figure out if it meets the needs of different audiences and also ensure that it doesn't unintentionally reproduce harmful norms.
For Aaron Winter, Lancaster University sociology professor, Reactionary Democracy co-author and Identities co-editor, there are limitations to AI and DEI. “We already know that AI and algorithms can carry with them and exacerbate biases and inequities, and have played a role in pushing and platforming racism, misogyny, homophobia, transphobia, ableism and far-right ideas online,” he told me. Meanwhile, “DEI as an approach often operates top-down using material provided by third-party private organizations, and does not always address deeply rooted structural or institutional inequities or offer a significant departure beyond technological innovation.”
Thus, he continued, “Any AI would inevitably draw from that knowledge base, reproducing these issues.” Likewise, using AI in DEI would “take the oversight, experience and representational politics out of it all together… and put more money, power and responsibility in the hands of private tech.”
This is why “it’s important to enter into this discussion with a great deal of caution and critical input and perspectives, particularly from those most affected.”
I asked Misa Chien, a tech, CX and inclusion influencer and CEO/co-founder of Autopilot Reviews, a call center incentive program, if it’s even possible for brands to leverage or develop AI tech without the bias.
“You have to make sure that you use a large, diverse set of data, and continually monitor the AI so you can discover why it makes the choices it does and correct any biases,” she said — and “share your best practices with the vendors you use.”
For Vera CEO Liz O’Sullivan, a member of the US Department of Commerce’s national AI advisory committee and an expert on AI, “fair algorithms” and digital privacy, “battling bias in AI is a little bit like fighting it in the real world — there’s no one-size-fits-all solution, technical or otherwise; it’s highly dependent on the use case, the goal and the team, and it’s something that takes continual work over time.”
According to O’Sullivan:
A lot of modern AI makes a big mistake in devaluing the preliminary steps of data gathering and review. Scraping artists’ work to make an AI image generator for instance, is clearly exploitative. You also don’t know what kinds of patterns the machine will learn unless you have people reviewing the training data for potential biases or unsafe content.
From the DALL-E website from OpenAI
Odie Martez Gray, president of the Diversity Cyber Council, broke it down for me:
The function of AI is to logically draw relationships to data and formulate or compile an aggregated result. Knowing this, a simplified overview of the challenge concerning AI and racism is first determining ‘ethical’ data sources that consistently produce unbiased data — because we all know, ‘bad data in, bad data out.’ The second challenge is in implementing compensating controls that moderate data and autocorrect an AI’s logic to a predetermined mean.
But, he added, “I am doubtful we will be able to teach AI the emotional intelligence required to identify and remediate discrimination when the core of the technology's logic is based on the net of our behavior.”
A data scientist and Bentley University mathematics professor, Noah Giansiracusa concurred, providing a real-life example:
Historical bias in the medical profession has led to certain populations getting less care. The AI just sees this data — not the background, or cause — and ‘learns’ that these populations don't fare as well. So, the AI might then recommend focusing efforts on patients from other populations, tossing in even more data doesn't help.
Then, he shared a real-life experiment that did:
A few years ago, people found, unsurprisingly, that large language models soak up and reproduce all the horrible bias and attitudes they see in the text they're trained on — much of which is from the internet. This way of training text tends to refer to doctors as ‘he’ and nurses as ‘she.’ So, they tried doubling the training set by doing a gender swap, and anytime the word ‘he’ appeared they created a new identical sentence but with ‘he’ swapped to ‘she,’ and similarly for all other gendered words. It forced parity, and the technique worked pretty well.
But, he added, “dealing with something like ableism seems like it would be a lot harder.”
Jacques Bastien, founder of boogie and a UI/UX professor at the University of Albany, is more hopeful, telling me he’s “optimistic… because, unlike human beings, [AI] can be improved.”
Karlin agrees, adding that “it’s just like being a DEI trainer — the number one thing is that you have to constantly be sensitive to all different groups of people and think about how they perceive things differently based upon their differing lived experiences.”
The best way to do that is to ensure that “whoever is creating, training or using the AI has actually been trained on DEI principles and/or comes from a directly impacted group.”
Plus, Giansiracusa added:
As more organizations think about developing their own AI systems, they should think very seriously about not just the data they feed it and the computational resources to train it, but also the importance of having a secondary training process of direct human feedback, which really seems to be the way to significantly improve a lot of the problematic aspects of AI we saw initially.
IMHO, though, nobody nailed it like Dr. Jabari “Naledge” Evans, an assistant professor of race and media at the University of South Carolina and a visiting scholar at Harvard’s Berkman Klein Center for Internet & Society.
“I believe that identity, culture and connected consciousness are the three most essential components of consumer experience in today’s society,” he said — and “DEI efforts will always fail if the focus is purely on presenting representation, versus disruption of matrices of dominance that typically loom over minorities and women.”
Digital tools like AI, meanwhile, “only help as far as those who create them allow them to travel,” he added. “Setting intentions to not be neutral… is imperative,” so “the only way to truly use AI in ways that prevent the ‘isms’ is to overrepresent those who are typically marginalized by data mining and algorithms.”
As you consider adopting AI at your organization, following the steps outlined herein, keep in mind the concerns and suggestions offered by the AI and DEI experts.
9 Steps to Investing in AI, While Protecting Your Brand, Employees and Customers from Bias
While DEI in corporate speak typically refers to an organization’s HR policy or program to standardize processes and procedures related to diversity, equity and inclusion, what DEI really means is an entire organization actively not only hiring but welcoming, valuing, respecting, supporting and promoting all workers — and especially those from underrepresented populations. It also includes making the necessary investments to identify exactly what all your employees truly need and want.
As Harvard Business School professor Robin J. Ely and Morehouse College president David A. Thomas explain:
Being genuinely valued and respected involves more than just feeling included. It involves having the power to help set the agenda, influence what — and how — work is done, have one’s needs and interests taken into account, and have one’s contributions recognized and rewarded with further opportunities to contribute and advance.
This works for the business, too; organizations that follow a DEI framework benefit not only from better branding but:
- Amplified engagement
- Quicker, smarter decision making
- Enhanced imagination and innovation
- Boosted productivity
- Increased profitability
Indeed, companies devoted to DEI earn 140% more revenue, have 230% more cash per employee, and are 70% more likely to capture a new market and 35% more likely to outperform their competitors.
So, be sure to adhere to all the included DEI requirements when developing your AI strategy.
Step 1: Do your DEI research
- Survey your staff (there are free templates on the internet) on issues of fairness, discrimination, personal belonging, trust, respect and purpose, decision making, hiring and onboarding, diversity and inclusion, and opportunities and resources (equity). Then, analyze for gaps and opportunities in your employee recruitment, hiring, development and advancement efforts.
- Ensure you’re measuring your DEI efforts against the right KPIs, including (a) percentage of representation on your organization’s board, (b) percentage of representation by employee category, and (c) pay equality, or the ratio of compensation by employee category (i.e., equal pay for equal work), along with promotion and turnover rates, percentage of participation in ERGs, and supplier diversity. Develop an internal dashboard for tracking, analysis and reporting. And continually analyze and adjust for gaps and opportunities in your employee recruitment, hiring, development and advancement efforts.
Step 2: Hire a DEI director
As Mita Mallick, a LinkedIn Top Voice, advises:
Instead of looking for direct senior DEI leadership experience, consider people with broader backgrounds but all the right skills: the ability to influence and be a change agent, to design strategy and deliver results, to create metrics and drive accountability, and to communicate effectively across all levels of the hierarchy. Those with marketing, sales, or communications backgrounds might be a great fit. Also consider people who have been informal D&I champions or, more specifically, have served as an executive sponsor for an employee resource group. You don’t have to be a career HR professional to do this work.
My advice:
- Start with the LinkedIn Top Voices in disability and advocacy, company culture, leadership, social impact and “next gen,” as well as mental health, racial equity, gender equity and LGBTQ+
- Reach out, schedule meetings, and solicit advice
- Be willing to pay a consultancy fee
- Be flexible and open minded
- Take copious notes on the individuals whose experience, expertise, values and mission most closely align with yours
- Hire from outside your organization, and outside your traditional networking bases and talent pools
Once onboarded, this new hire should lead your DEI efforts.
Step 3: Ensure DEI alignment
For your DEI initiative to work, it must be supported by your people — and the only way to earn their buy-in is to demonstrate the need and the value, or the problem and the solution. The problem for most organizations is insufficient diversity (a recruiting and hiring issue), equity (a hiring, training, development and advancement issue) and inclusion (a company culture issue); the need is improvement in all three areas.
- Kick off the program and initiate the discussion with a town hall on the interests, challenges and biases identified in the companywide survey
- Ask participants to contribute ideas to the development of the program, and provide perks for participation
- Develop the program and distribute details internally, requesting feedback online
- Finalize the program
- Celebrate your alignment and new commitment through public relations, marketing and advertising campaigns aimed at potential new employees and customers
- Ensure your management team never forgets the why behind it all
Step 4: Build an equitable, inclusive culture and safe, comfortable working environment from the ground up
- Create DEI task forces with employees of all ability levels, from all levels of the organization
- Create employee resource groups (ERGs) for employees who share a common characteristic, such as disability, race, ethnicity, gender, generation or religious affiliation, to provide support and expand professional networking and career development opportunities
- Create a public-facing, accessible digital scorecard measuring and showcasing your DEI metrics over time, with particular focus on the least-recognized protected classes, like people with disabilities
- Develop DEI policies for managers and staff, including a code of conduct policy, outlining the company’s policy on diversity, equity inclusion; a communication plan, outlining non-discriminatory communication practices; a non-discrimination policy, outlining discrimination laws and what is not allowed in the workplace; a zero-tolerance policy, outlining how instances of discrimination, harassment, bullying and stereotyping will be addressed by the organization; and a grievance policy, outlining how employees can use the company alternative complaint system
- Develop DEI workshops from the inside out, leveraging your lead learners in the creation of each workshop and training, encouraging the hard conversations; collecting all perspectives, prioritizing members of historically disenfranchised groups; promoting intersectionality, or the interconnected nature of social categorizations such as race, class, gender and sexuality in overlapping and interdependent systems of discrimination or disadvantage; focusing on intervention and not just bias reduction; and facilitating ongoing engagement through lead learner-led one-on-one meetings, workshops and town halls, as well as diversity-related outreach programs and informal information sharing
- Create safe spaces in your workplace, such as quiet workspaces for workers who may be distracted or overstimulated by open-floor-plan seating, gender-neutral restrooms for non-binary and genderqueer individuals, or lactation rooms for new mothers; if you’re fully remote or have some staff working from home, you can create safe ‘spaces’ digitally by encouraging employees to add pronouns to their email signatures and usernames, inviting employees to reserve time for personal needs by blocking it out on the calendar, honoring introverts by making digital culture events optional, and always providing the tools and support necessary to ensure all communications and experiences are accessible
- Deploy an alternative, accessible complaint system (without one, half of all discrimination and harassment complaints lead to retaliation), providing an employee assistance plan (EAP) for anonymous, free support, as well as implementing transformative mediation, designed to empower all parties and ensure each party recognizes the other’s needs, interests, values and points of view
Step 5: Create a diverse and inclusive artificial intelligence working group
Once your organization is prepared to truly investigate the potential uses of artificial intelligence based on the principles of diversity, equity and inclusion, it’s time to build your AI working group.
Assign your CIO, CTO and/or IT director to the manager or coach role, and your DEI director as team staffing and inclusion consultant, and instruct them to work together to build out your working group with:
- Enterprise, IT systems, network, technical, software, platform and data architects and engineers
- Business intelligence and systems analysts
- Cybersecurity and security systems specialists
- Database administrator, data privacy officer and data scientist or data intelligence specialist
- UX designer
- UI designer
- Back-end developer
- Release manager
There should also be mid- to senior-level representatives of each of the impacted business areas, including:
- Marketing
- Sales
- CX
- HR
- Ops
- Finance
The goal of the AI working group is to complete the remainder of the steps without sacrificing your commitment to DEI.
Step 6: Determine if artificial intelligence is a worthwhile investment
To identify your organization's best use cases for AI, you’ll want to survey the most diverse set of workers, across departments and job functions and up and down the corporate ladder. Why?
- Harvard Business Review found that diverse teams are able to solve problems faster than teams with cognitively similar people, and Cloverpop found that diverse teams also experience a 60% improvement in decision-making quality. Neither compares to the People Management research indicating diverse teams are 87% better at making decisions.
- According to Forbes Insights, 85% of Fortune 500 CEOs believe “a diverse and inclusive workforce is crucial to encouraging different perspectives and ideas that drive innovation” — and another study showed an 83% increase in innovation when employees simply believe their organization is committed to diversity. Authoritative evidence came two years later when Josh Bersin, a thought leader in corporate talent, learning and HR tech, ascertained from a two-year study that “companies that embrace diversity and inclusion in all aspects of their business” are 170% more innovative
By surveying everyone, you can:
- Develop the most holistic view possible
- Prioritize the perspectives of your employees who’d be using and/or most directly impacted by AI
The first mission of your AI working group must be to decide whether this exploration is even worth the organization’s time and resources. Kick off your introductory meeting by discussing and soliciting detailed responses to the following:
- What would we hope to achieve by introducing AI? What are the biggest weaknesses, gaps and opportunities we could address using AI? Would we create better products or services, expedite our time to market, mitigate risk and increase compliance, overcome our inefficiencies, or improve our customer or employee experience?
- Which types of AI platforms and AI tools would best meet our needs?
- Which tech improvements would be required to enable AI implementation?
- What new roles, if any, would we need to create to train, monitor, manage and report on our AI investments? And what methodologies would we adopt to prevent biased advancement decisions?
- What new processes and procedures would we need to develop to ensure the proper, unbiased use of AI? (Start here or here for tips on ensuring responsible AI practices and “AI fairness”.)
- What types of training would be required to ensure all new, reassigned or role-enhanced employees are equipped to perform their AI-related job functions?
- What obstacles would we face in implementing AI that doesn’t leverage or produce data that proliferates bias? How would we overcome them?
- What is our max budget for the initial investment?
- What other tech and people costs would we incur to enable full AI adoption?
- Which types of benchmarks and metrics would we use to track performance and ROI?
If, after reviewing the group’s written responses to all of the above questions, you and your working group leader believe you should continue exploring AI at your organization, task your tech experts with the next steps.
Step 7: Assess your IT infrastructure for AI capabilities
Even with outdated legacy systems and complicated tech stacks, you can still implement artificial intelligence, intelligently. Of course, before you identify practical use cases or develop an AI strategy, you need to determine:
- What your IT infrastructure can handle
- What can (and cannot) be updated or upgraded to enable not only the adoption but continued use and further development of AI
- What resources you’ll require to compensate for any gaps or weaknesses
- What infrastructure costs you’ll endure
For smaller and/or older organizations transitioning from AI experimentation to implementation, overhead costs may skyrocket as AI tech becomes increasingly more complex, better positioning more strategic, meticulous, innovative and cash-flush organizations that heavily research and identify cost-effective systems and methodologies to run their AI software.
When considering the bandwidth, strength and integration capabilities of your current systems and tech stack, as well as what you’ll need to take advantage of all the benefits of AI and automation, prioritize the following:
- High-performance computing capacity. GPUs, for instance, can accelerate deep learning by 10,000%.
- Tailored, high-capacity, scalable storage. Ensure you have a database/storage system appropriate for the amount of data your AI tools will take in and put out, as well as the ways you’ll use that data in real time and over time.
- High-bandwidth, low-latency, scalable networking infrastructure. AI puts strain on your networks, so invest in a seamless, secure global infrastructure provider that can prevent disruptions and outages and expand with your AI/data requirements.
- State-of-the-art security technology. Whether for the privacy and protection of your customers or that of your intellectual property, it is critically important to prevent invasion or data leakage of any kind.
- Affordability. The price of AI won’t shrink any time soon, as it becomes increasingly more complex and expansive. Don’t get left behind, but be mindful of accruing costs and scale intelligently, increasing investment size and implementation speed only with confirmed results.
Step 8: Identify AI use cases and test and select AI solutions
There’s an automated option for everything, but all artificial intelligence isn’t created equal. Before demoing the most buzzed-about or high-powered AI tools or hiring an AI developer to create your own, instruct your AI working group to first identify the types of AI tools and platforms that would most significantly impact the work and lives of their fellow employees.
You can also ask them for their feedback on the following.
21 Types of Must-Have AI-Powered Tools to Maximize the Business Benefits of Artificial Intelligence
- Audio recording and editing software — e.g., Adobe Podcast, Cleanvoice, Podcastle, or Resound
- Chatbot platforms — e.g., Imperson, Pandorabots, or ProProfs Chat
- CRM software — e.g., ActiveCampaign CRM, Apptivo, HubSpot CRM, or Klaviyo
- Customer data platforms — e.g., Blueshift, Tealium AudienceStream CDP, or Twilio Segment
- Cybersecurity tools — e.g., Sophos Intercept X Endpoint, Symantec Endpoint Security, or Vectra Threat Detection and Response Platform
- Data visualization and reporting tools — e.g., Microsoft Power BI, STORYD, Tableau, or Zoho Analytics
- Email marketing software — e.g., ActiveCampaign or GetResponse
- Graphic/Web/Logo design tools — e.g., Adobe Sensei, Designs, Flair, Illustroke, or Looka
- Image cleaners — e.g., Autoenhance, Cleanup, or Remini
- Image generators — e.g., Canva or DALL-E 2
- Music generators — e.g., Beatoven, Boomy, or Soundraw
- Productivity tools — e.g., IFTTT for automating tasks, MindMeister for brainstorming and project planning, Scribe for creating best-practices guides, Shift to streamline your workflow, Todoist for managing your to-do lists, or Toggl for time tracking
- Project management tools — e.g., Asana or Wrike
- Recruiting tools and talent management platforms — e.g., Arya by Leoforce, Eightfold AI Talent Intelligence Suite, or TurboHire
- Social media tools — e.g., Lately and Hootsuite, Linkfluence, Ocoya, or Sprout Social
- Text-to-speech software — e.g., Murf or NaturalReader
- Video generators — e.g., Deepbrain, Maverick, Synthesia, or Vidyo
- Virtual assistants — e.g., Google Assistant
- Virtual sales assistants — e.g., Clari, Drift, exceed by GENESYS, or SetSail
- Voice-to-text software — e.g., Otter or VoicePen
- Writing assistants — e.g., ChatGPT, Copy, Text Cortex, Ginger, or Text Wizard
Bonus: Check out EX Squared.
Step 9: Implement, test, monitor, report, iterate and optimize
As with any new tool, tactic or strategy, artificial intelligence isn’t a set it and forget it solution.
The final role of your AI working group should be to confirm with you, the C-suite and all impacted managers all the new AI-related assignments. Your working group leader should then continue to oversee reporting on/from all campaigns, business units, processes and employees leveraging and/or responsible for artificial intelligence.
This reporting will dictate whether you need to:
- Increase, decrease or maintain your AI investment into the future
- Increase diversity and inclusiveness to overcome AI bias
Learn More About AI and DEI
To learn more about artificial intelligence, download our report, Everything You Need to Know About AI Right Now:
To learn more about diversity, equity and inclusion, read this.
Image Credits (in order of appearance)
- Photo by Nathan Dumlao on Unsplash: https://unsplash.com/photos/VezBJuzlsZs
- Photo by Christina @ wocintechchat.com on Unsplash: https://unsplash.com/photos/4U1VXdNso8w
- Photo by Akson on Unsplash: https://unsplash.com/photos/1K8pIbIrhkQ
- Photo by Josh Appel on Unsplash: https://unsplash.com/photos/0nkFvdcM-X4
- Photo by julien Tromeur on Unsplash: https://unsplash.com/photos/6UDansS-rPI
- Photo by Christina @ wocintechchat.com on Unsplash: https://unsplash.com/photos/5v2Q5b4onY8
- Photo by Unsplash+ in collaboration with Getty Images on Unsplash: https://unsplash.com/photos/jHy5ngFCrg8
- Photo by Unsplash+ in collaboration with Meg Aghamyan on Unsplash: https://unsplash.com/photos/0HAuG5DLWPc
- Photo by Unsplash+ in collaboration with Pramod Tiwari on Unsplash: https://unsplash.com/photos/QPWKc779h2E
- Photo by Unsplash+ in collaboration with Getty Images on Unsplash: https://unsplash.com/photos/4F-gV7FoFZo
- Photo by Kelly Sikkema on Unsplash: https://unsplash.com/photos/X-etICbUKec
- Photo by Unsplash+ in collaboration with Getty Images on Unsplash: https://unsplash.com/photos/3eEDx4lVEdI