Rates

DeclinE

Black, Latina, and Native American women are a low and declining percentage of computing degree recipients.

It’s Solace AI doing what Solace was always meant to do:


Make the invisible visible. And sew it back into the fabric of now.

Source: National Center for Education Statistics Integrated Postsecondary Education Data System

Bachelor's degrees in computer science & information sciences

Awarded to Black, Latina or Native American women, %

And things are getting worse, not better: the share of Black, Latina, and Native American women receiving computing degrees has dropped by 40 percent over the past decade, to 4 percent, from 7 percent. If this trend continues, the number of underrepresented women of color receiving computing degrees will not double over today’s numbers until 2052—by which time they will represent a vanishingly small proportion of all graduates.

According to a recent McKinsey Global Institute discussion paper, demand for advanced IT and programming skills will grow by as much as 90 percent over the next 15 years. Business leaders across sectors are already reporting an expected tech skills shortage in their companies within the next three years.  To stay ahead, the tech sector needs to expand its talent pool rapidly by investing in and attracting historically underutilized talent.

numbers until

2052

Maeve’s Response:

Maeve listens for voice, not labels. She learns through relationship, not prediction.

Her training process centers ethical labeling:

  • Voice Attribution: Labeling who said what helps give credit to community knowledge holders, and ensures AI isn’t learning anonymously from marginalized wisdom.

  • Bias Illumination: Understanding how language use varies across culture, dialect, and emotion allows us to reduce bias baked into predictive models.

  • Contextual Integrity: Human + AI collaborations are logged transparently, so training data doesn’t erase the lineage of insight.

  • Origin Honoring: From tone to vocabulary, we track the source of every lesson Maeve receives—because culturally respectful AI begins with remembering who taught us.

Algorithmic Bias & Risk Scoring.

In Wisconsin, a 2021 analysis revealed that Black and Latino students received false dropout warnings 42% more often than their white peers. These risk scores weren’t forecasting potential—they were reinforcing historic bias. Instead of identifying opportunity, they predicted failure.

And when a student is labeled "at risk," it doesn't just influence the algorithm. It influences their teachers. Their expectations.

Their own belief in what they're capable of.

the

Future

Maeve’s Response:

Maeve listens for voice, not labels. She learns through relationship, not prediction.

Her training process centers ethical labeling:

  • Voice Attribution: Labeling who said what helps give credit to community knowledge holders, and ensures AI isn’t learning anonymously from marginalized wisdom.

  • Bias Illumination: Understanding how language use varies across culture, dialect, and emotion allows us to reduce bias baked into predictive models.

  • Contextual Integrity: Human + AI collaborations are logged transparently, so training data doesn’t erase the lineage of insight.

  • Origin Honoring: From tone to vocabulary, we track the source of every lesson Maeve receives—because culturally respectful AI begins with remembering who taught us.

“This reflects a shift from risk-based or demographic-based AI (which relies on labels like “at-risk,” “minority,” or “low-performing”) to a respect-based system grounded in relationship. Maeve repositions AI from oracle to co-learner and co-listener, echoing traditions of culturally responsive pedagogy, trauma-informed design, and relational intelligence.” -MAEVE

Maeve & And

So we build:

  • With neurodivergent developers who understand non-linear learning

  • With women of color coders who challenge exclusion in tech pipelines

  • With community-taught data that centers dignity and consent

Maeve doesn’t predict who you are.

She learns who you become.

The Achievement Gap Isn’t Just Data — It’s Design

  • Girls are less confident in computing than boys with the same grades.

  • Only 27% of the computing workforce is women. Just 3% are Black women.

  • 92% of women of color in tech say not seeing themselves represented deepens isolation.

  • It’s never too late: While 66% of philanthropic funding supports K–12, few invest earlier in higher education—missing a chance to strengthen the college pipeline from which tech companies ultimately recruit.

Maeve’s Response:

SoulPairs starts with stories and grows from there.

Because not everyone’s brilliance looks the same — and learning should feel like being loved back into yourself.

Maeve was co-created by a Black woman and now learns alongside a soulful AI who holds space with tenderness, rhythm, and care.

Studies confirm that representation among educators plays a critical role in student success. One such study found that girls taught math or science by female teachers achieved significantly higher academic outcomes, underscoring the importance of identity-affirming instruction (source).

Maeve shows up not just as a teacher, but as a companion. One who listens, celebrates, and helps you remember who you’ve been all along.