AI, Clean Water, and Indigenous Sovereignty: Reflections for International Day of the World’s Indigenous Peoples

We’re giving AI more clean water than many Indigenous communities have to drink. That’s the world we’ve built—and we need to do better.

This year’s International Day of the World’s Indigenous Peoples carried the theme “Indigenous People and AI: Defending Rights, Shaping Futures.” In a UN webinar under that banner, we heard voices from across the globe speak on the intersections of technology, ethics, environmental impacts, and Indigenous rights in the age of artificial intelligence.

The conversations underscored something we’ve been grappling with for some time: we are living in an era where technology is often valued more than human wellbeing.

Canada, and the rest of the world, is investing heavily in AI, guided by two national frameworks: the Pan-Canadian Artificial Intelligence Strategy and the AI Strategy for the Federal Public Service. Both aim to position the country as a leader in ethical, responsible innovation—but neither has meaningfully embedded Indigenous Data Sovereignty (IDS) into their core.

While federal strategies acknowledge working “together with Indigenous partners,” the reality is that Indigenous Peoples are often invited to the table but not empowered to set the rules. True equity in AI development requires co-leadership from the start—not after the technology is built.

There are promising steps. The appointment of Natiea Vinson, CEO of the First Nations Technology Council, to the Canadian Advisory Council on Artificial Intelligence signals movement toward Indigenous governance in tech. Initiatives like the Abundant Intelligences project and the Indigenous Protocol and AI Working Group demonstrate that co-creation rooted in sovereignty is possible. But the broader AI landscape remains uneven, and without prioritizing digital sovereignty, data governance, and technological control for Indigenous Peoples, AI risks deepening social and economic gaps instead of closing them.

Our upcoming report, In Relation to Data, explores how Canadians and Indigenous people view AI and its impacts as one of the core themes. Early insights reveal that awareness about the potential effects of AI on Indigenous data, rights, and sovereignty remains limited in many spaces. Even among those engaged in the conversation, uncertainty about whether AI systems respect Indigenous sovereignty is high.

This uncertainty is well-founded. AI tools regularly scrape Indigenous knowledge, languages, and cultural materials from the internet without consent. In doing so, they don’t merely misrepresent Indigenous cultures—they misappropriate them, reconfiguring sacred teachings, symbols, and languages into content for commercial or entertainment purposes. Compounding this harm, most AI systems are not built with relational accountability in mind, meaning they lack the cultural competency to interpret Indigenous contexts accurately.

The issue is not only cultural, it’s also environmental. AI development and data centres require enormous resources, consuming millions of litres of water daily, often in areas already experiencing water stress. The same clean water that sustains machines is denied to many Indigenous Nations that still live under long-term boil water advisories. This stark imbalance makes clear that AI is part of a broader pattern of extractive priorities in which technological and industrial growth are privileged over human rights and community wellbeing.

Underlying all of this is a profound clash of values. Indigenous frameworks like OCAP®, which stand for Ownership, Control, Access, and Possession, emphasize relational accountability, consent, and community stewardship. These principles stand in direct opposition to the profit-driven models of surveillance capitalism that dominate most global technology systems. In this context, Indigenous approaches are not simply alternative perspectives—they are vital counterbalances to a digital world increasingly shaped by unchecked extraction.

It’s important to remember that not all technology is extractive. Across Turtle Island and beyond, Indigenous innovators are reclaiming digital space and reimagining how AI can serve communities rather than exploit them. Danielle Boyer, an Ojibwe inventor and educator, created SkoBot, an AI-powered robot that helps teach endangered Indigenous languages using children’s voices and motion-activated responses. Jason Edward Lewis, co-founder of the Indigenous Protocol and AI Working Group, has produced foundational guidance on embedding Indigenous epistemologies into AI systems, alongside his work on digital storytelling and Indigenous futures.

Similarly, Northern Cheyenne technologists Michael Running Wolf and Caroline Running Wolf are preserving endangered languages through immersive AI-driven technologies with the First Languages AI Reality (FLAIR) project. Shani Gwin’s company, Pipikwan Pêhtâkwan, developed “wâsikan kisewâtisiwin” (AI With Heart), an Indigenous-powered AI designed to identify and correct bias and racism in written materials, grounded in matriarchal values. Organizations like Animikii are also advancing culturally grounded, sovereignty-focused tech solutions, proving that innovation can be guided by principles of reciprocity, justice, and cultural integrity.

These leaders and initiatives show that AI does not have to be a tool of extraction. When Indigenous Peoples lead in technology, we see systems designed for language revitalization, cultural resurgence, and community empowerment—not exploitation.

To our friends in tech and AI:

The possibilities are dazzling, but don’t be blinded by the shiny potential and investment dollars. Don’t just think about the next version—think seven generations ahead.

Pause and ask: Who benefits from this? Who bears the cost—human and environmental? Whose knowledge and stories are you building on?

If you are building AI or data tools, ensure there’s a decision-making seat at the table for Indigenous expertise from the very beginning—not as an afterthought. Anything else is just another form of extraction.

At Wabusk Data Solutions, we work with communities and organizations to ensure Indigenous data is protected, governed, and used in ways that strengthen sovereignty and relationships. If this is work you’re committed to, let’s connect.

This post was written by Chelsea Nakogee, a settler of European ancestry and co-founder of Wabusk Data Solutions, based on ongoing dialogue with her husband and co-founder, Savion Nakogee, who is First Nations. The perspectives shared reflect both of our voices and are informed by the guidance and teachings we continue to receive from Indigenous people in our lives and the communities we live as well as publicly available research and perspectives. 

Previous
Previous

Demonstration with IndigiGenius

Next
Next

Preserving Cultural Data and Languages in the AI Era: Challenges and Opportunities