According to Indigenous prophecy, particularly the teachings of the Lakota and Dakota peoples of North America, it was foretold that a “black snake” would slither across the land, bringing destruction to the Indigenous people, their way of life, and the environment around them. As an activist working at the intersection of technology and Indigenous rights, I can see a number of parallels with the current challenges for digital rights around the world; challenges that have been further illuminated since I became Access Now’s new Executive Director a little over three months ago.
By framing environmental and human rights issues in spiritual and prophetic terms, Indigenous activists highlight how our ecological, cultural, and moral survival are all interconnected. The “black snake” metaphor reminds us of the timeless power and relevance of Indigenous knowledge(s), values, and spiritual teachings for confronting “modern” challenges like climate change, resource extraction, and systemic oppression. Our digital-driven world may seem futuristic in comparison, but our sector is wrestling with its own “digital black snakes” — external variables that pose pervasive and disruptive threats to our mission of safeguarding human rights in a digital era, and which require us to adapt and respond thoughtfully.
1. Shrinking civil society resources
Around the world, nonprofits, newsrooms, and grassroots activists are facing existential threats in the form of an ever-widening gap between the needs of those they serve, and the resources available to meet these needs — while operating in increasingly restrictive environments and amid growing demands from grassroots organizations and marginalized communities. With operational costs being pushed up by inflation and the desire to ensure equitable pay, many nonprofits, including those within the digital rights sector, are experiencing significant funding shortfalls, as grantmakers rethink their priorities, reduce grant amounts, or pause certain funding streams. For many nonprofits, this has meant imposing hiring freezes, program cuts, layoffs, and even winding down operations completely, jeopardizing the ability of organizations to sustainably deliver on our respective missions.
2. Fighting on a new frontline for global control and exploitation: data colonialism
The rise of data colonialism, whereby historical practices of colonial extraction and oppression are now replicated in the digital world through the control and exploitation of personal and communal data, reflects an unsettling evolution. As my predecessor and Access Now’s co-founder, Brett Solomon, noted, in the 15+ years since Access Now began its work, democratic backsliding in Iran, Egypt, Palestine, Sudan, Myanmar, and elsewhere has demonstrated that “governments view digital technology not as a liberating force, but as a tool of control,” amplifying their ability to monitor, repress, and suppress dissent. Data colonialism shows how mechanisms of extraction and domination have been transposed into digital ecosystems.
Historically, colonial powers exploited physical resources, labor, and knowledge; today, human experiences and data are appropriated for the same purpose, by governments but also by Big Tech companies keen to profit from our personal lives. This evolution poses a critical challenge for civil society organizations, particularly those already constrained by limited resources, as they work to protect digital rights and address these forms of systemic injustice.
3. Backsliding on hard-won human rights
The erosion of hard-won human rights is becoming increasingly apparent across the globe, with digital rights emerging as a critical battleground in the fight against authoritarianism. Digital technologies are not only instruments of control but also mechanisms to enforce discriminatory policies, such as anti-LGBTQ+ laws, with devastating precision. Advanced systems like biometric databases and AI-driven monitoring disproportionately expose activists and individuals to risks of discrimination, criminalization, and violence. For LGBTQ+ communities in particular, digital platforms — which often serve as lifelines for organizing, advocacy, and connection — are increasingly hostile spaces.
Governments are tightening their grip on dissent by criminalizing online speech, imposing restrictive regulations on grassroots organizations, and reintroducing oppressive policies that exacerbate existing inequalities. Online platforms, instead of fostering inclusion and activism, have become battlegrounds where hate speech, digital harassment, and algorithmic biases disproportionately harm marginalized groups. Meanwhile litigation and legal processes are being abused, with myriad Strategic Lawsuits Against Public Participation (SLAPPs) deployed with the goal of silencing dissent and censoring voices advocating for human rights. As digital civic spaces shrink, they undermine the broader fight for equality and human rights, leaving activists and allies vulnerable and isolated. Protecting these spaces requires immediate global attention and accountability from governments, corporations, and civil society. Digital rights are not abstract concerns — they are foundational to the safety, visibility, and progress of marginalized communities worldwide.
4. Escalating human rights atrocities, war crimes, and the rising risk of genocide
Around the world, we are witnessing atrocities and other grave human rights abuses, in Gaza, Ukraine, Myanmar, Sudan, and many other places. Technology plays a dual role in such crises — it can be a tool for documenting abuses and mobilizing support, but it is also used by oppressive regimes to surveil, censor, and suppress populations. In active war zones such as Gaza, we’ve even seen artificial intelligence (AI) technologies deployed to target airstrikes. Meanwhile, the digital divide and internet shutdowns exacerbate these crises, preventing the global community from responding effectively and hindering the flow of vital information.
5. Emerging risks of artificial intelligence
The rapid advancement of AI presents both opportunities and significant risks. Without proper, rights-based oversight and ethical considerations, AI can perpetuate existing human biases against marginalized groups, enable mass surveillance, and make critical decisions without transparency or accountability. Authoritarian regimes may exploit AI technologies to strengthen their control over citizens, infringing on privacy and other fundamental rights. It’s imperative that we advocate for responsible AI development and policies that protect human rights.
6. Technology and climate justice
Digital technologies have a complex relationship to the climate. On one hand, they offer tools for monitoring environmental changes, optimizing resource use, and promoting sustainable practices. On the other hand, the tech industry significantly contributes to carbon emissions through energy-intensive data centers, which now include the growing demands of generative AI systems that consume vast amounts of power and water to produce outcomes. The environmental toll of these technologies is compounded throughout their lifecycles, from the mineral extraction necessary to build them, to the electronic waste they generate once discarded, adding pressure to already strained ecosystems. Furthermore, the ongoing climate emergency can disrupt digital infrastructure, impacting connectivity and access to information.
We must work toward sustainable solutions that mitigate the environmental impact of technology while enhancing resilience against climate-related disruptions. Increasing collaboration between grassroots organizations and policymakers is essential for fostering systemic change. By connecting local environmental knowledge with digital innovation and policy advocacy, we can create solutions that not only mitigate the environmental impact of technology but also ensure equitable access to its benefits in a rapidly changing context.
Connecting ways of resisting
As alluded to above, none of these challenges can be viewed, or tackled, in isolation. The sudden explosion in generative AI use, for instance, entails a simultaneous increase in this technology’s environmental impact, while also deepening human rights violations, from the everyday to the dystopian. Meanwhile, governments — both authoritarian and democratic — are squeezing civil society’s ability to push back against these and other challenges. That brings us back to Indigenous perspective of interconnectedness. For Indigenous activists, the “black snake” metaphor isn’t only a symbol of destruction; it is also a unifying call to resist and protect sacred spaces. As the digital black snakes intertwine and multiply, we must redouble our efforts to understand these issues and how they impact our communities.
As 2025 gets underway, we must all embrace this pivotal moment with renewed commitment. Creating real impact demands collaboration, mutual support, and shared effort. The “black snakes” in our way are significant, but so too is our collective power to fend them off. Together, we can rise to the occasion to defend and extend digital rights for everyone, everywhere.