Issue 2 - Jan 2026

The Architecture of Resilience

Featuring Guest Editor Peter B. Kaufman, Associate Director of Development, MIT Open Learning.

Article 1Issue 2

Letter from the Editor

We’ve seen it. We’ve seen societies free themselves from the seemingly ironclad hold of insane ideas. Once upon a time, for example, it was the common understanding around the world that human slavery was part of the natural order of things. Historian Adam Hochschild drew that horrifying picture precisely for us in a great book 20 years ago: \T]he vast majority of people are prisoners. Most of them have known no other way of life. They are not free to live or go where they want. They plant, cultivate, and harvest most of the earth’s major crops. They earn no money from their labor. Their work often lasts twelve or fourteen hours a day. Many are subject to cruel whippings or other punishments if they do not work hard enough. They die young. They are not chained or bound most of the time, but they are in bondage, part of a global economy based on forced labor. Such a world would, of course, be unthinkable today.  But this was the world – our world – just two centuries ago, and to most people then, it was unthinkable that it could ever be otherwise. At the end of the eighteenth century, well over three-quarters of all people alive were in bondage of one kind or another, not the captivity of striped prison uniforms, but of various systems of slavery or serfdom. And then — boom! Boom, in a cosmic sense. Three-quarters of all people alive. In a flash, about a generation or two, people around the world realized that such a system was aberrant, contrary to the rights of man — and they abolished it. Hochschild writes: Looking back \[…], what is even more astonishing than the pervasiveness of slavery in the late 1700s is how swiftly it died. By the end of the following century, slavery was, at least on paper, outlawed almost everywhere. The antislavery movement had achieved its goal in little more than one lifetime. Likewise colonialism, with its comprehensive systems of oppression (Hochschild [writes about those, too) across Africa and Asia. Boom! In the 1950s, 1960s, 1970s — and then, poof! Communist/Leninist/Stalinist totalitarianism — in the 1990s — pow! Paradigms changed. I remember how, in February 1990, as the Berlin Wall was coming down, former political prisoner Václav Havel came to address a joint session of the U.S. Congress, gathered to celebrate his victory. In June, South Africa’s Nelson Mandela, after 27 years in prison himself, came to do the same. Again, paradigms had changed. Today, we have a world of stink again, burbling full of dreck. Monstrous personalities, entertainers, and ideologues alike have inherited or seized the most powerful states. Businesses of a size we have never seen tighten their grip upon the world. Inequalities in wealth and access to knowledge overwhelm any calculation or benefit in data collection. Our most important media networks, once the most trusted in the world, circulate information that’s barren of truth. The guilty escape justice, and the innocent are imprisoned by ignorance and debt. But — cling to history. Those who despair today can never give up hope — they have to draw inspiration from the past. Not only the inspiration to survive in times of darkness, but to prepare for the moment when they reach up and flip the switch.  Inspiration is what this issue of the Digest is about. Inspiration is also what Filecoin Foundation for the Decentralized Web (FFDW) has been providing by supporting, for the past several years, what it calls its social impact community. That community is one that my institution, MIT, is proud to be a member of.  The seven essays that follow describe the path-breaking, breathtaking work of Starling Lab, the Internet Archive, the Prelinger Archives, Gray Area, the Flickr Foundation, TRANSFER, Lighthouse, and WITNESS, among others. The forms of writing vary — a decentralized design manifesto (Starling), a profile of a changemaker (Prelinger), a Q&A (with Gray Area instructors), reports on using decentralizing storage technology (Flickr and TRANSFER), on building those same technologies (Lighthouse), a field report on the present challenges of their use in the wild (WITNESS).  But taken together, they illustrate bravery in action and invite a paradigm change in the present rather than a century later. This collection is about technology and media: the instruments that get new and true ideas into our heads, and from there, outward into reality. Those who control the spread of ideas control how we understand and shape the world. The contribution about Prelinger Archives and the Internet Archive details the almost unbelievable scope of what these organizations do (the IA is about to celebrate preserving its trillionth link of the web). The contribution from Gray Area reminds us that we have to work toward “democratic, solidarity-based systems of governance over the design and stewardship of network infrastructure,” and that building decentralized technologies is part of (to remember Hochschild) “a practice of liberation.” And all of these contributors are building what we’ve decided to call, for this issue of the Digest, the architecture of resilience. Author Bio Peter B. Kaufman is Associate Director for Development at MIT Open Learning and a writer and producer.

PK
Peter B. Kaufman
Read Article
Article 2Issue 2

Analogies for a Decentralized Web of Trust

Introduction: The Authentication Problem We're fighting the last war. While everyone builds increasingly sophisticated deepfake detectors, the real battle is happening at the moment of creation. The current paradigm — to detect forgeries after they've spread — puts us permanently on defense in a cat-and-mouse game. By the time we've identified a fake, it's already done its damage in courtrooms, newsrooms, and public discourse. Starling Lab's approach turns this logic on its head: authenticate at creation, not after circulation. Our Principal Investigator and Stanford’s cryptography pioneer, Dan Boneh, frames this as "Glass to Glass": a cryptographically secured chain from the moment a lens or other sensor captures data to the moment it is emitted from your screen display. In between, cryptography makes tampering detectable. This shifts the burden from "prove this isn't fake" to "prove this is real." Leveraging the Filecoin ecosystem, we can pursue this fundamental reframe of data integrity. Instead of trying to catch lies after they've been told, we can build systems that make lying cryptographically impossible. But there's a catch. Problematisation: But Humans Are Still Outside The Glass The "Glass to Glass" vision offers powerful protection against technical forgery, but it leaves humans outside the cryptographic chain. Someone still decides what to record, how to frame it, when to start rolling. Technical verification can prove that data hasn't been altered — but it can't prove that it matters, that it was captured honestly, or that it represents what it claims to represent. Even a perfectly authenticated video of unclear events creates a troubling gap between "this data is unaltered" and "this data is meaningful and reliable." Context, interpretation, and significance remain stubbornly human concerns. Technical solutions alone create brittle trust — a cryptographically sealed castle built on sand. This isn't a flaw in the technology; it's a reminder that trust has always been more than technical. If we want to understand how to build resilient verification systems, we need to examine how humans have always created trust networks that survive both technical failure and malicious attack. How Trust Actually Works Courtrooms: Corroboration, Authority Consider a typical criminal trial: rarely does a single piece of evidence decide the outcome. Instead, prosecutors weave together witness testimony, physical evidence, and circumstantial proof, while defense attorneys challenge each thread. This adversarial process isn't a bug — it's the feature that creates resilience. When witnesses contradict each other, the system doesn't break; it forces deeper examination. The jury weighs competing narratives, and truth emerges not from authority but from the collision of perspectives. What's remarkable is how this creates trust without a central arbiter of truth. Procedural rules — evidence standards, cross-examination rights, and jury instructions — form a protocol that participants trust even when they dispute the outcome. In the U.S., different jurisdictions develop their own variations, creating a federation of legal systems that cross-validate each other's approaches. Appeals courts provide another layer of verification, like consensus mechanisms in distributed networks. We’ve designed a technical prototype in support of this pluralism. In this setup we call a “Witness Server”: when a researcher triggers the archiving of a web page, multiple independent institutions simultaneously produce their own versions of that archive. These independent crawlers don't coordinate their responses; they operate from different locations, with different infrastructure and crawling engines, sometimes capturing slightly different visual artifacts — a promotional banner here, a cookie notice there. Rather than treating these discrepancies as system failures, these contradictions are but the whole point. When servers disagree — perhaps due to network delays, geolocation differences, or varying browser behaviors — those variations become data points that help investigators understand what actually happened. Like conflicting witness testimony in court, the differences often reveal important context about how information appears to different observers. The protocol creates trust in the process by making the verification distributed and transparent, even when individual witnesses might capture imperfect snapshots. Art Authentication: Soft Consensus Networks Walk into any major auction house during a Picasso attribution dispute, and you'll witness humanity's most sophisticated trust networks in action. The painting travels between conservation labs, where experts examine brushstrokes under microscopes, and provenance researchers, who trace ownership through decades of private collections. Art historians compare it to verified works from the same period. Meanwhile, in the informal networks of dealers, curators, and collectors, whispered conversations gradually build consensus. No single expert has the final word, but their collective judgment crystallizes into market conviction — or doubt. What matters isn't just having multiple opinions, but having opinions from institutions and experts whose reputations carry weight. A conservation report from the Metropolitan Museum carries more credibility than one from an unknown lab. Another one of our technical prototypes, a system called Authenticated Attributes (AA), mirrors this reliance on institutional reputation. AA allows verifiers to provide a single signer ledger where they publish metadata, a bit like an RSS or micro social media feed. When we create verifiable metadata attestations — individual facts about digital evidence signed with cryptographic proof — their trustworthiness derives not just from the technical integrity of the signatures and timestamps but from the reputation of the institution making the attestation. Whether it's witness servers operated by Stanford and Harvard, or metadata verified by established news organizations, the cryptographic tools amplify rather than replace the trust networks built on institutional standing over time. Wikipedia: Transparent Scepticism at Scale Consider Wikipedia's article on any controversial political figure — say, a recently deceased dictator. Initially, the page erupts in edit wars as contributors with opposing viewpoints battle over every sentence. Some editors attempt to whitewash the subject's crimes; others push for maximum condemnation. The page gets locked, unlocked, re-edited, and locked again. Consider Wikipedia's legendary yogurt spelling war — a controversy that lasted seven years and generated over 140,000 words of discussion. The battle wasn't over substance but spelling: should the dairy product be "yogurt" (American) or "yoghurt" (British)? Initially, edit wars erupted as contributors with different linguistic loyalties battled over a single letter. The page got locked, unlocked, re-edited, and locked again for more than a decade. Our Authenticated Attributes system operates on remarkably similar principles. Like Wikipedia's edit histories that create public archaeological records of how knowledge develops, AA creates cryptographically signed metadata attestations that serve as auditable trails of how evidence was processed and verified. Each attestation — whether describing the location of a photograph or the time a video was captured — functions like a Wikipedia citation, distributing the burden of proof across multiple independent sources rather than relying on a single authority. And just as Wikipedia's transparent editing processes allow anyone to verify how consensus emerged, AA's append-only log ensures that all metadata contributions are permanently attributable to their authors, creating the same kind of collective accountability that makes Wikipedia's chaotic process ultimately trustworthy. These three systems (legal proceedings, art authentication, and Wikipedia) share profound structural similarities. All create trust through multiplication rather than concentration of authority. A single witness, expert, or editor might be biased, bribed, or simply wrong, but coordinated deception across multiple independent actors becomes exponentially harder. They embrace contradiction as information rather than system failure. When disagreements surface important uncertainties that might otherwise remain hidden, the systems process conflict gracefully, using procedural rules to convert disagreement into collective learning. Most importantly, they survive gaming attempts through transparency and redundancy, making attacks visible and ultimately self-defeating. Design Lessons for Decentralized Infrastructure These analogies reveal why purely technical solutions create brittle trust: they optimize for the wrong variables. Cryptographic verification can prove that data hasn't been altered, but says nothing about whether the data is meaningful, was captured authentically, or accurately depicts reality. The lesson isn't to abandon technical verification but to design systems that scale human wisdom rather than bypassing it altogether. Witness Servers, Authenticated Attributes, and cryptographic authentication tools more broadly work best when they amplify the patterns that make human trust networks resilient — redundancy, transparency, graceful handling of contradiction — rather than trying to replace human judgment with algorithmic certainty. For the Filecoin ecosystem, this suggests a design philosophy: build infrastructure that makes it easier for humans to verify claims independently rather than building systems that claim to verify truth automatically. This means: Creating protocols that surface disagreement rather than hiding it. Designing for multiple independent validators rather than single sources of authority. Making verification processes comprehensible to the humans who must ultimately decide whether to trust them. Supporting verification workflows instead of making them automatic or binary — giving users the tools to understand not just whether something is verified, but how and by whom. The goal isn't to eliminate human judgment but to give it better tools for seeing clearly. Cryptography gives us powerful tools for establishing provenance and preventing tampering. But humans give us meaning, context, and the collective wisdom to navigate an increasingly complex information landscape. The future of decentralized trust lies not in choosing between technical and human verification, but in weaving them together into systems that honor both the irreducible human element and the power of pluralistic verification networks.

BS
Basile Simon
Read Article
Article 3Issue 2

Preserving the Past on a Decentralized Future

Rick Prelinger is all about contradictions. He’s a 72-year-old archivist, but his shock of white hair is dyed a cyberpunk blue. He’s a historian who is desperate to expand public access to rare materials — and also knows that the process of sharing fragile items can sometimes end up destroying them. And, like any good collector, he’s obsessed with the ancient and the obscure… yet he’s tuned in to next-generation technologies. Here’s an example: We’re on the upper floor of a sprawling warehouse outside San Francisco where his immense, generational film preservation project, the Prelinger Archives, is working to scan thousands of ephemeral movies from its collection — and then upload them to a warren of decentralized digital storage, organized and managed through Filecoin. “There is push and pull, and there are protocols,” he says with a grin. “But onchain is the next development. It's the next stage of making sure that material is safe and material is available.” Since 1983, Prelinger has been obsessively collecting video from just about every place he can, particularly those that get lost or ignored: old advertising reels, home movies, amateur films. Beginning in 2000, the Archives have been putting that material online, mainly in close collaboration with another Bay Area institution, the Internet Archive. And then in 2023, things went even further, when Filecoin Foundation for the Decentralized Web (FFDW) made a grant to the Prelinger Archives to help it make those videos available through its decentralized technology.  That’s made it possible not just to digitize what’s already in the collection, but to work with new organizations to help them store and share their own history. “It could be somebody in the community who has a home movie collection. It could be the Filipino American Historical Society. It could be the Mission Media Arts Collection… It could be the Center for Asian American Media. Our priority here is to daylight material that represents underrepresented people and underrepresented communities, because there's such a great demand for that right now.” Underrepresentation has been part of the script from the very beginning. By design, the Prelinger Archives focus on the kind of video that often gets tossed aside, junked, or just plain forgotten. Some of the films they get hold of haven’t been watched for decades, and some have possibly never been seen. But to Prelinger, these strips of film hold precious opportunities for viewers, whether it’s the chance to witness a city’s streetscape from 75 years ago in the background of a promotional short, or finding footage of a Native American ceremony that somebody inadvertently caught on 8mm home video in the 1960s. Inside the facility’s network of rooms, he guides me through the process that a piece of film takes to make it from the bin to the blockchain. I watch as the team works in a painstaking production line. One member of staff diligently takes film from a canister and carefully examines the strip inside, cutting it, cleaning it, and readying it for scanning. Sometimes, you open a tin and the material inside has decomposed, points out archivist Jennifer Miko, melted or vinegared. But this one’s OK, she says: crinkly on the edges but probably salvageable. I know how it feels. Once the film has been successfully prepped, it’s passed to another expert who feeds it into one of the project’s scanners, a byzantine assembly of wheels and knobs and cameras and recording equipment. The machine’s wheels spin backwards and forwards, projecting and capturing the visuals and sound in a stop-start dance (Every 5,000 frames, the machine stutters and checks to make sure everything’s still in sync.) On a nearby screen, we watch the output that’s being pulled through in high definition: this one’s a developmental psychology film of young boys from the 1950s. And from there, staff add metadata, tag the video, and run quality control checks, before moving it to physical storage and, eventually, onto the internet and the blockchain. Over at the Internet Archive — another recipient of a grant from Filecoin Foundation — the chain is also part of the picture, although different material creates different complications.  Mark Graham, director of the organization’s Wayback Machine project, has responsibility for the Internet Archive’s more forward-looking work. He says copying data to decentralized technologies is one of several bets the organization is making to future-proof itself: to ensure the long-term security of the vast troves of records it is collecting, which stretch from websites to software to books.  “We spend a lot of time looking at possible futures, and the blockchain is one of them,” he says, “But it’s still more experimental, and so we’re exploring.”  Exploratory, yes, but the end goal is clear: find new ways to ensure that information can be available forever, that stop it from being eroded or erased. Sometimes that’s just because of natural attrition or a lack of attention, similar to the problems Prelinger sees. Sometimes, however, it disappears through deliberate deletion. Since November 2024, for example, the Archive has been working around the clock to capture and share as much information as possible from the US government’s huge libraries of documents and information, even as the current US executive deletes material from previous administrations.  With the integrity of information facing new types of threat, Graham says it’s time to see what options the chain has to offer.  “We generally think decentralization is good and has benefits, and we’re seeing how it might be realized,” he says. “But it’s a game with many winners.” There are detractors, of course, and challenges for all archival projects. One of the issues both teams face is scale. The Prelinger Archives collection stands at maybe 40,000 videos (on top of 60,000 that were already donated to the Library of Congress), and it’s producing around 18 terabytes of video each week — too much for their current unoptimized decentralizing process. So right now, the team is producing smaller, more usable files for decentralization and keeping the higher-resolution originals elsewhere. If that scale makes your eyes water, then the Internet Archive’s challenges might pop them straight out of your head. Graham says the organization deals with around a billion items a day and currently hosts around 175 petabytes of material across all its formats. Still, the march of technology suggests decentralizing will get easier over time — or at least cheaper: Prelinger says that when he first started working with the Internet Archive, it cost about $100 to digitize a minute of footage. That number is drastically lower today. But why do it? Well, everything needs a backup, he says, and “lots of copies to keep it safe.” In reality, though, it’s less about backups for the present and more about creating a gift for posterity. These aren’t our materials, says Prelinger, and they shouldn’t be ours to gatekeep: putting them on the decentralized system is the next step in their liberation. “You find yourself in collaboration with people all over the world, many of whom you will never know,” he says. “We've enabled hundreds and hundreds and thousands of projects; our material became canonical. While I'm the first person to argue for attribution, there's something that happens when your collection, when your content, becomes part of infrastructure. History needs to be infrastructure, like water and air.” “The idea of getting stuff on the chain and letting it float around through the world and be available to everybody? That’s the highest destiny that an archive can aspire to… when you give up control.” Bobbie Johnson is a media maker, editor, strategist and entrepreneur with more than two decades of experience in the US and UK. He was most recently editorial director at the Steve Jobs Archive. Prior to that he held roles at MIT Technology Review, The Guardian, and Medium, and helped start award-winning publications Matter and Anxy.

BJ
Bobbie Johnson
Read Article
Article 4Issue 2

What it Means to Build the Architecture of Resilience

What does it mean to build and inhabit networks that hold up under pressure — not just technically, but culturally and creatively? This was one of the animating questions of the DWeb for Creators course, a multi-week workshop from Gray Area and sponsored by Filecoin Foundation for the Decentralized Web (FFDW), led by artists, organizers, and technologists working at the intersection of decentralized technologies and cultural production. Designed for creatives of all kinds, the course introduced students to the possibilities of decentralized tools, platforms, and protocols. In this conversation, the course instructors reflect on what it means to construct an “architecture of resilience” — not just in terms of infrastructure, but in terms of imagination. Their answers underscore a shared belief: the decentralized web offers blueprints for systems rooted in solidarity, sovereignty, and care. Artists bring not only a stake in this reimagined architecture, but the vision and urgency to shape it. What does an architecture of resilience mean for you? mai: In the context of distributed web technologies, it means the construction of digital spaces that enable people to respond and recover from obstacles and crises. Networked technologies can enable dialogue within and between communities, and disseminate information about the challenges that we face and how to deal with them. Yet the corporate, monopolistic platforms that dominate the internet are failing at serving these basic civic functions. If we approach the DWeb as a means to make our societies resilient against current crises, it would necessitate bottom-up, democratic, solidarity-based systems of governance over the design and stewardship of network infrastructure, in order for us to be able to confront global challenges. In what ways do decentralized technologies empower creativity, especially for underrepresented or independent creators? ngọc: First off — data ownership. For the first time, creators can actually own their work and still be able to share it with a global audience. In a world much dominated by surveillance, extraction, and corporate control, the DWeb opens space for radical alternatives: autonomy, justice, and collective power. Tools like Distributed Press allow independent and historically marginalized creators to publish without censorship, to speak without fear, and to build without compromises. They can also receive direct support from their communities — no middlemen, no cuts, no commission fees. They are free to build communities rooted in solidarity, care, and mutual benefits. In a way, decentralization is more than just a shift in technology; it is a practice of liberation. How are decentralized technologies intersecting with creative fields to cultivate an architecture of resilience? Sarah G: My creative practice integrates decentralized technologies as both medium and methodology. An ongoing example of this approach is Futura Trōpica, in collaboration with Juan Pablo García Sossa. It is an intertropical network connecting communities from around the tropics via the IPFS protocol. This system enables lateral knowledge exchange outside conventional North-to-South hierarchies. Rather than relying on centralized platforms, this work creates resilience through distributed architecture, transforming traditional audience relationships into participatory ecosystems where communities collectively steward their cultural resources on their own terms. Ayana: Equitable Internet Initiative (EII) is a group in Detroit building its own wireless networks to prevent the creation of a digital class system. Taeyoon Choi explores infrastructure and equity through participatory performances happening on top of what they’ve coined, The Distributed Web of Care. Through their project Forkonomy(), artists Tzu Tung Lee and Winnie Soon are using decentralized technologies to address questions around collaboration, land, and ownership. Considered together, these projects demonstrate that cultivating an architecture of resilience not only requires tools and technology, but also a mindset shift toward reclaiming our agency and power. These artists and organizers remind us that an architecture of resilience is first cultivated inside our own imaginations. Reflecting on the DWeb for Creators course, what lessons did you most hope students would take away? And what insights did you leave with yourself? Sarah F: I hope students develop a nuanced and thoughtful perspective on decentralization. We tried to design the curriculum to contain both optimism and pragmatic conversations about limitations. There are many possibilities on offer with decentralized tech, but new tools also come with new problems, both social and technical. Personally, I was excited to see the range of student projects and so much enthusiastic participation throughout the course. I think building a sense of agency is one of the most useful things you can do as a tech educator, and I hope students find ways to keep engaging with the tools and protocols discussed in class going forward. mai: The main lesson I hope students take away from the course is that artists and cultural workers have always pushed the boundaries of network technologies. There are the companies, state institutions, and NGOs that fund, design, and build the internet’s infrastructures with their respective motivations. But artists see beyond what is expected of us all as “users” and expand what’s possible. We cover many of those experiments that have taken place over the course of the Web’s evolution — really my favorite part of the course was learning about them from my co-educators. So my primary motivation is to give students the ability to have a deep understanding of the powers and technologies that have shaped the Web, and use that knowledge to continue the legacy of creatively challenging its limitations. What kinds of infrastructure — technological, social, or otherwise — do creative communities need to thrive in decentralized spaces? Sarah F: One thing we plan to more fully address in the next edition of the course is infrastructure risk as it pertains to creative practice. As artists, we’re often reliant on platforms and protocols to host, distribute, or in some cases, literally as the medium for our work. But these platforms can crumble, and the tools can stop working. Infrastructure risk doesn’t go away with decentralized tech; it just changes. An open source decentralized protocol can lose funding or have maintainers step away. Assessing what risks we take on in specific infrastructure contexts and how to best ensure the longevity of our work is an extremely important skill for artists who work with technology.  Why is it important for artists and creatives to actively shape the decentralized web? mai: We constantly consume art on the Web — all the videos, photos, illustrations, games, writing, and even memes that we share online are all works of art. The internet has unleashed an international creative renaissance that allows us to share and access culture in a historically unprecedented way. And yet artists continue to be neglected when it comes to their rights online. Restrictive digital policies over copyright and free expression are one primary way that artists’ rights are suppressed. Another is how mainstream platforms continue to squeeze creators for their work and undermine their ability to make a living, or even remotely financially benefit from their contributions. It’s important for artists to shape the distributed web so their rights and needs are embedded in the social and technological protocols of our network infrastructures.  Sarah G: Artists must help shape the decentralized web because they work beyond practical constraints, constantly questioning technology rather than simply accepting its path. They bring a range of perspectives, from critical to playful, that technical development alone can't provide. While engineers build systems that work, artists challenge assumptions without being limited by what's considered possible, often finding surprising insights by exploring the improbable and even the absurd. Without diverse viewpoints actively shaping these systems, we risk simply transferring existing problems into new infrastructures, no matter how distributed their technical design may be. Author Bios Sarah Grant is an American media artist and educator based in Berlin at Studio Weise7. She engages with the electromagnetic spectrum and computer networks as artistic material, habitat, and political landscape. She organizes the Radical Networks conference in support of critical investigations and creative experiments in telecommunications. ngọc triệu is a potter, design researcher, and learner. She works closely with free, open-source, decentralized, and distributed project teams and their communities to tackle challenges such as digital security, neocolonialism in tech, and Internet censorship. ngọc initially joined the DWeb movement in 2019 as a maintainer of Decent Patterns. She was a DWeb Fellow in 2022 and a Curator of the Design Track in 2023. She also previously led the DWeb Fellowship, bringing her experiences in community organizing and decolonial practices to amplify and expand the program's impact. Ayana Zaire Cotton is the founder and steward of Seeda School and host of the podcast For the Worldbuilders. Seeda School teaches black feminist worldbuilding through art, technology, and archives. Inside the ecosystem of their speculative practice, Ayana braids storytelling, software, facilitation, and interspecies collaboration to engage our collective imagination around the worlds we need in the future we desire. mai ishikawa sutton is an organizer and writer focused on the digital commons and other intersections between network technologies and the solidarity economy. They are a co-founder and editor of COMPOST, an online magazine about and for the digital commons. They are a Senior Organizer with DWeb and a Digital Commons Fellow with Commons Network. Sarah Friend is an artist and software developer currently based in Berlin, Germany. Besides her work as an artist, she has taught at The Cooper Union (NYC), Gray Area (San Francisco), La Plateforme (Marseille), HEAD – Genève (Switzerland), and Rupert (Lithuania). Other DWeb for Creators instructors not included in this article:  Regina Harsanyi is the Associate Curator of Media Arts at the Museum of the Moving Image and has taught at Columbia and NYU. She advises on preserving variable media arts for artist spaces and private collectors. Her curatorial work includes the acclaimed exhibition Aureia Harvey: My Veins are the Wires, My Body is Your Keyboard (MoMI, 2024).

CI
Course Instructors
Read Article
Article 5Issue 2

Built to Last: Data Lifeboats for Distributed Digital Heritage

Have you ever clicked a link that shows you a 404 error page? Have you ever lost data or photos when a web service closed down? While the common notion is that digital content is everlasting, the reality is often different. Our increasingly monolithic online platforms, along with the valuable data they host, are not as enduring as we believe. We need new models for preserving our digital culture — decentralized, distributed, and built for longevity. As the web ages, we have seen that web links and their host platforms can — and do — disappear, taking huge swathes of our digital histories with them. Take, for instance, a 2024 study by Pew Research Centre showing 38% of web pages that existed in 2013 are no longer accessible today. When platforms sunset their services, there are significant losses: Shutterfly closed down Share Site, and Apple terminated its My Photo Stream, resulting in millions of deleted photos for people around the globe. In 2017, Verizon nearly shut down Flickr, one of the most extensive picture collections humans have ever assembled. It was only when SmugMug — a smaller, family-run, photography-first company — stepped in to buy the service from Verizon that Flickr’s collection and community were saved from deletion.  Flickr was a pioneer of Web 2.0. It made it easy for the people to share their photos with friends and family. After two decades, it has grown to contain tens of billions of images, contributed by millions of people. I was part of the team that brought Flickr into the world in 2004. I was the lead designer until 2007 when I shifted to creating the Flickr Commons program, which focused on making photography collections from the world’s cultural institutions more easily accessible to the public.  Flickr Commons went live in January 2008 with the Library of Congress as our first partner. Since then, it has brought together almost two million images from over 100 cultural organizations of all sizes — from one person running a tiny local history archive to the collections in our national libraries and archives. On average, Commons accounts contain about 3,000 photos. Almost half a million Flickr members subscribe to new uploads, clocking a huge 4.59 billion views across the program.  Whether personal memories or cultural heritage, each photo across Flickr and Flickr Commons shows us an “observable moment", John Berger calls photographs: a document of lives lived. Imagine if this invaluable trove of human history were wiped from public view and our collective memory. How do we make a stand to ensure this will not happen? How can we build robust archival copies immune to the corporate economics, cyberattacks, authoritarian censorship, link rot, and tech failures that hinder web archiving today? How can we, with careful deliberation, extend the life of our digital works, not just in the Flickr ecosystem but across the web?  Here’s one approach: harness the power of the decentralized web to empower greater autonomy over our digital content and ensure that it endures for decades to come through distributed infrastructure and simple, responsible technology.  In 2022, I created the nonprofit Flickr Foundation to explore the next phase in the evolution of digital photography, with support from SmugMug’s COO, Ben MacAskill and Filecoin Foundation for the Decentralized Web (FFDW). The organization is a commitment to shaping the next 100 years of web content and ensuring our shared history can persist for generations to come. Our first major project is called Data Lifeboat, a tool to give our digital culture buoyancy in tumultuous digital and economic seas. Data Lifeboat diverges from many of the prevailing trends in digital preservation over the past decade. So far, the instinct has been aggregation: amassing content and records on platforms like the Digital Public Library of America or Europeana. This is a centralized approach. I believe it risks becoming unsustainable as the volume of digital content expands beyond available resources, and far-reaching platform dependencies make web archiving more cumbersome and less accessible. The systems we have built so far to manage our shared digital heritage are fragile and unsustainable. No single company or organization should have the ability to delete massive pieces of human history, nor bear the sole responsibility for maintaining them.  With the Data Lifeboat, however, we're developing an archival framework that reimagines how we can preserve large, networked cultural resources, charting a new course away from widespread centralization and consolidation. And through our collaboration with FFDW, we have built a software tool that embraces decentralization at both technical and cultural levels.  Data Lifeboats are light and flexible, unburdened by platform dependencies. These ‘archival slivers’ have a greater chance of survival. From a technical standpoint, Data Lifeboats enable Flickr members to curate collections and secure their data in a ‘light to store, easy to move’ package: a downloadable, compressed file containing images and their metadata. This implements the digital preservation principle LOCKSS (‘Lots of Copies Keeps Stuff Safe’) by facilitating the physical distribution of self-contained, versatile, and decipherable copies, providing redundancy and resilience should the original Flickr.com ship ever go down. While Data Lifeboats are currently downloaded and stored by their creators, we're developing a Safe Harbor Network — a distributed network of managed "docks" or servers where trusted network members can preserve Data Lifeboats for long-term safekeeping.  Beyond its technical aspects, the Data Lifeboat is also a conduit to decentralizing culture by dispersing cultural power from central entities, funneling it towards individual users and smaller organizations. The contents of a Data Lifeboat are selected by Flickr users themselves, ensuring that what matters most to people is what gets preserved. This disrupts the hegemony of traditional archival structures, where historical power holders are typically overrepresented in cultural heritage collections. By decentralizing selection and preservation tools, Data Lifeboats support and distribute diverse voices and more inclusive retellings of our collective digital culture. This approach becomes increasingly vital as we observe platforms' asymmetric power to censor and remove content that conflicts with prevailing political agendas. With internet shutdowns increasing annually across more countries, and the erasure or shadow-banning of critical accounts, the importance of individuals controlling their own data grows more urgent. Decentralized networks, with their inherently distributed structure, offer resilience against single points of failure and censorship. Without controls like these, the security of our collective digital heritage remains at risk. While Flickr’s billions of observable moments are a starting point for this work, our goal is to extend this approach beyond the Flickr community. This is the beginning of a collective effort to underscore the need for more resilient digital systems.  Right now, what’s on the Internet isn’t forever. Our digital memories and shared cultural heritage are vulnerable to time as much as any physical keepsake. I’ve heard archivists say a piece of paper lasts longer than a website. Our memories matter. It’s time we equip ourselves to make sure they last for the long term.  George Oates is Co-founder and Executive Director of the Flickr Foundation.

GO
George Oates
Read Article
Article 6Issue 2

Decentralized Avant-garde: How Artists Are Building a Cultural Memory Infrastructure Beyond Big Tech

Throughout history, artists have been the catalysts for major shifts in culture. A quick survey of the past 60 years shows how artists have shaped innovation in technology. Artist Robert Rauschenberg and Bell Labs Engineer Billy Kluver’s “Experiments in Art and Technology” in the 1970s anticipated technological innovations like CCTV, chat rooms, and digitized graphics, and squarely established a practice of experimental “R&D” within technology corporations. In the 80s and 90s, artists were among the first to form vibrant networks of exchange on bulletin board systems, the predecessors of the World Wide Web. The artistic avant-garde that emerged in that era, known as “Net Art,” served as the creative model for early internet communities, which evolved into modern social networks. Similarly, generative algorithmic art experiments in artists' studios date back to the 1960s, with pioneers like Vera Molnar and Manfred Mohr anticipating the emergence of generative AI via visual culture. Standing on the shoulders of these giants, contemporary artists working online have anticipated the seismic shifts in the technology landscape we are experiencing today, post-Web 2.0 and AI. As every aspect of our lives fell under the control of Silicon Valley giants — communication, commerce, travel, even our intimate relationships and identity — artists were showing us another possible world. Their work took shape as online public art, distributed moving images, social media interventions, and virtual worlds that reflect on how decentralization might offer alternative means of shaping digital value, identity, rights, privacy and security, and access to knowledge. As popular culture now becomes aware of the precarity of data and its immense value, we are facing a collective crisis.  In order to regain control of our digital spaces and identities, we should once again turn to artists to reframe our understanding of the future of the web and find new ways forward. Embracing Decentralization to Reimagine Cultural Infrastructure  For the past decade, I have been working with a group of extremely online artists at TRANSFER. Presenting as a contemporary art gallery, but operating more like a co-op, our exhibitions and experiments with decentralized infrastructure have culminated in a new model for stewarding data, outside of the grasp of Big Tech. Our overarching goal is to preserve the cultural heritage of networked moving images, video games and virtual worlds, immersive artworks, and online public art. Given the rapid evolution of the media landscape, institutions have struggled to keep up. Traditional institutional practices designed for scarce physical objects can’t adapt with the volume and rapid evolution of digital media. By embracing decentralized tools, we are shifting power away from centralized institutions into the hands of culture creators. Starting from a small scale allows us to reimagine how a resilient cultural infrastructure might operate. The current cultural landscape — from the contemporary art market to the music industry — is built on extraction from artists. Instead of looking to agents and institutions, we are establishing our own store of cultural data through archiving and stewardship. This requires not only realigning how we value our own work, share profits, and sustain our practice, but also extends to the actual infrastructure that carries on this legacy. Getting Hands-On with Data  In our experimental model, artists become nodes in a network, managing their own archives on Networked Attached Storage (NAS) drives locally in the studio, and leveraging decentralized protocols like IPFS (InterPlanetary File System) to create an encrypted peer-to-peer private network for redundantly storing each other’s works. If one node goes down, it can be restored from all the other nodes in the network. Similarly, local-first software for data collaboration allows the artists to move away from third-party-controlled platforms such as AWS, Google, or other corporate servers. Instead, the NAS drives allow the artist studios to host their own copy of the software that runs directly on the node for access to their data, meaning there is no external corporate dependency or risk of censorship. For a third layer of redundancy, “Archival Information Packages” are placed in cold storage via Filecoin storage deals. This model fosters a direct and personal relationship with data. In our post-Web 2.0 world, individuals have an entirely abstract relationship to data: we all create user accounts, sign terms of use, and give our data over to the convenience of easy-to-use interfaces. Taking back the management and care of this data requires sweeping behavioral change — getting hands-on with infrastructure and dealing with all the messy complexities of migrating off these systems. It means sorting through metadata and design standards that will connect our data together to unlock its value while still retaining ownership. This hands-on approach to stewarding data through time presents a prototype for a new future, where linked open data is interoperable across autonomous data cooperatives, becoming stronger as more creators take back ownership of their digital intellectual property. Building Direct Relationships with Data A single artist archiving is a lonely activity, but the value of data is only unlocked when it is amassed into meaningful data stores. This requires a data cooperative business entity to manage and grow value across markets. TRANSFER is exploring the boundary between financial value and cultural value through a data co-op.  Financial value is straightforward: artists’ works are valued within the contemporary art market, which is one of the few markets to have a system to appraise the value of data via Time-based Media Art. But the knowledge and IP embedded in artists' work represent an enormous amount of cultural value beyond market price — as a historic record of social media’s evolution and the political upheavals they questioned, as context within art history, and as a vision of the future they imagine.  The data we collectively gather becomes valuable in multiple forms: not only as collectable artworks but also for generative outcomes, like unlocking hidden histories in the anthropological record of the web, or training large language models. On a long enough timescale, the preservation of data by an automated cooperative infrastructure that extends beyond any individual artists’ lifetimes will create an invaluable cultural legacy that stands the test of time alongside traditional institutions. Operating at the scale of trust, we believe this vision is sustainable. Mass adoption and rapid growth are not our goals; instead, we work slowly, with intention, focused on keeping our technical debt low so that a conservator or researcher 100 years from now might easily reconstruct this cultural record. The power of small-scale innovation is in illustrating new possible futures. Imagine a world where everyone has a direct relationship with their data, understands where it is stored and how it is leveraged, owns it fully and can capitalize on the value it represents. Such a decentralized future is within reach, but it will take all of us to realize it. Author Bio Kelani Nichole is a technologist and the founder of TRANSFER, an experimental media art space. She has been exploring virtual worlds and decentralized networks in contemporary art since 2013. Nichole builds alternative technology infrastructures, and designs immersive exhibitions of interactive media art. Currently she is a visiting scholar at NYU Tandon School of Engineering Integrated Design and Media (IDM), where she's developing the TRANSFER Data Trust – a decentralized archive and cultural value exchange network with a mission to cooperatively steward data, ensuring preservation and access across generations.

KN
Kelani Nichole
Read Article
Article 7Issue 2

A Lighthouse in the Dark The Case for Decentralized Perpetual Storage

In today’s digital world, vast amounts of information are created every second, yet much of it remains fragile and temporary. From research papers and public records to cultural archives, essential data often lives on platforms that were never designed for long-term preservation. We are living in an ephemeral age: one where digital content is created at unprecedented rates, but little of it is built to last. We need an internet where permanence is an option. Anatomy of an Ephemeral Age Digital fragility stems from the centralized infrastructure that underpins today’s internet. A key driver of this ephemeral age is vendor lock-in, where data is stored with centralized providers who control access and limit portability. If the provider shuts down, changes its policies or removes content, users can lose access to their own data without warning. Another challenge is the cost of keeping data alive. Most platforms operate on monthly subscription models, requiring continuous payments just to retain access. This creates an economic barrier where only those who can afford ongoing fees are able to preserve their digital content. For individuals, small organizations, and communities in under-resourced areas, this model is unsustainable. As a result, important knowledge and history are at risk of disappearing simply because someone couldn’t pay. Without alternatives focused on durability and open access, we risk building a digital world that forgets just as quickly as it records, and that loss could have lasting consequences for generations to come. A (Decentralized) Path to Permanence The world’s data needs demand a path to permanence. But traditional data storage methods, such as hard drives and cloud storage, are limited by capacity constraints, data degradation, and centralized vulnerabilities. These limitations have become more apparent due to the rapid growth of digital data. As a result, perpetual storage has become increasingly important for knowledge preservation, regulatory compliance, and safeguarding digital legacies. However, centralized perpetual storage solutions face challenges when tackling these issues. They’re vulnerable to single points of failure, censorship, and control. Additionally, these centralized solutions can become expensive and inefficient as data volumes grow.  Decentralized perpetual storage offers a compelling alternative. By distributing data across a network of nodes, it ensures redundancy, resilience, censorship resistance, cost-effectiveness, and increased security. My company, Lighthouse, is developing a decentralized solution to perpetual storage –– a protocol that allows users to pay once and store their files indefinitely. Loosely speaking, this protocol offers a new and robust incentive and financial layer to create a perpetual storage protocol on top of a Decentralized Storage Network (DSN) such as Filecoin. Designing for Durability Centralized clouds concentrate state in a handful of data centers that share common control planes, routing tables, and legal jurisdictions. A single encryption key rotation error or subpoena can ripple through petabytes in milliseconds, making that data vulnerable to surveillance or loss.  Decentralized storage flips that topology: every object is content-addressed (e.g., a multihash content identifier, or CID) and committed to a verifiable ledger, while the underlying bytes are protected against data-loss with erasure-coding and replicated across independently run nodes. Resilience is no longer an internal service-level variable—it’s baked into the protocol’s mathematics. Instead of monthly payments, users pay a one-time fee for perpetual storage enabled by Lighthouse. This makes digital preservation more accessible and sustainable for individuals and communities. Built-in proofs coupled with long-term storage models create a storage architecture designed for permanence. With this model, resilience is no longer dependent on a single company or server. Data is stored across a global network and stays accessible even during local outages, attempts at censorship, or infrastructure failure. In addition to technical and architectural resilience, decentralization opens the door to a culture of sustainability through collective stewardship. Public data like climate records, educational content, and cultural archives can be funded by communities or grants via the power of Smart Contracts and Data DAOs. This creates shared responsibility and long-term accessibility. Real-World Impact: When Permanence Matters Not everything should live forever online –– and that’s fine. But some classes of data must outlast platforms, policies, and power shifts. Here are the places where permanence isn’t a luxury; it’s the whole point. Websites as durable public records  Tools like Webhash let anyone freeze an entire site—code, assets, and all—into perpetual, content-addressed storage and map it to an ENS name. What was once a fragile link becomes a verifiable artifact you can always resolve. Culture beyond the algorithmic feed In the world of NFTs, artists are turning to decentralized storage to preserve the integrity of their work. Rather than relying on fragile links on centralized platforms like Instagram or YouTube, perpetual storage ensures that digital art, zines, and creative expressions will exist independently. NFTs become less about speculation and more about long-term cultural preservation. Some projects are even using this technology to archive endangered languages, oral histories, and indigenous knowledge. Eternal AI and open models As models become increasingly powerful, the idea of Eternal AI—permanently preserving open-source models like DeepSeek and Llama on Filecoin via Lighthouse—is gaining traction. Projects like SingularityNET also explore how knowledge and algorithms can be stored and accessed permanently, ensuring future systems are built on trusted, open foundations. Public records that stay public Governments also benefit from this technology. Electoral data, legal records, and public infrastructure plans are being stored in a tamper-proof way, increasing transparency and public trust.  Platforms like Archivista.ai use perpetual storage to preserve personal memories, family histories, and legacy documents in an organized and perpetual way. By capturing relationships and timelines across generations, they create a durable digital archive that families can trust and pass on.  Conclusion: Designing a Web That Remembers We’re not just choosing between a “library” and a “feed”; we’re deciding who controls what stays online. Centralized services can keep data up, but they can also change it, remove it, or lose it. If permanence is to be truly available, the foundation has to be decentralized—using cryptographic proofs instead of policy-based promises, long-term funding instead of monthly bills, and many independent operators instead of one gatekeeper.  Some data isn’t meant to stay forever on the web. But the knowledge, culture, and public records that matter need to last. Decentralized perpetual storage makes that possible - built for preservation, censorship resistance, and open access for everyone. Author Bio Nandit Mehra is the founder of Lighthouse Storage and 1MB.io. He works at the intersection of infrastructure, data ownership, and cryptography, focusing on building systems that make data permanent, private, and user-owned.

NM
Nandit Mehra
Read Article
Article 8Issue 2

Gardens in the Rainforest: Community Infrastructure at the School of the Commons

In the Amazonian rainforest, over forty activists and trainers gather with laptops, video cameras, and flip charts to hack together. Escuela Común — the School of the Commons — was the result of years of collaborative work between decentralized technology support organizations: Laboratorio Popular de Medios Libres, WITNESS LAC, Sutty, Lanceros Digitales, Cad, Awana Digital, Radios Libres, and others. The plan: to equip indigenous communities with secure, decentralized digital tools to give them the freedom they need to protect their environment and advocate for their rights. This included video, audio, and image documentation for legal processes, digital archiving, and self-hosted servers to protect their data and communications from surveillance. Puyo sits in the Ecuadorian Amazon, near Yutzupino in Napo province, where illegal mining has exploded in recent years, destroying forests and threatening indigenous territories. It was the right place to learn. School of the Commons participants spent most of their time training on tools and strategies for documenting mining impacts, but they also went into the field, collecting evidence with video cameras, drones, and phones. Guiding them were members of the Pastaza Kichwa Nation Pakkiru, who came to share the techniques they use to preserve their language, tell their stories, and assert their rights. They left with something new: the knowledge to create their own community-run server for decentralized storage and communications, independent from the Big Tech platforms that so often side with the governments and corporations that they are fighting. WITNESS, one of the organizers of the School, has long recognized the importance of community-controlled archives. Indigenous groups like the Pakkiru produce strategic documentation of their territories and rights: evidence they need to protect from commercial exploitation, and from anyone intent on sabotaging the legal defense of their land. But it’s not just corporate misuse that concerns them. Government requests for user data have blurred the line between private companies and state surveillance. According to the Swiss privacy technology company Proton, major tech companies have handed over data from more than 3.1 million user accounts to governments over the past decade — a 600% increase in requests. These companies comply 80-90% of the time. That means governments are routinely accessing emails, messages, private files, search histories, and location data without users’ knowledge or consent. Self-hosted servers offer a defense. These decentralized systems don’t rely on Big Tech infrastructure, giving communities control over their own data and making it far harder for governments to access information without a clear legal process. Participants at Escuela Común learned to create "digital gardens”: protected, self-hosted servers that can also provide a private, cost-effective alternative to large data centers. These servers run on free and open source software, giving communities full control; they can understand, modify, and share the tools without restriction. Participants learned to install and manage platforms like Nextcloud and WordPress, and implement security measures like password management, secure shell connectivity, and physical server protection. WITNESS shared guidance for the use of open source intelligence (OSINT) and satellite imagery, using free tools like OpenStreetMap. The "Video as Evidence" training gave participants hands-on experience documenting mining damage and anti-mining assemblies. The first Escuela Común showed what’s possible when communities take control of their own technology. Twelve collectives across Latin America left with new tools and skills, and the first Network of Autonomous Servers of Abya Yala now connects them. The tools are open, the knowledge is shared, and the work continues. A new edition of the School of the Commons takes place in 2026. Find out more at: escuelacomun.yanapak.org Author Bios Laura Salas is Senior Program Manager for Latin America and the Caribbean at WITNESS, where she leads training and capacity-building processes that empower activists and communities to use video for human rights advocacy. With over two decades of experience, she has designed and facilitated regional learning programs and audiovisual initiatives focused on social justice and collective storytelling. Andrés Tapia is co founder of Lanceros Digitales and Radio La Voz de la Confeniae in the Amazonian rainforest of the Pastaza province of Ecuador, with two decades  of experience in communitarian and environmental journalism, human rights advocacy, radio broadcast and multimedia/transmedia production. He is a member of the Kichwa Nationality of Pastaza PAKKIRU as a part of the media team of the organisation and the indigenous radio Jatari Kichwa. Nicolás Tapia is a member of the Laboratorio Popular de Medios Libres (LPML). He works on communication processes, free technologies, and community-based digital infrastructures, supporting organizations and communities in building technological autonomy, digital care, and data sovereignty. Yvonne Ng is the Senior Program Manager of Archives at WITNESS, where she trains practitioners, develops learning materials, and advocates for the use of archives to support human rights change and accountability. An audiovisual archivist with over 20 years of experience in community-based and nonprofit settings, she focuses on accessible, innovative archiving approaches relevant to contemporary movements and human rights defenders. Fauno’s work is focused on investigating, adapting and implementing ecological and resilient technologies, specially autonomous, collectivelly managed infrastructure. In the last eight years I've been working almost exclusively on resilient web sites using Jekyll and developing a platform for updating and hosting them called Sutty. Elio is part of Sutty, a diverse and inclusive cooperative based in South America that provides resilient websites and sovereign hosting, as well as support in holistic digital care; inclusive design, and human interaction for activists, the social and solidarity economy, organizations, and collectives defending human and environmental rights. He specializes in support for communication and the appropriation of community technologies with a holistic digital care perspective; planning and management in communication and education; and advice on inclusion from intersectional perspectives.

EF
NT
AT
LS
YN
Elio & Fauno, Nicolás Tapia, Andrés Tapia, Laura Salas & Yvonne Ng
Read Article