Beyond Mere Decentralization – Orthogonal Web
There is an obvious need to build scalable resilience mechanisms into our social safety net. We no longer have the time or luxury to ignore the current gaps in privacy, accessibility, and collaboration. Now is the time to develop systems and tools specifically for communities in need, to provision for privacy and universal inclusion, and to put forward effective strategies for localized problem solving and communications.
– The ARPAnet, a DARPA.mil Creation
– The Topologies of Decentralization
– Making Use of the Tools We Have
– Is ______ THE solution?
– Techno-Socratic Dialectic
– The Orthogonal Web
– Resource Mapping and the Consent of the Governed
If we follow the trends of innovation, the future of computing starts to look like incredible immersive experiences powered by distant servers. However, powerful surveillance ready systems are becoming household items, while the infrastructure the internet was built on is still fundamentally insecure and incomplete.
With so many cutting edge technologies being developed, from “the cloud” and IoT, blockchains, to DLT, DHT, stacks, and DApps, it can be hard to keep track. Let’s take a step back for a second and ask the question, what even is decentralization. Why should anyone care about any of this at all? In this article we’ll go back in time to uncover the nuances of networks and explore concepts beyond mere decentralization.
The ARPAnet, a DARPA.mil Creation
The Advanced Research Projects Agency’s Network
Recall from any other internet history primer that the ARPAnet (progenitor of the Internet) was funded and implemented so that U.S. war fighters could employ encrypted intranets and low-bandwidth communications (using TCP/IP) to better defend against a nuclear armed Soviet Union. It enabled decisive decision making by offering commanders near real-time information exchange and “command & control” (C2) systems. Due to its generalizable nature, the applications of this technology were endless.
Once trained and equipped, soldiers and sailors could quickly relay commands, orders of battle, time sensitive information, unit reports, and plan operations spanning many locations at once. Having additional interoperable “high-level” protocols meant that the kind of I/O link or a link’s security integrity was no longer critical because security and data packaging occurs in the layers above TCP. This made it trivial to deploy and secure networks given the hardware encryption devices that were available at the time. “The goal was to exploit new computer technologies to meet the needs of military command and control against nuclear threats, achieve survivable control of US nuclear forces, and improve military tactical and management decision making” (S. Lukasik, 2011).
The idea of “centralized control, decentralized execution” began taking root in military doctrine once some of these foundational systems were in place. One of the ways this doctrine manifests to accomplish the stated goal is in the adoption of software and hardware tools and interfaces that are secure and easy (enough) to use even for the lowest ranking service-members, who at the time had likely never seen a computer. To this day, access to the national security networks that exist is granted based on levels of trust and verification established by the National Security Act of 1947.
With network privacy built into these new systems using end-to-end hardware encryption, a new style of secure collaboration was made possible. With additional open protocols like HTTP and HTML by the early 90’s, the task of sharing critical information between many large organizations was made much more simple. Eventually, Intelink was born, a group of secure intranets that served as a hub of information for services, agencies, engineers, operators, and analysts. Anyone with the proper clearance to the network could access common tools & databases, locate source reporting, coordinate with experts in the field, or even read the President’s Daily Briefing.
The Topologies of Decentralization
Internet Relay Chat, or IRC, was one of the first widely deployed chat protocols that improved communications between commanders and field units using secure, multiplexed chat channels adaptable for any team (just ask Slack how awesome it is). U.S. armed forces use tools like IRC to this day in places like Afghanistan and Iraq to send and receive urgent messages such as 9-line medical evacuation (MEDEVAC) reports, troops in contact (TICs), resupply efforts, and to support quickly unfolding activities and decisions. IRC is just one example of a simple and flexible tool that’s highly effective, as long as you can implement and secure it.
From C2 to C5ISR – A long and historic journey of developing command, control, communications, computers, cyber, intelligence, surveillance, & reconeissance collection and sharing capabilities within a trust network.
It’s important to identify, there are different contexts in which decentralization can arise, from an organization decision making perspective, the physical infrastructure of a network, to a software’s architecture implementation. All that to say, terms like “decentralization” and “distributed” actually aren’t that helpful without also having all the appropriate context of how the systems are put into practice.
It is now common practice for software companies to require users to give up vital private data, devise lock-in tactics, or require the internet to even function, while also being black boxes of present and future risk. Although “cloud” technologies make using and sharing information vastly easier for end users, there are still major privacy and accessibility gaps – especially from a social inclusion standpoint. For example, where are the open protocols and adaptive interfaces that enable spatial navigation for the blind and those with limited mobility? How can businesses, cities, and communities better serve people with disabilities using open protocols, information visualization, and data sharing? What happens when the internet or power goes out?
Even some of the most widely recognized “decentralized” infrastructures are not as decentralized as they appear (ref: Bitcoin & original research by Cornell); additionally, they can require highly restrictive licenses and maintain proprietary elements that require buy-in on others’ business or governance model. The questions I always like to ask are, who owns the data, who can access that data, and where does it reside? Identifying this pattern can be the first step in trying to decide where you or your community fits into all of this confusion progress.
For an introductory understanding to the subtle differences between decentralization and distributed systems, check out Julia Poenitzsch’s article What’s the difference between Decentralized and Distributed? And how does this relate to Private vs. Public blockchains?, published on Medium, Oct 3, 2018. “There are degrees of decentralization and distribution, rather than hard divisions. How much decentralization or distribution is desirable then depends on your objectives.”
Making Use of the Tools We Have
It is no coincidence that tools with profound utility are also broadly applicable to people everywhere, whether in business, emergency response, or “going off to college”. Today more than ever before, the importance of being able to connect and transact easily, dependably, and securely is paramount, whether or not it’s obvious. Both the military intelligence community (MIC) and the Silicon Valley behemoths were created on the backbone of open source initiatives like Linux, TCP/IP, HTTP, and HTML protocols. How did so many people (the non-technical masses) miss out on owning and employing these tools directly?
Unfortunately, it has become all too easy to rely on a handful of platform gatekeepers. The technology field has been dominated by companies like MSFT, APPL, GOOG, AMZN (etc.), with device-specific operating systems, cloud-services, apps, and web browsers as the de facto standard. But with the rise of powerful open software languages, distributed computing, and 3D graphics, the equation is rapidly changing. Major development efforts are underway in this nascent space, including projects like Holochain, Polkadot, EOS, Hyperledger Indy, and many others.
However, I think it’s pertinent to keep in mind that the infrastructure that you decide to use is part of a much larger integration puzzle. It’s also important to make these infrastructure and data tools meta-data secure, and usable (e.g. with an interface) by communities and their members for building additional resilient networks, with flexibility to centralize or decentralize depending on the needs of the network.
Is ______ THE solution?
Projects like Holochain, DAT, and Substrate have emerged as open and flexible distributed infrastructure software, offering a decisive advantage over proprietary / closed technologies. Holochain provides infrastructure for developers to build apps on their peer-to-peer “DHT” layer called a distributed hash table. Like PLAN, Holochain features a private instancing model, meaning that any group or community can use it without becoming otherwise dependent on other people, groups, or organizations.
In contrast, many distributed technologies ultimately keep a proprietary element behind lock and key, a paywall, or entwined with a digital currency. It’s important to point out that while PLAN is designed to be self-contained, it can also integrate with other existing DLT / Blockchain layers that are compatible with our architecture and Design Principles. PLAN is a kind of technology glue that brings the components of a system all together, including infrastructure, a complete security data-model, and interfaces that are required for non-technical users to participate (UI/UX).
PLAN is a kind of technology glue that brings the components of a system all together
Holochain = Distributed P2P Infrastructure
PLAN = Integrated Platform with Pluggable Components
A component based approach works well for PLAN because it allows the end user to choose a configuration for their local needs, whether on or off-grid. As we’ve seen in many blockchain-specific architectures, what happens when the community’s needs outstrip the conceptual models imposed by the distributed infrastructure layer? What happens when the models used at the distributed infrastructure layer are inherently too much complexity for community end users? And regardless if we all agree on how pluggable the bottom-most infrastructure layer should be, what about usability? If there hasn’t been an intentional vision or plan about a device operating system AND infrastructure-agnostic user experience, then how far along are we really? As an example:
With these barriers to entry in mind, there is still the task of building usable interfaces and maintaining integration for all the different OS’s, platforms, and browsers that are out there. Given that they all come with dependencies and layers of complexity, which ones do you target? While having dependencies is generally acceptable when deriving value from a particular network or specific kinds of applications, it’s just not practical to assume app developers will ensure critical communications infrastructure is usable, private, and universally accessible. This is especially true when the demand to capture that data is so high.
If there hasn’t been an intentional vision or plan about a device operating system AND infrastructure-agnostic user experience, then how far along are we really?
Techno-Socratic Dialectic
Is it realistic that all communities will forever rely on a particular solution as the ideal for managing critical privacy needs, connectivity, identity, value exchange, or trust? What is the cost of finding out the hard way? Do we just discard the principle that a key part of this tech is that communities choose the right components best for their needs? A team of journalists doing a high-risk expose are correct to demand a peer-to-peer storage layer that will have rather different security and performance trade-offs than say, a community healing center, a city block organizing a weekly flea market, or a crafting guild with accessibility needs for seniors and people with disabilities.
Why Not “Decentralization”?
It connotes a process to disrupt the status quo… but suggests no vision of a better thing to replace it with.
It suggests a topological fix… but are our true problems merely topological?
– Peter Wang, CEO of Anaconda, Lightning Talk at DWeb 2019
So what’s next for “decentralization”? Is it going out of style before it even got popular? I would say yes, hopefully for all of our sake. There is actually a much more nuanced and actionable approach available than mere decentralization. While there is no common parlance that I can reference to provide an immediately satisfying understanding, I will refer to this approach as The Orthogonal Web, a concept coined and articulated by Peter Wang, CEO of Anaconda, during a lightning talk at DecentralizedWeb (DWeb) Camp 2019 (led and sponsored by the Internet Archive).
The Orthogonal Web
Peter points out there are “three critical elements of a communication and information system that need to be held orthogonal to each other”: Data | Transport | Identity. Key to understanding this Privacy Trinity is that “all three legs affect each other, but all three legs need to be put together in an orthogonal way … 90 degrees from each other so they can not be used to capture the other.”
For example, if you are sending a sensitive email using a transport method that relies on Gmail infrastructure, which retains part or all of your message on their servers, the identity and content of the message is tacitly exposed to Google, and by extension any one else that is able to gain access to those lines of communication. The key takeaway from Peter is that with any conventional infrastructure built on top of the Internet (aka ARPANet), that orthogonality does not exist.
Data
(content)
Integrity
Security
Resilience
Privacy
Provenance
Isolated from
Identity & Transport
Transport
(data paths)
Security
Availability
Latency
Bandwidth
Privacy
Isolated from
Data & Identity
Identity
(participants)
Self-sovereign
User controlled
Anonymity
Centralized &
Decentralized trust chains
Isolated from
Data & Transport
Pillars of Information Integrity
“ALL THREE PILLARS EFFECT EACH OTHER, BUT ALL THREE PILLARS NEED TO BE PUT TOGETHER IN AN ORTHOGONAL WAY … 90 DEGREES FROM EACH OTHER SO THEY CAN NOT BE USED TO CAPTURE THE OTHER.”
– Peter Wang, CEO of Anaconda
Resource Mapping and the Consent of the Governed
There is an obvious need to build scalable resilience mechanisms into our social safety net. We no longer have the time or luxury to ignore the current gaps in privacy, accessibility, and collaboration. Now is the time to develop systems and tools specifically for communities in need, to provision for privacy and universal inclusion, and to put forward effective strategies for localized problem solving and communications.
Maps are the answer to a critical question: “What is in my environment?” The purpose driven idea that has carried over into PLAN is to create a robust information visualization platform that fosters resiliency of shared habitats and local relationships through collaborative mapping and exploration. For that to be possible, we need to completely rethink systems so that collaboration as well as privacy is built in by design.
Resource mapping is indeed one of the high utility applications that can be harnessed with spatially collaborative systems; however, who owns that data and where does it reside? People are right to hold skepticism of a system that is designed to, let’s say, track all the money, resources, or the infirmed. There’s a really important human aspect at the heart of the matter that is rarely talked about or acknowledged. Unless a consent relationship to be part of a data sharing community (or any community for that matter) is fostered, and an agreement that governs the span of control in such a system is fully articulated and checked, then all we’ve really accomplished is creating additional moral hazard to navigate.
Community-Centric Technology
Community-centric technology means that systems are not only designed for collaboration, but owned, implemented, and managed at the local community and individual level. To facilitate this shift from developer-centric software, we’re working on bringing together a technology stack that is modular and pluggable, including the encryption AND storage layers, extensible core functionality, all the way to the interfaces that make the functionality accessible to end users.
PLAN is a new class of technology, a collaborative peer-based system that is device agnostic and community-centric (e.g. transport, data, and identity are decided at the community level). Furthermore, PLAN infrastructure enables the latest AAA 3D game engines to plug-in to this hybrid data-model. This model takes advantage of the latest advances in distributed peer-to-peer and swarm technologies without locking you into a particular system or platform component.
We think PLAN’s open and pluggable architecture stands up to Peter’s Orthogonal Web approach, and provisions for all thee vital aspects of an end user’s data footprint — data, transport, and identity. Ultimately, any other compatible layers can be plugged in as a component in PLAN’s data-model, like plugging in a highly advanced hard drive or adding an expansion bay to your computer.
PLAN Systems – Creating self-hosted, hardware agnostic solutions for dependable communications. To confront the challenges of this century, people need tools that facilitate secure communications and collaboration which are not dependent on a gatekeeper, a business model, or internet connection to continue functioning. Invest in technology that you can actually own.