We are all reflections of the world that raised us, shimmering and twisting and deforming as ripples distort our surface, and so it is that our notions of what should be and of what ought be and of what could be are each derivative of this reflection.
The Internet is a technology, and like any technology it is fungible. The Internet is a business, and like any business it is abusive for the sake of profit. The Internet is a hyperobject and like any hyperobject it is defies attempts to understand it as a thing.
The Internet is, above all else, a surviving machine, crafted to adapt and to grow and to be of use to people by providing low-risk interoperability for ad-hoc communications during crisis. The Internet is not a cure for ignorance, it is not a means to democratize information, it is not a distribution channel. Connectivity is the universal solvent of the Information Age, and the Internet is the apex of connectivity and that makes the Internet the foremost tool to subvert the status quo by crashing the barriers to access that owe their existence to distance, censorship, scarcity, and deception.
The Internet is at its best when it is mostly used to break down these barriers. The Internet languishes when it is used to maintain these barrier. The Internet is at its worst when it is used to expand these barriers by placing more information behind them than it releases from behind them.
The Internet, as a business, does not fulfill its eidos, yet there must be a business success for the Internet to exist in any form and this is the crux of the creative tension that lies at the heart of the technology market because what is good for business is typically not great for people, and eventually that catches up with the businesses. Businesses that sell technology, or access to information, or that package technology and information into services, perhaps more than any other kind of business, will always be running from the eventuality that people don’t want what they are selling for the price they are selling it. Coupled with a mandate for growth engendered by the way capital is dispensed into these businesses and the pre-existing but unchecked monopolies they compete with, this creates an environment where products diminish in their utility at ever increasing rates, implying that these businesses are built to fail in a marketplace controlled by an aristocracy that profits from failure.
The Internet as a distributed application substrate
Letting a business sell you the essential Internet applications – access, identity, privacy, communications, mobility, resilience, storage, programmability, serving – is perhaps inevitable, but each of these should be portable, composable, and scalable in such a way as to permit you to move freely in a marketplace of offerings at all levels of personal involvement (from DIY to fully managed).
The fullest expression of the eidos of the Internet is for everyone to have their own suite of these essential Internet applications that they and only they control without needing to devote time and expertise to keeping it all running. When using this – who controls the application – as a lens to view products, sorting all applications into three camps is trivial — those that are controlled by the person that uses them, those that are controlled by a company that uses them, and those that are controlled by the person or company that creates them. The value proposition, risk assessments, and ethical considerations precipitate out of the analysis from there in predictable fashion.
Consolidation and centralization are anathema to the well-being of the Internet, yet they have become the norm because the lack of imagination in the business of the Internet (and of businesses in general) cleaves to a top-down hierarchy reflexively. And this state of affairs is aided by the difficulty of doing basic things. The complicated nature of configuring servers and routers is beyond what an average person is willing to commit to have an application that they control, and doing something like controlling your own email server or a password manager service is so convoluted that even seasoned technologists would rather not do it. The utility value of solving these problems is high, but the financial value of maintaining the status quo is, for the companies that provide them and the people whose cache comes from having conquered their inscrutability, much higher.
But the Internet is designed to be distributed, and we have built an Internet that can be made to be distributed. Elastic compute, connectivity, and storage is cheaply available. Last-mile connectivity is, for many people in Asia, Europe, and the Americas, reliable and performant. And the behavior of incumbents has become sufficiently questionable that many people are choosing to be their own provider, using turn-key applications (often available as Free and Open Source Software) and cheap computers that they control to do a variety of things in their own houses that previously were only available on-line or through an expensive and proprietary device controlled by a corporation. Connecting these services to one-another through encrypted tunnels to federate with communities is a logical, if not inevitable, next step.
Density creates richer, more flexible, and more resilient marketplaces, and the long-term solution to the problem of products whose utility diminishes at an accelerated rate because of the pressures to grow placed on the businesses that make them is to relieve that pressure by changing the business model of creating products for the Information Age away from the Industrial Age standard of multi-national mega-corporations profiting on economies of scale to one more like the Japanese yokocho – a dense horizontal cluster of micro-scale vendors that are simultaneously competing with one another and fungible with one another which optimizes for operations that can make profit from keeping four to a dozen stools occupied with either tourists or salary-men during the course of a night. Costs are low, the owner is often the proprietor, and there is no forced impetus to grow beyond the walls of your stall and everyone else in your yokocho is in the same situation. Collectively, the market needs to draw in the sum of all the available stools, but individually, a particular business only needs a fraction of that to succeed each night; yokocho are adjacent to train stations which bring a steady stream of humanity, ebbing and flowing on the daily pull of offices and suburbs providing more than enough people for the whole of the yokocho to thrive.
This is the business model that the original services on the Internet tried to emulate – email, news, files, keys, and directory for a particular domain all available at a particular location that whoever controlled the domain also controlled. And how many stools at the bar was determined not by how much that organization was required to grow, but by how many people were in the organization and how many people they needed to communicate with outside the organization. Growing the business happened somewhere else (or not at all) and no one placed bets on the IT department’s ability to find new customers because they only needed to serve the customers that could fit in the stall, and that wasn’t changing.
To grow a service, you would interoperate, not scale up. My email server can get and send email to many email servers. My news server can get and sent news articles to many news servers. My file server can be mirrored by many file servers. My directory can be reached by many directory servers, but they can also cache my answers to their queries because directories don’t often change. Scale up wasn’t needed until we came up with an Application (or, as it happens, an architectural pattern) that didn’t interoperate between servers — the client-server design pattern and it’s most prolific evangelist, the Web.
The Web is, and truthfully has always been, broken because Web browsers are not also Web servers and vice versa. This had huge short-term advantages when growing the popularity of the Web, but the long-term consequences are at the heart of nearly every Internet-related problem we have today from Infrastructure to fraud to misinformation. Because clients don’t have to be servers, they don’t come with any of the Social Infrastructure that came with Internet services – because they don’t have email or a directory or keys, they don’t have an Identity so there wasn’t a way to find what you wanted or be certain you were talking to the authentic source; because they don’t federate and don’t provide a service, they ask for asymmetric provisioning of connections to support a get but don’t send view of the world; because they only take, we built businesses that didn’t create relationships with our customers, so our customers became dehumanized numbers — unique visitor counters are the ancestors of engagement metrics — and that forced us to seek ways to monetize these dehumanized numbers by packaging them up like processed food for other Industries to consume, so we have a Web whose utility value is decimated for people but maximized for advertising.
But as depressing as that is, un-doing that mistake is straightforward. The monolithic monopolies aren’t going away, but there are enormous swaths of territory between the walls of their gardens that can be infilled with small markets build on federation between distributed but autonomous systems. Email, news, files, keys, and directories are still the atomic elements of the Internet application, and the tools and methods for transmitting them have only gotten better over time, even if we’ve been slow to adopt them.
The Internet as a Peer-to-Peer Network
The most fundamental error that we’ve created for ourselves in the manner which we’ve allowed the Internet to emerge is that it is not, on the whole, a Peer-to-Peer network in the infrastructure that connects personal devices and residences to the Internet. This is most evident in the way that last-mile connectivity is sold with asymmetric capacity, but also in how infrastructure providers structure their pricing around an assumption that traffic will follow the historical patter of Web clients that send small requests and get large responses and in how slowly the transition from perimeterized enclaves that hide addressing (and the scarce address space re-use that prompted it) to globally unique addresses for devices, and in the perpetuation of tight-coupling between addresses and services that makes the delivery of applications and services both inside data centers and local area networks and across the Internet as a whole fragile and brittle.
Over and over again we produce products that follow the client-server pattern and must speak to an intermediary in a far off data-center or to a controller on a provider network before they can safely and reliable communicate with a device directly adjacent to them. This simply does not scale, nor does it create systems that are resistant to change or to crisis, nor does it make products that can outlive the company that makes them which calls to question if customers are actually buying anything when they pay for a device that someday may suddenly stop working because it is no longer economically viable for a corporation.
As above, the solution to this dilemma that is borne out of largesse is to get small to get big. The client-server pattern came about in the context of local area networks, but it is wholly unsuited to solve problems at scales beyond a few thousand devices. Yet we have built the Web out of it and because of that illusory success, we seek to build upon the Web (an application built on the Internet) to scale the Internet to a few trillion devices. This is such a bad idea that it is hard to come up with words to express how bad an idea it is, let us reach back a few centuries and call it folly.
Applications that scale do so because they are able to access massive resources (like High Performance Compute) or because they are suited to having their activities be sliced up into smaller sets, often sets of users. When we do this we aren’t scaling the application, per se, but are instead compartmentalizing the application — chopping it up and duplicating it so that these twenty-thousand users get this instance and those twenty-thousand users get that instance and so on — and this reveals a few things about the application.
- First, it reveals if the application is relying on a component which this compartmentalizing can’t be done to. Often this is the database, where all the information about all the users is concentrated. We concentrate it not because it is good for the users, but because it is good for the people who control the application because it makes getting reports about the application, and perhaps more importantly, about the customers easier to do.
- Second, it reveals if the application is trying to remember something about what the customer is doing while they are doing it, referred to as “state”. If the customer’s connection to the application is broken or the instance of the application fails, this state can be orphaned, lost, or processed, leading to a variety of problems. Imagine buying a book and just as the payment is processed, the server fails. You might take it in stride and re-process your order and go on about your day, but in short order a package arrives with two copies of the book, and the corresponding two charges to your credit card. This happens because the state of your transaction was kept on an instance of the application, not in a centralized place that all instances can share and not with you and your web browser where it can be presented to any instance of the application to pick up where you left off. This is a trivial example, and one that is, for the most part, solved in modern shopping cart applications, but not long ago it was a common occurrence.
- Lastly, it reveals if the application needs to be compartmentalized at all. If the various instances of the application are not relying on a component that cannot be broken up, and are not holding on to state for a user session, then we may start to ask why we have the application in our data center at all, and why can’t it be compartmentalized not for twenty-thousand users, but for one. An instance of the application for each customer, and an interoperable method of communicating only the information that is needed back to the data center. Taken to the extreme, this could mean that even the payment is conducted directly to the payment processor and the only information sent to buy your book is a token from the processor affirming that payment was made, the stock number of the book you are buying, the address where it should be delivered, the shipping method, and an identifier that your app generated to track the purchase transaction. The back end of this book buying application (the private portion of the communication that processed the order) just became the front end of your application (the public catalog and shopping cart) because your application doesn’t need to be in your data center to let your customers buy your products.
Now, imagine further that transaction message was standardized, and anyone selling anything used the same format. Instead of your application with your catalog and your shopping cart, customers could have a generic application with a generic shopping cart and they could load your catalog into it. Or they could just ask if you have any of a particular stock number and whether it could be delivered by a certain time. Or they could announce to a number of shippers that they want to purchase a particular stock number to be delivered by a certain time and the shippers could compete for the transaction. All of these possibilities re-frame the Web from a client-server design to a peer-to-peer design and with that, re-frame the way that business models are written to produce, sell, distribute, and maintain products using the Web.
And it isn’t just the Web that can benefit from this re-framing. The traditional services of email, news, files, keys, and directories have been recreated as Web applications and therefor as client-server applications. A return to their roots as peer-to-peer applications allows us to conceptualize not only what modern adaptation to these traditional services looks like, but also where these applications are deployed and run. While the example above probably prompted you to think of the generic shopping cart application as something that you have on your mobile device, and the popularity of services that facilitate peer-to-peer file sharing might lead you to a similar conclusion when imagining a file transfer application, trying to imagine an email server or a news aggregator or a key exchange or a directory service existing exclusively on your mobile device or even on your laptop is challenging. The usefulness of these kinds of things is strongly related to the ability for people to interact with them even when your personal devices might be disconnected, whether it is because they are off or the battery is dead, or because you are in a place where the Internet is inaccessible like a train tunnel or an air plane or in the middle of nowhere. Because of this, they were once operated at the institutional scale by a dedicated IT organization, and that spurred a commercialization of them for personal (but not private) services that mimicked what the IT organization did for companies. Eventually, that gets us services like Gmail and Dropbox, while key exchanges and directories have largely disappeared and news has become less and less recognizable as newspapers dry up and local television, talk radio, and fringe reporting all fight for access to people’s attention, trust, and engagement.
But it doesn’t have to be this barren landscape. In fact, because elastic compute, connectivity, and storage is so plentiful, and because of successive advances in automation, it isn’t completely unthinkable that the average person could have personal and private email, news, files, keys, and directory applications running 24/7 on the Internet acting as their agent in peer-to-peer relationships with other applications, vendors, and services while they connect to them with something that looks very much like today’s client-server apps on their mobile device or in their Web browser (because client-server works fine for this). These personal (or family) services become like the micro-scale vendors in the yokocho, any one business can succeed or fail but the aggregate stays healthy over time.
The Internet as foundations for Social Infrastructure
If we have a healthy Internet then we can conceptualize how it can help us have a healthy community. We’ve already talked about one of the most critical pieces of social infrastructure on the Internet when we talked about how Web browsers not having email servers or key exchanges or directory services the Web grew without ever achieving an organic or interoperable framework for verifying the Identity of a person or a thing. This growth also assured that the developing capabilities for Internet applications to do this atrophied and died. As a result we have spam, fishing, ddos attacks, ransomware, and fraud of all kinds. Interoperable frameworks for verifiable identity prevent all of this, because when you can use identity as a determinate of whether or not to establish or continue communication — even at the very lowest levels of networking and computing — you stop many attempts to defraud or corrupt or compromise people and systems before they can get started.
This is because out of identity comes the ability to build trust, and to distrust, and the ability to distrust is how you create safety. whether that is in a community or in a system, by seeing bad actors and making them known to others. Calling attention to bad actors is an ancient social mechanism to keep groups in line. At the extreme this becomes shame, which is embodied in the Internet infrastructure as blocking, filtering, or shunning addresses, domain names, or services that promote or tolerate bad actors. Today these decisions are made in private by governments and corporations, and they tend to be about “big” problems like terrorism, human trafficking, child exploitation. But they also happen at the small level when you block someone on social media, or report a spam message, or create a filter for your email. And they happen in-between when a company responds to abuse or copyright complaints by banning or locking a person’s account, or when a moderator or admin blocks a user in a Reddit forum or a server from peering to a federated service like Mastodon.
All of these are feeble, broken responses called for because it is difficult or impossible to identify bad actors definitively. The criminals move to new IP address or new domains to keep conducting their business, the spammers use compromised accounts to keep spamming, the abusive people in our socials and in the Reddit forum create new accounts to keep spewing their nonsense, the rogue Mastodon server gets a new name or the users get new accounts on different servers. All of this gets harder when we can make trust decisions earlier in the interaction. And our responses become more flexible, more durable, and more effective when we can make identity the thing we key on when programmatically creating safety, whether that safety is for the sake of people, networks, or computers.
Identity, trust, shame, exile, and safety are all examples of the building blocks of social infrastructure. The real-world communities we build all need them, and the Internet (and the applications built on top of it) does too. Fixing anything about how the Internet works without having this as the endgame isn’t worthless, but aiming at social infrastructure is the outcome we should be trying to achieve.