UPCOSM
How Multi-Gigabit Symmetrical Bandwidth Will Unleash A New Era Of Technological And Economic Growth
Part 1: Video Killed The Internet Star
Intentionally or not, the internet’s original architects built a technology platform whose defining attribute is its potential to upend even the most deeply entrenched social, political, artistic, and commercial incumbents, a power which captivated entrepreneurs and troublemakers from the earliest days of dial-up. In his 2014 book From Gutenberg To Zuckerberg, John Naughton called the internet “a global machine for springing surprises,” and added hopefully that “the only way we will stop them coming is either to switch off the network, or to cripple it in ways that will staunch the creative flow."
One of the early surprises Naughton cites: the tectonic upheaval of the music industry, a temblor catalyzed by Napster. Founded in 1999 by Shawn Fanning, then an 18-year-old Northeastern University student, Napster's meteoric rise caught record company executives completely off-guard and instigated an epic disruption that toppled their very profitable business model.
"Shipping atoms to ship bits"---in other words, digitally encoding thin slices of glass, loading them onto trucks, and delivering them by the thousands to brick-and-mortar retailers---made sense and generated healthy margins beginning in 1981, the year that music was first digitized commercially as the compact disc. But with the coming of the public internet, digitization turned out to be a two-edged sword. By the late 90s, CD’s represented only one form of digital transport, and an expensive, cumbersome one at that.
Earlier in the decade, engineers associated with the Motion Pictures Experts Group completed MPEG-1 Audio Layer 3 (mp3), a coding format for digital audio. Converted to mp3 files, music tracks could be sent directly to listeners over copper wires and fiber---not at the interstate limit of 65 miles per hour but at the speed of light. Mp3s could be endlessly duplicated with no degradation and at a marginal cost near zero. As ethereal as it seemed, the weightless mp3 was heavy payload on a deadly missile aimed straight at the heart of the record companies’ cash flow machine. They never saw it coming.
Fanning launched it by building a peer-to-peer application that enabled users---in the beginning, mostly college students---to become nodes in a file-sharing network, allowing them to swap whatever music tracks--files--they could rip from compact discs onto their individual hard drives. Over the course of just a few years, Napster captured the imagination of thought leaders, entrepreneurs, and the music-buying public.
Exposed to still-unsettled copyright issues, however, and unable to weather the industry’s well-bankrolled Big Law cannonade, Napster was forced into bankruptcy (it later re-launched as a subscription service.) Ambitious start-ups and mature, well-funded companies like Apple quickly filled the vacuum, jettisoning Napster’s legally problematic file-sharing model in favor of central distribution and control. But the forces set in motion by Napster were unstoppable, and music industry economics changed forever.
Napster’s revolution established two objectives:
· Electronic Delivery: Liberate digitized files from hard media and deliver them via the internet
· Many-to-Many Communications: Create a decentralized sharing and distribution system that envisions every user and every computer, i.e., every node, as both a client and a server
Only the first of these has since been fully realized. The process of delivering content from servers to users took center stage as the broadband era dawned and over the following two decades has become a proxy for what most of us believe the internet is and does.
Peer-to-peer communications and decentralization, on the other hand, have faded in relative importance as the formidable legal challenges which confronted Napster continue to represent a huge obstacle blocking the innovation path for cutting-edge many-to-many implementations. That’s why, in our post-Napster world, nodes mostly consume.
So it is that the internet, which began life as a community pot luck dinner, has become a suburban mall food court (Appendix 2). And thus, despite his optimism, Naughton has more recently seemed resigned to accepting that the “creative flow” of which he’d written about so optimistically just a few years earlier was getting “staunched.” Explaining almost wistfully that "the Internet....was seen at the outset as a radically different kind of medium from the mass media which had dominated the print and broadcast world,” he completes his musing with a lament: “[b]ut as the network has evolved to connect billions of users, this early vision of its potential as a communications medium has been tempered by experience. Analysis of data traffic on the network suggests that the kind of passive consumption that characterised the broadcast era is returning.”
In other words….Television! What happened?
Very briefly, this: After Napster and iTunes, the Media Entertainment Establishment (MEE) quickly grokked that video was next in the firebrands’ crosshairs. Determined not to suffer the fate of their musical brethren, they headed straight for the battlements. And over the course of the following decade the most dynamic and disruptive medium in history was itself disrupted by the most laid-back and passive medium in history, a medium so inert that its avid users are commonly identified by their sloth and vernacularly described as starchy, tuberous crops immovably planted on a davenport.
“Television,” wrote Michael Wolff in 2015, “is colonizing the internet.”
Sorry to interrupt your nice story, but this notion of the internet as not much more than TV on steroids seems like a transparently revisionist attempt to portray our glorious network in the most unflattering way, egregiously conflating the internet’s broad application potential with one currently popular application category. Isn’t it an enormous distortion to shoehorn the internet, a technological miracle that in just a few brief decades has transformed nearly every aspect of human life, into an idiot box?
Sure. The internet is obviously far bigger than just broadcast television. Everyone knows this. In The Inevitable, Kevin Kelly describes the irreversible changes in media, entertainment, social interaction, politics, commerce, and the arts already wrought by this miraculous network-of-networks. Selling, buying, banking, managing, manufacturing, talking, listening, watching, electing, reading, writing, composing…..and even with all of this, Kelly notes that it’s early in the game, that “in terms of the internet, nothing has happened yet.”
Passively consumed video is only one piece of the internet, even if it’s a big piece right now. And its current prominence in this still prehistoric internet era shouldn’t really be all that surprising; as Kelly, paraphrasing Marshall McLuhan, explains, “the first version of a new medium imitates the medium it replaces.” The internet may have started out life as telegraphy/telephony on steroids, but once screens overtook dot-matrix printers as the dominant form of terminal, it was pretty clear what medium was getting replaced. So TV is dead, long live TV, this was to be expected, no big deal. Enjoy your Breaking Bad binge; the rest of the internet is alive and well, and it will take care of itself.
Or will it? What about the possibility that our current video obsession will crowd out, or has already crowded out, other important categories of applications---some in early stages, others not yet even conceived--- that require different network architectures and access standards? An open, permissionless platform, the internet is in theory capable of accommodating all of the use cases we can invent; practically speaking, though, we’re more likely to invent those that will work with the network we have.
And video entertainment not only plays an outsize role in our actual use of the internet today; it also greatly influences the way we think about the internet. This matters because our collective understanding of the potential of any technology is a key driver of how that technology evolves.
Downcosm in this context is not so much a state of play as it is a state of mind, less descriptive than prescriptive. It’s intended to represent today’s idealized vision of the internet, but as circumscribed by Newton’s First Law: an object in motion stays in motion with the same speed and in the same direction unless acted upon by an unbalanced force.
And passive consumption of video is a commercial juggernaut which enables and is enabled by the Downcosm: more video, more Downcosm, rinse and repeat. They will continue to feed each other until the cycle is interrupted by a fresh conception of the internet, one which stimulates the development of new application categories such as the kinds of duplex, peer-to-peer, and many-to-many communications uses that are dependent on symmetrical bandwidth. Otherwise, we’ll just go on extrapolating current demand, building ever-bigger downstream pipes to accommodate ever-better quality video, and many possible futures, including ones we don’t even yet know that we want, will fail to materialize. Upcosm in this context is not a prediction; it is a call to arms.
A 6’4”, 210-pound freshman soccer recruit (he grew a bit over the summer) catches the football coach’s eye on the first day of practice. The soccer coach tells the football coach to back off, but they agree to let the kid decide. Over the course of the next four years, in addition to all his course work, he will either spend a lot of time in the weightroom getting bigger and stronger or out on the field improving his agility and endurance. Athletes are born, but linebackers and midfielders are made by choosing.
The future of a great young technology, like that of a great young athlete, is nowhere pre-ordained. The internet as it is today wasn’t an inevitability, and the internet of tomorrow will be the result of decisions made and forces set in motion today. Naughton writes of “the distortion imposed by the ‘Whig interpretation’ of Internet history – the tendency to view its [the internet’s] development with the 20/20 vision provided by hindsight. This provides a misleading impression of a linear progression from one great idea to the next, and obscures the paths of development that could have been, but were not, taken.”
The road you didn't take
Hardly comes to mind
Does it?
The door you didn't try
Where could it have led?
---Stephen Sondheim, Follies
It’s 2019, and the internet is a promising rookie facing a choice.
__________________________________________________
Learn more in Upcosm Part 2.
*** www.nytimes.com/2010/03/15/business/media/15carr.html
Sorry to interrupt your nice story, but this notion of the internet as not much more than TV on steroids seems like a transparently revisionist attempt to portray our glorious network in the most unflattering way, egregiously conflating the internet’s broad application potential with one currently popular application category. Isn’t it an enormous distortion to shoehorn the internet, a technological miracle that in just a few brief decades has transformed nearly every aspect of human life, into an idiot box?
Sure. The internet is obviously far bigger than just broadcast television. Everyone knows this. In The Inevitable, Kevin Kelly describes the irreversible changes in media, entertainment, social interaction, politics, commerce, and the arts already wrought by this miraculous network-of-networks. Selling, buying, banking, managing, manufacturing, talking, listening, watching, electing, reading, writing, composing…..and even with all of this, Kelly notes that it’s early in the game, that “in terms of the internet, nothing has happened yet.”
Passively consumed video is only one piece of the internet, even if it’s a big piece right now. And its current prominence in this still prehistoric internet era shouldn’t really be all that surprising; as Kelly, paraphrasing Marshall McLuhan, explains, “the first version of a new medium imitates the medium it replaces.”4 TV is dead, long live TV, this was to be expected, no big deal. Enjoy your Breaking Bad binge; the rest of the internet is alive and well, and it will take care of itself.
Or will it? What about the possibility that our current video obsession will crowd out, or has already crowded out, other important categories of applications---some in early stages, others not yet even conceived--- that require different network architectures and access standards? An open, permissionless platform, the internet is in theory capable of accommodating all of the use cases we can invent; practically speaking, though, we’re more likely to invent those that will work with the network we have.
And video entertainment not only plays an outsize role in our actual use of the internet today; it also greatly influences the way we think about the internet.This matters because our collective understanding of the potential of any technology is a key driver of how that technology evolves.
Downcosm in this context is not so much a state of play as it is a state of mind, less descriptive than prescriptive. It’s intended to represent today’s idealized vision of the internet, but as circumscribed by Newton’s First Law: an object in motion stays in motion with the same speed and in the same directionunless acted upon by an unbalanced force.
And passive consumption of video is a commercial juggernaut which enables and is enabled by the Downcosm: more video, more Downcosm, rinse and repeat. They will continue to feed each other until the cycle is interrupted by a fresh conception of the internet, one which stimulates the development of new application categories such as the kinds of duplex, peer-to-peer, and many-to-many communications uses that are dependent on symmetrical bandwidth. Otherwise, we’ll just go on extrapolating current demand, building ever-bigger downstream pipes to accommodate ever-better quality video, and many possible futures, including ones we don’t even yet know that we want, will fail to materialize. Upcosmin this context is not a prediction; it is a call to arms.
A 6’4”, 250-pound freshman soccer recruit (he grew a bit over the summer) catches the football coach’s eye on the first day of practice. The soccer coach tells the football coach to back off, but they agree to let the kid decide. Over the course of the next four years, in addition to all his course work, he will either spend a lot of time in the weightroom getting bigger and stronger or out on the field improving his agility and endurance. Athletes are born, but linebackers and midfielders are made by choosing.
The future of a great young technology, like that of a great young athlete, is nowhere pre-ordained. The internet as it is today wasn’t an inevitability, and the internet of tomorrow will be the result of decisions made and forces set in motion today. Naughton writes of “the distortion imposed by the ‘Whig interpretation’ of Internet history – the tendency to view its [the internet’s] development with the 20/20 vision provided by hindsight. This provides a misleading impression of a linear progression from one great idea to the next, and obscures the paths of development that could have been, but were not, taken.”9
The road you didn't take
Hardly comes to mind
Does it?
The door you didn't try
Where could it have led?
---Stephen Sondheim, Follies
It’s 2019, and the internet is a promising rookie facing a choice.
Downcosm: Follow The Bandwidth
Ask the average broadband user about his or her internet speeds, and the answer you get will nearly always be the downstream rate. To the extent that anyone ever mentions the upstream rate, it’s usually to complain about the time it took to upload a handful of selfies or a home video.
The significant differential between downstream and upstream bandwidth ceilings is not only a feature of the popular service tiers offered by most broadband providers; it’s actually codified in regulation. Since 2015, the Federal Communications Commission has formally designated as “broadband” only those internet service providers offering minimum speeds of 25 Mbps down and 3 Mbps up:
Source: Broadband Now
Note that the original 1996 standard, set well before high-quality internet video was even a gleam in Reed Hastings’ eye, specified equal upstream and downstream bit rates (it’s almost as if the internet pioneers were thinking about its potential as a sharing platform!) Yet as absolute speeds have continued to increase, the ratio of downstream-to-upstream for “official” broadband has grown steadily from 1:1 to 4:1 to 8:1.
What happened? As we’ve seen, higher speeds and streaming video, the key driver of broadband adoption, feed each other. In some ways the FCC’s changing definition, both in terms of absolute throughput and the downstream/upstream ratio, merely ratifies a development path that already happened and which was perhaps inevitable---Naughton notwithstanding--- at this early stage of the internet.
But regulators are not just neutral observers, and the FCC’s after-the-fact validation is actually an endorsement and a mandate. The clear message from regulators is that streaming content is king and must be served. On the FCC’s website today, downstream bit rates are discussed almost entirely in the context of how they affect video quality. Upstream speeds hardly rate a mention.
Yesterday’s egg is today’s chicken. Not surprisingly, few killer apps dependent on large magnitudes of upstream bandwidth have ever emerged. With the exception of slow Facebook/YouTube uploads or large email attachments, most consumers pay little attention to upload speeds.
What’s wrong with all of this? Nothing, maybe. The benign interpretation of consumer broadband history is that we’re getting the network we want. So….TV. Call this The Domino’s Pizza Model: choose your toppings and we’ll deliver when you’re ready. Poster Child: Netflix.
No, the internet is not justan enormous television network, any more than Las Vegas is justan enormous casino. But the dominant use informs the popular conception, which in turn plays a large role in determining the development roadmap. The Mother Teresa Museum will not be locating in Glitter Gulch.
There must be some kinda way outta here……..
The Field Of Dreams…..
…is the UPcosm,an era of multi-gigabit symmetrical bandwidth in which node-centric applications will flourish along withcentralized delivery models. What kinds of applications might these be?
Peer-to-Peer/Blockchain
Even after Napster's demise, like the last surviving humans before the Final Extinction Event who escape in order to continue to propagate the human race on a distant planet, a vibrant community of peer-to-peer enthusiasts has continued to thrive. But the development of gamechangers---paradigmatic P2P applications that would unleash the power of the internet to enable truly decentralized sharing---has been stifled by the failure to solve the critical issues that doomed Napster.
Enter Blockchain, the technological foundation of alternative currency Bitcoin. A blockchain is a public ledger certified by every node on a network each time a new transaction occurs. The memorialized transactions reside forever in a block. Changing the history of the block requires the approval of 51% of the nodes, a practical impossibility; as such, the ledger is immutable and serves as a guarantor of trust. By replacing third parties, e.g., banks, as trusted intermediaries, and eliminating the financial and process friction that third parties introduce to dealings between voluntary actors, blockchain-enabled platforms are capable of generating transaction cost reductions that can power new business models.
For relatively small purchases and sales, blockchain efficiencies can mean the difference between profit and loss so that even electronic micropayments, once considered unworkable because of third-party fees that often dwarfed the size of the payment itself, might now make economic sense. And because blockchain ledgers can also act as distributed databases containing rights information, they have the potential to solve the two fundamental difficulties encountered by Napster: 1) the inability to exchange a small amount of money for 2) a legal right to own or rent a work of art represented by a digital file. Payments, no matter how small and whether for full ownership or limited use, could flow directly to the content owner, even if the file itself is distributed from inside a peer network and not from a centralized server controlled by the owner. See Rinaldi, 2018.
Sensors, things, and a touch of the future
From the early days of ARPANET until just a few years ago, network-connected devices---whether personal computers, mobile phones, or high capacity computers serving content from huge CDNs --required human mediation. That changed early in the current decade when analog componentsthat sense, measure, interpret, and analyze datawere connected directly to the network and commercialized as consumer solutions. Designed for specific functions such as home automation and health monitoring, and packaged as “smart devices,” they gave birth to a new application category: the Internet of Things (IoT.) IoT applications have thus far not challenged the relatively meager upstream bandwidth currently available to most consumers. That won’t be the case forever.
Even more recently, the so-called “Tactile Internet” (TI) takes IoT a step further, envisioning human/sensor interactionin an effort to reproduce at a distance the immediate experience of touch. Advances in haptics will enable TI applications including telepresence, skills training, and remote surgery. These will require full-duplex communications relying on minimal latency, but they will also depend on the availability of huge volumes of symmetrical bandwidth. (See Appendix 1).
Which comes first, necessary conditions or applications? The answer is that the future begins when chicken/egg parlor exercises give way to decisive action. Applications await the capacity that will enable them. If the bandwidth---and the latency---don’t arrive, neither will the applications, forever abandoned in The Great Perhaps. It's worth observing that no-one ever decided to build a bridge by counting the swimmers.
Previously skeptical analysts have begun to reconsider. In 2012, Nokia's Ana Pesovic contended that internet traffic was becoming “more asymmetrical over time” and contended thata focus on symmetry would "hamper future investments due to higher costs."Yet a mere five years later she suggested that "until recently, residential internet use was almost solely about downloading content in one form or another...operators focused almost completely on giving their customers the best possible download speeds, creating asymmetrical networks," finally acknowledging that "by 2017, several key evolutions in customer behavior changed the way people use the internet."
She continued: "The upstream half of their service is becoming increasingly important to residential users....evidence suggests that the more upstream speed that is available, the more it is used. In Asia – where many operators offer high upstream bitrates – significantly more upstream bandwidth is used. The ratio of upstream to downstream usage in Asia is 1:3, compared with 1:20 in Europe."
In its whitepaper, The Zettabyte Era, Cisco shows the path forward. While maintaining that in all likelihood "residential Internet traffic will remain asymmetric for the next few years," Cisco cites "numerous scenarios" that "could result in a move toward increased symmetry." These include accelerated use of high-end video for communications, and especially adoption by CDN's of peer-to-peer architectures.
Real-world implementations surveyed by Cisco validate Pesovic's suggestion that where more upstream bandwidth is provided, it is absorbed. Cisco notes, for example, that P2P live streaming in China by applications like PPStream and PPLive have enjoyed great success.
Cisco's optimistic conclusion? "Generally, if service providers provide ample upstream bandwidth, applications that use upstream capacity will begin to appear."
If you build it......
Appendix 1
The Skinternet
The growth in IoT applications to date has brought about at least a reconsideration of the uses for upstream bandwidth, if not an increased
demand for it. The next generation of connected devices will surely kick things up a notch.
In 2012, Gerhard Fettweis coined the term “Tactile Internet” (TI)to represent a technological framework supporting applications that could mimic “the experience of touching something in real life.” Two years later the International Telecommunications Union predicted that, following close on the heels of the IoT, “the next wave of innovation will create the Tactile Internet.” Powered by the development of technologies that enable humans and machines to interact not only with each other but also with immediate and remote environments in real time, ITU forecast that TI will give rise to “numerous new opportunities for emerging technology markets and the delivery of essential public services.”
Full implementation of TI will obviously rely on continuing breakthroughs in haptics engineering, but it will also require a re-thinking of network architecture as well as major new investments in telecommunications plant and equipment. ITU noted that while “today’s fixed and mobile Internet infrastructure is typically used for transferring content from A-to-B and is optimized for the transmission of static or streaming content,” implementing TI would demand new benchmarking standards across a broad range of network performance criteria.
Over the past several years, Fettweis and others have begun to delineate the operational and networking elements that would be required to implement TI, and have also enumerated dozens of specific applications making up several broad categories. Networking KPI benchmarks for the individual categories vary according to the communication requirements of the use cases they comprise. A recent IEEE Communications Survey offered the following guide:
Source: Towards Haptic Communications over the 5G Tactile Internet, IEEE, June 2018
Upper-end data throughput requirements for high reliability/availability use cases is not specified. Clearly, however, bandwidth for mission critical tactile and presence applications dependent on full-duplex high-definition video must be engineered for bi-directional peaks, not averages.
Latency? Wait ‘n See
The great Aristotle, never one to let actual research and experimentation stand in the way of his own ego and intuition, believed that the speed of light was infinite. And for most of human history, either because of or despite Aristotle’s arguments from authority, so did almost everyone else. But beginning with the work of Danish astronomer Ole Roemer in 1676, a succession of observations and ingenious experiments yielded results that got to within striking distance of the constant we know today.
Before the dawn of the electric age, though, the speed of light was mostly a theoretical curiosity, a phenomenon without much real-world import or any useful applications. Even if you were among the scientific cognoscenti back in the pre-Fara day, without portable lasers and oscilloscopes you couldn’t even use your insider information to win a bar bet. And 186,000 miles per second —seven times around the world—must have seemed so incomprehensibly fast that you’d be suspected of just making it up.
Lately, though, we’ve discovered that for many purposes---like sending and receiving information over copper, fiber, or air--- light is actually far slower than we’d like it to be. 40 millisecond round-trips from New York to Paris might sound fast to a road warrior, but now that we can imagine the possibility of an emergency procedure performed by an Upper East Side surgeon on a patient at the Hôpital Saint-Louis, it’s become a constraint.
A quick glance at the classification table reveals the critical role of latency in TI. For delivering remote tactile and control sessions that are indistinguishable from immediate experiences, an inability to cap end-to-end delay at ≤ 1millisecond would be a showstopper.
APPENDIX 2
History: How Video Killed The Internet Star
ARPANET, a project started in the 1960s as a Department of Defense initiative to interconnect the computers of researchers, had by the late 1990s been transformed by rapid advances in microprocessors, packet switching, networking, and fiber optics into a General Purpose Technology (GPT) called the Internet. "Downcosm"represents Broadband 1.0, the commercial bundling of Internet access as a mainstream consumer service.
Public awareness of the internet began to take hold in the late 1980s with the availability of paid home services: AOL, Prodigy, and Compuserve. The networks built by those early players were referred to as "walled gardens," as they mostly delivered just their own proprietary feeds: news, weather, chaperoned chat rooms, and subscriber-to-subscriber email, as well as a bit of syndicated opinion and topical content. In many respects, they functioned like an online version of a provincial newspaper's Sunday supplement: mostly print stories, some local want ads and personals, and just enough syndicated "multimedia" (like those flashy weekend photo spreads in Parade) to keep the eye entertained.
For the majority of subscribers to these services, the rest of the internet iceberg was only dimly visible and largely out of reach. Soon, however, the lay public and the water cooler crowd began to catch on to the fact that there was something big on the other side of the garden.
The buzz swelled to a roar in the early 90s when Tim Berners-Lee created the World Wide Web. The Big Three responded to this disruption by offering their subscribers perfunctory tunnels to it. Then, in 1994, Marc Andreessen finally captured the zeitgeist when he launched the Netscape browser, opening up the web to millions of consumers clamoring to "get online" and "surf the net."
But despite the success of Netscape, ponderous dial-up speeds continued to render many exciting capabilities of the web inaccessible to most home users. Network developers and service providers took notice, however, and empowered by both Moore's Law and huge leaps in data transmission know-how implemented order-of-magnitude increases in bitrates.
By the turn of the millennium we’d been nibbling on frustrating dial-up multimedia for more than a decade. When the telcos began offering DSL most of us happily noted the reduced lag in loading web pages and opening email attachments. So-called "power users" (e.g., the data-hungry pioneers of peer-to-peer networking like Napster) could never get enough bandwidth in either direction and so, unlike the general public, even appreciated the less pronounced improvement on the upstream side. But at the time the only broad consumer application that could fully utilize DSL's downstream-on-steroids was video.
Completing the virtuous circle were traditional media companies, encouraged to build up their web presences and produce content that could take advantage of the ever-faster speeds. By the time YouTube launched in 2005, competition between and among telcos and cable companies had brought us downstream access as fast as 2 Mbps to home users. Video downstreaming exploded:
Each stage of internet development was itself enabled by improvements in output modalities, from blinking lights, punched cards, and paper tape to CRT’s, composite video, plasma, and flat LCD’s. These dictated our terms of engagement with the medium. For two or three generations brought up on televised entertainment, the internet was a natural for bringing us more of the same.
The hockey-stick growth in traffic exposed a key weakness in the architecture of the core network. Not only does the speed of light set a limit on the transit time for any requested data packet, but long distance travel on the internet---a network of networks--- requires those packets to take indirect routes, getting bounced back and forth through routers and switches and necessitating periodic conversions from photons to electrons and back again. Even though some applications, like email, could peacefully coexist with the resulting latency and jitter, video quality often suffered to the point of non-commerciality.
The solution to this multi-hop nightmare was provided by the development and proliferation of content distribution networks. CDNs enable the optimization of data-heavy content delivery from the core to the edge by strategically placing multiple servers loaded with caches of often-requested content in geographical proximity to access networks and thus to end users. CDNs function in much the same way as local Amazon Centers---the closer they are, the more efficiently they deliver.
Meanwhile the speed wars---the downstream speed wars---continued to escalate, and just eighteen months after "Me At The Zoo" became its first uploaded video, YouTube was acquired by Google. Soon enough, with a little help from its friends, internet video was ready for prime time--a coming-of-age certified by Netflix Inc.'s early-2007 launch of Watch Now, the video streaming platform known today simply as “Netflix.” Originally intended for just television sets and video game consoles---the first iPhone wouldn't be released for another six months-- Netflix over the following decade would transform our viewing habits on both fixed and mobile devices and in the process help to engineer a successful takeover of both Hollywood and much of the retail internet.
Today, for many of its most devoted users, the internet is an NFL Championship Game, a short clip of Bruno Mars in concert, a viral 30-second cat movie on Facebook, or an episode of "Better Call Saul." Welcome to the Downcosm.
APPENDIX 3
Recapitulation? Really?
Just for fun.
The Theory Of Recapitulationstates that “the development of the embryo of an animal, from fertilization to gestation or hatching (ontogeny), goes through stages resembling or representing successive stages in the evolution of the animal's remote ancestors (phylogeny).” In other words, we microcosmically play out the entire Ascent Of Man while in utero. If that seems a bit silly---like a Woody Allen time-lapse documentary about Darwinism---it is; in fact the specifically biogenetic interpretation of the theory has been roundly discredited. Still, recapitaulation may have some explanatory power asapplied to the evolution of telecommunications and the development of the embryonic internet:
Telecommunications
(Phylogeny)
Internet
(Ontogeny)
Users
Telegraphy
ARPANET
Guild Only
Telephony
Modem (e.g., TRS 100)
Guild + Public
Analog Radio
Dial-up/BBS
G + P (+ Narrowband Broadcasting)
Analog Television
Broadband
G + P (+ Wideband Broadcasting)
Mirroring the progression of the telecommunications era that began with telegraphy, the internet started out as a private network utilized only by a guild fluent in the esoteric operating code of specialized hardware. It then evolved through a succession of public phases until what finally emerged was a few-to-manyvideo broadcasting medium.
APPENDIX 4
GedankenExperiment
History never gives up its counterfactuals, and declaring that things might have worked out differently is either a fruitless lament or a harmless indulgence--unless in the thinking process we become more acutely aware of the ramifications of choices we're making today. To that end, even outlandish thought experiments can be illuminating.
Telephony: A Road Not Taken
Boston, a sunny day in late March, 1876.
Just a few weeks ago, Alexander Graham Bell was issued patent 174,465("the method of, and apparatus for, transmitting vocal or other sounds telegraphically"). Over the past few days, he completed a successful demonstration of the device described in the patent.
Lost in thought while taking his daily Cambridge Street constitutional on that brisk early spring afternoon, he bumps into Mayor Phineas Taylor Barnum of Bridgeport, Connecticut. Barnum’s in Boston on a familiar errand--- attending a circus. This time, it’s a plenary session of the New England Republican Party. The GOP’s Presidential nominating convention, to be held in Cincinnati, is only a few months away.
P.T. Barnum, of course, is more than just another glad-handing politician looking to beef up his social network. He’s also a legendary showman with an extraordinary sense of public tastes in entertainment and an instinctive understanding of the role technology can serve in creating events that draw ticket-buying crowds.1
The chance meeting—at least it seemed so to Bell---is no accident to Barnum, who’d been enjoying his ham and egg sandwich that morning at Young’s Hotel on Court Street when he happened to catch a front-page Globe story about Bell and his “telephone.” In fact, Barnum already knew quite a bit about Bell; an 1844 Barnum exhibition in London’s Egyptian Hall had featured an odd invention called the Euphonia, a grotesque mechanical speaking contraption which underwhelmed the hoi polloi but which captured the attention of Bell’s father.
After breakfast, Barnum spent a few hours reading up on Bell in the hotel library, and then put together this 1876 version of an elevator pitch to present to the Scotsman that afternoon:
“Messrs. Bell and Barnum are pleased to announce the formation of their partnership as joint proprietors of bTunes. Making use of Mr. Bell’s wonderful invention and Mr. Barnum’s well-known musical acumen, bTunes will endeavor to allow the citizens of this great city to enjoy wonderful performances of the world’s greatest symphonic and popular music without leaving the comfort of their homes.”
The entertainment business! Bell admitted to Barnum that he’d never actually considered that as a use case for his invention, let alone as its primary application, but that he’d give it some thought.
The development paths would differ greatly. Commercial rollout of of remote music delivery would necessitate huge leaps forward in microphone/speaker technology and sound quality, while scaling two-way voice communications as a consumer service would require an intense focus on enabling full-duplex channels and interconnection.
Even more important, the popular conception of the technology---the common cultural understanding in 1870s-1880s America of the promises, limitations, and risks associated with whichever application Bell chose ---would inform the incessant flows of human, intellectual, artistic, social, and financial capital that would serve as launchpad for Version 1.0 and springboard for never-ending upgrades. Bell understood that whether he went with telephony or entertainment he would need to capture the public's imagination with his vision, and in so doing spur a generation of engineers to develop solutions propelling the business forward, a generation of entrepreneurs to develop new products and services that exploited those solutions in startups across a multitude of verticals, and a generation of thought leaders and financiers to imagine and capitalize the wonderful opportunities that continued to lie ahead.
Bell Telephone v. bTunes: same starting technology, two vastly different futures.
Footnotes (Books)
1George Gilder, Microcosm: The Quantum Revolution In Economics And Technology, Touchstone, 1990
2 George Gilder, Telecosm: How Infinite Bandwidth Will Revolutionize Our World,Free Press, 2000
3 John Naughton, From Gutenberg To Zuckerberg: What You Really Need To Know About The Internet, Quercus, 2012, Loc. 221
4 Kevin Kelly, The Inevitable: Understanding The 12 Technological Forces That Will Shape Our Future, 2016, p.63
Internet sources are hyperlinked and shown in blueor purple