Chapter +7: Earth – home sweet home

Earth photographed from deep space forms a most strikingly beautiful image. It’s an image that carries an enormous significance. Nearly eight-billion souls live out their daily lives carried along on this blue-green sphere spinning slowly and silently through the void. All of humanity and possibly all of living things exist only on this one single fragile planet. Carl Sagan said it most eloquently[1] referring to the ‘pale blue dot’ that Voyager-1 captured when it turned its cameras back towards the distant Earth for one last time:

“We succeeded in taking that picture, and, if you look at it, you see a dot. That’s here. That’s home. That’s us. On it, everyone you ever heard of, every human being who ever lived, lived out their lives. The aggregate of all our joys and sufferings, thousands of confident religions, ideologies and economic doctrines, every hunter and forager, every hero and coward, every creator and destroyer of civilizations, every king and peasant, every young couple in love, every hopeful child, every mother and father, every inventor and explorer, every teacher of morals, every corrupt politician, every superstar, every supreme leader, every saint and sinner in the history of our species, lived there on a mote of dust, suspended in a sunbeam.”

The formation of the Earth

The Solar System was created some 4.6 billion years ago. That was about 9.2 billion years after the universe began. The Solar System is defined as starting when the central mass of the primordial cloud of gas and dust attained sufficient density and temperature to ignite hydrogen-fusion and thus become our Sun. The outpouring of Solar energy swept the inner regions of the Solar System clear of the lighter elements (hydrogen, helium). Electronic forces (chemical bonds, Van der Waals’s, and electrostatics) caused the remaining heavier elements and compounds to condense into solid particles. Beyond a certain size, gravitational forces started to dominate and larger rocky bodies were formed composed mainly of metal silicates. After a chaotic beginning, two large planets (Venus and Earth-Moon) and two smaller bodies (Mars and Mercury and) finally emerged and remained in stable orbits in the inner Solar System.

Early Earth would have been unrecognizable. The entire planet was liquid and red-hot – melted by heat from in-falling material, radioactive-decay[2], and gravitational fractionation into a very dense iron core (density ~= 12 grams/cc) surrounded by a mantle of much lighter silicates (4 grams/cc). The Moon is believed to have formed in this early Hadean[3] period as a result of an unimaginably violent impact with a Mars-sized body. The Moon is composed of just the lighter mantle material that was ejected and does not have a significant iron core. Earth’s hot atmosphere at this point would have been composed of H20 (water-vapor), CO2, and nitrogen outgassed during Earth’s formation and also delivered by in-falling comets during these early chaotic beginnings.

As early as four billion years ago, a solid crust had formed on the Earth and large oceans of water had condensed from the atmosphere. There is even some evidence that life existed. Whether life was created independently on Earth or was delivered courtesy of an interstellar asteroid or comet is not known[4]. Life has had a major impact on Earth’s atmosphere and geology. From four-billion to two-billion years ago, only simple lifeforms like bacteria existed. Photosynthesis developed during this period and offered an enormous new energy source for building the structures of life. Photosynthesis uses sunlight to strip carbon and hydrogen from CO2 and H2O. Oxygen is excreted in the process. Initially the excess oxygen was incorporated into iron oxide deposits in the crust, but, eventually, large amounts of oxygen built up in the air, completely changing the composition of Earth’s atmosphere. Conversely, over geological time, large deposits of biological origin became incorporated into the crust. These deposits include the carbon and hydrocarbon fossil-fuels and the carbonates such as limestone and marble.

More complex cells containing distinct nuclei and ribosomes started to appear about two billion years ago. Multicellular organisms then developed and by about half a billion years ago life had started to colonize the land. Convection currents in the liquid outer core had already generated a magnetic field around the Earth. The magnetic field gave protection from the solar wind and terrestrial life was now also protected from ultra-violet radiation by virtue of the ozone[5] layer – newly-formed from the oxygen in the atmosphere.

By 200 million years ago, the multicellular organisms included enormous dinosaurs that roamed both the land and the seas. Today the largest animals are mammals, but the dinosaurs’ descendants still flourish worldwide in the myriad forms of birds. In contrast, Homo Sapiens is a real newcomer, originating only about 200,000 years ago in Africa, before gradually spreading throughout the rest of the World. By twelve thousand years ago, humans were almost everywhere on the globe[6]. Altogether, there are nearly eight-billion of us here today – four hundred million tonnes of humanity.

The Global Village

Early humans are believed to have lived in small bands of perhaps ten to a hundred closely-related individuals. The bands would have been relatively self-sufficient, probably living as nomadic ‘hunter-gatherers’ relying on their intimate knowledge of the land for shelter, water, and food and to avoid danger. The bands would travel from location to location as the need arose. Humans are long-lived and have prodigious memories – a necessity for maintaining an internal map of a large territory with all the details of critical water sources and food and shelter and the numerous perils – all of which would change on a seasonal basis. Early humans are known to have hunted and butchered large animals, a task that requires complex plans to be developed and to be communicated throughout the group. The ability to hold and manipulate an internal mental version of the world and project this vision into the future to test out various possibilities is one of the greatest of human talents. Ranking right beside this is the unparalleled ability of humans to clearly communicate such a vision to others in the group.

The bands in those early days had few enough members that each individual could recognize all members of the band. Humans are exceedingly good at identifying individuals from facial features. Because of competition over resources, the idea of us versus them was inherent and well-developed. Strangers were necessarily treated with suspicion and relations between neighboring bands were not necessarily peaceful. However, certain types of interaction could be of great mutual benefit. These include trading of goods and skills and knowledge, and mutual defense against predators – in fact, collaboration on any urgent large project that could lead to faster better completion.

But caution over strangers is inherent and is well-justified from the perspective of Darwinian genetics[7]. Nearby bands might have shared kinfolk whereas distant tribes would have much less genetic similarity. Geographic separation causes the genetics to drift apart (ultimately creating new species). But the drift and divergences in culture and customs and language and tools occurs much faster than in genetics. If the different bands or tribes shared some language and had many things in common and had familiar tales to tell of exciting escapades, the chances were good that they could collaborate. But a tribe from distant parts never previously encountered and that used a foreign language and looked different and had a very alien culture would find collaboration very difficult and not just because of the problem of communicating.

The shift to an agricultural lifestyle, occurring around 10,000 years ago, would have placed further demands on the need for strong collaboration. With the development of permanent settlements, rules for property and land ownership had to be codified and enforced. The idea of ownership, of course, intensified the possibilities for conflict. Nevertheless, settlements and villages and towns and cities continued to grow. It has been argued that the large-scale development of religions was an important factor in enabling much larger groups to form and work together. Religions propagate memorable stories or viral memes[8] and include instantly recognizable icons. Such factors in common across disparate groups may have played an early role in overcoming ethnic and cultural and language barriers and facilitating collaboration on large projects. One way or another, it does seem that the cooperative nature of humankind is winning out – or maybe the cooperative humans were the ones that survived. Sadly, many many millions of people, usually innocent bystanders, have perished in the devastating conflicts that occurred over the centuries. These kinds of tragic events continue, but, fortunately, the world is generally becoming safer for the vast majority of humankind. Today’s world is organized in very large nation states on the scale of tens or hundreds of millions of people. China and India as political entities both exceed one billion individuals. The United Nations is an organization that covers all of these nation states and ostensibly all of humankind. The term ‘Global Village’ may seem like an oxymoron but it serves well to emphasize the interdependence and interconnectedness of today’s world.

More than anything it has been the huge advances in communications that have led to today’s global world. Information can be exchanged more quickly and more easily over greater distances than ever before. In the absence of communications, cultures tend to gradually drift further and further apart and become increasingly foreign to each other. Strong communications between geographically separated groups helps to converge the cultures and languages. Increasing familiarity with and increased understanding of the ‘foreign’ group’s methods and motives and aspirations all work to reduce our concerns and suspicions and fears. Today’s ability to communicate with almost anyone anywhere without giving it a moment’s thought has served to make the them seem much less threatening and much more like the friendly us. From the perspective of communications, we all now live in one big Global Village.

Communications

Humans had expanded to inhabit almost the entire world 20,000 years ago. But the inhabitants were totally unaware of this fact. It was not until the great sea voyages from the 1400s through 1700s by Zheng He, Columbus, Vespucci, Magellan, Tasman, Cook and many others that the entirety of the world was fully appreciated and the many far flung outposts of humanity were tallied. A global awareness had finally been established yet communications were measured in voyages or overland journeys that took many months or even years. Fast, reliable communications on a global scale were really not established until about 100 years ago when long-distance telegraph cables and radio links became available.

Communications within a primitive band of ten to a hundred is straightforward. The individuals are likely to live in close proximity or assemble together frequently, so voice and visual signals work very well to spread information quickly within the group. Over longer distances, a few tens of kilometers (or miles), it perhaps took a day or so to convey a message. Famously, Pheidippides ran 42 km (26 miles) from Marathon to Athens to carry the news of the Persians defeat. Horses travel about twice as fast – hence Paul Revere’s midnight ride on horseback to warn of an impending British assault; also the short-lived Pony Express[9] featured in many Westerns (movies). Carrier pigeons travel at about 80 km/h (50 miles per hour) – about twice as fast again as horses,. Pigeons are difficult to intercept and were widely used for conveying messages and light objects as recently as the First World War (1914-18).

Sounds travel about 10 times faster than pigeons but the range is fickle and limited to a few kilometers under most circumstances. War drums and horns have been used for millennia and church bells have summoned the faithful to prayer for centuries. In Hong Kong and Cape Town the tradition persists of marking the noon hour by firing a cannon to inform the town’s citizens. Powerful fog horns originally driven by steam or compressed air have been warning shipping of dangerous hazards during foggy conditions since the 1850’s. These operate at about 300 Hz for best propagation and audibility. Under good conditions, a fog horn can be heard about 5 km (3 miles) out to sea[10].

Light travels a million times faster than sound and would seem ideal for high-speed long-distance communication[11]. And indeed it too has been used since antiquity. ‘Heliographs’ were apparently used by the Ancient Greeks relying on polished medal mirrors to reflect the Sun). Native Americans were noted for using visual smoke signals to transmit critical information over large distances. Fire had the advantage that it could be used in either daylight (smoke) or at night (flames). Fires readied for use and strategically located on hilltops were the basis of the beacons used by the Roman Empire for transmitting critical information (such as an attack by Barbarians). Similar beacons warned the British of the approach of the Spanish Armada. The name survives in the many “Beacon Hills” or equivalents found throughout Europe and the Americas.

The use of line-of-sight visual signaling reached a great deal of sophistication. The Chappe telegraph network was a system of optical semaphore[12] relay stations covering much of France (see diagram). The system was developed by Claude Chappe, whose uncle was the famous astronomer Jean-Baptiste Chappe[13]. Each semaphore station prominently displayed two large moveable paddles mounted on each end of a large movable cross-bar. There were the two telescopes for viewing each of the adjacent stations in the chain. Information was conveyed through eight possible positions of each of the two paddles (45 degrees apart) as well as two possible positions of the crossbar (vertical or horizontal). Of the 128 possible combinations, there were 98 signs or symbols that could be unambiguously distinguished at a distance. Messages were encoded into these 98 symbols. The transmission rate was 2 to 3 symbols per minute, equivalent to about ~0.3 bits/s. For a message starting in Brest, the first symbol would reach Paris within about 8 minutes. Brest is a city in the far west of France, 500 km from Paris, so the speed of transmission was a remarkable 3800 km/hr (2400 mph). The signals went through 80 ‘repeater’ stations on the route – an average of 6 seconds delay per station.

Napoleon Bonaparte was quick to recognize the advantage of the Chappe system and promoted its construction. The network played an important role during the Napoleonic wars (1803–1815) giving a distinct advantage to the French versus the fragmented communication amongst Great Britain and her varying allies. However, systems like the Chappe semaphore telegraph do have their drawbacks. They are easily interrupted by rain or snow or fog and they do not work at night. Also the system required a large diligent and extremely attentive and accurate workforce. Accuracy was especially important since the messages were not plain text but were encrypted to prevent eavesdropping. The cost of the workforce and upkeep of the stations eventually led to the demise of the Chappe network in the 1850s in the face of competition from the new electric telegraph.

The the idea of using electricity to send signals seems straightforward – all you need is a battery at one end, a long pair of wires, and then a coil and a compass-needle at the other end. However, it was not until 1799 that Alessandro Volta (1745-1827), had the idea of putting together a battery or pile of Galvanic cells and it was not until 1820 that Hans Christian Oersted (1777-1851) discovered, accidentally during a lecture, that an electric current can deflect a compass needle. These inventions/discoveries and the subsequent work by André-Marie Ampère (1775–1836), Georg Ohm (1789-1854), Michael Faraday (1791-1867), Joseph Henry (1797–1878), Wilhelm Weber (1804-1891), Carl Gauss (1804-1891), Lord Kelvin (1824-1907) and others culminated in the widespread deployment of commercial telegraph systems in the 1840s. Interestingly, all these famous gentlemen now have physical units named after them[14]. Early systems employed many parallel wires in an attempt to directly transmit alphabetic characters. The first successful ‘modern’ two-wire one-needle system was run in 1843 from Paddington station in London to Slough station – a distance 29 km (18 miles). Samuel Morse, an American artist, was responsible for the more economical 2-wire (or 1-wire using the ground as the return path) system and developed the eponymous Morse code[15] a variant of which survives to this day[16]. The development of telegraph systems very much paralleled (literally!) the development of railroads. The train operators needed real-time information about the locations of the trains if only for reasons of safety (early operations used a single track for travel in both directions). Plus there were clear existing rights-of-way already available. Typically a large number of bare iron or hardened-copper wires would be strung immediately adjacent to the track using porcelain or glass insulators mounted on wooden telegraph poles.

The operation of the telegraph was straightforward. At the sending or transmitting end, a human operator would read the intended message and simultaneously translate it into Morse code and send the message manually using a Morse key (a spring-loaded on-off switch). At the receiving end was a ‘telegraph sounder’, a device rather like an electromagnetic relay but designed to create sound. The incoming wire was wrapped in multiple turns around a soft iron core to form a solenoid or electromagnet. When current flowed, a small spring-loaded iron armature was attracted to the electromagnet resulting in an audible ‘click’. When the current stopped flowing, the armature fell back against a mechanical stop and made a ‘clack’ sound. A human operator listened attentively to the clicks and clacks, interpreted the sounds back into alpha-numeric characters, and transcribed them back onto paper in the form of a ‘telegram’. On good links, telegraph operators would work at about 20 words per minute (around 8 bits/second ).

To cover very long spans, relays or repeaters were required. Initially, these were human repeaters, but the job became automated using sensitive electromechanical relays. In early versions, the incoming line drove a solenoid that caused a small, very lightly-sprung iron armature to dip into a pool of mercury closing the outgoing circuit. Each repeater needed a local battery providing the hundred volts or so that was required to drive the outgoing line. Although lines of several hundred kilometers (400 km = 250 miles) could be constructed and used, in practice, repeaters were placed about every 40 km (25 miles) to assure reliable accurate duplication of the signals at each of the many sections on the line. By 1861, telegraph lines were providing ‘instant’ communications across the 5000 kilometers (3000 miles) between the East and West coasts of America (bringing about the early demise of the Pony Express).

But how to span the large bodies of water separating the continents? This was the remaining challenge in completing a truly global network. The route across the North Atlantic from Ireland to Newfoundland was around 4000 km (2500 miles) and submarine repeaters on the ocean floor were not an option at that time. Remarkably, by 1858, the first cable to physically span the Atlantic had been completed. Queen Victoria of the United Kingdom sent a telegram of congratulations to President James Buchanan of the United States. Data rates were painfully slow – about two minutes per character (0.04 bits/s). Even simple messages took many hours. Within a month the cable was dead – its insulation destroyed by the 2 kilovolts of signal applied trying to reach the other end of a rapidly degrading cable[17]. Interestingly the best insulating material used for all these cables right up through the 1930s was ‘gutta percha’, a natural latex produced from the sap of a tropical tree, palaquium gutta, found on the Malay peninsula. Several further attempts with new cables were made in the following years. Finally, in 1866, a more lasting success was achieved with a more robust cable that had copper conductors three times thicker than before[18] – thus requiring much lower voltages.

Communications across the Atlantic had been suddenly cut from the ten days it took for the ocean voyage down to the few minutes it took for the operators to handle the message at either end of the new cable. The submarine telegraph cable business expanded rapidly thereafter and became dominated by British interests. By 1876, almost the entire British Empire was interconnected by a vast telegraph network stretching all the way from Canada to London to Aden to India to Singapore to Australia and ending in New Zealand. The Pacific Ocean itself was not spanned by a cable (via Hawaii and the Philippines) until 1903.

The near monopoly of cables and cable-laying ships by the British proved very much to the detriment of Germany in World War I (1914-18). Britain severed all the German submarine cables right at the start of the war. Around the turn of the century, however, Guglielmo Marconi had started developing a wireless telegraphy system based on electromagnetic waves (radio). In 1901, with a powerful transmitter and a massive wire antenna system, he was able to transmit a signal across the Atlantic from Cornwall in England to St. Johns in Newfoundland[19]. Early ‘spark’ transmitters relied on the negative resistance[20] characteristics of an electric arc discharge in air. The receiver was a ‘coherer’. This was a tube containing a loosely packed iron powder and two electrodes. When a radio frequency signal was applied, the particles would ‘cohere’ (cling together) and DC resistance of the powder would drop. The change in resistance could be detected by an external circuit with a battery and a sensitive galvanometer. After a signal was received, someone had to give a few taps on the side of the tube and that would ‘decohere’ the iron particles and return them to their original higher-resistance state ready for the next signal. The introduction of “cat’s whisker’ crystal detectors (point-contact semiconductor diodes) around the turn of the century made the detection process much more sensitive, but it was the invention of the vacuum tube amplifier by Lee De Forest in 1906 that was the real turning point that made wireless telegraphy a practical reality. As suggested above, the German forces in World War I (1914-18), out of necessity, made very good use of wireless telegraphy. The development of strong capabilities in wireless telegraphy became an important strategic element in every nation’s national defense, prompted by the obvious ease with which submarine cables could be severed and destroyed. Today’s cables use very different technology (optical fibers) and carry massive amounts of traffic, but they are equally vulnerable to sabotage.

The huge distances over which certain wireless radio signals could propagate around the World came as a big surprise. This was especially true of the shorter wavelengths between 100 and 10 meters (3 to 30 MHz). Strangely, often a signal could be picked up thousands of miles away that could not be heard at all just 10 miles away. It was conjectured that the tenuous upper atmosphere became ionized by solar radiation and served to reflect radio waves arriving at shallow angles. This proved indeed to be the case – the so-called ‘ionosphere’ ranges from about 60 to 1000 km (40 to 600 miles) above the Earth and has a high density of free electrons ionized by UV sunlight and the solar wind. Truly global coverage can be achieved with short-wave radio. Signals become trapped between the ionosphere and the equally reflective Earth’s surface and, essentially, are confined in only two dimensions rather than three. Under the right conditions, just a few watts of transmitted power can be picked up half-way round the world.

Morse code proved ideal as the preferred mode for short wave radio. It was easy to key or switch the transmitter on an off rapidly to convey the Morse characters. At the receiving end, the incoming signal was added together with a sinusoidal signal that was generated locally by a vacuum tube oscillator. The locally generated signal was carefully tuned to be about 600-800 Hz above or below the incoming frequency. The two signals together were fed into a nonlinear device such as a point-contact diode. As a result of this mixing, a clearly audible beat frequency (difference frequency) arose that could be listened to with headphones. The human ear proved remarkably sensitive in detecting the distinctive frequency and could easily pick out the desired signal from much larger competing interference and noise. Morse code on short-wave radio played an important role in World War II in communications for resistance fighters and operatives behind the front-lines. From the many movies about that period, when we now think of Morse code, we think of a series of ~700 Hz ‘beeps’ or ‘dah-di-dah-dits’ (─ • ─ •) rather than the click-clack of the land-line telegraph sounder.

Short wave radio propagation is, however, notoriously fickle. There are strong diurnal and annual cycles that are predictable and also a strong 11-year cycle corresponding to the sunspot cycle (the underlying mechanism is the regular 22 year period with which the Sun reverses its North and South magnetic poles). But all of this is totally subject to the caprices of the solar weather. A solar flare or storm can totally disrupt all short-wave communications for many days. Only in recent years have relatively accurate predictions of solar weather been possible aided by satellite observations of the Sun. The electromagnetic radiation (UV and X-rays) takes only 8 minutes to reach the earth but some large outbursts of particles (solar mass ejection) allow several days of warning before they reach the Earth.

In the 1950s and 1960s, microwave radio links were established across the continents. These typically operated in the 1 to 10 GHz regime where the wavelength is short enough (30 to 300 mm) for highly directional parabolic antennas to be built but not so high that atmospheric attenuation (rainfall) became a problem. These links were strictly line-of-sight with spacings between the microwave repeaters of several tens of kilometers or miles. The repeater towers were typically situated on hilltops – sometimes on the same hilltops occupied earlier by optical semaphore repeater stations.

Spanning the Atlantic with a cable for telephone communication was more of a challenge – much more bandwidth[21] is required for voice. This feat was not accomplished until 1956 when polythene had become available as an insulator and vacuum tubes had become sufficiently reliable to support building submarine repeaters. The first telephone cable, TAT-1, was a coaxial design and had 51 vacuum-tube repeaters (yes, with incandescent cathodes happily glowing several miles down under water) actually built into the cable. These repeaters were strung along the length of the cable at intervals of 69 km (43 miles). They were powered by DC current fed from either shore. Frequency division multiplexing allowed 36 telephone channels, each 4 kHz in bandwidth, to occupy the single co-axial line. There were two separate cables laid – one to serve each direction. TAT-1 was still working in 1978 when it was retired from service, supplanted by much higher capacity cables (TAT-7 installed in 1978 carried 4,000 telephone channels).

During this period, geostationary satellites also played a role carrying telephone communications. However the loop delay (back and forth between customers) exceeded half a second and was severe enough to be noticeable distraction in normal telephone conversations. The most important role for geostationary satellites has always been that of providing a broadcast platform (covered in the previous chapter on the Clarke orbit).

None of these telephone or telegraph cables or radio links could have supported anything like the modern internet. Today’s cables support Terabits per second of data flow – the equivalent of billions of telephone channels. A major revolution in technology occurred in the 1970s and 80s when transmission using light beams through optical fibers became feasible. Since that time, data-rates on optical fibers have increased exponentially – more than doubling every two years – faster than the growth rate spelled out in Moore’s Law[22].

In 1963, Charles Kao, fresh from receiving his Ph.D. in electrical engineering from University College, London, joined the research department of STL in Harlow, Essex. The task was to investigate why light suffered significant attenuation as it passed down a long light guide. After studying a collection of various glass samples, he reached the conclusion that the glasses had intrinsically very low loss and that it was the trace impurities in the glass that caused the loss. He was very convinced of this to the extent that he proposed the idea of using ultra-pure silica optical fibers for long distance communication. This idea was ridiculed at the time. It was difficult to couple light into fibers and the existing fibers exhibited thousands of decibels of attenuation per kilometer compared with copper coaxial cables that offered 5 to 10 dB loss per kilometer.

The first problem, coupling-efficiency, was solved by the development of the laser. This was occurring around the same time. Lasers can provide an intense tightly-collimated (highly-parallel) beam of light suitable for launching into an optical fiber. To achieve high coupling efficiency, the light has to be launched in a very narrow beam at a very shallow angle along the axis of the fiber. (see the tutorial box on optical fibers at the end of this chapter).

The second problem was propagation-loss. An optical fiber comprises a transparent core with a higher index of refraction surrounded by a transparent cladding with a slightly lower index of refraction. Light trying to leave the core suffers ‘total internal reflection” at the interface between the core and the cladding and cannot escape. Total internal reflection at a dielectric interface is 100% efficient and differs from reflection off a metallic surface (mirror) which may be only 90% efficient. Given this very effective confinement of the ray, the emphasis of research was on minimizing the absorption of light by the impurities in the glass. In 1970, for the first time, Corning Glass managed to create fibers with less than 20 dB/km loss. This was the maximum loss considered acceptable for a communications cable. As predicted by Kao, the material was ultra-pure silica. Modern optical fibers have losses as low as a few tenths of a dB per kilometer. These are single-mode[23] fibers – also as predicted by Kao. In 2009, Charles Kao was awarded the Nobel Prize in Physics “for groundbreaking achievements concerning the transmission of light in fibers for optical communication”.

There has been a gradual shift away from multi-mode fibers towards single-mode fibers. Multimode fibers are designed with a much larger core diameter, 50-100 micrometers. This is much larger than the wavelength of light with a consequence that light can propagate down the fiber in many different waveguide modes (think of a mode as corresponding to the angle with which the light is ricocheting down the core). Unfortunately different modes propagate at different speeds (shallower angles travel faster) with the result that an initially narrow pulse gradually gets wider and wider as it travels down the fiber (a phenomenon called dispersion). In other words, the available frequency response (bandwidth) decreases rapidly with length. On the plus side, multimode fibers are much easier to launch light into and to splice together. Multi-mode fibers are used for short links of a few hundred meters often supporting networks just within buildings

Single-mode fibers, in contrast, are designed with a very small diameter core, just a few microns in diameter, and with a refractive index only perhaps 0.2% higher than the surrounding cladding. This ensures that only a single mode can propagate down the fiber eliminating the dispersion problem arising from multiple modes. Single-mode fibers also have the lowest loss. On the down side, extreme precision is required in coupling into and out of the fiber and making splices between fibers. Single-mode fibers are typically used for long-haul applications over distances of kilometers or more.

Single-mode fibers allow a technique known as wave length division multiplexing where one hundred or more distinct optical wavelengths (frequencies) can pass down a single fiber without mutually interfering. Each wavelength channel can carry data at over 100 Gigabits per second. As a recent example, MAREA[24] is a long transatlantic communications cable running between Virginia Beach, United States and Bilbao, Spain. It is owned and funded by Microsoft and Facebook, and operated by Telxius, a subsidiary of Spanish telecom Telefónica. It began operation in February 2018. The full length of the cable is 6600 km (4000 mile) and it weighs in at about 4,650 tonnes. The cable carries just eight carefully-protected optical fibers but is capable of a remarkable 160 Terabits per second combined transmission speed.

At the other end in terms of scale is the “last mile” connection. This refers to the final connection to the myriad customers with their individual computers and other devices. This last link is from a central office or computer or ‘server’ that might service a few thousand individuals typically within a mile (kilometer) or so radius. For most of the 20th century, life was simple. Two-way voice communications (and some data via modems) came in over a twisted copper pair from the local telephone exchange and the radio and television signals were picked up wirelessly from nearby broadcast stations. This has all changed in the 21st century. Many residences now have high-bandwidth coaxial cable or optical fiber links directly into the home and cell-phones (mobile-phones) have become almost ubiquitous offering both voice and data services. (We will spend more time on “the last mile” technologies in Chapter E+3 which is the 1 km length scale.)

The Internet

The term “internet” refers to the global network of interconnected computer systems that has developed over roughly the last 30 years. It comprises a vast array of public and private, academic and business, government and non-government computer networks. All these are linked together by high-speed connections (typically optical fiber) using a universal control language called TCP/IP (Transmission Control Protocol / Internet Protocol). The internet provides a transmission medium for applications such as the World-Wide-Web (WWW), electronic mail (e-mail), retail portals (Amazon.com and eBay), file-sharing, podcasts, telephony (VoIP), live streaming of audio and video[25] and a host of other applications.

The essence of the internet is that the data to be transferred is broken down into small self-contained packets that make their own way independently through the network. The nodes in the network are also dumb in that their sole job is to receive and forward packets to their neighbors with no awareness of the contents of the packets and with no need for any record-keeping. There is no global intelligence or control of the network at all. The computers at the nodes are referred to as ‘routers’. Cisco is a dominant player in providing routers for the internet. Each small data packet includes a header with information about its source, its relation to the original data, and, of course, the intended destination. The header is from 20 to 60 Bytes and the ‘payload’ of each packet might be ~1000 Bytes of data depending on the physical transmission system. The TCP/IP protocol controls the routing, detects errors or failures, makes acknowledgments or retransmission requests, re-routes packets to avoid congestion, and finally re-orders packets that may be received out of order (different packets in the sequence may take totally different routes between the same source and destination). TCP/IP is inherently robust and intended for implementation on any physical transmission layer. This has even included carrier pigeons[26].

The history of the internet[27] goes back to the 1970s starting with ARPANET (Advanced Research Projects Agency Network). The project was initially funded by the US Department of Defense which had interest in a network that would survive a nuclear war. The project linked various universities and military organizations across the US, and in 1973 added NORSAR[28], Kjeller, Norway, and University College, London[29], UK. It was within ARPANET that packet-switching and the TCP/IP protocol were first implemented. In 1981, the US National Science Foundation funded an expansion to provide connectivity to the major super-computer centers. In 1990, ARPANET, as such, was decommissioned and the network repurposed and enlarged to include commercial enterprises. At this point it began to take on its modern role of a neutral transmission medium servicing a wide variety of functions. Since that time, the internet has expanded literally exponentially to span the globe[30] and to become an integral part of almost everyone’s life.

The World Wide Web (WWW) or Web is one of the biggest and most obvious applications on the Internet. The term ‘Web’ is sometimes used interchangeably with ‘Internet’, but strictly the World Wide Web (WWW), refers to all the many billions of web-page documents and their interconnecting hypertext links. These web-pages are addressed by their unique URLs (Uniform Resource Locators) that comprise the access protocol plus the host name plus the file name[31].(e.g. ‘https://www.ieee.org/about’). The Hypertext Transfer Protocol (HTTP or HTTPS for ‘-secure’) [32] is the protocol used to respond to requests. Host names (e.g. ‘www.ieee.org’) are unique case-insensitive alpha-numeric strings that must be registered in the corresponding country. The actual file or document (e.g. ‘about’) would typically be written with Hypertext Markup Language (HTML)[33]. The two essential features of HTML are its ‘reflowable’ text and, obviously, its inclusion of hypertext links. Reflowable refers to the ability of the written text to format itself appropriately to fit whatever screen it’s to be viewed on, whether it be a tiny cell-phone or a gigantic high-resolution computer display. Hypertext refers to the presence of hidden links behind selected words or phrases or images on the web-page. These words are usually highlighted to in the text to identify them. Then, a simple click or touch will automatically bring up another web-page or document containing relevant more-detailed information. The gateway into the Web is referred to as a ‘browser’. Browser software is tuned to provide clear formatting of web pages and ease of working with hyperlinks. However, the most important feature of a browser is its function as a search-engine.

As of 2018, the World Wide Web is estimated to include approximately 2 billion web-pages hosted on about seven million computers[34]. Finding exactly the right piece of information among these 2 billion web-pages is obviously a daunting task. This challenge led to the development of ‘search-engines’ starting in the 1990s[35] . By the mid-1990s, there were a number of search-engines vying for supremacy: Magellan, Excite, Infoseek, Inktomi, Northern Light, AltaVista, Yahoo!, and Google. By 2000, Google had risen to prominence based on the speed and relevance of its search results. A key aspect of Google’s search engine was an iterative process whereby page results were ranked based on the number of links into the page and the ranking of the pages that the links came from[36].

A search engine has three indispensable components: web crawling, indexing, and searching. Web crawling is done by robot “spiders” that are continuously looking at page content on one site after another and reporting back on what they see in terms of content. ‘Indexing’ means compiling a list of key words and phrases together with links into the appropriate web-pages. The final ‘search’ then involves referencing into the index and using proprietary algorithms to rank and present candidate pages to the customer. Search engines like Google have become indispensable for both formal research and casual enquires. Indeed, the creation of this book has been totally dependent on access to the web plus the use of a good search engine!

As well as the World Wide Web (and, of course, e-mail), there are many other internet applications that have risen to prominence, notably, on-line retail outlets such as Amazon and Alibaba and eBay, social media such as Facebook and WeChat and Twitter, and video services such as Netflix and YouTube and Tudou. This last group of services that rely on video (moving images) is particularly thirsty in terms of data bandwidth and data storage.

Cloud Storage

The term “Cloud” is a nebulous concept – as indeed the name suggests. In the context of the internet, a formal definition might be that the Cloud refers to resources provided on an as-needed basis by a third party. The third party is typically remote from the customers and specializes in providing the support that might normally require a separate IT (information technology) department in a small business. The resources provided to a particular customer are usually a small portion of the total capability available at the provider. The resource provided to the customer is ‘virtual’ in that at any time it may reside on any machine in any location. In fact, for reasons of reliability and security, an effort may be made to distribute the task widely across a number of computers. However, the customer has no need to know ‘where’ and ‘how’. It all happens somewhere in the nebulous ‘cloud’. His/her only concern is that the system should seamlessly provide accurate responses in a timely manner.

Although the name and the concept of cloud computing had been around for some time, it was the launch of Amazon Web Services in 2006 that provided the first major foray and established the business model. Amazon’s entry was followed in 2008 by Google. Today cloud computing is a huge business and there are many major players (many with familiar names) including Adobe, Alibaba, Dropbox, Huawei, IBM, Microsoft, 1&1, Red Hat, Salesforce, Oracle, SAP, Tencent, Verizon, VMware, etc.

The Cloud involves aspects of computation and aspects of storage. Most applications involve both. However, it is the data storage aspect of the cloud that is of more interest to us here. In fact, irrespective of how the cloud is defined, we now go on to discuss the total world-wide demand for data storage and the devices that try to keep up with this huge demand.

Total data creation for 2018 is estimated at 30 ZettaBytes (30 x 1021 Bytes). Recent growth rate has been about 30% per year. If this growth-rate continues exponentially, the data created would reach about 140 ZettaBytes (140 x 1021 Bytes) by 2025[37]. However, there is an increasing gap between amount of data being created and the amount of data that can be stored economically. A lot of data has only a fleeting existence before it is either discarded or is processed to extract the essential information. Photographs and videos may be captured in glorious full resolution, but they typically undergo immediate lossy compression for transmission and storage (into jpeg or mpeg formats, for example). Facebook and similar sites, out of necessity, automatically apply their own compression algorithms to any photographs or videos submitted, no matter how wonderful the quality and resolution of the original images may be.

Increasingly data is generated locally by consumers with smart-phones (cell-phones or mobile phones) creating photographs and videos that tend to generate large data files. In addition, there is the much vaunted Internet of Things (IoT). This refers to the wide variety of devices in our lives that are now being endowed with sufficient electronic intelligence to report their status and pertinent information about their environment. All these devices typically have some local storage, but the almost universal availability of high-bandwidth access to the internet and the low cost of bulk storage devices like Hard Disk Drives (HDDs) means that a huge amount of this data ends up stored remotely in large centralized data-centers managed by companies like Amazon or Facebook or YouTube or Tencent. In other words, the data is stored in the Cloud.

In many situations, the time to access data can be important. Human beings start noticing if system responses are more than a few hundred milliseconds. Solid-State Drives (SSD) have response-times of less than a millisecond and Hard-Disk Drives (HDD) are more like 10 milliseconds (assuming queries are not queued up). Magnetic Tape, even in a fully automated library, takes many seconds to retrieve a piece of data. So SSD and HDD are more than fast enough to keep humans happy but magnetic tape requires more patience and is generally confined to back-up and archive. However, even a simple search in a browser can automatically launch thousands of queries. Also human beings are not the only entities using the system. There are various pieces of software (e.g. the web-crawlers mentioned earlier) continually running on the internet. These generate large amounts of traffic and expect very rapid responses.

Speed of response is also limited by geographic separation. Light in optical fibers travels at about 200,000 km/s (124,000 mi/s). This is about 2/3 the speed of light in a vacuum corresponding with the fiber’s refractive index of about 1.5. The circumference of the Earth is 40,000 km[38]. It follows that the best possible response time between two places linked by optical fibers on opposite sides of the World is 200 ms. Conversely, if we desire a response time of 10 ms or better (similar to an HDD), then the data-center needs to be within 1000 km (~625 mi) of the customer. In practice, these times and distances are far too optimistic. Cables are generally not routed directly ‘as the crow flies’ and also the signals get slowed by the numerous nodes and repeaters that the signals must traverse. The bottom line is that data-centers generally have to be with a few hundred kilometers or miles of their customers if they are to support a workload that requires any kind of high-speed transaction processing. This is an important consideration in determining the number of and locations of data-centers to serve a given geography and population. It is also a factor in deciding how much SSD vs. HDD to install in a data-center

An amusing pathological case, not unrelated, occurs in the secretive world of high-speed computer trading where every nanosecond counts[39]. It has been widely reported that some of these trading companies have installed their own microwave links between stock exchanges and data-centers and from exchange to exchange. The reason for this is that microwaves travel from tower to tower at very close to the speed of light, 300,000 km/s, giving an significant advantage against competitors who may be using conventional optical fiber. If this sounds like a backwards step, then consider the recent evidence that shortwave radio is being used to provide very low latency links across the Atlantic ocean[40]. These provide the lowest latency possible. albeit with very low bandwidth, but perhaps able to transmit just enough critical information to trigger a ‘buy’ or ‘sell’ action on a nearby exchange.

As mentioned earlier, there exists a distinct hierarchy of storage devices. SSDs provide the fastest response but are most expensive in dollars per GByte. HDDs are slower but less expensive. Magnetic tape libraries are the slowest (sometimes referred to as “near-line”) but are by far the least expensive in dollars/GByte. A significant part of the cost is in the power consumed by the storage devices and the power required to keep them cool. Great care is taken in data centers to understand the usage of the data and place or migrate the data to the appropriate kind of storage device. We will be talking much more about the storage hierarchy and the design of data-centers in chapter +2 on the 100 meter scale.

To conclude the chapter, we note that the majority of the world’s on-line data is stored on Hard Disk Drives (HDDs). The total capacity of SSD, HDD, and Tape shipped in 2017 was 793 ExaBytes (793 × 1018 Bytes)[41]. In 2018, that figure is expected to be close to one ZettaByte (1,000 ExaBytes). An estimated 400 million HDDs are expected to be shipped in 2018[42] accounting for about 80% of that data capacity. We will be talking much more in the coming chapters as we zoom in on the remarkable HDD. Much of the innovation in the HDD business and for Information Technology has taken place in California and the San Francisco Bay Area and ‘Silicon Valley’ and its largest city, San Jose.

Further Reading:

Humans, Scientific American, Vol. 319, No. 3, September 2018

G. Holzmann, B. Pehrson, “The Early History of Data Networks”, Wiley-IEEE Computer Society Press, December 1994,

George Prescott, History, Theory, and Practice of the Electric Telegraph, Ticknor and Fields, 1860

B. Dormon, “How the Internet works: Submarine fiber, brains in jars, and coaxial cables”, Ars Technica, May 26, 2016

B. Mccullough, How the Internet Happened, Liveright Publishing, Oct. 23, 2018

https://arstechnica.com/information-technology/2016/05/how-the-internet-works-submarine-cables-data-centres-last-mile/

D. Reinsel J. Gantz J. Rydning, “Data Age 2025” IDC White Paper, April 2, 2017 https://www.seagate.com/our-story/data-age-2025/

G. Schulz , Cloud and Virtual Data Storage Networking, Auerbach Publications; Aug. 26, 2011

(see the tutorial box below in blue)


Optical Fibers

At a basic level, the operation of optical fibers relies on the phenomenon of total internal reflection. Light rays travelling at a very shallow angle down a glass rod will be completely reflected from the internal surfaces and will bounce back and forth within the rod with very little loss. This idea extends to very thin glass fibers where the light is confined by total internal reflection at the interface between a core with a high refractive index and a cladding material with a lower refractive index. The difference in refractive index is small – < 1%.

Refractive index, n, is the degree to which light slows down in the medium. For example, in glass with a refractive index of n = 1.5, light travels at 2/3 the speed that it does in a vacuum. The change in speed at an interface causes some light to be reflected and some to pass through. This refracted light that passes through travels in a different direction. The angles of the incident and refracted rays obey Snell’s Law (above). Examining the equation, one realizes that for n1 > n2, there is a critical value for the incident angle, i, below which no refracted ray exists. This situation corresponds to total internal reflection with 100% of the ray being reflected off the interface. This differs from reflection off a metallic surface (mirror) which may be only 90% efficient. Given this very effective confinement of the ray, it remained only to find a very low-loss glass material for transmission. This was first achieved in the 1970s by Corning Glass using ultra-pure silica with germanium oxide doping to achieve the higher index core.


References

  1. Public lecture at Cornell University, October 13, 1994, http://www.bigskyastroclub.org/pale_blue_dot.html https://fettss.arc.nasa.gov/collection/details/the-pale-blue-dot/
  2. decay-levels at the Earth’s formation were many times higher than today’s levels
  3. Hadean – an adjective describing Hades, the underworld, or hell
  4. Panspermia refers to the idea that simple lifeforms like bacteria could potentially survive a transfer from one planet to another inside meteorites. https://en.wikipedia.org/wiki/Panspermia
  5. Ozone or trioxygen, O3, is a rare form of oxygen less stable than 02. It has a distinctive pungent smell. It is present in trace amounts in the atmosphere with the highest concentration occurring in the stratosphere where it absorbs harmful ultra-violet light around a wavelength of 250 nm.
  6. New Zealand, Madagascar and Hawaii are large exceptions. These were not reached by humans until historic times.
  7. Richard Dawkins, The Selfish Gene, Oxford University Press, 4th edition, 2016
  8. The word ‘meme’ was coined by analogy with the word ‘gene’ to express the propagation and evolution of ideas in a culture. The word first appears in The Selfish Gene by Richard Dawkins.
  9. https://en.wikipedia.org/wiki/Pony_Express
  10. http://www.oceannavigator.com/January-February-2003/Electronic-fog-signals/
  11. http://people.seas.harvard.edu/~jones/cscie129/papers/Early_History_of_Data_Networks/The_Early_History_of_Data_Networks.html
  12. Telegraph = ‘distance-writing’ and Semaphore = ‘sign-bearing’, both from French.
  13. Jean-Baptiste Chappe is famous for his dogged pursuit of the transit of Venus, first in 1761 in Siberia and again in 1769 in Baja California where he fell ill and died. The successful observations were critical in establishing the size of the Solar System.
  14. The Volt is unit of electromagnetic force (emf).The Oersted is the cgs unit of magnetic field strength.

    The Ampere is the unit of electric current.

    The Ohm is the unit of electrical resistance.

    The Farad is the unit of electrical capacitance.

    The Henry is the unit of electromagnetic inductance.

    The Weber is the unit of magnetic flux.

    The Gauss is the cgs unit of magnetic flux density.

    The Kelvin is the unit of absolute temperature.

  15. Morse code is a binary signaling method with current either on or off. Each alphabetic letter or numeral or punctuation mark is represented by a sequence of short and long pulses of current (dots and dashes). For example, “SOS” is transmitted as “dot-dot-dot dash-dash-dash dot-dot-dot”. Dashes are three times longer than dots. The space between letters is equivalent to one dash.
  16. http://www.telegraphlore.com/morse_misc/morse_rescue.html
  17. https://www.wired.co.uk/article/transatlantic-cables
  18. http://theinstitute.ieee.org/tech-history/technology-history/first-successful-transatlantic-telegraph-cable-celebrates-150th-anniversary
  19. https://ethw.org/Milestones:Reception_of_Transatlantic_Radio_Signals,_1901
  20. There is a regime for an electric arc where the voltage across the ‘spark’ gap decreases even though the current is increasing. This provides the condition for a self-sustaining oscillation. The frequency of operation was largely determined by the scale of the antenna and associated circuits. Due to the unruliness of the discharge, spark transmitters were very noisy (did not produce a clean sine wave) and caused massive interference between stations. They were banned in 1930.
  21. This refers to how wide a band of frequencies can be used for transmission. The wider the bandwidth – the more quickly in time the system can respond. A Morse telegraph channel can get by with less than 100 Hz of bandwidth whereas 3,000 Hz is considered the minimum for an analog telephone channel.
  22. Gordon Moore, a cofounder of both Fairchild and Intel, predicted in 1975 that the number of transistors in integrated circuits would double every two years.
  23. Single-mode optical fibers are more difficult to couple light into and to splice but offer lower loss and wider bandwidth. All long-haul fibers are single-mode
  24. https://en.wikipedia.org/wiki/MAREAhttps://news.microsoft.com/features/microsoft-facebook-telxius-complete-highest-capacity-subsea-cable-cross-atlantic/
  25. Telephony and streaming use versions of protocol that emphasize timeliness and continuity of delivery. TCP/IP emphasizes accuracy and completion of delivery.
  26. https://en.wikipedia.org/wiki/IP_over_Avian_Carriers
  27. https://www.internetsociety.org/internet/history-internet/brief-history-internet/
  28. https://www.norsar.no/about-us/history/
  29. http://www0.cs.ucl.ac.uk/csnews/internet_pioneers.html
  30. https://global-internet-map-2018.telegeography.com/
  31. https://en.wikipedia.org/wiki/URL
  32. https://en.wikipedia.org/wiki/Hypertext_Transfer_Protocol
  33. https://www.w3schools.com/html/html_intro.asp
  34. https://news.netcraft.com/archives/2018/01/19/january-2018-web-server-survey.html
  35. https://en.wikipedia.org/wiki/Web_search_engine
  36. S. Brin, L. Page, “The Anatomy of a Large-Scale Hypertextual Web Search Engine”, http://ilpubs.stanford.edu:8090/361/1/1998-8.pdf
  37. https://www.storagenewsletter.com/2017/04/05/total-ww-data-to-reach-163-zettabytes-by-2025-idc/
  38. The kilometer, as originally defined by the French Academie des Sciences in 1791, was 1/10,000 of the distance from the equator to the North pole.
  39. https://www.reuters.com/article/us-highfrequency-microwave-idUSBRE9400L920130501
  40. https://sniperinmahwah.wordpress.com/2018/05/07/shortwave-trading-part-i-the-west-chicago-tower-mystery/
  41. https://www.forbes.com/sites/tomcoughlin/2018/05/14/2018-continues-growth-in-near-line-hdd-market/
  42. https://www.storagenewsletter.com/2018/10/17/99-million-to-101-million-hdds-shipped-in-3cq18-trendfocus/