HISTORY72 - Top 10 Inventions that Changed the World
Introduction
Picking the top 10 inventions
that changed the world is, of course, a very argumentative subject. I thought it would be fun to try, and then
challenge you to come up with your own list and rationale.
First off, I eliminated some
obvious choices, like the wheel and fire, because these “inventions” cannot be
associated with a particular individual or group of persons. The same for inventions such as tools, the
calendar, the clock, nails/screws, the compass, electricity, etc.
Even with this “known-inventor
ground rule,” I found it impossible to choose a top 10 specific
inventions. So, I cheated. You will note that one of my top 10 is really
a group of related inventions. Many of
the others have intimately-related predecessor and successor inventions, in
effect therefore, representing a class of inventions. (No specifics here; I’ll leave it to you to
figure out what I’m talking about.)
Finally, I decided that I couldn’t ignore an entire critical science, so
I added it at the end as an 11th “invention.”
For each of my top inventions, I
will summarize the history, identify appropriate predecessor and successor
inventions, state my rationale for including the invention on my list, and
perhaps relate my personal experiences with the invention.
I will list my principal sources
at the end.
So here are my top inventions in
approximate historical order:
1.
Printing Press
Printing (graphic communication by multiplied impressions) has a
long history.
In 3000 BC and earlier, the Mesopotamians used round cylinder
seals for rolling an impress of images onto clay tablets. In other early
societies in China and Egypt, small stamps were used to print on cloth.
In the second century AD, a Chinese court official named Ts’ai
Lun is credited with inventing paper.
Woodblock
printing in China dates back to the 9th century. Woodcut is a
relief printing technique in which text and images are carved into the surface
of a block of wood. The printing parts remain level with the surface while the
non-printing parts are removed, typically with a knife or chisel. The woodblock
is then inked and the substrate pressed against the woodblock.
In the 11th century, a Chinese man named Pi-Sheng
developed type characters from hardened clay, creating the first movable type. The fairly soft material hampered the success
of this technology.
In the 13th century, type characters cast from metal
(bronze) were developed in China, Japan and Korea. The oldest known book
printed using metal type dates back to the year 1377. It is a Korean Buddhist document, called Selected
Teachings of Buddhist Sages and Seon Masters.
The oldest known European woodblock printing specimen dates from
the beginning of the 15th century.
Books were still rare since they needed to be laboriously handcopied by scribes.
The University of Cambridge had one of the largest libraries in Europe –
constituting just 122 books.
German
goldsmith Johannes Gutenberg is credited with inventing the printing
press around 1436. Gutenberg’s machine
improved on the already existing presses and introduced them to the West. His newly devised hand mold made possible the
precise and rapid creation of small metal movable type pieces with raised backwards letters, in large quantities, arranged
in a frame, and coated with ink, that could be pressed to a piece of paper. This allowed books to be printed more quickly. The first book to be mass produced was the
“Gutenberg Bible” in 1455, when 180 copies were printed.
By
1500, Gutenberg presses were operating throughout Western Europe, with a
production of 20 million materials, from individual pages to pamphlets and
books.
Drawing of the Gutenberg printing press.
In the 19th century, the replacement of the
hand-operated Gutenberg-style press by steam-powered rotary
presses allowed printing on an industrial scale.
The
printing press not only allowed the mass production of newspapers and
pamphlets, but it also lowered the price of printed materials, making books and
newspapers accessible to many, and fostering literacy around the world. The printing press is therefore part of the
foundation upon which modern civilization was built.
Since
Gutenberg, there have been many improvements in printing.
The
Rotary Press, invented by Richard March in 1843, was the natural successor of
the Gutenberg Printing Press. It worked by curving the images to be printed
around cylinders. This allowed for paper to be continuously fed through the
press and was a lot faster.
Offset printing was invented in 1875,
and works by
transferring (or
“offsetting) the inked image from a plate to a rubber blanket, and then to the
printing surface. Offset Printing
remains almost unchanged today and is the most popular way of printing large runs
of books, magazines, posters and other large format prints.
Since
the 1950s, printing can be accomplished in the comfort of your own home. Inkjet Printing was invented in 1951 and
requires no direct contact with paper. Ink is applied by spraying it through jets.
In
1969, Laser Printing was invented to produce high-quality images by passing a
laser beam back and forth over a negatively charged cylinder within the
printer. It then collects electrically
charged powdered ink to transfer the image to the paper.
By
the late 1980s, color printers were readily available to consumers.
In
1991, in the dawn of the digital age, printing got faster and more easily
accessible to everyone. Digital Printing
made it possible to print straight from a digital file to any “connected”
printer.
The earliest Three-Dimensional (3D printer) originated in
1981, when Dr. Hideo Kodama invented one of the first rapid prototyping
machines. 3D printing or additive manufacturing
is the construction of a three-dimensional object from a
digital 3D model. Material is deposited,
joined or solidified under computer control, with material being added together
(such as plastics, liquids or powder grains being fused), typically layer by
layer.
In recent years, 3D printing has developed
significantly and can now perform crucial roles in many applications, with
the most common applications being medical and industrial
applications, and the construction of common household items.
Since reading is my primary avocation, I really appreciate the invention of the original printing press and its improvements.
2. Telescope
Inspired by early Dutch efforts, in 1609, Italian physicist and astronomer Galileo became the first person to point a telescope skyward.
Galileo made his first optical telescope by
using a convex objective lens in one end of a leaden tube and a
concave eyepiece lens in the other end, an arrangement that came to
be called a Galilean telescope. Although
that telescope was small and the images fuzzy, Galileo was able to make out
mountains and craters on the moon, as well as a ribbon of diffuse light arching
across the sky - which would later be identified as our Milky Way galaxy.
Galileo demonstrating his telescope in 1609.
After
Galileo, astronomy flourished as a result of larger and more complex
telescopes. With advancing technology,
astronomers observed the planets and moons in our solar system and discovered
many faint stars, enabling the calculation of stellar distances.
In the 19th
century, using a new instrument called a spectroscope, astronomers gathered
information about the chemical composition and motions of celestial objects.
Twentieth
century astronomers developed bigger and bigger telescopes and, later, specialized
instruments (to observe infrared, ultra-violet, x-ray, and gamma ray radiation)
that could peer into the distant reaches of space and time.
Today’s and future telescopes are designed to
explore the solar system, measure the age and size of the universe, search for
our cosmic roots, chart the evolution of the universe, and unlock the mysteries
of galaxies, stars, planets, and life itself.
We live in exciting astronomical times. It seems that the recently-launched James
Webb Space Telescope is making new discoveries every day about our universe,
and our place in it!
3.
Telephone
In 1837, American inventor Samuel
Morse invented a single-wire telegraph system based on earlier European
telegraphs. Morse’s telegraph, with the
depression of a key at the transmission end of the wire, sent a series of
“dots” and “dashes” to the receiving end of the wire, where the message need be
decoded, using Morse code that Morse and his assistant co-invented. Morse helped to develop the commercial use of
telegraphy that
revolutionized long-distance communication.
Although a highly successful system, the telegraph was
basically limited to receiving and sending one message at a time.
During the 1870s, American
inventor Alexander Graham Bell invented the telephone while attempting to
improve the telegraph. His goal was to
create a device that could transmit the human voice along electrical cables,
and to transmit
multiple messages over the same wire at the same time. The possibilities of being able to talk down
an electrical wire far outweighed those of a modified telegraph system, which
was essentially based on just dots and dashes.
“Mr.
Watson, come here, I want you,” were the immortal first words ever
spoken on a telephone. Alexander Graham
Bell said them on March 10, 1876, to his assistant Thomas Watson. This moment would change communications
forever.
On October 9, 1876, Alexander Graham Bell conducted a two-way test of his telephone over a two-mile distance between Boston and Cambridge, Massachusetts.
Bell’s
telephone quickly improved with the development of telephone networks,
exchanges and rotary dialing, pay phones, long distance, party lines,
touch-tone phones, cordless phones, digital phones, and mobile phones. With the arrival of the mobile phone in the
1980s, personal communications were no longer shackled to cables.
Telephone
development continued with cell phones, and today’s smart phones and even smart
watches. The clever invention of
the cellular network supported the revolution of the telephone
industry. From bulky mobile phones to
ultrathin handsets, mobile phones have come a long way. John F. Mitchell and Martin Cooper of
Motorola demonstrated the first handheld device in 1973, starting a
technological revolution we still live in today.
The
smart cell phone has changed and developed so rapidly in the past decades that
it seems as though almost anything you can imagine is possible for the future.
The convergence of all our tech gadgets into one mobile device will continue to
advance. Experts believe that the majority of the hardware and the
software can be moved to “the cloud” and the product will mainly be comprised
of the input and the display.
Telephone
cellular networks were first proposed and described by my Uncle Douglas Ring in
1947 at Bell Labs. It would be decades
before technology developments made his dream come true.
4.
Camera
The forerunner of the
photographic camera was the camera obscura, a natural optical phenomenon
that occurs when an
image of a scene on the front side of a screen (or for instance a wall) is
projected through a small hole in that screen and forms an inverted image (left
to right and upside down) on a surface behind the screen, opposite to the
opening. Although it is unclear who
first used the camera obscura, the oldest known written recordings of this idea
are by Han Chinese scholar Mozi (c. 470 to c. 391 BC). This phenomenon was used to
view solar eclipses without harming one’s eyes and, later, to make drawings
from the inverted image.
Historians
generally accept that the first photographic camera was developed in 1816 by
Frenchman Joseph Nicéphore Niépce, who leveraged discoveries from the late
Middle Ages that some substances are visibly altered by exposure to light. Niépce created photographic images on silver
chloride-lined paper; the oldest extant photograph is one he made around 1826.
Over
the years, cameras gradually improved with silver-based processes
(Daguerreotypes), photographic plates covered with light-sensitive chemicals,
box cameras with paper film (followed by celluloid) that was exposed and then
sent out for development, folding cameras, instant-image cameras (such as the
Polaroid Land Camera) and reflex cameras with mirrors (permitting
photographers to view the image that will be seen through the lens, and
therefore to see exactly what will be captured).
The first Kodak camera designed for the public hit the market in 1888.
Camera
manufacturers gradually added improvements for basic image control, including
capability to set the camera aperture, shutter speed, and exposure time, and
also offered interchangeable lenses and other accessories.
The
first digital camera was introduced in 1999, revolutionizing photography, with images saved on memory cards rather than using film.
In
2004, mirrorless cameras were introduced, where light flows directly through
the lens to the digital sensor, which displays the image on the camera’s LCD
screen, allowing the photographer to alter adjustments and preview the image
before shooting.
Today,
every smartphone has at least one built-in camera that can take photos and
videos.
Photography has been an important part of Ring-family life for over a century. In 1905, my Grandfather Ambrose Ring took photos of his mining exploits in southern Arizona. In the 1990s, my brother Al Ring and I (re)discovered these old prints, started exploring where the images were taken, and began research about the mining history of the region. Our research culminated in the publication of the book, Ruby Arizona - Mining, Mayhem, and Murder, in 2005. This set us off on a (post-retirement) career of Arizona historical research and writings that included newspaper columns, additional books, and a dedicated website (ringbrothershistory.com). Pat and I continue the Ring-family photography tradition today by recording family events, travels, and exploring creativity in making interesting pictures.
5.
Planes, Trains, and Automobiles
Ok, you found it; this is the “invention” that’s
really a group of related inventions.
I’m borrowing the name of a 1987 movie, “Planes, Trains, and
Automobiles,” to cover a group of “transportation” vehicles invented in the 19th
and early 20th centuries.
Before these inventions, land travelers were limited to walking,
horseback, or animal-pulled wagons or carriages. Let’s go in historical order:
Left to right: Steam locomotive of the late 1800s, Henry Ford’s 1908 Model T, Wright Brothers first powered flight.
Trains. British
engineer Richard Trevithick built the first full-scale working railway
steam locomotive in the United Kingdom in 1804. It used high-pressure steam to drive the
engine. The commercial appearance of
train networks came in the 1820s and the age of railways began. By 1827,
railways began to crisscross the United States, starting between Baltimore and
Virginia. The first American
transcontinental railroad was completed in 1869. Diesel powered
locomotives were used in Sweden starting in 1913, followed by the U.S. in 1925. The first bullet
train was introduced in Japan in 1964.
Automobiles. Automobiles
were in the works since 1769, when Nicolas-Joseph Cugno developed the first steam-powered
automobile capable of human transportation.
More than a
century later, in 1872, American George Brayton invented the first
commercial liquid-fueled internal combustion engine. In 1886, German automobile engineer Karl Benz
began the first commercial production of practical motor vehicles with an
internal combustion gasoline engine.
However,
automobiles did not come into common use until the early 20th
century, when American Henry Ford produced an automobile that most people could
afford. Popularized by Ford's Model T in
1908, the automobile gave the average person more mobility and personal freedom
while also spawning a revolution in the market place. Goods could now be transported much more
easily as well quickly, the seeds of the traveling industry were planted,
people could move out of the city, live in the suburbs, and even take driving
vacations - all of this thanks to the automobile.
Airplanes. Flying
has long been the dream of humans. The
first kites were flown in China in 1000 BC.
Leonardo da Vinci made many drawings of wings and flying machines
in the late 1400s. In the 18th
and 19th centuries, there were many experiments with balloons and gliders,
and in the late 19th century, the first attempts at powered flight.
On
December 17, 1903, brothers Wilbur and Orville Wright achieved the
first powered, sustained, and controlled airplane flight with a
propeller-driven wooden airplane. The
duo's success laid the foundation for modern aeronautical engineering by
demonstrating what was possible.
Some
key milestones: In 1917, the first metal
airplane was produced. Charles Lindberg
completed the first solo trans-Atlantic flight in 1927. Aircraft began carrying passengers in
1933. The first jet-powered airplane
flew in 1937. In 1947, Charles Yeager
piloted the first aircraft to exceed the speed of sound. In 1970, the Boeing 747 made its first
commercial flight.
Today
airplanes carry passengers thousands of miles world-wide in a matter of hours.
With these transportation
inventions, people are able to travel to places they've never been and trade
goods easier than ever. And these
technologies are still widely used today and ever-expanding, despite concerns
about climate change.
I’m sure that virtually
everyone has made (and continues to make) generous use of these transportation
inventions.
6.
Rockets
Rockets,
fueled by gunpower, were first used as weapons in China in the 11th
century. By the late 18th and
early 19th century, gunpowder rockets were in common use as short-range
weapons in India and Great Britain.
In
1903, Russian Konstantin Tsiolkovsky began a series of papers discussing
the use of rocketry to reach outer space, space suits, and
colonization of the Solar System. Two key points discussed in his works
are liquid fuels and staging of rockets. In 1926 American engineer, professor,
physicist, and inventor Robert Goddard built and launched the first liquid
fueled rocket. This is considered by some to be the start of the Space Age.
Robert Goddard about to launch the world’s first liquid-fueled rocket on March 16, 1926.
Late
in World War II, Nazi Germany attacked London with 200-mile-range V-2
missiles. After World War II, the United
States and the Soviet Union created their own missile programs.
In
the latter half of the 20th century, rockets were developed that
were powerful enough to overcome the force of gravity to reach orbital
velocities, paving the way for space exploration to become a reality.
On
Oct. 4, 1957, the Soviets launched the first artificial Earth satellite,
Sputnik 1, into space. Four years later
on April 12, 1961, Russian Lt. Yuri Gagarin became the first human to orbit
Earth in Vostok 1.
The
first U.S. satellite, Explorer 1, went into orbit in 1958. In 1961, Alan Shepard became the first
American to fly into space. In 1962,
John Glenn’s historic flight made him the first American to orbit Earth.
On
July 20, 1969, American astronaut Neil Armstrong was the first human to step
onto the Moon, carried there by the Saturn 5 heavy-lift launch vehicle. Six Apollo missions were made to explore the
Moon between 1969 and 1972.
By
the early 1970s, orbiting communications and navigation satellites were in
everyday use, and the Mariner spacecraft was orbiting and mapping the surface
of Mars. By the end of the decade, the
Voyager spacecraft had sent back detailed images of Jupiter and Saturn, their
rings, and their moons.
Skylab,
America’s first space station, was a human-spaceflight highlight of the 1970s,
as was the Apollo-Soyuz Test Project, the world’s first internationally crewed
(American and Russian) space mission.
In
the 1980s, satellite communications expanded to carry television programs, and
people were able to pick up the satellite signals on their home dish
antennas. Satellites discovered an ozone
hole over Antarctica, pinpointed forest fires, and gave us photographs of the
nuclear power plant disaster at Chernobyl in 1986. Astronomical satellites found new stars and
gave us a new view of the center of our galaxy.
For
30 years (1981-2011), America’s reusable Space Shuttle carried people into
orbit; launched, recovered, and repaired satellites; conducted cutting-edge
research; and helped build the International Space Station, which has been
continuously occupied since 2000.
Space
systems continue to become more and more integral to homeland defense, weather
surveillance, communication, navigation, imaging, and remote sensing for
chemicals, fires, and other disasters.
Today’s
space launch systems have been designed to reduce costs and improve
dependability, safety, and reliability.
Most U.S. military and scientific satellites are launched into orbit by
a family of expendable launch vehicles designed for a variety of missions. Other nations have their own launch systems,
and there is strong competition in the commercial launch market to develop the
next generation of launch systems.
NASA's developing Space Launch System (SLS) will carry humans
beyond the grasp of Earth's gravity, stretching to the moon, Mars (planned for
the 2030s), and perhaps, one day, deep space. The first SLS launch was
the uncrewed Artemis 1, which took place on 16 November 2022.
Meanwhile, private U.S. companies, like SpaceX, Virgin
Galactic, and Blue Origin, are actively developing space launch systems for
near-earth orbital missions.
The U.S.
Defense Advanced Research Projects Agency has recently commissioned three
private companies, Blue Origin, Lockheed Martin and General Atomics, to develop
nuclear fission thermal rockets for use in lunar orbit.
Someday, humans may travel to distant solar systems in advanced
rockets propelled by ion engines, nuclear fission, nuclear fusion, solar
energy, or any of a number of theoretical propulsion sources.
On October 4, 1957, while visiting the University of Michigan
as a prospective college to attend, I heard a radio broadcast about the Soviet
successful launch of the first artificial Earth satellite. Already pointed towards engineering, that
launch and the beginning to the “space race,” focused my interest on aerospace engineering,
working towards degrees at Purdue University and the University of Michigan,
and spending 35 years in the aerospace industry, where I worked on both
civilian and military applications of rockets, including being invited by NASA
to attend the Apollo-Soyuz mission launch from Kennedy Space Center in 1975.
7.
Television
Radio,
forerunner of television, was invented in conjunction with many different
scientists including Nikola Tesla (who demonstrated a wireless radio in
1893) and Guglielmo Marconi. Radio
became the most used form of communications in the world, and in the early 20th
century served as a social bonding tool, a place of news, an education tool,
and for emergency broadcasts - in short, providing an entirely new way for
people to communicate and interact
The
invention of television was also the work of many individuals. The first demonstration of the instantaneous transmission
of images was by Frenchmen Georges Rignoux and A. Fournier in Paris in
1909. In 1911, Russian Boris
Rosing and his student Vladimir Zworykin created a TV system
that used a mechanical mirror-drum scanner to transmit crude images over wires
to a cathode ray tube receiver. But
the system was not sensitive enough to allow moving images.
In
1925, Scottish engineer John Logie Baird gave the first public demonstration of
televised images in motion. Later,
in 1927, he demonstrated the transmission of an image of a face in motion using
telephone lines. This is widely regarded
as being the world's first public television demonstration.
In
1927, American inventor Philo Taylor Farnsworth (sometimes called the father of
television) invented the first fully functional and complete
all-electronic television system.
Robert Goddard about to launch the world’s first liquid-fueled rocket on March 16, 1926
The
opening of the 1939 World’s Fair in New York introduced television to a
national audience, thanks to RCA and a speech by President Franklin D.
Roosevelt. NBC soon began nightly broadcasts.
As
black-and-white TVs became more common in American households, the finishing
touches on color TV were refined in the late 1940s. “Meet the Press” debuted and eventually
became TV’s longest-running show.
By the 1950s, television had truly entered the mainstream,
with more than half of all American homes owning TV sets by 1955. As the number of consumers expanded, new
stations were created and more programs broadcast, and by the end of that
decade TV had replaced radio as the main source of home entertainment in the
United States.
Improvements
in TV continued rapidly, including cable television (1948), video tape (1956),
remote control (1956), satellite TV (1962), video recording (1976), and
high-definition TV (1981).
The turn of the century saw the emergence of flat-screen TVs
with high-definition pictures, replacing box TV sets. The late 1990s and early 2000s also saw
people switching from cable to satellite television. In the 2010s, companies started testing 3D
TVs, while streaming services became more popular than cable and satellite
television. TV monitors grew to amazing
sizes, enhancing the viewing experience.
Smart TVs became more prevalent in
American households, allowing viewers to stream music, skim through YouTube,
and watch their favorite shows all on one device.
Today, streaming services are slowly replacing cable and
satellite. And, thanks to computers and
the internet (see below), we can receive TV on our computers and mobile
devices.
Television
was one of the first inventions to affect the lives of masses all over the
world, and to this day still remains a popular way of getting information and
entertainment. Future improvements
include incredible clarity advances and 3D TV.
Of
all the myriad news, entertainment, and educational shows available on
television these days, I watch only sports (a lot) and an occasional movie or
thriller show. Most of my connections to
electronic information, education, entertainment, and meetings are via the next
two inventions.
8.
Computer
In the early 19th
century, English mechanical engineer Charles Babbage conceptualized
and invented (but didn’t build) the first mechanical computer. For this work, Babbage is regarded as the
“father of the computer.” English mathematician and writer Ada
Lovelace was the first to recognize that the machine had applications beyond
pure calculation, and to have published the first algorithm intended to be carried
out by such a machine. As a result, she
is often regarded as the first computer programmer.
The principles of modern computer science were set out by Englishman Alan Turing in his seminal 1936 paper, “On Computable Numbers, with an Application to the Entscheidungsproblem [decision problem].”
There are five
apparent generations of digital computers.
Each generation is defined by a key technological development that
changed necessarily how computers operate.
First Generation
- Vacuum Tubes (1940 - 1956). These first computers utilized vacuum tubes as circuitry
and magnetic drums for memory. As a
result, they were huge, actually taking up entire rooms, generating an
abundance of heat, which caused perpetual breakdowns.
These
first-generation digital computers relied on “machine language” (the most
fundamental programming language that can be understood by computers). These
computers were limited to solving one problem at a time. Input was accomplished with punched cards and
paper tape. Output emerged on paper
printers.
Second
Generation - Transistors (1956 - 1963). Although first invented in 1947, transistors weren’t used
much in computers until the end of the 1950s. Transistors were far superior to
vacuum tubes, making computers smaller, more efficient, less expensive, and they
used less electricity. But they still
used punched cards for inputs.
Better programming
languages were being developed (early versions of COBOL and FORTRAN). Transistor-driven machines were the first
computers to store instructions into their magnetic drum memories. The early versions of these machines were
created for the atomic energy industry.
Note: In the 1950s and
1960s, analog computers were used to study the continuous variation aspect of
physical phenomena such as electrical, mechanical, or hydraulic processes to
model the problem being solved. The growing
superior computational speed and accuracy of digital computers made analog
computers obsolete after the 1960s. Even so, some research in analog computation is still being
done; a few universities still use analog computers to teach control
system theory.
Third Generation - Integrated Circuits
(1964 - 1971). Transistors were now
being miniaturized and put on silicon chips.
This led to a huge improvement in speed and effectiveness of these
machines. These were the first computers
where users interacted utilizing keyboards and monitors which interfaced with
an operating system, a great improvement from the punched cards and printouts.
This enabled these machines to run multiple applications at once, utilizing a
central program which functioned to monitor memory.
Fourth
Generation - Microprocessors (1972 - 2010).
The Intel company developed the computer chip
in 1971, which located components of the computer, such as CPU, memory, input/output
controls, onto a single chip. The
Intel chip contained thousands of unified circuits, enabling much smaller
computers
In 1977, Apple,
Inc. introduced the Apple II, one of the first successful mass-produced
microprocessor personal computers.
Fourth-generation computers used Ultra
Large-Scale Integration technology, and advanced languages like C, C++, Java,
and .Net.
The incremented power of these smaller
computers led to linking multiple computers, establishing networks, which
eventually led to the birth and rapid evolution of the Internet. (See next invention.) Other primary advances during this period
were the Graphical user Interface, the mouse, and startling advances in laptop
and hand-held device capability.
Fifth Generation
- Artificial Intelligence (2010 Onwards).
Computers with artificial intelligence (see invention
below) are in development. AI-assisted
computing will utilize extensive parallel processing and superconductors. Future, computers will be revolutionized
again with quantum computation, molecular, and nano technology.
I am truly an
old computer geek. While I was in
college, I programmed analog computers and learned machine language for digital
computers. In my early aerospace career,
I simulated aerospace systems on a room-sized digital computer, inputting trays
of punched cards that I prepared myself in the FORTRAN computer language; the
turn-around time to see if my simulation worked was one day (usually
overnight). I even simulated systems on
a hybrid analog-digital computer. Besides
continuing to use increasingly capable digital computers at work, I began my home
computer history with the early Apple computers. Today, in retirement, I continue to update
and use a personal computer, and do a lot of stuff on my iPad and smart phone.
9.
Internet and World Wide Web
The Internet is
a networking infrastructure of computers, whereas the World Wide Web is a way
to access information using the Internet.
The
Internet has no single “inventor." Instead, it has evolved over time,
starting in the United States in the 1950s, along with the development of
computers.
The
figure below shows the timeline of Internet and World Wide Web development.
Development timeline of the Internet and World Wide Web.
The
first workable prototype of the Internet came in the late 1960s, with
the creation of ARPANET, Advanced Research Projects Agency Network. By the 1970s, the basic rules (protocols)
that define Internet operations were developed by Vinton Cerf, which enabled
computers to communicate with each other.
ARPANET adopted the protocols on January 1, 1983, and from there,
researchers began to assemble the “network of networks” that became the
modern Internet.
The
father of the World Wide Web is considered to be British computer scientist, Tim
Berners-Lee, who in 1989, created the Web that allows information to accessed
over the Internet. (Today, documents and
downloadable media are made available to the network through web servers,
that distribute web content, and can be accessed by web browsers like
Google Chrome and Apple Safari.)
By
the end of 1990, Berners-Lee had the first web server and browser up and
running at CERN, the European Organization for Nuclear Research in Geneva,
Switzerland.
Only
a few users had access to the computer platform that ran the browser, so
development soon started on a more spartan browser, which could run on any
system. The first web browser was available to the
public in 1993. By the year 2000, 300
million people around the world were online.
In
2004, the social media Facebook app was launched and the Web began to grow
rapidly.
In
2007, the iPhone was released, starting the mobile device revolution.
By
2022, five billion people worldwide were online!
Education,
commerce, science, art, music, communication, modern media, and travel have all
been shaped by the Internet in some shape or form. The Internet has even been used as a tool to
make positive and negative social, economic, and political changes that affect
billions of people a day.
Since
my retirement from the aerospace industry in 2000, I have researched and
written (or co-authored) seven books, about 200 newspaper columns, and 100 blog
articles, including this one. (See my website at ringbrothershistory.com and
blog at bobringreflections.blogspot.com) I
used the Internet and World Wide Web for the research and production of
virtually all of these “products.”
10.
Artificial
Intelligence
Artificial
Intelligence (AI) is the science that allows machines and computer applications
to mimic human intelligence by modeling human behavior so that it can use
human-like thinking processes to solve complex problems.
From ancient times, various mathematicians, theologians,
philosophers, professors, and authors mused about mechanical techniques,
calculating machines, and numeral systems that eventually led to the concept of
mechanized “human” thought in non-human beings.
The origins of artificial intelligence may be dated to the middle
of the 20th century, when computer scientists started to create
algorithms and software that could carry out tasks that ordinarily need human
intelligence, like problem-solving, pattern recognition, and judgment.
One of the earliest pioneers of AI was British mathematician Alan
Turing, who proposed the concept of a machine that could simulate any human
intelligence task.
Development timeline of the Internet and World Wide Web.
The
1956 Dartmouth Conference gathered academics from various professions to
examine the prospect of constructing robots that can “think.” The conference officially introduced the field
of (and term) artificial intelligence.
In
the 1960s and 1970s, the focus of AI research focused on developing expert
systems designed to mimic the decisions made by human specialists in specific
fields. These methods were frequently
employed in industries such as engineering, finance, and medicine.
In
the 1980s, AI research began to focus on machine learning that employs
statistical methods to let computers learn from data. As a result, neural networks were created and
modeled after the human brain’s structure and operation.
AI
research made substantial strides in the 1990s in robotics, computer vision,
and natural language processing. In
the early 2000s, advances in speech recognition, image recognition, and natural
language processing were made possible by the advent of deep (machine) learning
- that uses complex neural networks that process
data in complex ways by employing sophisticated math modeling.
Today,
virtual assistants, self-driving cars, medical diagnostics, and financial
analysis are just a few of the modern-day uses for AI. Another important trend in modern-day AI is
the shift toward more human-like interactions, with voice assistants like Siri
and Alexa leading the way. Natural
language processing has also made significant progress, enabling machines to
understand and respond to human speech with increasing accuracy. ChatGPT - a
large language model trained by the company OpenAI - is an example of the “talk
of the town” AI that can understand natural language and generate human-like
responses to a wide range of queries and prompts.
Looking
to the future, AI is likely to play an increasingly important role in solving
some of the biggest challenges facing society, such as climate change,
healthcare, and cybersecurity. As
AI continues to evolve, it will likely profoundly impact virtually every aspect
of our lives, from how we work and communicate, to how we learn and make
decisions.
However,
there are concerns about AI’s ethical and social implications, particularly as
the technology becomes more advanced and autonomous. Issues that need to be resolved include
potential job displacement, privacy, and weaponized AI.
Also,
an original goal of AI was to achieve generalized
human cognitive abilities in software so that, faced with an unfamiliar task,
the AI system could find a solution. Some
researchers extend this goal to computer programs that experience sentience or
consciousness. Most AI researchers have devoted little
attention to this, with some claiming that intelligence is too complex to be
completely replicated. However, a small
number of computer scientists are still active in this research, with estimates
of the time required to achieve success ranging widely from ten years to over a
century.
Others
question whether machine consciousness is even desirable. More
than a few leading AI figures subscribe to a nightmare scenario, whereby
superintelligent machines take over and permanently alter human existence
through enslavement or eradication.
Many experts are urging us to take the
time to understand what we’re creating and how we’re going to incorporate it
into society.
All of us have already probably
experienced contact with artificial intelligence through virtual assistants,
chat boxes, perhaps AI-assisted internet searches, and generally with the
“behind the scenes” more efficient contacts in our daily lives. So far, so good. I’m excited about the forecasted near-term
improvements. But I also have concern
about AI getting “out of our control” and urge us to really think about what
we’re doing, and generate a plan that identifies the benefits and risks.
11.
Medical Firsts
All of the “Top 10” invention
lists that I checked included medical inventions or firsts. I really couldn’t argue with this, so I
decided to include the entire category of “medical firsts” as my 11th
invention that changed the world.
The following table lists the
medical firsts (in historical order) that I chose to be part of my 11th
world-changing invention.
My top
medical firsts in historical order.
No. |
Date |
“Inventor” |
Medical First |
1 |
1796 |
Edward Jenner |
Vaccines. Beginning with Edward Jenner in 1796 using inoculations to tame the smallpox virus, the usefulness and popularity of vaccines grew very quickly. Throughout the 1800s and early 1900s, vaccinations were created to combat some of the world’s deadliest diseases, including smallpox, rabies, tuberculosis, and cholera. Over 200 years, one of the deadliest diseases known to man - smallpox – was wiped off the face of the earth. Since then, virtually all vaccines have worked using the same concept - until a new technology, called mRNA, created game-changing possibilities for the future. Its high effectiveness, capacity for rapid development, and potential for low production costs were evident during the Covid-19 pandemic when two separate mRNA vaccines were developed and approved for use in just a matter of months. |
2 |
1846 |
William T.G. Morton |
Anesthesia. In the mid-19th century,
surgery was undertaken only as a last resort, with some patients opting for
death rather than enduring the excruciating ordeal. William T. G. Morton made history in 1846
when he successfully used ether as an anesthetic during
surgery. Soon after, a faster-acting substance called chloroform became
widely used but was considered high-risk after several fatalities. Since
the 1800s, safer anesthetics have been developed, allowing millions
of life-saving, painless operations to take place. |
3 |
1861 |
Louis Pasteur |
Germ Theory. It was widely believed that disease was caused by “spontaneous generation.” In 1861, French microbiologist Louis Pasteur proved that infectious disease was a result of an invasion of specific microscopic organisms - known as pathogens - into living hosts. This new understanding marked a significant turning point in how diseases were treated, controlled, and prevented, helping to prevent devastating epidemics that were responsible for thousands of deaths every year, such as the plague, dysentery, and typhoid fever. |
4 |
1895 |
Wilhelm Rontgen |
Medical Imaging. The X-ray, a form of electromagnetic radiation, was ‘accidentally’ invented in 1895 by German physicist Wilhelm Rontgen when experimenting with electrical currents through glass cathode-ray tubes. The discovery transformed medicine overnight. Ultrasound, began being used for medical diagnosis in 1955, using high frequency sound waves to create a digital image. In 1967, the computed tomography scanner was created, which uses X-ray detectors and computers to diagnose many different types of disease, and has become a fundamental diagnostic tool in modern medicine. In 1973, Paul Lauterbur produced the first magnetic resonance image (MRI). MRI data creates detailed images within the body and is a crucial tool in detecting life-threatening conditions including tumors, cysts, damage to the brain and spinal cord, and some heart and liver problems. |
5 |
1928 |
Alexander Fleming |
Antibiotics. Alexander Fleming’s penicillin, the world’s first antibiotic,
completely revolutionized the war against deadly bacteria. The
Scottish biologist accidentally discovered the anti-bacterial “mold” in a
petri dish in 1928. However, Fleming’s incredible findings were not
properly recognized until the 1940s, when they began being mass-produced
by American drug companies for use in World War II. Two other scientists were
responsible for the mass distribution of penicillin, Australian Howard Florey
and Nazi-Germany refugee Ernst Chain - saving millions of lives.
Unfortunately, over the years certain bacterium have become
increasingly resistant to antibiotics, leading to a world-wide crisis
that calls for the pharmaceutical industry to develop new anti-bacterial treatments. |
6 |
1954 |
Joseph Murray and David
Hume |
Organ Transplants.
In1954, the first successful kidney transplant was carried out
by Dr Joseph Murray and Dr David Hume in Boston. Despite many previous
attempts in history, this was the first instance where the recipient of an
organ transplant survived the operation. The turning point came when various
technical issues were overcome, such as the connection between two blood
vessels, placement of the kidney, and immune response. In 1963, the first
lung transplant was carried out, followed by a pancreas/kidney in 1966, and
liver and heart in 1967. Aside from saving thousands of lives in the years
following, transplant procedures have also become increasingly innovative and
complex, with doctors successfully completing the initial
hand transplant in 1998, and full-face transplant in 2010 |
7 |
1960 |
Carl Djerassi |
Birth
Control. The first
known form of condom (from a goat bladder) was used in Egypt around 3000 BC. In AD 1844, Charles Goodyear patented the
vulcanization of rubber, which led to the mass production of rubber condoms. In 1914, Margaret Sanger, a nurse and sex
educator from New York state, first coined the term “Birth control.” Carl Djerassi successfully created a
progesterone pill that could block ovulation.
"The Pill" was approved for sale in 1960 and launched an
international revolution that allowed women to determine when they would have
children, and freed them from unplanned pregnancies, which could derail their
lives. |
8 |
1963 |
many |
Antiviral Drugs. Smallpox, influenza, and
hepatitis have ravaged many human populations throughout history. Unlike the
sweeping success of antibiotics in the late 1930s and 1940s, the development
of antivirals did not really take off until the 1960s because it was
difficult to use them without damaging the patient’s cells. The first antiviral was
approved in 1963 for topical treatment of the herpes simplex virus. Over the years antivirals improved
significantly, and work by blocking the rapid reproduction of viral
infections; some can even stimulate the immune system to attack the virus.
The development of effective antivirals has been significant in treating and
controlling the spread of deadly virus outbreaks such as
HIV/AIDS, Ebola and rabies. One of the most recent
antivirals is used to treat COVID-19. |
9 |
1970s |
many |
Stem Cell Therapy. The
incredible potential of stem cells was discovered in the late 1970s, when
they were found inside human cord blood. Two specific characteristics make
stem cells remarkable: they are unspecified cells that can renew themselves
through cell division, even after being inactive, and under certain
conditions can be used to make any type of human cell. This discovery has
enormous potential and stem cell therapy has already been used to
treat leukemia and other blood disorders, as well as in bone marrow
transplantation. Research is currently ongoing to use stem cells to treat
spinal cord injuries and a number of neurological conditions such as
Alzheimer’s, Parkinson’ and strokes. However, due to the ethical issues
surrounding the use of embryonic stem cells, researchers are likely to
face many obstacles when developing stem cell-based therapy. |
10 |
1970s |
many |
Immunotherapy. A treatment
that stimulates the immune system to fight off a disease, has been in the
making for over a century. In the 1890s, William B. Coley injected inactive
bacteria into cancerous tumors, achieving remission in some patients.
However, it is only in the last 40 years that serious progress has been made
in immunotherapy, particularly in respect to treating cancer. In the 1970s,
antibody therapies were developed and in 1991, researchers produced the first
cancer vaccine which was approved by the FDA in 2010. In the last decade,
immuno-oncology has become one of the most revolutionary cancer
therapies. |
11 |
1990s |
many |
Gene Therapy. Genes inside the human body contain DNA - the code that controls much of the body's form and function, from height to regulating body systems. Genes that don't work properly can cause disease. Gene therapy replaces a faulty gene or adds a new gene in an attempt to cure disease or improve the body's ability to fight disease. Gene therapy holds promise for treating a wide range of diseases, such as cancer, cystic fibrosis, heart disease, diabetes, hemophilia and AIDS. Gene editing was pioneered in the 1990s. Researchers are still studying how and when to use gene therapy. Currently, in the United States, gene therapy is available only as part of a clinical trial. There are also concerns that gene editing might be used to affect human evolution. |
Over the years, I have been
the beneficiary of several of these medical advances. I have also lost a wife to cancer, and
appreciate how much more needs to be accomplished in the medical world.
Principal Sources
My principal sources include: “35
of the most revolutionary inventions that shaped our world,”
intrestingenginering.com; “A Brief History of Printing Press &
Printmaking,” instantprint.co.uk; “Telescope History,” nasa.gov; “How the
Telephone Was Invented,” thoughtco.com; “When Was the Camera Invented? Everything You Need to Know,” nfi.edu; “A
Brief History of Space Exploration,” aerospace.org; “Timeline of rocket and
missile technology,” Wikipedia.com; “A new era of spaceflight? Promising advances in rocket propulsion,” theconversation.com;
“Who Invented Television?”, history.com; “A brief history of television, by
decade,” stacker.com; “Evolution of Computers,” nettantra.com; “History of
Computers,” toppr.com; “A brief history of artificial intelligence,”
cointelegraph.com; “Science11 - Artificial Intelligence,”
bobringreflections.blogspot.com; “The top 10 medical advances in history,”
proclinical.com; plus, numerous other online sources.
Now it’s your turn to pick the “top 10” inventions that changed the world.
Comments
Post a Comment