Origins of Computing Morphology

Julian Scaff
29 min readMay 24, 2024

Why do personal computers look and function the way they do?

A diagram showing how almost all major components of the modern personal computer have ancient origins.
Most of the components of modern computers have ancient origins.

Computers are integral to almost every aspect of our daily lives, but have you ever wondered why they look and function the way they do? I have spent several years delving into this question, tracing the evolution of key components like the camera, display, keyboard, graphical user interface (GUI), and mouse. By exploring historical advancements from ancient optical devices to the revolutionary inventions of the 20th century, it becomes clear that thousands of years of technological discoveries have shaped the modern computer. From the first lenses crafted by the ancient Assyrians in 800 BCE to the creation of the computer mouse by Douglas Engelbart in the late 1960s, each breakthrough has contributed to the sophisticated, user-friendly devices we rely on today.

I break down the modern computer, whether a desktop or laptop, into it’s most basic components of human-computer interaction: The display or screen for viewing content and using GUIs, the keyboard for text entry, the mouse (or track pad) as a pointing and input device, the software that powers the GUI, and the camera that is often used more for communications than photography. Each of these components have a unique geneology, and their convergence in modern computers affords us incredible extensions of our memory and thinking.

The Camera

The modern camera, now seamlessly integrated into computers and smartphones, traces its lineage through a rich history of technological and scientific advancements. This lineage begins with the ancient Assyrians’ invention of the lens, exemplified by the Nimrud lens, which laid the foundation for manipulating light to create magnified images. The Camera Obscura, used by Renaissance artists and scientists, further developed the understanding of projecting images through a small aperture. The invention of telescopes and microscopes in the Netherlands in the 16th and 17th centuries harnessed optical principles to explore the cosmos and microscopic worlds, respectively. Analog film photography, revolutionized by the Kodak Brownie, democratized image capturing and preserving memories. The advent of digital photography in the late 20th century, epitomized by devices like the Apple QuickTake and Nikon D1, marked the shift from chemical processes to electronic image sensors and storage. Each of these innovations contributed to the compact, high-resolution cameras now ubiquitous in our digital devices, enabling instant image and video capture, sharing, and communication worldwide.

The first consumer digital cameras emerged in the late 1980s and early 1990s, revolutionizing photography by transitioning from film to digital imaging. One of the earliest examples was the Fuji DS-1P, introduced in 1988, which featured a 16 MB internal memory card for image storage. However, it was the Apple QuickTake 100, released in 1994, that became widely recognized as the first commercially successful consumer digital camera. It offered a resolution of 0.3 megapixels and could store up to eight photos, making digital photography accessible to the general public for the first time.

From left to right: the Fuji DS-1P (1988), the Apple QuickTake 100 (1994), and the Sony Mavica (1997).
From left to right: the Fuji DS-1P (1988), the Apple QuickTake 100 (1994), and the Sony Mavica (1997). (images: Digital Kamera Museum)

Throughout the 1990s, major electronics companies like Kodak, Canon, and Sony began developing and releasing their own digital cameras, improving resolution, storage capacity, and user-friendliness. The introduction of the Sony Mavica in 1997, which used a floppy disk for storage, and the Nikon Coolpix series, known for their higher resolution and advanced features, marked significant milestones in the evolution of digital photography. By the early 2000s, digital cameras had largely supplanted film cameras in the consumer market, setting the stage for the rapid advancements and widespread adoption of digital imaging technology.

The first digital single-lens reflex (DSLR) camera, the Nikon D1, was released in 1999, marking a significant milestone in the transition from film to digital photography for professional and serious amateur photographers. The Nikon D1 featured a 2.7-megapixel CCD sensor and was capable of shooting at 4.5 frames per second, offering a remarkable level of performance for its time. Unlike earlier digital cameras that often relied on existing film camera bodies, the D1 was designed from the ground up as a digital device, providing a fully integrated and cohesive user experience. Its introduction set new standards for image quality, speed, and versatility in digital photography, paving the way for the rapid evolution and widespread adoption of DSLR technology in the subsequent decades.

The first film single-lens reflex (SLR) camera, the Kine Exakta, was introduced by Ihagee Kamerawerk in Dresden, Germany, in 1936. This pioneering camera featured a through-the-lens viewfinder, allowing photographers to see exactly what the lens was capturing, a significant advancement over previous viewfinder systems. The Kine Exakta used 35mm film, which was a standard format for still photography, and incorporated a focal plane shutter, interchangeable lenses, and a variety of other innovative features that set the foundation for future SLR designs. Its introduction revolutionized the field of photography by providing greater control, precision, and versatility, making it immensely popular among both professional photographers and serious amateurs. The success of the Kine Exakta established the SLR as a dominant camera type, influencing the design and functionality of cameras for decades to come.

From left to right: The Eastman-Kodak Brownie (1900)(image: FotoVoyage); The Kine Exakta (1936)(image: British Science Museum); The Nikon D1 (1999)(images: Digital Kamera Museum)

The first consumer camera, the Kodak Brownie, was introduced in 1900 by the Eastman Kodak Company. This groundbreaking device was designed to be affordable and easy to use, bringing photography to the masses for the first time. The Brownie camera used roll film and featured a simple box design with a fixed focus lens and a single shutter speed. Priced at just one dollar, it democratized photography, allowing everyday people to capture their own memories. The Brownie’s success helped establish Kodak as a dominant player in the photography industry and set the foundation for the development of modern consumer cameras.

The camera is a device that revolves around the lens, as the lens is crucial for capturing and focusing light to create clear, sharp images on the film or digital sensor. Thus, the history of optics is crucial to the history of the camera, as advancements in understanding light and lenses directly enabled the development of technologies essential for capturing precise and detailed images. In the 16th and 17th centuries, Dutch spectacle makers used their expertise in optics to invent devices for seeing things far away or very small.

The invention of the first telescopes is credited to Dutch spectacle makers in the early 17th century, with Hans Lippershey often recognized as the earliest to apply for a patent in 1608. Lippershey’s design consisted of a convex objective lens and a concave eyepiece, which allowed for the magnification of distant objects. This breakthrough inspired other inventors, such as Galileo Galilei, who significantly improved the design by increasing the magnification power and utilizing it for astronomical observations. In 1609, Galileo built his own telescope and used it to discover celestial phenomena, such as the moons of Jupiter, the phases of Venus, and detailed observations of the Moon’s surface. These discoveries challenged the prevailing geocentric view of the universe and laid the groundwork for modern astronomy, transforming humanity’s understanding of the cosmos.

The invention of the first microscope is attributed to Dutch spectacle makers Hans Jansen and his son Zacharias Jansen in the late 16th century, around 1590. They created a compound microscope, which utilized two sets of lenses to achieve higher magnification than a single lens could provide. This groundbreaking invention allowed for the observation of objects too small to be seen with the naked eye, revolutionizing scientific inquiry and biological studies. The microscope’s ability to reveal the intricate details of tiny organisms and structures spurred significant advancements in various fields, including biology, medicine, and materials science. Subsequent improvements by other pioneers, such as Antonie van Leeuwenhoek and Robert Hooke, further enhanced the microscope’s capabilities, solidifying its role as an indispensable tool in scientific research and discovery.

The Camera Obscura is a device that projects an inverted image of its surroundings onto a surface through a small hole or lens, illustrating the fundamental principles of optics. The history of the Camera Obscura dates back to ancient times, with early descriptions found in the works of Chinese philosopher Mozi and Greek philosopher Aristotle. The Chinese were the first to invent the camera obscura, with the philosopher Mozi documenting its principles in the 5th century BCE, describing how an inverted image is formed through a small hole, laying the groundwork for future developments in optical science. A century or so later the ancient Greeks discovered the device, although it’s unclear if they received the knowledge from the Chinese or if they discovered it independently.

The concept was refined during the Renaissance by artists and scientists such as Leonardo da Vinci, who used it to study perspective and light. By the 16th century, the Camera Obscura had evolved into a portable device with a lens to enhance image clarity, utilized by artists like Johannes Vermeer for accurate drawing and painting. In the 17th and 18th centuries, it became a popular tool for both scientific observation and artistic creation, laying the groundwork for the development of photographic cameras.

In Japan, during the Edo period (1603–1868), the camera obscura was used for artistic and educational purposes, aiding in the understanding of perspective and optics. This device’s ability to project accurate representations of the external world made it a valuable tool for both scientific inquiry and artistic endeavors across Asia during these centuries.

The Camera Obscura’s principle of capturing light and projecting images directly influenced the invention of the first cameras, making it a foundational technology in the history of photography.

From left to right: The Nimrud Lens (800 BCE)(image: British Museum); Illustration of the Camera Obscura by Nicolaas Beets (1864)(image: Beets, Nicolaas. “Camera Obscura.” Amsterdam: Kessinger, republished 2009); Replica of the first Galileo telescope (1609)(image: Museo Galileo).

The Nimrud lens, also known as the Layard lens, was discovered in the ruins of the ancient Assyrian city of Nimrud (modern-day Iraq) by British archaeologist Austen Henry Layard in the mid-19th century. Dating back to the 8th century BCE, this piece of rock crystal, roughly ground into a convex shape, is one of the earliest known examples of a lens. Its exact purpose remains a topic of scholarly debate; it may have been used as a magnifying glass, a burning glass for starting fires, or even as a decorative piece. The Nimrud lens provides evidence that ancient Assyrian craftsmen had a rudimentary understanding of optical principles, demonstrating the early human effort to harness the power of light. This artifact is significant not only for its potential practical uses but also for its contribution to the long history of optical innovation, preceding the development of more sophisticated lenses and optical devices in later civilizations.

The Display

The modern computer display traces its lineage through a rich tapestry of visual and performance arts that reaches back thousands of years. Starting with the Passion Plays of ancient Egypt, which utilized dramatic storytelling and visual spectacle to convey religious narratives, the tradition of theater evolved to include the grand productions of the Shang Theater in ancient China and the immersive performances at the Teatro Olimpico in Renaissance Italy. American Vaudeville theater in the late 19th and early 20th centuries introduced a variety of live acts and early film screenings, serving as a bridge between live performance and recorded media. The advent of cinema in the early 20th century brought moving pictures to large audiences, leading to the development of television, which further miniaturized and popularized visual storytelling. The evolution continued with the transition from bulky cathode-ray tube (CRT) televisions to sleek, high-definition flat-panel displays. These advancements in display technology ultimately culminated in the sophisticated computer monitors of today, capable of delivering high-resolution images and videos, interactive graphics, and immersive experiences, all rooted in a long history of theatrical and cinematic innovation.

In the 1970s, the first home computers and gaming consoles ingeniously repurposed the ubiquitous home television set as their display screen, capitalizing on its widespread presence in households. Early home computers, like the Apple I and the Commodore PET, and gaming consoles, such as the Magnavox Odyssey and the Atari 2600, were designed to connect directly to a television, which served as an affordable and accessible output device. This innovation made these pioneering technologies more feasible and appealing to consumers by eliminating the need for a separate, costly monitor. The use of the television as a display allowed users to interact with digital content in new and exciting ways, from playing video games to programming and running software applications. This symbiotic relationship between early home computers, gaming consoles, and televisions played a crucial role in popularizing digital technology and paving the way for the integration of advanced multimedia systems in modern homes.

From left to right: The Apple I (1976)(image: By Cynde Moya — Own work, CC BY-SA 4.0); The Commodore PET (1977)(image: By Photograph by Rama, Wikimedia Commons, Cc-by-sa-2.0-fr, CC BY-SA 2.0 fr); the Atari 2600 (1977)(image: By Evan-Amos — Own work, Public Domain); a Zenith TV set (1976)(image: eBay).

The invention of the first consumer television sets in the late 1920s and early 1930s marked a significant milestone in the history of mass communication and entertainment. Scottish inventor John Logie Baird is credited with demonstrating one of the first working television systems in 1926, utilizing mechanical scanning techniques. This was followed by American inventor Philo Farnsworth’s development of the first fully electronic television system in 1927, which offered a clearer and more reliable image. By the early 1930s, several companies, including RCA in the United States and Telefunken in Germany, began producing and marketing television sets for home use. These early models featured small screens and limited broadcast content, yet they captivated the public imagination and laid the groundwork for the explosive growth of television as a dominant medium in the following decades. The introduction of television into homes revolutionized the way people consumed news, entertainment, and information, paving the way for the modern multimedia landscape.

The first cinemas and moving picture machines, such as the Praxinoscope, marked the beginning of the film industry and the public’s fascination with motion pictures. The Praxinoscope, invented by Frenchman Charles-Émile Reynaud in 1877, was an early animation device that used a series of hand-drawn images placed on a rotating drum and viewed through a set of mirrors, creating the illusion of motion. This device, along with others like the Zoetrope and the Phenakistoscope, laid the groundwork for more advanced motion picture technologies and were in many ways the animated GIFs of their time. The advent of the Kinetoscope, developed by Thomas Edison and William Kennedy Laurie Dickson in the 1890s, allowed individuals to view short films through a peephole, while the Lumière brothers’ Cinematographe, introduced in 1895, enabled the projection of films to larger audiences. The first public screenings of films in cafés and theaters captivated audiences and led to the establishment of the first dedicated cinemas. These early experiences with moving pictures transformed entertainment and cultural consumption, paving the way for the development of the global film industry.

From left to right: Teatro Olimpico (built in 1580)(image: Di Didier Descouens — Opera propria, CC BY-SA 4.0); Vaudeville star Marie Dressler (1917)(image: By Unknown author — J. Willis Sayre Collection of Theatrical Photographs, Public Domain); a Praxinoscope (1890s)(image: Museum of the History of Science, Oxford, UK); an Edison Kinetoscope in use (1895)(image: By [1] — en:Image:Kinetophone1.jpg, Public Domain).

American Vaudeville Theater, which flourished from the late 19th century to the early 20th century, was a diverse and dynamic form of entertainment that featured a variety of acts, including comedy, music, dance, magic, and more. Vaudeville’s rise coincided with a period of rapid urbanization and technological advancement in the United States, making it a popular form of mass entertainment. The structure of Vaudeville shows, with their continuous, varied performances, provided a template for early film screenings. As motion picture technology advanced, Vaudeville theaters began to incorporate short films into their programs, recognizing the audience’s growing interest in this new medium. By the early 20th century, many Vaudeville theaters had transitioned into or were replaced by dedicated movie theaters, which offered a new way to experience entertainment on a larger scale. The skills and techniques honed in Vaudeville, particularly in terms of timing, humor, and visual spectacle, significantly influenced early filmmakers and the development of cinema as a major entertainment industry. This transition marked a pivotal shift in American entertainment, blending live performance traditions with the emerging art of film. In many ways, American Vaudeville is also a precursor to the formats of popular online and streaming content we see today.

The Teatro Olimpico, located in Vicenza, Italy, is one of the most significant and historic theaters in the world. Designed by the renowned Renaissance architect Andrea Palladio and completed by his student Vincenzo Scamozzi in 1585, it is the oldest surviving indoor theater in Europe. The Teatro Olimpico is celebrated for its classical architecture, which includes a stunningly detailed stage set that replicates a classical Roman city, complete with perspective streets and buildings. This innovative design created an immersive environment for audiences and marked a significant advancement in theater architecture and stagecraft. The theater’s inauguration featured a production of Sophocles’ “Oedipus Rex,” emphasizing its roots in classical drama. The Teatro Olimpico’s blend of architectural beauty and technical ingenuity had a profound influence on the design of subsequent theaters and is considered a masterpiece of Renaissance art and engineering. Its enduring legacy lies in its role as a bridge between ancient theatrical traditions and modern stage design, setting new standards for the visual and experiential aspects of theatrical performances. The immersive experience for audiences was in some ways a forecast of contemporary virtual reality devices.

The Shang Theater in ancient China flourished during the Shang Dynasty (1600–1046 BCE), and is considered one of the earliest forms of organized theater in history. Though direct evidence of the Shang Theater is sparse, historical texts and archaeological findings suggest that theatrical performances were integral to religious and ceremonial life. These early performances likely included ritualistic dances, music, and dramatic storytelling, which were performed to honor ancestors, celebrate harvests, and invoke blessings from deities. The integration of these elements into Shang religious ceremonies laid the groundwork for the rich tradition of Chinese theater. Over time, these early forms of theatrical expression evolved, influencing later theatrical forms such as the Zhou Dynasty’s music and dance dramas, and ultimately contributing to the development of classical Chinese opera. The Shang Theater’s role in ceremonial and cultural life underscores its importance in the early development of theatrical traditions in China, marking the beginnings of a long and diverse history of Chinese performance arts.

The Passion Plays in ancient Egypt, performed during the Middle Kingdom (c. 2050–1710 BCE), are among the earliest known examples of theatrical performance. These dramatic enactments were centered around the story of Osiris, the god of the afterlife, and his death and resurrection. Held annually in the city of Abydos, these plays were part of larger religious festivals that included processions, rituals, and reenactments of Osiris’s myth, symbolizing themes of death, rebirth, and the eternal cycle of nature. The participants, often priests and laypersons, would dramatize key events such as Osiris’s murder by his brother Set, the mourning of his wife Isis, and his eventual resurrection and ascension as the ruler of the underworld. These performances not only served religious and ceremonial purposes but also reinforced the social and moral order of ancient Egyptian society. The Passion Plays of Abydos exemplify how early theatrical traditions were deeply intertwined with religious beliefs and practices, setting a precedent for the use of drama in conveying spiritual narratives.

The Keyboard

The modern computer keyboard, an essential tool in contemporary computing, has a fascinating lineage that traces back to various historical instruments and devices. Its most direct ancestor is the typewriter, invented in the 19th century, which introduced the layout and mechanical action of pressing keys to produce text. The typewriter itself drew inspiration from musical instruments like the piano and the clavichord, which feature rows of keys that produce sound when struck. These musical keyboards evolved from earlier innovations such as the ancient Greek Hydraulis, an early form of the organ that used water pressure to create music, and the Middle Eastern hammered dulcimer, which involves striking strings with small hammers. These instruments established the fundamental concept of a key-operated mechanism, ultimately influencing the design and functionality of modern keyboards. Through this lineage, the computer keyboard inherited not only its physical structure but also the cultural and technological advancements that each predecessor contributed over centuries.

The first uses of typewriter-style keyboards as computer inputs emerged in the mid-20th century, significantly transforming human-computer interaction. During the 1940s, the early computers like the ENIAC utilized patch cables and switches for input, but as computing technology advanced, there was a growing need for more efficient and user-friendly input methods. The transition began in earnest with the introduction of the UNIVAC I in 1951, which employed a typewriter-style keyboard for data entry and programming. This innovation allowed operators to input data directly and interact more intuitively with the machine. The adoption of typewriter-style keyboards accelerated with the development of time-sharing systems in the 1960s, which enabled multiple users to interact with a computer simultaneously via terminals equipped with keyboards. By the 1970s, personal computers like the Apple II and the IBM 5150 solidified the keyboard’s role as the primary input device, leveraging the familiar QWERTY layout from typewriters. This integration of typewriter-style keyboards into computing laid the foundation for modern human-computer interfaces, making technology more accessible and functional for everyday users.

The IBM Selectric typewriter, introduced in 1961, revolutionized the world of typing and had a profound influence on the development of modern computers. I am intimately familiar with this product line, as I learned how to type on an IBM Selectric III in 1985. Unlike traditional typewriters, the Selectric used a revolutionary “golf ball” type element instead of individual typebars, which allowed for faster typing speeds, reduced jams, and interchangeable fonts. This innovation not only improved the efficiency and versatility of typewriting but also laid the groundwork for the development of electronic text editing. In the 1970s, IBM integrated Selectric mechanisms into computer terminals, creating the IBM 2741, which combined the reliability and familiarity of the Selectric keyboard with the power of computing. These terminals were widely used for data entry, programming, and word processing, bridging the gap between typewriting and computing. The Selectric’s design principles influenced the development of early computer keyboards, helping to establish the standard QWERTY layout and key mechanisms that are still in use today. Its legacy is evident in the ergonomic and functional design of modern computer keyboards, underscoring the Selectric’s pivotal role in the evolution of typing and computer input technologies.

From left to right: A piano made by Cristofori (1726)(image: By Opus33 — Own work, CC BY-SA 4.0); an IBM Selectric I (1961)(image: By steve lodefink, Flikr, CC BY 2.0); IBM Selectric Type Ball for changing fonts (image: By Scs — Own work, Public Domain).

In the late 19th and early 20th centuries, the typewriter emerged as a revolutionary consumer product, capturing the public’s imagination and becoming a symbol of modernity and efficiency. As businesses and individuals increasingly recognized the typewriter’s potential to enhance productivity and streamline communication, its popularity soared. The release of new typewriter models, such as those by Remington, Underwood, and Smith Corona, became highly anticipated events, drawing large crowds eager to witness the latest advancements in technology. Demonstrations and exhibitions showcased the typewriters’ capabilities, and the sleek designs and innovative features captivated audiences. The excitement surrounding these launches reflected the broader societal fascination with technological progress and the growing demand for tools that could improve daily life and work. Typewriters became essential fixtures in offices, homes, and educational institutions, revolutionizing the way people wrote and communicated, and solidifying their status as a must-have device in an increasingly industrialized and literate society.

The Sholes & Glidden typewriter, invented by Christopher Latham Sholes and introduced in 1874 by E. Remington and Sons, was the first commercially successful typewriter and a significant milestone in the history of typing technology. This pioneering machine featured the QWERTY keyboard layout, which was designed by Sholes and his collaborators to reduce the jamming of type bars by placing commonly used letter pairs apart. Sholes drew inspiration for the typewriter keys from piano keys, envisioning a similar layout that would allow for efficient and rapid typing. Despite initial challenges, including a cumbersome design and the inability to see the typed text immediately, the Sholes & Glidden typewriter offered unprecedented efficiency in writing and document production. The QWERTY layout, despite being somewhat counterintuitive, proved effective in minimizing mechanical issues and quickly became the industry standard. Its success paved the way for the widespread adoption of typewriters in offices and homes, fundamentally transforming business communications and personal correspondence. The legacy of the Sholes & Glidden typewriter endures in the modern computer keyboard, which retains the QWERTY layout, highlighting the enduring impact of its design on usability and efficiency in text entry.

The clavichord, invented in the early 14th century, is one of the earliest stringed keyboard instruments and a predecessor to the piano. It featured a simple mechanism where keys pressed metal blades, known as tangents, against strings to produce sound. The clavichord was valued for its expressive control over volume and sustain, but it was relatively quiet, limiting its use to intimate settings. The piano, invented by Bartolomeo Cristofori around 1700 in Italy, revolutionized keyboard instruments by addressing the limitations of its predecessors, including the clavichord and the harpsichord. Cristofori’s innovative hammer mechanism allowed for dynamic variation in sound, from soft to loud, by varying the pressure applied to the keys. This invention, initially called the “gravicembalo col piano e forte” (harpsichord with soft and loud), combined the expressive capabilities of the clavichord with the volume needed for larger performances, leading to its eventual dominance in both domestic and concert settings. The piano’s versatility and expressive range profoundly influenced music composition and performance, making it a cornerstone of Western music.

Keyed instruments trace their lineage back to ancient innovations such as the Greek Hydraulis and the Middle Eastern hammered dulcimer. The Hydraulis, invented in the 3rd century BCE, was an early type of pipe organ that used water pressure to create sound, and featured keys that controlled the flow of air through the pipes. This instrument laid the groundwork for the development of later keyboard instruments by introducing the concept of keys to control pitch. Meanwhile, the hammered dulcimer, which originated in the Middle East around the 9th century, utilized strings struck by small hammers to produce sound. Though it lacked a keyboard, the dulcimer’s principle of using hammers to strike strings directly influenced the development of the clavichord and the piano. These early instruments provided crucial technological and conceptual foundations, ultimately leading to the creation of sophisticated keyed instruments that combined elements of both the Hydraulis’s key mechanism and the dulcimer’s string-striking technique. This lineage underscores the rich, cross-cultural heritage of keyboard instruments, culminating in the pianos and organs that became central to Western music traditions.

The Graphical User Interface (GUI)

The modern Graphical User Interface (GUI) traces its lineage through a rich history of innovation, beginning with ancient repositories of knowledge like the Library of Ashurbanipal in Assyria, which organized vast amounts of information on clay tablets. This tradition of information organization and accessibility continued with educational institutions like Al Qarawiyyin University in Fez, one of the oldest universities in the world, fostering environments where knowledge could be systematically structured and shared. Fast forward to the 20th century, the conceptual groundwork for GUIs was laid by Vannevar Bush’s Memex, envisioned in 1945 as a hypothetical device that used analog computers to link information through associative trails, prefiguring hypertext. The Xerox Alto, developed in the 1970s, was the first computer to implement a GUI, featuring windows, icons, and a mouse, significantly shaping the user experience. Building upon these foundations, Apple’s release of MacOS 1 in 1984 brought the GUI to mainstream audiences, refining the interface with intuitive design elements inspired by Xerox PARC’s research. MacOS 1’s emphasis on ease of use and visual interaction set the standard for future operating systems, cementing the GUI as an essential component of modern computing. This evolution from ancient information systems to contemporary digital interfaces highlights the enduring quest to make knowledge accessible and manageable through tools that extend human memory and cognition.

MacOS 1, officially released by Apple in 1984 as the “System Software,” marked a revolutionary leap in personal computing by introducing a graphical user interface (GUI) that departed significantly from the text-based interfaces dominant at the time. Developed under the guidance of Steve Jobs, MacOS 1 leveraged innovative ideas from Xerox PARC’s research on GUIs, which included the use of windows, icons, and a point-and-click mouse interface. This user-friendly design allowed for intuitive interaction with the computer, democratizing access to technology and setting a new standard for software usability. MacOS 1’s success and influence were profound, laying the groundwork for subsequent iterations of Mac operating systems and inspiring the development of other GUI-based operating systems, such as Microsoft Windows. The principles of direct manipulation, visual feedback, and ease of use established by MacOS 1 have become fundamental to modern software design, making it a cornerstone in the evolution of user interfaces that continues to shape the way people interact with technology today.

From at left and center: Xerox Alto GUI and icon set (mid-1970s)(images: Computer History Museum). At Right: MacOS I (1984)(image: uncredited, Apple Wiki).

The Xerox Alto NLS (oNLine System), developed in the early 1970s at Xerox’s Palo Alto Research Center (PARC), was a groundbreaking project that introduced the world to the modern Graphical User Interface (GUI). The Alto OS featured a revolutionary interface that included windows, icons, menus, and a point-and-click mouse, which allowed users to interact with the computer in a visual and intuitive manner. This was a significant departure from the text-based command-line interfaces that were prevalent at the time. The GUI concepts pioneered by the Alto OS were heavily influenced by earlier ideas, such as Vannevar Bush’s Memex and Douglas Engelbart’s work on interactive computing. Although the Alto itself was not commercially successful, its innovative design profoundly influenced the development of future computing systems. Apple co-founder Steve Jobs famously visited Xerox PARC and was inspired by the Alto’s GUI, leading to the incorporation of similar elements in Apple’s Lisa and Macintosh computers. The Alto OS’s GUI laid the foundational principles for user interface design, shaping the development of operating systems and software applications that are still used today.

A diagram of the Memex machine by Vannevar Bush showing its various components and interfaces. This is where our current concepts of the “system desktop” and “hypertext” originate. (“As We May Think,” 1945).

Vannevar Bush’s Memex machine, conceptualized in his 1945 essay “As We May Think,” was a visionary idea that prefigured many elements of modern computing. The Memex was conceived as an electromechanical device that would store vast amounts of information on microfilm, allowing users to create, annotate, and retrieve documents through associative links, much like modern hypertext. Bush envisioned a “desktop” interface where users could organize and access their data intuitively, laying the groundwork for the graphical user interfaces we use today. The Memex also anticipated semantic search capabilities, enabling users to find documents based on content and context rather than just keywords. Additionally, Bush’s concept included dual displays for multitasking and a scanner-like input device to digitize paper documents, paralleling modern peripherals. Although the Memex was never built, its innovative ideas directly influenced the development of personal computing and information technology. The conceptual framework of the Memex can be seen in everything from desktop metaphors and hyperlinking to advanced search engines and document management systems, highlighting its lasting impact on the evolution of modern computers.

Al Qarawiyyin University, founded in 859 CE in Fez, Morocco, by Fatima al-Fihri, holds the distinction of being the oldest continuously operating educational institution in the world. As an early university, it represented a monumental advancement in the organization and dissemination of knowledge, effectively serving as an analog computer powered by human intellect and repositories of recorded information. Al Qarawiyyin played a crucial role in preserving and transmitting scientific, philosophical, and religious knowledge across generations, extending and enhancing human memory and cognition. The university’s extensive libraries and scholarly activities created a dynamic environment where knowledge could be stored, retrieved, and expanded upon, much like the functions of modern digital databases. By fostering a culture of learning, critical thinking, and intellectual exchange, Al Qarawiyyin exemplified how educational institutions can act as powerful tools for cognitive enhancement, allowing societies to accumulate and refine knowledge over time. This model of the university as a human-powered repository of information laid the groundwork for the intellectual traditions that continue to drive academic and technological progress today.

At left: Artifacts from the Library of Ashurbanipal in the British Museum, dating from 7th century BCE from Nineveh, Assyria (modern-day Iraq)(image: By Gary Todd — Flickr, CC0). At right: The courtyard (Sahn) of the Qarawiyyin Mosque at the University of al-Qarawiyyin in Fes, Morocco. (image: By Momed.salhi — Own work, CC BY-SA 4.0)

The Library of Ashurbanipal, established in the 7th century BCE in Nineveh, Assyria (modern-day Iraq), is one of the earliest known libraries and represents a seminal effort to compile and preserve human knowledge. Assembled under the reign of Ashurbanipal, the last great king of the Neo-Assyrian Empire, this vast collection housed thousands of clay tablets inscribed with cuneiform script, covering a wide range of subjects including literature, history, medicine, astronomy, and legal texts. The library’s systematic organization and extensive cataloging of information created an early form of a database, enabling the storage, retrieval, and dissemination of knowledge across generations. By preserving and making accessible a wealth of information, the Library of Ashurbanipal extended human memory beyond individual lifespans, ensuring that accumulated wisdom and cultural achievements could be transmitted to future generations. This repository of information not only served as a critical resource for contemporary scholars but also laid the groundwork for the development of future libraries and information management systems, highlighting the enduring human quest to collect, organize, and preserve knowledge.

The Mouse

The computer mouse, or pointing device, stands out among technological innovations as it lacks ancient origins. It is a distinctly 20th-century invention created explicitly for computing. Nonetheless, two inventions from the nineteenth- and early twentieth-century share some human-machine interactions (HMI) similar to the mouse: the telegraph machine and the airplane control stick or joystick.

Conceived by Douglas Engelbart in the 1960s, the mouse was designed as part of his groundbreaking work on the oN-Line System (NLS), which aimed to augment human intellect through advanced computing tools. Engelbart’s vision materialized in the form of the “X-Y Position Indicator for a Display System,” patented in 1970, which introduced a simple, intuitive way for users to interact with graphical interfaces by moving a cursor across a screen. Unlike other tools and devices that evolved over centuries from earlier concepts, the mouse was a direct response to the needs of modern computing, providing a revolutionary method for navigating digital environments. Its introduction transformed user interfaces and remains integral to personal computing, highlighting its unique and innovative role in the history of technology.

Replica of the first protoype X-Y Position Indicator, later renamed the Mouse, by Douglas Engelbart (1968)(image: By SRI International — SRI International, CC BY-SA 3.0).

The NLS (oN-Line System), developed by Douglas Engelbart and his team at the Stanford Research Institute (SRI) in the 1960s, was a pioneering computer system designed to augment human intelligence through interactive computing. Engelbart envisioned a future where computers would be used to enhance human problem-solving capabilities, and the NLS was a manifestation of this vision. It introduced many revolutionary concepts and technologies, including hypertext, video conferencing, real-time collaborative editing, and the computer mouse. On December 9, 1968, Engelbart demonstrated the NLS in what has come to be known as the “Mother of All Demos.” During this landmark presentation, Engelbart showcased the system’s capabilities to a stunned audience, demonstrating live text editing, graphics, and remote collaboration, all controlled with the newly invented mouse. This groundbreaking demo not only showcased the potential of interactive computing but also laid the foundation for many aspects of modern computing, influencing the development of graphical user interfaces, personal computing, and the internet. Engelbart’s work with the NLS and his visionary demo are celebrated as pivotal moments in the history of technology, highlighting the transformative power of innovative thinking and collaboration.

After his groundbreaking work on the NLS system, Douglas Engelbart went on to collaborate with researchers at Xerox PARC, where he contributed to the development of the Alto system. The Alto incorporated many of Engelbart’s innovative ideas, including the use of a graphical user interface and the computer mouse, further advancing the field of personal computing.

In 1907, French aviator and inventor Robert Esnault-Pelterie made a significant contribution to aviation by inventing the first joystick for aircraft control. His innovative design provided a more intuitive and effective method for pilots to maneuver their planes. The joystick, a vertical stick mounted in the cockpit, allowed the pilot to control the ailerons and elevators by moving the stick in different directions — left and right to tilt the aircraft’s wings and forward and backward to adjust the nose’s pitch. This invention greatly enhanced the precision and responsiveness of aircraft control, marking a pivotal advancement in aviation technology. Esnault-Pelterie’s joystick set the standard for future aircraft controls, influencing the design of modern flight control systems. The aircraft joystick and the computer mouse both serve as human-machine interfaces that translate the user’s physical movements into corresponding actions within a system, enabling precise control and interaction.

The invention of the telegraph in the early 19th century revolutionized long-distance communication by transmitting electrical signals over wires. Samuel Morse, along with Alfred Vail, developed the telegraph system and its accompanying Morse code, a series of dots and dashes representing letters and numbers. The interface of the telegraph was simple yet effective: it consisted of a key, which the operator would press and release to create the signals, and a sounder, which would receive and convert the electrical signals back into audible clicks. This design allowed for quick and efficient transmission of messages, with operators needing to learn the rhythmic patterns of Morse code to send and receive information accurately. The interaction design focused on ease of use and reliability, making it possible for trained operators to communicate over vast distances with unprecedented speed and accuracy, laying the groundwork for modern telecommunications. The telegraph key is similar to the computer mouse button in that both interfaces translate simple, deliberate user actions into signals that can be interpreted by a system to perform specific tasks.

While the telegraph and joystick did not have a direct influence on the design of the computer mouse, the similarities in their HMI are worth noting. While developing the NLS systems, Douglas Engelbart tested various input devices, including a joystick, a stylus, and a mouse. Through his experiments, he discovered that the mouse had the best usability, making it the preferred choice for interacting with computers. The joystick, ideal for flying aircraft, turned out to be less optimal for moving a cursor around a screen. Nonetheless, the joystick would find a home in the computer gaming industry beginning in the early 1970s. It’s remarkable that we still use the mouse (or it’s sibling, the track pad), a testament it’s outstanding usability for controlling graphical user interfaces.

Closing the loop

Most components of the modern computer trace their origins to ancient inventions, illustrating a long history of technological evolution. The digital camera, for instance, can trace its roots back to ancient optical devices such as the Camera Obscura, which utilized lenses to project images. Modern displays have their lineage in ancient theaters, where visual storytelling and stagecraft laid the groundwork for screen-based entertainment and information presentation. The typing keyboard’s ancestry includes antique keyed musical instruments like the clavichord and hydraulis, which pioneered the use of keys to produce specific outputs. The GUI has conceptual ties to ancient schools and libraries, such as the Library of Ashurbanipal, which organized and facilitated access to vast amounts of information, much like modern databases. Among these historically grounded components, the computer mouse stands out as a uniquely modern invention, explicitly created by Douglas Engelbart in the 20th century to improve human-computer interaction, making it a uniquely modern part of our computing ecosystem.

Modern computing is a result of a rich tapestry of technological and cultural advancements that span continents and millennia. In addition to the examples I give above, there were many other inventions that contributed to modern computing. European innovations, such as the mechanical calculator were built on the mathematical and scientific knowledge preserved and expanded by Middle Eastern scholars during the Islamic Golden Age. North African contributions, particularly in mathematics and astronomy, also played a crucial role. Meanwhile, East Asia provided significant advancements in algorithms and early computing and time-keeping devices, like the Chinese Abacus (a counting machine), the Japanese Soroban (similar to the Abacus), and the Korean Jagyeokru that was a sophisticated water-powered clock. Ancient India also contributed to modern computing with the invention of the concept of zero, the decimal system, early binary-like systems, and algorithmic processes by mathematicians like Aryabhata, Pingala, Brahmagupta, and Bhaskara.

However, the full potential of computing technology can only be realized by incorporating perspectives from cultures that historically had fewer opportunities to contribute. Indigenous peoples of Sub-Saharan Africa, the Americas, Pacific Islanders, and the indigenous populations of Indonesia, Australia, and New Zealand possess unique knowledge systems, problem-solving approaches, and worldviews that could lead to more holistic and innovative solutions. Integrating these diverse perspectives would not only honor these cultures but also enhance the richness and applicability of modern computing, driving further technological and societal advancements. Innovation thrives on diverse perspectives and shared knowledge, and we now have the potential for global inclusion in the design of human-computer interaction.

Sources:

  • Brereton, Gareth. I am Ashurbanipal: King of the World, King of Assyria. London, UK: Thames & Hudson, 2020.
  • Bush, Vannevar. The Essential Writings of Vannevar Bush. New York, NY: Columbia University Press, 2022.
  • Cuomo, S. Technology and Culture in Greek and Roman Antiquity. Cambridge, UK: Cambridge University Press, 2007.
  • DK Publishing. Imperial China (DK Classic History). New York, NY: DK Publishing, 2020.
  • Gustavson, Todd; House, George Eastman. Camera: A History of Photography from Daguerreotype to Digital. New York, NY: Union Square & Co., 2012.
  • Haigh, Thomas and Ceruzzi, Paul E. A New History of Modern Computing. Boston, MA: The MIT Press, 2021.
  • Ilham, Talbi. “Fatima al-Fihri: Founder of the world’s oldest university.” Deutsche Well (DW), May 8, 2020. https://www.dw.com/en/fatima-al-fihri-founder-of-the-worlds-oldest-university/a-53371150
  • Lewis, Robert M. From Traveling Show to Vaudeville: Theatrical Spectacle in America, 1830–1910. Baltimore, MD: Johns Hopkins University Press, 2007.
  • Markoff, John. What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry. Penguin Books, 2006.
  • Miettinen, Jukka O. “The Early History of Chinese Theater” from Asian Traditional Theatre & Dance. Teatterikorkeakoulun julkaisusarja: https://disco.teak.fi/asia/the-early-history-of-chinese-theatre/
  • Nyce, James M.; Kahn, Paul. From Memex To Hypertext: Vannevar Bush and the Mind’s Machine. Cambridge, MA: Academic Press, 1991.
  • Oosting, J. Thomas. Andrea Palladio’s Teatro Olimpico. Ann Arbor, MI: UMI Research Press, 1981.
  • Paz, Emilio Bautista; Ceccarelli, Marco; Otero, Javier Echávarri; Sanz, José Luis Muñoz. A Brief Illustrated History of Machines and Mechanisms. New York, NY: Springer Press, 2010.
  • Shaw, Ian. The Oxford History of Ancient Egypt. Oxford, UK: Oxford University Press, 2000.
  • Sheikh, Naeem Ur Rahman. Al-Qarawiyyin: The World’s Oldest Operating University. Self-published: 2024.
  • Smith, Mark A. From Sight to Light: The Passage from Ancient to Modern Optics. Chicago, IL: University of Chicago Press, 2017.
  • Solnit, Rebecca. River of Shadows: Eadweard Muybridge and the Technological Wild West. London, UK: Penguin Books, 2004.
  • Van Lente, Dick. Prophets of Computing: Visions of Society Transformed by Computing. ACM Books, 2022.
  • Weil, Peter and Robert, Paul. Typewriter: A Celebration of the Ultimate Writing Machine. New York, NY: Union Square & Co., 2016.
  • Weller, Charles Edward. The Early History of the Typewriter. Classic Reprint, 2012.
  • Williams, John-Paul. The Piano: An Inspirational Guide to the Piano and Its Place in History. New York, NY: Billboard Books, 2002.

--

--

Julian Scaff

Interaction Designer and Futurist. Associate Chair of the Master of Interaction Design program at ArtCenter College of Design.