Among the more influential truisms about science today is that it is essential for technological — and thus economic — progress. It is fitting, then, that the apparent slowing of American innovation has fueled a debate about the importance of science and the need for the federal government to support it.
Indeed, there is growing interest across the political spectrum in revitalizing American innovation, raising questions about how best to allocate scarce resources. What kinds of research should we support? Who should decide — government or industry or the scientific community? Should we emphasize science or technology? Should we steer research toward solving practical problems or simply leave science free to pursue its own aims?
When asking these questions, we typically take for granted that scientific research is necessary for innovation. But while it may be a truism today, this contention is in fact a modern one, best known from the writings of Francis Bacon. And it rests on an important claim about — and, too often, a misunderstanding of — the relationship between science and technology.
Bacon was among the first thinkers to argue that scientific knowledge should not be pursued as an end in itself but rather as a means to an end — the improvement of the human condition. Science, in other words, is essentially useful, especially by enabling the technological mastery of nature. Such Baconian rhetoric is so familiar to us today that it likely passes unnoticed. Scientists have long invoked the practical fruits of their trade, especially when seeking public recognition or funding for science. As historian of science Peter Dear wrote in The Intelligibility of Nature (2006):
The authority of science in the modern world rests to a considerable extent upon the idea that science is powerful; that science can do things. Television sets, or nuclear explosions, can act as icons of science because it is taken for granted that they somehow legitimately represent what science really is.
In this way, what Dear calls the “instrumentality” of science “stands for the whole of science.” Science, we might say, gets absorbed into technology.
When thinking about the technological fruits that we expect from science, we are now all Baconians. A critical examination of this inheritance, then, may help illuminate today’s debates about science, technology, and innovation. As we shall see, Bacon’s contention that scientific knowledge is useful — even essential — for technological innovation was ultimately vindicated by history. But the story of how this came to be is more complicated than we typically assume.
Even though science and technology have developed into overlapping and mutually reinforcing fields, they were and remain distinct. The paradox — what I call the Baconian paradox — is that as science becomes more useful for technology (and vice versa), technology tends to overshadow science. As a result, we fail to recognize any meaningful distinction between the two. But this is dangerous for both enterprises. When science is constrained by technological usefulness, scientific knowledge pursued for its own sake falls by the wayside; but besides being intrinsically valuable, this type of knowledge often bears technological fruit.
If we want both science and technology to flourish, we may well need to temper our Baconian rhetoric — to promote science as an end in itself, rather than as a means to technological innovation only — precisely because science has become so useful, just as Bacon predicted.
The year 2020 marked the four hundredth anniversary of Francis Bacon’s New Organon, in which he sought to reform science by modeling it on the mechanical arts — what today we would call technology. Bacon’s 1620 treatise argued that science, like the mechanical arts, should produce “useful works.” In contrast to ancient and medieval science, which sought knowledge for its own sake, science — or natural philosophy, as it was then called — should “be judged of by its fruits, and pronounced frivolous if it be barren.” Science is not to be pursued for the sake of wisdom, in other words, but — as Bacon famously put it in the The Advancement of Learning — “for the glory of the Creator and the relief of man’s estate.”
This reformation of knowledge, which Bacon calls the “Great Instauration,” would be accomplished with the help of a new kind of logic, a new “organon.” Organon, meaning “tool” or “instrument” in ancient Greek, is the name conventionally given to Aristotle’s collection of writings on logic. But the difference between the old organon and the new is often misunderstood. One popular misconception is that Bacon proposed induction, reasoning from particulars to universals (say, from observations to natural laws) whereas Aristotle and his medieval followers practiced only deduction, reasoning from universals to particulars (for example, from general concepts to observations of particular instances of those concepts). This mistake is closely related to another canard: that pre-moderns did not make any observations, instead relying entirely on syllogistic reasoning. Bacon himself was under no such illusions: He wanted to replace Aristotle’s method of induction with a new and improved one.
The problem with Aristotelian induction, Bacon charges, is that it moves too blithely from sensory observations to universal statements — with these universals then serving as premises in syllogistic arguments. Human reason, Bacon warns, has a natural tendency to fly off into abstraction, and so, unless weighted down by experience, it will fail to extend our knowledge, even while creating the illusion of doing so. Like spiders, says Bacon in a memorable passage, the scholastic philosophers of the Middle Ages, following Aristotle, spin intricate webs in the air out of their own thoughts, leaving behind experience and producing only vacuous and irresolvable disputations.
These rationalists, as Bacon called them, are not the only victims of his ire. He also excoriated those who, he believed, relied blindly on empirical experience, such as the practical craftsmen and natural historians of his day. Like ants, they pile up stores from sensory observations without understanding them. This “Empirical school” thus winds up giving “birth to dogmas more deformed and monstrous than the Sophistical or Rational school.” This point is often overlooked in cartoonish portrayals of Baconian induction as the mere accumulation of facts. Rather than just relying on plain observation, Bacon “sought … to provide helps for the sense — substitutes to supply its failures, rectifications to correct its errors … by experiments” (emphasis added). Experiments would not only tether the intellect to experience; they would also discipline the senses.
With this mediating role for experiment, Bacon believed that he had “established forever a true and lawful marriage between the empirical and the rational faculty.” Unlike the ant that merely collects details (the empirical), and also unlike the spider that spins webs in the air (the rational), the scientist should be like the bee, which “gathers its material from the flowers of the garden and of the field, but transforms and digests it by a power of its own.”
It’s worth spelling out further how this vision of science departs from classical, Aristotelian notions. Contrary to some caricatures, the ancients and medievals did make observations — the slogan “there is nothing in the mind that was not first in the senses” was an Aristotelian principle revered by the scholastics. Some even performed experiments, notably Archimedes in third-century-b.c. Greece and Roger Bacon in thirteenth-century England. But the prevailing view was that we obtain knowledge of nature by observing things in their natural settings.
Bacon, by contrast, proposed examining “nature under constraint and vexed; that is to say, when by art and the hand of man she is forced out of her natural state, and squeezed and moulded.” Nature could be forced to give up her secrets, but only if pressed with instruments and artificial techniques. Possessed with such knowledge, the natural philosopher would gain not disinterested knowledge but power.
In allying knowledge with power, Bacon did not, unlike many of his followers, seek to abolish the distinction between science and technology — between natural philosophy and mechanical art. He certainly wanted natural philosophy to bear fruit and thus to contribute to improving the human condition. But such utilitarian benefits were the result of knowledge gained through the experimental study of nature. As historian of science Paolo Rossi notes, the practical results of science for Bacon were not “artefacts” — particular tools or inventions like the printing press — but rather knowledge of nature’s underlying causes. Equipped with such knowledge, natural philosophers would have the power to harness nature for their own purposes; scientific knowledge would enable technological power. In this sense, Bacon declared, “those twin objects, human knowledge and human power, do really meet in one.”
Bacon recognized that such knowledge could be of great value to the state. Accordingly, he endeavored (unsuccessfully) to secure political support for his grand proposal for a new science. In his utopian story New Atlantis [from which this journal takes its name –Ed.], he envisioned a large-scale, organized research enterprise, overseen by scientists but supported by public funds and geared toward the betterment of society. It would take over three centuries for something resembling this vision to become reality — perhaps nowhere more clearly than in the American effort to establish scientific and technological preeminence during and after World War II.
Bacon himself made no real scientific contributions, although he did perform some of his own experiments. In one, intended to study the effect of snow on a dead chicken, he reportedly fell ill and died of pneumonia. But while Bacon was primarily a statesman, jurist, and philosopher — rather than a practicing scientist — he is undoubtedly among the most important theorists of modern science, which is why he is often referred to as the “father of modern science.”
Scholars continue to debate the extent of Bacon’s influence on the actual practice of science in the centuries after his death, for example how well his logic of induction captures the methods of modern science. But there is no dispute that Bacon’s understanding of science — especially his proposal that it “be judged of by its fruits” — continues to influence our hopes and fears about the nature of science and its place in society.
Bacon’s idea that science is about gaining power rather than acquiring wisdom has attracted critics of modernity no less than its cheerleaders. Writing in 1944, Theodor Adorno and Max Horkheimer singled out Bacon as representative of the Enlightenment, which sought to conform everything to “the standard of calculability and utility.” In Baconian science, “what human beings seek to learn from nature is how to use it to dominate wholly both it and human beings.” This is not unlike magic, which, although it is “bloody untruth,” is similarly “concerned with ends,” with domination. Thus did Adorno and Horkheimer unknowingly echo C. S. Lewis, who, only a year earlier in The Abolition of Man, had pointed to Bacon when arguing that “the serious magical endeavour and the serious scientific endeavour are twins…. born of the same impulse.” The “true object” of Baconian science, as of magic, is “to extend Man’s power to the performance of all things possible.”
Less dramatic, though no less Baconian, is the popular association of the word “science” with such patently technological enterprises as the building of computers, robots, and advanced weapons, or the manufacturing of medicines. As Peter Dear puts it: “The popular image of a scientist is of someone in a white coat who inventssomething — a vaccine, a satellite, or a bomb.” Science, in this sense, is “a form of engineering, whether that engineering be mechanical, genetic, computational, or any other sort of practical intervention in the world.” And so we get the prevalent terms that simply lump science and technology together — “science & technology” (S&T), “science and innovation,” “research & development” (R&D), or that great linguistic barbarism “science, technology, engineering, and mathematics” (STEM).
All this is more than semantic sloppiness; it affects public policy. Thus we hear proposals to boost federal R&D spending that pay lip service to science but focus almost exclusively on technology — whether “tech hubs” that aim to imitate Silicon Valley or fashionable areas of applied research such as artificial intelligence, robotics, quantum computing, data analytics, and alternative energy technologies. Similarly, politicians and educators promote STEM, with a particular focus on “coding.” Or they want to maintain America’s competitive advantage in “science and technology,” especially by bridging the “STEM talent gap.” And during the Covid-19 pandemic, politicians and pundits have routinely characterized the efforts of private corporations to invent a vaccine by talking of a hope that “science” will come to our rescue.
One likely explanation for this ready conflation of science and technology is simply sociological. This talk of “science” unceremoniously bags together epidemiology, virology, and clinical medicine; atmospheric physics, marine biology, and organic chemistry; civil engineering, industrial design, and materials science, to name only a few. Most policymakers have no firsthand acquaintance with the inner workings of these various professions and divergent practices, and so do not recognize them as distinct fields of inquiry. Here we have, perhaps, a contemporary version of C. P. Snow’s famous “two cultures,” with an important difference: Where the technical illiteracy of humanists fueled dismissive condescension toward the practitioners of science, technology, and industry, the technical illiteracy of our media and political elites often fuels uncritical admiration.
Yet behind this possible sociological explanation lies something more fundamental: Science has indeed become deeply intertwined with technological development; it is even indispensable to it — just as Bacon envisioned four centuries ago. But the story of how this relationship came to be — and what it means for us today — is not the one that Baconian rhetoric, past or present, would have us believe.
To tell the story of this relationship between science and technology, it will be helpful first to consider an influential version of it — what I will call the Baconian story. It goes something like this.
In ancient times, knowledge was thought to be sharply distinct from utility, just as natural philosophy was from manual labor. “In Greek thought as a whole,” writes historian of technology David E. Nye in Technology Matters, “work with the hands was decidedly inferior to philosophical speculation.” This opposition between knowledge and use reflects a pre-modern and pre-scientific view of knowledge, as well as a class structure, dependent on slavery, that associated utility with the “servile arts.” As political scientist Donald E. Stokes puts it in Pasteur’s Quadrant, “severing inquiry from use was strongly reinforced in Greek civilization by the consignment of the practical arts to people of lesser station — and manual labor increasingly to slaves.”
In contrast to the practical arts, ancient science aimed at knowledge for its own sake and was therefore “useless.” As such, it was the privilege of the elite, for “only free men,” Peter Dear writes, “such as the citizens of the city-state, had the leisure to devote their time to philosophizing, while practical abilities were the province of servants and slaves.” The ancient edifice built on these ideas remained in place through the Greco-Roman period and into the Middle Ages.
Cracks began to appear during the Renaissance, and the structure would crumble under the weight of the Scientific Revolution in the seventeenth and eighteenth centuries. Francis Bacon had introduced the idea of a science aimed toward practical use. No longer the privilege of a class that disdains practicality, science, at Bacon’s direction, was pressed into service for utilitarian ends. At the same time, those very ends came to be of growing interest to the upper classes.
The story culminates in the Industrial Revolution. In an influential work on that period, English economic historian T. S. Ashton credited modern science with the technological innovations that drove industrialization in England. More recently — and more plausibly, if not entirely convincingly — another economic historian, Joel Mokyr, credits Bacon for promoting an experimental culture of “useful knowledge” that was conducive to innovation. Science, through innovation, thus becomes the engine of economic growth.
At this stage the ancient divide between knowledge and use collapsed, and so too did the pre-modern class structure on which it depended. Rather than a privilege made possible by the labor of others, science, through technology, becomes the creator of wealth. If natural philosophy is thereby brought down from the lofty pedestal upon which the Greeks had placed it, technology now comes to occupy a privileged place as the foundation of economic progress.
According to the Baconian story, these historical changes mean that any hard and fast distinction between science and technology is archaic, resting on an outmoded and elitist opposition between knowledge and use. It is now misleading, perhaps even meaningless, to speak of “science” versus “technology,” of “discovery” versus “invention,” of “basic” research versus “applied” research or “development.” Some versions of the story, following Bruno Latour, do away with the distinction altogether: there is only technoscience. What we have is more or less useful knowledge, offering us increasingly effective techniques with which to manipulate nature.
We see the main thrust of this story implicit in some of the most influential critiques of modern science and technology. Consider Martin Heidegger’s famous 1954 essay “The Question Concerning Technology,” where he describes how modern technology transforms nature into a “standing-reserve” to be exploited. The development of this technological framework was made possible by “the rise of modern physics as an exact science.” Note that Heidegger does not simply lump the scientific and industrial revolutions together, as in the crudest version of the Baconian story. On the contrary, he points out that “mathematical science arose almost two centuries before [modern] technology.” “How, then,” he asks, could modern science “have already been … placed in [technology’s] service?” His answer: Modern mathematical science presents nature to us as a “calculable coherence of forces” and a “storehouse of the standing energy reserve.” In so doing, science “prepares the way” — like a “herald” — for modern technological exploitation.
Similarly, Herbert Marcuse — an early student of Heidegger associated with Marxism and the Frankfurt School — writes in One-Dimensional Man that “the principles of modern science were a priori structured in such a way that they could serve as conceptual instruments for a universe of self-propelling, productive control.” His point is not that science when applied technologically leads to domination, but rather that “science, by virtue of its own method and concepts, has projected and promoted a universe in which the domination of nature has remained linked to the domination of man.”
Today, our own techno-pessimists follow Heidegger and Marcuse in blaming modern science for the technological exploitation of nature and humanity. We can also hear echoes of the Baconian story in the prognostications of those techno-optimists who, at their most unrestrained, hope for a posthuman future in which science and technology have allowed us to transcend all limits imposed by nature.
Both modernity’s champions and its judges accept the moral of the Baconian story: that the essence of modern science lies in its technological power.
However influential, this Baconian story is wrong in several crucial respects.
Start with the opposition between knowledge and use allegedly promoted by the ancients and medievals. It is true that Plato and Aristotle distinguished between science (episteme) and technology or craft (techne) as well as between theory (theoria) and practice (praxis). And they certainly promoted the pursuit of knowledge for its own sake. But Plato also criticized the pre-Socratic natural philosophers for reducing philosophy to the mere study of nature. The Socratic innovation was to orient philosophy toward something much less theoretical and more practical: the moral and political question of how to lead a flourishing life. Aristotle followed his teacher in this respect, taking practical wisdom to be the highest virtue, and devoted considerable attention to the practical sciences of ethics, politics, and rhetoric.
As for the Greek words episteme and techne, these do not track well our own distinction between “science” and “technology.” Any comparison between ancient and modern ideas on these subjects is more complicated than the Baconian story suggests.
Plato often uses episteme and techne interchangeably, and even thinks of techne as a paradigmatic form of rational knowledge. In the dialogue Ion, for instance, Socrates concludes that his interlocutor, who is a kind of poetic performer, does not possess a techne because he is unable to explain the nature of his craft. Tellingly, Socrates contrasts Ion’s lack of “art and knowledge” not with the philosopher or scientist but with the carpenter, the charioteer, the fisherman, as well as the doctor and the arithmetician.
In some places, Aristotle appears to draw a sharper distinction between episteme and techne. But, as Alasdair MacIntyre points out, Aristotle in his Metaphysics also takes the master-craftsman to be “the model of the person with sophia,” or wisdom. (Among other things, the master-craftsman knows what is the right thing to do in the right situation.) And, far from seeing episteme and techne as mutually exclusive, in the Nicomachean Ethics Aristotle characterizes the pursuit of wisdom as a science that is a master-craft. Thus while Aristotle, like Plato, takes the pursuit of knowledge for its own sake to be nobler than any other pursuit, that is because the object of this particular craft is wisdom — and wisdom is the precondition for human flourishing.
Similarly, Aristotle believed the theoretical sciences were “higher” than the practical sciences. But this ranking is based on what these sciencesstudy, not on whether they are practical or require the use of one’s hands. As he put it in On the Soul, a science can be said to be “more honourable and precious than another” either because of its “greater exactness” or because of the “higher dignity and greater wonderfulness” of its “objects.” This is why physics and mathematics are higher than practical sciences like politics, which are less exact; but metaphysics is higher still, because it studies not particular beings like plants and stars, but being as such.
Our modern distinctions between knowledge and use, or between science and technology, simply fail to capture these intricate webs of meaning.
There is no question that both Plato and Aristotle had disparaging things to say about those who had to work for a living. Aristotle writes that shepherds are the “laziest” of all men, and infamously argued that slavery was not contrary to nature. But we should not interpret such claims, morally repugnant and factually unfounded though they are, through the lens of our own notions about knowledge and use.
Aristotle does not denigrate the use of one’s hands so much as the condition of people who must concern themselves with survival — with the “servile arts,” such as cooking, blacksmithing, or shepherding. By contrast, he writes in Politics, “those who are in a position which places them above toil,” which is to say, those who have slaves to attend to the necessities of life for them, “occupy themselves with philosophy or with politics.” What makes the servile arts inferior is that they are directed entirely toward instrumental goods rather than being ends in themselves. Thus Aristotle also says — perhaps counterintuitively for us moderns — that the inventors of those “arts” directed to “recreation” are “naturally always regarded as wiser” than the inventors of arts “directed to the necessities of life.”
The Middle Ages inherited many of these ancient ideas — and in this era too, the relationship between knowledge and use is more complicated than the Baconian story suggests. For instance, the curricula of medieval universities were structured around the seven “liberal arts”: the trivium (grammar, logic, and rhetoric) and the quadrivium (arithmetic, geometry, music, and astronomy). All seven were considered crafts — artes in Latin, which is just a translation of the Greek techne. To be sure, these liberal arts were separate from the servile arts, but not because the servile arts were associated with practicality so much as with professional vocation. The liberal arts, by contrast, were those crafts that could be practiced only by those who were free of the need to pursue such vocations. (Modern universities have enshrined this conception in the separation of colleges of liberal arts from those of engineering and other professions.)
Notably, while the medievals retained the classical concept of servility, the society in which they lived was not dependent on slavery in the same way as ancient Greece and Rome. Medieval thinkers like Thomas Aquinas agreed with Plato and Aristotle that a life devoted to the pursuit of knowledge for its own sake was a privilege, in the sense that it required wealth, time, and leisure — a point that is no less true today. But at the height of the Middle Ages, the socioeconomic system that afforded a small minority of men the freedom to devote themselves to such study was one in which slavery had been greatly reduced, and at least in England all but abolished. Though the agrarian society of the Middle Ages depended on a feudal hierarchy, many peasants were not serfs, but had rights (if often limited) to property, tools, and the surplus of what they produced for their lords. And skilled artisans — masons, weavers, blacksmiths — formed a growing middle class bound together in craft guilds that both transmitted artisans’ skills and protected their economic interests.
The contrast in the socio-economic standing between ancient and medieval craftsmen may be brought out by an infamous line from Aristotle’s Politics: “If … the shuttle would weave … without a hand to guide [it], chief workmen would not want servants, nor masters slaves.” At first glance, this appears to be a prophetic statement about automation and emancipation. It is all the more striking when one considers that the nineteenth-century Luddites — who violently protested automation by smashing the machines that threatened their way of life — were weavers. And yet, the Luddites were not servants or slaves, but rather skilled artisans who were members of a craft guild — a medieval institution that had flourished up through the Renaissance and until the Industrial Revolution. Aristotle could more readily imagine the automation of mechanical art than he could a society organized in such a way that its artisans were not slaves. But it was precisely such a society that confronted — and, as we shall see, may have even helped to facilitate — the era of industrialization.
The Baconian story oversimplifies the ancient and medieval conception of science to the point of falsehood. But there is no question that the new science marked a significant departure from classical and medieval natural philosophy, nor that science and technology became increasingly intertwined. We should not overlook Bacon’s role in these historical changes. The Baconian story should be taken seriously — but not therefore literally.
Really, it should not even be taken literally as an interpretation of Bacon. He believed that science needed to reform itself precisely to become more like the mechanical arts, which were “continually thriving and growing, as having in them a breath of life.” In other words, Bacon recognized that progress in technology had been taking place quite independently of science.
Indeed, many of history’s impressive technological achievements — the water wheel, the mechanical clock, the mariner’s compass, eyeglasses, gunpowder and many others — long predated modern science. Thanks to figures such as Archimedes and Hero of Alexandria, the Hellenistic period could boast inventions such as the water screw, the early steam engine, and all manner of mechanical devices and weaponry. The Romans, who did little to advance science, made numerous technological improvements — for example in plumbing, construction (notably cement), and hydropower.
Similarly, the Middle Ages, despite today’s prejudices, were not a dark age for innovation. “In technology, at least,” writes historian Lynn Townsend White, “the Dark Ages mark a steady and uninterrupted advance over the Roman period.” What is known as the Renaissance of the twelfth century — that is, not the more familiar, later Renaissance — saw countless technical improvements, including in windmills, manufacturing, and architecture. White enthuses:
The chief glory of the later Middle Ages was not its cathedrals or its epics or its scholasticism: it was the building for the first time in history of a complex civilization which rested not on the backs of sweating slaves or coolies but primarily on non-human power.
The historian Jean Gimpel has gone so far as to refer to an “Industrial Revolution of the Middle Ages.”
And finally, there is the Renaissance of the fourteenth through sixteenth centuries, which pre-dated the Scientific Revolution but saw unprecedented levels of innovation in fine and mechanical arts, with Leonardo da Vinci as perhaps the best-known exemplar. It is likely no coincidence that it was immediately following this period that Bacon set out to reform natural philosophy.
Of course, technology later came to rely on scientific knowledge to a degree unimaginable to ancient, medieval, or Renaissance inventors. And science, in turn, came to rely on experiments and technology — science became instrumental — just as Bacon envisioned. But for most of history, technology advanced quite independently of science — a dynamic that remained true even during that most significant of technological transformations, the Industrial Revolution.
The Industrial Revolution marks the most fundamental transformation of human life in the history of the world recorded in written documents,” wrote the historian Eric Hobsbawm. Whether or not one entirely agrees with this, one can hardly deny that the Industrial Revolution, even if measured only in terms of the quality and quantity of inventions, brooks no comparison. But was this momentous transformation due to modern science?
Historians remain divided on this vexed question of causality. But one thing is clear: With a few notable exceptions, most inventions of the Industrial Revolution required little, if any, formal knowledge of the cutting-edge science of the day. And most of the inventors were not educated scientists but skilled artisans and craftsmen. As Hobsbawm puts it, for the most part the technology of the Industrial Revolution “required little scientific knowledge or technical skill beyond the scope of a practical mechanic of the early eighteenth century.”
This is hardly to denigrate their achievements, nor to suggest that inventors were working haphazardly, but rather to point to two interrelated historical realities. First, most of the inventions associated with industrialization (at least in its initial phase) were not applications of scientific theory. The flying shuttle, the spinning jenny, the spinning mule, the puddling furnace — these did not require, and did not stem from, any deep knowledge of Copernican astronomy, Newtonian mechanics or optics, or even the chemistry of Boyle or Lavoisier. Whatever role scientific knowledge played, it did not directly drive industrialization. In fact, perhaps the most iconic invention of industrialization, the steam engine, is one of the clearest examples of technology driving scientific understanding rather than being driven by it.
The second historical reality worth noting is that the natural sciences and the mechanical arts were at the time professionally and sociologically distinct enterprises. Many of the artisans and craftsmen were of humble birth and lacked the formal education that would have acquainted them with the science of the day. A prime example is Henry Cort, whose puddling and rolling techniques for converting pig iron to wrought iron are among the most important innovations of the Industrial Revolution. Cort was of modest pedigree and education — not much is known about his early life — leading the chemist Joseph Black to describe him as “a plain Englishman without Science.”
Most of the inventors of the industrial era were not natural philosophers but engineers, clock makers, miners, and iron masters whose skills were transmitted through apprenticeship rather than university education. The historian Joel Mokyr suggests that this sociological reality may even help explain why industrialization first took hold in Great Britain, for “in an age in which dexterity and experience could still substitute for a formal training in mathematics and physics,” the country was “fortunate to possess a class of able and skilled people, larger and more effective than anywhere else.” Some historians, such as Jane Humphries, have credited the British guild system for preserving these skills that proved so instrumental to industrialization.
The idea that scientific knowledge bears technological fruit, however central to the self-understanding of modern science, remained at this point more a reflection of Baconian rhetoric than of historical reality.
If the role of science in technological change up until the nineteenth century is often greatly exaggerated, so too is the role of technology in science. Even in the Scientific Revolution, technology did not play as significant a role as is commonly believed.
The modern sciences that first came to maturity during that time were those least dependent on either technology or experiment. Astronomy — arguably the science that inaugurated the Scientific Revolution — was an observational rather than an experimental science, and it still depended on naked-eye observations when Copernicus placed the Sun at the center of the cosmos. Even the physics of Galileo was more mathematical than experimental. The historian of science Alexandre Koyré has argued that Galileo likely did not carry out many of his most famous experiments, using them instead as what we might call thought experiments or rhetorical devices.
This is not to deny any role for instruments in the early days of the Scientific Revolution, especially in making observations — Galileo’s telescope and pendulum come to mind — or to deny that experiments were important, for example in Newton’s research on optics. But even at their most technological, the scientific advances characteristic of the early Scientific Revolution shared with medieval and ancient science a desire to understand nature — albeit increasingly with the aid of instruments and experiments — rather than to dominate nature or furnish technologies for manipulating it. Put simply, the classical sciences with which the Scientific Revolution began were not especially Baconian.
If any early modern sciences approximated the Baconian ideal, it was not the classical sciences of astronomy and physics, but the experimental traditions that laid the groundwork for the scientific understanding centuries later of such natural phenomena as magnetism, electricity, and the chemical elements. Unlike the classical sciences, these “Baconian sciences,” as historian Thomas Kuhn calls them, aimed explicitly at the experimental manipulation of nature. But they grew out of an altogether different set of practices and traditions, with roots in Hermeticism — a school of thought that flourished during the Renaissance.
Hermeticism emphasized the wisdom of ancient esoteric knowledge — especially of alchemy and pharmacology — and the power of occult forces. Perhaps the most prominent Hermetic thinker of the time was the Swiss Theophrastus von Hohenheim, known as Paracelsus, whose philosophy combined Neoplatonism, Pythagoreanism, Christianity, and paganism, and straddled the line between science and magic. Although Bacon was critical of Paracelsus, he was clearly influenced by his tradition. For Baconian science, as for Hermeticism, the ideal practitioner was not the natural philosopher but, as Kuhn puts it, the “Faustian figure of the magus, concerned to manipulate and control nature, often with the aid of ingenious contrivances, instruments, and machines.”
Some key figures of the Scientific Revolution were practitioners of both the classical and Baconian traditions. Newton, for instance, was as devoted to alchemy as he was to physics. Yet, the majority of these experimentalists were not natural philosophers but craftsmen and pharmacists. Their research would not be fully integrated into the scientific mainstream until much later, with the rise of modern chemistry and electromagnetism in the eighteenth and nineteenth centuries — helping to forge the bond between science and technology that has become so familiar to us today.
The relationship between science and technology underwent a substantial transformation over the course of the nineteenth century. As we have seen, science played no direct role in the Industrial Revolution. But by 1898, William John McGee, president of the American Association for the Advancement of Science, could proclaim that “scientific progress … is so closely interwoven with industrial and social progress that the advance of one cannot be traced without constant reference to the other.” What had changed?
American companies had begun to recognize the economic significance of scientific knowledge in the second half of the nineteenth century, following the lead of the German chemical industry. The Germans had pioneered the concept of industrial research — the first systematic attempt by industry to exploit science for commercial ends. American entrepreneurs soon followed suit. Thomas Edison opened a research laboratory in Menlo Park in 1876. Around the turn of the century, both General Electric and DuPont did the same, and American chemist Arthur D. Little launched an independent commercial laboratory. Academically trained physicists and chemists were in high demand and started to get hired in large numbers by the private sector. For example, Frank B. Jewett, who had received his doctorate in physics at the University of Chicago, oversaw research and development at AT&T and Western Electric, which later became Bell Telephone Laboratories, where he became president.
So intertwined became science and technology that when the Great Depression hit, humanists, religious figures, and ordinary citizens alike were ready to blame science. Science, it was said, not only produced wondrous technologies, like telephony, but also led to automation, unemployment, and rising levels of inequality. As William Wickenden, president of the Case School of Applied Science (now part of Case Western), put it in a 1933 Science magazine article:
John Doe isn’t quite so cock-sure as he used to be that all this science is a good thing. This business of getting more bread with less sweat is all right in a way, but when it begins to destroy jobs, to produce more than folks can buy and to make your wife’s relatives dependent on you for a living, it is getting a little too thick.
This new attitude was a reaction to the economic crisis but also to the excessive optimism about the social and economic benefits that many believed would come from the application of science to industry. Scientific knowledge had, for better or worse, become indispensable for technological advance — a reality that affected much more than just industry and economics. During World War I, science came to be seen as vital to the public interest for its use in refining the design of machine guns, tanks, hand grenades, submarines, and (infamously) chemical weapons. And World War II finally solidified the Baconian notion that technological advance required sustained federal support for large-scale scientific research — most prominently in the Manhattan Project.
All this is to say that the Baconian story about the technological usefulness of science had finally come to reflect historical fact. So was the distinction between science and technology thereby dissolved? If yes, then the Baconian story would appear to have been wrong mainly in its chronology, not its conclusion. That is, while science and technology remained distinct through the Industrial Revolution (roughly 1750 to 1850), by the beginning of the 1900s they had become a single, uniform enterprise. If this were the case, our public discourse, and our public policies, would be right to conflate the two.
But this would be a hasty conclusion, because science and technology had in fact not become a single, uniform enterprise. Although deeply interdependent and mutually reinforcing, they remained sociologically and institutionally distinct, with divergent professional identities.
Rather than becoming one enterprise, science and technology underwent parallel transformations during the nineteenth century.
Science became both more professionalized and more specialized. Universities sought to employ scientists from the new specialties as professors, while scientists founded a growing number of professional societies and journals. Around this time, in 1834, William Whewell coined the term “scientist” — a more apt description of the professional scientist than the more classical “natural philosopher” or the patrician “man of science.”
At the same time, technology underwent what historian Edwin Layton has termed the “scientific revolution in technology.” Just as scientists increasingly relied on technology, so too did engineers increasingly rely on science, and the engineering disciplines began to model themselves on the sciences. As Layton describes:
The artisan was replaced in the vanguard of technological progress by a new breed of scientific practitioner. For the oral traditions passed from master to apprentice, the new technologist substituted a college education, a professional organization, and a technical literature patterned on those of science…. As a result, by the end of the 19th century, technological problems could be treated as scientific ones.
And just as the word “scientist” originated in this period, so too did our current understanding of the word “technology.” The term previously referred to the study of practical and mechanical arts. As historian Eric Schatzberg explains, we owe the newer sense of “technology” — referring broadly to industrialization as well as the machinery and outputs of the industrial process — largely to the economist and social theorist Thorstein Veblen, who used it to translate the German Technik in the early 1900s.
The parallel transformations of science and technology and their growing interdependence nevertheless left their institutional and professional separation largely intact. For instance, during the mid-nineteenth century, America’s leading colleges and universities began expanding their course offerings in mathematics and the natural sciences. But science and mathematics departments were (and in many respects still remain) institutionally distinct from the more practically oriented engineering departments and technical schools — such as Harvard’s Lawrence Scientific School, Yale’s Sheffield Scientific School, the Massachusetts Institute of Technology, and the new land-grant universities that tended to focus on such “applied” sciences as agriculture. “The difficulty,” writes one scholar, “was finding an educational formula that could join the progress of theoretical science with applications to the practical arts.”
The professional self-understandings of scientists and engineers was similarly distinct, even opposed. The leaders of higher education who promoted science — such as Charles William Eliot at Harvard, James McCosh at Princeton, and Noah Porter at Yale — did not do so because of purported practical benefits but rather because science “ennobles and purifies the mind,” as Eliot put it. The term “pure science” came into vogue during this same time, to signify purity from what used to be called the “servile arts” — applied sciences, engineering, and industry. The American physicist Henry Rowland, in an 1883 address to the American Association for the Advancement of Science titled “A Plea for Pure Science,” took umbrage at the use of the word “science” to describe “telegraphs, electric lights, and such conveniences.”
I do not wish to underrate the value of all these things: the progress of the world depends on them, and he is to be honored who cultivates them successfully. So also the cook who invents a new and palatable dish for the table benefits the world to a certain degree; yet we do not dignify him by the name of a chemist. And yet it is not an uncommon thing, especially in American newspapers, to have the applications of science confounded with pure science.
Rather than fusing together two preexisting enterprises — science and technology — their parallel transformations helped to produce their ambivalent but fruitful relationship. Science and technology had not become a single, uniform enterprise; as Layton writes, they grew up as “mirror-image twins.”
Today, it has become fashionable in policy circles to accept that there is no real distinction between science and technology. Any such distinction is often associated with the much-maligned “linear model” of innovation, according to which scientists conduct their research in a kind of vacuum, disinterested in the use of their findings, while others separately apply their findings to new technologies and products.
Not only does this model fail to capture the complicated relationship between science and technology, its critics argue, it also surreptitiously promotes an elitist distinction between knowledge and use — that ancient prejudice that is now outdated thanks to the triumph of Bacon’s ideas. Thus the rejection of the linear model often takes the form of a full embrace of the Baconian story, going so far as to erase altogether the line between science and technology.
For instance, Daniel Stokes, in his influential Pasteur’s Quadrant, sets out to criticize the linear model but then goes on to criticize also the continued separation of knowledge and use — of basic and applied science, or of science and technology. Any institutional or professional distinctions of this sort are superficial and outdated, leading to “the perception that basic and applied science are separate ventures.”
As we have seen, the linear model that strictly separates science from technology indeed does not hold up; the relationship has always been more complicated. But just because science and technology came to enjoy an intimate and dynamic relationship does not mean that they also became the same enterprise.
Rather, the real distinction between the two enterprises has become harder to discern as science has grown more technologically useful. As Thomas Kuhn pointed out in 1971, technologies that are derived from scientific knowledge have become so ubiquitous that they “disguise the still real cleavage between science and technology.” And so “they make it difficult to realize how very recent and decisive the emergence of this kind of interaction has been.” Now, half a century later, technologies borne of science are even more ubiquitous, and the difference between the two is all but invisible to the public eye.
We might call this the Baconian paradox: The final vindication of Bacon’s conception of the utility of science has helped to obscure how long it took for this dynamic relationship between science and technology to come off — and that there nonetheless remains a real and lasting distinction between the two.
How can we make sense of this distinction today, given the undeniable overlap between science and technology? The first thing to notice is that the apparent plausibility of the Baconian story is partly due to our rather narrow focus on certain areas of scientific and technological research.
In some fields, such as genetic engineering, the boundary between science and technology is indeed quite murky. But few would describe, say, the building of bridges or the designing of air conditioning systems as science, even though they depend on scientific principles. Similarly, it’s hard to imagine anyone mistaking cosmology or entomology for engineering, even though they depend on technology to varying degrees.
The restricted focus on a subset of fields in science and technology may be part of the reason why we find it so easy to run the two together. And this restricted focus likely stems from the prestige that some technological domains have come to enjoy in our time. As David E. Nye points out, beginning in the 1990s the word “technology” came to be synonymous with specific types of inventions, especially computers, phones, and related tools. For many of us, “technology” mainly means smartphones and automated systems — the particular products of one broad industry. Not only is the narrow use of the term strikingly different from the classical concept of techne discussed above, it fails even to capture many other types of technologies we take for granted, from automobiles to vacuum cleaners — technologies that do not (or no longer) seem to us especially scientific in nature.
In fields where the difference between science and technology has become hard to tell — for examples in high-energy physics and quantum computing — we might be helped by distinguishing between the respective aims of science and technology, even if in practice they are tightly linked.
As a rough approximation, we might say that science generally aims at explanation, whereas technology aims at production. A scientific theory is accepted over its rivals because the scientific community deems it a more adequate explanation (however one thinks of this), not because it issues in more useful technologies. Which interpretation of quantum physics should we accept? Is global climate change anthropogenic or not? What is the underlying chemical composition of the stars? By contrast, technologies are deemed successful, at least in part, when engineers are able to produce tools or machines — bridges, HVAC systems, cars, computers, cell phones, nuclear reactors, or whatever — that work. Electric lighting works whether or not we accept the existence of the mechanical ether (as did Maxwell’s theory of electromagnetism); automated doors (which utilize the photoelectric effect) function just fine, even though physicists have yet to agree on the proper interpretation of quantum physics.
We can draw a meaningful distinction between science and technology without therefore insisting that they are fully separate: Technology can contribute to scientific knowledge, just as science uses technology to manipulate or produce certain effects. We might think of the two less as separate disciplines than as overlapping but different practices with their own professional and scholarly methods and aims. To be sure, individual scientists and engineers may not always be consciously oriented toward the differing ideals that shape these two enterprises. And exceptional practitioners often participate in and contribute to both at once. But the distinction, as we will see, remains important, even if the relationship is highly interdependent, interactive, symbiotic, and in some places well-nigh invisible.
The risk in conflating science and technology is that this endangers both. As President Roosevelt’s science advisor, Vannevar Bush, pointed out in his famous 1945 report Science, the Endless Frontier,
Under the pressure for immediate results, and unless deliberate policies are set up to guard against this, applied research invariably drives out pure.
This is problematic because many technological developments depend on advances in so-called basic or pure science. The crowding out of science by technology thus threatens technology by threatening certain areas of science itself.
Bush was not the only thinker to sound this alarm. The German cultural critic Friedrich Jünger, in The Failure of Technology (1946), denounced what he called the “subjugation of science”:
As technology progresses, the relation between science and technology undergoes a change. Science becomes the servant of technology…. The disciplines of science become auxiliary disciplines of technology, and they fare the better the more willingly they submit to this role. “Pure science” declines because the important thing is no longer an understanding of the laws of nature, but, first of all, the application, the uses, the exploitation, of those laws.
And in his classic The Technological Society (1954), the French historian and sociologist Jacques Ellul wrote that “in the twentieth century, this relationship between scientific research and technical invention resulted in the enslavement of science to technique.”
Less dramatically, we might simply observe that, especially when budgets are tight, funding for research with overtly utilitarian applications tends to beat out science pursued for its own sake, even if that research might eventually bear fruits. The past several decades of federal R&D policy seems to offer proof of this. Thus, federal funding of basic science, though increasing in absolute terms, has declined over the past half-century as a share of non-defense R&D and as a share of GDP. The ratio is even worse when including defense research, which skews heavily toward applied science and product development. Meanwhile, private actors are even less able to and less interested in bearing the uncertain economic returns of basic science. Unsurprisingly, the vast majority of private R&D budgets goes toward application. And so, as the share of R&D spending funded by the federal government has dwindled, and industry has begun to fund the lion’s share of research, total national spending on applied science has dwarfed basic science.
The irony of the Baconian legacy is that the more fruitful science becomes, the more it loses its own identity: Whenever science is technologically useful, the two enterprises tend to appear as “different functions performed by the same community,” as Edwin Layton has put it. But “a fundamental fact is that they constitute different communities, each with its own goals and systems of values.” Almost no society in history, Thomas Kuhn declared, “has managed successfully to nurture both at the same time.” That the United States has been able to do so — so far, at least — has surely contributed to our scientific, technological, and economic preeminence.
If we wish science to continue to bear fruit for us, we may well need to refrain from judging it entirely by those fruits, and instead defend science for its own sake.
Exhausted by science and tech debates that go nowhere?