file-20250714-56-5rzp48.jpg

Who invented the light bulb?

Eureka, what an idea! TU IS/iStock/Getty Images Plus

Curious Kids is a series for children of all ages. If you have a question you’d like an expert to answer, send it to CuriousKidsUS@theconversation.com.


Who invented the light bulb? – Preben, age 5, New York City


When people name the most important inventions in history, light bulbs are usually on the list. They were much safer than earlier light sources, and they made more activities, for both work and play, possible after the Sun went down.

More than a century after its invention, illustrators still use a lit bulb to symbolize a great idea. Credit typically goes to inventor and entrepreneur Thomas Edison, who created the first commercial light and power system in the United States.

But as a historian and author of a book about how electric lighting changed the U.S., I know that the actual story is more complicated and interesting. It shows that complex inventions are not created by a single genius, no matter how talented he or she may be, but by many creative minds and hands working on the same problem.

Thomas Edison didn’t invent the basic design of the incandescent light bulb, but he made it reliable and commercially viable.

Making light − and delivering it

In the 1870s, Edison raced against other inventors to find a way of producing light from electric current. Americans were keen to give up their gas and kerosene lamps for something that promised to be cleaner and safer. Candles offered little light and posed a fire hazard. Some customers in cities had brighter gas lamps, but they were expensive, hard to operate and polluted the air.

When Edison began working on the challenge, he learned from many other inventors’ ideas and failed experiments. They all were trying to figure out how to send a current through a thin carbon thread encased in glass, making it hot enough to glow without burning out.

In England, for example, chemist Joseph Swan patented an incandescent bulb and lit his own house in 1878. Then in 1881, at a great exhibition on electricity in Paris, Edison and several other inventors demonstrated their light bulbs.

Edison’s version proved to be the brightest and longest-lasting. In 1882 he connected it to a full working system that lit up dozens of homes and offices in downtown Manhattan.

But Edison’s bulb was just one piece of a much more complicated system that included an efficient dynamo – the powerful machine that generated electricity – plus a network of underground wires and new types of lamps. Edison also created the meter, a device that measured how much electricity each household used, so that he could tell how much to charge his customers.

Edison’s invention wasn’t just a science experiment – it was a commercial product that many people proved eager to buy.

Inventing an invention factory

As I show in my book, Edison did not solve these many technical challenges on his own.

At his farmhouse laboratory in Menlo Park, New Jersey, Edison hired a team of skilled technicians and trained scientists, and he filled his lab with every possible tool and material. He liked to boast that he had only a fourth grade education, but he knew enough to recruit men who had the skills he lacked. Edison also convinced banker J.P. Morgan and other investors to provide financial backing to pay for his experiments and bring them to market.

Historians often say that Edison’s greatest invention was this collaborative workshop, which he called an “invention factory.” It was capable of launching amazing new machines on a regular basis. Edison set the agenda for its work – a role that earned him the nickname “the wizard of Menlo Park.”

Here was the beginning of what we now call “research and development” – the network of universities and laboratories that produce technological breakthroughs today, ranging from lifesaving vaccines to the internet, as well as many improvements in the electric lights we use now.

Sparking an electric revolution

Many people found creative ways to use Edison’s light bulb. Factory owners and office managers installed electric light to extend the workday past sunset. Others used it for fun purposes, such as movie marquees, amusement parks, store windows, Christmas trees and evening baseball games.

Theater directors and photographers adapted the light to their arts. Doctors used small bulbs to peer inside the body during surgery. Architects and city planners, sign-makers and deep-sea explorers adapted the new light for all kinds of specialized uses. Through their actions, humanity’s relationship to day and night was reinvented – often in ways that Edison never could have anticipated.

Today people take for granted that they can have all the light they need at the flick of a switch. But that luxury requires a network of power stations, transmission lines and utility poles, managed by teams of trained engineers and electricians. To deliver it, electric power companies grew into an industry monitored by insurance companies and public utility regulators.

Edison’s first fragile light bulbs were just one early step in the electric revolution that has helped create today’s richly illuminated world.


Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.

The Conversation

Ernest Freeberg does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

file-20250911-56-v9oqot.jpg

Proposed cuts to NIH funding would have ripple effects on research that could hamper the US for decades

The NIH is a node in an interconnected system producing health and medical advances. Anchalee Phanmaha/Moment via Getty Images

In May 2025, the White House proposed reducing the budget of the National Institutes of Health by roughly 40% – from about US$48 billion to $27 billion. Such a move would return NIH funding to levels last seen in 2007. Since NIH budget records began in 1938, NIH has seen only one previous double-digit cut: a 12% reduction in 1952.

Congress is now tasked with finalizing the budget ahead of the new fiscal year, which begins Oct. 1. In July, the Senate rejected the White House’s proposed cuts and instead advanced a modest increase. And in early September, the House of Representatives also supported a budget that maintains the agency’s current funding levels.

However, talk of cutting NIH funding is not a new development. Such proposals tend to resurface from time to time, and the ongoing discussion has created uncertainty about the stability of research overall and prompted concern among scientists about the future of their work.

As researchers studying complex health policy systems – and specifically, science funding policy – we see the NIH as one node in an interconnected system that supports the discovery of new knowledge, trains the biomedical workforce and makes possible medical and public health advances across the U.S.

Our research shows that while cutting NIH funding may appear to save money in the short term, it can trigger a chain of effects that increase long-term health care costs and slow the development of new treatments and public health solutions over time.

Seeing the bigger picture of NIH funding

NIH funding does not just support the work of individual researchers and laboratories. It shapes the foundation of American science and health care by training scientists, supporting preventive health research and creating the knowledge that biomedical companies can later build into new products.

To understand how funding cuts may affect scientific progress, the training of new researchers and the availability of new treatments, we took a broad look at existing evidence. We reviewed studies and data that connect NIH funding, or biomedical research more generally, to outcomes such as innovation, workforce development and public health.

In a study published in July 2025, we built a simple framework to show how changes in one part of the system – research grants, for example – can lead to changes in others, like fewer training opportunities or slower development of new therapies.

Eroding the basic research foundation

The NIH funds early-stage research that lacks immediate commercial value but provides the building blocks for future innovations. This includes projects that map disease pathways, develop new laboratory methods or collect large datasets that researchers use for decades.

For example, NIH-supported research in the 1950s identified cholesterol and its role in disease pathways for heart disease, helping to lay the groundwork for the later discovery of statins used by millions of people to lower cholesterol levels. Cancer biology research in the 1960s led to the discovery of cisplatin, a chemotherapy prescribed to 10% to 20% of cancer patients. Basic research in the 1980s on how the kidneys handle sugar helped pave the way for a new class of drugs for Type 2 diabetes, some of which are also used for weight management. Diabetes affects about 38 million Americans, and obesity affects more than 40% of the adults in the U.S.

A cancer patient receives chemotherapy in a clinic
Cisplatin, a chemotherapy widely used today, was developed through NIH-supported cancer biology research.
FatCamera/E+ via Getty Images

Without this kind of public, taxpayer-funded investment, many foundational projects would never begin, because private firms rarely take on work with long timelines or unclear profits. Our study did not estimate dollar amounts, but the evidence we reviewed shows that when public research slows, downstream innovation and economic benefits are also delayed. That can mean fewer new treatments, slower adoption of cost-saving technologies and reduced growth in industries that depend on scientific advances.

Reducing the scientific workforce

By providing grants that support students, postdoctoral researchers and early-career investigators, along with the labs and facilities where they train, the NIH also plays a central role in preparing up-and-coming scientists.

When funding is cut, fewer positions are available and some labs face closure. This can discourage young researchers from entering or staying in the field. The effect extends beyond academic research. Some NIH-trained scientists later move into biotechnology, medical device companies and data science roles. A weaker training system today means fewer skilled professionals across the broader economy tomorrow.

For example, NIH programs have produced not only academic researchers but also engineers and analysts who now work on immune therapies, brain-computer interfaces, diagnostics and AI-driven tools, as well as other technologies in startups and in more established biotech and pharmaceutical companies.

If those training opportunities shrink, biotech and pharmaceutical industries may have less access to talent. A weakened NIH-supported workforce may also risk eroding U.S. global competitiveness, even in the private sector.

Innovation shifts toward narrow markets

Public and private investment serve different purposes. NIH funding often reduces scientific risk by advancing projects to a stage where companies can invest with greater confidence. Past examples include support for imaging physics that led to MRI and PET scans and early materials science research that enabled modern prosthetics.

Our research highlights the fact that when public investment recedes, companies tend to focus on products with clearer near-term returns. That may tilt innovation toward specialty drugs or technologies with high launch prices and away from improvements that serve broader needs, such as more effective use of existing therapies or widely accessible diagnostics.

Surgeon examines an MRI of the brain
Imaging technologies such as MRI were developed through NIH funding for basic research.
Tunvarat Pruksachat/Moment via Getty Images

Some cancer drugs, for instance, relied heavily on NIH-supported basic science discoveries in cell biology and clinical trial design. Independent studies have documented that without this early publicly supported work, development timelines lengthen and costs increase, which can translate into higher prices for patients and health systems. When public funding shrinks and companies shift toward expensive products instead of lower-cost improvements, overall health spending can rise.

What looks like a budget saving in the near term can therefore have the opposite effect, with government programs such as Medicare and Medicaid ultimately shouldering higher costs.

Prevention and public health are sidelined

NIH is also a major funder of research aimed at promoting health and preventing disease. This includes studies on nutrition, chronic diseases, maternal health and environmental exposures such as lead or air pollution.

These projects often improve health long before disease becomes severe, but they rarely attract private investment because their benefits unfold gradually and do not translate into direct profits.

Delaying or canceling prevention research can result in higher costs later, as more people require intensive treatment for conditions that could have been avoided or managed earlier. For example, decades of observation in the Framingham Heart Study shaped treatment guidelines for risk factors such as high blood pressure and heart rhythm disorders. Now this cornerstone of prevention helps to avert heart attacks and strokes, which are far more risky and costly to treat.

A broader shift in direction?

Beyond these specific areas, the larger issue is how the U.S. will choose to support science and medical research going forward. For decades, public investment has enabled researchers to take on difficult questions and conduct decades-long studies. This support has contributed to advances ranging from psychosocial therapies for depression to surgical methods for liver transplants that do not fit neatly into market priorities, unlike drugs or devices.

If government support weakens, medical and health research may become more dependent on commercial markets and philanthropic donors. That can narrow the kinds of problems studied and limit flexibility to respond to urgent needs such as emerging infections or climate-related health risks.

Countries that sustain public investment may also gain an edge by attracting top researchers and setting global standards for new technologies.

On the other hand, once opportunities are lost and talent is dispersed, rebuilding takes far more time and resources.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.