file-20250521-56-kzyzlr.jpg

From prototype to construction site: how innovative smart materials make it out of the lab and into our cities

Building material made from recycled plastic waste. Rene Notenbomer/Shutterstock

The construction industry accounts for approximately 37% of global CO₂ emissions. Traditional materials like cement, steel, and bricks contribute over 70% of its footprint, with cement production making up an especially large share.

To confront this problem, researchers are developing all manner of innovative construction materials and mechanisms, ranging from walls that produce solar energy to self-repairing bacteria-based concrete.

These smart materials, seemingly the stuff of science fiction, are fast becoming a reality, and a raft of European Union (EU) initiatives aim to turn cutting-edge construction materials into real, sustainable, affordable solutions. The private sector is also playing its part – over the past two decades companies such as Dyson Holdings and Monodraught have filed more than 40 patents for advanced materials aimed at enhancing buildings’ thermal performance, durability and environmental impact.

However, any new material has to clear a lot of safety, security and environmental hurdles. This means that getting them out of the lab and into the real world can present a serious challenge.

From prototype to building site

The development process begins with identifying a technical or environmental issue, such as improving insulation or reducing energy use. A functional prototype is then created and tested under controlled conditions to assess its physical and chemical properties. This includes evaluating compressive strength, water absorption, fire resistance, thermal conductivity and acoustic insulation.

If the prototype shows promise, it then progresses to a pilot production phase, where larger quantities are manufactured to test stability, consistency, and scalability. At the same time, comprehensive technical documentation is prepared.

In the EU, approval is a lengthy and tightly regulated process. Construction materials have to comply with the Construction Products Regulation (EU No 305/2011). This involves obtaining CE (European conformity) marking, submitting a Declaration of Performance (DoP), and adhering to harmonised European standards (EN) established by the European Committee for Standardisation (CEN). These standards ensure products meet criteria related to structural safety, thermal efficiency, moisture resistance and fire behaviour.

Additionally, a Life Cycle Assessment is conducted to evaluate the environmental impact of the material, from the extraction of its component raw materials through to its eventual disposal or recycling. This assessment is crucial for aligning with European policies, and for obtaining green building certifications such as BREEAM and LEED.

Once technical approvals are complete, strategies for production, packaging, distribution and marketing are developed. Performance simulations and digital representations of the material (known as Building Information Modelling or BIM objects) are also created to ensure seamless integration into architectural designs using specialised commercial software.




Leer más:
Buildings inspired by worms and grasshoppers: the future of biomimicry in construction


Innovation isn’t easy (or cheap)

This complex process means that many innovative ideas in construction never reach the market. Developers need to follow strict safety, performance, and environmental rules, which often involve costly testing and certifications. At the same time, many research teams face challenges like limited funding or industry contacts, and they may not fully understand the legal requirements. Without the right support, even the best ideas can stay stuck as prototypes.

To address these challenges, the European Union has launched several initiatives to push innovations from the initial research phase to market adoption:




Leer más:
‘Urban form’ and the housing crisis: Can streets and buildings make a neighbourhood more affordable?


A gateway to future materials

Bridging the gap between prototypes and market-ready construction materials requires comprehensive support. Exploit4InnoMat is a European platform offering a Single-Entry Point for entrepreneurs, SMEs and research centers aiming to scale smart, sustainable materials.

The platform provides services encompassing the entire development cycle:

  • Technical validation and certification: Access to EU-approved pilot line facilities for testing according to European standards, including the aforementioned CE marking and Declaration of Performance (DoP).

  • Specialised scientific advice: Support in material characterisation, property optimisation, and scaling strategies.

  • Simulation and digital modelling: Tools that predict how materials behave in terms of heat, strength, and environmental impact within digital models of real buildings. These tools help create models that can be directly inserted into BIM platforms.

  • Legal and intellectual property support: Assistance with patent registration and regulatory compliance.

Through this comprehensive approach, Exploit4InnoMat has already brought several new materials to market. These innovations not only enhance energy efficiency, but also minimise environmental impact and extend the lifespan of buildings. Some prominent examples include:

  • Ceramic panels with phase change materials (PCM), which store and release heat, maintaining stable indoor temperatures and reducing the need for heating and cooling.

  • Nanotechnology coatings offering antibacterial and reflective properties. These coatings are ideal for hospitals and schools, particularly in hot climates where hygiene and energy efficiency are paramount.

  • Recycled cement panels made from industrial waste, which reduce the use of virgin raw materials and lower emissions in production.

  • Optimised Ceramic Elements, such as bricks and tiles improved with additives, recycled materials, and nanotechnology to boost insulation, porosity, and sustainability.

Schemes like Exploit4InnoMat play a crucial role by integrating all development phases into a single platform. From laboratory testing and environmental validation through to market entry, they assist developers in accelerating their innovations in the knowledge that they stand a solid chance of actually being used in construction.

Materials that previously stalled at the prototype stage now have a much clearer pathway to real-world application. This streamlined process ensures that scientific advancements reach our built environment more rapidly, contributing to the creation of greener, more efficient cities prepared for future challenges.

The Conversation

Andrés Jonathan Guízar Dena participates as a researcher in the Exploit4InnoMat project, funded by the European Union. Within the project, he provides advisory and product characterisation services for digital modelling, including BIM environments and energy simulation.

file-20250515-80-68f38a.jpg

NZ Budget 2025: science investment must increase as a proportion of GDP for NZ to innovate and compete

Shutterstock/Olivier Le Queinec

A lack of strategy and research funding – by both the current and previous governments – has been well documented, most comprehensively in the first report by the Science System Advisory Group (SSAG), released late last year.

If there is one word that sums up the current state of New Zealand’s research sector, it is scarcity. As the report summarises:

We have an underfunded system by any international comparison. This parsimony has led to harmful inter-institutional competition in a manner that is both wastefully expensive in terms of process and scarce researcher time, and is known to inhibit the most intellectually innovative ideas coming forward, and of course it is these that can drive a productive innovation economy.

The government expects research to contribute to economic growth, but policy and action undermine the sector’s capacity to do so.

The latest example is last week’s cancellation of the 2026 grant application round of the NZ$55 million Endeavour Fund “as we transition to the science, innovation and technology system of the future”. Interrupting New Zealand’s largest contestable source of science funding limits opportunities for researchers looking for support for new and emerging ideas.

Changes to the Marsden Fund, set up 30 years ago to support fundamental research, removed all funding for social science and the humanities and shifted focus to applied research. This is despite fundamental research in all fields underpinning innovation and the international ranking of our universities.

New Zealand has an opportunity to change its economy based on the potential of emerging sectors such as artificial intelligence, cleantech and quantum technologies. Other countries, including Australia and the United Kingdom, already consider quantum technologies a priority and fund them accordingly.

But when it comes to strategy, the composition of the boards of new Public Research Organisations, set up as part of the government’s science sector reform, are skewed towards business experience. Where there is scientific expertise, it tends to be in established industries. The governance of the proposed new entity to focus on emerging and advanced technologies is yet to be announced.

Critical mass requires funding and strategy

Scientists have been calling for a science investment target of 2% of GDP for a long time. It was once – roughly a decade ago – the average expenditure within the OECD; this has since increased to 2.7% of GDP, while New Zealand’s investment remains at 1.5%.

The SSAG report repeatedly refers to the lack of funding, and it would be the obvious thing to see addressed in this year’s budget. But expectations have already been lowered by the government’s insistence there will be no new money.

The report’s second high-level theme is the engagement of government with scientific strategy. Government announcements to date seem focused on attracting international investment through changes to tax settings and regulation. I would argue this is a matter of focusing on the wrapping rather than the present: the system itself needs to be attractive to investors.

Creating a thriving research sector is also a matter of scale. International cooperation is one way for New Zealand to access efficiencies of scale. And work on building international partnerships is one area of positive intent. But we need to look at our connectivity nationally as well, and use investment to build this further.

Countries with greater GDPs than New Zealand’s invest much more in research as a proportion of GDP. It means the size of these other countries’ scientific ecosystems – if measured by total expenditure – is three to four times New Zealand’s on a per capita basis.

A matter of scale

Per-capita scale matters because it tells us how easy it is for researchers to find someone else with the right skillset or necessary equipment. It tells us how likely it is for a student to find an expert in New Zealand to teach them, rather than needing to go overseas.

And it tells us how quickly start-up companies in emerging technologies will be able to find the skilled employees they need. A thriving university system that attracts young people to develop the research skills needed by advanced technology companies is a key part of this challenge.

The government’s science sector reform aims to increase its contribution to economic growth. But research contributes to economic growth when scientists can really “lean in” with confidence to commercialising and translating their science.

That can’t happen if budgets don’t fund the critical mass, connectivity and resources to stimulate the transition to a thriving science system.

The Conversation

Nicola Gaston receives funding from the Tertiary Education Commission as the Director of the MacDiarmid Institute for Advanced Materials and Nanotechnology. She also receives funding from the Marsden Fund. All research funding goes to the University of Auckland to pay the costs of the research she is employed to do.

file-20250515-62-mngrpt.jpg

For a Canadian in London, King Charles’ Royal Garden Party inspires sustainability education

On a glorious afternoon recently, I had the good fortune to attend a specially themed Education and Skills Garden Party hosted at Buckingham Palace in London to celebrate the contributions of educators in the United Kingdom and beyond.

As a Canadian citizen living and working in education in the United Kingdom, I was invited to attend by the High Commission of Canada in London.

The occasion provided a relaxing yet exciting opportunity to reflect on my involvement embedding sustainability into education related to innovation and intellectual property (IP) rights law.

Royal Gardens as oasis

King Charles has been a lifelong supporter of sustainability education, which is a new addition to the curricula. For me, the Royal garden and lake beautifully highlighted concerns with sustainability.

The King’s Royal garden at the Palace is an oasis in the city of London, alive with foliage and wildlife that guests may stroll around and explore. According to the event leaflet: “A survey of the Garden by the London Natural History Society revealed a wealth of flora and fauna, some quite rare species.”

Garden parties are a special way for members of the Royal Family to speak to a broad range of people, all of whom have made a positive impact on their community. Today these events are a way to recognize and reward public service.

A network of sponsors is used to invite guests, including lord-lieutenants, societies and associations, government departments and local government, as well as representatives of various churches and other faiths.

Charles first marked the issue of pollution in 1970 when he was a 21-year-old student. The King continues to champion his lifelong passion regarding the importance of the health of the environment and living sustainably.

People striding across a lawn in front of a low palace.
‘The garden party at Buckingham Palace for Queen Victoria’s Golden Jubilee,’ painting by Frederick Sargent, 1887.
(Royal Collection (U.K.) 407255/Wikipedia)

Why intellectual property and sustainability?

Since 2004, I have been an innovation, intellectual property rights and business law educator. My research group contributed to a publication called The Guide to The Sustainable Development Goals (SDGs), developed to explore the connections between the United Nation’s 17 SDGs, sustainable development and IP.

Intellectual property is of concern because we need to envision and build a common future with innovation and creativity. How sustainability challenges are overcome depends on the commercialization of new green technology catalysts.

However, this process is complex. Choosing between solar versus wind, or hydro, geothermal or tidal energy technologies involves making difficult choices. IP rights, such as patents, provide practical scientific information about new green technologies. This information helps society to prioritize public, private and alternative financing to support climate change mitigation and adaptation.

Canadian firms have patented numerous climate change mitigation technologies.

For example, the Toronto-based WhalePower has significantly advanced fluid dynamics and has filed Canadian, European Union, United States, Chinese and Indian patents to protect its new technology. Their award-winning invention, inspired by the bumpy flippers of humpback whales, results in more efficient and reliable wind turbine blades.




Read more:
Here’s why UK tides are soon going to play a much bigger part in powering your home


This “tubercle” technology, named for a rounded point of a bone, also has applications for hydroelectric turbines and for revolutionizing fan design. These blades, featuring tubercles (bumps) on the leading edge, reduce aerodynamic drag and improve performance. WhalePower also generates revenue by licensing its patented technology to other companies to use in wind turbines.

Patents encourage knowledge sharing

Patents encourage knowledge sharing, because the way the invention works must be disclosed, rather than kept secret.

For example, new tidal energy inventors can read Whalepower’s patents and be inspired to further advance the new technology with additional incremental innovations.

A granted patent is published for free online and digitally tagged using globally recognized classification codes to facilitate easy searching by scientists, investors and financiers. The data collected on the patent register is also used to design new climate innovation research studies and inform policy-making.

In this manner, IP often stimulates investment by providing the legal rights needed to justify longer-term investment in a changing landscape of innovation.

Long-term investment into green technology is a form of environmental stewardship that I discuss in more detail in my article “Companies and UN 2030 Sustainable Development Goal 9 Industry, Innovation and Infrastructure.” IP rights support firms like Whalepower by enabling knowledge tools that can bring sustainable development goals closer to fruition.

Patent attorneys and Earthshot Prize

The significant role of IP rights in promoting sustainability gained a higher profile when the United Kingdom’s Chartered Institute of Patent Attorneys (CIPA) became an Official Nominator for the annual Earthshot Prize launched by Prince William’s Royal Foundation in 2020.

CIPA helps to identify and nominate solutions for the environmental challenges that the prize aims to address. One nominated solution that uses DNA sequencing and nature’s own colours to create sustainable dyes to reduce the use of water and harmful chemicals in the fashion industry, Colorifix, was a runner-up in the 2023 edition.




Read more:
Can marketing classes teach sustainability? 4 key insights


CIPA provides crucial IP rights checks to finalists, ensuring that their innovations have no outstanding IP issues. This partnership is an example of how the Royal Family works together with CIPA to use the power of IP to help solve sustainability challenges.

As the King stated when he was Prince of Wales in 2017: “Mine is not a new commitment, but perhaps you will allow me to restate my determination to join you in continuing to do whatever I can, for as long as I can, to maintain not only the health and vitality of the ocean and all that depends upon it, but also the viability of that greatest and most unique of living organisms — nature herself.”

The Conversation

Janice Denoncourt is affiliated with the British Association for Canadian Studies (BACS)..

file-20250509-56-i9suzg.jpg

Challenges to high-performance computing threaten US innovation

Oak Ridge National Laboratory's Frontier supercomputer is one of the world's fastest. Oak Ridge Leadership Computing Facility, CC BY

High-performance computing, or HPC for short, might sound like something only scientists use in secret labs, but it’s actually one of the most important technologies in the world today. From predicting the weather to finding new medicines and even training artificial intelligence, high-performance computing systems help solve problems that are too hard or too big for regular computers.

This technology has helped make huge discoveries in science and engineering over the past 40 years. But now, high-performance computing is at a turning point, and the choices the government, researchers and the technology industry make today could affect the future of innovation, national security and global leadership.

High-performance computing systems are basically superpowerful computers made up of thousands or even millions of processors working together at the same time. They also use advanced memory and storage systems to move and save huge amounts of data quickly.

With all this power, high-performance computing systems can run extremely detailed simulations and calculations. For example, they can simulate how a new drug interacts with the human body, or how a hurricane might move across the ocean. They’re also used in fields such as automotive design, energy production and space exploration.

Lately, high-performance computing has become even more important because of artificial intelligence. AI models, especially the ones used for things such as voice recognition and self-driving cars, require enormous amounts of computing power to train. High-performance computing systems are well suited for this job. As a result, AI and high-performance computing are now working closely together, pushing each other forward.

Lawrence Livermore National Laboratory’s supercomputer El Capitan is currently the world’s fastest.

I’m a computer scientist with a long career working in high-performance computing. I’ve observed that high-performance computing systems are under more pressure than ever, with higher demands on the systems for speed, data and energy. At the same time, I see that high-performance computing faces some serious technical problems.

Technical challenges

One big challenge for high-performance computing is the gap between how fast processors are and how well memory systems can keep up with the processors’ output. Imagine having a superfast car but being stuck in traffic – it doesn’t help to have speed if the road can’t handle it. In the same way, high-performance computing processors often have to wait around because memory systems can’t send data quickly enough. This makes the whole system less efficient.

Another problem is energy use. Today’s supercomputers use a huge amount of electricity, sometimes as much as a small town. That’s expensive and not very good for the environment. In the past, as computer parts got smaller, they also used less power. But that trend, called Dennard scaling, stopped in the mid-2000s. Now, making computers more powerful usually means they use more energy too. To fix this, researchers are looking for new ways to design both the hardware and the software of high-performance computing systems.

There’s also a problem with the kinds of chips being made. The chip industry is mainly focused on AI, which works fine with lower-precision math like 16-bit or 8-bit numbers. But many scientific applications still need 64-bit precision to be accurate. The greater the bit count, the more digits to the right of the decimal point a chip can process, hence the greater precision. If chip companies stop making the parts that scientists need, then it could become harder to do important research.

This report discusses how trends in semiconductor manufacturing and commercial priorities may diverge from the needs of the scientific computing community, and how a lack of tailored hardware could hinder progress in research.

One solution might be to build custom chips for high-performance computing, but that’s expensive and complicated. Still, researchers are exploring new designs, including chiplets – small chips that can be combined like Lego bricks – to make high-precision processors more affordable.

A global race

Globally, many countries are investing heavily in high-performance computing. Europe has the EuroHPC program, which is building supercomputers in places such as Finland and Italy. Their goal is to reduce dependence on foreign technology and take the lead in areas such as climate modeling and personalized medicine. Japan built the Fugaku supercomputer, which supports both academic research and industrial work. China has also made major advances, using homegrown technology to build some of the world’s fastest computers. All of these countries’ governments understand that high-performance computing is key to their national security, economic strength and scientific leadership.

The U.S.-China supercomputer rivalry explained.

The United States, which has been a leader in high-performance computing for decades, recently completed the Department of Energy’s Exascale Computing Project. This project created computers that can perform a billion billion operations per second. That’s an incredible achievement. But even with that success, the U.S. still doesn’t have a clear, long-term plan for what comes next. Other countries are moving quickly, and without a national strategy, the U.S. risks falling behind.

I believe that a U.S. national strategy should include funding new machines and training for people to use them. It would also include partnerships with universities, national labs and private companies. Most importantly, the plan would focus not just on hardware but also on the software and algorithms that make high-performance computing useful.

Hopeful signs

One exciting area for the future is quantum computing. This is a completely new way of doing computation based on the laws of physics at the atomic level. Quantum computers could someday solve problems that are impossible for regular computers. But they are still in the early stages and are likely to complement rather than replace traditional high-performance computing systems. That’s why it’s important to keep investing in both kinds of computing.

The good news is that some steps have already been taken. The CHIPS and Science Act, passed in 2022, provides funding to expand chip manufacturing in the U.S. It also created an office to help turn scientific research into real-world products. The task force Vision for American Science and Technology, launched on Feb. 25, 2025, and led by American Association for the Advancement of Science CEO Sudip Parikh, aims to marshal nonprofits, academia and industry to help guide the government’s decisions. Private companies are also spending billions of dollars on data centers and AI infrastructure.

All of these are positive signs, but they don’t fully solve the problem of how to support high-performance computing in the long run. In addition to short-term funding and infrastructure investments, this means:

  • Long-term federal investment in high-performance computing R&D, including advanced hardware, software and energy-efficient architectures.
  • Procurement and deployment of leadership-class computing systems at national labs and universities.
  • Workforce development, including training in parallel programming, numerical methods and AI-HPC integration.
  • Hardware road map alignment, ensuring commercial chip development remains compatible with the needs of scientific and engineering applications.
  • Sustainable funding models that prevent boom-and-bust cycles tied to one-off milestones or geopolitical urgency.
  • Public-private collaboration to bridge gaps between academic research, industry innovation and national security needs.

High-performance computing is more than just fast computers. It’s the foundation of scientific discovery, economic growth and national security. With other countries pushing forward, the U.S. is under pressure to come up with a clear, coordinated plan. That means investing in new hardware, developing smarter software, training a skilled workforce and building partnerships between government, industry and academia. If the U.S. does that, the country can make sure high-performance computing continues to power innovation for decades to come.

The Conversation

Jack Dongarra receives funding from the NSF and the DOE.