Skip to main content

News

Feb 24, 2026   |   by Catha Mayor

Eric Fossum Honored at 2026 Draper Prize Award Ceremony

On February 18th in Washington, DC, Dartmouth Engineering Professor Eric Fossum was officially presented with the 2026 NAE Charles Stark Draper Prize for Engineering.

News

In the News

Live Science

Jan 18, 2026

Could There Ever Be a Worldwide Internet Outage?

Professor George Cybenko is quoted in an article about the possibility of a worldwide internet outage. "It is possible but would require significant resources and/or huge coincidences, which makes it a highly unlikely, but possible, event," Cybenko said.

Yahoo Tech

Jan 13, 2026

Eric Fossum earns the prestigious Draper Prize for pioneering the CMOS image sensor

Professor Eric Fossum, director of the PhD Innovation Program, has been named the recipient of the 2026 Charles Stark Draper Prize for Engineering, one of the most prestigious honors for engineering achievement. "Eric Fossum is a pioneering semiconductor device physicist and engineer whose invention of the CMOS active pixel image sensor, or 'camera on a chip,' has transformed imaging across everyday life, industry, and scientific discovery," the NAE said in announcing the prize.

The Guardian

Jan 07, 2026

We study glaciers. ‘Artificial glaciers’ and other tech may halt their total collapse

Professor Colin Meyer co-authors an opinion piece about technology advancements that can help stop total collapse of the world's glaciers. "Technologies we can bring to bear include satellite-based radar, solar-powered drones, robot submarines, lab-based 'artificial glaciers,' and advanced computing technologies, including artificial intelligence," he writes.

The African Exponent

Jan 06, 2026

How Sim Shagaya Built Konga Into One of Africa's Leading E‑Commerce Platforms

Simdul Shagaya Th'99, who earned his master of engineering management from Thayer, is featured in a storya bout how he launched his company Konga to be one of Africa's leading e‑commerce platforms.

Research Quick Takes

Figure depicting reversible all-liquid conversion path

Feb 26, 2026

Next-Gen Batteries for Grid Storage

Research Associate Peiyu Wang Th'25, PhD students Huilin Qing, Baiheng Li, and Ruiwen Zhang, and Professor Weiyang (Fiona) Li co-authored "Semi-liquid lithium−sulfur batteries for large-scale energy storage" published in Nature Reviews Clean Technology. This review examines catholyte chemistry and design, static and redox flow configurations, and strategies to improve performance and scalability for large-scale energy storage. "Lithium–sulfur batteries offer high energy density and cost-effectiveness but are limited by the precipitation of solid sulfur species, which has driven interest in semi-liquid systems," said Li.

The study's graphical abstract.

Feb 19, 2026

Machine-Learning-Enabled Phototransistors

PhD student Simon Agnew '22, Research Associate Xavier Cadet, and professors Peter Chin and Will Scheideler co-authored "Decoding disorder: Machine learning unlocks multi-wavelength and intensity sensing in a single indium oxysulfide phototransistor" published in Device. The paper presents machine-learning-enabled phototransistors that decode both light wavelength and intensity from a single printed device—no filters or sensor arrays required. This work points toward simpler, lower-cost, and more scalable multi-parameter sensing for flexible optoelectronics. "By combining scalable liquid-metal printing of ultrathin indium oxysulfide with data-driven analysis, we show how disorder—often viewed as a limitation in printed semiconductors—can be turned into a powerful sensing feature," said Scheideler.

Research figure depicting transfer learning

Feb 12, 2026

Better Metamaterial Design Via Transfer Learning

PhD students Xiangbei Liu, Ya Tang, and Huan Zhao, and Professor Yan Li, are co-authors of "A transfer learning–enabled framework for rapid property prediction toward scalable and data-efficient metamaterial design" published in Results in Engineering. When faced with new requirements, conventional machine-learning approaches require substantial new datasets for retraining—basically starting from scratch. Transfer learning can significantly reduce the required amount of training data while maintaining high accuracy and stability. "This approach provides a foundation for building a scalable, data-efficient knowledge base for future applications," said Li.