- Undergraduate
Bachelor's Degrees
Bachelor of ArtsBachelor of EngineeringDual-Degree ProgramUndergraduate AdmissionsUndergraduate Experience
- Graduate
Graduate Experience
- Research
- Entrepreneurship
- Community
- About
-
Search
Dartmouth Engineering Study Presents Simple, Affordable Tool to Measure Tissue Oxygen and Health
The tool, developed in Professor Brian Pogue's lab, could provide a simple, affordable method far superior to blood oxygen measures for detecting disease and making treatment decisions.
Dartmouth Ranks Among Top 100 Universities for Patents
The National Academy of Inventors recognizes innovation by faculty and researchers, including professors Hui Fang and Margie Ackerman.
Celebrating an AI Milestone and Guiding the Future
At Dartmouth, where AI began, researchers and scholars convene to define what comes next.
Eric Fossum Honored at 2026 Draper Prize Award Ceremony
On February 18th in Washington, DC, Dartmouth Engineering Professor Eric Fossum was officially presented with the 2026 NAE Charles Stark Draper Prize for Engineering.
Dartmouth Team Earns Distinguished Team Award for Online Engineering Innovation
The team behind Dartmouth Engineering's online Master of Engineering in Computer Engineering program is a 2026 recipient of the Distinguished Team Award from NERCOMP.
Bachelor's Degrees
Master's Degrees
Doctoral Degrees
Research Quick Takes
Mar 05, 2026
Better Liver Transplant Decisions
PhD student Jiahui "Gary" Luo and Professor Wesley Marrero, with researchers at U Michigan, developed a new simulation framework to analyze liver transplant decisions, which was published in IEEE's 2025 Winter Simulation Conference. The team created a continuous-time simulation that models patient health and organ arrivals while mimicking real-world, varied organ acceptance practices. "The study concludes that high selectivity is a major obstacle to saving lives. Because small quality differences have modest survival effects, accepting a broader range of medically suitable organs can significantly reduce waiting times and maximize the lifesaving potential of the donor pool," said Marrero.
Feb 26, 2026
Next-Gen Batteries for Grid Storage
Research Associate Peiyu Wang Th'25, PhD students Huilin Qing, Baiheng Li, and Ruiwen Zhang, and Professor Weiyang (Fiona) Li co-authored "Semi-liquid lithium−sulfur batteries for large-scale energy storage" published in Nature Reviews Clean Technology. This review examines catholyte chemistry and design, static and redox flow configurations, and strategies to improve performance and scalability for large-scale energy storage. "Lithium–sulfur batteries offer high energy density and cost-effectiveness but are limited by the precipitation of solid sulfur species, which has driven interest in semi-liquid systems," said Li.
Feb 19, 2026
Machine-Learning-Enabled Phototransistors
PhD student Simon Agnew '22, Research Associate Xavier Cadet, and professors Peter Chin and Will Scheideler co-authored "Decoding disorder: Machine learning unlocks multi-wavelength and intensity sensing in a single indium oxysulfide phototransistor" published in Device. The paper presents machine-learning-enabled phototransistors that decode both light wavelength and intensity from a single printed device—no filters or sensor arrays required. This work points toward simpler, lower-cost, and more scalable multi-parameter sensing for flexible optoelectronics. "By combining scalable liquid-metal printing of ultrathin indium oxysulfide with data-driven analysis, we show how disorder—often viewed as a limitation in printed semiconductors—can be turned into a powerful sensing feature," said Scheideler.
Feb 12, 2026
Better Metamaterial Design Via Transfer Learning
PhD students Xiangbei Liu, Ya Tang, and Huan Zhao, and Professor Yan Li, are co-authors of "A transfer learning–enabled framework for rapid property prediction toward scalable and data-efficient metamaterial design" published in Results in Engineering. When faced with new requirements, conventional machine-learning approaches require substantial new datasets for retraining—basically starting from scratch. Transfer learning can significantly reduce the required amount of training data while maintaining high accuracy and stability. "This approach provides a foundation for building a scalable, data-efficient knowledge base for future applications," said Li.
