Article about RETINA was published in Photonics Views
- June 2, 2025
- Posted by: joan
- Category: News

Smart Photonic Sensors is a four-year collaborative research project (2024–2028) aimed at developing a new generation of compact, low-power “multimodal” perception systems by combining two complementary photonic technologies:
- Spectral Imaging Module: A miniaturized spectral imager capable of capturing detailed wavelength-resolved information across visible and near-infrared bands. This subsystem will be built around advanced tunable filters and photonic-chip receivers to enable high-resolution, hyperspectral data in a small form factor.
- Lidar Module on Photonic Chip: An integrated lidar (light detection and ranging) sensor fabricated on a photonic-chip platform. By leveraging silicon photonics, this module will provide precise distance-and-shape measurements with low power consumption, high scanning speed, and immunity to electromagnetic interference.
The core idea is to fuse the two data streams (spectral and depth information) in real time through onboard signal-processing electronics and algorithms. By merging “what” (material and color properties from spectral data) and “where” (3D structure from lidar), the project seeks to create a versatile perception engine that can be embedded into applications such as:
- Autonomous Navigation: Enabling drones, robots, or self-driving vehicles to simultaneously recognize objects/materials (e.g., vegetation vs. pavement) and map their surroundings in 3D for safer path planning.
- Precision Agriculture: Monitoring crop health through hyperspectral signatures (e.g., water stress, nutrient levels) while mapping field topography to optimize irrigation or harvesting strategies.
- Environmental Surveillance: Detecting pollutants or hazardous materials (via characteristic spectral fingerprints) together with terrain profiling in remote or industrial settings.
Key development phases include:
- Design and Fabrication (Months 1–18): Creating prototype spectral-imager chips (with integrated micro-filters) and photonic-chip lidar units (using silicon-nitride waveguides and modulators).
- Algorithm and Firmware Integration (Months 19–30): Developing real-time data-fusion algorithms that align spectral frames with depth maps, plus on-device preprocessing to reduce data bandwidth.
- System Integration and Packaging (Months 31–42): Combining both photonic chips, associated electronics, and optics into a robust, handheld or vehicle-mountable enclosure with thermal management and power supplies.
- Field Trials and Validation (Months 43–48): Testing in real-world scenarios (e.g., greenhouse environments for agriculture, urban streets for navigation) to benchmark performance against separate commercial spectral imagers and lidar units.
The consortium includes academic research groups specializing in photonic-chip design, industrial partners with fabrication facilities for Silicon-Nitride (SiN) waveguides, and end-users from the automotive/agribusiness sectors. By the project’s end, the team expects to demonstrate a portable sensor head measuring under 10 × 10 × 5 cm, consuming less than 5 W, delivering real-time fused imaging at 10 Hz across a 50° field of view.
More info here: https://onlinelibrary.wiley.com/doi/10.1002/phvs.202500006