Academic projects showcase
Projects developed by undergrad students using neuromorphic sensing, algorithms, and processing hardware.

Hybrid ArtificialNN and SpikingNN for closed-loop motor control
Using the LuI silicon neurons in a spiking neural network converting the sound processing output of an ANN into spike trains to move a servo motor. Another ANN classifies the spike trains to determine the motor direction.
Team:
- Selin Schmitt - schmittse79503@th-nuernberg.de
- Timon Löwl - loewlti80780@th-nuernberg.de
- Sven Remy - remysv80967@th-nuernberg.de

Spiking Silicon Neurons for Text2Morse Conversion
Neuromorphic data encoding and decoding using LuI silicon neurons for implementing an efficient Text2Morse converter based on spike trains.
Team:
- Tan Phat, Nguyen - phatnguyen@gmx.de
- Paul Schmachtl - paul.schmachtl@outlook.com

Event-based camera spatial and temporal clustering for predictive maintenance
Using a neuromorphic event-based camera to design and develop a spatial and temporal clustering algorithm for frequency detection used in machine state estimation and predictive maintenance.
Team:
- Benedikt Fischer - fischerbe98484@th-nuernberg.de
- Felix Sixdorf - sixdorffe80095@th-nuernberg.de
- David Stiegler - stieglerda78912@th-nuernberg.de

Spiking PID Controller for Mobile Robot Trajectory Tracking
Neuromorphic PID implemented using Nengo to control a differential mobile robot trajectory tracking performance under uncertainty.
Team:
- Adrian Stangl - stanglad98626@th-nuernberg.de
- Bastian Wunderlich - wunderlichba98628@th-nuernberg.de

Follow the leader robot swarm
Neuromorphic vision based frequency tracking algorithm and closed-loop control. Embedded processing for collaborative mini-robots.

Scene understanding for robot motion
Neuromorphic visual sensing fusion with LiDAR for scene understanding and planning for mobile robot motion control.

Camera-based instruments for karaoke
Users can accompany their favourite songs instrumentally by imitating selected instruments with gestures sensed using a neuromorphic vision sensor.
Team:
- Selin Schmitt - schmittse79503@th-nuernberg.de
- Kay Hartmann - hartmannka80488@th-nuernberg.de

Crowd-sourced visuals
Privacy-preserving club visuals generation based on synchronized neuromorphic video sensing and sound generation.
Team:
- Selin Schmitt - schmittse79503@th-nuernberg.de
- Kay Hartmann - hartmannka80488@th-nuernberg.de

Event-based camera-LiDAR fusion for clustering-based depth estimation
This project addresses the challenge of estimating depth using a single DVS (Dynamic Vision Sensor) camera and a 2D LiDAR. Built using ROS2 for communication and data processing, the system can run on various platforms, including wheeled robots with Jetson Nano or similar hardware. A custom clustering algorithm was developed to estimate the depth of objects outside the LiDAR's measurement plane, enhancing depth perception while maintaining affordability.
Team:
- Annika Igl - aiglgg@web.de
- Timo Kapellner - timo.kapellner@gmail.com