Program

Thursday, April  4, 2024, 10:30-14:00
University of Zurich,  Irchel Campus, Institute of Neuroinformatics Foyer and 35F-51. See Travel/Location.

Add event to your google calendar

Program 

10.30 Registration, pick up badges in the INI foyer where the lunch will take place.


11:00-11:30 Welcome and introduction (The lecture hall Y35 F51, one floor down and under the "rock desert" outside INI.)

11.00-11:15 Welcome (Prof. Shih-Chii Liu, Chair IEEE Swiss CAS/ED Chapter)

11:15-11:30 Introduction to IEEE WiCAS & YP (Prof. Yoko Uwate, Diversity, Equity and Inclusion committee chair, Tokushima University )

11:30-11.35 Networking is key to a career. (Prof. Laura Bégon-Lours, D-ITET, ETH Zurich)



11.30-14:00 Lunch along with posters & demonstrations
Foyer of the Institute of Neuroinformatics, Building 55, level G


Posters

Yoko Uwate, Tokushima University

Visualization of Neuronal Activity Using Attractor Reconstruction

This study analyzes the developmental status of mouse brains using nonlinear time series analysis. By using attractor embedding, it becomes possible to unravel how the mouse brain is evolving. 

Xiaying Wang, IIS, ETH Zurich

Near-Sensor Analytics and Machine Learning for Long-Term Wearable Biomedical Systems

Wearable biomedical devices are increasingly crucial in sectors where privacy and low latency matter. They must perform continuous monitoring and data analysis akin to human experts but on tiny batteries, creating energy efficiency challenges for both hardware and software. Machine learning and deep learning techniques have shown promise in surpassing human performance in data analytics. Yet, these methods typically demand high computational power, unsuitable for low-power wearables. In this poster, we present our research efforts in pushing beyond the current power walls for machine learning and bio-signal processing and move toward milli-watt (or even micro-watt) continuously active, long-term wearable biomedical systems.

Giusy Spacone, IIS, ETH Zurich

Wearables and at-body AI for next generation human-machine interfaces: an arm-centric approach


Biosignal sensing presents significant potential for developing bio-inspired human-machine interfaces (HMIs), with applications across various domains. This poster presents our research efforts in the design of circuits, systems, and tiny Machine Learning for ultra-low-power (ULP), arm-centric human-machine interfaces, based on surface electromyography (sEMG) and A-Mode ultrasound (US). We present BioGAP and WULPUS, groundbreaking wearable sensing platforms with onboard computing for ULP real-time signal processing. When embedded into innovative EMG and ultrasound armbands, BioGAP and WULPUS demonstrate state-of-the-art accuracy in wearable gesture classification and continuous movement regression with sub-30mW power consumption and multi-day battery lifetime. 

Emma Boulharts, IBM Zurich

Investigating the impact of low precision digital operations on neural networks deployed on mixed-signal in-memory computing accelerators

Recent advancements in AI hardware highlight mixed-signal accelerators, which combine analog computation for matrix multiplications with digital operations, showcasing remarkable performance and efficiency. Here, we investigate the robustness of networks deployed in these mixed-signal chips to reduced precision digital operations. We perform post-training quantization on digital operations in networks trained with IBM's AIHWKit, down to 8 and 6 bits, and evaluate on CNNs and Transformers. Our findings reveal that 8-bit precision yields satisfactory accuracy, while 6-bit precision poses challenges, particularly due to normalization layers. Future research will delve into employing quantization-aware training in digital layers to enhance network robustness.


Manasi Muglikar, RPG, UZH

Seeing behind dynamic occlusions with Event Cameras

Weather conditions like snow and rain degrade the performance of computer vision algorithms. Traditional camera-based methods struggle to remove occlusions as they tend to hallucinate the background. We introduce a novel approach which combines a traditional and event camera to remove occlusions and reconstruct background from a single viewpoint. 


Demos

Xiang Deng, Sensors Group, Institute of Neuroinformatics, UZH & ETHZ

Dextra: Rock-Scissors-Paper Robot

Dextra uses a DVS event camera + a CNN computed on laptop GPU + a new super quick tendon-driven robotic hand to play the game rock, scissors, paper with people

Tobi Delbruck (for Marcin Paluch), Sensors Group, Institute of Neuroinformatics, UZH & ETHZ

Cartpole Robot

The cartpole is controlled by a neural network (MLP) computed on FPGA and trained to imitate an optimal nonlinear controller.  The MLP is computed in <5us and is constructed with hls4ml, developed by the particle physics community.