Projects

Analyzing Humans

AI Job Counselor:
Collect transcripts of interviews with leading exponents of different professions, e.g. using categorizations like:

and then build a system, that will tell your profession based on the words you use (e.g. from WhatsApp)

Happy-Map:
Using a machine learning model trained with movements of people wearing their mobile phones in their pockets, build an enhanced map that shows how happy people are at certain locations. A potential use case is at exhibitions and museums, where people will first download an app similar to Happimeter (Apple StoreGoogle Play Store) and then feed their phone accelerometer/GPS signals to a cloud server for real-time analysis.

Happy_Avatar:
Using image enhancement algorithms, build a system that shows an avatar picture of yourself based on your mood, i.e. when you are happy, your face picture is beautiful, when you are unhappy or sad, your face picture is ugly.


Analyzing Animals

HappyCow:
Build an app, that shows the mood of a cow based on a video stream of the body posture and facial expression of the cow.

HappyCow-AR
Build an AR system that using a VR or AR camera shows the view of the cow from the Swiss alp while the cow is wearing a camera 

Analyzing Plants

Singing plant
Build a plant mood sensor translation system, that takes voltage differentials from plants (e.g. basil, or bamboo) measured with a plant spiker box or our own hardware biolingo, and translates the voltage differential waves into aurally pleasing melodies, see e.g. also the bamboo playing music of the plants.

Presentation with project overviews shown in the introduction course