Dr. Kshitij tiwari
Post-Doctoral Research Scholar
Intelligent Robotics Group, Aalto University
Ask anyone what is a robot, and, the very first image that crosses their mind will be a tin "man" with a shotgun (idea courtesy: Terminator). However, a robot is much more than a tin man. It does not even have to look like a "man" to begin with. For instance, the 1st robot citizen, Sophia is not a "man". So, then we should simply adjust the definition as tin "person"? Well, obviously not. Robot should just be seen as a complex piece of hardware that has the ability to carry out complex tasks. Depending on the tasks that need to be performed, a robot can look like a "man" or in precise terms, humanoids, or a box with wheels (in other words, a ground robot) or a piece of thermocol with blades (drones as they are known) and so on. But the interesting fact is that a robot by itself needs a human "master" to tele-operate it as is done with the conventional heavy construction machinery (cranes, bulldozers etc.). This, to some extent, is not that appealing. What is appealing however is an autonomous driving car. So, how did we go from robots to autonomous driving cars? Well, we gave the robots the key piece of the puzzle, the brain, a.k.a. Artificial Intelligence. So now backed by software and driven by hardware what we get is essentially a robot that comes in all shapes and sizes based on the application.
So, that was about robots but having a "brain" for the Terminator is a recipe for Robot Apocalypse: Robots taking over humans and destroying the infidels (as they would see us). Is that where we are headed? Thanks to the science fiction movies, people perceive robots with brain and emotions as superior entities that would take over the world, but little do they know about the 3 basic laws of robotics. Isaac Asimov, an American Sci-fi Writer and a Professor of Biochemistry designed the following 3 operational rules for all robots:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Assuming that it is 2050 and we have done significant research to attain human level cognition for robots, as long as these 3 laws are intact, the robot apocalypse is far from happening. If only the people knew better, they would know that in good hands, robots are meant to take over the mundane tasks so that the humans can invest their time fruitfully in other activities which are not meant for robots like eating pizza and playing with the dolphins. Robots are not here to take our jobs and get us fired!!!!
So that was me breaking some stereotypes and addressing some ethical issues that I always get asked pertaining to Robotics. Hi !!! My name is Kshitij and I am a Roboticist. I design both hardware and software for the robots, so theoretically I can be your one-stop-shop for all robotic queries. Welcome to my website where you will details about my journey to adulthood, interests, educational background and the likes. Also, summarized below are my key skills and core research areas. Feel free to scroll along.
- Raspberry Pi
- Circuit Designing
- PCB Etching
- DIY Robots
- MS Office
Some of the key research areas that I am interested in are listed below:
- Path planning under uncertainty
- Cooperative Navigation
- Long-term Autonomy
- Distributed/Decentralized Robotics
- Multi-Robot SLAM
- Applied Machine Learning
- Search-and-Rescue Robotics