Skip to content

R3plica

Robots In The Future Of Cleaning

Robotics, Service Robots5 min read

Pratik Pradhane

Meet Pratik Pradhane

Senior Robotics Engineer @ Peppermint Robots

Pune, India 

Pratik Pradhane works as a Senior Robotics Engineer at Aubotz Labs. Pvt. Ltd based in Pune, India. He is into the software development of autonomous floor cleaning robots. He pursued his Masters in Mechatronics from Vellore Institue of Technology, Vellore, Tamil Nadu.

Pratik started his career with TATA Automation Limited, where he worked on various technologies in robotics, in AGV’s, Robotic arms, 2D, and 3D Image processing, to develop innovative solutions for complex problems.


What inspired you to pursue a career in robotics?

I was always fascinated by this field. I remember, in fact from my childhood days, I was curious to see the things inside the toys, like remote-controlled cars, the way the motors would make the toys move and receive the signals from the wireless remote control, to steer it.

The world is transforming at a higher speed with tech developments. Especially with advancements in robotics, the lives of people are getting simpler and simpler as robots takeovers dull and dirty jobs.

More and more people would be able to concentrate on more skilled jobs. From recent years we could see tremendous developments in electronics, like processors getting more powerful, and are becoming cost-effective.

Also, the advancements in open-source robotics software platforms like ROS, have enabled quite a good amount of people to pursue their interests, which has led to an ever-growing robotic community.

To be a part of this journey of transformation, these thoughts itself have always been my inspiration to pursue my career in robotics.

Explain the differences between Service Robots and Industrial Robots?

Industrial robots like robotic arms are programmed to do repetitive tasks like pick and place (fixed programmed positions), welding, painting, palletizing, and many more. The process in the industry is well defined, with almost no unknown condition for a robot to face while carrying out the above-mentioned operations.

Whereas, service robots have to deal with changing environments and dynamic objects, like people around while carrying out simple tasks like floor cleaning in public areas like malls, airports, etc.

These demands the need for intelligence in a system to recognize the dynamic objects, to make the robot environment interactive, to decide to stop, or to take another action plan considering the safety norms for the environment and people.

Explain a challenge you have faced working with service robots?

The performance of the service robots depends on the taxonomy of the sensors and programming. Various sensors playing an important role in mobile robots such as [LiDAR](https://en.wikipedia.org/wiki/Lidar#:~:text=Lidar%20(%2F%CB%88la%C9%AAd,D%20representations%20of%20the%20target.), Radar, Depth Cameras, IMUs, GPS, etc. all of them have their capabilities and limitations.

To use the right taxonomy based upon our use cases and utilize the maximum of its capabilities through programming plays a crucial role in service robotics.

Do you have any Robotics project you are working on that wants to share your experience with us?

Yes, I am working on Autonomous floor cleaning robots which involves mapping the area and path planning to cover most of the area.

I work with SLAM algorithms, to make it more robust for various kind of environments, like in a crowded area, and sensors data fusion to get the best out of it. It has always been amazing to see the output of our code, in a physical world.

It’s quite an amazing stuff, staying in continuous up-gradation mode, to take a grasp of state-of-art algorithms and sensors with evolving technology and to keep developing the new features for the robots.

Can you explain how do you see the evolution of services robots in 10 years?

The growth of any technology depends on many factors based on priorities like its necessity in society, its demand in the market, its cost-effectiveness, its reliability, etc. Considering many of these factors, the service robotics is sure to grow in the market.

The recent pandemic of COVID-19 has taught us a lot of things, also we could see that service robots like food and medicine delivery robots, UV based sanitizing mobile robots, quadcopters for monitoring the stay-at-home rules has been of great help, at many places globally.

Apart from these service robots has quite a wide area of applications in domestic, professional segments specified in the IFR report.

It is also being observed that there is a blurring line between industrial robots and service robots, as some of the robots used in the industry could also be used as a service robot, merely by changing the end application of the robot.

For eg. Some of the Cobot arms are used for preparing and serving coffee or drinks. Although the service robot market has a CAGR between 35 to 45 % as mentioned in IFR for the next 3 years around.

It is not difficult to predict that with ever-increasing innovation in this field, we could see the good amount of service robots in our vicinity, doing a variety of tasks like outdoor garbage collection, cleaning, gardening, society security, medical assistant, parcel delivery robots, etc. within next 10 years.

What is the importance of Artificial Intelligence in robotics?

Artificial Intelligence plays a crucial role in robotics when it comes to making robots more and more environmental and human interactions.

The more the robot understands about the environment the smarter is its decision making. And AI with the help of sensors data and algorithms enables the robot to understand it in a better way.

AI is providing justice to sensors, by utilizing the maximum of the data provided by sensors to train the neural networks and to get more out of it.

This has allowed the robots to interpret the details of their surroundings like because of AI algorithms the robots could interpret what a person wants to say, a good example would be Google Home, Alexa, etc.

Also, robots could recognize many of the different objects around by processing the data received from 2D, 3D cameras, etc. the latest example of such algorithms being YOLO V5.

In short, AI has enabled the robots to understand what it hears and to know what it sees.

Can you explain with a brief example of how machine learning and image processing work?

Machine Learning in simple words, consists of algorithms, which learns from a data fed to it, and then after it gets trained, it could predict the output based on inputs given. And when it is implemented with image processing, it learns from a set of images fed to the algorithm. For eg.

If we want an algorithm to recognize an animal, whether its a dog or a cat, we would have to feed it with a thousand images of cat and thousand images of the dog.

So after feeding these images to the algorithms, which are nothing but deep neural networks(used for training based on image data).

This neural network consists of layers of neurons in it. The images pass through these layers of neurons, wherein the layers of neurons learn the features from the image.

The image of the dog will have different features than the image of the cat, in terms of the shape of the body or face of cat and dog are different, likely there are many other features to learn. Thus these trained neural networks are then able to identify the dogs and cats based on data they have been trained on.

So to identify more no. of animals, we need to build and train the neural network with more no. images of all those animals.