Touch is a zero-distance sensory mode: we can see without being seen but we can't touch without being touched. When catching an object, we feel it with our tactile perception and we adjust our grasp accordingly. A common test for robotic arms consists of filling a glass with water by pouring it from a bottle. Why is such a simple task a challenge even for most advanced technologies? As we pour the water, both the weight of the bottle and its balance point are altered. Therefore, we must reprocess the tactile information provided by the hand to re-calibrate the force applied by each finger. It is very difficult to re-create these dynamics in a robotic arm, since it would require the exact replication of tactile perception - a goal that has not yet been achieved, even though recent studies report significant advancements in this direction.