How to detect target object's offset from a reference point?


I want to know how to detect a target object’s offset from aa reference point in an image Machine Learning.

For example, let’s say I have five images (1,2,3,4,5). these images have a ball in it (the ball is the target object). Image-1 is the “reference image”. This means that the position of the ball in Image-1 is the reference point. In all other images, the ball is offset from the reference point.

I want to detect and measure this offset using Machine Learning. I want something like below: -

  1. Image-1 = measure is 0.0.
  2. Image-2 = measure is 0.1
  3. Image-3 = measure is 0.4
  4. Image-4 = measure is -0.3
  5. Image-5 = measure is -0.2
  6. etc…

by “measure” I mean some quantity that increases if the offset of the ball increases and decreases if the offset of the ball decreases. And the sign of the “measure” signifies the direction of the offset. For example: - if the ball is offset to the left of the reference point then the sign is minus “-” and if the ball is offset to the right then the sign is plus “+”.

Can anyone tell me how to achieve this or anything close to this.

Please help.

Thanks in adv