This seems to be quite an interesting problem from IOITC 2013.
Problem in Short: You are given N distinct points on a plane where distances are measured using the Manhattan metric. Find the number of ordered triplets of distinct points (A;B;C) such that the sum of distances from A to B and B to C is equal to the distance from A to C.
I have been trying this problem for quite some time but unable to come up with a solution to pass the following constraints: N<=10^5 and |xi|<=10^9 , |yi|<=10^9.
Some hints to get me unstuck would be highly appreciated. Thank you.