Find minimal A[i]^2 + B[i]^2 when A and B are sorted Find minimal A[i]^2 + B[i]^2 when A and B are sorted arrays arrays

Find minimal A[i]^2 + B[i]^2 when A and B are sorted


Seems to me that the answer to your question is "NO" and linear algorithm is the fastest.

Constructively we can build sequences of any length which have a maximum in any random place inside such sequence. Also sequences of sums of squares built on these sequences are unordered, so you can't use techniques like binary search which are suitable for sorted sequences.

For example, if we have two arrays of length 7:

5   8   10  11  12  14  1512  11  10  9   7   6   1

we get array of sums of squares like this:

169  185  200  202  193  232  225

We can see, that the maximum is 232, but there's no way to find it except of iterating over the whole array cause the sequence of sums of squares is unsorted and maximum lies somewhere inside.

Also using the fact that one array is sorted is useless because second array can impact greatly in total sums and it has no sense to use something like binary search on a single array trying to evaluate total sums.

I'm sure, that this problem is equivalent of finding the largest element in unsorted array which has linear solution as the fastest.


Here is a simple adversarial argument that it's impossible to find the answer without looking at all the points. That is, the algorithm gives the adversary a possibly adaptive sequence of indexes to check, and the adversary will always fill in an A and B that preserve the monotonicity constraint and it will make it impossible for the algorithm to get the right answer without querying every location.

Consider just the top right quadrant of the 2d plane, for some given N. The adversary will just stack things on the unit quarter circle, equally spaced in polar coordinates for all but the last query. Finally, the adversary will put the Nth item queried just barely inside the circle. Hence this last queried index is the correct answer. If there are fewer than N queries, and the algorithm picks a queried location, the adversary insists that one remaining unqueried location barely inside the circle, which means the right answer had distance less than 1, but the algorithm returned distance 1. If instead the algorithm picks an unqueried location, the adversary puts it just outside the circle.


To formalize Edgar's intuition into an actual cell probe lower bound, consider defining, for i in [1, n] and some j unknown to the algorithm,

X[i] = 2 * (n**2 + i)Y[i] = 2 * (n**2 - i)A[i] = X[i] + 1 if i == j       X[i]     if i != jB[i] = Y[i] + 1 if i == j       Y[i]     if i != j

We do some analysis on A[i]**2 + B[i]**2 assuming i != j.

A[i]**2 + B[i]**2 == 4 * n**4 + 8 * n**2 * i + 4 * i**2                       + 4 * n**4 - 8 * n**2 * i + 4 * i**2                  == 8 * n**4 + 8 * i**2                  <= 8 * n**4 + 8 * n**2

Now assuming i == j.

A[i]**2 + B[i]**2 == 8 * n**4 + 8 * n**2 + 8 * i**2 + 2                  >= 8 * n**4 + 8 * n**2 + 2

The latter is always greater than the former. (Argh, we wanted a minimum, not a maximum. The idea should be basically the same -- just decrease by one instead of increase by one -- but the algebra is slightly different.)

The family of array pairs with j varying look the same except at j, so the algorithm must examine half of the indexes on average to determine j, which is synonymous for this family to discovering the correct output.