Non-linear Regression (for Time Difference of Arrival)

I’m working on a project involving the use of Time Difference of Arrival techniques (see: https://sites.tufts.edu/eeseniordesignhandbook/files/2017/05/FireBrick_OKeefe_F1.pdf, e.g.) to localize a signal source.

Working in two-dimensions, and given the location of 3 receivers and the time at which they “seen” some signal (let’s say, a light signal), the following process is used:

Once the signal is received at two reference points, the difference in arrival time can be used to calculate the difference in distances between the source and the two reference points. We use the following equation:

∆d = c ∗ (∆t) [1]

Where ‘c’ is, of course, the speed of light, and ∆t is the difference in arrival times at each reference point. In two dimensions, this yields the following equation:

∆d = √((x_2 - x)^2 - (y_2-y)^2)
-√((x_1 - x)^2 - (y_1-y)^2)

Where (x_1,y_1) and (x_2,y_2) are the known positions of the recievers.

Now, using nonlinear regression, this equation can be converted to the form of a hyperbola. Once enough hyperbolas have been calculated, the position of the source is calculated by determining their intersection.

So, my question is, is there a ROOT method to use non-linear regression to “convert” [2] to the form of a hyperbola, do so for all 3 possible receiver pairings, and then determine the intersection of the 3 hyperbolas?


Please read tips for efficient and successful posting and posting code

ROOT Version: Not Provided
Platform: Not Provided
Compiler: Not Provided


May be @moneta can answer your question.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.