Issues with Pure Pursuit Path Generator in JavaScript

I’ve been working on implementing a pure pursuit algorithm using the method described in this whitepaper. I decided to create my own path generator using JavaScript, and I’m having a strange problem with the path smoothing function.

The function I wrote is identical to the one described in the whitepaper, and I didn’t suspect a problem with it until I looked at this version of the same path generator written by @AriMB in Python. If you generate a path using my program, the longer you make the path, the more the smoothing function diverges from the original path.


The straight lines are the desired path, the red trail is the smoothed path. You can see that it handles a simple 90 degree turn just fine, but if I add another point to the path and smooth it again, the smoothed path diverges from the desired path drastically.


However, if you plot a similar path on the program by AriMB, you get a more uniform result.


I’ve compared my smoothing function to his extensively, as far as I can tell they should take in the exact same inputs, and provide the exact same outputs. The only differences I see pertain to the different syntax of Python. This is my smoothing function:

var smoothPath = function(Path, A, B, tolerance) {
    var tempPath = Path;
    var change = tolerance;
    while (change >= tolerance) {
        change = 0.0;
        for (var i = 1; i < Path.length-1; i++) {
            for (var j = 0; j < Path[i].length-1; j++) {
                var aux = tempPath[i][j];
                tempPath[i][j] += A * (Path[i][j] - tempPath[i][j]) + B * (tempPath[i-1][j] + tempPath[i+1][j] - (2.0 * tempPath[i][j]));
                change += Math.abs(aux - tempPath[i][j]);
            }
        }
    }

    return tempPath;
};

And this is AriMB’s smoothing function:

while change >= tolerance:
  change = 0
  for i in range(0, len(total_waypoints)-1):
      for j in range(len(total_waypoints[i])):
          aux = smooth_waypoints[i][j]
          number+=1
          smooth_waypoints[i][j] += weight_data*(total_waypoints[i][j]-smooth_waypoints[i][j]) + \
              weight_smooth*(smooth_waypoints[i-1][j]+smooth_waypoints[i+1 [j]-2*smooth_waypoints[i][j])
          change += abs(aux - smooth_waypoints[i][j])
          changes.append((total_waypoints[i][j]))

There are several input parameters to the smoothing algorithm that I made sure to set to the same values used in the python implementation (called A, B, and tolerance in my function). Yet my function still produces a much different output.

I’ve done a lot of debugging, and I’m fresh out of ideas. Is there something I’m missing here? Do the JavaScript math functions I’m using behave somehow differently than those used in Python?

Edit: I should probably mention, that when running my function with the same inputs as AriMD’s, it generates basically a straight line due to the tolerance value being so low. But again, I still don’t understand why the function behaves so differently.

1 Like

It may not be the cause, but one change is your i loop upper bound should be length - 2 to match the Python code.

The only difference that would make would be that it would ignore the last point on the path. I tested it with -2 just for spits and giggles though and it didn’t change the functionality.

How does the python code initialize smooth_waypoints? Does it copy the total_waypoints? One thing to look at is that Python and JavaScript may copy 2D arrays differently—in particular it can be easy to not end up with a deep copy of the inner arrays (or a copy at all… does your tempPath = path line make a copy or just create a second reference to the same path array?).

Also, the Python code saves a list of the changes made—is this used later for some purpose?

That was just left over from my debugging, it serves no other purpose.

Wow, that was it! Inserting the following code in place of var tempPath = Path; remedies the problem, and the smoothing function now performs as expected.

var tempPath = [];
for (var i = 0; i < Path.length; i++) {
    tempPath.push([0,0]);
    for (var j = 0; j < Path[i].length; j++) {
            tempPath[i][0] =(Path[i][0]);
            tempPath[i][1] =(Path[i][1]);
    }
    
}

Why is it that JavaScript would handle the copying of multidimensional arrays like this? It seems careless and obstructing to simply lose data when making a 1 to 1 copy.

Edit: Or maybe it doesn’t actually make a copy, and instead only makes a reference to the same array like you said. I’m fairly certain I ruled out this possibility when debugging, but I could be wrong.

Deep copying is costly; most languages do not do it by default. There’s no “losing data,” you re-use the same data in both arrays.

1 Like

So for example if I say array1 = array2; then array1[0][3] = 4; now does array2[0][3] == 4? Is array1 simply a reference to array2?

It depends on what specifically you do, and what the array contains.

Arrays are reference type, so in your example, yes, array2 is just a reference to the same object as array1. It is not entirely correct to call it a reference to array1, as array1 is itself a reference (and array2 is not a reference-to-a-reference).

But even if we actually do copy the array (say through concat()), we are not necessarily doing what you would expect a copy to do, because the contents of the array might themselves be reference-type rather than value-type. If you naively copy the contents of one array into another array using simple assignment, and the contents are reference-type, then the same thing will happen with the contents and what you will end up with is a new array (in that it is a separate chunk of memory) containing identical references to the first array. In such a situation, the actual objects have not been copied, and both arrays refer to the same set of objects (until you modify one of the arrays). This is called a “shallow copy.”

In order to obtain a “deep copy,” in which the entire tree of references and referred-to objects are copied and nothing is shared between the two arrays, you have to manually populate the new arrays with deep copies of the objects contained in the original array. This can be difficult to do, as those objects may themselves have fields that are reference-type rather than value-type, and so creating a deep copy in general requires recursively creating deep copies down the entire tree, which is extremely costly for complicated objects.

2 Likes