An Investigation of Accuracy and Calibration of the VLP-16 Puck LIDAR

VLP-16 Marketing Image

A quick note: This article war originally written in February 2018, so some of the info herein such as pricing, current delivered product quality, etc. is likely out of date by now.

LIDAR, once the domain of ultra-high-budget industrial and research projects, has permeated cheaper markets in the last two years primarily with the introduction of ever-cheaper sensors by Velodyne, namely, the VLP-16 Puck. A sensor technology that was once ~$50k/unit, then ~$20k, then, with the VLP-16 $8k, is now, with a recent price drop (Jan 2018), $4k/unit, and sure only to get cheaper. It’s this precipitous price drop that led me to work on a client project recently with as many as 8 sensors in our fleet.

Having so many sensors on hand allows one to make interesting observations about comparative performance. During development, we noticed that the point clouds returned from two separate sensors mounted in different axes were both capturing the same wall, but in different locations inside the system reference frame. Specifically, a vertically-oriented sensor (scanning floor-wall-ceiling-wall in its sweep) was placing the floor a few degrees out of alignment relative to a horizontally mounted sensor (scanning wall-wall-wall-wall). Actually, it was only by luck that we happened to be scanning in a way that would include the floor within both point clouds as to reveal the discrepancy. After digging, I determined that the yaw angle reported by the vertically mounted sensor was actually a few degrees out from reality. This is actually pretty lucky as well – had the horizontally mounted sensor been the erroneous one, the error wouldn’t have been as apparent in the point cloud view.

This set me down a path to quantify the angular error of the faulty sensor and determine if others in our pool suffered similar issues. Yes, as it turns out.

Designing a Measurement Jig

sketch of lidar alignment

Since I was trying to determine how far offset from ground truth the reported point cloud yaw angle was, I had to start with a known ground truth. Originally, my idea was to align the expected 0° yaw angle (90.0° perpendicular to the line between mounting alignment pins) to some reasonably-far-off feature like a telephone pole or something.

The general idea stuck, but I realized that much better would be the interior corner of a room – two flat, straight walls meeting at a vertical corner would make for a great feature to align on in the point cloud, with an arbitrarily small feature to align my jig to in the physical world, also. Plus, rooms tend to have flat, level surfaces to position a LIDAR on to make things easy in post.

Next was figuring out how to precisely point the LIDAR. From the beginning, I assumed I’d 3D print a base plate to sit the LIDAR on, with precisely toleranced alignment pins to clock the puck. For pointing THAT plate precisely at the target feature, I figured I’d use a laser. But even if you get a nicely packaged laser diode, they often don’t fire perfectly in-axis so some alignment correction is required. Then I thought about a class of laser that’s designed to be easily aligned coaxially with an existing barrel, and fire perfectly straight out of it – laser boresighters for firearms.

A cheap crappy laser boresighter from Amazon

These range from very expensive and very accurate to dirt cheap and medium-accurate. But what’s cool about a coaxially firing laser is that, as long as the source itself is coaxial to the shaft, it’s really only off in one direction. That is, you can rotate it until it’s only off in the vertical or horizontal axis.

Aligning the Boresighter

What we need to do, then, is make sure the laser is only off in pitch, and has 0° yaw error. We can find this point by rotating the boresighter on a flat surface (so the barrel doesn’t change yaw angle, only rotation/roll) while watching the dot on a far wall bounce up and down. When it’s at the top or bottom of its bounce, we know there’s no left-right error. We can then mark the top of the barrel as 0° rotation – with this mark perfectly up or down, the laser fires perfectly straight in the horizontal plane (which is all we need).

Marked boresighter
My boresighter, marked with a paint pen at the ‘top’

Drawing and Printing The Base

So then all we need to do is CAD up a base plate that the LIDAR can bolt to, with alignment pins and a hole with an interference fit for the boresighter through the axis of the LIDAR at 0° yaw. This is what I came up with:

Alignment plate CAD screenshot

You can screw the LIDAR in place with a short 1/4″-20 screw, or just don’t – it’s pretty stable either way. I added some rubber feet to the corners of this one so it doesn’t slide around much, which I do recommend so your target doesn’t drift.

Calibration Procedure

  1. If you’ve got a VLP-16, you’ve no doubt tried out VeloView. Fire that up now, and get to the point where you’ve got a live pointcloud from your LIDAR. Configure your view so you’re looking top-down (-Z, I believe) with an orthographic rather than isometric projection (the box icon top left).
  2. Position the boresighter in the hole as above, with the “up” mark up, and aim the whole assembly at a far-off, clean corner in your test room. The farther away the better, since a small angular misalignment will show up as a larger linear displacement the farther away you are. Approach the corner to verify the laser dot is dead-on the crease, as accurately as you can get it. It might help to have an assistant if your corner is far away or if the lighting is tough.
  3. In VeloView, the corner you’re pointed at should be due-north in the window, on x=0. If you zoom in on that point, you may find that your two walls don’t actually converge on that line; that there’s actually an angular offset. That’s what I found, at first.
  4. At this point, you basically have to guess-and-check at the angular displacement. If your corner is far enough away, you can use the linear measurement tools to calculate the linear displacement of your corner from the x=0 line, and do some trig to determine the angle. But you’ll still want to verify by changing the yaw. Click the “Sensor Stream” button (the icon looks like a VLP-32) and check “Advanced Configuration.” Here, you can adjust the yaw value of the laser’s position to determine how far off your internal calibration is. Increase the positive or negative value until your corner perfectly lines up on the x=0 gridline. (Sorry, I don’t have any screenshots of this part).
Aligning the calibration jig on a corner
Aligning the jig on a corner feature

You can download the calibration plate file on Thingiverse, here.

Results

After calibrating our fleet of 8 VLP-16’s, I found the following azimuth errors:

LIDAR #Azimuth Correction (deg)
1
0
2-3.35°
3-0.75°
4
5+0.95°
6-0.74°
7+0.3°
8-0.34°

Of course, the unit we caught originally was #2 above, with the worst angular error. It’s tough to say whether we would have noticed or undertaken this verification process if the error had only been the 0.95° next-worst-case sensor, and it’s hard to say also whether that would have impacted our SLAM loop closure and gathered data.

All of the errors above were measured at a rotational speed of 600RPM. In the interest of being thorough, I tested a few sensors at 1200rpm (the VLP-16’s max rotational speed), but found the angular errors to be identical. My hypothesis is it’s some kind of un-nulled timing error relative to a 0° sensor.

I also wanted to compare angular error to distance error, and did so for the first 5 sensors in our fleet. Distance error was taken as difference between the LIDAR measurement in VeloView and a laser tape measure good to ±2mm, positioned on the rotational center of the LIDAR:

LIDAR #Distance error (cm)
12
22.5
3.8
4.3
5.2

Notably, the newer sensors in our fleet seem to be noticeably more accurate, and the sensor with worst angular error also has the greatest distance error. However, all are comfortably within the datasheet spec of ±3cm.

Conclusions

I haven’t personally been involved with any further LIDAR acquisitions or testing since this study was conducted around February 2018, so it’s difficult to say whether quality has improved. I did discuss this with Velodyne support at the time, so I can only assume in good faith that whatever cause for the error existed in production, it has been mitigated since. It also appears to be the case even within the data gathered that newer sensors have lower error on both distance AND azimuth angle. In other words, it’s probably safe to say we got bit on sensor #2 by early production quality issues as Velodyne was rapidly ramping up their production line. It’s further worth pointing out that sensor #2 was purchased used. While I find it difficult to ascribe the increased error to previous owner neglect when new factory-fresh units also exhibited their own measurably substantial error, I can’t rule out that handling might exacerbate the underlying issue.

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *