The Final Solution

This is the 8th and final post on on my paper “A Method for Determining and Improving the Horizontal Accuracy of Geospatial Features” Other posts on this topic:

7. Determining the Spatial Accuracy of Polygons Using Buffer Overlay

In the last post on the subject we described how Buffer Overlay can be used to determine the horizontal accuracy of polygon features. We ran Buffer Overlay on a single Township/Range where we had 259 permits to compare against parcels. This resulted in 259 cumulative probability (CP) curves one for each permit. In the sample below 21 curves are shown for each of these when the CP is greater than one the buffer distance is equal to the accuracy of the permit. If the curve does not cross the 1 CP then its accuracy could not be determined.

Here is our initial horizontal accuracy distribution for the 259 permits. In the graph below we have two peaks: < 2 feet where we have about 50 records and >= 60 where we have about 35 these are those feature for which accuracies could not be determined. The middle part of the curve shows a random distribution of accuracies.

Here is the cumulative curve. If the data was perfect all 259 permits would have a horizontal accuracy of .5 feet. In which case the cumulative curve (shown below) would be a straight line across at a cumulative count of 259.

This information is visualized below where you can see that there are many features at accuracy levels from  2 – 60 feet. This type of visualization is useful for manual correction of features but we were looking to automate the process.

In most cases the last buffer and clip operation will extract a line segment from control that forms a complete ring. This means that using standard topology tools the extracted control lines can be built into polygons and used to replace less accurate test (permit) features.

The solution that we implemented consisted of two processes:

  • If the clipped line work formed a complete ring the ring could be easily built into a polygon. This was the Phase 1 Correction.
  • If the clipped line work had gaps, we implemented a more complex algorithm that did a node-to-node comparison and then extended lines to form closed rings. In this case, it was possible to create multiple rings and once those rings were built into polygons their area was compared to the original permit (test feature) to assure that the new feature was not significantly larger or smaller than the original. This was the Phase 2 Correction.

In this video the first build is the trivial solution or Phase 1 Correction, the second build is the more complex Phase 2 Correction.

After the Phase 1 Correction we we generated statistics and found that we had corrected 104 permits or 40% of our 259 features to parcel. The horizontal accuracy distribution before and after Phase 1 correction is shown below.

In this close up we see that the Phase 1 Corrections curve spikes at 0.5 feet above the initial conditions and that the rest of the Phase 1 Correction curve is under the Initial curve as is expected.

The same improvements can be seen in the cumulative curve.

After the Phase 1 correction we also calculated the RMSE error and found a significant reduction in the horizontal error when compared to parcel (see below).

Lastly, a visual comparison was done showing the obvious improvements at the 0.5 feet horizontal accuracy (shown in yellow below).

The phase two correction made a 6% (16 record) improvement to our data. Here are some images of the features corrected. Red is the original permit the green area is the corrected feature following parcel.

The image below shows a common problem with our algorithm on the west side of the polygon where a corner has been clipped. However, in general the new feature is a better representation than the original in red.

The final horizontal accuracy measures after Phase 2 correction are provided below. Phase 2 is under Phase 1 is under Initial conditions as expected.

In conclusion, we have found that Buffer Overlay analysis is an excellent tool for determining and improving the horizontal accuracy of our permit features to parcels. The initial accuracy assessment is easy to perform and provides data that is easy to interpret. Visualizing the data by these generated accuracies provides an avenue for manual correction of the data. However, with a little more effort complete rings can be extracted from the last buffer and clip operation that can be built into more horizontally accurate polygons. This is the low hanging fruit but also provides the biggest bang for the buck.

Our additional efforts in Phase 2 where we connected nodes to close gaps and create rings only improved our data by 6%, however, we feel confident that we can tweak our algorithm to yielding at least a 10-15% gain in the future.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s