A Regulatory Application of LiDAR Data

Capture South Florida Water Management District (SFWMD) has an agreement with the U.S. Department of Agriculture and the Natural Resources Conservation Service (NRCS) to provide engineering/hydrologic modeling technical assistance to NRCS in the delivery of technical services related to fish and wildlife conservation throughout the state of Florida on non-Federal private and Tribal lands, and to deliver conservation programs that are a part of the Food, Conservation, and Energy Act of 2008, also referred to as the 2008 Farm Bill.

In order to evaluate the water management systems for off-site potential impacts a computer model is generally developed for the existing condition and the peak stages and flows generated are compared to the proposed system to determine impacts.

One of the most data intensive steps for model development is the creation of the topographic information necessary to construct the stage/storage relationship.  The use of digital elevation models (DEM) derived from LiDAR data allows for quick and accurate estimates of basin storage especially in remote areas where conventional topographic surveys are unavailable or cost prohibitive.  The use of these LiDAR derived DEMs allows District staff to accurately delineate and evaluate watersheds because of the large extent of the area covered.  In the past, many watershed hydrology and hydraulics assessments have not been possible because of a lack of topographic data. The following discussion presents a typical project in support of the 2008 Farm Bill.

The project involved the production of volume calculations (based on one foot intervals) for each of the 47 basins identified within a large project encompassing approximately 24,500 acres of land. The steps taken to accomplish this were as follows: Create a file directory for each basin, within each directory, create a file geodatabase to store: the basin boundary, DEMs, contours, TINs, and text files.

The basin boundary becomes the mask for clipping the DEM. blog1.jpg

The clipped LiDAR DEM.

blog2.jpg

Using ESRI’s 3D Analyst Extension a contour feature class is created from the clipped DEM.

blog3

A TIN is created using the contour layer as the input feature class. blog4

In some instances the TIN will extend beyond the boundary of the basin so it needed to be edited to remove the excess area. When clipping for this purpose, the surface feature type must be set to Hard Clip. This ensures that the basin boundary defines the clipped extent of the TIN.

blog5

At this point the 3D Analyst tool Surface Volume is needed to obtain the calculated volumes.  Unfortunately, the tool creates a separate results text file for each elevation analyzed and thus can result in a large number of files. To fix this problem the standard ESRI Volume Calculation script was modified to produce one file with the calculated volumes for all elevations analyzed. The output is a comma delimited text file listing the outputs of the surface volume function for multiple depths. You can view the modified script on GitHub. blog6

Here is the output for 13 elevations.blog7 The last step was to QAQC the results by comparing the basin acreage with the Area_2D of the maximum elevation.  In this case, the basin acreage is 464.36 acres and the volume calculated acreage is 20,227,347 / 43,560 = 464.355. With this basin complete its time to start another one…

Linda McCafferty, Geographer 3, Regulation GIS, SFWMD
Shakir Ahmed, Geographer 3, Regulation GIS, SFWMD
Juan Tobar, Supervisor – Geographers, Regulation GIS, SFWMD

Advertisements

SFWMD’s New Composite Topo

CaptureIn South Florida’s complex water management system, hydrologic models are used for evaluation, planning and simulating water control operations under different climatic and hydrologic conditions. These models need as input accurate topographic data as part of their inputs. It has been four years since SFWMD updated their composite topographic model here is a general description of the source data and the processing steps that went into the newest composite product for the Lower West Coast.

All work was done using ArcGIS v10.1, with a raster (grid) cell size of 100 ft, with elevations in feet relative to NAVD 1988, and X-Y coordinates in the standard State Plane Florida east zone, 1983-HARN, US feet.  The extent is X from 205000 to 750000 and Y from 270000 to 1045000, creating a gridded dataset of 5450-by-7750 cells.  Almost all of the source data had been previously compiled.  The process consisted of assembling the topographic data at 100-ft cell size, layering together the topographic data, blending the source layers to remove discontinuities along the edges, assembling and layering together the bathymetric data, joining the bathymetric with the topographic data, and filling in any remaining no-data holes along the shoreline.

Most of the project land area was covered by modern LiDAR data, which had already been processed to create DEMs at 100-ft cell size.Sfwmd_Lidar-5bSeveral areas lie in the western zone, and they had already been projected for the Florida east zone.  The LiDAR data includes:

  • FDEM 2007 Coastal LiDAR project, with partial or complete coverage by county:  Lee, Charlotte, Sarasota, Collier, Monroe, and Miami-Dade.
  • SWFWMD 2005 LiDAR:  Peace River South project, covering part of Charlotte County.
  • USGS 2012 LiDAR:  Eastern Charlotte project, covering part of eastern Charlotte and western Glades counties.
  • USACE 2007 LiDAR:  The HHD_EAA project as an add-on to the FDEM Coastal LiDAR project, covering part of Hendry, Glades, Palm Beach, and Okeechobee counties.  The USACE also processed and merged in bathymetric data from Lake Okeechobee (from USGS and other boat surveys) at 100-ft cell size.
  • USACE 2010 LiDAR:  The HHD_Northwest dataset was merged from two deliverables named HHD NW Shore and Fisheating Creek, and covers parts of Okeechobee, Glades, Highlands, and Charlotte counties.
  • USACE 2003 LiDAR:  The Southwest Florida Feasibility Study (SWFFS) LiDAR covered parts of Collier, Hendry, and Glades counties.  This dataset has lower quality than the other LiDAR data.

For the Everglades and Big Cypress areas, the collection of LiDAR data is problematic due to extremely dense vegetation cover.  The USGS conducted a project through 2007 to collected high-accuracy elevation points throughout those areas, essentially using a plumb bob hanging from a helicopter equipped with GPS.  This high-accuracy elevation dataset (HAED) consists of about 50,000 points collected at 400-m intervals.  The points were converted to a gridded surface using an ordinary-Kriging interpolation and resampled at 100-ft cell size.

Another problematic area is the well-known “Topo Hole” in SE Hendry and NE Collier counties, where no high-quality elevation data has been collected.  Several previous approximations of a topographic surface had been made for this area (2003 and 2005 for the SWFFS, and in early 2006 by the USACE), primarily using 5-ft contours and spot elevations from USGS topographic maps (quad sheets).  For this project, several newly available datasets from the C-139 Regional Feasibility Study were obtained, and two were included in the current processing:  ground spot-elevations from the “all-static” survey and 1-ft contours for the C-139 Annex area.  Unfortunately these new datasets cover only a small area at the margins of the Topo Hole.  The contours and spot elevations were converted to a gridded surface using the ArcGIS tool TopoToRaster, formerly known as GridTopo and sometimes referred to as the ANUDEM method, which applies “essentially a discretized thin plate spline technique” to create a “hydrologically correct digital elevation model.”  The method is not perfect, and the lack of detailed source data limits the accuracy, but still the result is better than other methods that are currently available.  A 20,000-ft buffer zone was added around the Topo Hole, and the TopoToRaster tool was applied.  Until LiDAR or similar data is collected for the Topo Hole, this is likely to remain as the best-available approximation of a topographic surface for this area.

C139_source-2a

C139_source-2bFor the Collier, Monroe, and Miami-Dade DEMs, “decorrugated” versions of the processed LiDAR data were used.  During the original processing of the accepted deliverables from the FDEM LiDAR project, significant banding was apparent.  This banding appears as linear stripes (or corn rows or corrugations) of higher and lower elevations along the LiDAR flightlines.  The DEM data can be “decorrugated” by applying a series of filters to the elevation dataset, but real topographic features can also be altered slightly in the process.  In the resulting product, the systematic errors are removed, but with the cost that every land elevation is altered to some extent; thus the decorrugated surface is considered a derivative product. The decorrugation work for the rural areas of these three counties was done in 2010, but the results had never been formally documented or added to the District’s GIS Data Catalog.  For this project, in comparison with the “original” processed DEMs, these datasets are considered the “best-available” because errors that are visually obvious have been removed.

The best-available topographic DEMs were mosaicked into a single DEM, with better data used in areas of overlap.  In order to remove discontinuities where the border of better data joins to other data, a special blending algorithm was used to “feather” the datasets into each other.  The result is a surface that is free from discontinuities along the “join” edges, but of course accuracy of the result is limited by the accuracy of the source data.  The width of the blending zone along the edges was varied according to the type of data and the amount of overlap that was available, and ranged from 4,000 to 20,000 feet.  The “blending-zone” adjusted data was retained and is available for review.

The layering of the topographic data, from best (top) to worst (bottom), is listed here with best listed first:

  • 2007 FDEM LiDAR, HHD_EAA, HHD_Northwest, and Eastern Charlotte  (all are roughly equivalent in quality)
  • SWFWMD 2005 LiDAR
  • SWFFS 2003 LiDAR
  • USGS HAED
  • Topo Hole

As noted above, two datasets for the land area were created.  One treated the major lake surfaces as flat areas, and the other used available lake bathymetry to add the lake-bottom elevations into the topo DEM surface.  The USACE-processed bathymetry was added for the area of Lake Okeechobee that had been treated as a water body in the 2007 HHD_EAA LiDAR deliverable.  Also, bathymetry data for Lake Trafford was available from a 2005 USACE project.  In the flat-surface version, elevations of 7.5 ft for Lake Okeechobee and 15.1 ft for Lake Trafford, relative to NAVD 1988, were imposed.

Offshore bathymetry was mosaicked at 100-ft cell size.  In 2005 the available data had been collected and mosaicked at 300-ft cell size.  Since that time, no significant bathymetry datasets are known to have been created within the Lower West Coast area.  In 2004, a best-available composite for the Lee County area had been created at 100-ft cell size from 2002 USACE channel surveys of the Caloosahatchee River, boat surveys by USGS for the Caloosahatchee Estuary in 2003 and other areas in 2004, experimental offshore LiDAR from USGS in 2004, and older NOAA bathymetric data for areas that were not otherwise covered.  Other bathymetric data down the shoreline consists of USGS boat surveys down to Cape Romano, Naples Bay boat surveys for SFWMD, and Florida Bay boat surveys that had been compiled in approx. 2004 by Mike Kohler.  For the remaining offshore areas, the older NOAA bathymetric data was used.  The bathymetric pieces were mosaicked together and blended along their edges.

When the offshore bathymetry was joined with the land topography, there were thousands of small no-data holes near the shoreline.  lwc_compositetopo_general-1e

These empty spaces were grouped into three categories.  First, some small and low-lying islands (essentially mangrove rookeries) in Florida Bay had no pertinent elevation data.  These were assigned an arbitrary land elevation of +1.5 ft NAVD88.  Secondly, empty spaces adjacent to offshore bathymetry were treated as shallow offshore areas.  For each no-data hole (polygon), the maximum elevation for adjacent offshore bathymetry was assigned.  (i.e. It’s water, and we’ll give it the shallowest water value that’s nearby).  Finally, empty spaces that were inland were treated as low-lying land or marsh.  For each no-data hole (polygon), the minimum elevation for adjacent “land” was assigned.  (i.e. If it’s a low area, it can’t be above anything that’s along its edge.)

After filling in all of the no-data holes, the bathymetry and land-with-lake topo were mosaicked together.  The deepest spot in the full dataset is about 100 ft at a distance of about 90 miles west of Cape Sable.  For most modeling and mapping purposes, however, the areas well off the shore (e.g. 10 miles or farther) are not needed.  Thus a buffer zone was defined as 20,000 ft beyond the LWC offshore boundary, and the full dataset was clipped to that boundary to create Lwc_TopoShore.

lwc_compositetopo_general-1c

These products can accurately be described as composites of the best-available data.  For example, both the 2005 SWF composite and the early-2006 USACE SF-Topo composite were based on older data and did not include any of the LiDAR that was collected after 2003.  It might be possible to make slight improvements to the current source data (say, by decorrugating rural Lee County), but such modifications would involve significant time and work that would go beyond the short turn-around time allocated for this project.

Timothy Liebermann, Senior Geographer, Regulation GIS, SFWMD

Organizing Geospatial Data Collections

Capture

When organizing geospatial data collections it is important to associate data with names that facilitate discovery. This applies to the names your users see and how you name features at the root level (think SDE Feature Names).

In 1997, I was appointed GIS Coordinator to the City of Bakersfield and while searching the internet for standards that I could use to organize the City’s data I came across the Tri-Services Spatial Data Standards (TSSDS) later renamed the Spatial Data Standards for Facilities, Infrastructure, and the Environment (SDSFIE). You can download the old standards here.

The TSSDS was developed in 1992 by the Tri-Services CADD/GIS Technology Center at the US Army Engineer Waterways Experiment Station in Vicksburg, MISS. The Center’s primary mission was to serve as a multi-service (Army, Navy, Air Force, Marine Corps) vehicle to set CADD and GIS standards, coordinate CADD/GIS facilities systems within DoD, and promote CADD/GIS system integration.

I was particularly interested in the hierarchical classification that assigned geospatial features to Entity Sets, Entity Classes, and Entity Types.

  • Entity Sets were a broad classification like boundaries, cadastre, fauna, flora, hydrography, transportation, and others.
  • Entity Classes were more narrow focused groups such as: transportation air, transportation marine, transportation vehicle, and others.
  • Entity Types represented the actual geographic features.

Some examples:
Transportation – Vehicle – Road Centerline
Cadastre – Real-estate – Parcels
Hydrography – Surface – Canal Centerline

In 1997, shapefiles were new and most of us were still using covers so most geospatial data was organized in folders and this standard worked pretty well.

In 2001, at SFWMD we implemented the same standard in SDE:

Some examples:
Transportation – Vehicle – Road Centerline (TRVEH_ROAD_CENTERLINES)
Cadastre – Real-estate – Parcels (CDREL_PARCELS)
Hydrography – Surface – Canal Centerline (HYSUR_CANAL_CENTERLINES)

This system also worked pretty well but by 2001 there were many more non-geospatial professionals using the system, and the need for something more user-friendly.

The solution was a look-up table that takes terse SDE names and renders them in a simpler to read style. In principal, such a system would consist of a parent table containing source information (SDE, services, layers, shapes, covers, etc…) and a daughter table containing your classification schema. The parent table would be populated automatically through ETL scripts but the daughter table would be crafted manually by a data steward. A graphical user interface would then provide users with access to the information in the daughter table while a backend uses the parent table to retrieve the data.

Some of you may be asking why create a library from scratch when GIS packages can automatically index data for you. The answer is that these automated systems are good but still not as effective as a hand crafted system. Automated systems perform only as well as their metadata is written. Now let me ask you, how well is your organization’s metadata written? In most sizable governmental agencies data is incoming from both internal and external sources and unless you have a robust procedure for vetting the metadata a lot of garbage will get in. Vetting not only means automatically verifying that required metadata elements are filled in but also verifying that they make semantic sense and that titles are standardized. Looked at from this perspective it is much easier to standardize a feature class name in a lookup table than feature class metadata.

Juan Tobar, Supervisor – Geographers, Regulation GIS, SFWMD

A C# Program to Fix Broken Layer Files

sunrise
We are currently working through an ArcGIS v9.3 to ArcGIS v10.1 migration of our Regulatory GIS Support System (RegGSS) and have run into a problem when applying a layer file to a data set.

There are two approaches in applying a layer file to a data set either load the layer file directly or load the data set first and then apply the layer file to the data set. In the past, we had run into problems loading the layer files directly where some properties would not get set (especially label properties) so we decided to go with the second option where we would load a data set and apply a layer file. The main difference is that in the first case the layer file cannot have any broken links while the second method does not care about broken links.

At ArcGIS 10.1 however, this approach stopped working. RegGSS uses close to 800 layers files and checking/fixing each one by hand would have been a monumental task in both effort and time. Our solution was to created a C# script to check every layer file for broken links and repair those that were broken or flag them as not-repairable so they could be repaired manually. The general logic is as follows:

  • Cycle through all layer files used by RegGSS one at a time and check if the layer file is valid or not by using the Valid property on the ILayer interface of ArcObjects
  • If layer file is not valid, read the SDE server name, reset the connection parameters and re-save the layer file.
  • Re-check the validity of the re-saved layer file and if it still has a broken link, write out the layer file name to a text file so it can be fixed manually.

You can check out the main procedures here on GitHub.

Carlos Piccirillo, Senior Geographer, Regulation GIS, SFWMD
Juan Tobar, Supervising Geographer, Regulation GIS, SFWMD

Migrating GRIDS and Shapefiles to FGDB Using Python

Sunset 3
The Drawdown tool within RegGSS can be used to create a hydrologic drawdown models (grids) in a non-leaky aquifer for single or multiple well systems. The tool is also a library of over 2,500 grids that allows for storage and retrieval of the created drawdown grids and contours.

Drawdown

Drawdown creates a directory named the same as the permit application number, for example 080410-22. The individual grids are stored within this directory and are named “application number – version”, for example 080410-22-1. Each grid can also have contours and those are named “contour – version”, so the contours for 080410-22-1 is just contours-1.

We did extensive testing on our test environment and thought we had figured it all out. But how often does your test environment faithfully duplicate production? Especially, in a Windows XP to Windows 7, ArcGIS v9.3 to 10.1 migration?

Here are the problems as we migrate to ArcGIS v10.1:

  • ArcGIS v10.1 wants everything in a geodatabase (will use a file geodatabase)
  • Geodatabases do not like grids that begin with numbers (will prefix grids with a “g”)
  • Geodatabses do not like grids or contours that have dashes “-” in the name (will use underscore “_”)
  • Prior to ArcGIS v9.2 grids were stored with auxiliary files, for example 080410-22-1.aux which stored statistical information about the grid. At ArcGIS v9.2 in addition to aux files ESRI added aux.xml files, for example 080410-22-1.aux.xml. ArcGIS v10.1 does not recognize the older format with the aux file alone. (we will generate aux.xml files).

The image below is a look at the output of Drawdown for a single well system.
Drawdown Grid

You can take a look at the python code on GitHub.

Juan Tobar, Supervisor – Geographers, Regulation GIS, SFWMD

ArcSDE Replication

Sunset 4SFWMD Regulation GIS has a staging and production ArcSDE 9.3.1 sp2 servers used to store geospatial data. Most of the features on staging do not change very often so they are moved manually to production as the need arises. Those features which are edited daily on staging are replicated to production and synchronized hourly.

We are currently undergoing a simultaneous Windows XP to Windows 7 and ArcGIS v9.3 to ArcGIS 10.1 upgrade. Once our desktops were upgraded to 10.1 our 9.3 python script to automatically re-create the replica failed. We tried the 10.1 tool CreateReplica but were getting the following error:Problem

It turns out that CreateReplica will give an error if the feature class was created and registered as versioned in the earlier 9.3 version.

The problem was solved when the feature classes were deleted from both staging and production and then re-created in 10.1 from scratch from a file geodatabase archive. The new feature classes were then assigned privileges,  Global IDs were generated, and then Registered as Versioned.  Once this was done it was noticed that feature tables views (“VW” suffix) were being created. We originally did not understand this but quickly learned that this is a default behavior of ArcGIS 10.1 when registering feature classes as versioned.

We like to use Model Builder as a prototyping environment.
Model Builder

Once done prototyping we migrate our models to python scrips. You can take a view replication.py on GitHub.

Shakir Ahmed, Geographer 3, Regulation GIS, SFWMD
Juan Tobar, Supervisor – Geographers, Regulation GIS, SFWMD

Feature Class Indexes Using Python and XML

Python
The Regulatory GIS Support System (RegGSS) is an ArcMap extension written in C# and managed by SFWMD Regulation GIS.

RegGSS Toolbar

It  that has numerous functions to load data including a number of “Find” functions that work by indexing the names of features in a feature class and then makes this available through a listbox from which users can select individual features.

One of these Finds indexes the ArcHydro feature class HydroEdge containing canal centerlines. Unfortunately, with over 210,000 features creating an index on the fly can take well over a minute and this can try a user’s patience.

Find Canal (AHED)

A solution to this issue is to create a list of feature names ahead of time and then just load it into a listbox. In the example below, this is being done using Python and a XML file.

You can view CreateFindIndex.py code through Github.

And here is what the XML looks like:
XML File

Louis Artola has a good post on using Python to write XML on his blog louisartola|software.

Let us know if you have any comments,

By Juan Tobar, Supervisor – Geographers, Regulation GIS, SFWMD