How Topo to Raster works
The Topo to Raster tool is an interpolation method specifically designed for the creation of hydrologically correct digital elevation models (DEMs). It is based on the ANUDEM program developed by Michael Hutchinson (1988, 1989). See Hutchinson and Dowling (1991) for an example of a substantial application of ANUDEM and for additional associated references. A brief summary of ANUDEM and some applications are given in Hutchinson (1993). The current version of ANUDEM used in ArcGIS is 4.6.3.
Topo to Raster interpolates elevation values for a raster while imposing constraints that ensure:
- A connected drainage structure
- Correct representation of ridges and streams from input contour data
As such, it is the only ArcGIS interpolator specifically designed to work intelligently with contour inputs.
The Topo to Raster by File tool is useful for executing the Topo to Raster tool multiple times, since it is often easier to change a single entry in the parameter file and rerun the tool than to repopulate the tool dialog box each time.
The interpolation process
The interpolation procedure has been designed to take advantage of the types of input data commonly available and the known characteristics of elevation surfaces. This method uses an iterative finite difference interpolation technique. It is optimized to have the computational efficiency of local interpolation methods, such as inverse distance weighted (IDW) interpolation, without losing the surface continuity of global interpolation methods, such as Kriging and Spline. It is essentially a discretized thin plate spline technique (Wahba, 1990) for which the roughness penalty has been modified to allow the fitted DEM to follow abrupt changes in terrain, such as streams and ridges.
Water is the primary erosive force determining the general shape of most landscapes. For this reason, most landscapes have many hilltops (local maximums) and few sinks (local minimums), resulting in a connected drainage pattern. Topo to Raster uses this knowledge of surfaces and imposes constraints on the interpolation process that results in a connected drainage structure and correct representation of ridges and streams. This imposed drainage condition produces higher accuracy surfaces with less input data. The quantity of input data can be up to an order of magnitude less than that normally required to adequately describe a surface with digitized contours, further minimizing the expense of obtaining reliable DEMs. The global drainage condition also virtually eliminates any need for editing or postprocessing to remove spurious sinks in the generated surface.
The program acts conservatively in removing sinks and will not impose the drainage conditions in locations that would contradict the input elevation data. Such locations normally appear in the diagnostic file as sinks. Use this information to correct data errors, particularly when processing large datasets.
The drainage enforcement process
The purpose of the drainage enforcement process is to remove all sink points in the output DEM that have not been identified as sinks in the input sink feature dataset. The program assumes that all unidentified sinks are errors, since sinks are generally rare in natural landscapes (Goodchild and Mark, 1987).
The drainage enforcement algorithm attempts to clear spurious sinks by modifying the DEM, inferring drainage lines via the lowest saddle point in the drainage area surrounding each spurious sink. It does not attempt to clear real sinks as supplied by the Sink function. Since sink clearance is subject to the elevation tolerance, the program is conservative when attempting to clear spurious sinks. In other words, it does not clear spurious sinks that would contradict input elevation data by more than the value of Tolerance 1.
Drainage enforcement can also be supplemented with the incorporation of stream line data. This is useful when more accurate placement of streams is required.
The drainage enforcement can be turned off, in which case the sink clearing process is ignored. This can be useful if you have contour data of something other than elevation (for example, temperature) for which you want to create a surface.
Use of contour data
Contours were originally the most common method for storage and presentation of elevation information. Unfortunately, this method is also the most difficult to properly utilize with general interpolation techniques. The disadvantage lies in the under sampling of information between contours, especially in areas of low relief.
At the beginning of the interpolation process, Topo to Raster uses information inherent to the contours to build a generalized drainage model. By identifying areas of local maximum curvature in each contour, the areas of steepest slope are identified, and a network of streams and ridges is created (Hutchinson, 1988). This information is used to ensure proper hydrogeomorphic properties of the output DEM and can also be used to verify accuracy of the output DEM.
After the general morphology of the surface has been determined, contour data is also used in the interpolation of elevation values at each cell.
When the contour data is used to interpolate elevation information, all contour data is read and generalized. A maximum of 50 data points are read from these contours within each cell. At the final resolution, only one critical point is used for each cell. For this reason, having a contour density with several contours crossing output cells is redundant.
Multi resolution interpolation
The program uses a multi resolution interpolation method, starting with a coarse raster and working toward the finer, user-specified resolution. At each resolution, drainage conditions are enforced, interpolation is performed, and the number of remaining sinks is recorded in the output diagnostic file.
Processing stream data
The Topo to Raster tool requires that stream network data has all arcs pointing downslope and that there are no polygons (lakes) or braided streams in the network.
The stream data should be composed of single arcs in a dendritic pattern, with any braided streams, parallel stream banks, lake polygons, and so on, cleaned up by interactive editing. When editing lake polygons out of the network, a single arc should be placed from the beginning to the end of the impounded area. The arc should follow the path of a historic streambed if one is known or exists. If the elevation of the lake is known, the lake polygon and its elevation can be used as a CONTOUR input.
To display the direction of the line sections, change the Symbology to the Arrow at End option. This will draw the line sections with an arrow symbol showing the line directions.
Creating and mosaicking adjacent rasters
Sometimes it's necessary to create DEMs from adjacent tiles of input data. Normally this happens when input features are derived from a map sheet series or when, due to memory limitations, the input data must be processed in several pieces.
The interpolation process uses input data from surrounding areas to define the morphology and drainage of the surface and interpolate output values. However, the cell values at the edges of any output DEM are not as reliable as in the central area because they are interpolated with one-half as much information.
To make the most accurate predictions at the edges of the area of interest, the extent of the input datasets should be greater than the area of interest. The Margin in cells parameter provides a method for trimming the edges of output DEMs based on a user-specified distance. The edges of overlapping areas should be at least 20 cells wide.
There should be some overlap of input data into the adjacent areas when multiple output DEMs are to be combined into a single raster. Without this overlap, the edges of merged DEMs may not be smooth. The extents of the input datasets from each of the interpolations should have an even larger area than if only an interpolation for a single interpolation were to be done, so as to ensure that the edges can be predicted as accurately as possible.
When the DEMs have been created, they can best be combined using the geoprocessing Mosaic tool with the Blend or Mean options. This function provides options for handling overlapping areas to smooth the transition between datasets.
Every created surface should be evaluated to ensure that the data and parameters supplied to the program resulted in a realistic representation of the surface. There are many ways to evaluate the quality of an output surface, depending on the type of input available to create the surface.
The most common evaluation is to create contours from the new surface with the Contour tool and compare them to the input contour data. It is best to create these new contours at one-half the original contour interval to examine the results between contours. Drawing the original contours and the newly created contours on top of one another can help identify interpolation errors.
Another method of visual comparison is to compare the optional output drainage cover with known streams and ridges. The drainage feature class contains the streams and ridges that were generated by the program during the drainage enforcement process. These streams and ridges should coincide with known streams and ridges in the area. If a stream feature class was used as input, the output streams should almost perfectly overlay the input streams, although they may be slightly more generalized.
A common method for evaluating the quality of a generated surface is to withhold a percentage of the input data from the interpolation process. After generating the surface, the height of these known points can be subtracted from the generated surface to examine how closely the new surface represents the true surface. These differences can be used to calculate a measure of error for the surface, such as the root mean squared (RMS) error.
The optional diagnostic file can be used to evaluate how effectively the tolerance settings are clearing sinks in the input data. Decreasing the values of the tolerances can make the program behave more conservatively at clearing sinks.
There is a minor bias in the interpolation algorithm that causes input contours to have a stronger effect on the output surface at the contour. This bias can result in a slight flattening of the output surface as it crosses the contour. This may result in misleading results when calculating the profile curvature of the output surface but is otherwise not noticeable.
Likely causes of problems with Topo to Raster
- There are insufficient system resources available. The algorithms used in Topo to Raster hold as much information as possible in memory during processing. This allows point, contour, sink, stream, and lake data to be accessed simultaneously. To facilitate processing of large datasets, it is recommended that unnecessary applications be closed before running the tool to free up physical RAM. It is also important to have sufficient amounts of system swap space on disk.
- The contour or point input may be too dense for the output cell size specified. If one output cell covers several input contours or points, the algorithm may not be able to ascertain a value for that cell. To resolve this, try any of the following:
- Decrease the cell size, then resample back to the larger cell size after Topo to Raster.
- Rasterize smaller sections of the input data using the Output extent and Margin in cells. Assemble the resulting component rasters with the Mosaic tool.
- Clip the input data into overlapping sections, and run Topo to Raster separately on each section. Assemble the resulting component rasters with the Mosaic tool.
- The application of a surface interpolator may not be consistent with the input dataset. For example, if there is a sinks input with more points than there would be cells in the output raster, the tool will fail. Densely sampled data sources, such as LiDAR data, may have similar problems. Using the NO_ENFORCE option may help in this case, but a proper understanding of how the interpolator works is important to prevent misapplication.
Goodchild, M. F., and D. M. Mark. 1987. The fractal nature of geographic phenomena. Annals of Association of American Geographers. 77 (2): 265–278.
Hutchinson, M. F. 1988. Calculation of hydrologically sound digital elevation models. Paper presented at Third International Symposium on Spatial Data Handling at Sydney, Australia.
Hutchinson, M. F. 1989. A new procedure for gridding elevation and stream line data with automatic removal of spurious pits. Journal of Hydrology, 106: 211–232.
Hutchinson, M. F., and T. I. Dowling. 1991. A continental hydrological assessment of a new grid-based digital elevation model of Australia. Hydrological Processes 5: 45–58.
Hutchinson, M. F. 1993. Development of a continent-wide DEM with applications to terrain and climate analysis. In Environmental Modeling with GIS, ed. M. F. Goodchild et al., 392–399. New York: Oxford University Press.
Hutchinson, M. F. 1996. A locally adaptive approach to the interpolation of digital elevation models. In Proceedings, Third International Conference/Workshop on Integrating GIS and Environmental Modeling. Santa Barbara, CA: National Center for Geographic Information and Analysis. See: http://www.ncgia.ucsb.edu/conf/SANTA_FE_CD-ROM/sf_papers/hutchinson_michael_dem/local.html.
Wahba, G. 1990. Spline models for Observational data. Paper presented at CBMS-NSF Regional Conference Series in Applied Mathematics. Philadelphia: Soc. Ind. Appl. Maths.