The Creation of a National Multiscale Database for the United States Census

Jonathan P Schroeder, Martin Galanda, Robert B McMaster, Ryan Koehnen

Research output: Book/ReportCommissioned report


Although considerable developments in automated generalization have taken place over the last thirty years, it is still difficult to solve generalization problems with off-the-shelf software due to the limited capability of the algorithms and complexity of the databases. At the National Historical Geographic Information System (NHGIS) project at the University of Minnesota (, work is currently underway to design a multiple scale database at1:150,000, 1:400,000, and 1:1,000,000 through the application of data models and generalization algorithms.The NHGIS is taking a two-fold approach using both specific algorithmic approaches and object-oriented datamodeling. Early results from the application of a mixture of standard and custom-tailored algorithms such as theDouglas and Visvalingham routines has shown promising results, especially along coastal areas. Examples of coastalgeneralization at a variety of scales are provided. The project will continue to develop the needed generalizationalgorithms as specific geographical conditions are encountered. Current work is identifying the specific constraints such as distance between points and/or objects and specific algorithms needed for generalization at the various scales and applying these in a comprehensive generalization framework that goes beyond the tract boundary specific approach.
Original languageEnglish (US)
StatePublished - 2005


Dive into the research topics of 'The Creation of a National Multiscale Database for the United States Census'. Together they form a unique fingerprint.

Cite this