What is Data Reviewer?

To produce high-quality map products and perform accurate data analysis, your source database must be of high quality and well maintained. Data Reviewer allows you to manage your data in support of data production and analysis. Data Reviewer provides a complete system for automating and simplifying data quality control, which can quickly improve the integrity of your data.

Data Reviewer allows you to manage quality control and analysis of your data. For instance, detecting a building in the ocean would most likely be an error identified as part of quality control. While detecting a building within a 100-year flood zone may not be an error, it could be useful analytic information to an insurance adjuster. The checks provided by Data Reviewer enable you to monitor both types of analysis.

Data analysis tools

Data Reviewer creates a new dataset and table in the geodatabase you are using for a session. The dataset contains point, line, and polygon feature classes that contain features that have been written to the Reviewer table. The Reviewer dataset can be stored in any existing geodatabase, including the production geodatabase, or a separate file or personal geodatabase created specifically for review. If stored in an enterprise or ArcSDE database, the Reviewer dataset tables and feature classes support versioning.

Spatial checks analyze the spatial relationships of features. You can analyze whether features overlap, intersect, are within a specified distance of each other, or touch. For instance, you may want to check that a road does not cross into the ocean. You can also analyze whether features are within a certain distance of one another. For instance, a fire hydrant must be connected to a water lateral.

Attribute checks analyze the attribute values of features and tables. This can be simple field validation similar to a geodatabase domain or with more complex attribute dependencies. For many features, one attribute is dependent on another attribute of the same feature. For instance, if a road is under construction, it may not be accessible. An attribute check can be configured to monitor the status and accessibility of the roads.

Feature integrity checks analyze the properties of features. Not all features in a database follow the same capture criteria. You might have collection rules that define how close two vertices may be or whether multipart features are allowed in your data. Feature integrity checks ensure the collection rules are followed for each feature class. For instance, the cutbacks check can be used to identify features that contain sharp angles.

Metadata checks analyze the metadata information of the feature datasets and feature classes. Metadata can contain critical information about the source used to collect the derived data, which can significantly impact the reliability of the data. For instance, the date range of source data used could significantly impact the accuracy of maps produced and the analysis performed using the data.

Managed review of data is essential to complete data analysis. Whether reviewing the data through automated checks or visually, it is essential to understand the integrity of the entire database. Polygon grids can be created over the data with an arbitrary cell size specified by a number of rows and columns or map units. You can then use the Reviewer Overview Window with the grid to navigate to specific cells to run your batch jobs or to visually review the data. Once the data has been reviewed, the status of the cell can be changed so you can monitor which areas have been completed.

You can sketch a geometry to provide better communication about missing features and features with inaccurate shapes. Use the Notepad or Flag Missing Feature tool to capture a sketch to provide specific geometric information about the feature that is in question.

Batch jobs

Define your specification rules in a batch job to ensure consistent and repeatable data analysis. Batch jobs can be thought of as a way of organizing and storing your analysis rules so they can be executed or distributed to internal or external team members. They can contain many checks that cover a wide range of data types, or they can be focused on a theme. This allows you to manage data analysis by giving you the flexibility to run the batch jobs as necessary. Batch jobs can be scheduled to run through a service as needed, run through the Reviewer Batch Validate tool, or included in scripting or part of your workflow.

Reviewer Batch Validate can be used to run a batch job while you are working with data in ArcMap.

The Data Reviewer service is a Windows service that can be scheduled to run Reviewer batch jobs. Similar to running a batch job using the Reviewer Batch Validate tool, the Reviewer service validates and runs batch jobs and writes the results to the Reviewer table in a specified Reviewer session. Batch jobs can be scheduled to run once at a specific date and time or to run repeatedly at regular intervals.

Workflow Manager organizes the steps necessary to complete a job. You can define a step in your workflow that identifies the appropriate time to run a batch job, then initiate it directly from the step. Other steps in the workflow can create a Reviewer session or open the Reviewer session when starting ArcMap so any results of running the batch job are seamlessly displayed.

Scripting can call the Execute Reviewer Batch Job script in the Data Reviewer toolbox. This allows you to run a batch job on a workspace and write the results to the Reviewer table in a specified Reviewer session. The same functionality used in the Python script can be called through the command line or in a batch script (.bat).

Sample

Often, the database being analyzed is very large, which makes complete analysis of all values in the database impractical or impossible. A sample represents a subset of the database that is of manageable size. If the sample is unbiased, analysis of the sample can be used to represent the database as a whole. This can significantly reduce the time necessary to analyze a database. The sampling check can be used to generate a statistical or random sample of features or records from one or more layers.

Reviewer session

A Reviewer session is used to store and manage the results of your data analysis throughout the life cycle of an anomaly. It allows you to organize the results discovered from running batch jobs and visual inspections. Reviewer sessions are stored in either a centrally located geodatabase for multiuser access or in a local geodatabase. You can create a geodatabase specifically for your Reviewer session or include it with your feature geodatabase. Information about the analysis can include notes, status, the person who performed the analysis, and when it was done. Additionally, a representative sketch, thumbnail image, or a geometry describing the anomaly, such as the specific area of overlap, is captured.

Reviewer table

The Reviewer table is the interface for accessing and interacting with the information stored in Reviewer sessions. Through the Reviewer table, you can manage the stages of the life cycle for an anomaly. By simply clicking a record in the table, the feature is selected and automatically centered for easy identification. The feature can then be modified based on the information in the record for the anomaly, or if necessary, it can be further analyzed.

Reviewer reports

Once data analysis is completed by running batch jobs and by visual inspection, reports of the results can be generated. The reporting tools allow you to organize the results in a variety of ways that help you understand the data.

Temas relacionados


7/10/2012