Workflow strategies for loading data
There are several things to consider when deciding whether to use the Object Loader in ArcMap or the Simple Data Loader in ArcCatalog or the Catalog window. The most important of these considerations is time. The Simple Data Loader is faster than the Object Loader because it doesn't validate or process data as it loads. If you aren't hindered by a time constraint, using the Object Loader is a good option, because it allows you to load into geometric networks, feature-linked annotation, and feature classes in a relationship.
Loading into geometric networks
If you do have a time constraint, you must consider that loading a lot of data into a network feature class with the Object Loader can take a long time, especially if the network is large and consists of several feature classes. So if you are creating a network from scratch, you should load all the data with the Simple Data Loader before you build the network. If you have already built the network, instead of using the Object Loader, save time by deleting the geometric network, use the Simple Data Loader to load data into the feature class, then rebuild the network.
Loading into versioned feature classes and tables
Loading into versioned feature classes and tables also takes time and is slower than loading into feature classes and tables that are not versioned. If you are migrating data to a geodatabase, you should load the data before registering your data as versioned. Once you have completed migrating your data and applications to the geodatabase, register your feature classes and tables as versioned. You can then load any updates into the versioned feature classes and tables.
If your data is already registered as versioned and you need to load into versioned feature classes, the most straightforward approach is to use the Object Loader. Loading in an ArcMap edit session ensures that changes will merge and allows you to review other edits before you save the newly loaded features. It also allows you to take advantage of the conflict resolution capabilities of ArcMap, should you need them.
However, if you are loading several features and time is a factor, there are a few things you can do to prepare your data that will save time:
- Reconcile and post each outstanding version in the database against the default version. After posting, delete each version.
- Run Compress to compress the database.
- Unregister the data as versioned.Note:
If you have not completed steps 1 and 2 before unregistering your data as versioned, you will lose any edits that those versions contain.
- Delete any geometric network.
- Use the Simple Data Loader in ArcCatalog or the Catalog window to load the new data into your existing feature classes.
- Rebuild the geometric network using the Build Geometric Network wizard in ArcCatalog or the Catalog window.
- Register your data as versioned and continue with production. Registering the data as versioned automatically updates the database statistics for the feature classes
- You cannot use this method if your network has any complex junction features with connection points and custom topology, since the process of batch rebuilding the network will not re-create the custom topology.
- The process of rebuilding the network will reconnect all network features you may have disconnected from the network.
- If any of the feature classes into which you are loading data have feature-linked annotation, you cannot use the Simple Data Loader. In this case, you must use the Object Loader.
- This method is incompatible with some workflows. If you have outstanding versions that cannot be reconciled and posted to default, you cannot use this method. Such versions include outstanding design versions that are not complete, are not ready for posting, or are historical versions. If this is the case, you need to use the Object Loader and append your data as part of an edit session.
When you use either the Simple Data Loader or Object Loader, the data loads into the delta tables. Therefore, after you've finished loading into any feature class or table registered as versioned without the option to move edits to the base table, reconcile each version with the DEFAULT, then run Compress on your database to push all the records from the delta tables to the base tables. Having your data in the base tables results in better query speed than if you have large amounts of data in your delta tables. For more details on compressing your database to improve performance, see Compressing the database.
Loading into feature classes that have topology
Loading your data before creating topologies eliminates the overhead of creating dirty areas for each new feature that you insert into a participating feature class. If the topology is created after the data is loaded, a single dirty area spanning all the features is created, which can then be validated as described in Topology.
If you're loading into a feature class that has topology, you can load with either the Object Loader or the Simple Data Loader. However, neither tool validates topology as the features load, so the end result is the same—you will need to validate the topology yourself once you've finished loading.
Loading data from another coordinate system
Suppose you want to load data that's in a coordinate system other than the coordinate system of the destination feature class. For example, you may want to load features from the North American Datum (NAD) 1927 coordinate system into a feature class that uses the NAD 1983 coordinate system. Before you load the features, use the Project tool to convert them to the new coordinate system.
Loading datasets containing large text fields from a personal geodatabase to an ArcSDE geodatabase
There may be times when you need to move your data from an ArcSDE geodatabase to a personal geodatabase, then move it back to the ArcSDE geodatabase. If there are text fields that have a length greater than 255 characters in the ArcSDE datasets, when you copy or load them into a personal geodatabase, these text fields will be stored in memo fields in Microsoft Access.
ArcGIS interprets these memo fields as BLOBs and assigns them a size of 2,147,483,647 characters. Memo fields in Access do not record a field length. Since it is unlikely that you are actually storing that many characters in the field, be sure to indicate in your metadata the length the field is supposed to be.
If you try to copy the data from the personal geodatabase and paste it back into your ArcSDE geodatabase, the paste will fail because it will attempt to create a text field that is 2,147,483,647 characters in length. Most database management systems do not support text fields of that length.
To avoid this, create the feature class in the ArcSDE geodatabase and define the text field size there based on the field length it is supposed to be. Then use the Simple Data Loader or the Object Loader to load the data and map the personal geodatabase text field to the text field you defined in the feature class in your ArcSDE geodatabase.