The UI to the warping algorithm has to depict the source and target volumes, in conjunction with the feature elements. Hardware-assisted volume rendering  makes possible a UI solely based on direct visualization of the volumes, with the embedded elements interactively scan-converted. Using a low-end rendering pipeline, however, the UI has to resort to geometric representations of the models embedded in the volumes. These geometric representations can be obtained in either of two ways:
Once geometric representations of the models are available, the animator can use the commercial modeler of his/her choice to specify the elements. Our system, shown in figure 6d, is based on Inventor, the Silicon Graphics (SGI) 3D programming environment. Models are drawn in user-defined materials, usually translucent, in order to distinguish them from the feature elements. These, in turn, are drawn in such a way that their attributes --- local coordinate system, scaling factors, and dimensionality --- are graphically depicted and altered using a minimal set of widgets.