

We present a database of 27 identity models and six expression pose models (sadness, anger, happiness, disgust, fear, and surprise), together with software to manipulate the models in ways that are common in the face perception literature, allowing researchers to: (1) create a sequence of renders from interpolations between two or more 3D models (differing in identity, expression, and/or pose), resulting in a “morphing” sequence (2) create renders by extrapolation in a direction of face space, obtaining 3D “anti-faces” and caricatures (3) obtain videos of dynamic faces from rendered images (4) obtain average face models (5) standardize a set of models so that they differ only in selected facial shape features, and (6) communicate with experiment software (e.g., PsychoPy) to render faces dynamically online. Here, we provide such tools by expanding the open-source software MakeHuman.

Software for 3D face modeling provides such control, but there is a lack of free and open source alternatives specifically created for face perception research. Ideally, experiments should precisely manipulate facial features under study and tightly control irrelevant features. A problem in the study of face perception is that results can be confounded by poor stimulus control.
