As a three-dimensional object is moving through our world, we generally obtain a vivid impression of both its structure and its motion through space. The time-course of two-dimensional projections of the scene (optic flow) is important in conveying this three-dimensional information to us. The extent to which we can solve this specific inverse problem, i.e. infer a three-dimensional scene from two-dimensional flow, depends on the accuracy with which the required flow characteristics are processed by our visual system. Inadequate two-dimensional processing can lead to incomplete representations of the three-dimensional world (three dimensional metric information is lost). Then the motion and structure of objects can no longer be recovered uniquely. Consequently, metameric classes of three-dimensional representations arise (e.g. only affine properties are conserved). This study investigates under what conditions we find metameric combinations of the perceived attitude and perceived rotation of a plane. Our subjects are presented with stimuli consisting of two horizontally separated planar patches rotating back and forth in depth about vertical axes. Subjects are required to match both the attitude and the rotation magnitude of these two patches. We vary the attitude from 15 to 60 deg vertical slant, and the rotation magnitude from 28 to 98 deg. We find that the matched slant and rotation settings vary widely. For high slant values and for small rotations, attitude and rotation settings become highly correlated, suggesting metamery. For low slant values and for large rotations, the correlation almost disappears, suggesting that both quantities are estimated independently and uniquely. Our paradigm reveals that with one task and one type of stimulus a gradual transition occurs from unique settings (metric representations) to metameric classes of settings (e.g. affine representations).