Rovers have provided the ability to view planetary surfaces from different positions on the surface, allowing multiple views covering the same surface patch. By mid November 2010, MER PanCams (Mars Exploration Rover Panoramic Camera) have acquired more than three hundred thousand images. About 25 percent are ordinary images covering landscape or objects of interest, the others are used for calibration, meteorological, astronomical or sun tracking purposes. Many sets from PanCam (Panoramic Camera) are multi-wavelength couples that can be similarly processed without additional manual effort. NavCam (Navigation Camera) image sets consist of some low resolution samples and thumbnails. It is expected that the number of (geometrically) different MER (Mars Exploration Rover) images usable in the context of PRoViDE (Planetary Robotics Vision Data Exploitation) is between 10.000 and 20.000. A similar order of magnitude may apply to Lunokhod imagery, however it is intended to use only images with acceptable quality in the sense of 3D data exploitation.
As mentioned earlier, several times during both MER (Mars Exploration Rover) missions, LBS (Long base stereo) images were taken: take a picture, drive a few meters, and take another picture. These long-baseline stereo images fill a mid-range in resolution between the typical in-situ images and the orbital images, and are valuable in the overall PRoViDE (Planetary Robotics Vision Data Exploitation) context. It is most likely that there exists a variety of (multi – sensor, multi – temporal, multi – view) imagery covering the same scene that could be processed into 3D data sets, never having been brought into such context. We may call it “serendipitous” stereo pairs, to be found by novel robust image browsing strategies such as visual image search based in image indexing  and image features selection and geometrical verification .
The task is twofold: First, implement a capability to search for serendipitous LBS (Long base stereo) images – those that were not specifically targeted as LBS (Long base stereo), but are usable anyway. Second, process these images systematically, similar to the baseline MER (Mars Exploration Rover) process.
The image search and matching technique is based on image features that are invariant to large viewpoint changes and hence facilitate LBS (Long base stereo) reconstruction  and object recognition in images . Figure 10 shows an example with wide-baseline images from MER (Mars Exploration Rover) opportunity automatically matched providing a reconstruction of a large crater area. Similarly the trench of the Phoenix mission which was automatically reconstructed by above mentioned image matching and structure from motion , .
This image indexing and matching has been successfully used in visual object search and recognition and offers an interesting approach for serendipitous science since similar visual features can be learned and searched in large image sets. The very frequent but also exceptional visual appearances can be discovered, located and even reconstructed this way. This would allow a scientist to query all mission imagery by selecting an area in one image and retrieving many (if not all) other images having visually similar features: a future added-value of the PRoViDE (Planetary Robotics Vision Data Exploitaiton) processed data suite. There is no need to mention, that the techniques are not MER-specific (Mars Exploration Rover-specific), but should be usable for any rovers including MSL (Mars Science Laboratory) and ExoMars, and hand-held / tripod imagery from Apollo.