One potential application that I'd love to see worked out with this kind of data is to infer and separate lighting and material data from an image (which would require algorithms to guess the original light source(s)) so that artificial light sources can be applied arbitrarily. Imagine taking a picture with a simple flash, and then using software to create whatever dramatic lighting effect you want to generate a new image.
It is a program that takes a single image and generates diffuse/normal/occlusion/specular/displacement textures from it in such a way as can be easily used as a material in blender.
As you can see from the video, it can work amazingly well given the limited amount of input information.
I hadn't seen that. So if you can shoot your photos in very flat light, you could get close the result I'm talking about with very little additional code from what already exists (it would help to reproject the image as a single "texture" as it would wrap around the 3D data). What I don't know of is an algorithm that "de-lights" an image, essentially bringing it to flat matte lighting, regardless of the original data.
Obviously that's something that involves multiple stages, and I can think of three. The first is removing gradients from diffuse shading. The second is lightening shadows to match their surroundings (shadows, incidentally, could probably be used in combination with the 3D structure to infer light sources). Finally, you'd need to identify specular highlights and inpaint them. You might also have to use inpainting in stage 2, in order to deal with full-black shadows.