Smartphones are becoming our primary camera and our primary way to consume images. The technical differences between those devices are shrinking. How do you differentiate between them?
Considering all devices coming with approximatively the same hardware from sensors to optic, the differences should come from the software. if you take Android OS, despite its presence on most of the smartphone, the manufacturers using this OS can still pre-load their devices with advanced options.
My point here is not to list or say which one is better than the other, but to highlight all the options now available. What I think is interesting here is to see what is considered to be granted from the user point of view: auto-focus, white balancing, hdr, face detection, low light condition, panorama and to wonder what people really understood of these tools. The marketing department surely has something to say about the last point.
To know a bit about the production workflow around this problematic, I really see the the battle of new algorithms/automatic image as the way for company to market their image (see the last Iphone campaign). A bit like the battle between film manufacturers (Kodak, Agfa, Fuji...) back in the days, where for the same ISO sensitivities (for me my favorite was 400 color by Fujifilm) the same color signal will appear slightly different on paper. The camera allows you to record the scene and the film/digital camera chosen will print the look of the final image.
So what can you really do? A lot will answer the emergency color scientist. It is super important and very interesting to study how our visual system is functioning, how it adapts with the light condition (eg. how to improve an image when observe on your phone in low light condition). All sounds pretty cool to me, color scientists can be useful.
La suite dans le futur.