The iPhone's metering is suspicious. It seems to average all areas of the frame equally, resulting in underexposure when the background's exposure is different from the exposure you'd use the capture the subject. Of course this will always happen with any camera, by my A7ii seems to be very good at picking a correct exposure. (Of course it has exposure compensation and manual mode, so I can always override its choice, which is all I really ask.) You can see this demonstrated in the "backlit" photo with the article, I think the exposure could go up a little bit more to get some more detail from the shadowy clouds and the boat, with the only side effect of blowing that cloud out more. (It's already gone.)
I also think the iPhone's sensor is capable of collecting more light data than Apple allows it to. It is amazing how much dynamic range modern sensors pick up. When I first got my A7ii I put it into auto bracket mode. By default it does something like -1/3, 0, +1/3. This adds no data. After some more testing, even at +3, 0, -3, you can still recover the other two exposures from any one of the others with no significant loss of detail. So I don't bracket anymore and I've never had to throw away a picture because of the exposure.
That said, that is all with RAW files that capture data that can't be visible in the JPEG. The iPhone tone maps all that data to make a JPEG, and its tone mapping works differently than how I'd manually do it in lightroom. The RAW files off my camera capture so much data that you can ridiculously under- or over-expose and still get something that looks nice on the computer screen. (Not as perfect for the pixel-peepers as the correct exposure of course, but something that would look fine on your wall at 8x10 or shared to G+.) The iPhone's JPEGs are unfixable in Lightroom, the data isn't there and all you can do is make bright stuff brighter or dark stuff darker, which you almost never want to do.
Finally, based on EXIF data, I've found that the iPhone chooses some oddball exposure settings, using overly-fast shutter speeds at high ISOs when it could use a fine shutter speed at a fine ISO to get the same exposure. But I'm sure that's optimized for how people normally use their phone's cameras, not for whatever I happen to be shooting at the moment. I know the interplay between ISO, aperture, and shutter speed. Most people just want a picture of their friend eating dinner. And to be fair, the A7ii on auto-ISO + P mode will choose oddball exposures too. 1/8000s at ISO 32000? Back in my day, we were happy when we had ISO 400 film :P
So anyway, the iPhone camera is disappointing on a technical level, but the best camera is the one you have with you, so I can't complain. I'd rather have an imperfect picture than no picture at all.
That said, I do wish it were possible to save raw images in the Camera app.
Not really RAW, but at least you get all the bits you can.
Which would be useful to see how the scores have improved or not improved, much like how this article is demonstrating the progress of the iPhone camera.
It's 100% valid and perfectly clear to not include "the best Android phone cameras" as that is not the goal.
also for fun there's a behind the scenes on the same link that shows how the shots came together.
How is the lens attached to the iPhone? How is the focus controlled?
http://androidcommunity.com/android-flagships-may-have-margi...
tl;dr first time it doesn't beat them.
Thanks, this was my first crack at doing something with Canvas! For last year's we did it in CSS animations but with it never had great performance.