Here's the JPEG, non HDR+ shot on a Nexus 5: http://i.imgur.com/So44muL.jpg
Here's a similar image shot with HDR+: http://i.imgur.com/QFS3ZYd.jpg.
As you can see, the dynamic range is increased greatly; however there's strange black spots in the shadows.
Here's the same photo that I took in DNG format, edited in Lightroom:
http://i.imgur.com/VRFsnf5.jpg
And here's my HDR photo, combined 5 DNG exposures inside Photoshop HDR Pro's functionality:
The strange black dots you find in the shadows are noise. Simple no data to produce any results.
The key part to in camera HDR is, automatic. No manually fixing your camera to tripod, copying them onto computer and the loading into software capable of HDR and finally manually processing them.
Do you mind posting a DNG file of this photo? I might switch from iOS if Apple doesn't provide this functionality soon enough.
This demographic can be safely ignored. The people who get the most up in arms about the "purity" of photographic tools are also the ones producing the least work. They're the ones who buy $10,000 worth of bodies and lenses but can never go beyond photos of their local park, or sharpness test charts in their basement. This group isn't good for much more than vociferous, highly-technical religion wars on the Internet. We'll start caring what they think when they start producing work.
In the mean time there are many passionate photographers out there producing great work, with a variety tools, cheap to expensive, simple to complex.
This guy with a crappy old iPhone 3GS has been taking better photos (and publishing them) than the bulk of people with 5D3's and 70-200 f/2.8's: http://boingboing.net/2009/10/29/photographer-takes-p-1.html
I totally agree that it's not the price of your equipment though, it's what you do with it - most of the time. Photography has a great many reasons to occur and there are aspects which most definitely benefit from higher quality equipment. For example, nature macro photography (some beautiful examples from the same photographer above: https://www.flickr.com/photos/sasurau/sets/72157631002090680... ). I guess the more you are looking for "The Decisive Moment", the less raw quality matters. Although, again, something like studio photography is likely to be a balance of quality and decisive moments.
Ultimately, unless you're getting most of your income from the photographs, just make sure however you're doing it you're having fun and enjoying it :)
And there is a generally annoying group of online gearheads who spend a lot more time obsessing about camera specs than they do going out and taking photos.
http://sasurau.squarespace.com/about/
It is a pity, there is something magical about the lomography look of a beat up 3Gs.
Features like megapixels, 51 phase detection AF points, face detection metering and other goodies all allow real people to capture better, more memorable images than cell phone cameras.
1. https://www.flickr.com/photos/jemfinch/8594571664/in/set-721... could not have been captured with a cell phone camera. I was holding my DSLR in my offhand and pushing him on the swing with my other hand.
2. https://www.flickr.com/photos/jemfinch/9326415006/in/set-721... could not have been captured with a cell phone either. My wife was holding him and he was peeking out over her shoulder then switching to the other shoulder immediately when I brought the camera up. With a cell phone's shutter lag, he'd have been on the other side before the picture was taken.
3. https://www.flickr.com/photos/jemfinch/8582319362/in/set-721... no cell phone camera in the world could have captured this in such low light so successfully.
4. https://www.flickr.com/photos/jemfinch/9012492563/in/set-721... there are a hundred fleeting moments a day like this, and no cell phone in the world is fast enough to capture them.
It's easy to take a picture of an abandoned alley or cats lying on blacktop with a cellphone camera. Try doing that with a newborn's first smile, or a mischievous preschooler's momentary smirk.
I think you're presenting some sort of knowledge gap that is contrived. I used SLRs for years. I seldom use it anymore because the results of my smartphone, under many normal circumstances, are comparable. This is the case for many people. The camera you have and all of that, and they just keep getting better and better.
SLRs shine when there is more lighting range (e.g. strong shadows and light) in the shot and in particular when it is a lower-light situation. A smartphone has to start doing HDR (which has inherent downsides, like its inability to record motion) a lot sooner than a lot of SLRs as the dynamic range is wider on the latter.
You also lose physical controls and the ability to change lenses on a smartphone, which slows you down, and makes your shots look more samey. The viewfinder (either EVF or OVF) is also a huge benefit when shooting in either strong sunlight or extreme darkness (as the LCD's backlight can bleed light into your shot).
As to contrast Vs. phase detection: Contrast is very accurate for non-moving targets and even some moving targets (thanks largely to software). Phase detection is still the king of the castle when it comes to high speed and erratic motion however, by far.
The DSLR pictures when I do get around to loading and editing are better. It just takes more time.
As for focus points, I have this wierd camera the canon 70d. Every "pixel" on the camera can phase detect. But I don't think the camera is making full use of the sensor. I wish I could write software for it.
An overview of camera focusing: http://www.dpreview.com/reviews/canon-eos-70d/3
Autofocus in the iPhone is much faster than the majority of Canon (or Nikon) DSLRs, regardless of the number of focus points. Partly because it's a lot easier to focus a lens that has nearly infinite depth of field! Most photographers are also gear heads, which is another dangerous thing to say, just like saying (quite correctly) that most serial killers are keen photographers. which true, it doesn't necessarily add much to the conversation.
why we have this obsession with resolution is largely because of computers. In the "old days" I would make contact sheets and then prints, mainly 8x8" (from a Hasselblad, pre instagram-square-hipness), and they would (and do) look stunning, to use object oriented speak, a fiber print IS-A photograph, where as a digital print HAS-A photograph on the paper surface. They are different, when silver crystals are in (vs on) the medium it becomes something altogether different. In these modern enlightened times we put a photo up on a retina big screen and stare at it from 2' and of course we're going to see imperfections, this creates a cycle of obsession with sharpness. But of course it doesn't matter, would a painting be any better if it was created with a smaller brush? This is all a very classical philosophical debate, it's the medium is the message all over again.
From my perspective this work is impressive, does it make better photos? No, but it does make the photos more representative of what the taker was looking at. Over time we are subtracting the technology component from photography, eventually we will have the ability to record easily any visual experience we have. For me this comes from having enough experience in using equipment and eeking the most out of it, but in contrast my daughter is frustrated by cameras (both her Canon DSLR and her iPhone), she will see something like the solar eclipse and wonder why simply pointing the camera at it won't bring about a photo that looks the same as the one that the Mt Hamilton took on their 36" reflecting telescope, or why a photo at her friends candle lit birthday party didn't look how she remembered it. Her generation will grow up with sufficiently advanced technology that a lot of the inconvenience of photography's technical side has been automated. Just as I grew up in a time when emulsions were capable of exposing in a few hundredth's of a second rather than minutes.
In this case, is it?
Their solution involves using burst mode, then taking those many pictures and turning them into 1 high quality image. Burst mode would simply output a bunch of .jpg images... couldn't you run the algorithm on those standardized images?
To be clear, I would LOVE to get raw out of an iOS device camera. It would make motion analysis much more precise. With iOS 8 you have kinda/sorta control over the exposure time, but it could be more precise.
Does this mean they're using ISO bracketing instead of exposure bracketing?
So if you think of your dynamic range as being limited by the noise floor and the overexposure ceiling, this technique lowers the noise floor while keeping the ceiling right where it is, giving you a better effective dynamic range. ISO bracketing extends both the floor and ceiling by a much larger margin, of course, but its has issues that complicate its use in automated HDR shots, as described in the article.
A similar problem might be to try and improve the resolution of a single hand-held photo, shot in normal lighting conditions, by taking many pictures in rapid succession and then processing them. Each picture would be in a very slightly different position and so (again, in theory) might provide more data?
Has Google discussed any technical reason for this restriction? Seems like lots of third party apps support HDR on a wider variety of phones...
It would be nice if Google could do something similar for EIS phones, too, though.
How come the duplicate detector didn't trigger?
EDIT: In fact, I just tried to re-submit an existing link by adding a question mark and it didn't go through.
The first one I can imagine existing in "real life" if there were a soft-box enclosed fill light out of the frame that was lighting up the lady's face (it can't be the sun since the tree isn't being lit the way it would if the light hitting her was point-light-esque), which would make the photo very "posed", but still possibly something that isn't highly post-processed.
The second picture with the two ladies is much more distracting. The HDR version keeps a lot more detail than the non-HDR image, but at the expense of making everything look extremely flat and unnatural as your brain tries to process how the lighting is working (this may be unique to people who are used to worrying about light with regards to photography and a non-issue for normal folks, I can't say for sure) since the scene is clearly midday but the light across the entire scene makes it seem like the sun must be very low, which doesn't match with the contents of the scene.
The HDR version of the second photo would look better if the exposure on the two ladies were bumped up close to the value that the background sky was bumped down, but doing that automatically would be an amazing visual detection feat that I wouldn't expect out of a phone camera. The lighting still wouldn't make sense but at least it wouldn't look so flat.
If its done right you don't notice it. Since Photoshop added HDR photo merging its been a thing to capture the full dynamic range of a scene and make it look HDR.
google "hdr photos" to see it overdone. https://www.google.com/search?q=hdr+photos
Well, $300 price difference can buy you a lot.
Now I get it. Comments like this one must be why Google felt pressured into making the next Nexus phone so much more expensive :P
Some similar technology, my impression is less "automatic" I think but more control. Haven't seen either in action so I'm not sure how it compares.
It takes the cleanest pictures by far, beating out my Canon 7D at times, but for optimal results it does require the device to be extremely steady. I've been waiting for this app to appear on Android too, but it hasn't so far.
[1] https://itunes.apple.com/us/app/average-camera-pro/id4155778...
Edit: I just discovered Flickr is filled with em. Some good examples on what happens when the number of shots you select (anywhere from 4-64 I believe) is less than optimal if you scroll down: https://www.flickr.com/search/?q=%23avgcampro
I use Avg Cam Pro in touristy situations with lots of movement. Take 64 shots and (moving) people simply fade away by the time you're done.
This is about algorithms to achieve better results with a given sensor. The Nexus 5, like the various Nexii that came before, has a poor image sensor -- it's a $300 smartphone, and LG wasn't going to put the best in it. HDR+ gives rather decent results in a wide range of settings despite that, and mine has managed a large number of fantastic shots.
The Nexus 6, being twice the price, apparently has a fantastic sensor, and with that dramatically better base results, made even better with HDR+. Awesome. That's good.