Dual, triple, quad, penta camera smartphones: The history runni
[ad_1]
(Pocket-lint) – With more and more smartphones launching with a multi-lens camera system, we’re taking a look at where this has all come from and romp through the history of dual, triple and (gasp) quad lens smartphone cameras.
Dual lenses on smartphones aren’t new, with a number of models offering a range of unique features using this camera setup as far back as 2011 in formats you’ll recognise – not forgetting the Samsung B710 offering a dual lens back in 2007! (Thanks for that tip Leo.)
Getting a phone with a single lens might now be a rarity, but follow us as we walk you through key moments in smartphone multi-lens camera systems of the past and into the present…
LG Optimus 3D and HTC Evo 3D: Another dimension
In 2011 3D was a thing. The world’s TV manufacturers were lining up 3D TV sets, there were 3D films being produced and we were being told that 3D was the next big thing (again).
For smartphones, it was the opportunity for innovation. The LG Optimus 3D was announced in February 2011 and the HTC Evo 3D launched on Sprint in March 2011.
Both these smartphones (and there were some others) used dual lenses to allow them to take 3D video and 3D photos. They use the same technique used by regular 3D cameras, using those dual lenses to create a sense of depth in images. This was boosted with a 3D display to view those images, without the glasses.
But 3D was just a passing phase, and although we could capture 3D, ultimately, that was only the start of the story for modern multi-lens smartphones.
HTC One M8: Making sense
It was the HTC One M8 that really introduced dual lens cameras to the world and saw HTC trying to do something different. The HTC One M8 was launched in April 2014 and used two sensors in the same way that modern smartphone cameras do.
With a 4-megapixel UltraPixel main image sensor and a secondary 2-megapixel sensor capturing extra data, the dual lens camera was used, like 3D, to create a sense of depth in photos. The idea was that the second lens could create a depth map and feed it into the final image.
That meant you could create bokeh/background blur effects, you could refocus the image with a tap and you could easily manipulate photos, keeping the subject sharp and changing the backgrounds, even after you’d taken the photo.
The One M8 was clever, but the camera wasn’t that impressive. The effects were rather gimmicky and the benefits of having a dual camera didn’t really make an impact – even if the full metal body did.
There are still plenty of devices that have a second lens for “depth” and nothing else – but that’s often seen as a method of getting background blurring on portrait shots.
HTC might have started this whole second lens thing, but it was about 2 years in advance of the rest of the pack – and it was 2016 that really saw the industry change.
LG G5: Going wide
Step forward a few years and LG announced the LG G5 in February 2016. There were two things that were interesting about it. Firstly, it attempted to integrate modular accessories – which was a flop – and secondly, LG equipped it with dual cameras, one of the first phones to launch in 2016.
There was a main 16-megapixel sensor and a second 8-megapixel sensor. Rather than combining information to create effects, the second lens was ultra wide-angle.
With 135-degree lens on the rear for that 8-megapixel camera, the LG G5 could shoot wide-angle photos to great effect. You could simply switch from one camera to the other, perfect for tight spots or landscapes – and the chance to create something you can’t do with software.
LG added the wide-angle to the V20 and subsequent models in the G and V series, but it wasn’t until the Huawei Mate 20 triple camera that we saw big moves in wide-angle from other manufacturers. That’s all changed in 2019, as everyone else realised that wide-angle was a creatively sound proposition.
Huawei P9: Leica’s monochrome mark
In April 2016 Huawei launched the P9 in partnership with Leica, with two cameras sitting on the back. Huawei’s big selling point wasn’t about depth sensing or wide-angle, it was about monochrome and it was the start of some influential work in multi-camera systems from Huawei.
Leveraging Leica’s classic monochrome skills, the Huawei P9 presented two cameras on the rear, claiming one lens captured RGB colour and the second lens captured monochrome detail. This resulted in some great black and white photos, but working together, the P9 attempted to combine information from both sensors to make all your photos better – and generally speaking it all seemed to work well.
Huawei continued with this arrangement through 2018 into the Huawei P20, launched alongside another significant device: the Huawei P20 Pro.
Honor used the same system in a number of devices – without the Leica branding – adding a monochrome sensor on the Honor 8 and subsequent devices, until we hit the Honor View 20. It wasn’t just Huawei and Honor – Nokia adopted the same system on the Nokia 8, but with Zeiss branded lenses.
Apple iPhone 7 Plus: A play to zoom
As 2016 continued, one of the big launches was the Apple iPhone 7 Plus with two cameras on the rear, both 12-megapixels, but offering different focal lengths. The first camera was 23mm zoom, while the second camera was 56mm and we entered the realms of telephoto on phones.
The idea was to let you zoom without losing as much quality, switching to the 56mm camera to get you closer, then any digital zooming you do is then starting from a closer position, so the loss in quality will be lessened. Apple wanted to address what it saw as a significant problem with smartphone photography and came up with a solution that matched user behaviour.
Apple also played HTC’s game by offering bokeh effects thanks to a depth map drawn from both lenses.
Since the launch of the iPhone 7 Plus, Apple has continued to offer zoom on its phones and many others have moved to adopt a zoomed lens too – in 2017 OnePlus added it to the OnePlus 5 and Samsung’s launched its first dual-camera phone, the Note 8, a system it has continued with since.
Huawei P20 and Mate 20 Pro: Three is the magic number
When the Huawei P20 Pro was announced in early 2018, everything was poured into the camera, with a new triple camera system. This added a zoom lens to the to the existing system of RGB and monochrome sensors, but there was a lot more happening with AI – and the birth of an impressive Night Mode.
The Huawei P20 Pro was a great success, a camera that justified its excesses with results and proved the critics wrong. It seemed to do everything.
What was a little suspicious, however, was the evolution in the Huawei Mate 20 later in 2018. Again using a triple camera system, Huawei switched it up, dropped the monochrome sensor swapping in a wide-angle lens instead, effectively turning its back on the previous 2 years of marketing. The results, though, gave very little to complain about, adding that desirable wide-angle with seemingly no quality downside for losing that monochrome lens – so did it ever actually do anything?
Samsung also offered a three-lens camera in the Samsung Galaxy A7 in 2018, but opted for regular, wide-angle and a dubious third for “depth information” and nothing else. Oppo graced the R17 Pro with three cameras, but perhaps more confusingly, offered a main camera, a depth camera and a final time of flight camera – one of the first phones to push time of flight as another sensor to feed into AR, depth and other applications.
Subsequently, three cameras has become the norm. The 2020 iPhone 12 Pro offer three lenses, the 2021 Samsung Galaxy S21 has three cameras and lots of more affordable devices have three cameras too. There’s a wide difference in performance, with many cheaper phones using macro cameras to make up the numbers.
Galaxy A9: Samsung shoots four the stars
Samsung likes “world firsts” and having lost out to Huawei on the triple camera front and been fairly slow to adopt dual camera systems, the Samsung Galaxy A9 strode out with four cameras on the back in 2018. Samsung has continued to offer four cameras on some phones: the 2019 S10 5G repeated the A9’s offering of main, zoom, ultra-wide and depth, while also packing in two cameras on the front.
Quad cameras are everywhere since the turn of the decade: at the high end you have duplication of zoom lenses to give you better performance, such as in the Xiaomi Mi 10 Pro, while OnePlus is still offering a macro and monochrome lenses on the 8T to make up the numbers, and at the more affordable end you have phones like the Moto G 5G Plus, hanging on to a depth sensor and macro cameras, just to get “quad camera” on the spec sheet.
But in the mix of multi-lens cameras came another important development in 2019.
Huawei P30 Pro: Periscope hits the mainstream
In 2019, Huawei launched probably the most notable phone it had ever made. Although Oppo had shown off a periscope lens previously, and the Asus Zenfone Zoom had actually used it in 2015, it was Huawei who hit the big time, offering zoom on the Huawei P30 Pro that was hitherto unchallenged. Offering higher quality long range capture, it was certainly a breakthrough, while also pushing night shooting skills to rival those shown off by Google’s Night Sight in late 2018.
The P30 Pro seemed to do it all and laid the foundation for phones that now follow. The periscope zoom graced the Oppo Find X2 Pro in 2020, the Samsung Galaxy S20 Ultra and the Realme X3 SuperZoom. It continues to be a highly desired feature, with Samsung rearranging the cameras in the 2021 S21 Ultra to have two zooms, one periscope and one regular, in an attempt to boost zoom quality.
The Huawei P30 Pro can take credit for a lot of this, but the P30 Pro was also one of the last Huawei phones to run with Google Mobile Services. Although more recent Huawei phones offer amazing camera capabilities, they’ve proven less popular than the P30 Pro.
Nokia 9 PureView’s got five on it
Nokia moved in a different direction in 2019, launching the Nokia 9 PureView with five lenses on the back. Unlike other systems, these weren’t lenses designed with different functions – there’s no zoom, no wide-angle. Instead, the lenses use Light’s system pursuing quality above all else. The idea was to capture a lot more data to combine into images.
It’s a great theory, but Nokia was trying to do a job with lots of data that rivals – like Google – was doing with AI. Ultimately, AI and the growth of computational photography won this race. Google has been able to apply AI and machine learning in photography not just to new images, but old images too – including applying these skills to older and less powerful phones.
The Nokia 9 PureView launched on outdated hardware – with Nokia saying at the time that it was tuned for the camera system and they didn’t want to change that. Customers also didn’t want to buy it. It didn’t review well – after the first reviews went out, Nokia stopped putting the phone out, so we never got to test it in the flesh. It was ambitious, but ultimately, the market has run off in a different direction – and we’re still waiting to see if Nokia will launch a another premium flagship to replace the 9.
Writing by Chris Hall.
[ad_2]
Source link