New Iphone Brings New Feature With Upgrade
Editorials News | Nov-01-2019
It’s hard to tell what’s going on inside Smartphone cameras. Every time you push the button to take a picture, a whole computerized Rube Goldberg-style chain of events kicks off with the goal of capturing as much image information as humanly possible. That way, powerful processors can cram it all together into a single cherished memory or hilarious snap of a turtle biting your buddy’s finger. This week, Apple introduced its new Deep Fusion camera tech for the iPhone 11 as part of the iOS 13.2 software update. For the average user, photos taken under certain conditions will now look more detailed and slightly less noisy. There's no indicator in the app like there is with Camera mode, so you won't even know it's working. But, those somewhat granular improvements took some serious engineering—and a whole lot of processing power—to achieve. When you have the iPhone’s camera app open, it’s taking photos before you push the shutter. It keeps a rolling buffer of images, and once you press that button or tap the screen, it captures a total of nine photos. Eight of them happened before you pushed the button. Four of the frames are short to make them as sharp as possible by fending off camera shake or motion blur from objects in the photo. The other four from the buffer are standard exposures to capture color and details. The ninth frame is a long exposure to bring in more light and provide a lighter image for the processor to pull from. Deep Fusion and the Smart HDR technologies work basically the same at this point. From here, however, Deep Fusion takes the sharpest of the four short exposures and basically ditches the rest. It then fuses those four standard exposures with the long exposure to get the color and highlights that belong in the finished photo.
This entire process happens seemingly instantaneously and exists to give the A13 bionic chip all the raw material it needs to smash them into something that looks like real life. That reliance on Apple's latest processor is the reason why only iPhone 11 users get the Deep Fusion mode, at least for now. Deep Fusion uses its neural engine to analyze the individual pixels in a scene, as well as their relationship to each other in the frame. It can pick and choose where to try and preserve highlights and where to add contrast to make details look more pronounced. To my eye, the sharpening effect is by far the most noticeable difference between Deep Fusion and the Smart HDR photos. When high-end retouchers work to smooth out skin in a photo, they use a technique called frequency separation, which allows them to separate the fine details and edges of the image from the colors and tones and manipulate them independently. So, if you wanted to take out blemishes or trouble areas in a picture of a face, you could do it without losing the skin’s natural texture or color.
By: Abhishek Singh
Content: https://www.popsci.com/iphone-11-camera-just-got-better-after-software-update/
Related News
- Trade Routes Shaped Cultural Exchange in the Pre-Modern World
- "Popularity Grows for High School Equestrian Teams"
- Education for Sustainable Development Goals (SDGs)
- Cultural Understanding Promoted by Exchange Programs
- Nationwide Competitions for High School Esports Teams
- "Increased Participation in Women’s Sports Programs"
- Student Progress Showcased by Digital Portfolios
- Nature-Based Learning: Outdoor Classrooms Gain Popularity
- Coding Becomes Core in School Curriculums
- Digital Citizenship Becomes Essential Curriculum
Most popular news
- The Law Of Equivalent Exchange
- Essay On Issues And Challenges Of Rural Development In India
- Srinivasa Ramanujan And His Inventions
- The Youth Is The Hope Of Our Future!
- Poverty In India: Facts, Causes, Effects And Solutions
- Top 20 Outdoor Games In India
- Festivals Of India: Unity In Diversity
- Role Of Women In Society
- The Impact Of Peer Pressure On Students' Academic Performance
- Books As Companion