![]() ![]() It'd be another great way of getting creative with photography from your phone, and making use of the excellent quality from those cameras. I'd love to see Apple make better use of its optical image stabilization to allow for really sharp long-exposure photos, not just of water, but of nighttime scenes too, perhaps of car headlights snaking their way through the street. The body itself goes for 4000 plus taxes my 16-35mm lens which goes for. While the iPhone 12 Pro starts at 1399 plus taxes for the 128GB model, my DSLR camera on the other hand is much more expensive. In this article, I will compare Apple’s newest and Nikon D850. ![]() They're fine for sending to your family or maybe posting to Instagram, but they won't look good printed and framed on your wall, and I think that's a shame. That’s why I took all photos with my iPhone on my last trip to Mexico City. The result is shots that are quite mushy looking, even when you put the phone on a mobile tripod for stability. The problem is that the iPhone uses a moving image - a Live Photo - to detect motion in the scene and then digitally blur it, and this usually means that any movement gets blurred, even bits that shouldn't be. On the Pro model series of iPhone 13 you get a 6x optical zoom range (3x optical zoom in, 2x optical zoom out). Andrew Lanxon/CNETĪnd though it's easy to do on the iPhone, the results are only OK. The iPhone 13 Pro and iPhone 13 Pro Max have identical camera systems. It's a good effort, but you lose a lot of detail in the process. It's a great technique to really highlight the motion in a scene, and it's something I love doing on my proper camera and on my iPhone.Ī standard and long-exposure comparison, taken on the iPhone 11 Pro. You'll have seen those shots images of waterfalls or rivers where the water has been artfully blurred but the rocks and landscape around the water remain sharp. Much better long-exposure photographyĪpple has had the ability to shoot long exposure images on the iPhone for years now. After all, the phone already uses image blending technology to combine different exposures into one HDR image - it'd just be doing the same thing, only with focus points, rather than exposure. It might be a niche desire, but I'd love to see this focus stacking capability built in to the iPhone, and it possibly wouldn't even be that difficult to do. It's the opposite goal of the camera's Portrait Mode, which purposefully tries to defocus the background around a subject for that artful shallow depth of field - or "bokeh." Then, those images are blended together later - usually in desktop software like Adobe Photoshop or dedicated focus software like Helicon Focus - to create an image that has focus on the extreme foreground and the background. Andrew Lanxon/CNETįocus stacking means taking a series of images with the camera staying still while focusing on different elements within a scene. The result is a subject that's pin-sharp from front to back. Using my Canon R5, I took multiple images here, focusing at different points on this fly, and then merged them together afterward. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |