How to calculate FOV?

11,510

Solution 1

In iOS 7 and above you can do something along these lines:

float FOV = camera.activeFormat.videoFieldOfView;

where camera is your AVCaptureDevice. Depending on what preset you choose for the video session, this can change even on the same device. It's the horizontal field-of-view (in degrees), so you'll need to calculate the vertical field-of-view from the display dimensions.

Here's Apple's reference material.

Solution 2

Apple has also released a list with all camera specification details including FOV (Field of View).

https://developer.apple.com/library/ios/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/Cameras/Cameras.html

The values match the values that can be retrieved using:

float FOV = camera.activeFormat.videoFieldOfView;

Solution 3

To answer your question:

Do my method and my formula look right...?

Maybe, but they also look too complex.

...and if yes, which values do I pass to the function?

I don't know, but if the goal is to calculate HFOV and VFOV, here is a code example which programmatically finds the Horizontal Viewing Angle, which is the only viewing angle one can access in Swift currently, and then calculates the Vertical Viewing Angle based on the aspect ratio of the iPhone 6, 16:9.

    let devices = AVCaptureDevice.devices()
    var captureDevice : AVCaptureDevice?
    for device in devices {
        if (device.hasMediaType(AVMediaTypeVideo)) {
            if(device.position == AVCaptureDevicePosition.Back) {
                captureDevice = device as? AVCaptureDevice
            }
        }
    }
    if let retrievedDevice = captureDevice {
        var HFOV : Float = retrievedDevice.activeFormat.videoFieldOfView
        var VFOV : Float = ((HFOV)/16.0)*9.0
    }

Also remember to import AVFoundation if you want this to work!

Solution 4

From the FOV equation, you need focal length and sensor dimensions. Exif data has focal length but not sensor dimensions. I have found only one camera vendor (Canon) that provides sensor dimensions in the metadata -- and that is in their CRW raw files. So you will have to have a table look-up for sensor dimensions based on the camera or smart phone. The following Wikipedia link lists sensor dimensions for numerous cameras and some newer iPhones. The list is near the end of the Wiki article. http://en.wikipedia.org/wiki/Image_sensor_format

Another source for this type of information might be various photography blogs. Hope this helps.

If digital zoom is used, multiply the focal length by the zoom value. The Exif data contains the zoom factor.

Photography blogs and web sites such as http://www.kenrockwell.com have lots of information on camera sensor dimensions.

A correction: actually there are Exif tags for Focal plane X resolution and Focal plane Y resolution (in pixels) and Focal plane resolution units, From those tags and the image dimensions, you can compute the sensor size. But not all cameras provide those Exif tags. For example iPhone 4 does not provide those tags.

Share:
11,510

Related videos on Youtube

Humbertda
Author by

Humbertda

Humbertda Junior Web Developer Website

Updated on June 14, 2022

Comments

  • Humbertda
    Humbertda about 2 years

    Initial Context

    I am developping an augmented reality application location based and I need to get the field of view [FOV] (I just update the value when the orientation change, so I am looking for a method which can get this value when I call it)

    The goal is to make a "degree ruler" relevant to reality like the following: Degree Ruler - AR App

    I am already using AVCaptureSession to display camera stream ; and a path coupled with a CAShapeLayer to draw the ruler. This is working pretty good, but now I have to use Field of view value to place my element in the right place (choose the right space between 160° and 170° for example!).

    Actually, I am hardcoding these values with these sources : https://stackoverflow.com/a/3594424/3198096 (Special thanks to @hotpaw2!) But I am not sure they are fully precise and this is not handling iPhone 5, etc. I was unable to obtain values from official sources (Apple!), but there is a link showing values for all iDevice I think I need (4, 4S, 5, 5S) : AnandTech | Some thoughts about the iphone 5s camera improvements.

    Note: After personal test and some other research online, I am pretty sure these values are inaccurate! Also this forces me to use an external library to check which model of iPhone am I using to manually initialize my FOV... And I have to check my values for all supported device.

    ###I would prefer a "code solution" !###

    After reading this post: iPhone: Real-time video color info, focal length, aperture?, I am trying to get exif data from AVCaptureStillImageOutput like suggested. After what I could be able to read the focal length from the exif data, and then calculate the horizontal and vertical field of view via formula! (Or maybe directly obtain the FOV like showed here : http://www.brianklug.org/2011/11/a-quick-analysis-of-exif-data-from-apples-iphone-4s-camera-samples/ -- note: after a certain number of update, It seems that we can't get directly field of view from exif!)


    Actual Point

    Sources from : http://iphonedevsdk.com/forum/iphone-sdk-development/112225-camera-app-working-well-on-3gs-but-not-on-4s.html and Modified EXIF data doesn't save properly

    Here is the code I am using:

    AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    if (camera != nil)
    {
        captureSession = [[AVCaptureSession alloc] init];
        
        AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:nil];
        
        [captureSession addInput:newVideoInput];
        
        captureLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
        captureLayer.frame = overlayCamera.bounds;
        [captureLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
        previewLayerConnection=captureLayer.connection;
        [self setCameraOrientation:[[UIApplication sharedApplication] statusBarOrientation]];
        [overlayCamera.layer addSublayer:captureLayer];
        [captureSession startRunning];
        
        AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
        NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
        [stillImageOutput setOutputSettings:outputSettings];
        [captureSession addOutput:stillImageOutput];
        
        AVCaptureConnection *videoConnection = nil;
        for (AVCaptureConnection *connection in stillImageOutput.connections)
        {
            for (AVCaptureInputPort *port in [connection inputPorts])
            {
                if ([[port mediaType] isEqual:AVMediaTypeVideo] )
                {
                    videoConnection = connection;
                    break;
                }
            }
            if (videoConnection) { break; }
        }
        
        [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
                                                             completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
         {
             NSData *imageNSData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
             
             CGImageSourceRef imgSource = CGImageSourceCreateWithData((__bridge_retained CFDataRef)imageNSData, NULL);
    
             NSDictionary *metadata = (__bridge NSDictionary *)CGImageSourceCopyPropertiesAtIndex(imgSource, 0, NULL);
    
             NSMutableDictionary *metadataAsMutable = [metadata mutableCopy];
             
             NSMutableDictionary *EXIFDictionary = [[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyExifDictionary]mutableCopy];
            
             if(!EXIFDictionary)
                 EXIFDictionary = [[NSMutableDictionary dictionary] init];
    
             [metadataAsMutable setObject:EXIFDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];
             
             NSLog(@"%@",EXIFDictionary);
         }];
    }
    

    Here is the output:

    {
        ApertureValue = "2.52606882168926";
        BrightnessValue = "0.5019629837352776";
        ColorSpace = 1;
        ComponentsConfiguration =     (
            1,
            2,
            3,
            0
        );
        ExifVersion =     (
            2,
            2,
            1
        );
        ExposureMode = 0;
        ExposureProgram = 2;
        ExposureTime = "0.008333333333333333";
        FNumber = "2.4";
        Flash = 16;
        FlashPixVersion =     (
            1,
            0
        );
        FocalLenIn35mmFilm = 40;
        FocalLength = "4.28";
        ISOSpeedRatings =     (
            50
        );
        LensMake = Apple;
        LensModel = "iPhone 4S back camera 4.28mm f/2.4";
        LensSpecification =     (
            "4.28",
            "4.28",
            "2.4",
            "2.4"
        );
        MeteringMode = 5;
        PixelXDimension = 1920;
        PixelYDimension = 1080;
        SceneCaptureType = 0;
        SceneType = 1;
        SensingMethod = 2;
        ShutterSpeedValue = "6.906947890818858";
        SubjectDistance = "69.999";
        UserComment = "[S.D.] kCGImagePropertyExifUserComment";
        WhiteBalance = 0;
    }
    

    I think I have everything I need to calculate FOV. But are they the right values? Because after reading a lot of different website giving different focal length values, I am a bit confused! Also my PixelDimensions seems to be wrong!

    Via http://en.wikipedia.org/wiki/Angle_of_view this is the formula I planned to use:

    FOV = (IN_DEGREES(   2*atan( (d) / (2  * f) )   ));
    // d = sensor dimensions (mm)
    // f = focal length (mm)
    

    My Question

    Do my method and my formula look right, and if yes, which values do I pass to the function?


    Precisions

    • FOV is what I think I need to use, if you have any suggestion of how the ruler can match reality; I would accept the answer !
    • Zoom is disabled in the augmented reality view controller, so my field of view is fixed when camera is initialized, and can't change until the user rotate the phone!

    • Pavan
      Pavan over 10 years
      Maybe try getting more reputation so that you can invest in a higher bounty.. maybe that will help to gain more attention
    • Jim Merkel
      Jim Merkel over 10 years
      ExifTool calculates FOV and some other things from the Exif data. Maybe you could look at the output from ExifTool and reverse Engineer it.
    • Humbertda
      Humbertda over 10 years
      @Pavan : I think this is what I will do! JimMerkel It's a good idea, I did not know about ExifTool by Phil Harvey, thank you! I will give you the bounty if you had a comment!
    • tc.
      tc. about 10 years
      The pixel dimensions are likely correct: the camera hardware is in video mode, which presumably defaults to recording at 1080p. The FOV ought to be calculable from the EXIF data assuming a "35mm" frame size of 36×24mm, but apparently the iPhone 4S sensor is 4.54×3.42mm, so the numbers don't quite work out (assuming no horizontal crop).
  • Humbertda
    Humbertda over 10 years
    (Zoom is disabled, I added the information in the question, thanks!) So magnification will not change, zoom is not appropriated into the application, so it's better disable it, I think! Actually this is the way I was thinking, but in pixels: if I have FOV and screen size I can know how many pixels are there between two degrees. (Like you proposed with 1.5mm/10). I could also do it in mm and then transform it into pixels, but my question still pending: How do you get camera FOV in iOS? (mm or degrees) Thanks for your answer !
  • Humbertda
    Humbertda over 10 years
    Ok so it is simpler to keep my hardcoded array with all iPhone field of view I guess! Thank you for your answer!
  • Humbertda
    Humbertda about 10 years
    Its perfect! This answer have to be known by everybody.. You made my day, thank you (hope you will win the bounty!)
  • Nestor
    Nestor almost 5 years
    It's the wider of the two fields of view, i.e. the horizontal fov if working in landscape orientation.