So, we are capturing frames by starting a AVCaptureVideoDataOutput session. In the capture function
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
{
(just to be clear what we are doing) we grab the data, turn it into a UIImage, analyze it, and if we see the feature we are looking for we "freeze" the live view by copying the UIImage to a UIImageView which we size to be full screen as follows:
imageview_Confirm.frame.origin.x = 0
imageview_Confirm.frame.origin.y = 0
imageview_Confirm.frame.size.width = UIScreen.main.bounds.width
imageview_Confirm.frame.size.height = UIScreen.main.bounds.height
The intent is that our captured frame/UIImage will be shown over the live view and it will look as if the live view has "frozen", in that the frame matches up with the live feed. When we run this on an iPhone 7 it works great, when we move to an iPhone 12 there is an odd horizontal "squeezing" of our image from the left and the right. (It's odd, it seems like there was actually more content captured by the iPhone 12 live view, off to the the left and right, than it was displaying.)
Is there a proper way to size our UIImageView so that our grabbed data will appear to match the live view? Thanks very much for any help.