Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

So, we are capturing frames by starting a AVCaptureVideoDataOutput session. In the capture function

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
{

(just to be clear what we are doing) we grab the data, turn it into a UIImage, analyze it, and if we see the feature we are looking for we "freeze" the live view by copying the UIImage to a UIImageView which we size to be full screen as follows:

    imageview_Confirm.frame.origin.x = 0 
    imageview_Confirm.frame.origin.y = 0 
    imageview_Confirm.frame.size.width = UIScreen.main.bounds.width
    imageview_Confirm.frame.size.height = UIScreen.main.bounds.height

The intent is that our captured frame/UIImage will be shown over the live view and it will look as if the live view has "frozen", in that the frame matches up with the live feed. When we run this on an iPhone 7 it works great, when we move to an iPhone 12 there is an odd horizontal "squeezing" of our image from the left and the right. (It's odd, it seems like there was actually more content captured by the iPhone 12 live view, off to the the left and right, than it was displaying.)

Is there a proper way to size our UIImageView so that our grabbed data will appear to match the live view? Thanks very much for any help.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
418 views
Welcome To Ask or Share your Answers For Others

1 Answer

等待大神答复

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...