CN114900680A - 3D camera system, visual field correction method, device, equipment and storage medium - Google Patents

3D camera system, visual field correction method, device, equipment and storage medium Download PDF

Info

Publication number
CN114900680A
CN114900680A CN202210789153.3A CN202210789153A CN114900680A CN 114900680 A CN114900680 A CN 114900680A CN 202210789153 A CN202210789153 A CN 202210789153A CN 114900680 A CN114900680 A CN 114900680A
Authority
CN
China
Prior art keywords
binocular
display
assembly
image
focal length
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210789153.3A
Other languages
Chinese (zh)
Other versions
CN114900680B (en
Inventor
李娜娜
郭志飞
顾兆泰
刘满林
朱文华
安昕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oupu Mandi Technology Co ltd
Original Assignee
Guangdong Optomedic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Optomedic Technology Co Ltd filed Critical Guangdong Optomedic Technology Co Ltd
Priority to CN202210789153.3A priority Critical patent/CN114900680B/en
Publication of CN114900680A publication Critical patent/CN114900680A/en
Application granted granted Critical
Publication of CN114900680B publication Critical patent/CN114900680B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Abstract

The invention relates to the technical field of video processing, and particularly discloses a 3D camera system, a visual field correction method, a device, equipment and a storage medium, wherein the 3D camera system comprises: the binocular imaging component is used for acquiring binocular image information; the focusing assembly is connected with the binocular imaging assembly and used for adjusting the focal length of the binocular imaging assembly; a display component for displaying a 3D image constructed based on binocular image information; the controller is in communication connection with the binocular imaging assembly, the focusing assembly and the display assembly and is used for sending binocular image information acquired by the binocular imaging assembly to the display assembly so as to display a 3D image; the controller is also used for acquiring focal length information generated by the focusing assembly adjusting the binocular imaging assembly and adjusting the display position of the binocular image information on the display assembly according to the focal length information so as to display a 3D image; the system compensates and adjusts the offset of the two images of the binocular image information in real time, so that the user cannot feel dizzy.

Description

3D camera system, visual field correction method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of video processing technologies, and in particular, to a 3D camera system, a method, an apparatus, a device, and a storage medium for correcting a field of view.
Background
The 3D camera shooting technology strengthens the spatial perception on the visual field of an operator by providing a stereo image, and can make the operative visual field clearer and the anatomical level more obvious particularly when being applied to medical equipment such as an endoscope, and the defects of the 2D camera shooting technology are overcome to a certain extent.
When the human eyes observe an object, the eyeballs can be rotated according to the distance of the observed object, the binocular vision field is adjusted, the pictures of the binocular vision field are accurately matched, the image is transmitted to the brain, and the image is processed into three-dimensional image information by the brain.
The existing 3D camera shooting technology is limited by the design of binocular light paths, and the binocular vision field centering effect brought by eyeball rotation when human eyes aim at different observation distances cannot be simulated, so that when 3D images displayed by a camera shooting system are at different working distances (namely, the process of adjusting the focal lengths of the binocular light paths to obtain images with different sizes), visual field dislocation of different degrees can be generated if the display equipment of the 3D images is not adjusted, and after an observer receives binocular information, part of information is disordered when the brain synthesizes a three-dimensional image, and discomfort such as poor stereoscopic impression, dizziness and the like is caused.
In view of the above problems, no effective technical solution exists at present.
Disclosure of Invention
The application aims to provide a 3D camera system, a visual field correction method, a visual field correction device, a 3D camera equipment and a storage medium, so as to avoid visual field dislocation caused by focusing and dizziness of a user.
In a first aspect, the present application provides a 3D camera system for capturing and displaying stereoscopic images, the system comprising:
the binocular imaging component is used for acquiring binocular image information;
the focusing assembly is connected with the binocular imaging assembly and used for adjusting the focal length of the binocular imaging assembly;
the display component is used for displaying a 3D image formed based on the binocular image information;
the controller is in communication connection with the binocular imaging assembly, the focusing assembly and the display assembly and is used for sending the binocular image information acquired by the binocular imaging assembly to the display assembly so as to display the 3D image;
the controller is further used for acquiring focal length information generated by the focusing assembly adjusting the binocular imaging assembly, and adjusting the display position of the binocular image information on the display assembly according to the focal length information so as to display the 3D image.
The utility model provides a 3D camera system passes through the controller and acquires the focus information that focusing subassembly focusing process produced in real time to adjust the display position of binocular image information at display element in order to show the 3D image according to focus information, compensate the offset of adjusting two images of binocular image information in real time promptly, so can not produce the field of vision dislocation and arouse the condition of tearing image sense organ, more can not lead to the user to produce dizzy sense.
The 3D camera system, wherein the controller is in communication with the focusing assembly through a focus feedback device, and the focus feedback device generates the focus information based on an adjustment amount of the focusing assembly.
In the system of the example, a focal length feedback device is added to monitor and acquire the adjustment amount of the focusing assembly in time to generate corresponding focal length information, and the focal length information is sent to the controller, so that the accuracy and the real-time performance of the focal length information source are ensured, and the 3D image is adjusted more accurately and more timely.
In a second aspect, the present application further provides a method for correcting a visual field, which is applied to a 3D camera system, where the 3D camera system includes:
the binocular imaging component is used for acquiring binocular image information;
the focusing assembly is connected with the binocular imaging assembly and used for adjusting the focal length of the binocular imaging assembly;
the display component is used for displaying a 3D image formed based on the binocular image information;
the visual field correction method comprises the following steps:
acquiring focal length information generated by the focusing assembly adjusting the binocular imaging assembly;
and adjusting the display position of the binocular image information on the display assembly according to the focal length information so as to display the 3D image.
The vision field correction method is applied to a 3D camera system, the method is used for displaying a 3D image by acquiring the focal length information generated in the focusing process of the focusing assembly in real time and adjusting the display position of binocular image information on the display assembly according to the focal length information, namely, the offset of two images of the binocular image information is compensated and adjusted in real time, so that the condition of torn image sense organ caused by vision field dislocation is avoided, and the user is prevented from generating dizzy feeling.
The visual field correction method, wherein the step of adjusting the display position of the binocular image information on the display assembly according to the focal length information to display the 3D map comprises:
acquiring a reference overlapping point or a reference overlapping area according to the overlapping area in the two images of the binocular image information;
and respectively adjusting the display positions of the two images of the binocular image information on the display assembly according to the focal length information, so that the reference overlapping point or the reference overlapping area is kept overlapped to display the 3D image.
The visual field correction method according to the present invention, wherein the reference overlapping point is any one or more points in the overlapping area.
The visual field correction method described above, wherein the reference overlapping point is a central point in the reference overlapping area.
In the method of this example, the central point in the reference overlapping region is the orthogonal point of the two images, and in the case of the two images generating the overlapping region, the central point is necessarily present, and the central point is used as the reference overlapping point, so that the binocular image information is necessarily adjusted and corrected under the normal use condition to generate the corresponding 3D image.
The visual field correction method according to the above, wherein the reference overlapping point or the reference overlapping area is located at a very center of the display unit.
In a third aspect, the present application further provides a visual field correction device, which is applied in a 3D camera system, where the 3D camera system includes:
the binocular imaging component is used for acquiring binocular image information;
the focusing assembly is connected with the binocular imaging assembly and used for adjusting the focal length of the binocular imaging assembly;
the display component is used for displaying a 3D image formed based on the binocular image information;
the visual field correction device includes:
the acquisition module is used for acquiring the focal length information generated by the focusing assembly adjusting the binocular imaging assembly;
and the correction module is used for adjusting the display position of the binocular image information on the display assembly according to the focal length information so as to display the 3D image.
The vision correction device is applied to a 3D camera system, the method acquires the focal length information generated in the focusing process of the focusing assembly in real time, and adjusts the display position of binocular image information on the display assembly according to the focal length information so as to display a 3D image, namely, the offset of two images of the binocular image information is compensated and adjusted in real time, so that the condition of tearing image sense caused by vision dislocation is avoided, and the user is prevented from generating dizzy feeling.
In a fourth aspect, the present application further provides an electronic device comprising a processor and a memory, wherein the memory stores computer readable instructions, and the computer readable instructions, when executed by the processor, perform the steps of the method as provided in the second aspect.
In a fifth aspect, the present application also provides a storage medium having a computer program stored thereon, which, when executed by a processor, performs the steps of the method as provided in the second aspect above.
From the foregoing, the present application provides a 3D imaging system, a visual field correction method, an apparatus, a device, and a storage medium, wherein, the 3D camera system acquires the focal length information generated in the focusing process of the focusing assembly in real time through the controller, and adjusts the display position of the binocular image information on the display assembly according to the focal length information to display the 3D image, namely, the offset of two images of binocular image information is compensated and adjusted in real time, so that the image display positions of two visual fields in the 3D image always keep the matching relation consistent with the previous moment, the binocular visual field staggered relation of the 3D image is consistent with the previous moment, therefore, the 3D image is always matched with the visual sense of human eyes in the focusing process, and a user does not need to change the interpupillary distance to adapt to the 3D image, so that the condition of torn image sense organ caused by visual field dislocation is avoided, and the vertigo sense of the user is avoided.
Drawings
Fig. 1 is a schematic structural diagram of a 3D imaging system according to an embodiment of the present application.
Fig. 2 is a more preferred structural schematic diagram of a 3D imaging system according to an embodiment of the present application.
Fig. 3 is a flowchart of a visual field correction method according to an embodiment of the present application.
Fig. 4 is a front view of binocular image information generated by the focusing process.
Fig. 5 is a schematic top view of binocular image information generated by the focusing process.
Fig. 6 is a schematic structural diagram of a visual field correction device according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Reference numerals: 101. a binocular imaging component; 102. a focusing assembly; 103. a display component; 104. a controller; 105. a focal length feedback device; 301. an acquisition module; 302. a rectification module; 401. A processor; 402. a memory; 403. a communication bus.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
In a first aspect, please refer to fig. 1 and 2, fig. 1 and 2 are schematic structural diagrams of a 3D camera system for capturing and displaying a stereoscopic image in some embodiments of the present application, the system including:
the binocular imaging component 101 is used for acquiring binocular image information;
the focusing assembly 102 is connected with the binocular imaging assembly 101 and used for adjusting the focal length of the binocular imaging assembly 101;
a display component 103 for displaying a 3D image configured based on binocular image information;
the controller 104 is in communication connection with the binocular imaging assembly 101, the focusing assembly 102 and the display assembly 103 and is used for sending binocular image information acquired by the binocular imaging assembly 101 to the display assembly 103 to display a 3D image;
the controller 104 is further configured to acquire focal length information generated by the focusing assembly 102 adjusting the binocular imaging assembly 101, and adjust a display position of the binocular image information on the display assembly 103 according to the focal length information to display the 3D image.
Specifically, the binocular image information includes images of two fields of view, the two images constitute a 3D image in a partially overlapped form and are displayed in the display module 103, and the two images are received by both eyes of the user through 3D glasses or other 3D display modes, so that the stereoscopic image is viewed on the display module 103.
More specifically, the binocular imaging assembly 101 is an optical imaging system based on collecting illumination to form binocular image information by sensitization, and includes at least two optical imaging systems because it needs to acquire images of two fields of view; each optical imaging system generally includes an objective lens, an image transferring lens group, an eyepiece and a photosensitive element, wherein the objective lens, the image transferring lens group and the eyepiece can be regarded as an imaging lens group for adjusting and collecting light within a specific range, and the photosensitive element is a light processor including at least one photosensitive sensor for generating corresponding electrical signals according to the light collected by the imaging lens group to form an image of a corresponding view field.
More specifically, the focusing assembly 102 is configured to change the focal length of the binocular imaging assembly 101 for acquiring the binocular image information, and in order to ensure that the binocular image information can form a 3D image, it should be noted that the focusing process of the focusing assembly 102 is performed to keep the focal lengths of the two optical imaging systems in the binocular imaging assembly 101 synchronously changed.
More specifically, the focusing process of the focusing assembly 102 may be manual adjustment, or may be automatic adjustment adapted to the environment in the field of view according to the characteristics of the field of view acquired by the binocular imaging assembly 101, or may be adjustment performed according to adjustment data set by a user according to the use requirement, and the final purpose is to adjust the object to be observed in the binocular image information to have sufficient definition; in the embodiment of the present application, the focusing process of the focusing assembly 102 generates corresponding focal length information, and it can be known from the characteristics of optical imaging that the spatial size corresponding to the image of each field of view in the binocular image information changes according to the change of the focal length information, the display screen of the display assembly 103 is generally of a fixed specification, the images of the two fields of view have certain offset characteristics in the display screen to ensure the matching with the human eye vision, while in the focusing process, the spatial size corresponding to the image changes, two images displayed only by the display screen can be seen as a zoom change process, but there is no reference for zoom association between the two images, and the positions of the two imaging lens groups are relatively fixed, so that distance adjustment similar to the change of the interpupillary distance cannot be directly generated, and the process of the two images zoom change can generate left-right deviation or left-right staggering phenomenon, the interpupillary distance of the eyes of a user cannot be adjusted correspondingly at the first time, so that the visual field is dislocated to cause torn image sense, and the user can feel dizzy in severe cases.
More specifically, in the process of actually adjusting the focal length of the binocular imaging assembly 101, the focal length information is directly associated with the zoom ratio of the binocular image information, which is directly associated with the offset in the display assembly 103, and thus, the focal length information is associated with the display position of each image in the binocular image information on the display assembly 103; the 3D camera system of the embodiment of the application acquires the focal length information generated in the focusing process of the focusing assembly 102 in real time through the controller 104, adjusts the display position of binocular image information on the display assembly 103 according to the focal length information to display a 3D image, namely, compensates and adjusts the offset of two images of the binocular image information in real time, so that the image display positions of two visual fields in the 3D image always keep the matching relation which is consistent with the last moment, the binocular visual field interlacing relation of the 3D image is consistent with the last moment, so that the 3D image is always matched with the visual sense of human eyes in the focusing process, a user does not need to change the interpupillary distance to adapt to the 3D image, the condition of torn image sense caused by visual field dislocation cannot be generated, and the user cannot be dazzled.
In some preferred embodiments, the controller 104 is communicatively coupled to the focusing assembly 102 via a focus feedback device 105, and the focus feedback device 105 generates focus information based on an adjustment amount of the focusing assembly 102.
Specifically, as can be seen from the foregoing, the focusing assembly 102 may be adjusted manually or controlled electrically, in order to ensure that the controller 104 can more accurately and timely acquire the focal length information of the binocular imaging assembly 101 focused by the focusing assembly 102, in the embodiment of the present application, the focal length feedback device 105 is added to timely monitor the adjustment amount of the focusing assembly 102 to generate corresponding focal length information, and send the focal length information to the controller 104, thereby ensuring the accuracy and real-time of the source of the focal length information, and making the adjustment of the 3D image more accurate and timely.
In some preferred embodiments, the focusing assembly 102 preferably includes an adjustment motor for adjusting the assembly distance in both imaging lens groups simultaneously to change the imaging focal length, and accordingly, the focal length feedback device 105 is preferably a motor controller or a rheostat ranging device.
Specifically, when the focal length feedback device 105 is a motor controller, the motor controller controls the adjustment motor to rotate based on a control amount input by a user to change the focal length of the imaging lens group, and the motor controller records the rotation amount of the motor in real time, obtains a focal length variation amount of the imaging lens group based on a preset conversion formula, generates focal length information in real time, and feeds the focal length information back to the controller 104.
Specifically, when the focal length feedback device 105 is a rheostat distance measuring device, a rheostat is arranged on the motor, an adjusting end of the adjusting motor is connected with an adjusting slider for adjusting the focal length of the imaging lens group, that is, the process of adjusting the focal length of the motor is to change the position of the adjusting slider, the slider of the rheostat is connected with the adjusting slider, so that the resistance value of the rheostat can be directly changed by sliding of the adjusting slider, and the rheostat distance measuring device obtains the focal length information of the imaging lens group based on the resistance value of the rheostat and a preset conversion formula and feeds the focal length information back to the controller 104 in a communication manner.
In some preferred embodiments, the adjusting motor is preferably a stepping motor, so that the binocular imaging assembly 101 can be accurately adjusted to different focal lengths in a sectional manner, and the focal length information acquired by the focal length feedback device 105 also has a sectional type characteristic, thereby ensuring that the 3D image can be accurately adjusted.
In some preferred embodiments, in the present embodiment, the controller 104 serves as a processing center of binocular image information, and is configured to collect binocular image information generated by the binocular imaging assembly 101, adjust the coincidence of two images in the binocular image information according to the focal length information obtained by the focal length feedback device 105, send the binocular image information to the display assembly 103 after the adjustment is completed, and display two images in the binocular image information after the adjustment of the position in the corresponding position of the display assembly 103 for displaying a 3D image.
In a second aspect, please refer to fig. 3, fig. 3 is a flowchart of a visual field correction method provided in some embodiments of the present application, the method is applied in a 3D camera system, and the 3D camera system includes:
the binocular imaging component 101 is used for acquiring binocular image information;
the focusing assembly 102 is connected with the binocular imaging assembly 101 and used for adjusting the focal length of the binocular imaging assembly 101;
a display component 103 for displaying a 3D image configured based on binocular image information;
the visual field correction method comprises the following steps:
s1, acquiring focal length information generated by the focusing assembly 102 adjusting the binocular imaging assembly 101;
s2, adjusting the display position of the binocular image information on the display assembly 103 according to the focal length information to display the 3D image.
The visual field correction method is applied to a 3D camera system, the method acquires focal length information generated in the focusing process of the focusing assembly 102 in real time, adjusts the display position of binocular image information on the display assembly 103 according to the focal length information to display a 3D image, namely, compensates and adjusts the offset of two images of the binocular image information in real time, so that the image display positions of two visual fields in the 3D image always keep a matching relation consistent with the last moment, the binocular visual field staggered relation of the 3D image is consistent with the last moment, the condition of torn image sense caused by visual field dislocation cannot be generated, and the user cannot be dazzled.
In some preferred embodiments, the adjusting the display position of the binocular image information on the display assembly 103 according to the focal length information to display the 3D map includes:
s21, acquiring a reference overlapping point or a reference overlapping area according to the overlapping area of the two images of the binocular image information;
specifically, in order to ensure that the 3D image generated after adjustment does not generate the view misalignment phenomenon, it is necessary to determine reference data for guiding the display positions of the two images of the binocular image information in the display assembly 103, and the method of the embodiment of the present application may intuitively complete the position adjustment of the two images of the binocular image information in the display assembly 103 by using the reference overlapping point or the reference overlapping area as the reference data.
More specifically, the overlapping area in the two images of the binocular image information is the overlapping range in which the two optical imaging systems in the binocular imaging assembly 101 acquire the object.
More specifically, the reference overlap point or the reference overlap area is overlap data for matching display in the overlap range.
S22, adjusting the display positions of the two images of the binocular image information on the display module 103 according to the focal length information, respectively, so that the reference overlapping point or the reference overlapping area remains overlapped to display the 3D image.
Specifically, the reference overlapping point or the reference overlapping area may be a reference overlapping point or a reference overlapping area set in a preset overlapping area, and may also be a reference overlapping point or a reference overlapping area set in an overlapping area in two images at the last time; since the method of the embodiment of the application needs to keep that the focused 3D image does not generate the phenomenon of view misalignment, a reference overlapping point or a reference overlapping area may be set based on the overlapping area in the two images of the binocular image information at the previous time, so as to ensure that the reference overlapping point or the reference overlapping area in the adjusted 3D image at the next time is still overlapped to avoid view misalignment, or the reference overlapping point or the reference overlapping area may be acquired based on a preset overlapping area, so as to adjust the 3D image at each time to ensure that the reference overlapping point or the reference overlapping area in the 3D image at each time is always overlapped to avoid view misalignment.
More specifically, adjusting the reference overlap point or the reference overlap region to overlap adjusts the reference overlap point of the two images in the display assembly 103 to the overlap position or adjusts the corresponding outer contour of the reference overlap region of the two images to the overlap position.
In some preferred embodiments, the steps S21 to S22 are preferably performed by performing adjustment correction of the 3D image using the reference overlapping point to reduce the data processing amount of the adjustment process.
In some preferred embodiments, since the two imaging lens groups are adjusted synchronously during the focusing process, the offset generated in the display assembly 103 by the two images of the binocular image information is equally reversed without adjustment, and therefore, in the embodiment of the present application, step S22 may generate position adjustment information about one image of the binocular image information to adjust the position of the image in the display assembly 103 according to the focal length information, and simultaneously generate opposite position adjustment information to adjust the position of the other image in the display assembly 103 to generate a 3D image; the adjustment processing mode can reduce half of data processing amount, improve image processing efficiency and reduce the hysteresis of the process from the image acquisition of the binocular imaging assembly 101 to the display of the display assembly 103.
In some preferred embodiments, the reference overlapping point is any one or more points in the reference overlapping area.
Specifically, in step S21, one or more points may be set as reference overlapping points to serve as the adjustment reference in step S22, and setting more reference overlapping points may effectively improve the adjustment accuracy in step S22 to ensure that the 3D image does not generate the view misalignment; in consideration of the fact that the positions of the two optical imaging systems of the binocular imaging assembly 101 are relatively fixed, the field of view misalignment phenomenon of the generated binocular image information is a fixed linear offset phenomenon, and therefore, in the embodiment of the present application, it is preferable that step S21 uses one point as a reference overlapping point to ensure the adjustment accuracy of the 3D image and improve the adjustment correction efficiency.
In some preferred embodiments, two images in the binocular image information of different focal length information have different overlapping areas therebetween, and the arbitrarily set reference overlapping point may exceed the overlapping area in the two images of the binocular image information in the case that the focal length information is small, resulting in a failure in adjusting the 3D image, and therefore, in the embodiment of the present application, the reference overlapping point is preferably a center point in the overlapping area, i.e., the point O in fig. 4 and 5.
In some preferred embodiments, the reference overlap point or region is located at the very center of the display assembly 103.
Specifically, in the embodiment of the present application, the automatic correction adjustment of the image offset is realized based on the reference overlapping point or the reference overlapping area, and the two corrected images need to be displayed in the display component 103 as 3D images, so that it is also necessary to determine the display positions of the two corrected images in the display component 103.
Specifically, the central point in the reference overlapping region is an orthogonal point of the two images, and in the case that the two images generate the overlapping region, the central point is inevitably present, and the central point is used as the reference overlapping point, so that the binocular image information can be inevitably adjusted and corrected to generate the corresponding 3D image under the normal use condition.
More specifically, as shown in fig. 4, assuming that the boundaries of the two images of the binocular image information acquired by the binocular imaging unit 101 are both circular, the 3D image displayed by the display unit 103 is rectangular, the reference overlapping point is defined as the center point O of the overlapping area, LD1 and RD1 are the two images of the binocular image information acquired by the binocular imaging unit 101 before focusing, LP1 is the 3D image corresponding to the display of the display unit 103, and when the focal length information becomes larger by 1.5 times, the two images of the binocular image information acquired by the binocular imaging unit 101 become LR2 and RD2, respectively, at this time, if the 3D image is not adjusted, the image corresponding to the left field of view is enlarged by the center of the corresponding imaging lens group to become LP2 in the image, it can be seen that LP2 shifts to the right and shifts from the other focused image, the method of the embodiment of the present application aims to compensate the shift values of this portion, the content displayed by LP2 is adjusted to coincide with point O where another image displays content on display assembly 103 to avoid the user from creating a feeling of vertigo.
More specifically, as shown in fig. 5, the reference overlapping point is defined as the center point O of the overlapping area, and the diameter of the single image is: e =2c tan q + a; the diameter of the overlapping area of the two images is: d =2(c-b/2/tanQ) tanQ =2c tan q-b; the ratio of the horizontal distance from the center point O of the overlapping area of the two images of the binocular image information to the overlapping area side to the diameter of the single image is: x = d/2e = (2c × tanQ-b)/2 (2c × tanQ + a) = (2c × tanQ-b)/(4 c × tanQ +2a), that is, when the focusing distance is c, the proportional position of the central point O in each image is fixed, and the central point O is located at the xe distance of the left image close to the right edge thereof and at the xe distance of the right image close to the left edge thereof, respectively, so that the method of the embodiment of the present application can quickly determine the position of the central point O (i.e., the reference overlapping point) based on the ratio x to indicate image position error correction in the error correction process; according to the method, when the binocular vision field is displayed in the 3D mode, the central point O of the overlapped area of the two vision fields is translated to overlap, the pictures can be accurately matched, the 3D images cannot be staggered in the process of adjusting the focal length, and the user is prevented from being dazzled.
In fig. 5, a is the diameter of the objective lens, b is the pitch between the two objective lenses, and Q is the field spread angle of the objective lens.
In a third aspect, referring to fig. 6, fig. 6 is a schematic structural diagram of a visual field correction device according to an embodiment of the present application, where the device is applied to a 3D camera system, and the 3D camera system includes:
the binocular imaging component 101 is used for acquiring binocular image information;
the focusing assembly 102 is connected with the binocular imaging assembly 101 and used for adjusting the focal length of the binocular imaging assembly 101;
a display component 103 for displaying a 3D image configured based on binocular image information;
the visual field correction device includes:
the acquisition module 301 is used for acquiring the focal length information generated by the binocular imaging assembly 101 adjusted by the focusing assembly 102;
and the correcting module 302 is used for adjusting the display position of the binocular image information on the display component 103 according to the focal length information so as to display the 3D image.
The visual field correction device is applied to a 3D camera system, the method obtains focal length information generated in the focusing process of the focusing assembly 102 in real time, adjusts the display position of binocular image information on the display assembly 103 according to the focal length information to display a 3D image, namely, offsets of two images of the binocular image information are compensated and adjusted in real time, the image display positions of two visual fields in the 3D image are always kept in a matching relation consistent with the last moment, the binocular visual field staggered relation of the 3D image is consistent with the last moment, the condition of torn image sense caused by visual field dislocation is avoided, and the user is not further prevented from generating dizzy feeling.
In some preferred embodiments, the vision correction device is preferably a controller 104 in a 3D camera system.
In some preferred embodiments, the visual field correction device of the embodiment of the present application is used for performing the visual field correction method provided by the second aspect.
In a fourth aspect, referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application, where the present application provides an electronic device including: the processor 401 and the memory 402, the processor 401 and the memory 402 being interconnected and communicating with each other via a communication bus 403 and/or other form of connection mechanism (not shown), the memory 402 storing a computer program executable by the processor 401, the processor 401 executing the computer program when the computing device is running to perform the method of any of the alternative implementations of the embodiments described above.
In a fifth aspect, the present application provides a storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program performs the method in any optional implementation manner of the foregoing embodiments. The storage medium may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
In summary, embodiments of the present application provide a 3D imaging system, a method, an apparatus, a device, and a storage medium for field correction, wherein, the 3D camera system acquires the focus information generated in the focusing process of the focusing assembly 102 in real time through the controller 104, and adjusts the display position of the binocular image information on the display module 103 according to the focal length information to display the 3D image, namely, the offset of two images of binocular image information is compensated and adjusted in real time, so that the image display positions of two visual fields in the 3D image always keep the matching relation consistent with the previous moment, the binocular visual field staggered relation of the 3D image is consistent with the previous moment, therefore, the 3D image is always matched with the visual sense of human eyes in the focusing process, and a user does not need to change the interpupillary distance to adapt to the 3D image, so that the condition of torn image sense organ caused by visual field dislocation is avoided, and the vertigo sense of the user is avoided.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist alone, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A 3D camera system for capturing and displaying stereoscopic images, the system comprising:
the binocular imaging component is used for acquiring binocular image information;
the focusing assembly is connected with the binocular imaging assembly and used for adjusting the focal length of the binocular imaging assembly;
the display component is used for displaying a 3D image formed based on the binocular image information;
the controller is in communication connection with the binocular imaging assembly, the focusing assembly and the display assembly and is used for sending the binocular image information acquired by the binocular imaging assembly to the display assembly so as to display the 3D image;
the controller is further used for acquiring focal length information generated by the focusing assembly adjusting the binocular imaging assembly, and adjusting the display position of the binocular image information on the display assembly according to the focal length information so as to display the 3D image.
2. The 3D camera system of claim 1, wherein the controller is communicatively coupled to the focus assembly via a focus feedback device that generates the focus information based on an amount of adjustment of the focus assembly.
3. A visual field correction method applied to a 3D camera system, wherein the 3D camera system includes:
the binocular imaging component is used for acquiring binocular image information;
the focusing assembly is connected with the binocular imaging assembly and used for adjusting the focal length of the binocular imaging assembly;
the display component is used for displaying a 3D image formed based on the binocular image information;
the visual field correction method comprises the following steps:
acquiring focal length information generated by the focusing assembly adjusting the binocular imaging assembly;
and adjusting the display position of the binocular image information on the display assembly according to the focal length information to display the 3D image.
4. The visual field correction method according to claim 3, wherein the step of adjusting the display position of the binocular image information on the display assembly according to the focal length information to display the 3D map comprises:
acquiring a reference overlapping point or a reference overlapping area according to the overlapping area in the two images of the binocular image information;
and respectively adjusting the display positions of the two images of the binocular image information on the display assembly according to the focal length information, so that the reference overlapping point or the reference overlapping area is kept overlapped to display the 3D image.
5. The visual field correction method according to claim 4, wherein the reference overlapping point is any one or more points in the overlapping region.
6. The visual field correction method according to claim 5, wherein the reference overlapping point is a central point in the reference overlapping region.
7. The visual field correction method according to claim 4, wherein the reference overlapping point or the reference overlapping area is located at a very center of the display unit.
8. A visual field correction apparatus for use in a 3D camera system, the 3D camera system comprising:
the binocular imaging component is used for acquiring binocular image information;
the focusing assembly is connected with the binocular imaging assembly and used for adjusting the focal length of the binocular imaging assembly;
the display component is used for displaying a 3D image formed based on the binocular image information;
the visual field correction device includes:
the acquisition module is used for acquiring the focal length information generated by the focusing assembly adjusting the binocular imaging assembly;
and the correction module is used for adjusting the display position of the binocular image information on the display assembly according to the focal length information so as to display the 3D image.
9. An electronic device comprising a processor and a memory, said memory storing computer readable instructions which, when executed by said processor, perform the steps of the method according to any one of claims 3 to 7.
10. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, performs the steps of the method according to any of claims 3-7.
CN202210789153.3A 2022-07-06 2022-07-06 3D imaging system, visual field correction method, device, equipment and storage medium Active CN114900680B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210789153.3A CN114900680B (en) 2022-07-06 2022-07-06 3D imaging system, visual field correction method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210789153.3A CN114900680B (en) 2022-07-06 2022-07-06 3D imaging system, visual field correction method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114900680A true CN114900680A (en) 2022-08-12
CN114900680B CN114900680B (en) 2022-10-28

Family

ID=82729394

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210789153.3A Active CN114900680B (en) 2022-07-06 2022-07-06 3D imaging system, visual field correction method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114900680B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005117193A (en) * 2003-10-03 2005-04-28 Ntt Docomo Inc Imaging terminal, image display terminal, and image display system
US20070070476A1 (en) * 2005-09-20 2007-03-29 Sony Corporation Three-dimensional display
US20070132863A1 (en) * 2005-12-14 2007-06-14 Sony Corporation Image taking apparatus, image processing method, and image processing program
WO2011024423A1 (en) * 2009-08-28 2011-03-03 パナソニック株式会社 Control device for stereoscopic image display and imaging device for stereoscopic images
WO2022086976A1 (en) * 2020-10-19 2022-04-28 Pictometry International Corp. Variable focal length multi-camera aerial imaging system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005117193A (en) * 2003-10-03 2005-04-28 Ntt Docomo Inc Imaging terminal, image display terminal, and image display system
US20070070476A1 (en) * 2005-09-20 2007-03-29 Sony Corporation Three-dimensional display
US20070132863A1 (en) * 2005-12-14 2007-06-14 Sony Corporation Image taking apparatus, image processing method, and image processing program
WO2011024423A1 (en) * 2009-08-28 2011-03-03 パナソニック株式会社 Control device for stereoscopic image display and imaging device for stereoscopic images
WO2022086976A1 (en) * 2020-10-19 2022-04-28 Pictometry International Corp. Variable focal length multi-camera aerial imaging system and method

Also Published As

Publication number Publication date
CN114900680B (en) 2022-10-28

Similar Documents

Publication Publication Date Title
CN105872526B (en) Binocular AR wears display device and its method for information display
CN104954777B (en) A kind of method and apparatus showing video data
US9330477B2 (en) Surgical stereo vision systems and methods for microsurgery
KR101446767B1 (en) Stereoscopic imaging device
CN103533340B (en) The bore hole 3D player method of mobile terminal and mobile terminal
US11528464B2 (en) Wide-angle stereoscopic vision with cameras having different parameters
CN103605208A (en) Content projection system and method
CN203838470U (en) Naked eye 3D shooting and displaying device and display thereof
US11570426B2 (en) Computer-readable non-transitory storage medium, web server, and calibration method for interpupillary distance
CN104090371A (en) 3D glasses and 3D display system
US20230239457A1 (en) System and method for corrected video-see-through for head mounted displays
EP3238178B1 (en) Methods and systems for producing a magnified 3d image
KR20140118608A (en) Method for automatically setting focus and therefor
CN111694158A (en) Calibration method, calibration equipment and calibration system for near-eye display device
US20210244274A1 (en) Devices, systems, and methods for fundus imaging and associated image processing
Cutolo et al. The role of camera convergence in stereoscopic video see-through augmented reality displays
CN114900680B (en) 3D imaging system, visual field correction method, device, equipment and storage medium
CN104216126A (en) Zooming 3D (third-dimensional) display technique
CN109710073A (en) A kind of VR medical operation teaching helmet
CN103393398A (en) Microcomputer synoptophore
Wu et al. Depth-disparity calibration for augmented reality on binocular optical see-through displays
US11962746B2 (en) Wide-angle stereoscopic vision with cameras having different parameters
RU210426U1 (en) DEVICE FOR AUGMENTED REALITY BROADCASTING
US20230171511A1 (en) Image processing apparatus, imaging apparatus, and image processing method
US11327313B2 (en) Method and system for rendering an image with a pupil enhanced accommodation of the eye

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 528253 Room 503, Floor 5, Building A, Jingu Zhichuang Industrial Community, No. 2, Yong'an North Road, Dawu Community, Guicheng Street, Nanhai District, Foshan City, Guangdong Province (residence declaration)

Patentee after: Guangdong Oupu Mandi Technology Co.,Ltd.

Address before: 528251 room 503, floor 5, building a, Jingu Zhichuang industrial community, No. 2, Yong'an North Road, Dawei community, Guicheng Street, Nanhai District, Foshan City, Guangdong Province

Patentee before: GUANGDONG OPTOMEDIC TECHNOLOGY CO.,LTD.