CN110689583B - Calibration method, calibration device, storage medium and electronic equipment - Google Patents

Calibration method, calibration device, storage medium and electronic equipment Download PDF

Info

Publication number
CN110689583B
CN110689583B CN201910850140.0A CN201910850140A CN110689583B CN 110689583 B CN110689583 B CN 110689583B CN 201910850140 A CN201910850140 A CN 201910850140A CN 110689583 B CN110689583 B CN 110689583B
Authority
CN
China
Prior art keywords
calibration
image
camera device
qualified
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910850140.0A
Other languages
Chinese (zh)
Other versions
CN110689583A (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhendi Intelligent Technology Co Ltd
Original Assignee
Suzhou Zhendi Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhendi Intelligent Technology Co Ltd filed Critical Suzhou Zhendi Intelligent Technology Co Ltd
Priority to CN201910850140.0A priority Critical patent/CN110689583B/en
Publication of CN110689583A publication Critical patent/CN110689583A/en
Application granted granted Critical
Publication of CN110689583B publication Critical patent/CN110689583B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The application relates to the technical field of camera calibration, and provides a calibration method, a calibration device, a storage medium and electronic equipment. The calibration method comprises the following steps: indicating a camera device to acquire images through a dynamic effect displayed in an area containing the guide characteristics in the calibration interface; changing the position of the guide feature in the calibration interface, and indicating the camera device to continuously acquire images until the camera device acquires a preset number of qualified images; and carrying out calibration calculation according to the qualified images of the preset number to obtain a calibration result of the camera device. The method has good interaction between the calibration interface and the user, is beneficial to the user to finish the calibration quickly, and improves the experience of the user in the calibration process. In addition, the method is a process of continuously acquiring qualified images in terms of flow, and the method is simple in steps and easy to master by common users.

Description

Calibration method, calibration device, storage medium and electronic equipment
Technical Field
The invention relates to the technical field of camera calibration, in particular to a calibration method, a calibration device, a storage medium and electronic equipment.
Background
At present, equipment provided with a camera is more and more widely applied in various fields, before the manufactured equipment is put into use formally, the parameter calibration of the camera arranged on the equipment is generally needed, the calibration process is generally completed by professionals in the past, the calibration process is complicated and complicated for common users, and the interactivity is poor.
Disclosure of Invention
An embodiment of the present application provides a calibration method, an apparatus, a storage medium, and an electronic device, so as to solve the above technical problem.
In order to achieve the above purpose, the present application provides the following technical solutions:
in a first aspect, an embodiment of the present application provides a calibration method, including: indicating a camera device to acquire images through a dynamic effect displayed in an area containing the guide characteristics in the calibration interface; changing the position of the guide feature in the calibration interface, and indicating the camera device to continuously acquire images until the camera device is determined to acquire a preset number of qualified images; and carrying out calibration calculation according to a preset number of qualified images to obtain a calibration result of the camera device.
The method comprises the steps of acquiring a plurality of rounds of images to obtain a preset number of qualified images required by calibration, displaying a guide feature at a certain position in a calibration interface in one round of image acquisition, requiring a camera device to acquire images of an area containing the guide feature in the calibration interface until a certain number of qualified images are acquired, changing the display position of the guide feature in the calibration interface and starting a new round of image acquisition after the camera device is confirmed to acquire the images successfully, and repeating the steps until the preset number of qualified images are obtained. And in each round of image acquisition, a dynamic effect is displayed in the area containing the guide feature, and the dynamic effect is used for prompting the user, so that the user tries to move the camera device and/or the calibration interface, and the camera device can acquire the image of the area containing the guide feature from a proper angle at a proper position so as to obtain a qualified image.
Because the user is prompted by displaying the dynamic effect in the calibration process, the calibration interface of the method has good interactivity with the user, so that the user can quickly complete the calibration of the camera device, and the experience in the calibration process is improved. In addition, the method is a process of continuously acquiring qualified images in terms of flow, and the method is simple in steps and easy to master by ordinary users.
The camera device can be a camera or various devices provided with the camera, and the problems existing in the device calibration process are improved, so that the popularization of the devices in common people is facilitated.
In an implementation manner of the first aspect, determining that the qualified image is acquired by the image capturing device includes: receiving images acquired by the camera device in real time; and detecting the guide features in the received image, and determining that the camera device acquires a qualified image when the detected guide features meet preset conditions.
In this implementation, the camera device is only responsible for acquiring images, and whether the images are qualified or not is judged by the terminal device (which may be a device displaying a calibration interface), so that the operation burden of the camera device is light.
In an implementation manner of the first aspect, the preset condition includes: (ii) contains the guide feature in its entirety; and the size of the guide feature in the image exceeds a preset size, and/or the shape of the guide feature presented in the image is similar to the original shape of the guide feature to a degree exceeding a preset degree.
In this implementation, the inclusion of the complete guide feature means that the guide feature is to be entirely included in the image, and cannot be only partially included in the image, and certainly cannot be further included in the image, and is intended to limit the image capturing device to perform image capturing on the region including the guide feature, rather than performing image capturing on another region; the size of the guide feature in the image exceeds a preset size, which means that the guide feature cannot be too small, and is intended to limit the camera device to perform image acquisition at a position closer to the calibration boundary surface rather than at a position far away from the calibration interface; the similarity between the shape of the guide feature in the image and the original shape of the guide feature exceeds a preset degree, which means that the deformation of the guide feature cannot be obvious, and the image acquisition of the area containing the guide feature is limited to be performed by the camera as far as possible, rather than being performed from the side. These restrictions are requirements for images, but also correspond to requirements of a user who controls the image pickup device to perform image pickup.
In an implementation manner of the first aspect, determining that the qualified image is acquired by the image capturing device includes: and receiving a successful acquisition signal sent by the camera device, wherein the successful acquisition signal indicates that the camera device acquires a qualified image.
In the implementation mode, whether the image is qualified or not is judged by the camera device, if the image is qualified, the acquisition success signal is sent to the terminal equipment, and the operation burden of the terminal equipment is light. It is optional as to whether the acquired image is to be sent to the terminal device in real time.
In an implementation manner of the first aspect, determining that the qualified image is acquired by the image capturing device includes: receiving an image sent by the camera device, and determining that the camera device acquires a qualified image; and the camera device sends the image after determining that the acquired image is a qualified image.
In the implementation mode, whether the image is qualified is judged by the camera device, if the image is qualified, the image is sent to the terminal equipment, otherwise, the image is not sent to the terminal equipment, so that the image received by the terminal equipment is necessarily qualified, and a signal indicating that the camera device collects the qualified image is not needed to be sent independently, so that the data transmission quantity in the calibration process is reduced, and the operation burden of the terminal equipment is light.
In an implementation manner of the first aspect, the performing calibration calculation according to a preset number of qualified images includes: and carrying out calibration calculation according to the qualified images with the preset number stored locally.
In this implementation, the camera device already sends a preset number of qualified images to the terminal device (of course, other images may also be sent according to different implementation manners) when acquiring the images, and the terminal device stores the images in the local and directly reads the images when performing the calibration calculation.
In an implementation manner of the first aspect, the performing calibration calculation according to a preset number of qualified images includes: and acquiring a preset number of qualified images from the camera device, and performing calibration calculation according to the preset number of qualified images.
In this implementation, the camera device does not send the qualified image to the terminal device when acquiring the image, but stores the qualified image locally in the camera device, and the terminal device needs to obtain the qualified image from the camera device when performing calibration calculation. The acquisition mode can be an active request of the terminal equipment, and can also be that the camera device actively sends the qualified images to the terminal equipment after acquiring a preset number of qualified images.
In one implementation of the first aspect, the dynamic effect comprises a combination of one or more of the following effects: the guide feature displays a state change; a feature point display state change in a region containing the guide feature; a background display state change within an area containing the guide feature; displaying a state change at an edge of an area containing the guide feature; a geometric graphic overlying the region containing the guide feature displays a change in state.
In this implementation, the dynamic effect has multiple selectable modes, only one of the effects can be implemented on the terminal device, and also multiple effects can be implemented, and one of the effects is automatically selected to be presented in the calibration interface during calibration or according to the requirements of a user.
In one implementation of the first aspect, the display state change comprises a combination of one or more of the following changes: flicker, color variation, and brightness variation.
In one implementation of the first aspect, the changing the position of the guide feature in the calibration interface includes: changing the position of the guide feature in the calibration interface in a clockwise direction or a counter-clockwise direction.
In this implementation, the change in position of the guide features is in a clockwise order, which is more consistent with the user's habits, for example, the camera device may move continuously in only one direction during the image capturing process.
In one implementation form of the first aspect, the method further comprises: and refreshing and displaying a progress bar representing the image acquisition progress in the calibration interface after the camera device is determined to acquire qualified images each time.
In this implementation, the acquisition progress is displayed through the progress bar, so that the user can grasp the progress of calibration, and if the duration of the calibration process is long, displaying the progress is a relatively user-friendly practice.
In one implementation form of the first aspect, the method further comprises: and displaying the image acquired by the camera device in real time in a preview window of the calibration interface.
In the implementation mode, the real-time acquisition picture of the camera device is displayed through the preview window, so that a user can conveniently control the camera device and/or the calibration interface to move according to the picture, the camera device can acquire qualified images more quickly, and the calibration efficiency is improved.
In an implementation manner of the first aspect, when the image capturing apparatus is a monocular camera, the preset number is at least three.
In an implementation manner of the first aspect, when the image capturing device employs binocular cameras, the preset number is at least three pairs.
In one implementation of the first aspect, the guide feature comprises at least three geometric figures.
In one implementation manner of the first aspect, after the obtaining of the calibration result of the image capturing apparatus, the method further includes: and judging whether the calibration result meets the calibration requirement or not, and outputting a judgment result.
In this implementation manner, the user is prompted by outputting the determination result, and if the output result is that the calibration requirement is not satisfied, the user may take a corresponding measure, for example, perform calibration again.
In an implementation manner of the first aspect, after the determining whether the calibration result meets the calibration requirement, the method further includes: and if the calibration result meets the calibration requirement, sending the calibration result to the camera device.
In this kind of implementation, terminal equipment can send the calibration result to camera device and use, for example, if camera device is unmanned aerial vehicle and this unmanned aerial vehicle adopts the binocular camera, then unmanned aerial vehicle can utilize the picture and the calibration result of binocular camera collection to calculate the scene depth to realize the automatic obstacle avoidance in the flight process.
In a second aspect, an embodiment of the present application provides a calibration apparatus, including: the dynamic prompting module is used for indicating the camera device to acquire images through a dynamic effect displayed in an area containing the guide characteristics in the calibration interface; the image confirmation module is used for determining that the qualified image is acquired by the camera device; the characteristic display module is used for changing the position of the guide characteristic in the calibration interface and indicating the camera device to continuously acquire images until the image confirmation module determines that the camera device acquires a preset number of qualified images; and the calculating module is used for carrying out calibration calculation according to a preset number of qualified images to obtain a calibration result of the camera device.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where computer program instructions are stored, and when the computer program instructions are read and executed by a processor, the computer program instructions perform the steps of the method provided in the first aspect or any one of the possible implementation manners of the first aspect.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: a memory in which computer program instructions are stored, and a processor, which when read and executed by the processor, perform the steps of the method as provided by the first aspect or any one of the possible implementations of the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 illustrates a schematic diagram of a calibration scenario provided in an embodiment of the present application;
fig. 2 shows a flowchart of a calibration method provided in an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating a calibration interface provided by an embodiment of the present application;
FIG. 4 is a schematic diagram illustrating a calibration plate provided by an embodiment of the present application;
FIG. 5 is a diagram illustrating a first dynamic effect provided by an embodiment of the present application;
FIG. 6 is a diagram illustrating a second dynamic effect provided by an embodiment of the present application;
FIG. 7 is a diagram illustrating a third dynamic effect provided by an embodiment of the present application;
FIG. 8 is a diagram illustrating a fourth dynamic effect provided by an embodiment of the present application;
FIG. 9 is a diagram illustrating a fifth dynamic effect provided by an embodiment of the present application;
FIG. 10 is a schematic diagram illustrating one manner of changing the position of a guide feature provided by an embodiment of the present application;
FIG. 11 is a functional block diagram of a calibration apparatus provided in an embodiment of the present application;
fig. 12 shows a schematic diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The calibration method provided by the embodiment of the application is used for calibrating the camera device. The camera device is a device which comprises a camera and has an image acquisition function, and is used for calibrating the camera device, in particular to calibrating the camera in the camera device. Note that the video camera itself may also be an imaging device, for example, the imaging device may be a digital camera, a digital video camera, or the like; alternatively, the camera is only a component of the camera device, for example, the camera device may be an unmanned aerial vehicle, an unmanned ship, a monitoring device, a robot, etc. in which the camera is installed. The camera device is calibrated by using the calibration method provided by the embodiment of the application, and a terminal device can be used, and the terminal device can be, but is not limited to, a desktop computer, a notebook computer, a tablet computer, a smart phone, a projector, a special calibration device and other devices with processing and display functions.
Fig. 1 shows a schematic diagram of a calibration scenario provided in an embodiment of the present application. Referring to fig. 1, the terminal device 120 includes a display 130 on which a calibration interface is displayed. The camera device 100 to be calibrated is provided with a camera 110, and a user holds the camera device 100 to move to different positions in the calibration process to acquire images of a calibration interface.
The image pickup apparatus 100 and the terminal device 120 are respectively provided with communication interfaces so that they can establish a communication connection during the calibration process. The camera device 100 may send the acquired image to the terminal device 120, the terminal device 120 performs calibration calculation according to the acquired image, and finally obtains a calibration result, and the terminal device 120 may also send the calibration result to the camera device 100. Of course, in some embodiments, it is also possible to perform the calibration calculation locally directly after the image is captured by the image capturing apparatus 100.
In one implementation, the image capturing apparatus 100 may not be held by the user, for example, may be mounted on a freely movable support, and may be controlled by the support (manually controlled by the user or controlled by a program), so that an effect similar to that of holding by the user may be achieved, and the burden on the user may be reduced. However, for simplicity, the case where the user holds the camera device 100 is mainly taken as an example.
Fig. 2 shows a flowchart of a calibration method provided in an embodiment of the present application. The method guides a user to control the camera device to collect images according to the expected mode of a calibration flow designer through the dynamic effect displayed in the calibration interface, and then calibration calculation is completed. The method steps are mainly described in conjunction with the scenario in fig. 1, that is, the steps are executed by the terminal device 120 in fig. 1, and it is understood that the calibration method is not limited to the application in the scenario shown in fig. 1, and is only an example. Referring to fig. 2, the method includes:
Step S200: and indicating the camera device to acquire images through the dynamic effect displayed in the area containing the guide features in the calibration interface.
The calibration interface may be a graphical interface displayed on a display of the terminal device, for example, a calibration program for calibrating the camera device may be installed on the terminal device, and the calibration interface is displayed after the calibration program is started, and a communication connection is established between the terminal device and the camera device.
Fig. 3 is a schematic diagram illustrating a calibration interface provided in an embodiment of the present application. Referring to FIG. 3, the calibration interface 300 includes a calibration board 310, a preview window 320, a progress bar 330, an information prompt box 350, and a start calibration button 340, wherein the calibration board 310 is an optional component, and in other implementations, the calibration interface may include more components than those shown in FIG. 3.
The calibration plate includes two types of geometric figures, one type is called feature points, and the other type is called guide features. The characteristic points can be one or a combination of a plurality of figures such as a circle, an ellipse, a ring, a checkerboard and the like, and are regularly arranged in the calibration plate. For example, in the calibration plate 310 of fig. 3, the feature points 312 are small solid circles and are arranged in an array. The position of the feature points in the calibration plate is fixed.
The geometric figure constituting the guide feature is clearly distinguished from the feature point, and may be, for example, a figure larger in size than the feature point, or a figure of a different shape, or the like. The specific pattern of the guide feature is not limited, the specific number of required patterns is not limited, and the relative position relationship between the patterns constituting the guide feature is not limited. In one implementation, to meet the requirements of the calibration calculations, the guidance features may include at least three geometric figures. For example, in the calibration plate 310 of FIG. 3, the guide features 314 consist of four patterns, one solid triangle, one large solid circle, and two large hollow circles, each of which is clearly distinct from the small solid circle. It should be noted that the position of the guide feature in the calibration plate may change as the calibration process proceeds (see fig. 10 and the following description), and the relative position relationship between the geometric figures forming the guide feature may also change as the calibration process proceeds (see fig. 10 and the following description, but such a change is not necessary).
Fig. 4 shows a different calibration plate than that of fig. 3. The characteristic points of the calibration plate are the same as in fig. 3, but the guide features take the form of three large hollow circles.
With continued reference to fig. 3, the user may trigger the calibration process, i.e., the process from step S200 to step S240, by clicking the start calibration button 340 in the calibration interface 300. Alternatively, the calibration process may be automatically entered when the calibration procedure is started (in this case, the start calibration button 340 may not be provided in the calibration interface 300). The content of the calibration board can be displayed before the calibration process is triggered, and can also be displayed after the calibration process is triggered. In some implementations, a plurality of calibration boards may be built in the calibration program, and a user may select one of the calibration boards to use according to preference.
In the whole calibration process, the guide features are displayed at a plurality of positions in the calibration plate, the display positions are preset by a calibration process designer, and the guide features are displayed at one position when the calibration process is started. And a dynamic effect is displayed in the area containing the guide feature in the calibration interface, and the dynamic effect is used for instructing the camera device to acquire images aiming at the area containing the guide feature. For example, the area a shown by the dashed box in fig. 3 is an area containing a guide feature, and it is not assumed that the calibration interface in fig. 1 refers to the calibration interface in fig. 3, and then for the scene in fig. 1, the dynamic effect is to prompt the user to move the camera to a proper position to acquire an image of the area a.
The dynamic effect includes, but is not limited to, one or more of the following five effects:
(1) the guide feature displays a state change;
(2) a feature point display state change of a region including the guide feature;
(3) displaying a state change at an edge of the area containing the guide feature;
(4) the background display state of the area containing the guide features changes, wherein the background refers to the part of the area containing the guide features except the feature points and the guide features;
(5) a geometric display state change superimposed on the area containing the guide feature;
the above display state changes include, but are not limited to, combinations of one or more of the following: flicker, color variation, and brightness variation.
Fig. 5 shows a schematic diagram of the flicker of the guide feature in the dynamic effect (1), wherein the calibration board is the same as that in fig. 3 and will not be described again. At a certain moment, the guidance feature is in a display state in the calibration board, as shown in (a) of fig. 5, after being displayed for a period of time (e.g., 0.5s, 1s, etc.), the guidance feature is in a hidden state in the calibration board, as shown in (B) of fig. 5, after being hidden for a period of time, the guidance feature is in a display state again in the calibration board, as shown in (C) of fig. 5, and the process from (a) to (C) of fig. 5 is repeated to display the guidance feature, i.e., the effect of flickering the guidance feature is achieved. Note that the blinking of the guide feature in fig. 5 is that all the geometric figures included in the guide feature are blinking, but in some implementations, the blinking of the guide feature may be only that part of the geometric figures are blinking, and the rest of the geometric figures always maintain the display state.
Fig. 6 shows a schematic diagram of geometrical flicker in the region containing the guide feature, and since the geometrical in the region containing the guide feature is the finger feature point and the guide feature, fig. 6 shows a combination of effects (1) and (2), in which the calibration board is the same as that in fig. 3 and will not be described. At a certain moment, the geometric figure in the area a is in a display state in the calibration board, as shown in (a) of fig. 6 (the area a is not marked in (a) of fig. 6, and only fig. 3 is used), after a period of display, the geometric figure in the area a is in a hidden state in the calibration board, as shown in (B) of fig. 6, after a period of hiding, the geometric figure in the area a is in a display state again in the calibration board, as shown in (C) of fig. 6, and the process from (a) to (C) of fig. 6 is repeated to display the geometric figure in the area a, that is, the effect of geometric figure flickering in the area a is achieved. It should be noted that the flickering of the geometric figure in fig. 6 is that all the geometric figures included in the region a are flickering, but in some implementations, only a part of the geometric figures may be flickering, for example, only all or part of the feature points are flickering.
Fig. 7 shows a schematic diagram of edge flashing of a region containing guide features in the dynamic effect (3), where the calibration board is the same as in fig. 3 and will not be described again. At a certain moment, the edge of the area a is in a display state in the calibration board, which is shown by a dashed line frame, as shown in fig. 7 (a) (the area a is not shown in fig. 7 (a), and fig. 3 is used, the edge of the area a is in a hidden state in the calibration board after being displayed for a period of time, as shown in fig. 7 (B), and after being hidden for a period of time, the edge of the area a is in a display state in the calibration board again, as shown in fig. 7 (C), and the process of fig. 7 (a) to (C) is repeated to display the edge of the area a, that is, the edge of the area a flickers.
Fig. 8 shows a schematic diagram of background flicker of the region containing the guide feature in the dynamic effect (4), where the calibration plate is the same as in fig. 3 and will not be described again. At a certain moment, the background of the area a is in a display state in the calibration board, as shown in fig. 8 (a) (the area a is not marked in fig. 8 (a), and fig. 3 is used, only), after a period of display, the background of the area a is in a hidden state in the calibration board, shown in shadow, as shown in fig. 8 (B), after a period of hiding, the edge of the area a is in a display state again in the calibration board, as shown in fig. 8 (C), and the process of fig. 8 (a) to (C) is repeated to display the background of the area a, that is, the background of the area a flickers.
Fig. 9 shows a schematic diagram of the geometrical flicker superimposed on the area containing the guide features in the dynamic effect (5), where the calibration plate is the same as in fig. 3 and will not be described again. At a certain time, a semi-transparent rectangle is superimposed on the original pattern of the area a, as shown in fig. 9 (a), and the rectangle is hidden after a certain time, as shown in fig. 9 (B), and the rectangle is hidden after a certain time and is displayed again, as shown in fig. 9 (C). Repeating the process of fig. 9 (a) to (C) shows the rectangles superimposed on the area a, i.e., the effect of the geometric figure superimposed on the area a flickers is achieved. It is noted that the geometry in fig. 9 is superimposed over the entire area a, but in some implementations the geometry may be superimposed over only a portion of area a.
Only the case where the display state changes to flicker has been described above, and the case where the display state changes to color change and brightness change is similar to flicker, and will not be described in detail.
The calibration interface can only realize one dynamic effect or realize a plurality of dynamic effects, and provides the user with the right of selecting the dynamic effect autonomously, and the corresponding dynamic effect is presented in the calibration interface according to the selection of the user, so that the calibration is more in line with the personal habits of the user.
Optionally, in addition to displaying the dynamic effect in the area containing the guidance feature, the user may be prompted by other means as to what should be done currently. For example, in fig. 3, in addition to displaying the dynamic effect at the area a, corresponding text prompt information, such as "please aim the imaging device at the area that is blinking" may be displayed at the information prompt box 350. As an alternative, the text prompt may be changed to a voice prompt.
Through the display of the dynamic effect, which area in the calibration board is the area which needs to be subjected to image acquisition currently is indicated to the user, so that the user can make a target to be acquired by the camera device more definite, the camera device can be moved in a targeted manner to be aligned to the area containing the guide feature for image acquisition, the image acquisition efficiency is improved, the calibration efficiency is further improved, and the process reflects the good interactivity between the calibration interface and the user. Particularly, if the user cooperates with the text prompt information, the user can more easily understand how to perform the calibration operation, and the calibration efficiency is further improved. Meanwhile, the user is fully prompted, so that the user experience in the calibration process is improved.
Step S210: and determining whether the camera device acquires a qualified image.
In the calibration process, the camera device continuously collects images, but each collected image is not qualified, and the qualified images are in accordance with the requirements of calibration calculation. In one implementation, whether the image is qualified refers to whether the guide features in the image meet a preset condition, and if the guide features meet the preset condition, the image is qualified, otherwise, the image is not qualified. As for the preset condition, the following conditions may be included, but not limited to:
(1) the image contains the complete guide feature. This condition means that the geometry that constitutes the guiding feature is all contained within the acquired image, not only partially, but certainly not all, as the guiding feature is used in the calibration calculations. The condition (1) is intended to restrict the imaging device to perform image capturing on a region including the guide feature, not on another region. In practice, the geometric figures constituting the guide features may be detected in the image, and it may be determined whether or not the number of the geometric figures matches a predetermined number.
It should be noted that for some dynamic effects in step S200, for example, the guide feature flickers, the captured image may not include the guide feature, and such an image may be excluded from the qualified image through the condition (1). In practice, the frequency of the guide feature flickering can be set to be higher, and since the image pickup device is continuously shooting, if the image pickup device shoots the guide feature at a certain position, a picture containing the guide feature can be shot necessarily as long as the image pickup device stops slightly.
(2) The size of the guide feature in the image exceeds a preset size. The condition is that the guide feature in the image cannot be too small, at least to a certain standard, otherwise it will affect the calibration result. The condition (2) is intended to restrict the image pickup device from image pickup at a position closer to the calibration interface (the zoom problem may not be considered during calibration), rather than image pickup at a position away from the calibration interface. In practice, the geometric figures forming the guide features are detected in the image, and then the size of each figure is compared with a preset size threshold.
(3) The shape of the guide feature appearing in the image is similar to the original shape of the guide feature to a degree greater than a preset degree. This condition means that the deformation of the guiding feature due to the shooting angle cannot be too pronounced, which would otherwise affect the calibration result. For example, a guide feature that is originally circular in shape, if photographed from the side, will appear as an ellipse in the image, which is the deformation of the guide feature due to the photographing angle. The condition (3) is intended to restrict the image pickup device from performing image pickup on the region including the guide feature as far as possible, rather than performing image pickup on the region including the guide feature from the side. In practice, the geometric figures forming the guide features are detected in the image, and then the similarity between each figure and its original shape is calculated, for example, for an originally circular guide feature, the ellipticity thereof may be calculated.
In some cases, only one of the conditions (2) and (3) may be determined except that the condition (1) must be satisfied, and the image is regarded as a qualified image as long as it is satisfied, and it is not necessary that both the conditions (2) and (3) be determined. Of course, other determination conditions may be set in addition to the above conditions, and these conditions may be related to the guidance feature or may be unrelated to the guidance feature. For example, the conditions of whether the image is clear, whether the feature points in the image meet the requirements, and the like can also be included.
Although the above determination conditions are set for an image, they actually also impose corresponding requirements on the process of controlling the camera device by the user to capture an image, for example, a complete guide feature is to be captured, the guide feature is to be captured as large as possible, the guide feature is to be captured as directly as possible to the calibration board, and so on. Optionally, the requirements may be notified to the user during calibration to avoid the user from performing calibration operation blindly, for example, corresponding text prompt information, such as "please shoot the calibration board as far as possible", may be displayed at the information prompt box 350 in fig. 3. As an alternative, the text prompt may also be changed to a voice prompt.
Although step S210 is executed by the terminal device, the determination of whether the image is qualified is not necessarily executed by the terminal device, and there are at least the following schemes:
The first scheme is as follows: the camera device continuously sends the images acquired in real time to the terminal equipment in the calibration process, and the terminal equipment judges whether the images are qualified or not, wherein the judgment method is already explained above. If the image is qualified, the terminal equipment determines that the image is qualified and acquired by the camera device, and the terminal equipment can store the qualified image for use in subsequent calibration calculation. The proposal has light calculation load of the image pickup device.
For the first scheme, the terminal device executes step S210 to each received image to determine whether the image is qualified, and executes step S220 if the image is qualified, otherwise, continues to determine whether the next image is qualified. In a specific implementation, the processes for displaying the calibration interface and determining whether the image is qualified may be executed by different threads in the calibration procedure, which are independent of each other, that is, the process of step S200 does not stop executing when step S210 is executed.
Optionally, if the terminal device determines that a certain image is not qualified, the user may also be informed of the reason why the image is not qualified, so that the user may change the calibration operation. For example, if the terminal device determines that the size of the guide feature in the image is small, corresponding text prompt information, such as "the image capturing device is too far away from the calibration board and please get close to the calibration board to capture" may be displayed at the information prompt box 350 in fig. 3, so that the user may move the image capturing device to a position closer to the calibration board after seeing the information. As an alternative, the text prompt may be changed to a voice prompt.
Scheme II: the camera device automatically judges whether the acquired image is qualified or not in the calibration process, if the acquired image is qualified, an acquisition success signal indicating that the acquired image is qualified is sent to the terminal equipment, and after the terminal equipment receives the signal, the camera device can be determined to acquire the qualified image. In the second scheme, the camera device may still send the image acquired in real time to the terminal device (mainly used for preview, which will be described later), and inform the terminal device of which one is a qualified image, or may also send only the qualified image to the terminal device, and the terminal device may store the received qualified image for use in subsequent calibration calculation. In addition, the camera may not send any image to the terminal device, that is, the terminal device is notified of the fact that the terminal device has acquired the qualified image only by the acquisition success signal, and the qualified image is stored locally in the camera. The scheme has light operation burden of the terminal equipment.
With regard to the second scheme, once the terminal device receives the acquisition success signal sent by the camera device, it is equivalent to having determined that the judgment result of step S210 is yes, and step S220 may be executed next.
The third scheme is as follows: the camera device automatically judges whether the acquired image is qualified or not in the calibration process, and only sends the qualified image to the terminal equipment, so that the image received by the terminal equipment is necessarily the qualified image, and the terminal equipment can also determine that the qualified image is acquired by the camera device. In the scheme, the camera device does not need to independently send the acquisition success signal similar to the scheme II, and the image sent by the camera device has the function of acquiring the success signal, so that the data transmission quantity in the calibration process is reduced. In addition, the scheme has light operation burden on the terminal equipment.
With regard to the third scenario, the terminal device, upon receiving the image sent by the image capturing apparatus, may determine that the determination result of step S210 is yes, and may then execute step S220.
Optionally, in the second or third aspect, if the image capturing device determines that an image is not qualified, the reason why the image is not qualified may be notified to the terminal device, and the terminal device further notifies the user of the reason, so that the user changes the calibration operation. For example, the reasons that the image is not qualified can be divided into several categories, each category corresponds to one code, the camera device sends the corresponding code to the terminal device after judging that the image is not qualified, and the terminal device informs the terminal device of the reasons that the image is not qualified, and how the terminal device informs the user of the reasons is required to refer to the relevant contents in the first scheme, and the description is not repeated.
According to the above explanation, for some cases in the first and second schemes, the terminal device may obtain images acquired by the camera device in real time, and in one implementation manner, the images may be displayed in a preview window on the calibration interface, so that the user can know what the image shot by the current camera device is, and meanwhile, the user can move the camera according to the current shot image, so that a qualified image can be obtained more quickly, and the calibration efficiency is improved. For example, in fig. 3, the user can know that the size of the guide feature in the currently captured image is small through the preview window 320 (this situation can also be accurately known in cooperation with the prompt information in the information prompt box 350), so as to move the camera device to a position closer to the calibration plate.
Step S220: and determining whether the camera device acquires a preset number of qualified images.
According to the camera calibration theory, if the camera device adopts a monocular camera, the preset number is at least three, and if the camera device adopts a binocular camera, the preset number is at least three, namely each camera collects at least three images. The above-mentioned camera device adopts a monocular camera, and includes two meanings, one means that the camera device is originally a monocular camera, and the other means that the monocular camera is mounted on the camera device. A similar understanding can be made for a camera device using a binocular camera.
If the determination result in step S220 is yes, it indicates that a sufficient number of qualified images have been acquired, calibration calculation may be performed, so step S240 is performed, otherwise, it indicates that a sufficient number of qualified images have not been acquired, and step S230 is performed to continue image acquisition.
When the terminal device is responsible for counting the qualified images, step S220 is executed once each time the terminal device confirms that a new qualified image is captured by the camera device. However, in other implementations, the camera device may be responsible for counting the qualified images, the terminal device does not care how many qualified images have been collected at present, it only needs to update the display position of the guidance feature when receiving the collection success signal (the second scheme in step S210) or the qualified images (the third scheme in step S210), and when the camera device determines that the number of the collected qualified images has reached the preset number, the terminal device is notified of the fact, and the terminal device may start the calibration calculation. That is, the terminal device only needs to confirm that the preset number of qualified images have been acquired according to the received notification, which is equivalent to performing step S220 only once, and the determination result is yes, which is different from the step in fig. 2.
In one implementation, a progress bar, such as the progress bar 330 in fig. 3, may be further disposed in the calibration interface to indicate the progress of image acquisition. After step S210 is executed, the progress bar may be refreshed to prompt the user of the current acquisition progress, and especially when the duration of the calibration process is long (for example, the preset number of qualified images has a large value), displaying the progress is a relatively user-friendly method. There are several ways to calculate progress, just to mention one way: and calculating the ratio of the determined number of qualified images to the preset number of qualified images, and refreshing the progress bar according to the ratio. The action of refreshing the progress bar and step S220 have no requirement of execution precedence.
Step S230: changing the position of the guide feature in the calibration interface.
As mentioned in the description of step S200, in the whole calibration process, the guidance features are sequentially displayed at a plurality of preset positions in the calibration board, and the image capturing device needs to perform image capturing on the area including the guidance features in the calibration interface for each position where the guidance features are located, and capture a certain number of qualified images (corresponding to step S210). After the qualified images are acquired at the current position of the guide feature, if the total number of the acquired qualified images does not reach the preset number (corresponding to step S220), the position of the guide feature in the calibration interface is changed (corresponding to step S230), and the acquisition process is repeated (step S200 is returned after step S230 is executed) until the preset number of the qualified images are acquired.
In some implementations, the position of the guide feature displayed in the calibration interface is changed in a clockwise or counterclockwise rotation manner, so that the display effect is more suitable for the habit of the user, for example, in the case that the user holds the camera device, the user only needs to continuously move the camera device along with the moving direction of the guide feature to complete the acquisition work required by calibration, which is relatively easy. Of course, even if the position of the guide feature in the calibration interface is not changed in the clockwise order, as long as the guide feature traverses all the preset display positions and the image capturing device captures images of all the positions where the guide feature is located, the calibration result does not have any obvious difference with respect to the manner in which the guide feature is changed in the clockwise order, and therefore, the method is a feasible implementation.
FIG. 10 illustrates a schematic diagram of one manner of changing the position of a guide feature provided by an embodiment of the present application. The guide feature needs to be displayed at three preset positions, which correspond to the diagrams (a) to (C) in fig. 10, respectively, and the sequence of display is from the diagram (a) to the diagram (C) in fig. 10, that is, the guide feature rotates in the calibration plate in the clockwise direction. As can be seen from the rotation of the diagram (a) in fig. 10 to the diagram (B) in fig. 10, the geometric figures forming the guide feature do not necessarily all rotate, but may also rotate partially, and the other part remains unchanged (the triangle is not rotated, the three circles are rotated). There is no limitation as to which graphics in the guide feature are to be rotated and which are not. As a more general conclusion, the relative positional relationship between the geometries that make up the guide features also allows the transformation to occur as the guide features change display position.
After the display position of the guide feature is changed, the area containing the guide feature in the calibration interface is correspondingly changed, so that a dynamic effect can be displayed in a new area, and a user is prompted to move the camera device to a new position for image acquisition. It is understood that, since the motion of the object is relative, it is possible to move the camera device and/or the calibration interface when acquiring the image, but it is common practice to move the camera device, and the moving camera device is always taken as an example in this disclosure, but this does not limit the scope of the present application. It should be further noted that the above-mentioned moving calibration interface may be a moving calibration interface on a display of the terminal device, or may be a moving display of the whole terminal device (the moving calibration interface of the display naturally moves).
It is understood that step S230 may be performed automatically after step S220, or may be performed after the user has triggered it, for example, a button may be provided in the calibration interface, and the position in the guiding feature calibration interface may be changed after clicking.
In the above description, the calibration interface is a graphical interface on the display, but in some implementations, the calibration interface may also be a solid board including an indicator light, and for simplification of description, only the calibration board in the calibration interface, the indicator light are installed at the positions where the feature point and the guide feature are located (the position where the guide feature is located includes a plurality of preset positions to be displayed), and the indicator light is not installed at the rest of the board surface (referred to as the background). When the mark is not marked, all the indicating lamps on the board are turned off, and the characteristic points, the guide characteristics and the background are not obviously different; during calibration, the indicator lights are turned on (the indicator light corresponding to the guide characteristic is turned on at one preset position), the characteristic points, the guide characteristic and the background are obviously distinguished, and meanwhile, the indicator lights can be flashed, discolored or changed in brightness, so that the dynamic effect in the method can be realized.
Step S240: and carrying out calibration calculation according to a preset number of qualified images to obtain a calibration result of the camera device.
In some implementations (for example, in the first and second solutions in step S210), the camera device already sends a preset number of qualified images to the terminal device when acquiring the image, and the terminal device stores the qualified images in the local and directly reads the qualified images from the local when performing the calibration calculation.
In other implementations (for example, in the case of part of the second scenario in step S210, and in the third scenario), the camera device does not send the qualified image to the terminal device when acquiring the image, but stores the qualified image locally in the camera device, and the terminal device needs to obtain the qualified image from the camera device when performing the calibration calculation. The acquisition mode can be that the terminal equipment actively requests the camera device, or the camera device actively sends the acquired qualified images in the preset number to the terminal equipment after confirming the acquired qualified images.
It can be understood that if the camera device stores all the qualified images locally, the calibration calculation may be performed locally on the camera device, and certainly if the camera device itself has weak calculation capability, it is a better way to send the calibration calculation to the terminal device for calculation.
Further, if calibration calculation is performed on the camera device, the calibration result is not necessarily sent to the terminal device (because the calibration result is likely to be used only by the camera device itself, which is specifically described later), at this time, communication may also not be needed between the camera device and the terminal device, when the camera device confirms that a qualified image is acquired, the user may be prompted in the manners of indicator light display, voice broadcast, and the like, the user manually operates the calibration interface again, the display position of the guidance feature in the calibration interface is updated, which is also mentioned when step S230 is described, and step S230 may also be executed after the user triggers.
The calibration calculation is intended to calculate the parameters of the camera (although other parameters such as projection errors mentioned later may also be calculated), and the specific process thereof may refer to the implementation in the prior art, which will not be described in detail herein. For example, if the camera device employs a monocular camera, the calculated parameters may include the coordinates of the main optical axis, distortion, focal length, etc.; if the camera device uses a binocular camera, the calculated parameters may include coordinates of a main optical axis, distortion, a focal length, a rotation matrix, a translation matrix, and the like. Both the specific points in the image and the guide features are used when performing the calibration calculations.
The calibration method comprises the steps that multiple rounds of image acquisition are carried out, the qualified images with the preset number required by calibration can not be obtained, in one round of image acquisition, the guide feature is displayed at a certain position in the calibration interface, the camera device is required to carry out image acquisition on the area, containing the guide feature, in the calibration interface until a certain number of qualified images are acquired, after the camera device is confirmed to successfully acquire the images, the display position of the guide feature in the calibration interface is changed, a new round of image acquisition is started, and the steps are repeatedly executed until the qualified images with the preset number are obtained. And in each round of image acquisition, a dynamic effect is displayed in the area containing the guide feature, and the dynamic effect is used for prompting the user, so that the user tries to move the camera device and/or the calibration interface, and the camera device can acquire the image of the area containing the guide feature from a proper angle at a proper position so as to obtain a qualified image.
It can be seen from the above embodiments that, since the operations that the user needs to perform are effectively prompted in the calibration process through various modes such as dynamic effects, characters, voice, and the like, the user can quickly complete the calibration of the camera device, and the experience in the calibration process is also improved. In addition, the flow of the calibration method is only a cyclic process of continuously acquiring qualified images, and a user only needs to execute corresponding operation according to the guidance of a calibration interface in the process, so that the steps are simple and are easy to master by common users. For example, the situation that the camera device is an unmanned aerial vehicle is also favorable for more common users to use the unmanned aerial vehicle after the calibration operation of the unmanned aerial vehicle is simplified.
Further, after step S240, the terminal device may further determine whether the calibration result meets the calibration requirement, and output the determination result. It should be noted that, since the image capture of the image capturing device is controlled by the user, the captured image has great randomness, and even if the image is determined as a qualified image according to a certain condition, it does not mean that the calibration result obtained according to the qualified image can certainly meet the preset requirement. At this time, the user may be prompted by outputting the determination result, and if the output result indicates that the calibration requirement is not met, the user may take a corresponding measure, for example, perform calibration again. If the output result is that the calibration requirement is met, in one implementation, the calibration result may be sent to the camera device, which may use the calibration result.
For example, if the camera device is an unmanned aerial vehicle and the unmanned aerial vehicle adopts a binocular camera, the unmanned aerial vehicle can calculate the scene depth by using the images acquired by the binocular camera and the calibration result, so that automatic obstacle avoidance in the flight process is realized. Of course, the use of the calibration result is not limited thereto, and for example, correction of picture distortion by the calibration result, and the like may also be achieved.
In one implementation, in addition to calculating parameters of the camera, a projection error may also be calculated at the same time when performing calibration calculation, and the projection error is compared with a preset threshold, so as to determine whether the calibration result meets the calibration requirement (if the calibration result meets the requirement if the calibration result is smaller than the threshold).
Fig. 11 shows a functional block diagram of a calibration apparatus provided in an embodiment of the present application. Referring to fig. 11, the calibration apparatus 400 includes: the dynamic prompting module 410 is used for indicating the camera device to acquire images through a dynamic effect displayed in an area containing the guide characteristics in the calibration interface; the image confirmation module 420 is used for determining that the camera device acquires a qualified image; the feature display module 430 is configured to change a position of the guide feature in the calibration interface, and instruct the camera to continuously perform image acquisition until the image confirmation module 420 determines that a preset number of qualified images are acquired by the camera; the calculating module 440 is configured to perform calibration calculation according to a preset number of qualified images to obtain a calibration result of the image capturing apparatus.
In one implementation of the calibration apparatus 400, the determining, by the image confirmation module 420, that the qualified image is acquired by the image capturing apparatus includes: receiving an image acquired by the camera device; and detecting the guide features in the received image, and determining that the camera device acquires a qualified image when the detected guide features meet preset conditions.
In one implementation manner of the calibration apparatus 400, the preset conditions include: (ii) contains the guide feature in its entirety; and the size of the guide feature in the image exceeds a preset size, and/or the shape of the guide feature presented in the image is similar to the original shape of the guide feature to a degree exceeding a preset degree.
In one implementation of the calibration apparatus 400, the determining, by the image confirmation module 420, that the qualified image is acquired by the image capturing apparatus includes: and receiving a successful acquisition signal sent by the camera device, wherein the successful acquisition signal indicates that the camera device acquires a qualified image.
In one implementation of the calibration apparatus 400, the determining, by the image confirmation module 420, that the qualified image is acquired by the image capturing apparatus includes: receiving an image sent by the camera device, and determining that the camera device acquires a qualified image; and the camera device sends the image after determining that the acquired image is a qualified image.
In one implementation manner of the calibration apparatus 400, the calculation module 440 performs the calibration calculation according to a preset number of qualified images, including: and carrying out calibration calculation according to the locally stored qualified images with the preset quantity.
In one implementation manner of the calibration apparatus 400, the calculation module 440 performs the calibration calculation according to a preset number of qualified images, including: and acquiring a preset number of qualified images from the camera device, and performing calibration calculation according to the preset number of qualified images.
In one implementation of calibration apparatus 400, the dynamic effect includes a combination of one or more of the following effects: the guide feature displays a state change; a feature point display state change in a region containing the guide feature; a background display state change within an area containing the guide feature; an edge display state change of an area containing the guide feature; a geometric graphic overlying the region containing the guide feature displays a change in state.
In one implementation of the calibration apparatus 400, the display state change includes one or more of the following changes in combination: flicker, color variation, and brightness variation.
In one implementation of calibration apparatus 400, the feature display module 430 changes the position of the guide feature in the calibration interface, including: changing the position of the guide feature in the calibration interface in a clockwise direction or a counter-clockwise direction.
In one implementation manner of the calibration apparatus 400, the calibration apparatus 400 further includes: and the progress bar display module is used for refreshing and displaying the progress bar representing the image acquisition progress in the calibration interface after determining that the camera device acquires qualified images each time.
In one implementation of the calibration apparatus 400, the calibration apparatus 400 further includes: and the image preview module is used for displaying the image acquired by the camera device in real time in a preview window of the calibration interface.
In an implementation manner of the calibration apparatus 400, when the image capturing apparatus is a monocular camera, the preset number is at least three.
In an implementation manner of the calibration apparatus 400, when the image capturing apparatus employs binocular cameras, the preset number is at least three pairs.
In one implementation of calibration arrangement 400, the guide feature includes at least three geometric figures.
In one implementation of the calibration apparatus 400, the calibration apparatus 400 further includes: and a result judgment module, configured to judge whether the calibration result meets the calibration requirement after the calculation module 440 obtains the calibration result of the image capturing apparatus, and output a judgment result.
In this implementation manner, the user is prompted by outputting the determination result, and if the output result is that the calibration requirement is not satisfied, the user may take a corresponding measure, for example, perform calibration again.
In one implementation manner of the calibration apparatus 400, the calibration apparatus 400 further includes: and the result sending module is used for sending the calibration result to the camera device after the result judging module judges that the calibration result meets the calibration requirement.
The implementation principle and the resulting technical effects of the calibration apparatus 400 provided in the embodiment of the present application have been introduced in the foregoing method embodiments, and for the sake of brief description, reference may be made to corresponding contents in the method embodiments where no part of the embodiment of the apparatus is mentioned.
Fig. 12 shows a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 12, the electronic device 500 includes: a processor 510, a memory 520, a communication interface 530, and an input-output device 540, which are interconnected and in communication with each other via a communication bus 550 and/or other form of connection mechanism (not shown).
The Memory 520 includes one or more (Only one is shown in fig. 12), which may be, but not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. Processor 510, and possibly other components, may access, read from, and/or write to memory 520.
The processor 510 includes one or more (only one shown in fig. 12), which may be an integrated circuit chip having signal processing capabilities. The Processor 510 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Micro Control Unit (MCU), a Network Processor (NP), or other conventional processors; or a special-purpose Processor, including a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, and a discrete hardware component.
Communication interface 530 includes one or more (only one shown in fig. 12) that may be used to communicate directly or indirectly with other devices, including but not limited to a camera to be calibrated, for data interaction. Communication interface 530 may be an ethernet interface; may be a mobile communications network interface, such as an interface for a 3G, 4G, 5G network; other wireless communication interfaces than mobile communication, such as bluetooth, Wi-Fi, infrared, etc.; or may be other types of interfaces having data transceiving functions.
Input output 540 generally refers to input devices, such as a keyboard, a mouse, buttons, a touch screen, etc., for inputting information to electronic device 500, and/or output devices, such as a display, speakers, etc., for outputting information generated by electronic device 500. For example, in the present application, the display may be used to display a calibration interface, and a user may operate the calibration interface through a mouse to start a calibration process.
One or more computer program instructions may be stored in memory 520 and read and executed by processor 510 to implement the steps of the calibration method provided by the embodiments of the present application, as well as other desired functions.
It will be appreciated that the configuration shown in FIG. 12 is merely illustrative and that electronic device 500 may include more or fewer components than shown in FIG. 12 or have a different configuration than shown in FIG. 12. The components shown in fig. 12 may be implemented in hardware, software, or a combination thereof. For example, the terminal device 120 in fig. 1 may be implemented by using the structure of the electronic device 500.
The embodiment of the present application further provides a computer-readable storage medium, where computer program instructions are stored on the computer-readable storage medium, and when the computer program instructions are read and executed by a processor of a computer, the steps of the calibration method provided in the embodiment of the present application are executed. For example, the computer readable storage medium may be embodied as the memory 520 in the electronic device 500 in fig. 12.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some communication interfaces, indirect coupling or communication connection between devices or units, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made to the present application by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (18)

1. A calibration method, comprising:
indicating a camera device to acquire images through a dynamic effect displayed in an area containing the guide characteristics in the calibration interface;
changing the position of the guide feature in the calibration interface, and indicating the camera device to continuously acquire images until the camera device is determined to acquire a preset number of qualified images;
carrying out calibration calculation according to a preset number of qualified images to obtain a calibration result of the camera device;
the determining that the qualified image is acquired by the camera device comprises:
receiving images acquired by the camera device in real time;
detecting the guide features in the received images, and determining that the camera device acquires qualified images when the detected guide features meet preset conditions;
The preset conditions include: (ii) contains the guide feature in its entirety;
and the size of the guide feature in the image exceeds a preset size, and/or the shape of the guide feature presented in the image is similar to the original shape of the guide feature to a degree exceeding a preset degree.
2. The calibration method according to claim 1, wherein determining that the camera device has acquired a qualified image comprises:
and receiving a successful acquisition signal sent by the camera device, wherein the successful acquisition signal indicates that the camera device acquires a qualified image.
3. The calibration method according to claim 1, wherein determining that the camera device has acquired a qualified image comprises:
receiving an image sent by the camera device, and determining that the camera device acquires a qualified image; and the camera device sends the image after determining that the acquired image is a qualified image.
4. A calibration method according to any one of claim 1, claim 2 or claim 3, wherein said performing calibration calculations based on a preset number of qualified images comprises:
and carrying out calibration calculation according to the locally stored qualified images with the preset quantity.
5. The calibration method according to claim 2, wherein the performing the calibration calculation according to the predetermined number of qualified images comprises:
and acquiring a preset number of qualified images from the camera device, and performing calibration calculation according to the preset number of qualified images.
6. Calibration method according to claim 1, wherein the dynamic effect comprises one or more of the following effects in combination:
the guide feature displays a state change;
a feature point display state change in a region containing the guide feature;
a background display state change within an area containing the guide feature;
an edge display state change of an area containing the guide feature;
a geometric graphic overlying the region containing the guide feature displays a change in state.
7. The calibration method according to claim 6, wherein the display state change comprises a combination of one or more of the following changes: flicker, color variation, and brightness variation.
8. The calibration method according to claim 1, wherein the changing the position of the guide feature in the calibration interface comprises:
changing the position of the guide feature in the calibration interface in a clockwise direction or a counter-clockwise direction.
9. The calibration method according to claim 1, further comprising:
and refreshing and displaying a progress bar representing the image acquisition progress in the calibration interface after the camera device is determined to acquire qualified images each time.
10. The calibration method according to claim 1, further comprising:
and displaying the image acquired by the camera device in real time in a preview window of the calibration interface.
11. The calibration method according to claim 1, wherein when the image capturing device is a monocular camera, the preset number is at least three.
12. The calibration method according to claim 1, wherein when the image capturing device employs binocular cameras, the preset number is at least three pairs.
13. A calibration method according to claim 1, wherein the guide feature comprises at least three geometric figures.
14. The calibration method according to claim 1, wherein after said obtaining the calibration result of the image pickup apparatus, the method further comprises:
and judging whether the calibration result meets the calibration requirement or not, and outputting a judgment result.
15. The calibration method according to claim 14, wherein after said determining whether the calibration result satisfies the calibration requirement, the method further comprises:
and if the calibration result meets the calibration requirement, sending the calibration result to the camera device.
16. A calibration device, comprising:
the dynamic prompting module is used for indicating the camera device to acquire images through a dynamic effect displayed in an area containing the guide characteristics in the calibration interface;
the image confirmation module is used for determining that the qualified image is acquired by the camera device;
the image confirmation module is also used for receiving the image acquired by the camera device; detecting the guide features in the received image, and determining that the camera device acquires qualified images when the detected guide features meet preset conditions, wherein the preset conditions comprise: (ii) contains the guide feature in its entirety; and the size of the guide feature in the image exceeds a preset size, and/or the similarity degree of the shape presented by the guide feature in the image and the original shape of the guide feature exceeds a preset degree;
the characteristic display module is used for changing the position of the guide characteristic in the calibration interface and indicating the camera device to continuously acquire images until the image confirmation module determines that the camera device acquires a preset number of qualified images;
And the calculating module is used for carrying out calibration calculation according to a preset number of qualified images to obtain a calibration result of the camera device.
17. A computer-readable storage medium, having stored thereon computer program instructions, which, when read and executed by a processor, perform the steps of the method according to any one of claims 1-15.
18. An electronic device, comprising: a memory having stored therein computer program instructions which, when read and executed by the processor, perform the steps of the method of any of claims 1-15.
CN201910850140.0A 2019-09-09 2019-09-09 Calibration method, calibration device, storage medium and electronic equipment Active CN110689583B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910850140.0A CN110689583B (en) 2019-09-09 2019-09-09 Calibration method, calibration device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910850140.0A CN110689583B (en) 2019-09-09 2019-09-09 Calibration method, calibration device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN110689583A CN110689583A (en) 2020-01-14
CN110689583B true CN110689583B (en) 2022-06-28

Family

ID=69108891

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910850140.0A Active CN110689583B (en) 2019-09-09 2019-09-09 Calibration method, calibration device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110689583B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111652942B (en) * 2020-05-29 2024-03-22 维沃移动通信有限公司 Calibration method of camera module, first electronic equipment and second electronic equipment
CN113538590A (en) * 2021-06-15 2021-10-22 深圳云天励飞技术股份有限公司 Zoom camera calibration method and device, terminal equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105096324A (en) * 2015-07-31 2015-11-25 深圳市大疆创新科技有限公司 Camera device and calibration method thereof
CN107081755A (en) * 2017-01-25 2017-08-22 上海电气集团股份有限公司 A kind of robot monocular vision guides the automatic calibration device of system
CN108257186A (en) * 2018-01-18 2018-07-06 广州视源电子科技股份有限公司 Determining method and device, video camera and the storage medium of uncalibrated image
CN109166156A (en) * 2018-10-15 2019-01-08 Oppo广东移动通信有限公司 A kind of generation method, mobile terminal and the storage medium of camera calibration image
CN109345597A (en) * 2018-09-27 2019-02-15 四川大学 A kind of camera calibration image-pickup method and device based on augmented reality
CN109741404A (en) * 2019-01-10 2019-05-10 奥本未来(北京)科技有限责任公司 A kind of mobile device-based optical field acquisition method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105096324A (en) * 2015-07-31 2015-11-25 深圳市大疆创新科技有限公司 Camera device and calibration method thereof
CN107081755A (en) * 2017-01-25 2017-08-22 上海电气集团股份有限公司 A kind of robot monocular vision guides the automatic calibration device of system
CN108257186A (en) * 2018-01-18 2018-07-06 广州视源电子科技股份有限公司 Determining method and device, video camera and the storage medium of uncalibrated image
CN109345597A (en) * 2018-09-27 2019-02-15 四川大学 A kind of camera calibration image-pickup method and device based on augmented reality
CN109166156A (en) * 2018-10-15 2019-01-08 Oppo广东移动通信有限公司 A kind of generation method, mobile terminal and the storage medium of camera calibration image
CN109741404A (en) * 2019-01-10 2019-05-10 奥本未来(北京)科技有限责任公司 A kind of mobile device-based optical field acquisition method

Also Published As

Publication number Publication date
CN110689583A (en) 2020-01-14

Similar Documents

Publication Publication Date Title
CN106254682B (en) A kind of photographic method and mobile terminal
US10491806B2 (en) Camera device control related methods and apparatus
CN113395419A (en) Electronic device, control method, and computer-readable medium
CN106713769B (en) Image shooting control method and device and electronic equipment
CN110689583B (en) Calibration method, calibration device, storage medium and electronic equipment
JP6587081B2 (en) Method and system for assisting a user in capturing an image or video
US20220182551A1 (en) Display method, imaging method and related devices
CN113329172B (en) Shooting method and device and electronic equipment
US20230164427A1 (en) Focusing method, device, and computer-readable storage medium
CN111784844A (en) Method and device for observing virtual object, storage medium and electronic equipment
US11252387B2 (en) Projection apparatus, projection method and storage medium
EP2981065B1 (en) Light metering method and device
CN112672051B (en) Shooting method and device and electronic equipment
CN108279774B (en) Method, device, intelligent equipment, system and storage medium for region calibration
JP2010074264A (en) Photographing apparatus and photographing system
US9699385B2 (en) Imaging apparatus and storage medium, and exposure amount control method
JP2014039166A (en) Controller of automatic tracking camera and automatic tracking camera equipped with the same
CN110213407A (en) A kind of operating method of electronic device, electronic device and computer storage medium
CN112653841B (en) Shooting method and device and electronic equipment
JP6129136B2 (en) Apparatus operating device and apparatus operating method
US20180367741A1 (en) Control device, control method, and program
US11516404B2 (en) Control apparatus and control method
CN106559627B (en) Projection method, device and equipment
US9467602B2 (en) Control terminal, imaging system, control method, and non-transitory medium saving program for controlling a target terminal responsive to detection of a predetermined state of an object or a predetermined motion of the object
JP2014211794A (en) Visual line detection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant