CN108279809B - Calibration method and device - Google Patents

Calibration method and device Download PDF

Info

Publication number
CN108279809B
CN108279809B CN201810034980.5A CN201810034980A CN108279809B CN 108279809 B CN108279809 B CN 108279809B CN 201810034980 A CN201810034980 A CN 201810034980A CN 108279809 B CN108279809 B CN 108279809B
Authority
CN
China
Prior art keywords
depth
mapping
image
parameter
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810034980.5A
Other languages
Chinese (zh)
Other versions
CN108279809A (en
Inventor
陈维亮
董碧峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201810034980.5A priority Critical patent/CN108279809B/en
Publication of CN108279809A publication Critical patent/CN108279809A/en
Application granted granted Critical
Publication of CN108279809B publication Critical patent/CN108279809B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a calibration method and a calibration device. The method is applied to coordinate mapping between a depth of field image and a projection image in an interactive projection device, and comprises the following steps: acquiring depth information of a projection plane according to the depth-of-field image; acquiring a mapping parameter value corresponding to the depth information according to a linear relation between a pre-established depth parameter and the mapping parameter; and obtaining a calibrated mapping equation according to a pre-established mapping model and the mapping parameter value. The invention can realize automatic coordinate mapping without the participation of a user in calibration, can finish calibration processing under the condition that the user is unaware and unconscious, and greatly enhances the user experience.

Description

Calibration method and device
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a calibration method and apparatus.
Background
With the development of electronic devices with multiple functions, the conventional key-type operation interface is gradually unable to meet the requirements of users. Under such circumstances, interactive technologies are rapidly developing. Interactive projection devices are receiving increasing attention. The interactive projection audio equipment can utilize the depth-of-field image to perform gesture recognition on a projected image, and if a user can play games, order dishes, shop online and the like while listening to music, the interactive projection equipment provides a novel interactive mode.
Fig. 1 is a schematic diagram of the operation of a projection audio apparatus, and as shown in fig. 1, the projection audio apparatus mainly includes two parts: the loudspeaker comprises a loudspeaker main body part and a projection module for interaction, wherein the projection module can provide an inclined projection as an interaction area of a user, also called a projection area. Wherein, the structure of projection module refers to fig. 2, and projection module mainly includes two parts, and firstly, projection ray apparatus part, this part is the visible region of user, and the user can be connected with android mobile phone through the bluetooth, throws out the interface of cell-phone. And secondly, depth of field projection is realized, the sensor module sensor is used for measuring depth, the distance from each point in the projection area to the projection sensor is obtained, a matrix can be obtained, the length and the width of the matrix are corresponding to the length and the width of the projection area, and each point in the matrix is the distance from a certain point in the projection area to the projection sensor. Therefore, when a finger enters the area, the depth distance of the point shielded by the finger changes from the depth distance of the point not shielded by the finger, and gesture detection is performed based on the change.
As shown in fig. 3, the projection surface of the depth of field projection in the depth of field module needs to cover the projection surface of the projection optical machine, so as to effectively capture the interactive information. Fig. 4 is a schematic diagram of the projection image and the depth-of-field image, a rectangular region with a relatively deep gray scale in fig. 4 is the projection image projected by the projection light machine, and a rectangular region with a relatively shallow gray scale is the depth-of-field image projected by the depth-of-field. During gesture interaction, the coordinates of the gesture in the depth-of-field image are obtained through gesture recognition, and interaction processing needs to be performed by using the coordinates in the projection image, so that coordinate mapping processing needs to be performed on the two images.
In the prior art, when the mapping process from the depth-of-field image to the projection image is performed, coordinates of a predetermined point in two images need to be manually obtained, as shown in fig. 3, coordinates of points at four corner positions of the projection image in the two images are generally selected for performing coordinate mapping, and since parameters involved in the mapping process are changed along with the projection depth, in the prior art, when the depth of a projection surface projected by a projection module is changed, manual calibration process needs to be performed, which greatly affects the use experience of a user.
Disclosure of Invention
The invention provides a calibration method and a calibration device, and aims to solve the problem that in the prior art, when projection depth of projection equipment is changed, a mapping equation needs to be calibrated manually, and user experience is influenced.
One aspect of the present invention provides a calibration method applied to coordinate mapping between a depth of field image and a projection image in an interactive projection device, the method comprising:
acquiring depth information of a projection plane according to the depth-of-field image;
acquiring a mapping parameter value corresponding to the depth information according to a linear relation between a pre-established depth parameter and the mapping parameter;
and obtaining a calibrated mapping equation according to a pre-established mapping model and the mapping parameter value.
Another aspect of the present invention provides a calibration apparatus for coordinate mapping between a depth of field image and a projection image in an interactive projection device, the apparatus comprising:
the depth acquisition unit is used for acquiring depth information of the projection plane according to the depth image;
a mapping parameter value obtaining unit, configured to obtain a mapping parameter value corresponding to the depth information according to a linear relationship between a pre-established depth parameter and the mapping parameter;
and the calibration unit is used for obtaining a calibrated mapping equation according to a pre-established mapping model and the mapping parameter value.
Another aspect of the present invention provides a computer-readable storage medium storing executable instructions for performing the above-described method.
The method comprises the steps of establishing a mapping model in advance, wherein the mapping model comprises mapping parameters which are in linear relation with depth parameters of a projection surface, obtaining mapping parameter values corresponding to depth information of the projection surface based on the linear relation between the depth parameters and the mapping parameters, and carrying out correction processing by using the mapping parameter values to obtain a calibrated mapping equation. The invention can realize automatic coordinate mapping without the participation of a user in calibration, can finish calibration processing under the condition that the user is unaware and unconscious, and greatly enhances the user experience.
Drawings
FIG. 1 is a schematic diagram of the operation of a projection audio device;
FIG. 2 is a schematic structural diagram of a projection module of a projection audio device;
FIG. 3 is a schematic diagram illustrating a relationship between a projection area and a depth-of-field area of the projection module;
FIG. 4 is a schematic view of a projection image and a depth image of the projection module;
FIG. 5 is a flowchart of a calibration method according to an embodiment of the present invention;
fig. 6 is a flowchart of acquiring depth information of a projection plane according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of the coordinates of a depth parameter and a transverse stretch ratio parameter A1 according to an embodiment of the present invention;
FIG. 8 is a schematic coordinate diagram of a depth parameter and an interference parameter A2 according to an embodiment of the present invention;
FIG. 9 is a schematic coordinate diagram of a depth parameter and an offset parameter A3 according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of the coordinates of a depth parameter and a longitudinal stretch ratio parameter B1 according to an embodiment of the present invention;
FIG. 11 is a schematic coordinate diagram of a depth parameter and an interference parameter B2 according to an embodiment of the present invention;
FIG. 12 is a schematic coordinate diagram of a depth parameter and an offset parameter B3 according to an embodiment of the present invention;
FIG. 13 is a schematic diagram of deviation of boundary points in a projected image according to an embodiment of the present invention;
fig. 14 is a block diagram of a calibration apparatus according to an embodiment of the present invention.
Detailed Description
Determining factors influencing the coordinate mapping from the depth-of-field image to the projection image according to the relationship between the depth-of-field image and the projection image, establishing a mapping model according to the determined influencing factors (the mapping parameters in the embodiment), mining the relationship between the mapping parameters and the projection depth to obtain mapping parameter values corresponding to the real-time projection depth of the projection image, bringing the obtained mapping parameter values into the mapping model to obtain a calibrated mapping equation, and automatically completing the calibration without manual operation of a user.
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 5 is a flowchart of a calibration method according to an embodiment of the present invention, where the method of the present embodiment is applied to coordinate mapping between a depth of field image and a projection image in an interactive projection device, as shown in fig. 5, the method of the present embodiment includes:
and S510, acquiring depth information of the projection plane according to the depth image.
The depth-of-field image is a depth-of-field image obtained by a projection sensor and a projection area of the depth-of-field projected on the projection plane when no barrier such as a finger exists between the projection module and the projection plane.
The interactive projection equipment of the embodiment can acquire the depth information of the projection surface in the initialization process of starting up each time; the projection depth of the projection module can be detected according to the set detection frequency, namely the depth information of the projection surface, so that the changed depth information can be acquired if the depth information of the projection surface changes in the use process of the interactive projection equipment, and the mapping equation is calibrated in time to accurately identify the interactive gesture.
S520, according to the linear relation between the pre-established depth parameter and the mapping parameter, the mapping parameter value corresponding to the depth information is obtained.
The present embodiment determines the mapping parameters according to the relationship between the depth image and the projection image. As shown in fig. 4, when the depth information of the projection surface changes, the stretching ratio of the projection image in the depth image changes, the offset condition of the projection image and the depth image also changes, and when the projection image is not a regular pattern, the depth image interferes with the projection image, so the mapping parameters of the present embodiment include a transverse stretching ratio parameter, a longitudinal stretching ratio, an interference parameter of the depth image in the longitudinal direction to the transverse direction of the projection image, an interference parameter of the depth image in the transverse direction to the longitudinal direction of the projection image, an offset parameter of the depth image and the projection image in the transverse direction, and an offset parameter in the longitudinal direction.
In the embodiment, a large number of depth parameter values and mapping parameter values are used as observation values, and a linear relation exists between each depth parameter and each mapping parameter obtained through statistical analysis, so that when the depth information changes, corresponding mapping parameter values can be obtained according to the linear relation, and the mapping model is calibrated based on the mapping parameter values obtained through calculation.
S530, obtaining a calibrated mapping equation according to a pre-established mapping model and the mapping parameter value.
The mapping model pre-established in this embodiment includes the mapping parameter, after the mapping parameter value corresponding to the depth information is obtained, the mapping parameter value may be substituted into the mapping model to obtain a calibrated mapping equation, and the coordinate of the interactive gesture in the depth-of-field image is mapped to the projection image by using the calibrated mapping equation, so that the interactive projection device performs gesture recognition according to the coordinate of the interactive gesture in the projection image, so as to perform interactive operation.
In this embodiment, a mapping model is pre-established, where the mapping model includes mapping parameters that are in a linear relationship with the depth parameters of the projection surface, and based on the linear relationship between the depth parameters and the mapping parameters, mapping parameter values corresponding to the depth information of the projection surface are obtained, and the mapping parameter values are used to perform correction processing, so as to obtain a calibrated mapping equation. The embodiment can realize automatic coordinate mapping, does not need a user to participate in calibration, can finish calibration processing under the condition that the user is unaware and unaware, and greatly enhances user experience.
In one implementation of this embodiment, the depth information of the projection plane may be obtained by:
calculating a gradient of each data point of the depth image, and calculating a gradient of each longitudinal direction of the depth image according to the gradient of each data point; calculating the absolute value of the difference between each data point in the depth image and the adjacent data point in the transverse direction of the data point and the absolute value of the difference between the data point and the adjacent data point in the longitudinal direction of the data point, and taking the sum of the two absolute values as the gradient of the data point; and calculating the gradient sum of all data points in each longitudinal direction in the depth image to obtain the gradient in each longitudinal direction.
When the gradient of each data point of the depth image is smaller than a concave-convex threshold value and the gradient of each longitudinal direction is smaller than an inclined threshold value, acquiring a central area of the depth image, wherein the central area is located in a range covered by the projection image; and arranging the data points in the central area in the order of the numerical values from large to small, calculating the average numerical value of the data points with the preset number in the sequence, and obtaining the average value as depth information.
Fig. 6 is a flowchart of acquiring depth information of a projection plane according to an embodiment of the present invention, and as shown in fig. 6, a process of acquiring depth information of a projection plane includes:
s610, collecting a depth image.
Referring to fig. 2, a projection image corresponding to a projection area on a projection plane may be acquired using a projection sensor. For example, when the projection sensor is an infrared sensor, the infrared sensor may be used to measure the distance from each point in the projection area to the infrared sensor, and a depth image may be obtained, where each data point in the depth image is a distance value between the infrared sensor and a corresponding point in the projection plane.
And S620, calculating the gradient of each data point of the depth image.
The present embodiment may utilize gradient (i, j) ═ datai+1,j-datai,j|+|datai,j+1-datai,jCalculating the gradient of each data point, wherein gradient (i, j) is the gradient of the data point (i, j) in the depth image, and (i, j) is the coordinate of the data point (i, j) in the depth image, and datai,jFor the distance value of the data point (i, j), assuming the depth image is M × N matrix, i belongs to [1, N ∈],j∈[1,M]M and N are natural numbers greater than 1. The present embodiment calculates the gradient of each data point of the depth image using the above formula, and obtains a matrix regarding the gradient.
And S630, calculating the gradient of each longitudinal direction of the depth image.
The present embodiment can utilize
Figure BDA0001547654610000051
And calculating the gradient of each longitudinal direction, wherein gradient _ sum (j) is the gradient of the jth longitudinal direction in the depth image, and if the depth image is an M × N matrix, imax is N.
And S640, judging whether the projection plane is inclined, if the projection plane is not inclined, executing S650, if the projection plane is inclined, prompting a user that the projection plane is inclined, and returning to S610 after the projection plane is adjusted.
In the embodiment, the gradient of each longitudinal direction is compared with an inclination threshold value, and when the gradient of each longitudinal direction is smaller than the inclination threshold value, the projection plane is judged not to be inclined; otherwise, judging that the projection plane is inclined.
S650, judging the unevenness of the projection surface, if the projection surface is relatively flat, executing S660, if the projection surface is relatively uneven (or not flat), prompting the user that the projection surface is seriously uneven, and returning to S610 after adjusting the projection surface.
Because the projection surface has great unsmooth when having great unsmooth, the gradient has great sudden change, therefore this embodiment compares the gradient of every data point with unsmooth threshold value, and when the gradient of every data point all was less than unsmooth threshold value, judges that the projection surface is comparatively level and smooth, otherwise, judges that the projection surface is comparatively unsmooth.
S660, the central area of the through-image is a target area.
The central area of this implementation is located within the area covered by the projected image.
And S670, calculating the depth value of the projection surface.
In practical application, due to the influence of floating dust particles in the air, noise may occur in data points in the depth image, in this embodiment, data points in the target region are arranged in the order of magnitude from large to small, an average value of the data points in the first 10% -20% of the sequence is calculated, and the calculated average value is a depth value.
In another implementation of this embodiment, the implementation establishes a linear relationship between the depth parameter and the mapping parameter by the following method:
according to the using distance of the projection module of the interactive projection equipment, a plurality of distance values are selected in the using distance to serve as depth parameter observation values, and if the plurality of distance values are uniformly selected in the using distance range, the obtained depth parameter observation values are uniformly distributed in the using distance range of the projection module.
Calculating a mapping parameter value corresponding to each depth parameter observation value, and taking the calculated mapping parameter value as a mapping parameter observation value; the coordinates of each observation point in the projection plane in the projection image and the depth image can be calculated based on each depth parameter observation value, and after the coordinates of a plurality of observation points are calculated, the mapping parameter observation value corresponding to each depth parameter observation value is calculated by using a mapping model established in advance.
And performing linear regression fitting processing on the depth parameter observed values and each mapping parameter observed value to obtain a linear relation between the depth parameters and each mapping parameter.
In the embodiment, an optical projection module in an Olympic ratio is exemplarily selected, the distance is 200mm-1100mm, and 400mm, 500mm, 600mm, 700mm, 800mm, 900mm and 1000mm are selected as the depth parameter observation values in the embodiment within 400mm-1000mm to excavate the linear relationship between the depth parameter and the mapping parameter.
Fig. 7 to 12 show a transverse stretching proportion parameter a1 of the projection image on the depth image, an interference parameter a2 of the depth image in the longitudinal direction to the transverse direction of the projection image, a shift parameter A3 of the depth image and the projection image in the transverse direction as a whole, a longitudinal stretching proportion parameter B1 of the projection image on the depth image, an interference parameter B2 of the depth image in the transverse direction to the longitudinal direction of the projection image, and a shift parameter B3 of the depth image and the projection image in the longitudinal direction, respectively, and a relationship between the parameters and the parameters, wherein horizontal coordinates of fig. 7 to 12 are depth parameters, and vertical coordinates of the parameters are corresponding mapping parameters, which can be visually seen from fig. 7 to 12: the transverse stretch ratio parameter a1, the interference parameter a2, the offset parameter A3, the longitudinal stretch ratio parameter B1, the interference parameter B2, and the offset parameter B3 are all linear with the depth parameter.
In this embodiment, a linear regression fitting straight line y is kx + b, y is an observed value of a mapping parameter, x is an observed value of a depth parameter, and k and b are fitting coefficients. Based on the data shown in fig. 7 to 12, the fitting coefficient corresponding to each mapping parameter can be obtained, referring to table 1.
TABLE 1
k b
Transverse stretch ratio parameter A1 6.890054144620897e-04 3.445336790652552
Disturbance parameter A2 0.001122548818640 -0.834039877906753
Offset parameter A3 0.426663458553791 -5.035206754938266e+02
Longitudinal stretch ratio parameter B1 -2.175293104056436e-04 0.096839180499118
Disturbance parameter B2 3.377888095238247e-04 3.649938259523799
Offset parameter B3 0.203183306878311 1.278813492592590e+03
Based on the value of the fitting coefficient corresponding to each mapping parameter in table 1, a linear relation between each mapping parameter and the depth parameter can be obtained.
After a linear relation between each mapping parameter and a depth parameter is obtained, a mapping parameter value corresponding to the depth information can be calculated according to the obtained depth information of the projection surface, and the mapping parameter value obtained through calculation is substituted into a pre-established mapping model, so that a calibrated mapping equation can be obtained.
The mapping model pre-established in this embodiment is
Figure BDA0001547654610000081
Where C1 and C2 are the amount of calibration error, (i, j) are the coordinates of the depth image data points, and (x, y) are the coordinates of the projection image data points.
In the present embodiment, when the linear relation between each of the mapping parameters and the depth parameter is calculated, the boundary point on the projection plane is set as the observation point, and when the boundary point is collected as the observation point, the observation point is deviated in the projection image. Referring to fig. 13, the coordinates of the boundary point (0,0) in the projection image may be miscalculated as the point (29, 29), so this embodiment sets the constants C1 and C2 to compensate for the deviation caused in the calculation process when the model is mapped.
Corresponding to the calibration method provided by the embodiment of the invention, the embodiment of the invention also provides a calibration device.
Fig. 14 is a block diagram of a calibration apparatus according to an embodiment of the present invention, and as shown in fig. 14, the calibration apparatus according to the embodiment includes:
a depth acquisition unit 141 configured to acquire depth information of the projection plane from the depth-of-field image;
a mapping parameter value obtaining unit 142, configured to obtain a mapping parameter value corresponding to the depth information according to a linear relationship between a pre-established depth parameter and the mapping parameter;
the calibration unit 143 is configured to obtain a calibrated mapping equation according to a mapping model established in advance and the mapping parameter value.
In an implementation of this embodiment, the depth obtaining unit 141 includes:
the first calculation module is used for calculating the gradient of each data point of the depth image and calculating the gradient of each longitudinal direction of the depth image according to the gradient of each data point;
the central region extraction module is used for acquiring a central region of the depth image when the gradient of each data point of the depth image is smaller than a concave-convex threshold value and the gradient of each longitudinal direction is smaller than an inclined threshold value, wherein the central region is located in a range covered by the projection image;
and the second calculation module is used for arranging the data points in the central area in the order of the numerical values from large to small, calculating the average numerical value of the data points with the preset number in the sequence, and calculating the obtained average value as the depth information.
The first calculating module is specifically configured to calculate an absolute value of a difference between each data point in the depth image and its laterally adjacent data point, and an absolute value of a difference between the data point and its longitudinally adjacent data point, and use a sum of the two absolute values as a gradient of the data point; and calculating the gradient sum of all data points in each longitudinal direction in the depth image to obtain the gradient in each longitudinal direction.
In another implementation of this embodiment, the apparatus shown in fig. 14 further includes a calculating unit, configured to calculate a mapping parameter value corresponding to each depth parameter observed value, and use the calculated mapping parameter value as the mapping parameter observed value; performing linear regression fitting processing on the depth parameter observed values and each mapping parameter observed value to obtain a linear relation between the depth parameters and each mapping parameter; wherein the depth parameter observation values are a plurality of distance values selected within a range of use of a projection module of the interactive projection device.
The calibration unit 143 of this embodiment is used for mapping the model
Figure BDA0001547654610000091
And the parameter values of the mapping parameters a1, a2, A3, B1, B2, B3 result in a calibrated mapping equation from the depth image to the projection image. Another aspect of the present invention provides a computer-readable storage medium storing executable instructions for performing the above-described method.
The computer-readable storage medium of the present embodiments may be, for example, any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The computer-readable storage medium may include a computer program that may include code/computer-executable instructions that, when executed by a processor, cause the processor to perform, for example, the method flows described above and any variations thereof.
The computer program may be configured with computer program code, for example comprising computer program modules. For example, in an example embodiment, code in the computer program may include one or more program modules. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, which when executed by a processor, enable the processor to execute, for example, the above-described method flows and any variations thereof.
The method comprises the steps of establishing a mapping model in advance, wherein the mapping model comprises mapping parameters which are in linear relation with depth parameters of a projection surface, obtaining mapping parameter values corresponding to depth information of the projection surface based on the linear relation between the depth parameters and the mapping parameters, and carrying out correction processing by using the mapping parameter values to obtain a calibrated mapping equation. The invention can realize automatic coordinate mapping without the participation of a user in calibration, can finish calibration processing under the condition that the user is unaware and unconscious, and greatly enhances the user experience.
The specific working modes of the units of the calibration device embodiment of the present invention can be referred to the method embodiment of the present invention, and are not described herein again.
For the convenience of clearly describing the technical solutions of the embodiments of the present invention, in the embodiments of the present invention, the words "first", "second", and the like are used to distinguish the same items or similar items with basically the same functions and actions, and those skilled in the art can understand that the words "first", "second", and the like do not limit the quantity and execution order.
While the foregoing is directed to embodiments of the present invention, other modifications and variations of the present invention may be devised by those skilled in the art in light of the above teachings. It should be understood by those skilled in the art that the foregoing detailed description is for the purpose of better explaining the present invention, and the scope of the present invention should be determined by the scope of the appended claims.

Claims (10)

1. A calibration method for use in coordinate mapping between a depth of view image and a projected image in an interactive projection device, the method comprising:
acquiring depth information of a projection plane according to a depth-of-field image, specifically acquiring a central area of the depth-of-field image when the projection plane is not inclined and is judged to be flat, wherein the central area is positioned in a range covered by the projection image, arranging data points in the central area in a sequence from large to small, calculating an average value of a preset number of data points in the sequence, and calculating the obtained average value to be the depth information;
acquiring mapping parameter values corresponding to the depth information according to a pre-established linear relationship between the depth parameters and the mapping parameters, wherein the mapping parameters are factors influencing the coordinate mapping from the depth image to the projection image, and each mapping parameter has a linear relationship with the depth parameters;
and obtaining a calibrated mapping equation according to a pre-established mapping model and the mapping parameter values, wherein the mapping model comprises the mapping parameters.
2. The calibration method according to claim 1, wherein the acquiring depth information of the projection plane from the depth image comprises:
calculating a gradient of each data point of the depth image, and calculating a gradient of each longitudinal direction of the depth image according to the gradient of each data point;
when the gradient of each data point of the depth image is smaller than a concave-convex threshold value and the gradient of each longitudinal direction is smaller than an inclined threshold value, acquiring a central area of the depth image, wherein the central area is located in a range covered by the projection image;
and arranging the data points in the central area in the order of the numerical values from large to small, calculating the average numerical value of the data points of the preset number in the sequence, and taking the calculated average value as the depth information.
3. The calibration method according to claim 2, wherein the calculating the gradient of each data point of the range image and the calculating the gradient of each longitudinal direction of the range image from the gradients of adjacent data points comprises:
calculating the absolute value of the difference between each data point in the depth image and the adjacent data point in the transverse direction of the data point and the absolute value of the difference between the data point and the adjacent data point in the longitudinal direction of the data point, and taking the sum of the two absolute values as the gradient of the data point;
and calculating the gradient sum of all data points in each longitudinal direction in the depth image to obtain the gradient in each longitudinal direction.
4. Calibration method according to claim 1, characterized in that the linear relationship of the depth parameter to the mapping parameter is established by:
selecting a plurality of distance values as depth parameter observation values within the use distance according to the use distance of a projection module of the interactive projection equipment;
calculating a mapping parameter value corresponding to each depth parameter observation value, and taking the calculated mapping parameter value as a mapping parameter observation value;
and performing linear regression fitting processing on the depth parameter observed values and each mapping parameter observed value to obtain a linear relation between the depth parameters and each mapping parameter.
5. The calibration method according to claim 1, wherein obtaining the calibrated mapping equation according to the pre-established mapping model and the mapping parameter value comprises:
according to a mapping model
Figure FDA0003235900970000021
And mapping parameter values of the parameters A1, A2, A3, B1, B2 and B3 to obtain a calibrated mapping equation from the depth image to the projection image;
wherein, a1 and B1 are a transverse stretching ratio parameter and a longitudinal stretching ratio parameter of the projection image on the depth image, a2 and B2 are an interference parameter of the depth image in the longitudinal direction with respect to the transverse direction of the projection image and an interference parameter of the depth image in the transverse direction with respect to the longitudinal direction of the projection image, A3 and B3 are an offset parameter of the depth image and the projection image in the transverse direction and an offset parameter in the longitudinal direction, respectively, C1 and C2 are calibration error amounts, (i, j) are coordinates of data points of the depth image, and (x, y) coordinates of data points of the projection image.
6. An apparatus for calibrating, for use in an interactive projection device, a coordinate mapping between a depth of view image and a projected image, the apparatus comprising:
the depth acquisition unit is used for acquiring depth information of a projection surface according to a depth-of-field image, specifically, when the projection surface is judged not to be inclined and the projection surface is relatively flat, acquiring a central area of the depth-of-field image, wherein the central area is positioned in a range covered by the projection image, arranging data points in the central area in a sequence from large to small, calculating an average value of data points of a preset number in the sequence, and taking the calculated average value as the depth information;
the mapping parameter value acquisition unit is used for acquiring a mapping parameter value corresponding to the depth information according to a linear relation between a pre-established depth parameter and the mapping parameter, wherein the mapping parameter is a factor influencing the coordinate mapping from the depth image to the projection image, and each mapping parameter has a linear relation with the depth parameter;
and the calibration unit is used for obtaining a calibrated mapping equation according to a pre-established mapping model and the mapping parameter values, wherein the mapping model comprises the mapping parameters.
7. The calibration device according to claim 6, wherein the depth acquisition unit comprises:
the first calculation module is used for calculating the gradient of each data point of the depth image and calculating the gradient of each longitudinal direction of the depth image according to the gradient of each data point;
the central region extraction module is used for acquiring a central region of the depth image when the gradient of each data point of the depth image is smaller than a concave-convex threshold value and the gradient of each longitudinal direction is smaller than an inclined threshold value, wherein the central region is located in a range covered by the projection image;
and the second calculation module is used for arranging the data points in the central area in the order of the numerical values from large to small, calculating the average numerical value of the data points with the preset number in the sequence, and calculating the obtained average value as the depth information.
8. The calibration device of claim 7,
the first calculation module is used for calculating the absolute value of the difference between each data point in the depth image and the adjacent data point in the transverse direction of the data point and the absolute value of the difference between the data point and the adjacent data point in the longitudinal direction of the data point, and taking the sum of the two absolute values as the gradient of the data point; and calculating the gradient sum of all data points in each longitudinal direction in the depth image to obtain the gradient in each longitudinal direction.
9. The calibration device according to claim 6, further comprising a calculation unit;
the calculation unit is used for calculating a mapping parameter value corresponding to each depth parameter observation value, and taking the calculated mapping parameter value as a mapping parameter observation value; performing linear regression fitting processing on the depth parameter observed values and each mapping parameter observed value to obtain a linear relation between the depth parameters and each mapping parameter; wherein the depth parameter observation values are a plurality of distance values selected within a range of use of a projection module of the interactive projection device.
10. Calibration device according to claim 6,
the calibration unit is used for mapping the model
Figure FDA0003235900970000031
And mapping parameter values of the parameters A1, A2, A3, B1, B2 and B3 to obtain a calibrated mapping equation from the depth image to the projection image;
wherein, a1 and B1 are a transverse stretching ratio parameter and a longitudinal stretching ratio parameter of the projection image on the depth image, a2 and B2 are an interference parameter of the depth image in the longitudinal direction with respect to the transverse direction of the projection image and an interference parameter of the depth image in the transverse direction with respect to the longitudinal direction of the projection image, A3 and B3 are an offset parameter of the depth image and the projection image in the transverse direction and an offset parameter in the longitudinal direction, respectively, C1 and C2 are calibration error amounts, (i, j) are coordinates of data points of the depth image, and (x, y) coordinates of data points of the projection image.
CN201810034980.5A 2018-01-15 2018-01-15 Calibration method and device Active CN108279809B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810034980.5A CN108279809B (en) 2018-01-15 2018-01-15 Calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810034980.5A CN108279809B (en) 2018-01-15 2018-01-15 Calibration method and device

Publications (2)

Publication Number Publication Date
CN108279809A CN108279809A (en) 2018-07-13
CN108279809B true CN108279809B (en) 2021-11-19

Family

ID=62803729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810034980.5A Active CN108279809B (en) 2018-01-15 2018-01-15 Calibration method and device

Country Status (1)

Country Link
CN (1) CN108279809B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110769222B (en) * 2018-12-24 2021-08-17 成都极米科技股份有限公司 Projection surface depth information acquisition method, projection method and projection system
CN117529909A (en) * 2021-12-10 2024-02-06 英特尔公司 Automatic projection correction

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1584729A (en) * 2003-08-22 2005-02-23 日本电气株式会社 Image projection method and device
CN101321303A (en) * 2008-07-17 2008-12-10 上海交通大学 Geometric and optical correction method for non-plane multi-projection display
US20120114225A1 (en) * 2010-11-09 2012-05-10 Samsung Electronics Co., Ltd. Image processing apparatus and method of generating a multi-view image
CN102799317A (en) * 2012-07-11 2012-11-28 联动天下科技(大连)有限公司 Smart interactive projection system
CN103824282A (en) * 2013-12-11 2014-05-28 香港应用科技研究院有限公司 Touch and motion detection using surface map, object shadow and a single camera
CN104349096A (en) * 2013-08-09 2015-02-11 联想(北京)有限公司 Image calibration method, image calibration device and electronic equipment
CN104793784A (en) * 2015-03-23 2015-07-22 中国科学技术大学先进技术研究院 Simulation touch operation system and operation method based on depth data
CN105160680A (en) * 2015-09-08 2015-12-16 北京航空航天大学 Design method of camera with no interference depth based on structured light
US20160247286A1 (en) * 2014-02-07 2016-08-25 Lsi Corporation Depth image generation utilizing depth information reconstructed from an amplitude image
CN106255938A (en) * 2014-02-28 2016-12-21 惠普发展公司, 有限责任合伙企业 Sensor and the calibration of projector

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1584729A (en) * 2003-08-22 2005-02-23 日本电气株式会社 Image projection method and device
CN101321303A (en) * 2008-07-17 2008-12-10 上海交通大学 Geometric and optical correction method for non-plane multi-projection display
US20120114225A1 (en) * 2010-11-09 2012-05-10 Samsung Electronics Co., Ltd. Image processing apparatus and method of generating a multi-view image
CN102799317A (en) * 2012-07-11 2012-11-28 联动天下科技(大连)有限公司 Smart interactive projection system
CN104349096A (en) * 2013-08-09 2015-02-11 联想(北京)有限公司 Image calibration method, image calibration device and electronic equipment
CN103824282A (en) * 2013-12-11 2014-05-28 香港应用科技研究院有限公司 Touch and motion detection using surface map, object shadow and a single camera
US20160247286A1 (en) * 2014-02-07 2016-08-25 Lsi Corporation Depth image generation utilizing depth information reconstructed from an amplitude image
CN106255938A (en) * 2014-02-28 2016-12-21 惠普发展公司, 有限责任合伙企业 Sensor and the calibration of projector
CN104793784A (en) * 2015-03-23 2015-07-22 中国科学技术大学先进技术研究院 Simulation touch operation system and operation method based on depth data
CN105160680A (en) * 2015-09-08 2015-12-16 北京航空航天大学 Design method of camera with no interference depth based on structured light

Also Published As

Publication number Publication date
CN108279809A (en) 2018-07-13

Similar Documents

Publication Publication Date Title
CN104715249B (en) Object tracking methods and device
US20140363073A1 (en) High-performance plane detection with depth camera data
US8805015B2 (en) Electronic device and method for measuring point cloud of object
US20140278173A1 (en) Baseline management for sensing device
US20120134536A1 (en) Image Processing Apparatus and Method, and Program
WO2021129305A1 (en) Calibration rod testing method for optical motion capture system, device, apparatus, and storage medium
CN103765870A (en) Image processing apparatus, projector and projector system including image processing apparatus, image processing method
CN107704638A (en) Elevator machine room figure generating means and method, modeling data generating means and method
CN108279809B (en) Calibration method and device
CN110648363A (en) Camera posture determining method and device, storage medium and electronic equipment
CN110910445B (en) Object size detection method, device, detection equipment and storage medium
CN110047133A (en) A kind of train boundary extraction method towards point cloud data
CN109727226A (en) A kind of position table automatic generation method based on machine learning
CN105354816B (en) A kind of electronic units fix method and device
CN103544492A (en) Method and device for identifying targets on basis of geometric features of three-dimensional curved surfaces of depth images
CN109241822A (en) A kind of multi-faceted method for detecting human face and system based on MTCNN
CN111829531A (en) Two-dimensional map construction method and device, robot positioning system and storage medium
CN103854026B (en) A kind of recognition methods and electronic equipment
CN110765926B (en) Picture book identification method, device, electronic equipment and storage medium
CN102622742B (en) Method and equipment for searching light spots and apertures of Hartmann wavefront detector
CN107566822A (en) The method, apparatus and electronic equipment of a kind of bore hole stereoscopic display
CN113473118B (en) Data timestamp alignment method, device, equipment and storage medium
CN116360634A (en) Coordinate acquisition method, equipment and medium for touch point group
WO2022215137A1 (en) Communication design assistance device, communication design assistance method, and program
CN111047635A (en) Depth image-based plane touch method and device and touch system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant