CN108337494A - A kind of calibration method of projection device, device, projection device and terminal device - Google Patents

A kind of calibration method of projection device, device, projection device and terminal device Download PDF

Info

Publication number
CN108337494A
CN108337494A CN201810479778.3A CN201810479778A CN108337494A CN 108337494 A CN108337494 A CN 108337494A CN 201810479778 A CN201810479778 A CN 201810479778A CN 108337494 A CN108337494 A CN 108337494A
Authority
CN
China
Prior art keywords
projection
module
information
image
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810479778.3A
Other languages
Chinese (zh)
Inventor
郭海龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201810479778.3A priority Critical patent/CN108337494A/en
Publication of CN108337494A publication Critical patent/CN108337494A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention discloses a kind of calibration methods of projection device, are applied to image processing field, including:Obtain the image information of depth of field module acquisition;Wherein, image information includes the calibration image of projection module projection;Image procossing is carried out to image information and obtains the Pixel Information of each predeterminated position in calibration image;Coordinate information of each predeterminated position in depth of field projection is determined using Pixel Information;The coordinate conversion relation of depth of field module and projection module is determined according to coordinate information and the projection surface size parameter information for projecting module;This method can be obtained coordinate information of each predeterminated position in depth of field projection by carrying out image procossing to the image information comprising calibration image, and then it determines depth of field module and projects the coordinate conversion relation of module, it realizes automatic calibration function, avoids user's operation, improve user experience;The invention also discloses a kind of calibrating installation of projection device, projection device, terminal device and computer readable storage mediums, have above-mentioned advantageous effect.

Description

Calibration method and device for projection equipment, projection equipment and terminal equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a calibration method and apparatus for a projection device, a terminal device, and a computer-readable storage medium.
Background
Currently, a projection apparatus capable of human-computer interaction (e.g., an intelligent projection lamp) is favored by more and more users, and the projection apparatus generally mainly comprises a projection module (for projecting an operating system) + a depth of field module. Because the projection area of the depth of field module is larger than that of the projection module, a user needs to perform calibration operation before using the projection equipment to perform touch interaction, and finds the corresponding relation between the projection area of the depth of field module and the projection area of the projection module.
At present, the calibration mode that projection equipment adopted all is for projecting the calibration icon to the plane of projection, and the user needs to click the calibration icon that projects to calibrate manually to when the user wants to realize the calibration effect ideal, the calibration icon that corresponds to need to click will be many, leads to user's operation more, reduces user and uses experience. Moreover, if the position where the user clicks the calibration icon is not very accurate, calibration deviation can be caused, and finally, the subsequent interaction effect is poor, and the user experience is further reduced. Therefore, how to realize automatic calibration of the projection device is very important, so that user operation can be reduced, and user experience can be improved.
Disclosure of Invention
The invention aims to provide a calibration method and device of projection equipment, the projection equipment, terminal equipment and a computer readable storage medium, which can enable the projection equipment to realize an automatic calibration function and avoid the problems of excessive calibration operation of a user or inaccurate calibration caused by the user operation.
In order to solve the above technical problem, the present invention provides a calibration method for a projection apparatus, including:
acquiring image information acquired by the depth-of-field module; the image information comprises a calibration image projected by the projection module;
performing image processing on the image information to obtain pixel information of each preset position in the calibration image;
determining coordinate information of each preset position in the depth projection by using the pixel information;
and determining the coordinate transformation relation between the depth of field module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module.
Optionally, the acquiring the image information collected by the depth of field module includes:
acquiring image information acquired by the depth-of-field module; the image information comprises a calibration image of which the full screen projected by the projection module is a preset color.
Optionally, performing image processing on the image information to obtain pixel information of each preset position in the calibration image, including:
carrying out binarization processing on the image information to obtain a gray level image;
and carrying out pixel scanning on the gray level image to obtain pixel information of each preset position in the calibration image.
Optionally, performing pixel scanning on the grayscale image to obtain pixel information of each preset position in the calibration image, including:
performing pixel scanning on the gray level image to obtain pixel information of a first vertex and a second vertex in the calibration image; wherein the first vertex and the second vertex are diagonal vertices.
Optionally, determining a coordinate transformation relationship between the depth of field module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module, including:
obtaining a coordinate transformation relation between the depth of field module and the projection module according to the coordinate information (Px0, Py0) of the first vertex, the coordinate information (Px1, Py1) of the second vertex and the projection surface size parameter information (Screen X, Screen Y) of the projection module:
wherein (TouchX, TouchY) is interactive position coordinate information in the projection plane corresponding to the projection module, (x, y) is interactive position coordinate information in the projection plane corresponding to the depth of field module, ScreenX is projection plane length information of the projection module, ScreenY is projection plane width information of the projection module, and Px0, Px1 are projection plane length minimum and maximum of the projection module, respectively; py0 and Py1 are the minimum and maximum width of the projection surface of the projection module respectively.
The invention also provides a calibration device of projection equipment, comprising:
the image acquisition module is used for acquiring the image information acquired by the depth-of-field module; the image information comprises a calibration image projected by the projection module;
the image processing module is used for carrying out image processing on the image information to obtain pixel information of each preset position in the calibration image;
the coordinate information acquisition module is used for determining the coordinate information of each preset position in the depth of field projection by using the pixel information;
and the coordinate transformation relation determining module is used for determining the coordinate transformation relation between the depth of field module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module.
Optionally, the image processing module includes:
a binarization unit, configured to perform binarization processing on the image information to obtain a grayscale image; the image information comprises a calibration image which is projected by the projection module and takes a full screen as a preset color;
and the pixel scanning unit is used for carrying out pixel scanning on the gray level image to obtain pixel information of each preset position in the calibration image.
Optionally, the pixel scanning unit is specifically a unit that performs pixel scanning on the grayscale image to obtain pixel information of a first vertex and a second vertex in the calibration image; wherein the first vertex and the second vertex are diagonal vertices.
Optionally, the coordinate transformation relation determining module is specifically a module that obtains a coordinate transformation relation between the depth of field module and the projection module according to the coordinate information (Px0, Py0) of the first vertex, the coordinate information (Px1, Py1) of the second vertex, and the projection plane size parameter information (screen x, screen y) of the projection module; the coordinate transformation relation is specifically as follows:
wherein (TouchX, TouchY) is interactive position coordinate information in the projection plane corresponding to the projection module, (x, y) is interactive position coordinate information in the projection plane corresponding to the depth of field module, ScreenX is projection plane length information of the projection module, ScreenY is projection plane width information of the projection module, and Px0, Px1 are projection plane length minimum and maximum of the projection module, respectively; py0 and Py1 are the minimum and maximum width of the projection surface of the projection module respectively.
The present invention also provides a projection apparatus comprising: the depth of field module, the projection module, the memory and the processor; wherein,
the memory for storing a computer program;
the processor, when executing the computer program, is configured to implement the steps of the calibration method for a projection apparatus according to any of the above.
The invention also provides terminal equipment comprising the projection equipment.
The invention also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method of calibrating a projection device as defined in any one of the preceding claims.
The invention provides a calibration method of projection equipment, which comprises the following steps: acquiring image information acquired by the depth-of-field module; the image information comprises a calibration image projected by the projection module; performing image processing on the image information to obtain pixel information of each preset position in the calibration image; determining coordinate information of each preset position in the depth projection by using the pixel information; and determining the coordinate transformation relation between the depth of field module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module.
Therefore, the method can obtain the coordinate information of each preset position in the depth-of-field projection in the calibration image by carrying out image processing on the image information which is acquired by the depth-of-field module and contains the calibration image projected by the projection module, and further determine the coordinate transformation relation between the depth-of-field module and the projection module, thereby realizing the calibration of the projection equipment; the method realizes the automatic calibration function without manual operation in the calibration process, thereby improving the user experience; furthermore, the method avoids manual operation, so that the problem of inaccurate calibration caused by user operation in the prior art can be avoided, namely the problem of inaccurate calibration caused by human factors is avoided. The invention also provides a calibration device of the projection equipment, the terminal equipment and a computer readable storage medium, which have the beneficial effects and are not described again.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of a calibration method for a projection apparatus according to an embodiment of the present invention;
fig. 2 is a schematic view of an area of a projection image projected by the projection module and an image area range acquired by the depth of field module according to the embodiment of the present invention;
fig. 3 is a schematic plan view of an area of a projection image projected by the projection module and an area range of an image acquired by the depth of field module according to the embodiment of the present invention;
FIGS. 4-7 are schematic diagrams of calibration images provided by embodiments of the present invention;
FIG. 8 is a schematic diagram of a coordinate transformation relationship provided by an embodiment of the present invention;
fig. 9 is a block diagram of a calibration apparatus of a projection device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart illustrating a calibration method of a projection apparatus according to an embodiment of the invention; the method can comprise the following steps:
s100, acquiring image information acquired by a depth-of-field module; the image information comprises a calibration image projected by the projection module.
The existing projection equipment comprises a depth of field module and a projection module; the projection module is used for projecting the image output by the operating system, and the depth of field module is used for acquiring the interactive position information. Specifically, the depth module comprises a distance measuring device (e.g., an infrared transceiver) and a camera (e.g., an RGB camera), wherein the camera can perform image acquisition (e.g., the RGB camera can acquire a color image). Therefore, the image information collected by the depth of field module in this embodiment is completed by the camera in the depth of field module. The camera is not specifically limited in this embodiment, and may be, for example, image information acquired by an RGB camera in the depth module.
The area of the projection image of the projection module and the image area collected by the depth of field module are different in size, and the area of the projection image of the projection module is contained in the image area range collected by the depth of field module, as shown in fig. 2 and 3. Fig. 3 is a relationship between a range of an image collected by the depth-of-field module and a projection range satisfied by the projection shown on the plane. Therefore, the image information collected by the depth of field module can contain all the projection information of the projection module. In this embodiment, the calibration function is mainly completed by performing image recognition on the image information acquired by the depth of field module, so that the coordinate transformation relationship between the depth of field module and the projection module needs to be determined by the image recognition. And then the projection position information of the projection module needs to be determined in the process of carrying out image processing on the image information acquired by the depth of field module.
In the embodiment, the calibration image projected by the projection module is identified from the image information collected by the depth of field module, and then the coordinate information of each preset position in the calibration image in the depth of field projection is determined according to the identified calibration image (the position information of the whole projection area of the projection module can be represented), and then the coordinate transformation relationship between the depth of field module and the projection module can be determined by combining the projection surface size parameter information of the projection module.
The embodiment does not limit the calibration image, and the user can set or modify the calibration image according to actual requirements. As long as the calibration image can represent the position information of the whole projection area (i.e. the projection surface) of the projection module. For example, the minimum coordinate value and the maximum coordinate value of the length, the minimum coordinate value and the maximum coordinate value of the width of the projection area of the projection module in the depth of field module can be determined according to the coordinate information of each preset position in the identified calibration image in the depth of field projection.
For example, the calibration image may be a calibration image of a full screen with a preset color (as shown in fig. 4, where the black area may be a color with high saturation, such as red), or a calibration image of four colored points at four corner positions of the projection module (as shown in fig. 5, where the black area may be a color with high saturation, such as red), or a calibration image of two red dots at corner positions of two opposite corners of the projection module (as shown in fig. 6, where the black area may be a color with high saturation, such as red, or of course, colored points that can determine the position of the projection area of the projection module may exist at other positions, and further, this embodiment does not limit this point to be a circle or a rectangle, or a frame of the projection range of the projection module may be a preset color and a rectangle with no color inside (as shown in fig. 7, where the black border may be a color of high saturation such as red).
And S110, carrying out image processing on the image information to obtain pixel information of each preset position in the calibration image.
And S120, determining coordinate information of each preset position in the depth projection by using the pixel information.
Specifically, the present embodiment does not limit the specific process of image processing, as long as it can identify each preset position in the calibration image. The user can select the corresponding algorithm according to the characteristics of the calibration image actually set by the user. For example, when the calibration image is as shown in fig. 4 to 6, the contour of each preset position may be determined by edge detection, and the pixel information of each preset position may be determined according to the contour. Further, the embodiment also does not limit the specific manner of edge detection. The edge detection can be performed, for example, by binarization.
The present embodiment does not limit the specific manner of acquiring the pixel information at each preset position in the calibration image. For example, after the image information is subjected to image processing to determine each preset position in the calibration image, the pixel information of each preset position can be determined in a progressive scanning manner. Of course, the pixel information of each preset position can also be determined in a column-by-column scanning manner.
After the pixel information of each preset position is determined, the coordinate information of the corresponding position in the depth projection can be determined according to the pixel information (located in the several rows and several columns) of each preset position.
Further, in order to improve the processing efficiency of the image information collected by the depth of field module, preferably, the image processing the image information to obtain the pixel information of each preset position in the calibration image may include:
carrying out binarization processing on the image information to obtain a gray level image;
and carrying out pixel scanning on the gray level image to obtain pixel information of each preset position in the calibration image.
Specifically, the image information acquired by the depth of field module is binarized to obtain a gray level image, the gray level image is subjected to pixel scanning, a point corresponding to each preset position in the calibration image is determined (for example, the row and column in the number of rows is determined as a point corresponding to one preset position through pixel scanning), and the pixel information of each preset position is determined according to the pixel scanning information (for example, the row and column in the number of rows) corresponding to the point.
The present embodiment does not limit the specific setting form of each preset position, as long as the specific setting form can represent the position information of the whole projection area of the projection module. For example, taking four vertexes of the projection area in fig. 4 as preset positions; or two diagonal vertices of the projection area in fig. 4 may be used as preset positions (which diagonal vertex to select may be determined by the user, and may be selected according to the actual projection coordinate system in general); two diagonal vertices of the projection area in fig. 4, and the center point of the long side and the center point of the wide side of the projection range may be used as the preset positions.
Further, the data processing complexity is reduced on the basis of not reducing the calibration accuracy. Preferably, pixel scanning is performed on the gray level image to obtain pixel information of a first vertex and a second vertex in the calibration image; wherein the first vertex and the second vertex are diagonal vertices.
And S130, determining the coordinate transformation relation between the depth of field module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module.
Specifically, the coordinate value of the minimum length of the projection surface of the projection module in the depth-of-field projection and the coordinate value of the maximum length of the projection surface of the projection module in the depth-of-field projection can be determined according to the coordinate information of each preset position; and the maximum width of the projection surface of the projection module is the coordinate value in the depth projection. The projection surface size parameter information of the projection module can be determined according to the actual projection module hardware parameters, such as length information and width information corresponding to 720P, 1080P, and the like. After the data are obtained, the coordinate transformation relation between the depth of field module and the projection module can be determined according to the proportion between the area of the image collected by the depth of field module and the projection surface of the projection module.
Further, when the preset position is diagonal vertex, i.e. the first vertex and the second vertex, and the upper left corner is the origin of coordinates of the system screen), the coordinate transformation relationship between the depth of field module and the projection module is obtained according to the coordinate information (Px0, Py0) of the first vertex, the coordinate information (Px1, Py1) of the second vertex, and the projection plane size parameter information (screen x, screen y) of the projection module:
wherein (TouchX, TouchY) is interactive position coordinate information in the projection plane corresponding to the projection module, (x, y) is interactive position coordinate information in the projection plane corresponding to the depth of field module, ScreenX is projection plane length information of the projection module, ScreenY is projection plane width information of the projection module, and Px0, Px1 are projection plane length minimum and maximum of the projection module, respectively; py0 and Py1 are the minimum and maximum width of the projection surface of the projection module respectively.
The embodiment does not limit the specific coordinate transformation relationship, and it may be adaptively changed according to the coordinate system actually selected by the user, or may be adaptively changed according to a preset known ratio (for example, the coordinate transformation relationship is determined according to a half of the maximum range of the projection surface of the projection module, and at this time, it is only a transformation in the form of a mathematical formula, and it still belongs to the protection range of the embodiment, and this case can still be understood as that it can represent the position information of the whole projection area of the projection module according to the ratio).
Based on the technical scheme, the calibration method of the projection equipment provided by the embodiment of the invention can obtain the coordinate information of each preset position in the calibration image in the depth-of-field projection by carrying out image processing on the image information which is acquired by the depth-of-field module and contains the calibration image projected by the projection module, and further determine the coordinate transformation relation between the depth-of-field module and the projection module, thereby realizing the calibration of the projection equipment; the method completes the calibration function through image processing (such as image recognition processing) without adding hardware, thereby saving the cost; the automatic calibration function is realized without manual operation in the calibration process, so that the user experience is improved; furthermore, the method avoids manual operation, so that the problem of inaccurate calibration caused by user operation in the prior art can be avoided, namely the problem of inaccurate calibration caused by human factors is avoided, and the calibration accuracy is improved.
Based on the above embodiment, in order to reduce the image processing difficulty and improve the image processing efficiency and the image recognition accuracy in this embodiment, it is preferable that the obtaining of the image information acquired by the depth of field module includes:
acquiring image information acquired by the depth-of-field module; the image information comprises a calibration image of which the full screen projected by the projection module is a preset color.
The calibration image in this embodiment may refer to the black area in fig. 4. The present embodiment does not limit the preset color. The user can select the color which forms sharp contrast with the color of the surrounding area so as to improve the effect of subsequent image processing, and the reliability of detecting the edge of the projection area of the projection module is improved through the huge contrast of the colors of the two areas. For example, when the surrounding area is white, the predetermined color may be red. When the user starts calibration, the projection module projects a full-screen red image, and the red image is strongly contrasted with other areas (white) in the image information.
Further, when the image information collected by the depth of field module includes a calibration image of which the full screen projected by the projection module is a preset color, the corresponding process of image processing on the image information collected by the depth of field module may be: and carrying out binarization processing on the image information acquired by the depth-of-field module to obtain a gray image, wherein the contour of a projection area of the projection module is white pixel points, and the rest are black pixel points. Then, the gray image is subjected to pixel scanning (the embodiment does not limit the scanning mode of specific pixels, and may be, for example, line-by-line scanning or column-by-column scanning), a point where the pixel is white is found to be an outline point of a red rectangular region (calibration image), and then, the pixel information of each preset position is determined according to the corresponding pixel scanning information (e.g., the rows and the columns). The specific algorithm can refer to an OpenCV image processing framework, and can be directly integrated with an OpenCV algorithm library to obtain the profile data of the calibration image.
Referring to fig. 8, the coordinate transformation relationship derivation process is described with 4 predetermined positions as an example. The projection range, i.e. the left-lower, left-upper, right-lower and right-upper vertex coordinates of the four corners of the calibration image in the depth projection are (Px0, Py0), (Px0, Py1), (Px1, Py0), (Px1, Py1), the left-lower, left-upper, right-lower and right-upper vertex coordinates of the four corners of the image range collected by the depth module are (0, 0), (0, H), (W, 0), (W, H), since the coordinate point position where the interaction position (e.g. the user click position) returned by the depth of view module is based on the image range collected by the depth of view module, namely, the value of the obtained interactive position coordinate point is in the coordinate range of (0, 0) to (W, H), after the interactive position coordinate point is obtained from the depth of field module, the coordinate position actually clicked in the projection image is obtained through the proportional calculation of the projection area outline range and the image collecting range of the depth of field module. The size of the image output by the actual system is (screen x, screen y) (the size is a known value, such as 720P, 1080P, etc., defined by the system), the position of the actual click on the output image is defined as (TouchX, TouchY), and the calculation formula for calculating TouchX and TouchY is as follows:
the coordinate information of the actual interaction position on the projection image can be obtained through the two formulas. At this time, if the interaction position coordinate obtained from the depth of field module is not in the projection area range, the point is ignored and is not processed, and the point is considered as the interaction operation point.
Based on the technical scheme, the calibration method of the projection equipment provided by the embodiment of the invention can complete the calibration function through image processing (such as image recognition processing) without adding hardware, thereby saving the cost; the automatic calibration function is realized without manual operation in the calibration process, so that the user experience is improved; furthermore, the method avoids manual operation, so that the problem of inaccurate calibration caused by user operation in the prior art can be avoided, namely the problem of inaccurate calibration caused by human factors is avoided; further, the image is calibrated to be a full-screen preset color area, so that the image processing efficiency and accuracy can be improved, and the calibration accuracy is further improved.
The calibration apparatus of the projection device, the terminal device, and the computer-readable storage medium according to the embodiments of the present invention are described below, and the calibration apparatus of the projection device, the terminal device, and the computer-readable storage medium described below and the calibration method of the projection device described above may be referred to in correspondence.
Referring to fig. 9, fig. 9 is a block diagram of a calibration apparatus of a projection device according to an embodiment of the present invention; the apparatus may include:
the image acquisition module 100 is used for acquiring image information acquired by the depth of field module; the image information comprises a calibration image projected by the projection module;
the image processing module 200 is configured to perform image processing on the image information to obtain pixel information of each preset position in the calibration image;
the coordinate information acquisition module 300 is configured to determine coordinate information of each preset position in the depth projection by using the pixel information;
and the coordinate transformation relation determining module 400 is configured to determine a coordinate transformation relation between the depth of field module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module.
Based on the above embodiments, the image processing module 200 may include:
the binarization unit is used for carrying out binarization processing on the image information to obtain a gray level image; the image information comprises a calibration image which is projected by the projection module and takes a full screen as a preset color;
and the pixel scanning unit is used for carrying out pixel scanning on the gray level image to obtain pixel information of each preset position in the calibration image.
Based on the above embodiment, the pixel scanning unit is specifically a unit that performs pixel scanning on the grayscale image to obtain pixel information of a first vertex and a second vertex in the calibration image; wherein the first vertex and the second vertex are diagonal vertices.
Based on the above embodiment, the coordinate transformation relation determining module 400 is specifically a module for obtaining the coordinate transformation relation between the depth of field module and the projection module according to the coordinate information (Px0, Py0) of the first vertex, the coordinate information (Px1, Py1) of the second vertex, and the projection plane size parameter information (screen x, screen y) of the projection module; the coordinate transformation relation is specifically as follows:
wherein (TouchX, TouchY) is interactive position coordinate information in the projection plane corresponding to the projection module, (x, y) is interactive position coordinate information in the projection plane corresponding to the depth of field module, ScreenX is projection plane length information of the projection module, ScreenY is projection plane width information of the projection module, and Px0, Px1 are projection plane length minimum and maximum of the projection module, respectively; py0 and Py1 are the minimum and maximum width of the projection surface of the projection module respectively.
It should be noted that, based on any of the above embodiments, the apparatus may be implemented based on a programmable logic device, where the programmable logic device includes an FPGA, a CPLD, a single chip, and the like.
An embodiment of the present invention provides a projection apparatus, including: the depth of field module, the projection module, the memory and the processor; wherein,
a memory for storing a computer program;
a processor for implementing the steps of the calibration method of the projection device as described in any of the embodiments above when executing the computer program. If the processor is used for acquiring the image information acquired by the depth of field module; the image information comprises a calibration image projected by the projection module; performing image processing on the image information to obtain pixel information of each preset position in the calibration image; determining coordinate information of each preset position in the depth projection by using the pixel information; and determining the coordinate transformation relation between the depth of field module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module.
The embodiment of the invention also discloses terminal equipment comprising the projection equipment. Specifically, the present embodiment does not limit the terminal device, and the terminal device may be a mobile phone or a projector.
The embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the calibration method for a projection device according to any of the above embodiments. If the computer program is executed by the processor, the image information acquired by the depth-of-field module is acquired; the image information comprises a calibration image projected by the projection module; performing image processing on the image information to obtain pixel information of each preset position in the calibration image; determining coordinate information of each preset position in the depth projection by using the pixel information; and determining the coordinate transformation relation between the depth of field module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module.
The computer-readable storage medium may include: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The calibration method and apparatus for a projection device, the terminal device and the computer readable storage medium provided by the present invention are described in detail above. The principles and embodiments of the present invention are explained herein using specific examples, which are presented only to assist in understanding the method and its core concepts. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the present invention without departing from the principle of the present invention, and those improvements and modifications also fall within the scope of the claims of the present invention.

Claims (12)

1. A method of calibrating a projection device, comprising:
acquiring image information acquired by the depth-of-field module; the image information comprises a calibration image projected by the projection module;
performing image processing on the image information to obtain pixel information of each preset position in the calibration image;
determining coordinate information of each preset position in the depth projection by using the pixel information;
and determining the coordinate transformation relation between the depth of field module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module.
2. The calibration method for the projection apparatus according to claim 1, wherein the acquiring the image information collected by the depth of field module comprises:
acquiring image information acquired by the depth-of-field module; the image information comprises a calibration image of which the full screen projected by the projection module is a preset color.
3. The method for calibrating a projection apparatus according to claim 1, wherein performing image processing on the image information to obtain pixel information of each preset position in the calibration image comprises:
carrying out binarization processing on the image information to obtain a gray level image;
and carrying out pixel scanning on the gray level image to obtain pixel information of each preset position in the calibration image.
4. The method for calibrating a projection apparatus according to claim 3, wherein performing a pixel scan on the grayscale image to obtain pixel information of each preset position in the calibration image comprises:
performing pixel scanning on the gray level image to obtain pixel information of a first vertex and a second vertex in the calibration image; wherein the first vertex and the second vertex are diagonal vertices.
5. The method for calibrating a projection apparatus according to claim 3, wherein determining the coordinate transformation relationship between the depth of view module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module comprises:
obtaining a coordinate transformation relation between the depth of field module and the projection module according to the coordinate information (Px0, Py0) of the first vertex, the coordinate information (Px1, Py1) of the second vertex and the projection surface size parameter information (Screen X, Screen Y) of the projection module:
wherein (TouchX, TouchY) is interactive position coordinate information in the projection plane corresponding to the projection module, (x, y) is interactive position coordinate information in the projection plane corresponding to the depth of field module, ScreenX is projection plane length information of the projection module, ScreenY is projection plane width information of the projection module, and Px0, Px1 are projection plane length minimum and maximum of the projection module, respectively; py0 and Py1 are the minimum and maximum width of the projection surface of the projection module respectively.
6. A calibration arrangement for a projection device, comprising:
the image acquisition module is used for acquiring the image information acquired by the depth-of-field module; the image information comprises a calibration image projected by the projection module;
the image processing module is used for carrying out image processing on the image information to obtain pixel information of each preset position in the calibration image;
the coordinate information acquisition module is used for determining the coordinate information of each preset position in the depth of field projection by using the pixel information;
and the coordinate transformation relation determining module is used for determining the coordinate transformation relation between the depth of field module and the projection module according to the coordinate information and the projection surface size parameter information of the projection module.
7. The calibration apparatus for a projection device according to claim 6, wherein the image processing module comprises:
a binarization unit, configured to perform binarization processing on the image information to obtain a grayscale image; the image information comprises a calibration image which is projected by the projection module and takes a full screen as a preset color;
and the pixel scanning unit is used for carrying out pixel scanning on the gray level image to obtain pixel information of each preset position in the calibration image.
8. The calibration apparatus of a projection device according to claim 7, wherein the pixel scanning unit is specifically a unit configured to perform pixel scanning on the grayscale image to obtain pixel information of a first vertex and a second vertex in the calibration image; wherein the first vertex and the second vertex are diagonal vertices.
9. The calibration apparatus for a projection device according to claim 8, wherein the coordinate transformation relation determining module is specifically a module for obtaining the coordinate transformation relation between the depth of view module and the projection module according to the coordinate information (Px0, Py0) of the first vertex, the coordinate information (Px1, Py1) of the second vertex, and the projection plane size parameter information (screen x, screen y) of the projection module; the coordinate transformation relation is specifically as follows:
wherein (TouchX, TouchY) is interactive position coordinate information in the projection plane corresponding to the projection module, (x, y) is interactive position coordinate information in the projection plane corresponding to the depth of field module, ScreenX is projection plane length information of the projection module, ScreenY is projection plane width information of the projection module, and Px0, Px1 are projection plane length minimum and maximum of the projection module, respectively; py0 and Py1 are the minimum and maximum width of the projection surface of the projection module respectively.
10. A projection device, comprising: the depth of field module, the projection module, the memory and the processor; wherein,
the memory for storing a computer program;
the processor, when executing the computer program, is configured to carry out the steps of the method of calibrating a projection device according to any of claims 1 to 5.
11. A terminal device characterized by comprising the projection device according to claim 10.
12. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method for calibrating a projection device according to any one of claims 1 to 5.
CN201810479778.3A 2018-05-18 2018-05-18 A kind of calibration method of projection device, device, projection device and terminal device Pending CN108337494A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810479778.3A CN108337494A (en) 2018-05-18 2018-05-18 A kind of calibration method of projection device, device, projection device and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810479778.3A CN108337494A (en) 2018-05-18 2018-05-18 A kind of calibration method of projection device, device, projection device and terminal device

Publications (1)

Publication Number Publication Date
CN108337494A true CN108337494A (en) 2018-07-27

Family

ID=62935155

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810479778.3A Pending CN108337494A (en) 2018-05-18 2018-05-18 A kind of calibration method of projection device, device, projection device and terminal device

Country Status (1)

Country Link
CN (1) CN108337494A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109660779A (en) * 2018-12-20 2019-04-19 歌尔科技有限公司 Touch-control independent positioning method, projection device and storage medium based on projection
CN110290364A (en) * 2019-06-04 2019-09-27 成都极米科技股份有限公司 Electrodeless Zooming method, device and readable storage medium storing program for executing under side throwing mode
CN110347257A (en) * 2019-07-08 2019-10-18 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking equipment
CN111932571A (en) * 2020-09-25 2020-11-13 歌尔股份有限公司 Image boundary identification method and device and computer readable storage medium
WO2022247418A1 (en) * 2021-05-25 2022-12-01 青岛海信激光显示股份有限公司 Image correction method and photographing device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622108A (en) * 2012-01-18 2012-08-01 深圳市中科睿成智能科技有限公司 Interactive projecting system and implementation method for same
CN103517017A (en) * 2012-06-22 2014-01-15 精工爱普生株式会社 Projector, image display system, and projector control method
CN103838437A (en) * 2014-03-14 2014-06-04 重庆大学 Touch positioning control method based on projection image
CN104090664A (en) * 2014-07-29 2014-10-08 广景科技有限公司 Interactive projection method, device and system
CN105654502A (en) * 2016-03-30 2016-06-08 广州市盛光微电子有限公司 Panorama camera calibration device and method based on multiple lenses and multiple sensors

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622108A (en) * 2012-01-18 2012-08-01 深圳市中科睿成智能科技有限公司 Interactive projecting system and implementation method for same
CN103517017A (en) * 2012-06-22 2014-01-15 精工爱普生株式会社 Projector, image display system, and projector control method
CN103838437A (en) * 2014-03-14 2014-06-04 重庆大学 Touch positioning control method based on projection image
CN104090664A (en) * 2014-07-29 2014-10-08 广景科技有限公司 Interactive projection method, device and system
CN105654502A (en) * 2016-03-30 2016-06-08 广州市盛光微电子有限公司 Panorama camera calibration device and method based on multiple lenses and multiple sensors

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109660779A (en) * 2018-12-20 2019-04-19 歌尔科技有限公司 Touch-control independent positioning method, projection device and storage medium based on projection
CN110290364A (en) * 2019-06-04 2019-09-27 成都极米科技股份有限公司 Electrodeless Zooming method, device and readable storage medium storing program for executing under side throwing mode
CN110290364B (en) * 2019-06-04 2021-06-29 成都极米科技股份有限公司 Stepless zooming method and device in side-casting mode and readable storage medium
CN110347257A (en) * 2019-07-08 2019-10-18 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking equipment
CN111932571A (en) * 2020-09-25 2020-11-13 歌尔股份有限公司 Image boundary identification method and device and computer readable storage medium
WO2022247418A1 (en) * 2021-05-25 2022-12-01 青岛海信激光显示股份有限公司 Image correction method and photographing device

Similar Documents

Publication Publication Date Title
CN108337494A (en) A kind of calibration method of projection device, device, projection device and terminal device
CN109118569B (en) Rendering method and device based on three-dimensional model
CN109104596B (en) Projection system and correction method of display image
US7342572B2 (en) System and method for transforming an ordinary computer monitor into a touch screen
CN109510948B (en) Exposure adjusting method, exposure adjusting device, computer equipment and storage medium
US9721532B2 (en) Color chart detection apparatus, color chart detection method, and color chart detection computer program
US9922443B2 (en) Texturing a three-dimensional scanned model with localized patch colors
JP6115214B2 (en) Pattern processing apparatus, pattern processing method, and pattern processing program
KR20200023651A (en) Preview photo blurring method and apparatus and storage medium
CN102484724A (en) Projection image area detecting device
CN108074237B (en) Image definition detection method and device, storage medium and electronic equipment
JP2011050013A (en) Video projection apparatus and video projection method
WO2022105276A1 (en) Method and apparatus for determining projection area, projection device, and readable storage medium
US9554121B2 (en) 3D scanning apparatus and method using lighting based on smart phone
US9319666B1 (en) Detecting control points for camera calibration
WO2022105277A1 (en) Projection control method and apparatus, projection optical machine, and readable storage medium
CN111160063B (en) Internet of things IPC two-dimensional code distribution network image enhancement method and system
CN113191327A (en) Biological characteristic collection method, chip and computer readable storage medium
WO2024055531A1 (en) Illuminometer value identification method, electronic device, and storage medium
CN104933430B (en) A kind of Interactive Image Processing method and system for mobile terminal
CN112308933B (en) Method and device for calibrating camera internal reference and computer storage medium
CN116645275A (en) Method, device, projector and storage medium for correcting projection image
CN105989587B (en) Automatic calibration method of multifunctional OCT system
CN114119609B (en) Method, device and equipment for detecting image stain concentration and storage medium
WO2020107196A1 (en) Photographing quality evaluation method and apparatus for photographing apparatus, and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180727

RJ01 Rejection of invention patent application after publication