CN112445326B - Projection interaction method based on TOF camera, system thereof and electronic equipment - Google Patents

Projection interaction method based on TOF camera, system thereof and electronic equipment Download PDF

Info

Publication number
CN112445326B
CN112445326B CN201910826028.3A CN201910826028A CN112445326B CN 112445326 B CN112445326 B CN 112445326B CN 201910826028 A CN201910826028 A CN 201910826028A CN 112445326 B CN112445326 B CN 112445326B
Authority
CN
China
Prior art keywords
projection
projector
tof camera
fingertip
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910826028.3A
Other languages
Chinese (zh)
Other versions
CN112445326A (en
Inventor
戴怡洁
张建峰
樊能
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Original Assignee
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sunny Optical Intelligent Technology Co Ltd filed Critical Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority to CN201910826028.3A priority Critical patent/CN112445326B/en
Publication of CN112445326A publication Critical patent/CN112445326A/en
Application granted granted Critical
Publication of CN112445326B publication Critical patent/CN112445326B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/117Biometrics derived from hands

Abstract

A projection interaction method based on a TOF camera, a system and electronic equipment thereof are provided. The projection interaction method based on the TOF camera comprises the following steps: performing contour detection based on a grayscale image and a depth image to obtain hand contour data, wherein the grayscale image and the depth image are obtained by the TOF camera shooting a hand in front of a projector; based on the hand contour data, positioning a fingertip point on a hand contour to obtain the position of the fingertip point on the gray level image; judging whether the finger tip of the hand is positioned on the projection surface of the projector; based on the mapping relation between the gray level image and a projection picture projected by the projector, mapping the position of the fingertip point on the gray level image to the projection picture so as to obtain the position of the fingertip point on the projection picture; and responding to the judgment that the fingertip is positioned on the projection surface, and sending a corresponding touch instruction based on the position of the fingertip point on the projection picture, thereby realizing projection interaction.

Description

Projection interaction method based on TOF camera, system thereof and electronic equipment
Technical Field
The invention relates to the technical field of human-computer interaction, in particular to a projection interaction method based on a TOF camera, a system and electronic equipment thereof.
Background
With the rapid development of computer vision, the human-computer interaction technology based on computer vision is also advanced, and human-computer interaction interfaces are developed in a more friendly and convenient direction. In recent years, various human-computer interaction systems are emerging for consumers, such as: touch screens, data gloves, motion sensing games, remote control pads, freehand, and the like. Among them, a human-computer interaction method that realizes free-hand operation by hand operation, for example, inputting characters or other commands by operating on a projected virtual keyboard image with a fingertip has become a focus of attention of consumers due to its low cost, simple operation and humanization, and has become one of the most convenient and feasible methods so far.
The fingertip detection is used as a key of a human-computer interaction mode of bare-handed operation, and the accuracy and timeliness of a detection result of the fingertip detection can directly influence the quality and user experience of the human-computer interaction mode of bare-handed operation. Currently, research and application for fingertip detection in the market are very wide, and especially, fingertip detection is performed based on a two-dimensional RGB image acquired by a common RGB camera, which mainly uses a difference between a skin color of a hand and an RGB color gamut of a background to primarily identify a hand region. However, since the RGB image is greatly affected by environmental factors (such as illumination intensity, backlight, reflection, etc.) and complex background, and the skin color of each hand is not consistent, the fingertip detection based on the RGB image is inevitably affected by the environmental factors and the skin color of the hand, thereby reducing the precision and efficiency of human-computer interaction. Therefore, accurate positioning of fingertip detection remains a very challenging topic for human-computer interaction.
Disclosure of Invention
An object of the present invention is to provide a projection interaction method based on a TOF camera, a system and an electronic device thereof, which can improve the precision and efficiency of human-computer interaction and contribute to enhancing the practicability of human-computer interaction.
Another object of the present invention is to provide a TOF camera-based projection interaction method, a system and an electronic device thereof, wherein in an embodiment of the present invention, the TOF camera-based projection interaction method can reduce the influence of ambient light and other factors on fingertip detection, which is helpful for improving the accuracy of fingertip detection.
Another object of the present invention is to provide a TOF camera-based projection interaction method, a TOF camera-based projection interaction system, and an electronic device, wherein in an embodiment of the present invention, the TOF camera-based projection interaction method can reduce errors caused by differences of different skin colors of hands, which is helpful for ensuring accuracy of fingertip detection.
Another object of the present invention is to provide a TOF camera-based projection interaction method, a TOF camera-based projection interaction system, and an electronic device, wherein in an embodiment of the present invention, the TOF camera-based projection interaction method uses corner point markers to calibrate the TOF camera and the projector, which is helpful for improving the accuracy of a calibration result, so as to improve the subsequent interaction accuracy.
Another object of the present invention is to provide a projection interaction method based on a TOF camera, a system and an electronic device thereof, wherein in an embodiment of the present invention, the projection interaction method based on the TOF camera can dynamically obtain mapping matrices at different projection distances, which is helpful to improve the accuracy of the mapping relationship between the TOF camera and the projector, and ensure that the subsequent projection interaction is performed smoothly.
Another object of the present invention is to provide a TOF camera-based projection interaction method, a TOF camera-based projection interaction system, and an electronic device, wherein in an embodiment of the present invention, the TOF camera-based projection interaction method uses an infrared depth information fusion map for fingertip detection, which is helpful for improving the precision of fingertip detection.
Another object of the present invention is to provide a TOF camera-based projection interaction method, a system and an electronic device thereof, wherein in an embodiment of the present invention, the TOF camera-based projection interaction method can remove detection noise caused by highlight influence in a gray-scale image through contour detection, which is helpful for improving precision of fingertip detection.
Another object of the present invention is to provide a TOF camera-based projection interaction method, a TOF camera-based projection interaction system, and an electronic device, wherein in an embodiment of the present invention, the TOF camera-based projection interaction method can fully consider the influence of wrist points and finger pits, which is helpful to greatly improve the effectiveness and accuracy of fingertip detection.
Another object of the present invention is to provide a TOF camera-based projection interaction method, a TOF camera-based projection interaction system, and an electronic device, wherein in an embodiment of the present invention, the TOF camera-based projection interaction method can reduce an error influence caused by TOF single-point cloud jitter, which is helpful for ensuring that projection interaction has good stability.
Another object of the present invention is to provide a TOF camera-based projection interaction method, a system and an electronic device thereof, wherein, in order to achieve the above object, it is not necessary to adopt expensive materials or complex structures in the present invention. Therefore, the invention successfully and effectively provides a solution, not only provides a simple projection interaction method based on the TOF camera, a system and an electronic device thereof, but also increases the practicability and reliability of the projection interaction method based on the TOF camera, the system and the electronic device thereof.
To achieve at least one of the above objects or other objects and advantages, the present invention provides a TOF camera based projection interaction method, including the steps of:
performing contour detection to obtain hand contour data based on a grayscale image and a depth image obtained by a TOF camera, wherein the grayscale image and the depth image are obtained by the TOF camera shooting a hand in front of a projector, and a relative position between the projector and the TOF camera is kept unchanged;
based on the hand contour data, positioning a fingertip point on a hand contour to obtain the position of the fingertip point on the hand contour on the gray level image;
judging whether the finger tip of the hand is positioned on the projection surface of the projector;
based on the mapping relation between the gray level image of the TOF camera and a projection picture projected by the projector on the projection plane, mapping the position of the fingertip point on the gray level image to the projection picture to obtain the position of the fingertip point on the projection picture; and
and responding to the judgment that the fingertip is positioned on the projection surface of the projector, and sending a corresponding touch instruction based on the position of the fingertip point on the projection picture, thereby realizing projection interaction.
In an embodiment of the present invention, the step of performing contour detection based on a gray scale image and a depth image obtained by a TOF camera to obtain hand contour data, wherein the gray scale image and the depth image are obtained by the TOF camera shooting a hand located in front of a projector, and a relative position between the projector and the TOF camera is kept unchanged includes the steps of:
fusing the gray image and the depth image obtained by the TOF camera to obtain a gray depth fused image;
carrying out contour detection on the gray level depth fusion image to detect a hand contour region on the gray level image; and
and filtering the detection noise in the hand contour region through noise filtering to obtain the hand contour data.
In an embodiment of the present invention, the step of locating a fingertip point on a hand contour based on the hand contour data to obtain a position of the fingertip point on the hand contour on the grayscale image includes the steps of:
calculating the curvature of each contour point on the hand contour based on the hand contour data;
screening contour points meeting the requirements through a preset curvature threshold value to perform clustering, and further obtaining finger-like cusp points; and
and filtering out finger tip-like points at a wrist area and a finger area on the hand contour through convex hull detection so as to determine the position of the finger tip-like points on the gray level image.
In an embodiment of the present invention, in the step of screening the contour points meeting the requirement through a preset curvature threshold to perform clustering, and further obtaining the finger-like cusp points, the curvature of the screened contour points is greater than the preset curvature threshold.
In an embodiment of the present invention, the step of determining whether the fingertip of the hand is located on the projection surface of the projector includes the steps of:
selecting a point cloud in a corresponding fingernail area on the hand contour according to the position of the fingertip point on the hand contour;
calculating the average depth of the support area according to the depth information of the selected point cloud so as to obtain the distance between the fingernail area and the projection surface of the projector; and
and judging whether the fingertip is positioned on the projection surface of the projector or not through a preset distance threshold value.
In an embodiment of the invention, in the step of selecting the point cloud in the corresponding nail region on the hand contour according to the position of the fingertip point on the hand contour, the selected point cloud is located in the contour inscribed circle of the nail region of the hand contour.
In an embodiment of the invention, in the step of determining whether the fingertip point is located on the projection surface of the projector by using the preset distance threshold, when the distance between the nail region and the projection surface of the projector is smaller than or equal to the preset distance threshold, it is determined that the fingertip is located on the projection surface of the projector.
In an embodiment of the present invention, the step of mapping the position of the fingertip point on the gray scale image to the projection screen based on the mapping relationship between the gray scale image of the TOF camera and the projection screen projected by the projector on the projection surface to obtain the position of the fingertip point on the projection screen includes the steps of:
determining a mapping relation between the gray-scale image and the projection picture at the projection distance based on the projection distance of the projector; and
and mapping the position of the fingertip point on the gray level image to the projection picture based on the mapping relation under the projection distance so as to obtain the fingertip position on the projection picture.
In an embodiment of the present invention, the TOF camera-based projection interaction method further includes the steps of:
calibrating the TOF camera and the projector with fixed relative positions to obtain mapping relations between the gray-scale image of the TOF camera and the projection picture of the projector at different projection distances.
In an embodiment of the present invention, the step of calibrating the TOF camera and the projector with fixed relative positions to obtain a mapping relationship between the grayscale image of the TOF camera and the projection image of the projector at different projection distances includes the steps of:
adjusting the distance between the projector and a calibration plate to enable the projector to project the projection picture on the calibration plate under different distances, wherein the distance between the calibration plate and the projector is the projection distance of the projector;
marking the corner points of the projection picture on the calibration plate through the corner point marks so as to obtain the marked corner points at different projection distances;
shooting the marking angular points on the calibration plate at different projection distances in sequence through the TOF camera to obtain gray level images containing the marking angular points at different projection distances; and
and establishing a mapping relation between the gray-scale image of the TOF camera and the projection picture of the projector along with the change of the projection distance based on the position of the marking corner point on the gray-scale image.
According to another aspect of the present invention, the present invention also provides a TOF camera based projection interaction system, comprising:
a contour detection module, wherein the contour detection module is configured to perform contour detection based on a grayscale image and a depth image obtained by a TOF camera to obtain hand contour data, wherein the grayscale image and the depth image are obtained by the TOF camera capturing a hand in front of a projector, and wherein a relative position between the projector and the TOF camera remains unchanged;
a fingertip positioning module, wherein the fingertip positioning module is communicably connected to the contour detection module, and is used for positioning a fingertip point on a hand contour based on the hand contour data to obtain a position of the fingertip point on the hand contour on the grayscale image;
the judging module is used for judging whether the fingertip of the hand is positioned on the projection surface of the projector;
a mapping module, wherein the mapping module is communicably connected to the fingertip positioning module, and is configured to map the position of the fingertip point on the grayscale image to a projection screen projected on the projection screen based on a mapping relationship between the grayscale image of the TOF camera and the projection screen, so as to obtain the position of the fingertip point on the projection screen; and
and the touch module is respectively connected with the judging module and the mapping module in a communication way and is used for responding to the judgment that the fingertip is positioned on the projection surface of the projector and sending out a corresponding touch instruction based on the position of the fingertip point on the projection picture so as to realize projection interaction.
In an embodiment of the present invention, the contour detection module includes a fusion module, a detection module, and a filtering module, which are communicably connected to each other, wherein the fusion module is configured to fuse the grayscale image and the depth image obtained by the TOF camera to obtain a grayscale depth fusion image; the detection module is used for carrying out contour detection on the gray level depth fusion image so as to detect a hand contour region on the gray level image; the filtering module is used for filtering the detection noise in the hand contour region through noise filtering so as to obtain the hand contour data.
In an embodiment of the present invention, the fingertip positioning module includes a curvature calculation module, a filtering module, and a convex hull detection module, which are sequentially connected in a communication manner, wherein the curvature calculation module is configured to calculate a curvature of each contour point on the hand contour based on the hand contour data; the screening module is used for screening contour points meeting requirements through a preset curvature threshold value to perform clustering, and further obtaining finger-like cusp points; the convex hull detection module is used for detecting and filtering finger tip-like points at a wrist area and a finger area on the hand contour through convex hulls so as to determine the positions of the finger tip points on the gray level image.
In an embodiment of the present invention, the determining module is further configured to select a point cloud in a corresponding fingernail region on the hand contour according to a position of the fingertip point on the hand contour; calculating the average depth of the support area according to the depth information of the selected point cloud so as to obtain the distance between the nail area and the projection plane of the projector; and judging whether the fingertip is positioned on the projection surface of the projector or not through a preset distance threshold value.
In an embodiment of the present invention, the mapping module is further configured to determine a mapping relationship between the grayscale image and the projection screen at the projection distance based on the projection distance of the projector; and mapping the position of the fingertip point on the gray level image to the projection picture based on the mapping relation under the projection distance so as to obtain the fingertip position on the projection picture.
In an embodiment of the present invention, the TOF camera based projection interaction system further includes a calibration module, where the calibration module is configured to calibrate the TOF camera and the projector with fixed relative positions, so as to obtain mapping relationships between the grayscale image of the TOF camera and the projection image of the projector at different projection distances.
In an embodiment of the present invention, the calibration module includes an adjusting module, an angular point marking module, a marking angular point obtaining module, and a mapping relationship establishing module, which are sequentially connected in a communication manner, wherein the adjusting module is configured to adjust a distance between the projector and a calibration board, so that the projector projects the projection image on the calibration board at different distances, where the distance between the calibration board and the projector is the projection distance of the projector; the corner marking module is used for marking the corner of the projection picture on the calibration board through the corner mark so as to obtain the marked corner at different projection distances; the marking angular point acquisition module is used for sequentially shooting the marking angular points on the calibration plate at different projection distances through the TOF camera so as to obtain gray level images containing the marking angular points at different projection distances; the mapping relation establishing module is used for establishing a mapping relation between the gray level image of the TOF camera and the projection picture of the projector along with the change of the projection distance based on the position of the marking corner point on the gray level image.
According to another aspect of the present invention, the present invention also provides an electronic device comprising:
a processor for executing instructions; and
a memory, wherein the memory is configured to hold machine readable instructions for execution by the logic to implement some or all of the steps in a TOF camera based projection interaction method as described in any of the above.
Further objects and advantages of the invention will be fully apparent from the ensuing description and drawings.
These and other objects, features and advantages of the present invention will become more fully apparent from the following detailed description, the accompanying drawings and the claims.
Drawings
Fig. 1 is a flowchart of a TOF camera-based projection interaction method according to an embodiment of the invention.
Fig. 2 shows a flow chart diagram of one of the steps of the TOF camera based projection interaction method according to the above-described embodiment of the invention.
Fig. 3 shows a flowchart of a second step of the TOF camera based projection interaction method according to the above embodiment of the invention.
Fig. 4 shows a flowchart of a third step of the TOF camera based projection interaction method according to the above embodiment of the invention.
FIG. 5 is a flow chart illustrating the fourth step of the projection interaction method based on the TOF camera according to the above embodiment of the present invention.
Fig. 6 shows a flow chart of the fifth step of the TOF camera based projection interaction method according to the above embodiment of the invention.
Fig. 7A to 7D are schematic diagrams illustrating a process of projection interaction by the TOF camera-based projection interaction method according to the above embodiment of the invention.
Fig. 8 shows a schematic diagram of the effect of fingertip detection in the TOF camera based projection interaction method according to the above embodiment of the invention.
FIG. 9 is a block diagram schematic diagram of a TOF camera based projection interaction system according to an embodiment of the present disclosure.
FIG. 10 is an example of an electronic device according to an embodiment of the invention.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
In the present invention, the terms "a" and "an" in the claims and the description should be understood as meaning "one or more", that is, one element may be one in number in one embodiment, and the element may be more than one in number in another embodiment. The terms "a" and "an" should not be construed as limiting the number unless the number of such elements is explicitly recited as one in the present disclosure, but rather the terms "a" and "an" should not be construed as being limited to only one of the number.
In the description of the present invention, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present invention, it should be noted that, unless explicitly stated or limited otherwise, the terms "connected" and "connected" are to be interpreted broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be directly connected or indirectly connected through an intermediate. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
With the rapid development of human-computer interaction technology, the human-computer interaction technology based on computer vision is also advancing, and the realization of human-computer interaction by hand motion is one of the most convenient and feasible methods so far. For example, in the current human-computer interaction technology in the market, fingertip detection is performed based on a two-dimensional RGB image by using the difference between the RGB color gamut of the skin color of the hand and the RGB color gamut of the background, and then a corresponding instruction is issued in response to a fingertip position and/or a gesture, so as to achieve the purpose of human-computer interaction. However, the RGB images captured by a common camera are greatly affected by ambient light and background, and the skin color of each person has no consistency, so that the precision and accuracy of fingertip detection in the man-machine interaction method based on a common camera are not high, and the method is difficult to be popularized and applied in a large range. Therefore, in order to solve the above problems, the present invention provides a TOF camera-based projection interaction method, a system thereof, and an electronic device.
It is noted that a TOF (Time of flight) camera mainly obtains a depth image of an object (i.e., depth information of a point cloud on the object) by continuously sending light pulses to the object (e.g., a hand), and then receiving light returning from the object by using a sensor, and detecting the Time of flight of the light pulses. Meanwhile, the sensor of the TOF camera can also acquire a gray scale image of the target object (i.e. gray scale information of the target object), that is, the sensor of the TOF camera does not acquire RGB information, so that the TOF camera-based projection interaction method of the invention can be effectively prevented from being affected by ambient light, and reliability and stability of projection interaction can be ensured. It is understood that the gray scale image acquired by the TOF camera can be implemented as, but is not limited to, an infrared image, that is, the sensor of the TOF camera only collects invisible infrared light, but is not affected by visible light projected by the projector and visible light in the environment, which helps to greatly enhance the anti-interference capability of the TOF camera based projection interaction system.
Illustrative method
Referring to fig. 1-6 of the drawings, a TOF camera based projection interaction method according to an embodiment of the invention is illustrated. Specifically, as shown in fig. 1, the TOF camera-based projection interaction method includes the steps of:
s120: performing contour detection to obtain hand contour data based on a grayscale image and a depth image obtained by a TOF camera, wherein the grayscale image and the depth image are obtained by the TOF camera shooting a hand in front of a projector, wherein a relative position between the projector and the TOF camera is kept unchanged;
s140: based on the hand contour data, positioning a fingertip point on a hand contour to obtain the position of the fingertip point on the hand contour on the gray level image;
s160: judging whether the fingertip of the hand is positioned on the projection surface of the projector;
s180: based on the mapping relation between the gray level image of the TOF camera and a projection picture projected by the projector on the projection plane, mapping the position of the fingertip point on the gray level image to the projection picture to obtain the position of the fingertip point on the projection picture; and
s190: and responding to the judgment that the fingertip is positioned on the projection surface of the projector, and sending a corresponding touch instruction based on the position of the fingertip point on the projection picture, thereby realizing projection interaction.
More specifically, as shown in fig. 2, the step S120 of the TOF camera-based projection interaction method includes the steps of:
s121: fusing the grayscale image and the depth image obtained by the TOF camera to obtain a grayscale depth fusion image; and
s122: and carrying out contour detection on the gray level depth fusion image to detect a hand contour region on the gray level image.
Further, as shown in fig. 2, the step S120 of the TOF camera based projection interaction method further includes the steps of:
s123: and filtering the detection noise in the hand contour region through noise filtering to obtain the hand contour data.
It should be noted that, in the step S122, when performing contour detection, a fine noise region is also filtered to obtain a more accurate hand contour region, that is, by filtering the fine noise region, it is helpful to more accurately obtain the hand position on the grayscale image. It can be understood that, since the TOF camera mainly relies on invisible light such as infrared light and the like when obtaining the grayscale image and the depth image, and does not need to consider the influence of ambient light (such as visible light) and the difference between the skin colors of the hand, the step S120 of the present invention not only reduces the adverse effect of the ambient light and the like on the detection, but also reduces the error caused by the difference between the skin colors of the hand, which is helpful for enabling the obtained hand contour data to have higher accuracy.
According to the above embodiment of the present invention, as shown in fig. 3, the step S140 of the TOF camera based projection interaction method includes the steps of:
s141: calculating the curvature of each contour point on the hand contour based on the hand contour data;
s142: screening contour points meeting the requirements through a preset curvature threshold value to perform clustering, and further obtaining finger-like cusps; and
s143: and filtering finger tip-like points at a wrist area and an inter-finger area on the hand contour through convex hull detection to determine the positions of the finger tip points on the gray level image.
It is to be noted that after the curvature of each contour point on the hand contour is calculated, all contour points with curvatures greater than or equal to T are screened out for clustering through a preset curvature threshold T, so as to obtain a plurality of finger-like point. However, in addition to the contour points near the fingertips having a larger curvature, those contour points located in the wrist region (i.e. the region where the palm end is connected to the arm) and the inter-finger region (i.e. the groove region between two adjacent fingers) also have a larger curvature, so in the step 140 of the TOF camera based projection interaction method of the present invention, the finger-like tip points at the wrist region and the inter-finger region need to be filtered out to avoid the false determination of the position of the fingertips caused thereby.
It is worth mentioning that, in the step 141 of the projection interaction method based on the TOF camera, calculating the curvature of each contour point on the hand contour mainly calculates an average included angle between the current contour point and the contour points of the left and right adjacent specific position areas, so that the effectiveness and accuracy of fingertip positioning can be greatly improved, and the positioning speed is high and the precision is high. For example: for each contour point X on the obtained hand contour i Respectively selecting the contour points X i Left and right contour points X of distance K i L And X i R Then calculating the vector (X) i ,X i L ) And (X) i ,X i R ) Cosine value between, denoted as cosK i Wherein K is respectively in the threshold range (K) min ,K max ) To obtain a set of cosine values; then, an average value is obtained as the contour point X i The curvature value of (a).
According to the above embodiment of the present invention, as shown in fig. 4, the step S160 of the TOF camera based projection interaction method includes the steps of:
s161: selecting point clouds in corresponding nail areas on the hand contour according to the positions of the fingertip points on the hand contour;
s162: calculating the average depth of the nail region according to the depth information of the selected point cloud so as to obtain the distance between the nail region and the projection plane of the projector; and
s163: and judging whether the fingertip is positioned on the projection surface of the projector or not through a preset distance threshold value.
Preferably, in the step S161, the selected point clouds are all located within the contour inscribed circle of the nail region on the hand contour. Therefore, the average depth of the selected point cloud is used as a judgment basis, so that the relative stability is better, and the finger tip can be accurately judged whether the finger tip is on the projection surface of the projector.
It is to be noted that, after the distance between the nail region and the projection surface of the projector is obtained, it is determined whether the fingertip is on the projection surface of the projector by the preset distance threshold S. Specifically, in the step S163 of the TOF camera-based projection interaction method of the present invention: when the distance between the nail region and the projection surface of the projector is smaller than or equal to the preset distance threshold S, determining that the fingertip is on the projection surface of the projector; when the distance between the nail region and the projection surface of the projector is greater than the preset distance threshold value S, determining that the fingertip is not located on the projection surface of the projector. It can be understood that the preset distance threshold S can be adjusted according to the sensitivity required by the interaction, and when the sensitivity of the interaction needs to be improved, only the preset distance threshold S needs to be correspondingly reduced; and when the sensitivity of the interaction needs to be reduced, only the preset distance threshold value S needs to be correspondingly increased.
According to the above embodiment of the present invention, in step S180 of the TOF camera based projection interaction method, according to the mapping relationship between the grayscale image and the projection screen, the position of the fingertip point on the grayscale image is obtained by mapping the fingertip point onto the projection screen. However, since the mapping relationship between the grayscale image and the projection screen will change with the change of the projection distance of the projector, the mapping relationship to be adopted needs to be determined before mapping. Specifically, as shown in fig. 5, the step S180 of the TOF camera based projection interaction method includes the steps of:
s181: determining a mapping relation between the gray-scale image and the projection picture at the projection distance based on the projection distance of the projector; and
s182: and mapping the position of the fingertip point on the gray level image to the projection picture based on the mapping relation under the projection distance so as to obtain the fingertip position on the projection picture.
It is noted that the projection distance of the projector may be set in advance before the projector and the TOF camera are installed, i.e. the parameters of the projector are initialized to determine the position and distance of the projection surface; or may be obtained by performing measurements by the TOF camera after the projector and the TOF camera are mounted.
In addition, in step S190 of the projection interaction method based on the TOF camera of the present invention, to implement projection interaction, not only the fingertip is ensured to be located on the projection surface of the projector, but also the position of the fingertip point on the projection screen is determined, so that an accurate touch instruction can be issued. In the embodiment of the present invention, after the fingertip is determined to be located on the projection surface of the projector, the position of the fingertip point on the projection screen is determined through mapping, and then a corresponding touch instruction is sent out to implement projection interaction. Therefore, when the fingertip is judged not to be positioned on the projection surface of the projector, the position of the fingertip point on the projection picture does not need to be determined through mapping, and the calculation amount of the projection interaction method based on the TOF camera is reduced. Of course, in other examples of the present invention, the step S160 and the step S180 in the TOF camera based projection interaction method may be performed simultaneously, or the step S180 is performed before the step S160.
It is worth mentioning that the accuracy of the mapping relationship between the grayscale image of the TOF camera and the projection screen of the projector directly determines the interaction quality of the projection interaction by the TOF camera based projection interaction method, that is, the higher the accuracy of the mapping relationship, the higher the interaction quality of the projection interaction by the TOF camera based projection interaction method. Therefore, in order to obtain a mapping relationship with higher precision, as shown in fig. 1, the TOF camera based projection interaction method according to the above embodiment of the invention further includes the steps of:
s110: and calibrating the TOF camera and the projector with fixed relative positions to obtain a mapping relation between the gray scale image of the TOF camera and the projection picture of the projector under different projection distances.
More specifically, as shown in fig. 6, the step S110 includes the steps of:
s111: adjusting the distance between the projector and a calibration board so that the projector projects the projection picture on the calibration board under different distances, wherein the distance between the calibration board and the projector is the projection distance of the projector;
s112: marking the corner points of the projection picture on the calibration plate through the corner point marks so as to obtain the marked corner points under different projection distances;
s113: shooting the marking angular points on the calibration plate at different preset distances in sequence through the TOF camera to obtain gray level images containing the marking angular points at different projection distances; and
s114: and establishing a mapping relation between the gray-scale image of the TOF camera and the projection picture of the projector along with the change of the projection distance based on the position of the marking corner point on the gray-scale image.
It should be noted that, because the TOF camera does not need to acquire RGB images, the influence of visible light rays on calibration and projection interaction is avoided, that is, the gray scale image captured by the TOF camera cannot capture a projection image projected by the projector, and therefore, the TOF camera is not interfered by the projection image of the projector. Marking the corner points of the projection picture on the calibration board by a corner point marking method to form marking corner points; and then shooting the marking angular points on the calibration plate through the TOF camera to obtain the gray-scale image containing the marking angular points, so as to establish the mapping relation according to the positions of the marking angular points on the gray-scale image. Therefore, the mapping relation between the gray-scale image of the TOF camera and the projection picture of the projector can be obtained, and the calibration efficiency of the TOF camera and the projector and the accuracy of the mapping relation can be improved.
Exemplarily, as shown in fig. 7A, the TOF camera based projection interaction method of the present invention initializes a projector and a TOF camera to complete calibration of the projector and the TOF camera, so as to determine a projection distance of the projector and a mapping relationship between a grayscale image of the TOF camera and a projection screen of the projector at the projection distance; then, carrying out hand detection to obtain a hand outline; then, carrying out fingertip positioning and mapping to obtain the position of the fingertip on a projection picture of the projector; finally, multi-point touch control judgment is carried out to complete projection interaction; that is to say, it is determined whether the fingertip of the hand is located on the projection surface of the projector, and in response to determining that the fingertip is located on the projection surface of the projector, a corresponding touch instruction is issued based on the position of the fingertip point on the projection screen to complete the gesture response, thereby implementing the projection interaction.
First, as shown in fig. 7B, when performing projection calibration, an infrared image (e.g., a grayscale image) and point cloud data (e.g., a depth image) are acquired by the TOF camera, respectively, and a projection screen is projected by the projector; performing plane fitting on the point cloud data, and performing corner marking and detection on a projection picture on a calibration plate; then, performing distance position curve fitting to fit a change curve of the angular point position according to the distance length; and finally, obtaining the mapping relation between the gray level image and the projection picture under different projection distances so as to finish calibration.
Next, as shown in fig. 7C, after the projection calibration is completed, the detection of the hand contour needs to be performed. Illustratively, a projector is initialized firstly so as to project at a preset projection distance, and then background modeling is carried out; meanwhile, infrared data and depth data are obtained through a TOF camera, and then data fusion is carried out to obtain gray level depth fusion data; finally, carrying out contour detection and noise filtering in sequence, and stopping contour detection if the hand contour is obtained after the noise filtering is passed; if the hand contour is not obtained after the noise filtering is passed, contour detection is performed again until a hand contour map (as shown in fig. 8) is obtained.
Finally, as shown in fig. 7D, after the hand contour image is obtained, the finger-like point is determined by curvature detection, clustering and curvature secondary screening in sequence; meanwhile, carrying out convex hull detection on the hand contour image so as to determine fingertip points from the fingertip-like points; then, sequentially determining a triangular area of the nail and an inscribed circle area of the nail based on point cloud data (namely depth data) obtained by the TOF camera; and finally, judging click touch based on the average distance between the inscribed circle area of the fingernail and the projection picture, thereby realizing projection interaction.
Illustrative System
Referring to FIG. 9 of the drawings, a TOF camera based projection interaction system according to an embodiment of the present invention is illustrated. Specifically, as shown in fig. 9, the TOF camera based projection interaction system 10 includes a contour detection module 12, a fingertip positioning module 14, a determination module 16, a mapping module 18 and a touch module 19. The contour detection module 12 is configured to perform contour detection based on a grayscale image and a depth image obtained by a TOF camera to obtain hand contour data, wherein the grayscale image and the depth image are obtained by the TOF camera shooting a hand in front of a projector, and a relative position between the projector and the TOF camera is kept unchanged; the fingertip positioning module 14 is communicably connected to the contour detection module 12, and is configured to position a fingertip point on a hand contour based on the hand contour data, so as to obtain a position of the fingertip point on the hand contour on the grayscale image; the judging module 16 is configured to judge whether a fingertip of the hand is located on the projection surface of the projector; the mapping module 18 is communicably connected to the fingertip positioning module 14, and is configured to map a position of the fingertip point on the grayscale image to a projection screen projected on the projection surface based on a mapping relationship between the grayscale image of the TOF camera and the projection screen, so as to obtain a position of the fingertip point on the projection screen; the touch module 19 is communicably connected to the determining module 16 and the mapping module 18, respectively, and is configured to issue a corresponding touch instruction based on a position of the fingertip point on the projection screen in response to determining that the fingertip is located on the projection surface of the projector, so as to implement projection interaction.
In an example of the present invention, as shown in fig. 9, the contour detection module 12 includes a fusion module 121, a detection module 122, and a filtering module 123 communicably connected to each other, wherein the fusion module 121 is configured to fuse the grayscale image and the depth image obtained by the TOF camera to obtain a grayscale depth fusion image; the detection module 122 is configured to perform contour detection on the grayscale depth fusion image to detect a hand contour region on the grayscale image; the filtering module 123 is configured to filter the detection noise in the hand contour region through noise filtering to obtain the hand contour data.
In an example of the present invention, as shown in fig. 9, the fingertip positioning module 14 includes a curvature calculation module 141, a filtering module 142, and a convex hull detection module 143, which are sequentially connected in a communication manner, wherein the curvature calculation module 141 is configured to calculate a curvature of each contour point on the hand contour based on the hand contour data; the screening module 142 is configured to screen contour points meeting the requirement through a preset curvature threshold value to perform clustering, so as to obtain finger-like cusp points; the convex hull detection module 143 is configured to filter out fingertip-like points in the wrist region and the inter-finger region by convex hull detection, so as to determine the positions of the fingertip points on the grayscale image.
In an example of the present invention, the determining module 16 is further configured to select a point cloud in a corresponding fingernail region on the hand contour region according to the position of the fingertip point on the hand contour region; calculating the average depth of the nail region according to the depth information of the selected point cloud, and further obtaining the distance between the nail region and the projection plane of the projector; and judging whether the fingertip point is on the projection surface of the projector or not through a preset distance threshold value.
In an example of the present invention, the mapping module 18 is further configured to determine a mapping relationship between the grayscale image and the projection screen at the projection distance based on the projection distance of the projector; and mapping the position of the fingertip point on the gray level image to the projection picture based on the mapping relation under the projection distance so as to obtain the fingertip position on the projection picture.
According to the above embodiment of the present invention, the TOF camera based projection interaction system 10 further includes a calibration module 11, configured to calibrate the TOF camera and the projector with fixed relative positions, so as to obtain mapping relationships between the grayscale image of the TOF camera and the projection image of the projector at different projection distances.
Further, as shown in fig. 9, the calibration module 11 of the TOF camera based projection interaction system 10 may include an adjusting module 111, an angle marking module 112, a marking angle obtaining module 113, and a mapping relationship establishing module 114, which are sequentially and communicably connected, where the adjusting module 111 is configured to adjust a distance between the projector and a calibration board, so that the projector projects the projection picture on the calibration board at different distances, where the distance between the calibration board and the projector is a projection distance of the projector; the corner marking module 112 is configured to mark, on the calibration board, a corner of the projection picture on the calibration board through a corner mark, so as to obtain a marked corner at different projection distances; the marking corner point obtaining module 113 is configured to sequentially shoot the marking corner points on the calibration plate at different predetermined distances through the TOF camera, and obtain the grayscale images at different projection distances, where the grayscale images are on the grayscale images; the mapping relationship establishing module 114 is configured to establish a mapping relationship between the grayscale image of the TOF camera and the projection image of the projector, which varies with a projection distance, based on a position of the marking corner point on the grayscale image.
Illustrative electronic device
Next, an electronic apparatus according to an embodiment of the present invention is described with reference to fig. 10 (fig. 10 shows a block diagram of the electronic apparatus according to the embodiment of the present invention). As shown in fig. 10, the electronic device 20 includes one or more processors 21 and a memory 22.
The processor 21 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 20 to perform desired functions.
The memory 22 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer readable storage medium and executed by the processor 21 to implement the methods of the various embodiments of the invention described above and/or other desired functions.
In one example, as shown in fig. 10, the electronic device 20 may further include: an input device 23 and an output device 24, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
The input device 23 may be, for example, a camera module or the like for capturing image data or video data.
The output device 24 can output various information including the classification result and the like to the outside. The output devices 24 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, among others.
Of course, for simplicity, only some of the components of the electronic device 20 relevant to the present invention are shown in fig. 10, and components such as buses, input/output interfaces, and the like are omitted. In addition, the electronic device 20 may include any other suitable components depending on the particular application.
Illustrative computing program product
In addition to the above-described methods and apparatus, embodiments of the present invention may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the methods according to various embodiments of the present invention described in the "exemplary methods" section above of this specification.
The computer program product may write program code for carrying out operations for embodiments of the present invention in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the C language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, an embodiment of the present invention may also be a computer-readable storage medium having stored thereon computer program instructions, which, when executed by a processor, cause the processor to perform the steps of the above-described method of the present specification.
The computer readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present invention have been described above with reference to specific embodiments, but it should be noted that the advantages, effects, etc. mentioned in the present invention are only examples and are not limiting, and the advantages, effects, etc. must not be considered to be possessed by various embodiments of the present invention. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the invention is not limited to the specific details described above.
The block diagrams of devices, apparatuses, systems involved in the present invention are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by one skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the apparatus, devices and methods of the present invention, the components or steps may be broken down and/or re-combined. These decompositions and/or recombinations are to be regarded as equivalents of the present invention.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the invention. Thus, the present invention is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are given by way of example only and are not limiting of the invention. The objects of the invention have been fully and effectively accomplished. The functional and structural principles of the present invention have been shown and described in the embodiments, and any variations or modifications may be made to the embodiments of the present invention without departing from the principles described.

Claims (16)

1. A projection interaction method based on a TOF camera is characterized by comprising the following steps:
performing contour detection based on a grayscale image and a depth image obtained by a TOF camera to obtain hand contour data, wherein the grayscale image and the depth image are obtained by the TOF camera shooting a hand in front of a projector, and a relative position between the projector and the TOF camera is kept unchanged;
based on the hand contour data, positioning a fingertip point on a hand contour to obtain the position of the fingertip point on the hand contour on the gray level image;
judging whether the finger tip of the hand is positioned on the projection surface of the projector;
based on the mapping relation between the gray level image of the TOF camera and a projection picture projected by the projector on the projection plane, mapping the position of the fingertip point on the gray level image to the projection picture to obtain the position of the fingertip point on the projection picture; and
responding to the judgment that the fingertip is positioned on the projection surface of the projector, and sending a corresponding touch instruction based on the position of the fingertip point on the projection picture, thereby realizing projection interaction;
wherein the step of determining whether the fingertip of the hand is on the projection surface of the projector includes the steps of:
selecting a point cloud in a corresponding fingernail region on the hand contour according to the position of the fingertip point on the hand contour;
calculating the average depth of the nail region according to the depth information of the selected point cloud so as to obtain the distance between the nail region and the projection plane of the projector; and
and judging whether the fingertip is positioned on the projection surface of the projector or not through a preset distance threshold value.
2. The TOF camera based projection interaction method according to claim 1, wherein said contour detection is performed based on a gray scale image and a depth image obtained by a TOF camera to obtain hand contour data, wherein the gray scale image and the depth image are obtained by the TOF camera shooting a hand in front of a projector, wherein the relative position between the projector and the TOF camera remains unchanged, comprising the steps of:
fusing the grayscale image and the depth image obtained by the TOF camera to obtain a grayscale depth fused image;
carrying out contour detection on the gray level depth fusion image to detect a hand contour region on the gray level image; and
and filtering the detection noise in the hand contour region through noise filtering to obtain the hand contour data.
3. The TOF camera based projection interaction method according to claim 1, wherein said step of locating a fingertip point on a hand contour based on the hand contour data to obtain a position of the fingertip point on the hand contour on the grayscale image comprises the steps of:
calculating the curvature of each contour point on the hand contour based on the hand contour data;
screening contour points meeting the requirements through a preset curvature threshold value to perform clustering, and further obtaining finger-like cusp points; and
and filtering out finger tip-like points at a wrist area and a finger area on the hand contour through convex hull detection so as to determine the position of the finger tip-like points on the gray level image.
4. The TOF camera-based projection interaction method as claimed in claim 3, wherein in the step of screening out contour points meeting requirements for clustering through a preset curvature threshold value to obtain fingertip-like points, the curvature of the screened-out contour points is greater than the preset curvature threshold value.
5. The TOF camera based projection interaction method of claim 1, wherein in the step of selecting the point cloud within the corresponding nail region on the hand contour according to the position of the fingertip point on the hand contour, the selected point cloud is within the contour inscribed circle of the nail region of the hand contour.
6. The TOF camera based projection interaction method according to claim 5, wherein in the step of determining whether the fingertip point is on the projection plane of the projector by a preset distance threshold, when the distance between the nail region and the projection plane of the projector is less than or equal to the preset distance threshold, it is determined that the fingertip is on the projection plane of the projector.
7. The TOF camera based projection interaction method according to claim 1, wherein the step of mapping the position of the fingertip point on the gray scale image to the projection screen based on the mapping relationship between the gray scale image of the TOF camera and the projection screen projected by the projector on the projection plane to obtain the position of the fingertip point on the projection screen comprises the steps of:
determining a mapping relation between the gray-scale image and the projection picture at the projection distance based on the projection distance of the projector; and
and mapping the position of the fingertip point on the gray level image to the projection picture based on the mapping relation under the projection distance so as to obtain the fingertip position on the projection picture.
8. The TOF camera based projection interaction method according to any of the claims 1 to 7, further comprising the steps of:
calibrating the TOF camera and the projector with fixed relative positions to obtain mapping relations between the gray-scale image of the TOF camera and the projection picture of the projector at different projection distances.
9. The TOF camera based projection interaction method according to claim 8, wherein said step of calibrating the TOF camera and the projector with fixed relative positions to obtain mapping relationships between the grayscale image of the TOF camera and the projection image of the projector at different projection distances comprises the steps of:
adjusting the distance between the projector and a calibration plate to enable the projector to project the projection picture on the calibration plate under different distances, wherein the distance between the calibration plate and the projector is the projection distance of the projector;
marking the corner points of the projection picture on the calibration plate through the corner point marks so as to obtain the marked corner points under different projection distances;
shooting the marking angular points on the calibration plate at different projection distances in sequence through the TOF camera to obtain gray level images containing the marking angular points at different projection distances; and
and establishing a mapping relation between the gray-scale image of the TOF camera and the projection picture of the projector along with the change of the projection distance based on the position of the marking corner point on the gray-scale image.
10. A TOF camera based projection interaction system, comprising:
a contour detection module, wherein the contour detection module is configured to perform contour detection based on a grayscale image and a depth image obtained by a TOF camera to obtain hand contour data, wherein the grayscale image and the depth image are obtained by the TOF camera capturing a hand in front of a projector, and wherein a relative position between the projector and the TOF camera remains unchanged;
a fingertip positioning module, wherein the fingertip positioning module is communicably connected to the contour detection module, and is used for positioning a fingertip point on a hand contour based on the hand contour data to obtain a position of the fingertip point on the hand contour on the grayscale image;
the judging module is used for judging whether the fingertips of the hand are positioned on the projection surface of the projector or not;
a mapping module, wherein the mapping module is communicably connected to the fingertip positioning module, and is configured to map the position of the fingertip point on the grayscale image to a projection screen projected on the projection screen based on a mapping relationship between the grayscale image of the TOF camera and the projection screen, so as to obtain the position of the fingertip point on the projection screen; and
the touch module is respectively connected with the judging module and the mapping module in a communication way and used for responding to the judgment that the fingertip is positioned on the projection surface of the projector and sending out a corresponding touch instruction based on the position of the fingertip point on the projection picture so as to realize projection interaction; wherein the content of the first and second substances,
the judging module is further used for selecting point clouds in corresponding fingernail regions on the hand outline according to the positions of the fingertip points on the hand outline; calculating the average depth of the nail region according to the depth information of the selected point cloud so as to obtain the distance between the nail region and the projection plane of the projector; and judging whether the fingertip is positioned on the projection surface of the projector or not through a preset distance threshold value.
11. The TOF camera-based projection interaction system of claim 10, wherein the contour detection module comprises a fusion module, a detection module and a filtering module communicably connected to each other, wherein the fusion module is configured to fuse the grayscale image and the depth image obtained by the TOF camera to obtain a grayscale depth fusion image; the detection module is used for carrying out contour detection on the gray level depth fusion image so as to detect a hand contour region on the gray level image; the filtering module is used for filtering the detection noise in the hand contour region through noise filtering so as to obtain the hand contour data.
12. The TOF camera-based projection interaction system of claim 11, wherein the fingertip positioning module comprises a curvature calculation module, a filtering module and a convex hull detection module, which are sequentially connected in a communication manner, wherein the curvature calculation module is configured to calculate the curvature of each contour point on the hand contour based on the hand contour data; the screening module is used for screening contour points meeting requirements through a preset curvature threshold value to perform clustering, and further obtaining finger-like cusp points; the convex hull detection module is used for detecting and filtering finger tip-like points at a wrist area and a finger area on the hand contour through convex hulls so as to determine the positions of the finger tip points on the gray level image.
13. The TOF camera based projection interaction system of claim 10, wherein the mapping module is further configured to determine a mapping relationship between the grayscale image and the projection screen at the projection distance based on the projection distance of the projector; and mapping the position of the fingertip point on the gray level image to the projection picture based on the mapping relation under the projection distance so as to obtain the fingertip position on the projection picture.
14. The TOF camera based projection interaction system according to any one of claims 10 to 13, further comprising a calibration module, wherein said calibration module is configured to calibrate the TOF camera and the projector with fixed relative positions to obtain mapping relationships between the grayscale image of the TOF camera and the projection image of the projector at different projection distances.
15. The TOF camera based projection interaction system of claim 14, wherein the calibration module comprises an adjusting module, an angular point marking module, a marking angular point obtaining module and a mapping relationship establishing module, which are sequentially connected in a communication manner, wherein the adjusting module is configured to adjust a distance between the projector and a calibration board, so that the projector projects the projection picture on the calibration board at different distances, and the distance between the calibration board and the projector is the projection distance of the projector; the corner marking module is used for marking the corner of the projection picture on the calibration board through the corner mark so as to obtain the marked corner at different projection distances; the marking angular point acquisition module is used for sequentially shooting the marking angular points on the calibration plate at different projection distances through the TOF camera so as to obtain gray level images containing the marking angular points at different projection distances; the mapping relation establishing module is used for establishing a mapping relation between the gray level image of the TOF camera and the projection picture of the projector along with the change of the projection distance based on the position of the marking corner point on the gray level image.
16. An electronic device, comprising:
a processor for executing instructions; and
a memory, wherein the memory is configured to hold machine readable instructions for execution by the processor to perform some or all of the steps in a TOF camera based projection interaction method according to any one of claims 1 to 9.
CN201910826028.3A 2019-09-03 2019-09-03 Projection interaction method based on TOF camera, system thereof and electronic equipment Active CN112445326B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910826028.3A CN112445326B (en) 2019-09-03 2019-09-03 Projection interaction method based on TOF camera, system thereof and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910826028.3A CN112445326B (en) 2019-09-03 2019-09-03 Projection interaction method based on TOF camera, system thereof and electronic equipment

Publications (2)

Publication Number Publication Date
CN112445326A CN112445326A (en) 2021-03-05
CN112445326B true CN112445326B (en) 2023-04-07

Family

ID=74734456

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910826028.3A Active CN112445326B (en) 2019-09-03 2019-09-03 Projection interaction method based on TOF camera, system thereof and electronic equipment

Country Status (1)

Country Link
CN (1) CN112445326B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114066917B (en) * 2021-11-11 2022-08-05 深圳市云鼠科技开发有限公司 Cleaning method, cleaning device, electronic equipment and storage medium
CN114296556A (en) * 2021-12-31 2022-04-08 苏州欧普照明有限公司 Interactive display method, device and system based on human body posture
CN115908573B (en) * 2023-02-20 2023-06-02 季华实验室 Rubber glove opening positioning method, system, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102572505A (en) * 2010-11-03 2012-07-11 微软公司 In-home depth camera calibration
CN103383731A (en) * 2013-07-08 2013-11-06 深圳先进技术研究院 Projection interactive method and system based on fingertip positioning and computing device
CN105528082A (en) * 2016-01-08 2016-04-27 北京暴风魔镜科技有限公司 Three-dimensional space and hand gesture recognition tracing interactive method, device and system
CN108549489A (en) * 2018-04-27 2018-09-18 哈尔滨拓博科技有限公司 A kind of gestural control method and system based on hand form, posture, position and motion feature
CN109375833A (en) * 2018-09-03 2019-02-22 深圳先进技术研究院 A kind of generation method and equipment of touch command

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102572505A (en) * 2010-11-03 2012-07-11 微软公司 In-home depth camera calibration
CN103383731A (en) * 2013-07-08 2013-11-06 深圳先进技术研究院 Projection interactive method and system based on fingertip positioning and computing device
CN105528082A (en) * 2016-01-08 2016-04-27 北京暴风魔镜科技有限公司 Three-dimensional space and hand gesture recognition tracing interactive method, device and system
CN108549489A (en) * 2018-04-27 2018-09-18 哈尔滨拓博科技有限公司 A kind of gestural control method and system based on hand form, posture, position and motion feature
CN109375833A (en) * 2018-09-03 2019-02-22 深圳先进技术研究院 A kind of generation method and equipment of touch command

Also Published As

Publication number Publication date
CN112445326A (en) 2021-03-05

Similar Documents

Publication Publication Date Title
US10620712B2 (en) Interactive input system and method
CN112445326B (en) Projection interaction method based on TOF camera, system thereof and electronic equipment
US20140354602A1 (en) Interactive input system and method
US8837780B2 (en) Gesture based human interfaces
CN111949111B (en) Interaction control method and device, electronic equipment and storage medium
US6979087B2 (en) Display system with interpretable pattern detection
US8743089B2 (en) Information processing apparatus and control method thereof
CN104423731B (en) Apparatus of coordinate detecting, the method for coordinate measurement and electronic information plate system
JP5974165B2 (en) User input processing by target tracking
US20110234542A1 (en) Methods and Systems Utilizing Multiple Wavelengths for Position Detection
TWI471815B (en) Gesture recognition device and method
JP5802247B2 (en) Information processing device
US20120319945A1 (en) System and method for reporting data in a computer vision system
US9733764B2 (en) Tracking of objects using pre-touch localization on a reflective surface
US9823782B2 (en) Pre-touch localization on a reflective surface
CN106569716B (en) Single-hand control method and control system
CN106370883B (en) Speed measurement method and terminal
CN110213407B (en) Electronic device, operation method thereof and computer storage medium
US9652081B2 (en) Optical touch system, method of touch detection, and computer program product
US9569013B2 (en) Coordinate detection system, information processing apparatus, and recording medium
JP2017125764A (en) Object detection apparatus and image display device including the same
WO2010023348A1 (en) Interactive displays
US20180074648A1 (en) Tapping detecting device, tapping detecting method and smart projecting system using the same
CN115793893B (en) Touch writing handwriting generation method and device, electronic equipment and storage medium
CN116974400B (en) Screen touch recognition method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant