CN113093916A - Infrared intelligent interaction system - Google Patents

Infrared intelligent interaction system Download PDF

Info

Publication number
CN113093916A
CN113093916A CN202110505087.8A CN202110505087A CN113093916A CN 113093916 A CN113093916 A CN 113093916A CN 202110505087 A CN202110505087 A CN 202110505087A CN 113093916 A CN113093916 A CN 113093916A
Authority
CN
China
Prior art keywords
interaction
infrared intelligent
module
sight line
active
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110505087.8A
Other languages
Chinese (zh)
Inventor
贾涛
洪旺
周若楠
习向智
杨粉丽
李蔓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Heijin Industrial Manufacturing Co ltd
Original Assignee
Shenzhen Heijin Industrial Manufacturing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Heijin Industrial Manufacturing Co ltd filed Critical Shenzhen Heijin Industrial Manufacturing Co ltd
Priority to CN202110505087.8A priority Critical patent/CN113093916A/en
Publication of CN113093916A publication Critical patent/CN113093916A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ophthalmology & Optometry (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses an infrared intelligent interaction system, which belongs to the field of intelligent interaction systems and comprises a sight line judgment module, a movement control module, an active interaction module, a gesture detection module and an interaction control module; the sight line judging module is used for capturing eye images of nearby interaction-simulating objects and judging whether active interaction is carried out or not; the mobile control module is used for controlling the device to move towards the direction of the interaction object; the active interaction module is used for sending an active interaction instruction; the gesture detection module is used for performing gesture detection on the to-be-interacted object; the interaction control module is used for controlling the active interaction module to send out an active interaction instruction. The system selects the interactive objects by realizing the judgment before the man-machine interaction, reduces the missing interaction rate of the objects with interactive willingness while not disturbing pedestrians too much, saves energy and has wide application environment.

Description

Infrared intelligent interaction system
Technical Field
The invention belongs to the field of intelligent interaction systems, and particularly relates to an infrared intelligent interaction system capable of increasing interaction amount and interaction accuracy.
Background
The man-machine interaction technology is a technology for realizing human-computer conversation in an effective mode through computer input and output equipment. The man-machine interaction technology comprises the steps that a machine provides a large amount of relevant information and prompt requests for people through an output or display device, and a person inputs the relevant information, answers questions, prompts and the like to the machine through an input device. Human-computer interaction technology is one of the important elements in the design of computer user interfaces. Currently, the input information may include keyboard input, mouse input, voice input, expression input or gesture input, and the like, and the forms of human-computer interaction are more diversified.
The expression input, the gesture input and the like need to adopt an infrared image collecting and analyzing device and system such as infrared generation and infrared camera shooting. At present, human-computer interaction by adopting the device and the system generally has the defects that the identification of an intended interaction object is inaccurate before active interaction, so that missed identification of the intended interaction object with interaction will is caused or mistaken interaction is carried out on the object without interaction will, pedestrians are disturbed, and adverse effects are caused.
Disclosure of Invention
In order to solve the problems that the infrared interactive system disturbs pedestrians and other interactive objects, the accuracy is poor, and the like, the invention provides a novel infrared intelligent interactive system for improving the identification accuracy and the interactive accuracy of a to-be-interacted object, and the specific scheme is as follows:
an infrared intelligent interaction system comprises a sight line judging module, a movement control module, an active interaction module, a gesture detection module and an interaction control module;
the sight line judging module is used for capturing eye images of the interaction-simulating objects near the device carrying the infrared intelligent interaction system and judging whether active interaction is carried out according to the eye images of the interaction-simulating objects; the mobile control module is used for controlling the capturing device carrying the infrared intelligent interaction system to move towards the direction of the object to be interacted when judging that the active interaction is carried out; the active interaction module is used for sending an active interaction instruction; the gesture detection module is used for performing gesture detection on the to-be-interacted object; the interaction control module is used for controlling the active interaction module to send an active interaction instruction, or controlling the active interaction module to send a corresponding active interaction instruction according to a result obtained by the detected gesture detection module.
Preferably, the sight line determination module includes an eye image capturing unit, a sight line determination unit, a time monitoring unit
The eye image capturing unit is used for capturing eye images within a capture range of the eye image capturing unit; the sight line judging unit is used for processing the captured eye pattern phenomenon and positioning the sight line position; the time monitoring unit is used for monitoring the duration that the positioning sight line position of the object to be interacted falls in the preset range when the positioning sight line position falls in the preset range, and judging whether to carry out active interaction according to the duration;
preferably, the above-mentioned eye image capturing mode is camera shooting or infrared camera shooting, and the captured eye image is a normal image or an infrared imaging image. More preferably a normal image.
Preferably, the eye image capturing unit comprises more than two camera devices, and the camera devices are distributed on a device carrying the infrared intelligent interaction system.
Preferably, the position of the camera device can capture eye images within a range of positions on the device, the eye images of which all sight lines can be determined to be mounted with the infrared intelligent interactive system.
Preferably, the mobile control module comprises an environment monitoring unit, a forward control unit, an avoidance control unit and a stop control unit;
the environment monitoring unit is used for detecting the surrounding environment; the forward control unit is used for controlling the device carrying the infrared intelligent interaction system to move towards the direction of the interaction object to be interacted according to the surrounding environment; the avoidance control unit is used for controlling the device carrying the infrared intelligent interaction system to avoid other surrounding pedestrians and obstacles according to the surrounding environment; the stop control unit is used for controlling a device carrying the infrared intelligent interaction system to stop moving near the object to be interacted.
Preferably, the sending of the active interaction instruction includes one or more of playing interactive voice or controlling a device carrying the infrared intelligent interaction system to perform interactive gesture operation or playing an interactive picture.
Preferably, the gesture detection module comprises a device control unit, a gesture image acquisition unit and an operation recognition unit.
The device control unit is used for controlling the gesture detection related detection device to turn to the simulated interaction object; the gesture image acquisition unit is used for acquiring a gesture image; the operation recognition unit is used for performing operation recognition on the collected gesture images to obtain recognition results.
Preferably, the gesture detection related detection device comprises a camera; an infrared generating device can be further included; the camera may be an infrared camera.
The method for performing man-machine interaction by using the infrared intelligent interaction system comprises the following steps:
s1, capturing an eye image of an object to be interacted near a device carrying the infrared intelligent interaction system;
s2, processing the captured eye pattern phenomenon, and positioning the sight line position;
s3, when the position of the positioning sight line falls in a preset range, monitoring the duration of the position of the positioning sight line of the object to be interacted falling in the preset range, and judging whether to carry out active interaction according to the duration;
s4, when the active interaction is judged to be carried out in S3, controlling the device carrying the infrared intelligent interaction system to move to the position near the object to be interacted;
s5, sending an active interaction instruction;
s6, performing gesture detection on the simulated interaction object;
and S7, carrying out subsequent interaction according to the detected gesture.
Preferably, the captured eye images of S1 include eye images whose all gaze location positions can be determined to be within a preset position range.
Preferably, any part on the device carrying the infrared intelligent interaction system is included in the preset position range.
Preferably, the monitoring method of S3 includes: and repeating the steps S1 and S2 within a time interval t, and judging to perform active interaction when the positions of the positioning sight lines are within the preset range for two times continuously, wherein the t is a set threshold value.
Preferably, S4 includes:
s401, detecting the surrounding environment in real time;
s402, controlling a device carrying the infrared intelligent interaction system to move towards a to-be-interacted object according to the surrounding environment;
s403, controlling a device carrying the infrared intelligent interaction system to avoid other surrounding pedestrians and obstacles according to the surrounding environment;
and S404, controlling the device carrying the infrared intelligent interaction system to stop moving near the object to be interacted.
Preferably, the step S5 of issuing the active interaction instruction includes one or more of playing an interactive voice or controlling a device with an infrared intelligent interaction system to perform an interactive gesture operation or playing an interactive picture.
Preferably, step S6 includes:
s601, controlling the gesture detection related detection device to turn to the simulated interaction object;
s602, acquiring a gesture image;
and S603, carrying out operation recognition on the collected gesture images to obtain recognition results.
Preferably, step S7 includes:
s701, sending an active interaction instruction again according to the detected gesture;
s702, performing gesture detection on the object to be interacted;
and S703, repeating the steps until the interaction is judged to be finished according to the detected gesture or the gesture is not detected, and judging that the interaction object actively finishes the interaction.
Advantageous effects
The invention has the advantages that:
according to the invention, before whether the object has the interactive desire is not determined, only the sight line judging module is started, wherein the infrared generation and infrared camera device can not be included, so that energy is saved, and meanwhile, some doubts about pedestrians having concerns about infrared rays can be eliminated.
The invention can judge the sight line only by collecting common images, and can greatly reduce the cost of carrying equipment without adopting all infrared camera devices because more camera devices are needed in the sight line judging step (the camera angle of the camera device needs to cover all preset ranges for positioning the sight line position).
The method comprises the steps of firstly identifying a sight line area, monitoring the stay time of the sight line in a specific area when the area is in line, setting a threshold value, sending an interaction instruction after the stay time of the sight line exceeds the threshold value, judging the interaction intention of an object according to the stay time of the sight line, and not disturbing pedestrians without the interaction intention. The step does not depend on actively projecting light beams such as infrared rays and the like to the iris to extract features, can be completed only through a common eye image collected under natural light, avoids discomfort of a user or disturbance to pedestrians, and performs tracking according to eyeballs and feature changes around the eyeballs or according to changes of angles of the iris. An eyeball center positioning and tracking algorithm can be adopted. After the sight line is judged, the pedestrian interaction method can actively interact with the pedestrian with an interaction intention, can be used for terminals with a promotion or recommendation function such as vending machines and the like, attracts the pedestrian, and increases the interaction amount through active interaction.
Drawings
FIG. 1 is a schematic diagram of a system structure of an infrared intelligent interactive system according to the present invention;
FIG. 2 is a schematic view of a sight line determination module in an infrared intelligent interaction system according to the present invention;
FIG. 3 is a schematic diagram of a mobile control module in an infrared intelligent interactive system according to the present invention;
FIG. 4 is a schematic diagram of a gesture detection module in an infrared intelligent interactive system according to the present invention;
FIG. 5 is a schematic diagram illustrating steps of a method for performing human-computer interaction by using the infrared intelligent interaction system provided by the invention;
FIG. 6 is a schematic diagram illustrating a step of controlling a device carrying an infrared intelligent interactive system to move to a position near an object to be interacted in a step of a method for performing human-computer interaction by using the infrared intelligent interactive system provided by the invention;
FIG. 7 is a schematic diagram illustrating a step of performing gesture detection on an object to be interacted in a step of a method of performing human-computer interaction by using the infrared intelligent interaction system provided by the present invention;
fig. 8 is a schematic diagram illustrating steps of performing subsequent interaction according to detected gestures in steps of a method for performing human-computer interaction by using the infrared intelligent interaction system provided by the present invention.
Detailed Description
The embodiments of the present invention will be further described with reference to the accompanying drawings.
One embodiment of the present invention: as shown in fig. 1, an infrared intelligent interaction system includes a sight line determination module, a movement control module, an active interaction module, a gesture detection module, and an interaction control module;
the sight line judging module is used for capturing eye images of the interaction-simulating objects near the device carrying the infrared intelligent interaction system and judging whether active interaction is carried out according to the eye images of the interaction-simulating objects; the mobile control module is used for controlling the capturing device carrying the infrared intelligent interaction system to move towards the direction of the object to be interacted when judging that the active interaction is carried out; the active interaction module is used for sending an active interaction instruction; the gesture detection module is used for performing gesture detection on the to-be-interacted object; the interaction control module is used for controlling the active interaction module to send an active interaction instruction, or controlling the active interaction module to send a corresponding active interaction instruction according to a result obtained by the detected gesture detection module.
The effect is as follows: the method comprises the steps of firstly identifying a sight line area, monitoring the stay time of the sight line in a specific area when the area is in line, setting a threshold value, sending an interaction instruction after the stay time of the sight line exceeds the threshold value, judging the interaction intention of an object according to the stay time of the sight line, and not disturbing pedestrians without the interaction intention.
One embodiment of the present invention: as shown in FIG. 2, the sight line determination module includes an eye image capturing unit, a sight line determination unit, a time monitoring unit
The eye image capturing unit is used for capturing eye images within a capture range of the eye image capturing unit; the sight line judging unit is used for processing the captured eye pattern phenomenon and positioning the sight line position; the time monitoring unit is used for monitoring the duration that the positioning sight line position of the object to be interacted falls in the preset range when the positioning sight line position falls in the preset range, and judging whether to carry out active interaction according to the duration;
the eye image capturing mode is shooting or infrared shooting, and the captured eye image is a common image or an infrared imaging image. More preferably a normal image.
The eye image capturing unit comprises more than two camera devices, and the camera devices are distributed on a device carrying the infrared intelligent interaction system.
The position of the camera device can capture eye images within a position range of the device, wherein all the sight lines of the eye images can be judged to be carried with the infrared intelligent interaction system.
The effect is as follows: the sight line judging step does not rely on actively projecting light beams such as infrared rays and the like to the iris to extract features, can be completed only through a common eye image collected under natural light, avoids discomfort of a user or disturbance to pedestrians, and carries out tracking according to eyeballs and feature changes around the eyeballs or according to changes of angles of the iris. An eyeball center positioning and tracking algorithm can be adopted. The sight line can be judged only by collecting common images, and the cost of carrying equipment can be greatly reduced without completely adopting an infrared camera device because the number of camera devices required by the sight line judging step is possibly more (the camera angle of the camera device needs to cover the preset range of all positioning sight line positions).
One embodiment of the present invention: as shown in fig. 3, the mobile control module includes an environment monitoring unit, a forward control unit, an avoidance control unit, and a stop control unit;
the environment monitoring unit is used for detecting the surrounding environment; the forward control unit is used for controlling the device carrying the infrared intelligent interaction system to move towards the direction of the interaction object to be interacted according to the surrounding environment; the avoidance control unit is used for controlling the device carrying the infrared intelligent interaction system to avoid other surrounding pedestrians and obstacles according to the surrounding environment; the stop control unit is used for controlling a device carrying the infrared intelligent interaction system to stop moving near the object to be interacted.
The effect is as follows: the device can move to the vicinity of the interactive object, and can interact with the object to be interacted in a larger range through sight line detection and movement control, so that the missing interaction rate of the object with interaction will is reduced.
One embodiment of the present invention: the step of sending the active interaction instruction comprises one or more of playing interactive voice or controlling a device carrying the infrared intelligent interaction system to carry out interactive gesture operation or playing interactive pictures.
The effect is as follows: different interaction instructions can be carried out according to environmental requirements, and the method is suitable for various interaction scenes.
One embodiment of the present invention: as shown in fig. 4, the gesture detection module includes a device control unit, a gesture image acquisition unit, and an operation recognition unit.
The device control unit is used for controlling the gesture detection related detection device to turn to the simulated interaction object; the gesture image acquisition unit is used for acquiring a gesture image; the operation recognition unit is used for performing operation recognition on the collected gesture images to obtain recognition results.
The gesture detection related detection device comprises a camera; an infrared generating device can be further included; the camera may be an infrared camera.
The effect is as follows: and the interaction can be accurate. Before whether the object has the interactive desire is not determined, only the sight line judging module is started, infrared generation and an infrared camera device can be omitted, energy is saved, and meanwhile, doubts of pedestrians having concerns about infrared rays can be eliminated.
One embodiment of the present invention: as shown in fig. 5, the method for performing human-computer interaction by using the infrared intelligent interaction system includes:
s1, capturing an eye image of an object to be interacted near a device carrying the infrared intelligent interaction system;
s2, processing the captured eye pattern phenomenon, and positioning the sight line position;
s3, when the position of the positioning sight line falls in a preset range, monitoring the duration of the position of the positioning sight line of the object to be interacted falling in the preset range, and judging whether to carry out active interaction according to the duration;
s4, when the active interaction is judged to be carried out in S3, controlling the device carrying the infrared intelligent interaction system to move to the position near the object to be interacted;
s5, sending an active interaction instruction;
s6, performing gesture detection on the simulated interaction object;
and S7, carrying out subsequent interaction according to the detected gesture.
S1, wherein the captured eye images include eye images whose all sight line positioning positions can be determined to be within a preset position range.
The preset position range comprises any part on a device carrying the infrared intelligent interaction system.
S3 the monitoring method includes: and repeating the steps S1 and S2 within a time interval t, and judging to perform active interaction when the positions of the positioning sight lines are within the preset range for two times continuously, wherein the t is a set threshold value.
The effect is as follows: firstly, a sight line area is identified, when the area is met, the stay time of the sight line in a specific area is monitored, a threshold value is set, an interaction instruction is sent out after the stay time of the sight line exceeds the threshold value, the interaction intention of an object is judged through the stay time of the sight line, and the pedestrian without the interaction intention is not disturbed. The step does not depend on actively projecting light beams such as infrared rays and the like to the iris to extract features, can be completed only through a common eye image collected under natural light, avoids discomfort of a user or disturbance to pedestrians, and performs tracking according to eyeballs and feature changes around the eyeballs or according to changes of angles of the iris. An eyeball center positioning and tracking algorithm can be adopted.
One embodiment of the present invention: as shown in fig. 6, in the above embodiment:
the method for performing man-machine interaction by using the infrared intelligent interaction system comprises the following steps:
s1, capturing an eye image of an object to be interacted near a device carrying the infrared intelligent interaction system;
s2, processing the captured eye pattern phenomenon, and positioning the sight line position;
s3, when the position of the positioning sight line falls in a preset range, monitoring the duration of the position of the positioning sight line of the object to be interacted falling in the preset range, and judging whether to carry out active interaction according to the duration;
s4, when the active interaction is judged to be carried out in S3, controlling the device carrying the infrared intelligent interaction system to move to the position near the object to be interacted;
s5, sending an active interaction instruction;
s6, performing gesture detection on the simulated interaction object;
and S7, carrying out subsequent interaction according to the detected gesture.
S1, wherein the captured eye images include eye images whose all sight line positioning positions can be determined to be within a preset position range.
Wherein S4 includes:
s401, detecting the surrounding environment in real time;
s402, controlling a device carrying the infrared intelligent interaction system to move towards a to-be-interacted object according to the surrounding environment;
s403, controlling a device carrying the infrared intelligent interaction system to avoid other surrounding pedestrians and obstacles according to the surrounding environment;
and S404, controlling the device carrying the infrared intelligent interaction system to stop moving near the object to be interacted.
S5, the sending of the active interaction instruction includes one or more of playing interactive voice or controlling a device carrying the infrared intelligent interaction system to perform interactive gesture operation or playing interactive pictures.
The effect is as follows: the device can move to the vicinity of the interactive object, and can interact with the object to be interacted in a larger range through sight line detection and movement control, so that the missing interaction rate of the object with interaction will is reduced.
One embodiment of the present invention: as shown in fig. 7, in the above embodiment:
the method for performing man-machine interaction by using the infrared intelligent interaction system comprises the following steps:
s1, capturing an eye image of an object to be interacted near a device carrying the infrared intelligent interaction system;
s2, processing the captured eye pattern phenomenon, and positioning the sight line position;
s3, when the position of the positioning sight line falls in a preset range, monitoring the duration of the position of the positioning sight line of the object to be interacted falling in the preset range, and judging whether to carry out active interaction according to the duration;
s4, when the active interaction is judged to be carried out in S3, controlling the device carrying the infrared intelligent interaction system to move to the position near the object to be interacted;
s5, sending an active interaction instruction;
s6, performing gesture detection on the simulated interaction object;
and S7, carrying out subsequent interaction according to the detected gesture.
S1, wherein the captured eye images include eye images whose all sight line positioning positions can be determined to be within a preset position range.
Wherein step S6 includes:
s601, controlling the gesture detection related detection device to turn to the simulated interaction object;
s602, acquiring a gesture image;
and S603, carrying out operation recognition on the collected gesture images to obtain recognition results.
The effect is as follows: the interaction accuracy is increased.
One embodiment of the present invention: as shown in fig. 8, in the above embodiment:
the method for performing man-machine interaction by using the infrared intelligent interaction system comprises the following steps:
s1, capturing an eye image of an object to be interacted near a device carrying the infrared intelligent interaction system;
s2, processing the captured eye pattern phenomenon, and positioning the sight line position;
s3, when the position of the positioning sight line falls in a preset range, monitoring the duration of the position of the positioning sight line of the object to be interacted falling in the preset range, and judging whether to carry out active interaction according to the duration;
s4, when the active interaction is judged to be carried out in S3, controlling the device carrying the infrared intelligent interaction system to move to the position near the object to be interacted;
s5, sending an active interaction instruction;
s6, performing gesture detection on the simulated interaction object;
and S7, carrying out subsequent interaction according to the detected gesture.
S1, wherein the captured eye images include eye images whose all sight line positioning positions can be determined to be within a preset position range.
Wherein step S7 includes:
s701, sending an active interaction instruction again according to the detected gesture;
s702, performing gesture detection on the object to be interacted;
and S703, repeating the steps until the interaction is judged to be finished according to the detected gesture or the gesture is not detected, and judging that the interaction object actively finishes the interaction.
The effect is as follows: before whether the object has the interactive desire is not determined, only the sight line judging module is started, infrared generation and an infrared camera device can be omitted, energy is saved, and meanwhile, doubts of pedestrians having concerns about infrared rays can be eliminated. The sight line is judged only by collecting common images, and because the number of camera devices required by the sight line judging step is possibly large (the camera angle of the camera device needs to cover the preset range of all the positioning sight line positions), the cost of carrying equipment can be greatly reduced without adopting all infrared camera devices. The method comprises the steps of firstly identifying sight areas, monitoring the stay time of sight in a specific area when the areas are met, setting a threshold value, sending an interaction instruction after the stay time exceeds the threshold value, judging the interaction intention of an object through the stay time of the sight, and not disturbing pedestrians without the interaction intention. The step does not depend on actively projecting light beams such as infrared rays and the like to the iris to extract features, can be completed only through a common eye image collected under natural light, avoids discomfort of a user or disturbance to pedestrians, and performs tracking according to eyeballs and feature changes around the eyeballs or according to changes of angles of the iris. An eyeball center positioning and tracking algorithm can be adopted. After the sight line is judged, the pedestrian interaction method can actively interact with the pedestrian with an interaction intention, can be used for terminals with a promotion or recommendation function such as vending machines and the like, attracts the pedestrian, and increases the interaction amount through active interaction.
While the preferred embodiments and examples of the present invention have been described in detail, the present invention is not limited to the embodiments and examples, and various changes can be made without departing from the spirit of the present invention within the knowledge of those skilled in the art.

Claims (10)

1. An infrared intelligent interaction system is characterized in that: the device comprises a sight line judging module, a movement control module, an active interaction module, a gesture detection module and an interaction control module;
the sight line judging module is used for capturing eye images of the interaction-simulating objects near the device carrying the infrared intelligent interaction system and judging whether active interaction is carried out according to the eye images of the interaction-simulating objects; the mobile control module is used for controlling the capturing device carrying the infrared intelligent interaction system to move towards the direction of the object to be interacted when judging that the active interaction is carried out; the active interaction module is used for sending an active interaction instruction; the gesture detection module is used for performing gesture detection on the to-be-interacted object; the interaction control module is used for controlling the active interaction module to send an active interaction instruction, or controlling the active interaction module to send a corresponding active interaction instruction according to a result obtained by the detected gesture detection module.
2. The infrared intelligent interaction system of claim 1, wherein: the sight line judging module comprises an eye image capturing unit, a sight line judging unit and a time monitoring unit;
the eye image capturing unit is used for capturing eye images within a capture range of the eye image capturing unit; the sight line judging unit is used for processing the captured eye pattern phenomenon and positioning the sight line position; and the time monitoring unit is used for monitoring the duration that the positioning sight line position of the object to be interacted falls in the preset range when the positioning sight line position falls in the preset range, and judging whether to carry out active interaction according to the duration.
3. The infrared intelligent interaction system of claim 1, wherein: the mobile control module comprises an environment monitoring unit, an advancing control unit, an avoiding control unit and a stopping control unit;
the environment monitoring unit is used for detecting the surrounding environment; the forward control unit is used for controlling the device carrying the infrared intelligent interaction system to move towards the direction of the interaction object to be interacted according to the surrounding environment; the avoidance control unit is used for controlling the device carrying the infrared intelligent interaction system to avoid other surrounding pedestrians and obstacles according to the surrounding environment; the stop control unit is used for controlling a device carrying the infrared intelligent interaction system to stop moving near the object to be interacted.
4. The infrared intelligent interaction system of claim 1, wherein: the gesture detection module comprises a device control unit, a gesture image acquisition unit and an operation recognition unit;
the device control unit is used for controlling the gesture detection related detection device to turn to the simulated interaction object; the gesture image acquisition unit is used for acquiring a gesture image; the operation recognition unit is used for performing operation recognition on the collected gesture images to obtain recognition results.
5. The infrared intelligent interaction system of any one of claims 1 to 4, characterized in that: the method for performing man-machine interaction by using the infrared intelligent interaction system comprises the following steps:
s1, capturing an eye image of an object to be interacted near a device carrying the infrared intelligent interaction system;
s2, processing the captured eye pattern phenomenon, and positioning the sight line position;
s3, when the position of the positioning sight line falls in a preset range, monitoring the duration of the position of the positioning sight line of the object to be interacted falling in the preset range, and judging whether to carry out active interaction according to the duration;
s4, when the active interaction is judged to be carried out in S3, controlling the device carrying the infrared intelligent interaction system to move to the position near the object to be interacted;
s5, sending an active interaction instruction;
s6, performing gesture detection on the simulated interaction object;
and S7, carrying out subsequent interaction according to the detected gesture.
6. The infrared intelligent interaction system of claim 5, wherein: s1, the captured eye images include eye images of which all the positions of the gaze location can be determined to be within a preset position range; any part on a device carrying the infrared intelligent interaction system is included in the preset position range; s3 the monitoring method includes: and repeating the steps S1 and S2 within a time interval t, and judging to perform active interaction when the positions of the positioning sight lines are within the preset range for two times continuously, wherein the t is a set threshold value.
7. The infrared intelligent interaction system of claim 5, wherein: the S4 includes:
s401, detecting the surrounding environment in real time;
s402, controlling a device carrying the infrared intelligent interaction system to move towards a to-be-interacted object according to the surrounding environment;
s403, controlling a device carrying the infrared intelligent interaction system to avoid other surrounding pedestrians and obstacles according to the surrounding environment;
and S404, controlling the device carrying the infrared intelligent interaction system to stop moving near the object to be interacted.
8. The infrared intelligent interaction system of claim 5, wherein: s5, the sending of the active interaction instruction includes one or more of playing interactive voice or controlling a device carrying the infrared intelligent interaction system to perform interactive gesture operation or playing interactive pictures.
9. The infrared intelligent interaction system of claim 5, wherein: the step S6 includes:
s601, controlling the gesture detection related detection device to turn to the simulated interaction object;
s602, acquiring a gesture image;
and S603, carrying out operation recognition on the collected gesture images to obtain recognition results.
10. The infrared intelligent interaction system of claim 5, wherein: the step S7 includes:
s701, sending an active interaction instruction again according to the detected gesture;
s702, performing gesture detection on the object to be interacted;
and S703, repeating the steps until the interaction is judged to be finished according to the detected gesture or the gesture is not detected, and judging that the interaction object actively finishes the interaction.
CN202110505087.8A 2021-05-10 2021-05-10 Infrared intelligent interaction system Pending CN113093916A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110505087.8A CN113093916A (en) 2021-05-10 2021-05-10 Infrared intelligent interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110505087.8A CN113093916A (en) 2021-05-10 2021-05-10 Infrared intelligent interaction system

Publications (1)

Publication Number Publication Date
CN113093916A true CN113093916A (en) 2021-07-09

Family

ID=76665170

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110505087.8A Pending CN113093916A (en) 2021-05-10 2021-05-10 Infrared intelligent interaction system

Country Status (1)

Country Link
CN (1) CN113093916A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160054805A1 (en) * 2013-03-29 2016-02-25 Lg Electronics Inc. Mobile input device and command input method using the same
US20170046965A1 (en) * 2015-08-12 2017-02-16 Intel Corporation Robot with awareness of users and environment for use in educational applications
CN111070214A (en) * 2018-10-18 2020-04-28 Lg电子株式会社 Robot
CN112556126A (en) * 2020-12-14 2021-03-26 珠海格力电器股份有限公司 Air deflector angle control method and control device and air conditioner

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160054805A1 (en) * 2013-03-29 2016-02-25 Lg Electronics Inc. Mobile input device and command input method using the same
US20170046965A1 (en) * 2015-08-12 2017-02-16 Intel Corporation Robot with awareness of users and environment for use in educational applications
CN111070214A (en) * 2018-10-18 2020-04-28 Lg电子株式会社 Robot
CN112556126A (en) * 2020-12-14 2021-03-26 珠海格力电器股份有限公司 Air deflector angle control method and control device and air conditioner

Similar Documents

Publication Publication Date Title
US11257223B2 (en) Systems and methods for user detection, identification, and localization within a defined space
Hsieh et al. A real time hand gesture recognition system using motion history image
US10095033B2 (en) Multimodal interaction with near-to-eye display
CN107493495A (en) Interaction locations determine method, system, storage medium and intelligent terminal
EP3477593B1 (en) Hand detection and tracking method and device
US20110273551A1 (en) Method to control media with face detection and hot spot motion
CN102200830A (en) Non-contact control system and control method based on static gesture recognition
KR20020031188A (en) Pointing direction calibration in video conferencing and other camera-based system applications
CN111597969A (en) Elevator control method and system based on gesture recognition
CN111767785A (en) Man-machine interaction control method and device, intelligent robot and storage medium
CN113934307B (en) Method for starting electronic equipment according to gestures and scenes
CN108108709B (en) Identification method and device and computer storage medium
CN112379781B (en) Man-machine interaction method, system and terminal based on foot information identification
CN104063041A (en) Information processing method and electronic equipment
CN106951077B (en) Prompting method and first electronic device
Khilari Iris tracking and blink detection for human-computer interaction using a low resolution webcam
CN115565241A (en) Gesture recognition object determination method and device
CN113093916A (en) Infrared intelligent interaction system
CN103327385A (en) Distance identification method and device based on single image sensor
TW201709022A (en) Non-contact control system and method
CN114821753B (en) Eye movement interaction system based on visual image information
CN113093907B (en) Man-machine interaction method, system, equipment and storage medium
CN115421590A (en) Gesture control method, storage medium and camera device
KR102305880B1 (en) User's Gaze Tracking Method, and Medium Being Recorded with Program for Executing the Method
CN114333056A (en) Gesture control method, system, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210709

RJ01 Rejection of invention patent application after publication