CN116301321A - Control method of intelligent wearable device and related device - Google Patents

Control method of intelligent wearable device and related device Download PDF

Info

Publication number
CN116301321A
CN116301321A CN202211722887.6A CN202211722887A CN116301321A CN 116301321 A CN116301321 A CN 116301321A CN 202211722887 A CN202211722887 A CN 202211722887A CN 116301321 A CN116301321 A CN 116301321A
Authority
CN
China
Prior art keywords
wearable device
determining
infrared
plane
intelligent wearable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211722887.6A
Other languages
Chinese (zh)
Inventor
骆伟华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yingmu Technology Co ltd
Original Assignee
Shenzhen Yingmu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yingmu Technology Co ltd filed Critical Shenzhen Yingmu Technology Co ltd
Priority to CN202211722887.6A priority Critical patent/CN116301321A/en
Publication of CN116301321A publication Critical patent/CN116301321A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a control method of intelligent wearable equipment and related equipment, wherein the method comprises the following steps: determining a preset plane capable of performing data interaction with the intelligent wearing equipment based on the gesture of the intelligent wearing equipment; establishing an infrared plane parallel to a preset plane according to the preset plane, and determining a mapping relation between the infrared plane and a virtual screen of the intelligent wearable device; when a user acts on the infrared plane, a user action image is acquired, wherein the user action image is provided with infrared light spots; the method comprises the steps of calculating actual coordinates of infrared light spots based on an image processing technology, converting the actual coordinates into virtual coordinates of a virtual screen of the intelligent wearable device, and determining control information of the target intelligent wearable device according to the virtual coordinates. The user action image with the infrared light spots is processed, the operation amount is reduced, and the actual coordinates of the infrared light spots are calculated, so that the operation load is greatly reduced compared with the identification of all the contents of the image.

Description

Control method of intelligent wearable device and related device
Technical Field
The invention relates to the technical field of control of intelligent wearable equipment, in particular to a control method of intelligent wearable equipment and related equipment.
Background
Existing AR smart wearable devices, such as smart glasses, typically employ slam+ image gesture recognition technology to achieve positioning and interaction and typing, accomplished by clicking a floating virtual button in the air with a hand. The resources such as a CPU and a memory of the intelligent glasses terminal occupied by SLAM positioning/gesture recognition image processing are high, so that a processor with high performance and a battery with large capacity are needed, the intelligent glasses cannot be light and thin, and the cruising ability is short. The sensor such as the camera and the IMU at the end of the intelligent glasses is relied on, so that the sensor has high requirements on precision, and the sensor cannot be used in a darker environment due to light in the environment. Meanwhile, the SLAM+ image gesture recognition technology needs to monitor the actions of a user in real time, and faces massive image data, the image content of each piece of image data is recognized and processed, massive calculation power is needed, and meanwhile, in the prior art, real-time feedback cannot be brought by setting a virtual button in air to click suspension, and a user cannot intuitively know whether to click the virtual button.
Disclosure of Invention
In view of the above, the invention provides a control method of an intelligent wearable device and related devices, which are used for solving the problems that in the prior art, when SLAM+ image gesture recognition technology is adopted to realize positioning, interaction and typing, the required calculation amount is large, the requirement on hardware is high, and the use environment is limited by the light environment. To achieve one or a part or all of the above or other objects, the present invention provides a control method of an intelligent wearable device, including: determining a preset plane capable of performing data interaction with the intelligent wearing equipment based on the gesture of the intelligent wearing equipment;
establishing an infrared plane parallel to the preset plane according to the preset plane, and determining a mapping relation between the infrared plane and the virtual screen of the intelligent wearable device;
when a user acts on an infrared plane, acquiring a user action image, wherein the user action image is provided with infrared light spots;
and calculating the actual coordinates of the infrared light spots based on an image processing technology, converting the actual coordinates into virtual coordinates of the virtual screen of the intelligent wearable device based on the mapping relation, and further determining control information of the intelligent wearable device according to the virtual coordinates.
Optionally, the step of determining the preset plane for performing data interaction with the intelligent wearable device based on the gesture of the intelligent wearable device includes:
determining a control range capable of performing information interaction with the intelligent wearable equipment based on the gesture of the intelligent wearable equipment;
and determining a preset plane for carrying out data interaction with the intelligent wearable equipment according to the control range.
Optionally, the step of determining the control range capable of performing information interaction with the smart wearable device based on the gesture of the smart wearable device includes:
determining a current sight line range of a user based on the gesture of the intelligent wearable device;
determining a screen range of the virtual screen of the intelligent wearable device according to the sight range;
and determining a control range capable of performing information interaction with the intelligent wearable device based on the screen range.
Optionally, the step of determining a mapping relationship between the infrared plane and the virtual screen of the intelligent wearable device includes:
acquiring actual coordinates of all vertexes of the infrared plane and virtual coordinates of all vertexes of a screen range of a virtual screen of the intelligent wearable device;
matching according to the actual coordinates and the virtual coordinates and the azimuth sequence to obtain at least three pairs of coordinate pairs;
and determining the mapping relation between the infrared plane and the virtual screen of the intelligent wearable device based on the coordinate pair.
Optionally, the step of acquiring a user action image, where the user action image has an infrared light spot includes:
monitoring the running state of the infrared plane, wherein the running state comprises a shielding state and a normal running state;
and when the running state of the infrared plane is a shielding state, determining the shielded target infrared emission equipment, and acquiring a user action image with the infrared light spots.
Optionally, the step of calculating the actual coordinates of the infrared light spot based on the image processing technology and converting the actual coordinates into virtual coordinates of the virtual screen of the intelligent wearable device based on the mapping relation includes:
identifying the infrared light spots through an image processing technology, and determining the brightness of the infrared light spots;
determining an energy intensity of the infrared light spot based on the brightness;
determining a distance value of the infrared light spot from the target infrared emission device according to the energy intensity;
determining actual coordinates of the infrared light spot based on the position information of the target infrared light emitting device and a distance value of the infrared light spot from the target infrared light emitting device;
and converting the actual coordinates into virtual coordinates of the virtual screen of the intelligent wearable device based on the mapping relation.
Optionally, before the step of determining the preset plane capable of performing data interaction with the smart wearable device based on the gesture of the smart wearable device, the method further includes:
acquiring image data of the intelligent wearable device through an image acquisition device;
and determining the gesture of the intelligent wearable device based on the image data.
On the other hand, the application provides a controlling means of intelligent wearing equipment, controlling means includes:
the data acquisition module is used for determining a preset plane capable of performing data interaction with the intelligent wearing equipment based on the gesture of the intelligent wearing equipment;
the establishing module is used for establishing an infrared plane parallel to the preset plane according to the preset plane and determining the mapping relation between the infrared plane and the virtual screen of the intelligent wearable device;
the image acquisition module is used for acquiring a user action image when a user acts on an infrared plane, wherein the user action image is provided with infrared light spots;
the control module is used for calculating the actual coordinates of the infrared light spots based on an image processing technology, converting the actual coordinates into virtual coordinates of the virtual screen of the intelligent wearable device based on the mapping relation, and further determining control information of the intelligent wearable device according to the virtual coordinates.
In a third aspect, an embodiment of the present application provides an electronic device, including: the system comprises a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, when the electronic device is running, the processor and the memory are communicated through the bus, and the machine-readable instructions are executed by the processor to perform the steps of the control method of the intelligent wearable device.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium, on which a computer program is stored, which when executed by a processor performs the steps of a method for controlling a smart wearable device as described above.
The implementation of the embodiment of the invention has the following beneficial effects:
determining a preset plane capable of performing data interaction with intelligent wearing equipment based on the gesture of the intelligent wearing equipment; establishing an infrared plane parallel to the preset plane according to the preset plane, and determining a mapping relation between the infrared plane and the virtual screen of the intelligent wearable device; when a user acts on an infrared plane, acquiring a user action image, wherein the user action image is provided with infrared light spots; the method comprises the steps of calculating the actual coordinates of the infrared light spots based on an image processing technology, converting the actual coordinates into virtual coordinates of a virtual screen of the intelligent wearable device based on the mapping relation, and further determining control information of the intelligent wearable device according to the virtual coordinates through processing only a user action image with the infrared light spots, so that real-time monitoring of user actions is avoided, the operation amount is reduced, misoperation caused by the user actions is also reduced, and compared with the steps of identifying all contents of an image, the operation burden is greatly reduced.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Wherein:
fig. 1 is a flowchart of a control method of an intelligent wearable device provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a control system of a smart wearable device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a control device of an intelligent wearable device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a storage medium according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1, an embodiment of the present application provides a control method of an intelligent wearable device, including:
s101, determining a preset plane capable of performing data interaction with intelligent wearing equipment based on the gesture of the intelligent wearing equipment;
s102, establishing an infrared plane parallel to the preset plane according to the preset plane, and determining a mapping relation between the infrared plane and the virtual screen of the intelligent wearable device;
s103, when a user acts on an infrared plane, acquiring a user action image, wherein the user action image is provided with infrared light spots;
s104, calculating actual coordinates of the infrared light spots based on an image processing technology, converting the actual coordinates into virtual coordinates of a virtual screen of the intelligent wearable device based on the mapping relation, and further determining control information of the intelligent wearable device according to the virtual coordinates.
Determining a preset plane capable of performing data interaction with intelligent wearing equipment based on the gesture of the intelligent wearing equipment; establishing an infrared plane parallel to the preset plane according to the preset plane, and determining a mapping relation between the infrared plane and the virtual screen of the intelligent wearable device; when a user acts on an infrared plane, acquiring a user action image, wherein the user action image is provided with infrared light spots; the method comprises the steps of calculating the actual coordinates of the infrared light spots based on an image processing technology, converting the actual coordinates into virtual coordinates of a virtual screen of the intelligent wearable device based on the mapping relation, and further determining control information of the intelligent wearable device according to the virtual coordinates through processing only a user action image with the infrared light spots, so that real-time monitoring of user actions is avoided, the operation amount is reduced, misoperation caused by the user actions is also reduced, and compared with the steps of identifying all contents of an image, the operation burden is greatly reduced.
In a possible implementation manner, the step of determining a preset plane for performing data interaction with the smart wearable device based on the gesture of the smart wearable device includes:
determining a control range capable of performing information interaction with the intelligent wearable equipment based on the gesture of the intelligent wearable equipment;
and determining a preset plane for carrying out data interaction with the intelligent wearable equipment according to the control range.
By way of example, the gesture of the intelligent wearable device can be obtained through the external positioning mode or the external positioning mode of the external positioning device, so that the control range capable of carrying out information interaction with the intelligent wearable device is determined based on the gesture of the intelligent wearable device, and a preset plane capable of carrying out data interaction with the intelligent wearable device is determined according to the control range, wherein the preset plane is selected as a desktop or a wall surface.
In a possible implementation manner, the step of determining the control range capable of performing information interaction with the smart wearable device based on the gesture of the smart wearable device includes:
determining a current sight line range of a user based on the gesture of the intelligent wearable device;
determining a screen range of the virtual screen of the intelligent wearable device according to the sight range;
and determining a control range capable of performing information interaction with the intelligent wearable device based on the screen range.
For example, a current sight line range of a user is determined based on a gesture of the intelligent wearable device, for example, the current sight line range of the user is a, a virtual screen of the intelligent wearable device is set at a central position in the sight line range a, the screen range is B, a distance value of a range boundary of the screen range B from a range boundary of the sight line range a is a preset value, and a control range C on a desktop or a wall surface capable of performing information interaction with the intelligent wearable device is determined based on the screen range B.
In a possible implementation manner, the step of determining the mapping relationship between the infrared plane and the virtual screen of the intelligent wearable device includes:
acquiring actual coordinates of all vertexes of the infrared plane and virtual coordinates of all vertexes of a screen range of a virtual screen of the intelligent wearable device;
matching according to the actual coordinates and the virtual coordinates and the azimuth sequence to obtain at least three pairs of coordinate pairs;
and determining the mapping relation between the infrared plane and the virtual screen of the intelligent wearable device based on the coordinate pair.
The infrared ray plane is obtained by projecting an infrared ray bundle parallel to the plane of the control range C on the control range C, and the size of the infrared ray plane is identical to that of the control range C, that is, the control range C can be regarded as the projection of the infrared ray plane on the plane of the control range C along the plane perpendicular to the control range C; and matching according to the actual coordinates and the virtual coordinates in a direction sequence to obtain at least three pairs of coordinate pairs, namely, an upper left coordinate pair comprising an upper left vertex of the infrared plane and an upper left vertex of the virtual screen, a lower left coordinate pair comprising a lower left vertex of the infrared plane and a lower left vertex of the virtual screen, an upper right coordinate pair comprising an upper right vertex of the infrared plane and an upper right vertex of the virtual screen and a lower left coordinate pair comprising a lower right vertex of the infrared plane and a lower right vertex of the virtual screen, and determining the mapping relation between the infrared plane and the virtual screen of the intelligent wearable device based on the coordinate pairs.
In one possible implementation manner, the step of acquiring a user action image, where the user action image has an infrared light spot, includes:
monitoring the running state of the infrared plane, wherein the running state comprises a shielding state and a normal running state;
and when the running state of the infrared plane is a shielding state, determining the shielded target infrared emission equipment, and acquiring a user action image with the infrared light spots.
For example, all actions of the user do not need to be monitored in real time, and only actions of the user interacting with the infrared plane are required to be acquired, so that a user action image with infrared light spots is acquired.
In one possible implementation manner, the step of calculating the actual coordinates of the infrared light spot based on the image processing technology and converting the actual coordinates into virtual coordinates of the virtual screen of the smart wearable device based on the mapping relationship includes:
identifying the infrared light spots through an image processing technology, and determining the brightness of the infrared light spots;
determining an energy intensity of the infrared light spot based on the brightness;
determining a distance value of the infrared light spot from the target infrared emission device according to the energy intensity;
determining actual coordinates of the infrared light spot based on the position information of the target infrared light emitting device and a distance value of the infrared light spot from the target infrared light emitting device;
and converting the actual coordinates into virtual coordinates of the virtual screen of the intelligent wearable device based on the mapping relation.
For example, since the brightness of the infrared ray has a linear relationship with the propagation distance of the infrared ray and the brightness of the infrared ray has a proportional relationship with the energy intensity of the infrared ray spot, the propagation distance of the infrared ray is reversely deduced based on the energy intensity of the infrared ray spot, so as to obtain the Y value of the actual coordinate of the infrared ray spot, the X value of the actual coordinate of the infrared ray spot is obtained based on the position information of the target infrared ray emitting device, so as to obtain the complete actual coordinate of the infrared ray spot, the coordinate of the actual position operated by the user is obtained, and the actual coordinate is converted into the virtual coordinate of the virtual screen of the intelligent wearable device based on the mapping relationship.
In a possible implementation manner, before the step of determining the preset plane capable of performing data interaction with the smart wearable device based on the gesture of the smart wearable device, the method further includes:
acquiring image data of the intelligent wearable device through an image acquisition device;
and determining the gesture of the intelligent wearable device based on the image data.
Exemplary, a control method of an intelligent wearable device, configured to determine a posture of the intelligent wearable device, includes:
acquiring image data of the target intelligent wearable device through the image acquisition device, wherein the image data is any frame of picture;
for example, the image data of the target intelligent wearable device is shot by the positioning box, namely, the binocular camera on the image acquisition device, and any frame of picture in the image data is selected;
identifying preset mark points on target intelligent wearable equipment in the image data, and determining position information of the preset mark points;
the method comprises the steps of identifying preset mark points on target intelligent wearing equipment in image data according to an image identification technology, and identifying the infrared mark points on the target intelligent wearing equipment in the image data by receiving the infrared identification technology when the preset mark points are infrared mark points;
drawing a virtual image of the target intelligent wearing equipment in a three-dimensional space coordinate system based on the position information, and determining the gesture of the target intelligent wearing equipment according to the virtual image;
the three-dimensional space coordinates and the gesture of the target intelligent wearable device are calculated according to the position information of the preset mark points through a triangulation principle;
and determining control information of the target intelligent wearing equipment according to the gesture of the target intelligent wearing equipment so as to complete control of the target intelligent wearing equipment.
Acquiring image data of the target intelligent wearable device through the image acquisition device, wherein the image data is any frame of picture; identifying preset mark points on target intelligent wearable equipment in the image data, and determining position information of the preset mark points; drawing a virtual image of the target intelligent wearing equipment in a three-dimensional space coordinate system based on the position information, and determining the gesture of the target intelligent wearing equipment according to the virtual image; and determining control information of the target intelligent wearing equipment according to the gesture of the target intelligent wearing equipment so as to complete control of the target intelligent wearing equipment. The positioning mode of the outlide-in, namely the mode of independently positioning each frame of picture, is adopted, so that accumulated errors are avoided, and the positioning precision is improved.
In one possible embodiment, before the step of acquiring, by the image acquisition device, the image data of the target smart wearable device, further includes:
based on the structural characteristics of the target intelligent wearing equipment, at least four preset mark points are arranged on the target intelligent wearing equipment, the relative position relation among the preset mark points is determined, the preset mark points comprise a first preset mark point and a second preset mark point which are used for representing the horizontal state of the target intelligent wearing equipment, and the preset mark points further comprise a third preset mark point and a fourth preset mark point which are used for representing the vertical state of the target intelligent wearing equipment.
The method comprises the steps of setting up four preset mark points on a target intelligent wearing device based on structural characteristics of the target intelligent wearing device, connecting the four preset mark points in pairs to obtain a connecting line, and determining the relative position relation among the preset mark points based on the length and the angle of the connecting line, wherein the upper preset mark point, the lower preset mark point and the left preset mark point are used for representing the vertical state of the target intelligent wearing device, and the left preset mark point and the right preset mark point are used for representing the horizontal state of the target intelligent wearing device.
In one possible implementation, the step of identifying a preset mark point on the target smart wearable device in the image data and determining the position information of the preset mark point includes:
identifying preset mark points on target intelligent wearable equipment in the image data to obtain target preset identification points, and determining position information of the target preset mark points;
and obtaining the position information of the unrecognized hidden preset mark points on the target intelligent wearing equipment based on the position information of the target preset mark points and the relative position relation among the preset mark points, and obtaining the position information of all the preset mark points on the target intelligent wearing equipment.
For example, because there is uncertainty in the angle between the target intelligent wearable device and the positioning box, the image of the target intelligent wearable device acquired through the positioning box often cannot completely display the target intelligent wearable device, for example, when the positioning box acquires the image of the target intelligent wearable device on the left side surface of the target intelligent wearable device, the acquired image corresponds to the left view of the target intelligent wearable device, and further, only the left preset mark point and the upper and lower preset mark points, that is, the target preset mark point, are identified, and the position information of the unrecognized hidden preset mark point, that is, the position information of the right mark point, on the target intelligent wearable device is obtained by combining the position information of the left preset mark point and the upper and lower preset mark points with the relative position relation of the left preset mark point, the upper and lower preset mark points and the unrecognized right preset mark point.
In one possible implementation, the step of drawing a virtual image of the target smart wearable device in a three-dimensional space coordinate system based on the location information includes:
constructing a three-dimensional space coordinate system by taking the image acquisition equipment as an origin, and converting the position information of all preset mark points on the target intelligent wearable equipment into three-dimensional coordinate data;
and drawing a virtual image of the target intelligent wearable device in a three-dimensional space coordinate system based on the three-dimensional coordinate data.
The three-dimensional space coordinate system is constructed by using the image acquisition device, namely the positioning box as an origin, the position information of all preset mark points on the target intelligent wearable device is converted into three-dimensional coordinate data based on the distance between the preset mark points and the positioning box and the position information of the preset mark points, and a virtual image of the target intelligent wearable device is drawn in the three-dimensional space coordinate system according to the three-dimensional coordinate data.
In one possible implementation, the step of determining the pose of the target smart wearable device from the virtual image includes:
determining the position of a virtual screen on the target intelligent wearable device based on the virtual image;
and determining the gesture of the target intelligent wearable device according to the position of the virtual screen.
For example, the position of the virtual screen on the target smart wearable device, that is, the display range, the angle of the virtual screen, the coordinate data of the virtual screen, etc., is determined based on the virtual image, and the posture of the target smart wearable device is determined according to the position of the virtual screen.
In one possible implementation manner, the step of determining the control information of the target smart wearable device according to the gesture of the target smart wearable device includes:
determining a control range capable of performing information interaction with the target intelligent wearing equipment based on the gesture of the target intelligent wearing equipment;
and determining control information of the target intelligent wearable device according to the action information of the user in the control range.
For example, a control range capable of performing information interaction with the target smart wearable device is determined based on the gesture of the target smart wearable device, for example, when the gesture of the target smart wearable device is inclined by 30 °, a control range capable of performing information interaction with the target smart wearable device is determined within a range of 0 ° -30 ° and within a length of 0.5 meter, that is, when the user performs an action at an angle of 45 ° and a distance of 0.6 meter from the target smart wearable device, the user action does not perform information interaction with the target smart wearable device.
In one possible implementation, the step of determining, based on the pose of the target smart wearable device, a control range capable of information interaction with the target smart wearable device includes:
determining a current sight range of a user based on the gesture of the target intelligent wearable device;
an infrared plane perpendicular to the head-up sight line of the user is constructed in the sight line range, an infrared plane which is the same as the current sight line range of the user and can interact with the target intelligent wearing equipment is obtained, and the process of interacting information with the target intelligent wearing equipment is that when the user acts to insert the infrared plane, interaction information between the user and the target intelligent wearing equipment is determined based on the inserted position of the infrared plane.
For example, the current sight line range of the user is determined based on the posture of the target intelligent wearable device, for example, when the posture of the target intelligent wearable device is inclined by 40 degrees, the action is performed at the point A, corresponding to the key B on the virtual screen, and when the posture of the target intelligent wearable device is inclined by 45 degrees, the action is performed at the point A, corresponding to the key Z on the virtual screen, so that the current sight line range of the user is determined based on the posture of the target intelligent wearable device, and an infrared plane perpendicular to the head-up sight line of the user is constructed in the sight line range, so that the infrared plane for interaction changes along with the posture of the target intelligent wearable device.
By way of example, the AR interaction and typing are realized by combining the outside-in positioning with the infrared laser, and as the laser is adopted to emit laser and the image is acquired and processed, only specific image spots are required to be processed, and compared with a pure image gesture recognition technology for directly recognizing the hands of a user, the computing burden is greatly reduced. Meanwhile, the infrared laser can be used in dark environment. The linear infrared light beam emitted by the infrared emitter is projected parallel to the desktop, when the hand of the user is not contacted with the desktop, the picture shot in the picture of the infrared camera is free of light spots, when the finger of the user is contacted with the desktop, the infrared light beam is blocked, and the reflected infrared light enters the infrared camera and forms a light spot in the image. (the infrared camera filters visible light, only infrared light of a specific wave band can enter). And recognizing and calculating the image coordinates of the light spots based on image processing, and reversely pushing out the position of the finger of the user on the desktop. Thereby enabling interaction. Because the interaction area is positioned on the desktop plane, the desktop can be touched accurately, and real-time feedback can be brought by combining the dynamic effect and the sound effect in the software interaction system, so that high-efficiency input is realized.
In one possible embodiment, as shown in fig. 2, the present application provides a control system of a smart wearable device, where the control system includes: the locating box is connected with the intelligent glasses through Bluetooth.
The infrared binocular camera is arranged on the positioning box, works in cooperation with the infrared LED marks on the intelligent glasses and is used for positioning the positions and the postures of the intelligent glasses. The infrared camera is used for positioning the position and operation of the fingers of the user. The linear infrared laser emitter is used for emitting infrared laser parallel to the desktop, and the infrared laser is shot and processed by the infrared camera to obtain the position and operation of the finger of the user. The preset infrared LED mark points are composed of a plurality of infrared LED light emitting diodes. The optical display is a display device of the intelligent glasses, and a user can see the virtual keyboard superimposed on the desktop through the component.
The typewriting scene flow mainly comprises the following steps:
an infrared binocular camera collects and processes images.
Infrared camera recognizes infrared LED mark point
The positioning boxes are positioned to the positions and the postures of the intelligent glasses through the infrared LED mark points and are sent to the intelligent glasses through Bluetooth, the intelligent glasses receive positioning information sent by the positioning boxes, virtual keyboards are overlapped in a desktop through the optical display, and a user clicks corresponding character virtual keys according to the virtual keyboards. The linear infrared laser transmitter transmits laser, and the infrared camera acquires gesture images in an infrared plane and processes the gesture images, so that finger actions of a user are recognized, input information is judged, and the input information is sent to the intelligent glasses through Bluetooth, and a typing input process is completed.
Mouse and gesture interaction scene:
and the user does not need to look at the virtual keyboard under the interaction scene of the mouse and the gesture, and can perform interaction only by sliding in a sensing area in front of the positioning box by using single finger or multiple fingers.
The linear infrared laser transmitter transmits laser, and the infrared camera collects gesture images with infrared light spots and gesture images with infrared light spots, so that finger actions of a user are recognized, input information is judged, and the input information is sent to the intelligent glasses through Bluetooth, and one-time interactive input is completed.
In a possible implementation manner, as shown in fig. 3, the present application provides a control device of an intelligent wearable apparatus, where the control device includes:
the data acquisition module 201 is configured to determine a preset plane capable of performing data interaction with the intelligent wearable device based on a gesture of the intelligent wearable device;
the establishing module 202 is configured to establish an infrared plane parallel to the preset plane according to the preset plane, and determine a mapping relationship between the infrared plane and the virtual screen of the intelligent wearable device;
the image acquisition module 203 is configured to acquire a user action image when a user performs an action in an infrared plane, where the user action image has an infrared light spot;
the control module 204 is configured to calculate an actual coordinate of the infrared light spot based on an image processing technology, convert the actual coordinate into a virtual coordinate of a virtual screen of the intelligent wearable device based on the mapping relationship, and further determine control information of the intelligent wearable device according to the virtual coordinate.
In one possible implementation, as shown in fig. 4, an embodiment of the present application provides an electronic device 300, including: comprising a memory 310, a processor 320 and a computer program 311 stored on the memory 310 and executable on the processor 320, the processor 320 implementing, when executing the computer program 311: determining a preset plane capable of performing data interaction with the intelligent wearing equipment based on the gesture of the intelligent wearing equipment; establishing an infrared plane parallel to the preset plane according to the preset plane, and determining a mapping relation between the infrared plane and the virtual screen of the intelligent wearable device; when a user acts on an infrared plane, acquiring a user action image, wherein the user action image is provided with infrared light spots; calculating the actual coordinates of the infrared light spots based on an image processing technology, converting the actual coordinates into virtual coordinates of the virtual screen of the intelligent wearable device based on the mapping relation, and further determining control information of the intelligent wearable device according to the virtual coordinates.
In one possible implementation, as shown in fig. 5, the present embodiment provides a computer-readable storage medium 400, on which is stored a computer program 411, which computer program 411, when executed by a processor, implements: determining a preset plane capable of performing data interaction with the intelligent wearing equipment based on the gesture of the intelligent wearing equipment; establishing an infrared plane parallel to the preset plane according to the preset plane, and determining a mapping relation between the infrared plane and the virtual screen of the intelligent wearable device; when a user acts on an infrared plane, acquiring a user action image, wherein the user action image is provided with infrared light spots; calculating the actual coordinates of the infrared light spots based on an image processing technology, converting the actual coordinates into virtual coordinates of the virtual screen of the intelligent wearable device based on the mapping relation, and further determining control information of the intelligent wearable device according to the virtual coordinates.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
It will be appreciated by those of ordinary skill in the art that the modules or steps of the invention described above may be implemented in a general purpose computing device, they may be centralized on a single computing device, or distributed over a network of computing devices, or they may alternatively be implemented in program code executable by a computer device, such that they are stored in a memory device and executed by the computing device, or they may be separately fabricated as individual integrated circuit modules, or multiple modules or steps within them may be fabricated as a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of numerous obvious changes, rearrangements and substitutions without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.
The foregoing disclosure is illustrative of the present invention and is not to be construed as limiting the scope of the invention, which is defined by the appended claims.

Claims (10)

1. The control method of the intelligent wearable device is characterized by comprising the following steps:
determining a preset plane capable of performing data interaction with the intelligent wearing equipment based on the gesture of the intelligent wearing equipment;
establishing an infrared plane parallel to the preset plane according to the preset plane, and determining a mapping relation between the infrared plane and the virtual screen of the intelligent wearable device;
when a user acts on an infrared plane, acquiring a user action image, wherein the user action image is provided with infrared light spots;
and calculating the actual coordinates of the infrared light spots based on an image processing technology, converting the actual coordinates into virtual coordinates of the virtual screen of the intelligent wearable device based on the mapping relation, and further determining control information of the intelligent wearable device according to the virtual coordinates.
2. The method for controlling a smart wearable device according to claim 1, wherein the step of determining a preset plane for data interaction with the smart wearable device based on the posture of the smart wearable device comprises:
determining a control range capable of performing information interaction with the intelligent wearable equipment based on the gesture of the intelligent wearable equipment;
and determining a preset plane for carrying out data interaction with the intelligent wearable equipment according to the control range.
3. The method for controlling a smart wearable device according to claim 2, wherein the step of determining a control range capable of information interaction with the smart wearable device based on the posture of the smart wearable device includes:
determining a current sight line range of a user based on the gesture of the intelligent wearable device;
determining a screen range of the virtual screen of the intelligent wearable device according to the sight range;
and determining a control range capable of performing information interaction with the intelligent wearable device based on the screen range.
4. The method for controlling a smart wearable device according to claim 1, wherein the step of determining a mapping relationship between the infrared plane and the virtual screen of the smart wearable device comprises:
acquiring actual coordinates of all vertexes of the infrared plane and virtual coordinates of all vertexes of a screen range of a virtual screen of the intelligent wearable device;
matching according to the actual coordinates and the virtual coordinates and the azimuth sequence to obtain at least three pairs of coordinate pairs;
and determining the mapping relation between the infrared plane and the virtual screen of the intelligent wearable device based on the coordinate pair.
5. The method for controlling a smart wearable device according to claim 1, wherein the step of acquiring a user action image, the user action image having an infrared light spot, comprises:
monitoring the running state of the infrared plane, wherein the running state comprises a shielding state and a normal running state;
and when the running state of the infrared plane is a shielding state, determining the shielded target infrared emission equipment, and acquiring a user action image with the infrared light spots.
6. The method for controlling a smart wearable device according to claim 5, wherein the step of calculating the actual coordinates of the infrared light spot based on the image processing technique and converting the actual coordinates into virtual coordinates of a virtual screen of the smart wearable device based on the mapping relationship comprises:
identifying the infrared light spots through an image processing technology, and determining the brightness of the infrared light spots;
determining an energy intensity of the infrared light spot based on the brightness;
determining a distance value of the infrared light spot from the target infrared emission device according to the energy intensity;
determining actual coordinates of the infrared light spot based on the position information of the target infrared light emitting device and a distance value of the infrared light spot from the target infrared light emitting device;
and converting the actual coordinates into virtual coordinates of the virtual screen of the intelligent wearable device based on the mapping relation.
7. The method for controlling a smart wearable device according to claim 1, further comprising, before the step of determining a preset plane capable of data interaction with the smart wearable device based on the posture of the smart wearable device:
acquiring image data of the intelligent wearable device through an image acquisition device;
and determining the gesture of the intelligent wearable device based on the image data.
8. Control device of intelligent wearing equipment, its characterized in that, control device includes:
the data acquisition module is used for determining a preset plane capable of performing data interaction with the intelligent wearing equipment based on the gesture of the intelligent wearing equipment;
the establishing module is used for establishing an infrared plane parallel to the preset plane according to the preset plane and determining the mapping relation between the infrared plane and the virtual screen of the intelligent wearable device;
the image acquisition module is used for acquiring a user action image when a user acts on an infrared plane, wherein the user action image is provided with infrared light spots;
the control module is used for calculating the actual coordinates of the infrared light spots based on an image processing technology, converting the actual coordinates into virtual coordinates of the virtual screen of the intelligent wearable device based on the mapping relation, and further determining control information of the intelligent wearable device according to the virtual coordinates.
9. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine readable instructions executable by the processor, the processor and the memory in communication over the bus when the electronic device is operating, the machine readable instructions when executed by the processor performing the steps of the method of controlling a smart wearable device as claimed in any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the method of controlling a smart wearable device as claimed in any one of claims 1 to 7.
CN202211722887.6A 2022-12-30 2022-12-30 Control method of intelligent wearable device and related device Pending CN116301321A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211722887.6A CN116301321A (en) 2022-12-30 2022-12-30 Control method of intelligent wearable device and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211722887.6A CN116301321A (en) 2022-12-30 2022-12-30 Control method of intelligent wearable device and related device

Publications (1)

Publication Number Publication Date
CN116301321A true CN116301321A (en) 2023-06-23

Family

ID=86822910

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211722887.6A Pending CN116301321A (en) 2022-12-30 2022-12-30 Control method of intelligent wearable device and related device

Country Status (1)

Country Link
CN (1) CN116301321A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117138353A (en) * 2023-09-08 2023-12-01 广州火石传娱科技有限公司 Coordinate image processing method and system applied to toy gun interaction system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117138353A (en) * 2023-09-08 2023-12-01 广州火石传娱科技有限公司 Coordinate image processing method and system applied to toy gun interaction system
CN117138353B (en) * 2023-09-08 2024-04-19 广州火石传娱科技有限公司 Coordinate image processing method and system applied to toy gun interaction system

Similar Documents

Publication Publication Date Title
CN111949111B (en) Interaction control method and device, electronic equipment and storage medium
WO2016017932A1 (en) Method and apparatus for providing interface recognizing movement in accordance with user's view
US11941207B2 (en) Touch control method for display, terminal device, and storage medium
US20150138086A1 (en) Calibrating control device for use with spatial operating system
CN110197461B (en) Coordinate conversion relation determining method, device, equipment and storage medium
CN104166509A (en) Non-contact screen interaction method and system
US11449196B2 (en) Menu processing method, device and storage medium in virtual scene
KR100907104B1 (en) Calculation method and system of pointing locations, and collaboration system comprising it
JP6127564B2 (en) Touch determination device, touch determination method, and touch determination program
CN116301321A (en) Control method of intelligent wearable device and related device
CN202159302U (en) Augment reality system with user interaction and input functions
CN114186007A (en) High-precision map generation method and device, electronic equipment and storage medium
CN114564106B (en) Method and device for determining interaction indication line, electronic equipment and storage medium
JP2017219942A (en) Contact detection device, projector device, electronic blackboard system, digital signage device, projector device, contact detection method, program and recording medium
CN115847384B (en) Mechanical arm safety plane information display method and related products
CN111915642A (en) Image sample generation method, device, equipment and readable storage medium
CN111814651A (en) Method, device and equipment for generating lane line
CN111462072A (en) Dot cloud picture quality detection method and device and electronic equipment
CN114706487A (en) Character input method and device, electronic equipment and readable storage medium
CN116301320A (en) Control method of intelligent wearable device and related device
US11562549B2 (en) System and method for user interaction in complex web 3D scenes
CN111047710B (en) Virtual reality system, interactive device display method, and computer-readable storage medium
WO2021190421A1 (en) Virtual reality-based controller light ball tracking method on and virtual reality device
CN113758481A (en) Grid map generation method, device, system, storage medium and electronic equipment
CN113325950A (en) Function control method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination