CN112965592A - Equipment interaction method, device and system - Google Patents
Equipment interaction method, device and system Download PDFInfo
- Publication number
- CN112965592A CN112965592A CN202110204497.9A CN202110204497A CN112965592A CN 112965592 A CN112965592 A CN 112965592A CN 202110204497 A CN202110204497 A CN 202110204497A CN 112965592 A CN112965592 A CN 112965592A
- Authority
- CN
- China
- Prior art keywords
- operation instruction
- user
- limb action
- information
- interactive
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000003993 interaction Effects 0.000 title claims description 21
- 230000009471 action Effects 0.000 claims abstract description 111
- 230000002452 interceptive effect Effects 0.000 claims abstract description 70
- 238000007405 data analysis Methods 0.000 claims description 23
- 230000015654 memory Effects 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 12
- 238000001514 detection method Methods 0.000 claims description 9
- 210000003414 extremity Anatomy 0.000 description 69
- 238000010586 diagram Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y20/00—Information sensed or collected by the things
- G16Y20/10—Information sensed or collected by the things relating to the environment, e.g. temperature; relating to location
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y20/00—Information sensed or collected by the things
- G16Y20/20—Information sensed or collected by the things relating to the thing itself
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y40/00—IoT characterised by the purpose of the information processing
- G16Y40/10—Detection; Monitoring
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y40/00—IoT characterised by the purpose of the information processing
- G16Y40/20—Analytics; Diagnosis
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y40/00—IoT characterised by the purpose of the information processing
- G16Y40/30—Control
- G16Y40/35—Management of things, i.e. controlling in accordance with a policy or in order to achieve specified objectives
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/021—Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Environmental & Geological Engineering (AREA)
- Toxicology (AREA)
- Medical Informatics (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Biomedical Technology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a device interaction method, a device and a system, which can be used in the financial field or other technical fields, wherein the device interaction method comprises the following steps: acquiring position information of a user in a preset environment; acquiring first limb action information of the user; determining interactive equipment corresponding to the position information of the user according to the corresponding relation between a preset position and the interactive equipment; searching an operation instruction corresponding to the first limb action information of the user from a first operation instruction list of the corresponding interactive device, wherein the first operation instruction list records the corresponding relation between the first limb action and the operation instruction; and sending an operation instruction corresponding to the first limb action information of the user to the interactive device corresponding to the operation instruction. The invention can accurately realize the scheme of interacting with the equipment based on the position of the user and the body movement.
Description
Technical Field
The invention relates to the technical field of Internet of things, in particular to a device interaction method, device and system.
Background
Currently, with the rapid development of smart homes and the field of industrial internet of things, the demand for sensible, intelligent and real-time interactive air-insulated operation is increasing. At present, due to the precision problem of indoor positioning technology, the positions of users and equipment cannot be accurately judged, so that a large amount of interactive operations based on relative positions cannot be completed. The prior art lacks a method for accurately realizing interaction with equipment based on user positions.
Disclosure of Invention
The present invention provides a device interaction method, apparatus and system for solving the technical problems in the background art.
In order to achieve the above object, according to an aspect of the present invention, there is provided a device interaction method including:
acquiring position information of a user in a preset environment, wherein the position information is obtained by positioning and detecting a UWB chip in portable equipment carried by the user through a UWB base station arranged in the preset environment;
acquiring first limb action information of the user, wherein the first limb action information is detected by a motion sensor in the portable equipment;
determining interactive equipment corresponding to the position information of the user according to the corresponding relation between a preset position and the interactive equipment;
searching an operation instruction corresponding to the first limb action information of the user from a first operation instruction list of the corresponding interactive device, wherein the first operation instruction list records the corresponding relation between the first limb action and the operation instruction;
and sending an operation instruction corresponding to the first limb action information of the user to the interactive device corresponding to the operation instruction.
Optionally, the device interaction method further includes:
acquiring image information of the user in the preset environment;
and identifying second limb action information of the user according to the image information and a preset action identification model.
Optionally, the device interaction method further includes:
searching an operation instruction corresponding to the combination of the first limb action information and the second limb action information of the user from a second operation instruction list of the corresponding interactive device, wherein the second operation instruction list records the corresponding relation between the combination of the first limb action and the second limb action and the operation instruction;
and sending an operation instruction corresponding to the combination of the first limb action information and the second limb action information of the user to the interactive equipment corresponding to the operation instruction.
Optionally, the portable device includes: at least one of a handheld device, a hand-worn device, a wrist-worn device, and an arm-worn device.
Optionally, the position information is three-dimensional position information, and the position information is obtained by positioning and detecting the UWB chip through at least four UWB base stations arranged in the preset environment and fixed in position.
Optionally, the motion sensor includes: a nine-axis sensor.
In order to achieve the above object, according to another aspect of the present invention, there is provided a device interaction apparatus, including:
the device comprises a position information acquisition module, a position information acquisition module and a position information acquisition module, wherein the position information acquisition module is used for acquiring the position information of a user in a preset environment, and the position information is obtained by positioning and detecting a UWB chip in portable equipment carried by the user through a UWB base station arranged in the preset environment;
the first limb action information acquisition module is used for acquiring first limb action information of the user, wherein the first limb action information is obtained by detection of a motion sensor in the portable equipment;
the interactive equipment determining module is used for determining interactive equipment corresponding to the position information of the user according to the corresponding relation between a preset position and the interactive equipment;
the first operation instruction determining module is used for searching an operation instruction corresponding to the first limb action information of the user from a first operation instruction list of the corresponding interactive device, wherein the first operation instruction list records a corresponding relation between the first limb action and the operation instruction;
and the first operation instruction sending module is used for sending the operation instruction corresponding to the first limb action information of the user to the interactive device corresponding to the operation instruction.
In order to achieve the above object, according to another aspect of the present invention, there is provided a device interaction system including: the device comprises a portable device, a UWB base station and a data analysis system, wherein the portable device is provided with a UWB chip and a motion sensor;
the UWB base station is used for carrying out positioning detection on a UWB chip in portable equipment carried by a user to obtain position information of the user in a preset environment and sending the position information to the data analysis system;
the portable equipment is used for sending the first limb action information of the user, which is detected by the motion sensor, to the data analysis system;
the data analysis system is used for determining the interactive equipment corresponding to the position information of the user according to the corresponding relation between the preset position and the interactive equipment; searching an operation instruction corresponding to the first limb action information of the user from a first operation instruction list of the corresponding interactive device, wherein the first operation instruction list records the corresponding relation between the first limb action and the operation instruction; and sending an operation instruction corresponding to the first limb action information of the user to the interactive device corresponding to the operation instruction.
In order to achieve the above object, according to another aspect of the present invention, there is also provided a computer device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the steps in the device interaction method when executing the computer program.
In order to achieve the above object, according to another aspect of the present invention, there is also provided a computer-readable storage medium storing a computer program which, when executed in a computer processor, implements the steps in the above-described device interaction method.
The invention has the beneficial effects that: according to the invention, the UWB base station is used for positioning and detecting the UWB chip in the portable equipment carried by the user, so that the position information of the user in the environment can be accurately obtained, the body action of the user is detected by the motion sensor, the scheme of interacting with the equipment based on the position and the body action of the user can be accurately realized, and the technical problem that interactive operation based on the relative position cannot be completed due to the difficulty in accurately positioning the position of the user indoors in the prior art is solved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts. In the drawings:
FIG. 1 is a first flowchart of an apparatus interaction method according to an embodiment of the present invention;
FIG. 2 is a second flowchart of a device interaction method according to an embodiment of the present invention;
FIG. 3 is a third flowchart of a device interaction method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an interactive system of a device according to an embodiment of the present invention;
FIG. 5 is a first block diagram of an apparatus interaction device according to an embodiment of the present invention;
FIG. 6 is a second block diagram of an apparatus interaction device according to an embodiment of the present invention;
FIG. 7 is a second block diagram of an apparatus interaction device according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a computer apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
It should be noted that the terms "comprises" and "comprising," and any variations thereof, in the description and claims of the present invention and the above-described drawings, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
It should be noted that the device interaction method, apparatus and system of the present invention can be used in the financial field, and can also be applied in other technical fields.
Fig. 4 is a schematic diagram of a device interaction system according to an embodiment of the present invention, and as shown in fig. 4, the device interaction system of the present invention includes: the device comprises a portable device, a UWB base station and a data analysis system, wherein the portable device is provided with a UWB chip and a motion sensor.
The UWB base station is used for positioning and detecting a UWB chip in portable equipment carried by a user to obtain the position information of the user in a preset environment and sending the position information to the data analysis system.
The portable device is used for sending the first limb action information of the user, which is detected by the motion sensor, to the data analysis system.
The data analysis system is used for determining the interactive equipment corresponding to the position information of the user according to the corresponding relation between the preset position and the interactive equipment; searching an operation instruction corresponding to the first limb action information of the user from a first operation instruction list of the corresponding interactive device, wherein the first operation instruction list records the corresponding relation between the first limb action and the operation instruction; and sending an operation instruction corresponding to the first limb action information of the user to the interactive device corresponding to the operation instruction.
As shown in fig. 4, in one embodiment of the present invention, the inventive device interaction system comprises: and the image acquisition device is used for acquiring the image information of the user in the preset environment and sending the image information to the data analysis system. The data analysis system is further used for identifying second limb action information of the user according to the image information and a preset action identification model.
In an embodiment of the present invention, the data analysis system is further configured to search, from a second operation instruction list of the corresponding interactive device, an operation instruction corresponding to a combination of the first limb action information and the second limb action information of the user, where the second operation instruction list describes a correspondence between the combination of the first limb action and the second limb action and the operation instruction; and sending an operation instruction corresponding to the combination of the first limb action information and the second limb action information of the user to the interactive equipment corresponding to the operation instruction.
Fig. 1 is a first flowchart of a device interaction method according to an embodiment of the present invention, where the main implementation body of the process is the data analysis system in fig. 4, as shown in fig. 1, in an embodiment of the present invention, the device interaction method of the present invention includes steps S101 to S105.
Step S101, obtaining position information of a user in a preset environment, wherein the position information is obtained by positioning and detecting a UWB chip in portable equipment carried by the user through a UWB base station arranged in the preset environment.
In one embodiment of the invention, the portable device comprises: at least one of a handheld device (e.g., a cell phone, etc.), a hand-worn device (e.g., a glove, etc.), a wrist-worn device (e.g., a watch, a smart bracelet, etc.), and an arm-worn device (e.g., an arm ring, etc.), and any combination thereof.
In an embodiment of the present invention, the position information is three-dimensional position information, and the position information is specifically obtained by performing positioning detection on the UWB chip through at least four UWB base stations which are arranged in the preset environment and have fixed positions.
In the embodiment of the invention, the preset environment is subjected to three-dimensional modeling to obtain an environment model. The UWB base stations are fixedly arranged in the preset environment, and the three-dimensional position of each UWB base station is fixed. In one embodiment of the invention, the number of UWB base stations is at least 4. The UWB base station with at least 4 determined positions keeps a communication state with the UWB chip continuously, positioning of the three-dimensional position of the UWB chip in the current environment is completed through continuous positioning in the direction of X, Y, Z in the environment, three-dimensional position information of the portable device carried by the user in the environment is obtained, and the three-dimensional position information of the portable device in the environment is used as the position information of the user in the environment.
Step S102, obtaining first limb motion information of the user, where the first limb motion information is obtained through detection by a motion sensor in the portable device.
In one embodiment of the present invention, the first limb movement is a movement of a hand, a wrist, and an arm of the user, that is, the first limb movement information includes: at least one of hand movements (e.g., gestures), wrist movements (e.g., raising hands), and arm movements, and any combination thereof.
In one embodiment of the present invention, the motion sensor may be a nine-axis sensor or a gyroscope. Preferably, the motion sensor may be a nine-axis sensor.
Step S103, determining the interactive equipment corresponding to the position information of the user according to the corresponding relation between the preset position and the interactive equipment.
In an embodiment of the present invention, after the environment is modeled in three dimensions to obtain the environment model, the present invention further establishes a correspondence between the location and the interactive device, where the correspondence describes the interactive device with which the user may interact at a certain location.
In one embodiment of the present invention, the corresponding interaction device may be plural.
Step S104, searching an operation instruction corresponding to the first limb action information of the user from a first operation instruction list of the corresponding interactive device, where the first operation instruction list records a corresponding relationship between the first limb action and the operation instruction.
In an embodiment of the present invention, the present invention further establishes a first operation instruction list in advance for each interactive device.
When a user interacts with the interactive equipment, the corresponding interactive equipment is determined according to the position of the user, and a first operation instruction list of the corresponding interactive equipment is obtained. And searching a corresponding operation instruction from the first operation instruction list according to the first limb action information of the user.
If the number of the interaction devices corresponding to the positions of the user is multiple, searching the corresponding operation instruction from the first operation instruction list of each corresponding interaction device according to the first limb action information of the user.
Step S105, sending an operation instruction corresponding to the first limb action information of the user to an interaction device corresponding to the operation instruction.
Therefore, the UWB base station is used for positioning and detecting the UWB chip in the portable equipment carried by the user, so that the position information of the user in the environment can be accurately obtained, the body action of the user is detected by the motion sensor, the scheme of interacting with the equipment based on the position and the body action of the user can be accurately realized, and the technical problem that interactive operation based on the relative position cannot be completed due to the fact that the position of the user in the room is difficult to accurately position in the prior art is solved.
Fig. 2 is a second flowchart of a device interaction method according to an embodiment of the present invention, where the main implementation body of the process is the data analysis system in fig. 4, as shown in fig. 2, in an embodiment of the present invention, the device interaction method further includes steps S201 to S202.
Step S201, acquiring image information of the user in the preset environment.
As shown in fig. 4, in one embodiment of the present invention, the inventive device interaction system comprises: and the image acquisition device is used for acquiring the image information of the user in the preset environment and sending the image information to the data analysis system.
Step S202, identifying second limb action information of the user according to the image information and a preset action identification model.
In an embodiment of the present invention, the data analysis system is further configured to identify second limb motion information of the user according to the image information and a preset motion recognition model.
In one embodiment of the invention, the second limb motion information may be a limb gross motion of the user, such as running, jumping, etc.
In an embodiment of the present invention, the motion recognition model may be obtained by training any machine learning model in the prior art.
Fig. 3 is a third flowchart of a device interaction method according to an embodiment of the present invention, where an implementation subject of the flowchart is the data analysis system in fig. 4, as shown in fig. 3, in an embodiment of the present invention, the device interaction method further includes steps S301 to S302.
Step S301, searching an operation instruction corresponding to a combination of the first limb action information and the second limb action information of the user from a second operation instruction list of the corresponding interactive device, where the second operation instruction list records a corresponding relationship between the combination of the first limb action and the second limb action and the operation instruction.
In one embodiment of the invention, the invention also establishes a first operation instruction list in advance aiming at some interactive devices.
When the user interacts with the interactive equipment, the corresponding interactive equipment is determined according to the position of the user, and a second operation instruction list of the corresponding interactive equipment is obtained. And searching a corresponding operation instruction from the second operation instruction list according to the combination of the second limb action information and the second limb action information of the user.
Step S302, sending an operation instruction corresponding to the combination of the first limb action information and the second limb action information of the user to an interactive device corresponding to the operation instruction.
It can be seen from the above embodiments that, by integrating existing hardware devices having functions of UWB positioning, three-dimensional spatial sensing, video acquisition, and the like, including but not limited to UWB chips such as mobile phones, smart watches, and the like, nine-axis sensors, and high-definition network cameras in a scene environment, data acquired by each hardware device is integrated, so that a corresponding application scene, such as a scene of device control in smart homes, industrial internet of things, and the like, can be realized based on an accurate position of a user in combination with user limb actions, user fine operations (such as arm inversion, arm lifting), and the like, thereby achieving good interactive experience and improving user operation convenience.
The invention can greatly improve the convenience in the application scenes of smart homes, industrial internet, accurate marketing systems and the like, and has the following specific advantages:
1. the method is based on a software system, realizes real-time intelligent interactive operation by combining common modules such as a sensor, a camera and the like of the current hardware equipment through a software related algorithm and rules;
2. the method is based on the existing mature technology combination, such as action judgment of a VR nine-axis sensor, accurate position information of a UWB positioning technology and video behavior detection, and has the condition of quick realization;
3. the method has wide application scenes, specific reproducibility and multiple permutation and combination application modes based on three types of equipment, namely handheld equipment, wearable equipment and environment acquisition equipment, can be realized through multiple hardware combination forms, and the application scene range is widened.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
Based on the same inventive concept, an apparatus interaction device is further provided in the embodiments of the present invention, which can be used to implement the apparatus interaction method described in the above embodiments, as described in the following embodiments. Because the principle of the device interaction apparatus for solving the problem is similar to the device interaction method, the embodiment of the device interaction apparatus may refer to the embodiment of the device interaction method, and repeated details are not described herein. As used hereinafter, the term "unit" or "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 5 is a first structural block diagram of an apparatus interaction device according to an embodiment of the present invention, and as shown in fig. 5, the apparatus interaction device according to the embodiment of the present invention includes:
the system comprises a position information acquisition module 1, a position information acquisition module and a position information acquisition module, wherein the position information acquisition module is used for acquiring position information of a user in a preset environment, and the position information is obtained by positioning and detecting a UWB chip in portable equipment carried by the user through a UWB base station arranged in the preset environment;
a first limb action information obtaining module 2, configured to obtain first limb action information of the user, where the first limb action information is obtained through detection by a motion sensor in the portable device;
the interactive device determining module 3 is configured to determine an interactive device corresponding to the location information of the user according to a correspondence between a preset location and the interactive device;
the first operation instruction determining module 4 is configured to search an operation instruction corresponding to the first limb action information of the user from a first operation instruction list of the corresponding interactive device, where the first operation instruction list records a corresponding relationship between a first limb action and an operation instruction;
and the first operation instruction sending module 5 is configured to send an operation instruction corresponding to the first limb action information of the user to the interaction device corresponding to the operation instruction.
Fig. 6 is a second structural block diagram of the device interaction apparatus according to the embodiment of the present invention, and as shown in fig. 6, the device interaction apparatus according to the embodiment of the present invention further includes:
an image information obtaining module 6, configured to obtain image information of the user in the preset environment;
and the second limb action information determining module 7 is configured to identify the second limb action information of the user according to the image information and a preset action identification model.
Fig. 7 is a third structural block diagram of an apparatus interaction device according to an embodiment of the present invention, and as shown in fig. 7, the apparatus interaction device according to the embodiment of the present invention further includes:
a second operation instruction determining module 8, configured to search, from a second operation instruction list of the corresponding interactive device, an operation instruction corresponding to a combination of the first limb action information and the second limb action information of the user, where the second operation instruction list records a correspondence between the combination of the first limb action and the second limb action and the operation instruction;
and a second operation instruction sending module 9, configured to send an operation instruction corresponding to a combination of the first limb action information and the second limb action information of the user to the interaction device corresponding to the operation instruction.
To achieve the above object, according to another aspect of the present application, there is also provided a computer apparatus. As shown in fig. 8, the computer device comprises a memory, a processor, a communication interface and a communication bus, wherein a computer program that can be run on the processor is stored in the memory, and the steps of the method of the above embodiment are realized when the processor executes the computer program.
The processor may be a Central Processing Unit (CPU). The Processor may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or a combination thereof.
The memory, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and units, such as the corresponding program units in the above-described method embodiments of the present invention. The processor executes various functional applications of the processor and the processing of the work data by executing the non-transitory software programs, instructions and modules stored in the memory, that is, the method in the above method embodiment is realized.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by the processor, and the like. Further, the memory may include high speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and such remote memory may be coupled to the processor via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more units are stored in the memory and when executed by the processor perform the method of the above embodiments.
The specific details of the computer device may be understood by referring to the corresponding related descriptions and effects in the above embodiments, and are not described herein again.
In order to achieve the above object, according to another aspect of the present application, there is also provided a computer-readable storage medium storing a computer program which, when executed in a computer processor, implements the steps in the above device interaction method. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD) or a Solid State Drive (SSD), etc.; the storage medium may also comprise a combination of memories of the kind described above.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and they may alternatively be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, or fabricated separately as individual integrated circuit modules, or fabricated as a single integrated circuit module from multiple modules or steps. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (14)
1. A device interaction method, comprising:
acquiring position information of a user in a preset environment, wherein the position information is obtained by positioning and detecting a UWB chip in portable equipment carried by the user through a UWB base station arranged in the preset environment;
acquiring first limb action information of the user, wherein the first limb action information is detected by a motion sensor in the portable equipment;
determining interactive equipment corresponding to the position information of the user according to the corresponding relation between a preset position and the interactive equipment;
searching an operation instruction corresponding to the first limb action information of the user from a first operation instruction list of the corresponding interactive device, wherein the first operation instruction list records the corresponding relation between the first limb action and the operation instruction;
and sending an operation instruction corresponding to the first limb action information of the user to the interactive device corresponding to the operation instruction.
2. The device interaction method of claim 1, further comprising:
acquiring image information of the user in the preset environment;
and identifying second limb action information of the user according to the image information and a preset action identification model.
3. The device interaction method of claim 2, further comprising:
searching an operation instruction corresponding to the combination of the first limb action information and the second limb action information of the user from a second operation instruction list of the corresponding interactive device, wherein the second operation instruction list records the corresponding relation between the combination of the first limb action and the second limb action and the operation instruction;
and sending an operation instruction corresponding to the combination of the first limb action information and the second limb action information of the user to the interactive equipment corresponding to the operation instruction.
4. The device interaction method according to claim 1, wherein the portable device comprises: at least one of a handheld device, a hand-worn device, a wrist-worn device, and an arm-worn device.
5. The device interaction method according to claim 1, wherein the position information is three-dimensional position information, and the position information is obtained by performing positioning detection on the UWB chip through at least four UWB base stations which are arranged in the preset environment and have fixed positions.
6. The device interaction method of claim 1, wherein the motion sensor comprises: a nine-axis sensor.
7. An apparatus interaction device, comprising:
the device comprises a position information acquisition module, a position information acquisition module and a position information acquisition module, wherein the position information acquisition module is used for acquiring the position information of a user in a preset environment, and the position information is obtained by positioning and detecting a UWB chip in portable equipment carried by the user through a UWB base station arranged in the preset environment;
the first limb action information acquisition module is used for acquiring first limb action information of the user, wherein the first limb action information is obtained by detection of a motion sensor in the portable equipment;
the interactive equipment determining module is used for determining interactive equipment corresponding to the position information of the user according to the corresponding relation between a preset position and the interactive equipment;
the first operation instruction determining module is used for searching an operation instruction corresponding to the first limb action information of the user from a first operation instruction list of the corresponding interactive device, wherein the first operation instruction list records a corresponding relation between the first limb action and the operation instruction;
and the first operation instruction sending module is used for sending the operation instruction corresponding to the first limb action information of the user to the interactive device corresponding to the operation instruction.
8. The device interaction apparatus of claim 7, further comprising:
the image information acquisition module is used for acquiring the image information of the user in the preset environment;
and the second limb action information determining module is used for identifying the second limb action information of the user according to the image information and a preset action identification model.
9. The device interaction apparatus of claim 8, further comprising:
a second operation instruction determining module, configured to search, from a second operation instruction list of the corresponding interactive device, an operation instruction corresponding to a combination of the first limb action information and the second limb action information of the user, where the second operation instruction list records a correspondence between the combination of the first limb action and the second limb action and the operation instruction;
and the second operation instruction sending module is used for sending the operation instruction corresponding to the combination of the first limb action information and the second limb action information of the user to the interactive device corresponding to the operation instruction.
10. A device interaction system, comprising: the device comprises a portable device, a UWB base station and a data analysis system, wherein the portable device is provided with a UWB chip and a motion sensor;
the UWB base station is used for carrying out positioning detection on a UWB chip in portable equipment carried by a user to obtain position information of the user in a preset environment and sending the position information to the data analysis system;
the portable equipment is used for sending the first limb action information of the user, which is detected by the motion sensor, to the data analysis system;
the data analysis system is used for determining the interactive equipment corresponding to the position information of the user according to the corresponding relation between the preset position and the interactive equipment; searching an operation instruction corresponding to the first limb action information of the user from a first operation instruction list of the corresponding interactive device, wherein the first operation instruction list records the corresponding relation between the first limb action and the operation instruction; and sending an operation instruction corresponding to the first limb action information of the user to the interactive device corresponding to the operation instruction.
11. The device interaction system of claim 10, further comprising: the image acquisition device is used for acquiring the image information of the user in the preset environment and sending the image information to the data analysis system;
the data analysis system is further used for identifying second limb action information of the user according to the image information and a preset action identification model.
12. The device interaction system according to claim 11, wherein the data analysis system is further configured to search a second operation instruction list of the corresponding interaction device for an operation instruction corresponding to a combination of the first limb motion information and the second limb motion information of the user, where the second operation instruction list describes a correspondence between the combination of the first limb motion and the second limb motion and the operation instruction; and sending an operation instruction corresponding to the combination of the first limb action information and the second limb action information of the user to the interactive equipment corresponding to the operation instruction.
13. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of any of claims 1 to 6 when executing the computer program.
14. A computer-readable storage medium, in which a computer program is stored which, when executed in a computer processor, implements the method of any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110204497.9A CN112965592A (en) | 2021-02-24 | 2021-02-24 | Equipment interaction method, device and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110204497.9A CN112965592A (en) | 2021-02-24 | 2021-02-24 | Equipment interaction method, device and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112965592A true CN112965592A (en) | 2021-06-15 |
Family
ID=76285839
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110204497.9A Pending CN112965592A (en) | 2021-02-24 | 2021-02-24 | Equipment interaction method, device and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112965592A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113489831A (en) * | 2021-06-29 | 2021-10-08 | 青岛海尔科技有限公司 | Equipment control method and device |
CN113681557A (en) * | 2021-08-17 | 2021-11-23 | Oppo广东移动通信有限公司 | Robot control method, robot, and readable storage medium |
CN113849065A (en) * | 2021-09-17 | 2021-12-28 | 支付宝(杭州)信息技术有限公司 | Method and device for triggering client operation instruction by using body-building action |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108375911A (en) * | 2018-01-22 | 2018-08-07 | 珠海格力电器股份有限公司 | A kind of apparatus control method, device, storage medium and equipment |
CN209712883U (en) * | 2018-08-30 | 2019-12-03 | 青岛联合创智科技有限公司 | A kind of UWB positioning bracelet with heart rate measurement function |
CN110568931A (en) * | 2019-09-11 | 2019-12-13 | 百度在线网络技术(北京)有限公司 | interaction method, device, system, electronic device and storage medium |
-
2021
- 2021-02-24 CN CN202110204497.9A patent/CN112965592A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108375911A (en) * | 2018-01-22 | 2018-08-07 | 珠海格力电器股份有限公司 | A kind of apparatus control method, device, storage medium and equipment |
CN209712883U (en) * | 2018-08-30 | 2019-12-03 | 青岛联合创智科技有限公司 | A kind of UWB positioning bracelet with heart rate measurement function |
CN110568931A (en) * | 2019-09-11 | 2019-12-13 | 百度在线网络技术(北京)有限公司 | interaction method, device, system, electronic device and storage medium |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113489831A (en) * | 2021-06-29 | 2021-10-08 | 青岛海尔科技有限公司 | Equipment control method and device |
CN113489831B (en) * | 2021-06-29 | 2023-02-03 | 青岛海尔科技有限公司 | Equipment control method, device and storage medium |
CN113681557A (en) * | 2021-08-17 | 2021-11-23 | Oppo广东移动通信有限公司 | Robot control method, robot, and readable storage medium |
CN113849065A (en) * | 2021-09-17 | 2021-12-28 | 支付宝(杭州)信息技术有限公司 | Method and device for triggering client operation instruction by using body-building action |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112965592A (en) | Equipment interaction method, device and system | |
CN109918975A (en) | A kind of processing method of augmented reality, the method for Object identifying and terminal | |
CN112561948B (en) | Space-time trajectory-based accompanying trajectory recognition method, device and storage medium | |
CN109176512A (en) | A kind of method, robot and the control device of motion sensing control robot | |
CN105825193A (en) | Method and device for position location of center of palm, gesture recognition device and intelligent terminals | |
CN109840508A (en) | One robot vision control method searched for automatically based on the depth network architecture, equipment and storage medium | |
CN111027507A (en) | Training data set generation method and device based on video data identification | |
CN113822460A (en) | Traffic flow prediction method and device, electronic equipment and storage medium | |
CN112925416A (en) | User sight tracking method, device and system | |
CN112912889B (en) | Image template updating method, device and storage medium | |
CN111899279A (en) | Method and device for detecting motion speed of target object | |
WO2022088613A1 (en) | Robot positioning method and apparatus, device and storage medium | |
CN114385012A (en) | Motion recognition method and device, electronic equipment and readable storage medium | |
CN108875901B (en) | Neural network training method and universal object detection method, device and system | |
CN114282587A (en) | Data processing method and device, computer equipment and storage medium | |
CN116756007A (en) | Attack test method, device and equipment for biological feature recognition | |
US20200132465A1 (en) | System and method for determining a trajectory | |
CN115690544B (en) | Multi-task learning method and device, electronic equipment and medium | |
CN111652168A (en) | Group detection method, device and equipment based on artificial intelligence and storage medium | |
CN203630717U (en) | Interaction system based on a plurality of light inertial navigation sensing input devices | |
CN113894779B (en) | Multi-mode data processing method applied to robot interaction | |
CN111784797A (en) | Robot networking interaction method, device and medium based on AR | |
CN113706606A (en) | Method and device for determining position coordinates of spaced gestures | |
CN116158077A (en) | Method for optimizing motion vector and related equipment thereof | |
CN116310406B (en) | Image detection method and device, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |