CN110069137B - Gesture control method, control device and control system - Google Patents

Gesture control method, control device and control system Download PDF

Info

Publication number
CN110069137B
CN110069137B CN201910359363.7A CN201910359363A CN110069137B CN 110069137 B CN110069137 B CN 110069137B CN 201910359363 A CN201910359363 A CN 201910359363A CN 110069137 B CN110069137 B CN 110069137B
Authority
CN
China
Prior art keywords
user
instruction
control
action
hand image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910359363.7A
Other languages
Chinese (zh)
Other versions
CN110069137A (en
Inventor
张连第
张程
柴君飞
俞宗嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xuzhou Heavy Machinery Co Ltd
Original Assignee
Xuzhou Heavy Machinery Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xuzhou Heavy Machinery Co Ltd filed Critical Xuzhou Heavy Machinery Co Ltd
Priority to CN201910359363.7A priority Critical patent/CN110069137B/en
Publication of CN110069137A publication Critical patent/CN110069137A/en
Application granted granted Critical
Publication of CN110069137B publication Critical patent/CN110069137B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a gesture control method, a control device and a control system. The gesture control device carries out video acquisition in a preset direction, detects whether a user starts a gesture control function in a reference area of a shooting visual field according to a user hand image in an acquired video, detects whether the user finishes control operation in a working area of the shooting visual field according to the user hand image in the acquired video if the gesture control function is started, and detects the moving direction and the moving distance of the user hand image relative to the reference area in the working area if the user finishes control operation in the working area, so that an action instruction corresponding to the current working mode is generated, and the action instruction is sent to the corresponding vehicle control device. The lifting rigging installation device has the advantages that an operator can finish lifting rigging installation, ambient environment observation and crane operation by one person, so that the labor cost can be effectively reduced, and the operation safety, the operation efficiency and the installation precision of a lifted object are improved.

Description

Gesture control method, control device and control system
Technical Field
The present disclosure relates to the field of control, and in particular, to a gesture control method, a control device, and a control system.
Background
Along with the rapid development of basic economy and civil engineering, engineering vehicles such as cranes play an important role in the fields of engineering construction, rail transit, emergency rescue and the like. For example, a wheel crane is widely favored in the market because of its high hoisting capacity and high transfer efficiency.
Conventional lifting operations require an operator to enter the control cabin. The operator can real-timely operate the handles arranged at two sides of the seat of the control cabin according to the type, the shape and the size of the object to be hoisted, the surrounding environment and the current position and by combining the experience of the operator, and adjust the length, the amplitude angle, the height and the like of the crane boom to ensure that the object to be hoisted can safely reach the target position, thereby completing the hoisting operation process of the object to be hoisted.
Disclosure of Invention
The inventor has noted that in the conventional manner of working a work vehicle, the operator needs to enter the cage, and thus the field of view is limited. In addition, because the special person is required to install the lifting rigging, observe the surrounding environment and issue an operation instruction, the working efficiency is low, and the lifting risk that the operation instruction cannot be quickly and accurately understood by an operator exists.
Therefore, the scheme for conveniently and efficiently controlling the engineering vehicle is provided.
According to a first aspect of the embodiments of the present disclosure, there is provided a gesture control method, including: carrying out video acquisition in a preset direction; detecting whether a user starts a gesture control function in a reference area of a shooting visual field according to a user hand image in the collected video; if the user starts a gesture control function in the reference area, detecting whether the user finishes control operation in a working area of a shooting visual field according to a hand image of the user in the collected video; if the user finishes control operation in the working area, detecting the moving direction of the current position of the hand image of the user in the working area relative to the reference area and the moving distance in the moving direction, wherein the reference area is arranged at a preset position in the working area; generating an action instruction corresponding to the current working mode according to the moving direction and the moving distance; and sending the action command to a corresponding vehicle control device so that the vehicle control device can control the corresponding engineering vehicle according to the working command.
In some embodiments, generating an action instruction corresponding to the current operation mode according to the moving direction and the moving distance includes: determining the ratio of the moving distance to the preset length as the action speed in the current working mode, wherein the preset length is the length of the reference area moving to the boundary of the working area along the moving direction; determining the action direction in the current working mode according to the moving direction; and generating a corresponding action instruction according to the action direction and the action speed.
In some embodiments, a prompt corresponding to the direction of motion and the speed of motion is displayed on a display screen.
In some embodiments, the display screen is a transparent display screen.
In some embodiments, sending the action command to the corresponding vehicle control device comprises: processing the action instruction by using a first protocol to obtain first instruction information; processing the action instruction by using a second protocol to obtain second instruction information; and sending the first instruction information and the second instruction information to corresponding vehicle control devices.
In some embodiments, detecting whether the user activates the gesture control function within the reference region comprises: detecting whether the hand image of the user is located in the reference area or not according to the hand image of the user in the acquired video; if the hand image of the user is located in the reference area, further detecting whether the hand image of the user corresponds to a preset gesture; and if the hand image of the user corresponds to the preset gesture, determining that the user activates a gesture control function in the reference area.
In some embodiments, detecting whether a user has completed a control operation in the work area comprises: detecting whether the user hand image stops moving in the work area; and if the hand image of the user stops moving in the working area, determining that the user completes control operation in the working area.
In some embodiments, the current operating mode is adjusted according to instructions entered by a user via an input device.
In some embodiments, if the user's hand image is not included in the captured video, the gesture control function is turned off.
According to a second aspect of the embodiments of the present disclosure, there is provided a gesture control apparatus including: a camera module configured to perform video capture in a predetermined direction; the control module is configured to detect whether a user starts a gesture control function in a reference area of a shooting visual field according to a user hand image in the acquired video, detect whether the user completes a control operation in a working area of the shooting visual field according to the user hand image in the acquired video if the user starts the gesture control function in the reference area, and detect a moving direction and a moving distance in the moving direction of a current position of the user hand image in the working area relative to the reference area if the user completes the control operation in the working area, wherein the reference area is arranged at a preset position in the working area, and generate an action instruction corresponding to a current working mode according to the moving direction and the moving distance; and the communication module is configured to send the action command to a corresponding vehicle control device so that the vehicle control device can control the corresponding engineering vehicle according to the working command.
In some embodiments, the control module is configured to determine a ratio of the moving distance to the predetermined length as a motion speed in the current working mode, where the predetermined length is a length of the reference region moving to a boundary of the working region along the moving direction, determine a motion direction in the current working mode according to the moving direction, and generate a corresponding motion instruction according to the motion direction and the motion speed.
In some embodiments, the above apparatus further comprises: a display configured to display prompt information corresponding to the motion direction and the motion speed.
In some embodiments, the display screen is a transparent display screen.
In some embodiments, the communication module is configured to process the motion command using a first protocol to obtain first command information, process the motion command using a second protocol to obtain second command information, and transmit the first command information and the second command information to the corresponding vehicle control devices.
In some embodiments, the control module is configured to detect whether the user hand image is located in the reference area according to the user hand image in the captured video, further detect whether the user hand image corresponds to a preset gesture if the user hand image is located in the reference area, and determine that the user activates the gesture control function in the reference area if the user hand image corresponds to the preset gesture.
In some embodiments, the control module is configured to detect whether the user hand image stops moving in the work area, and determine that a user has completed a control operation in the work area if the user hand image stops moving in the work area.
In some embodiments, the above apparatus further comprises: an input module configured to receive an instruction input by a user; the control module is further configured to adjust the current operating mode.
In some embodiments, the control module is further configured to turn off the gesture control function if the user hand image is not included in the captured video.
According to a third aspect of the embodiments of the present disclosure, there is provided a gesture control apparatus including: a memory configured to store instructions; a processor coupled to the memory, the processor configured to perform a method implementing any of the embodiments described above based on instructions stored by the memory.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a gesture control system, including the gesture control device according to any one of the embodiments, and a work vehicle configured with a vehicle control device, wherein the vehicle control device is configured to control the work vehicle according to a work instruction sent by the gesture control device.
In some embodiments, the vehicle control device is configured to, in a case where the work instruction includes first instruction information and second instruction information, process the first instruction information using a corresponding first protocol to obtain a first action instruction, process the second instruction information using a corresponding second protocol to obtain a second action instruction, and in a case where the first action instruction and the second action instruction are the same, control the work vehicle using the first action instruction or the second action instruction.
In some embodiments, the vehicle control apparatus is further configured to determine whether a handle control instruction is further received after receiving the operating instruction, and execute only the handle control instruction if the handle control instruction is received.
According to a fifth aspect of the embodiments of the present disclosure, a computer-readable storage medium is provided, in which computer instructions are stored, and when executed by a processor, the computer-readable storage medium implements the method according to any of the embodiments described above.
Other features of the present disclosure and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and for those skilled in the art, other drawings may be obtained according to the drawings without inventive labor.
FIG. 1 is an exemplary flow chart of a gesture control method of one embodiment of the present disclosure;
FIG. 2 is a schematic view of a motion control interface according to one embodiment of the present disclosure;
FIG. 3 is a schematic view of a motion control interface according to another embodiment of the present disclosure;
FIG. 4 is an exemplary block diagram of a gesture control apparatus according to an embodiment of the present disclosure;
FIG. 5 is an exemplary block diagram of a gesture control apparatus according to another embodiment of the present disclosure;
FIG. 6 is an exemplary block diagram of a gesture control apparatus according to yet another embodiment of the present disclosure;
FIG. 7 is an exemplary block diagram of a gesture control system of one embodiment of the present disclosure;
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
The relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as exemplary only and not as limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Fig. 1 is an exemplary flowchart of a gesture control method according to an embodiment of the present disclosure. In some embodiments, the gesture control method steps are performed by a gesture control device.
In step 101, video capture is performed in a predetermined direction.
In some embodiments, video capture is performed by a camera device in a predetermined direction.
In step 102, whether the user starts a gesture control function in a reference area of a shooting visual field is detected according to the hand image of the user in the collected video.
In some embodiments, a reference area is set in the photographing field of view. And detecting whether the hand image of the user is positioned in the reference area or not according to the hand image of the user in the acquired video. And if the hand image of the user is located in the reference area, further detecting whether the hand image of the user corresponds to the preset gesture. And if the hand image of the user corresponds to the preset gesture, determining that the user activates a gesture control function in the reference area.
For example, the user moves the hand to position the hand image within the reference area and raises the thumb within the reference area to turn on the gesture control function. Only after the gesture control function is started, the subsequent processing is performed. Thus, the misoperation caused by the random hand swinging of the user can be ensured.
In step 103, if the user starts the gesture control function in the reference area, it is detected whether the user completes the control operation in the working area of the shooting view according to the hand image of the user in the captured video.
In some embodiments, a work area is set in the photographing field of view, and the reference area is set at a predetermined position in the work area. And determining that the user completes the control operation in the working area if the hand image of the user stops moving in the working area by detecting whether the hand image of the user stops moving in the working area.
For example, after turning on the gesture control function, the user moves the hand to the right. After moving a certain distance, the hand of the user is kept still for a certain time, which indicates that the user completes one control operation.
In step 104, if the user has completed the control operation in the work area, the movement direction and the movement distance in the movement direction of the current position of the user's hand image in the work area with respect to the reference area are detected.
In step 105, an action command corresponding to the current operation mode is generated based on the moving direction and the moving distance.
In some embodiments, the moving speed in the current operation mode is determined by determining a ratio of the moving distance to a predetermined length, wherein the predetermined length is a length of the reference area moving to the boundary of the operation area along the moving direction. And determining the action direction in the current working mode according to the moving direction. And generating a corresponding action command according to the action direction and the action speed.
In some embodiments, prompt information corresponding to the direction of motion and the speed of motion is displayed on the display screen so that the user is aware of the information associated with the direction of motion and the speed of motion.
In some embodiments, the display screen is a transparent display screen. For example, the gesture control device may be smart glasses, and a transparent display screen is disposed on at least one of the glasses of the smart glasses. By displaying the action direction and the action speed on the display screen, the user can know information conveniently. In addition, the display screen is a transparent display screen, so that the sight of a user is not influenced.
In step 106, the action command is sent to the corresponding vehicle control device, so that the vehicle control device controls the corresponding engineering vehicle according to the work command.
In some embodiments, the action instruction is processed using a first protocol to obtain first instruction information. And processing the action command by using a second protocol different from the first protocol to obtain second command information. And sending the first instruction information and the second instruction information to the corresponding vehicle control devices. After receiving the first instruction information and the second instruction information, the vehicle control device processes the first instruction information by using the first protocol to obtain a first action instruction, and processes the second instruction information by using the second protocol to obtain a second action instruction. And only when the first action command and the second action command are the same, the corresponding control is carried out by utilizing the first action command or the second action command. Therefore, the safety of information transmission can be improved, and the misoperation caused by the transmission error of the action command can be effectively avoided.
In the gesture control method provided by the above embodiment of the present disclosure, the operator can control the engineering vehicle without entering the control cabin by moving the hand and controlling the gesture. Therefore, one operator can complete the installation of the hoisting rigging, the observation of the surrounding environment and the operation of the crane, thereby effectively reducing the labor cost and improving the operation safety, the efficiency and the installation precision of the hoisted object.
In some embodiments, the current operating mode is adjusted according to instructions entered by a user via an input device.
For example, there are independent motion control modes and compound motion control modes for a crane. The independent action control mode comprises a telescopic action control mode, a rotary action control mode, a variable amplitude action control mode and a lifting action control mode. The compound action control mode comprises a telescopic/variable amplitude compound control mode, a telescopic/lifting action control mode, a telescopic/rotary action control mode, a variable amplitude/rotary compound control mode, a variable amplitude/lifting compound control mode and a rotary/lifting action control mode. The user can input corresponding instructions by using input devices such as direction keys, confirmation keys, a touch pad and the like according to needs so as to select the current working mode.
In some embodiments, the reference region and the working region in the photographic field of view are mapped onto the display screen. This allows the user to see the gesture movement while viewing the necessary information.
FIG. 2 is a schematic diagram of a motion control interface according to an embodiment of the present disclosure.
As shown in fig. 2, the motion control interface displayed on the display screen includes a reference area 21 and a work area 22. The user can turn on the gesture control function in the reference area 21 and can also perform corresponding control by moving the hand in the work area 22. Since the display screen is a transparent screen, the user can intuitively perform corresponding control operations in the reference area 21 and the work area 22 through gesture motions.
In some embodiments, the reference area 21 is square, circular, or other shape that facilitates manipulation by a user.
In some embodiments, a pointing device 23 is also provided on the motion control interface to alert the user when needed. In addition, working mode prompt information is further arranged on the action control interface to prompt the user of the current working mode.
Fig. 3 is a schematic diagram of a motion control interface according to another embodiment of the disclosure.
As shown in fig. 3, if the user selects the current operation mode as "pivot", the "pivot" is highlighted on the motion control interface. The user moves the hand into the reference area and holds up the thumb to turn on the gesture control function. The gesture control means lights the indication device 23 after the gesture control function is turned on so as to remind the user that the gesture control function is turned on successfully.
The user's hand may then be moved within the work area 22. For example, the hand of the user moves a distance to the right from the reference area 21 and stops. By calculating the distance between the right border of the reference area 21 and the current position of the user's hand and converting to a corresponding percentage based on the maximum length. For example, the resulting percentage is 51%. Since the user has moved a distance of 51% to the right in the swivel mode, the resulting command state is: the direction of motion is right-hand rotation, and the speed of motion is 51%. By displaying the resulting command state on the motion control interface, as shown by information 24 in fig. 3.
In addition, a corresponding progress indication bar may also be displayed according to the movement of the user's hand, as shown by the progress indication bar 25 in fig. 3.
The gesture control device sends the obtained command to the vehicle control device so that the vehicle control device controls the engineering vehicle to execute the right turning motion at the motion speed of 51%.
In some embodiments, if the user's hand leaves the work area 22, the gesture control function is turned off to avoid false control of the vehicle.
Fig. 4 is an exemplary block diagram of a gesture control apparatus according to an embodiment of the present disclosure. As shown in fig. 4, the gesture control means includes a camera module 41, a control module 42, and a communication module 43.
The camera module 41 is configured to perform video capturing in a predetermined direction.
The control module 42 is configured to detect whether a user starts a gesture control function in a reference area of a shooting view according to a user hand image in the captured video, detect whether the user completes a control operation in a working area of the shooting view according to the user hand image in the captured video if the user starts the gesture control function in the reference area, and detect a moving direction and a moving distance in the moving direction of a current position of the user hand image in the working area relative to the reference area if the user completes the control operation in the working area, wherein the reference area is set at a predetermined position in the working area, and generate an action command corresponding to a current working mode according to the moving direction and the moving distance.
In some embodiments, the control module 42 is configured to determine a ratio of the moving distance to a predetermined length as the moving speed in the current working mode, wherein the predetermined length is a length of the reference area moving to the boundary of the working area along the moving direction, determine the moving direction in the current working mode according to the moving direction, and generate a corresponding moving instruction according to the moving direction and the moving speed.
In some embodiments, the control module 42 is configured to detect whether the user hand image is located in the reference area according to the user hand image in the captured video, further detect whether the user hand image corresponds to the preset gesture if the user hand image is located in the reference area, and determine that the user activates the gesture control function in the reference area if the user hand image corresponds to the preset gesture.
In some embodiments, the control module 42 is configured to detect whether the user hand image stops moving in the work area, and if the user hand image stops moving in the work area, determine that the user has completed the control operation in the work area.
In some embodiments, the control module 42 is further configured to turn off the gesture control function if the user hand image is not included in the captured video.
The communication module 43 is configured to transmit the action command to the corresponding vehicle control device so that the vehicle control device controls the corresponding work vehicle according to the work command.
In some embodiments, the communication module 43 is configured to process the action instructions using a first protocol to obtain first instruction information, process the action instructions using a second protocol to obtain second instruction information, and transmit the first instruction information and the second instruction information to the corresponding vehicle control devices.
In the gesture control device provided in the above embodiment of the present disclosure, the operator can control the engineering vehicle without entering the control cabin by moving the hand and controlling the gesture. Therefore, one operator can complete the installation of the hoisting rigging, the observation of the surrounding environment and the operation of the crane, thereby effectively reducing the labor cost and improving the operation safety, the efficiency and the installation precision of the hoisted object.
Fig. 5 is an exemplary block diagram of a gesture control apparatus according to another embodiment of the present disclosure. Fig. 5 differs from fig. 4 in that in the embodiment shown in fig. 5 the gesture control means further comprises a display 44. The display screen 44 is configured to display prompt information corresponding to the direction of motion and the speed of motion.
In some embodiments, the display screen 44 is a transparent display screen. For example, the gesture control device may be smart glasses, and a transparent display screen is disposed on at least one of the glasses of the smart glasses. By displaying the action direction and the action speed on the display screen, the user can know information conveniently. In addition, the display screen is a transparent display screen, so that the sight of a user is not influenced.
In some embodiments, as shown in fig. 5, the gesture control apparatus further includes an input module 45. The input device 45 may be a directional key and a confirmation key, a touch pad, or other type of input device.
The input module 45 is configured to receive an instruction input by a user. The control module 42 is also configured to adjust the current operating mode.
Fig. 6 is an exemplary block diagram of a gesture control apparatus according to still another embodiment of the present disclosure. As shown in fig. 6, the gesture control means includes a memory 61 and a processor 62.
The memory 61 is used for storing instructions, the processor 62 is coupled to the memory 61, and the processor 62 is configured to execute the method according to any embodiment in fig. 1 based on the instructions stored in the memory.
As shown in fig. 6, the gesture control apparatus further includes a communication interface 63 for information interaction with other devices. Meanwhile, the device also comprises a bus 64, and the processor 62, the communication interface 63 and the memory 61 are communicated with each other through the bus 64.
The memory 61 may comprise a high-speed RAM memory, and may further comprise a non-volatile memory (e.g., at least one disk memory). The memory 61 may also be a memory array. The storage 61 may also be partitioned and the blocks may be combined into virtual volumes according to certain rules.
Further, the processor 62 may be a central processing unit CPU, or may be an application specific integrated circuit ASIC, or one or more integrated circuits configured to implement embodiments of the present disclosure.
The present disclosure also relates to a computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions, and the instructions, when executed by a processor, implement the method according to any one of the embodiments in fig. 1.
Fig. 7 is an exemplary block diagram of a gesture control system of one embodiment of the present disclosure. As shown in fig. 7, the gesture control system includes a gesture control device 71 and a working vehicle 72, and a vehicle control device 73 is provided in the working vehicle 72. The gesture control device 71 is the gesture control device according to any one of the embodiments shown in fig. 4 to 6.
The vehicle control device 73 is configured to control the work vehicle according to the work instruction transmitted from the gesture control device.
In some embodiments, the vehicle control device 73 is configured to, in a case where the work instruction includes first instruction information and second instruction information, process the first instruction information using a corresponding first protocol to obtain a first action instruction, and process the second instruction information using a corresponding second protocol to obtain a second action instruction. And controlling the engineering vehicle by using the first action command or the second action command when the first action command and the second action command are the same.
Therefore, the misoperation caused by the instruction transmission error can be avoided.
In some embodiments, the vehicle control device 73 is further configured to determine whether a handle control command is further received after receiving the work command, and if the handle control command is received, only execute the handle control command.
It should be noted here that in a construction vehicle such as a crane, the control cabin is provided with a control handle, and the control signal generated by operating the control handle has a higher priority than other control devices. Therefore, under the condition of receiving the handle control instruction, even if the gesture control device sends a working instruction, the corresponding control is still carried out according to the handle control instruction.
In some embodiments, the vehicle control device 73 is further configured to determine whether the motion direction and the motion speed are valid before executing the work instruction sent by the gesture control device, and only if the motion direction and the motion speed are valid, corresponding control is executed, thereby further improving the system safety.
In some embodiments, the functional unit modules described above can be implemented as a general purpose Processor, a Programmable Logic Controller (PLC), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable Logic device, discrete Gate or transistor Logic, discrete hardware components, or any suitable combination thereof for performing the functions described in this disclosure.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The description of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to practitioners skilled in this art. The embodiment was chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (19)

1. A gesture control method, comprising:
carrying out video acquisition in a preset direction;
detecting whether a user starts a gesture control function in a reference area of a shooting visual field according to a user hand image in the collected video;
if the user starts a gesture control function in the reference area, detecting whether the user finishes control operation in a working area of a shooting visual field according to a hand image of the user in the collected video;
if the user finishes control operation in the working area, detecting the moving direction of the current position of the hand image of the user in the working area relative to the reference area and the moving distance in the moving direction, wherein the reference area is arranged at a preset position in the working area;
generating an action instruction corresponding to the current working mode according to the moving direction and the moving distance;
sending the action instruction to a corresponding vehicle control device so that the vehicle control device can control the corresponding engineering vehicle according to the working instruction;
wherein, according to the moving direction and the moving distance, generating an action instruction corresponding to the current working mode comprises:
determining the ratio of the moving distance to the preset length as the action speed in the current working mode, wherein the preset length is the length of the reference area moving to the boundary of the working area along the moving direction;
determining an action direction in the current working mode according to the moving direction;
generating a corresponding action instruction according to the action direction and the action speed;
and displaying prompt information corresponding to the action direction and the action speed on a display screen.
2. The method of claim 1, wherein,
the display screen is a transparent display screen.
3. The method of claim 1, sending the action instructions to the corresponding vehicle control devices comprising:
processing the action instruction by using a first protocol to obtain first instruction information;
processing the action instruction by using a second protocol to obtain second instruction information;
and sending the first instruction information and the second instruction information to corresponding vehicle control devices.
4. The method of claim 1, detecting whether a user activates a gesture control function within the reference region comprises:
detecting whether the hand image of the user is located in the reference area or not according to the hand image of the user in the acquired video;
if the hand image of the user is located in the reference area, further detecting whether the hand image of the user corresponds to a preset gesture;
and if the hand image of the user corresponds to the preset gesture, determining that the user activates a gesture control function in the reference area.
5. The method of claim 1, detecting whether a user has completed a control operation in the work area comprises:
detecting whether the user hand image stops moving in the work area;
and if the hand image of the user stops moving in the working area, determining that the user completes control operation in the working area.
6. The method of claim 1, further comprising:
and adjusting the current working mode according to an instruction input by a user through the input device.
7. The method of any of claims 1-6, further comprising:
and if the collected video does not comprise the hand image of the user, closing the gesture control function.
8. A gesture control apparatus comprising:
a camera module configured to perform video capture in a predetermined direction;
a control module configured to detect whether a user starts a gesture control function in a reference area of a shooting view according to a user hand image in the captured video, detect whether the user completes a control operation in a working area of the shooting view according to the user hand image in the captured video if the user starts the gesture control function in the reference area, detect a moving direction of a current position of the user hand image in the working area relative to the reference area and a moving distance in the moving direction if the user completes the control operation in the working area, wherein the reference area is set at a predetermined position in the working area, generate an action instruction corresponding to a current working mode according to the moving direction and the moving distance, and wherein a ratio of the moving distance to the predetermined length is determined, determining the action direction in the current working mode according to the movement direction, and generating a corresponding action instruction according to the action direction and the action speed;
a display configured to display prompt information corresponding to the motion direction and the motion speed;
and the communication module is configured to send the action command to a corresponding vehicle control device so that the vehicle control device can control the corresponding engineering vehicle according to the working command.
9. The apparatus of claim 8, wherein,
the display screen is a transparent display screen.
10. The apparatus of claim 8, wherein,
the communication module is configured to process the action command by using a first protocol to obtain first command information, process the action command by using a second protocol to obtain second command information, and send the first command information and the second command information to corresponding vehicle control devices.
11. The apparatus of claim 8, wherein,
the control module is configured to detect whether the user hand image is located in the reference area according to the user hand image in the acquired video, further detect whether the user hand image corresponds to a preset gesture if the user hand image is located in the reference area, and determine that the user activates a gesture control function in the reference area if the user hand image corresponds to the preset gesture.
12. The apparatus of claim 8, wherein,
the control module is configured to detect whether the user hand image stops moving in the work area, and determine that a user completes control operation in the work area if the user hand image stops moving in the work area.
13. The apparatus of claim 8, further comprising:
an input module configured to receive an instruction input by a user;
the control module is further configured to adjust the current operating mode.
14. The apparatus of any one of claims 8-13,
the control module is further configured to turn off the gesture control function if the captured video does not include an image of the user's hand.
15. A gesture control apparatus comprising:
a memory configured to store instructions;
a processor coupled to the memory, the processor configured to perform an implementation of the method recited in any one of claims 1-7 based on instructions stored by the memory.
16. A control system comprising a gesture control apparatus according to any of claims 8-15, and
the engineering vehicle is provided with a vehicle control device, and the vehicle control device is configured to control the engineering vehicle according to the work instruction sent by the gesture control device.
17. The system of claim 16, wherein,
the vehicle control device is configured to, in a case where the work instruction includes first instruction information and second instruction information, process the first instruction information using a corresponding first protocol to obtain a first action instruction, process the second instruction information using a corresponding second protocol to obtain a second action instruction, and in a case where the first action instruction and the second action instruction are the same, control the work vehicle using the first action instruction or the second action instruction.
18. The system of claim 16, wherein,
the vehicle control device is further configured to determine whether a handle control instruction is further received after receiving the work instruction, and execute only the handle control instruction if the handle control instruction is received.
19. A computer readable storage medium, wherein the computer readable storage medium stores computer instructions which, when executed by a processor, implement the method of any one of claims 1-7.
CN201910359363.7A 2019-04-30 2019-04-30 Gesture control method, control device and control system Active CN110069137B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910359363.7A CN110069137B (en) 2019-04-30 2019-04-30 Gesture control method, control device and control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910359363.7A CN110069137B (en) 2019-04-30 2019-04-30 Gesture control method, control device and control system

Publications (2)

Publication Number Publication Date
CN110069137A CN110069137A (en) 2019-07-30
CN110069137B true CN110069137B (en) 2022-07-08

Family

ID=67369700

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910359363.7A Active CN110069137B (en) 2019-04-30 2019-04-30 Gesture control method, control device and control system

Country Status (1)

Country Link
CN (1) CN110069137B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110705510B (en) * 2019-10-16 2023-09-05 杭州优频科技有限公司 Action determining method, device, server and storage medium
CN111736693B (en) * 2020-06-09 2024-03-22 海尔优家智能科技(北京)有限公司 Gesture control method and device of intelligent equipment
CN112244705B (en) * 2020-09-10 2023-05-23 北京石头创新科技有限公司 Intelligent cleaning device, control method and computer storage medium
CN112270302A (en) * 2020-11-17 2021-01-26 支付宝(杭州)信息技术有限公司 Limb control method and device and electronic equipment
CN114007140A (en) * 2021-10-29 2022-02-01 海信视像科技股份有限公司 Method for controlling position of controlled role through gesture and display device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102591446A (en) * 2011-01-10 2012-07-18 海尔集团公司 Gesture control display system and control method thereof
CN102749994A (en) * 2012-06-14 2012-10-24 华南理工大学 Indicating method for motion direction and speed strength of gesture in interaction system
CN102870078A (en) * 2010-02-10 2013-01-09 微晶片科技德国第二公司 System and method for contactless detection and recognition of gestures in three-dimensional space
CN103383598A (en) * 2012-05-04 2013-11-06 三星电子株式会社 Terminal and method for controlling the same based on spatial interaction
CN103793056A (en) * 2014-01-26 2014-05-14 华南理工大学 Mid-air gesture roaming control method based on distance vector
CN104216514A (en) * 2014-07-08 2014-12-17 深圳市华宝电子科技有限公司 Method and device for controlling vehicle-mounted device, and vehicle
CN104699238A (en) * 2013-12-10 2015-06-10 现代自动车株式会社 System and method for gesture recognition of vehicle
CN107563286A (en) * 2017-07-28 2018-01-09 南京邮电大学 A kind of dynamic gesture identification method based on Kinect depth information

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101459445B1 (en) * 2012-12-18 2014-11-07 현대자동차 주식회사 System and method for providing a user interface using wrist angle in a vehicle
KR102561132B1 (en) * 2016-09-21 2023-07-28 엘지전자 주식회사 Vehicle control device mounted on vehicle and method for controlling the vehicle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102870078A (en) * 2010-02-10 2013-01-09 微晶片科技德国第二公司 System and method for contactless detection and recognition of gestures in three-dimensional space
CN102591446A (en) * 2011-01-10 2012-07-18 海尔集团公司 Gesture control display system and control method thereof
CN103383598A (en) * 2012-05-04 2013-11-06 三星电子株式会社 Terminal and method for controlling the same based on spatial interaction
CN102749994A (en) * 2012-06-14 2012-10-24 华南理工大学 Indicating method for motion direction and speed strength of gesture in interaction system
CN104699238A (en) * 2013-12-10 2015-06-10 现代自动车株式会社 System and method for gesture recognition of vehicle
CN103793056A (en) * 2014-01-26 2014-05-14 华南理工大学 Mid-air gesture roaming control method based on distance vector
CN104216514A (en) * 2014-07-08 2014-12-17 深圳市华宝电子科技有限公司 Method and device for controlling vehicle-mounted device, and vehicle
CN107563286A (en) * 2017-07-28 2018-01-09 南京邮电大学 A kind of dynamic gesture identification method based on Kinect depth information

Also Published As

Publication number Publication date
CN110069137A (en) 2019-07-30

Similar Documents

Publication Publication Date Title
CN110069137B (en) Gesture control method, control device and control system
CN111017726B (en) Crane hook positioning method, device and system and engineering machinery
AU2018333452B2 (en) User interface for reversing a trailer with automated steering system
JP5326794B2 (en) Remote operation system and remote operation method
JP6743676B2 (en) Remote control terminal
TW201622916A (en) Robot and control method thereof
EP3831766B1 (en) Crane
EP3560881A1 (en) Operation control method and system for crane, and crane
CN111891922B (en) Crane operation real-time navigation system and method
EP3228760A1 (en) Travel control system and method for a work machine
SE1450785A1 (en) Method and a mobile electronic device for controlling a vehicle
CN110555913B (en) Virtual imaging method and device based on industrial human-computer interface
US20210349537A1 (en) Remote control of a device via a virtual interface
US11649146B2 (en) Safety system
JP2012179682A (en) Mobile robot system, mobile robot control device, and moving control method and moving control program to be used for the control device
CN105460785A (en) Control method and system for video monitoring and crane
EP3629135B1 (en) Action processing apparatus
KR101664968B1 (en) Position tracking device for riding basket of high place works car and its method
US12030751B2 (en) Crane information display system
CN113620191B (en) Crane operation protection method, device and system and crane
CN105442867A (en) Break-in device lifting system, fire engine and method
EP3778463A1 (en) Work vehicle
JP6922686B2 (en) Operating device
US20230333550A1 (en) Remote operation system, remote operation method, and storage medium
US11989841B2 (en) Operation system for industrial machinery

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant