CN111736511A - Equipment control method and device - Google Patents

Equipment control method and device Download PDF

Info

Publication number
CN111736511A
CN111736511A CN202010691827.7A CN202010691827A CN111736511A CN 111736511 A CN111736511 A CN 111736511A CN 202010691827 A CN202010691827 A CN 202010691827A CN 111736511 A CN111736511 A CN 111736511A
Authority
CN
China
Prior art keywords
equipment
direction information
control terminal
current
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010691827.7A
Other languages
Chinese (zh)
Inventor
孔紫微
谢文皓
王璐璐
王大勇
菅磊
蔡瑜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute Of Technology Robot (yueyang) Military And Civilian Integration Research Institute
Original Assignee
Harbin Institute Of Technology Robot (yueyang) Military And Civilian Integration Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute Of Technology Robot (yueyang) Military And Civilian Integration Research Institute filed Critical Harbin Institute Of Technology Robot (yueyang) Military And Civilian Integration Research Institute
Priority to CN202010691827.7A priority Critical patent/CN111736511A/en
Publication of CN111736511A publication Critical patent/CN111736511A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0428Safety, monitoring
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2603Steering car

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a method and a device for controlling equipment; wherein, the method comprises the following steps: acquiring current first direction information of a control terminal and a motion instruction sent to equipment by the control terminal, wherein the motion instruction is used for indicating the motion direction to be adjusted and the target motion speed of the equipment; determining a target movement direction of the equipment according to the first direction information and the movement direction to be adjusted; acquiring current second direction information of the equipment; adjusting the second direction information to the target movement direction and the movement speed of the device to the target movement speed; wherein the first direction information and the second direction information are determined based on the same reference coordinate system. Through the method and the device, the third person can control the equipment through the visual angle, the control difficulty of the equipment is reduced, and errors in equipment control are reduced.

Description

Equipment control method and device
Technical Field
The invention relates to the field of computers, in particular to a method and a device for controlling equipment.
Background
The existing combat robot trolley (hereinafter, the trolley is referred to as a trolley for short) adopts a first person visual angle, namely, the visual angle of an operator always coincides with the visual angle of the head of the trolley, and the operation is carried out according to the direction determined by the operator, namely, a rudder rocker of a remote control device moves forwards, the trolley moves forwards in the direction of the head, the rudder rocker of the remote control device moves backwards, and the trolley moves backwards in the direction of the tail of the vehicle. The direction control in the conventional remote control system is carried out by taking the coordinate system of the trolley as a reference to move. However, in a violent game, the control method is likely to cause the direction judgment error of the operator, and the game is not favorable due to the operation error.
Disclosure of Invention
The invention mainly aims to provide a device control method and a device, and aims to solve the problem that in the prior art, misoperation is easily caused when a first-person visual angle is adopted for controlling a robot trolley to operate.
In order to achieve the above object, according to an aspect of the present invention, there is provided a control method of an apparatus, including: acquiring current first direction information of a control terminal and a motion instruction sent to equipment by the control terminal, wherein the motion instruction is used for indicating the motion direction to be adjusted and the target motion speed of the equipment; determining a target movement direction of the equipment according to the first direction information and the movement direction to be adjusted; acquiring current second direction information of the equipment; adjusting the second direction information to the target movement direction and the movement speed of the device to the target movement speed; wherein the first direction information and the second direction information are determined based on the same reference coordinate system.
In order to achieve the above object, according to one aspect of the present invention, there is provided a control apparatus of a device, comprising: the device comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring current first direction information of a control terminal and a motion instruction sent to equipment by the control terminal, and the motion instruction is used for indicating the motion direction and the target motion speed of the equipment to be adjusted; the determining module is used for determining the target movement direction of the equipment according to the first direction information and the movement direction to be adjusted; the second obtaining module is used for obtaining the current second direction information of the equipment; the adjusting module is used for adjusting the second direction information into the target movement direction and adjusting the movement speed of the equipment into the target movement speed; wherein the first direction information and the second direction information are determined based on the same reference coordinate system.
By applying the technical scheme of the invention, because the direction information of the control terminal and the equipment is determined by the same reference coordinate system, but not by the control terminal or the self coordinate system of the equipment, the control direction of the control terminal is kept consistent with the motion direction of the equipment, namely, the equipment is controlled by a third person-name visual angle instead of being operated by a first person-name visual angle; through the mode of third person, reduced the control degree of difficulty to equipment, reduced the error to equipment control to the control of robot dolly adopts first person's visual angle to operate and leads to misoperation's problem easily among the prior art has been solved.
In addition to the objects, features and advantages described above, other objects, features and advantages of the present invention are also provided. The present invention will be described in further detail below with reference to the drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a flow chart of a method of controlling a device according to an embodiment of the present application;
FIG. 2 shows a first schematic structural diagram of a combat robot trolley and a handheld rocker terminal according to an embodiment of the application;
FIG. 3 shows a second schematic structural diagram of the fighting robot trolley and the handheld rocker terminal according to the embodiment of the application;
fig. 4 is a schematic structural diagram of a control device of an apparatus according to an embodiment of the present application.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances for describing embodiments of the invention herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
An embodiment of the present application provides a method for controlling a device, and fig. 1 is a flowchart of a method for controlling a device according to an embodiment of the present application, and as shown in fig. 1, the method includes the steps of:
step S102, acquiring current first direction information of a control terminal and a motion instruction sent by the control terminal to equipment, wherein the motion instruction is used for indicating the motion direction and the target motion speed of the equipment to be adjusted;
step S104, determining the target movement direction of the equipment according to the first direction information and the movement direction to be adjusted;
step S106, acquiring the current second direction information of the equipment;
step S108, adjusting the second direction information to be a target movement direction, and adjusting the movement speed of the equipment to be a target movement speed; wherein the first direction information and the second direction information are determined based on the same reference coordinate system.
Through the steps S102 to S108, since the direction information of the control terminal and the device is determined by the same reference coordinate system, not by the control terminal or the device itself, the control direction of the control terminal and the movement direction of the device are kept consistent, which is equivalent to that the device is controlled by the third person perspective, not by the first person perspective; through the mode of third person, reduced the control degree of difficulty to equipment, reduced the error to equipment control to the control of robot dolly adopts first person's visual angle to operate and leads to misoperation's problem easily among the prior art has been solved.
Optionally, the manner of acquiring the current first direction information of the control terminal and the motion instruction sent by the control terminal to the device, which is referred to in step S102 in this embodiment of the application, may further include:
step S102-11, acquiring first position information acquired through a first sensor arranged on a control terminal, and calculating a first attitude angle of the control terminal according to the first position information; wherein the first direction information comprises a first attitude angle;
step S102-12, acquiring an electric signal triggered on a control terminal, and converting the electric signal into a digital motion instruction; the electric signals are used for indicating the target movement direction and the target movement speed of the equipment.
For the first sensor involved in step S102 in the embodiment of the present application, in a specific application scenario, the first position information may be acquired by a 9-axis sensor, that is, the first position information is acquired by a sensor including a 3-axis gyroscope, a 3-axis accelerometer, and a 3-axis magnetometer, and then a first attitude angle of the control terminal may be calculated according to a 9-axis attitude fusion algorithm, where the first attitude angle may also be referred to as an euler angle, and a reference system is an NED ground coordinate system, and includes a heading angle yaw, a roll, and a pitch angle pitch. Of course, the 9-axis sensor is merely exemplary, and other sensors capable of collecting the first position information are within the scope of the present application.
It should be noted that, in the embodiment of the present application, the manner of determining the target movement direction of the device according to the first direction information and the movement direction to be adjusted, which is referred to in step S104, may be executed on the control terminal side, or may be executed by the device after sending the movement instruction to the device. That is, the present application does not limit the means for executing step S104.
Optionally, the method for acquiring the current second direction information of the device in step S106 of the present application may further include: acquiring second position information acquired by a second sensor arranged on the equipment, and calculating a second attitude angle of the control terminal according to the second position information; wherein the second direction information comprises a second attitude angle; the first sensor and the second sensor are the same type of sensor.
For the second sensor involved in step S106 of the embodiment of the present application, in a specific application scenario, the second position information may be acquired by a 9-axis sensor, that is, the second position information is acquired by a sensor including a 3-axis gyroscope, a 3-axis accelerometer, and a 3-axis magnetometer, and then a second attitude angle of the control terminal may be calculated according to a 9-axis attitude fusion algorithm, where the second attitude angle may also be referred to as an euler angle, and the reference system is an NED ground coordinate system including a heading angle yaw, a roll angle roll, and a pitch angle pitch. Of course, the 9-axis sensor is merely exemplary, and other sensors capable of collecting the second position information are within the scope of the present application.
It should be noted that the reference coordinate system in the embodiment of the present application includes: a northeast coordinate system or a body coordinate system. Of course, this reference coordinate system is merely an example, and other coordinate systems may be used as long as they can determine the direction information of the control terminal and the device.
Optionally, in this embodiment of the present application, in a case that the reference coordinate system is a body coordinate system, before the obtaining of the current second direction information of the device, the method of this embodiment of the present application may further include:
step S1, judging whether the current attitude angle of the equipment is zero;
step S2, under the condition that the current attitude angle of the equipment is not zero, the current attitude angle of the equipment is adjusted to be zero;
step S3, when the current attitude angle of the device is zero, triggering the step of acquiring the current second direction information of the device.
As can be seen from the above steps S1 to S3, in the case that the reference coordinate system is the machine coordinate system, the device needs to be calibrated, that is, the target direction and the target speed of the device can be accurately determined after the calibration.
In the following, the embodiment of the present application is described by way of example with reference to different coordinate systems and different ways of determining the target speed of the device, in which in this specific embodiment, the control terminal is taken as a handheld remote control terminal, and the device is taken as an example of a fighting robot trolley, that is, a way of controlling the fighting robot trolley by the handheld remote control terminal through a third person is described.
Embodiment mode 1): the coordinate system is a North East Down (NED) coordinate system, and the target direction and speed are determined to be on the trolley side of the fighting robot; the control method comprises the following steps:
and step S11, the hand-held remote control terminal controller calculates the attitude angle of the hand-held remote control terminal according to the 9-axis sensor original data on the terminal and the 9-axis attitude fusion algorithm.
Wherein, the reference system is an NED ground coordinate system, which includes a heading angle yaw, a roll angle, and a pitch angle pitch).
And step S12, the hand-held remote control terminal controller obtains the electric signals of the rocker and the key, and the electric signals are converted into digital motion instructions through the analog and digital.
The remote sensing and the key are used for determining the direction and the speed of the fighting robot trolley to be adjusted, and can be determined in other modes in other application scenes, such as touch screen control and the like.
And step S13, the hand-held remote control terminal transmits the terminal attitude angle and the motion instruction to the 2.4G data transmission module, and the 2.4G data transmission module converts the digital signal into a wireless signal and sends the wireless signal.
And step S14, the 2.4G wireless data transmission module of the fighting robot trolley receives the wireless signals and detects and transmits the effective information to the trolley controller. The car controller acquires direction information (attitude angle) and motion instructions (rocker and key signals) of the handheld remote control terminal.
And step S15, the wrestling robot car controller calculates the attitude angle of the car according to the 9-axis sensor original data on the car and the 9-axis attitude fusion algorithm.
Wherein, the reference system is an NED ground coordinate system, which comprises a heading angle yaw, a roll angle and a pitch angle pitch.
Step S16, the car controller of the fighting robot calculates the target direction and speed of the car according to the direction information (attitude angle) and the motion instruction (rocker and key signal) of the hand-held remote control terminal; and then, according to the current direction information (attitude angle) of the trolley, calculating and converting the PWM (Pulse width modulation) output value of the current left wheel and the current right wheel of the trolley.
And step S17, the trolley electric modulator of the fighting robot receives the PWM value output by the trolley controller, and controls the left and right wheel motors to reach corresponding rotating speeds, so as to complete the control of the trolley posture.
Embodiment 2): the coordinate system is a North East Down (NED) coordinate system, and the target direction and speed are determined to be at the side of the handheld remote control terminal; the control method comprises the following steps:
step S21, the hand-held remote control terminal controller calculates the attitude angle of the hand-held remote control terminal according to the 9-axis sensor original data on the terminal and the 9-axis attitude fusion algorithm;
wherein, the reference system is an NED ground coordinate system, which comprises a heading angle yaw, a roll angle and a pitch angle pitch.
And step S22, the hand-held remote control terminal controller obtains the electric signals of the rocker and the key, and the electric signals are converted into digital motion instructions through the analog and digital.
And step S23, the controller of the hand-held remote control terminal calculates the target direction and speed of the movement of the trolley according to the direction information (attitude angle) and the movement instruction (rocker and key signal) of the hand-held remote control terminal.
And step S24, the hand-held remote control terminal transmits the target direction and speed of the trolley movement to the 2.4G data transmission module, and the 2.4G data transmission module converts the digital signal into a wireless signal and sends the wireless signal.
And step S25, the 2.4G wireless data transmission module of the fighting robot trolley receives the wireless signals and detects and transmits the effective information to the trolley controller. The car controller obtains the target direction and speed of the movement of the car.
Step S26, the fighting robot trolley controller calculates the attitude angle of the trolley according to the 9-axis sensor original data on the trolley and the 9-axis attitude fusion algorithm;
wherein, the reference system is an NED ground coordinate system, which comprises a heading angle yaw, a roll angle and a pitch angle pitch.
And step S27, calculating and converting the PWM output values of the current left and right wheels of the trolley according to the current direction information (attitude angle) and the target direction and speed of the movement of the trolley by the trolley.
And step S28, the trolley electric modulator of the fighting robot receives the PWM value output by the trolley controller, and controls the left and right wheel motors to reach corresponding rotating speeds, so as to complete the control of the trolley posture.
With respect to the structures of the fisting robot trolley and the handheld remote control terminal corresponding to the above steps S11 to S18, and steps S21 to S28, as shown in fig. 2, the fisting robot includes: the system comprises a 9-axis sensor, a 2.4G data transmission module, a car controller and 2 paths (electric regulation and motor). The hand-held remote control terminal includes: 2.4G data transmission module, remote control terminal controller, rocker and button, 9 axle sensors.
Embodiment 3): the coordinate system is a machine body coordinate system, and the target direction and the speed are determined to be on the trolley side of the fighting robot; the control method comprises the following steps:
step S31, the hand-held remote control terminal controller calculates the attitude angle of the hand-held remote control terminal according to the 6-axis sensor original data on the terminal and the 6-axis attitude fusion algorithm;
the reference system is a machine body coordinate system and comprises a heading angle yaw, a roll angle and a pitch angle pitch.
And step S32, the hand-held remote control terminal controller obtains the electric signals of the rocker and the key, and the electric signals are converted into digital motion instructions through the analog and digital.
And step S33, the hand-held remote control terminal transmits the terminal attitude angle and the motion instruction to the 2.4G data transmission module, and the 2.4G data transmission module converts the digital signal into a wireless signal and sends the wireless signal.
And step S34, the 2.4G wireless data transmission module of the fighting robot trolley receives the wireless signals and detects and transmits the effective information to the trolley controller. The car controller acquires direction information (attitude angle) and motion instructions (rocker and key signals) of the handheld remote control terminal.
And step S35, the trolley controller judges whether the current posture of the trolley needs to be calibrated to be the zero-angle position in the motion instruction. If yes, go to step S36; if not, go to step S37.
At step S36, the car controller records the current position of the car as a zero angle (yaw =0 °, roll =0 °, pitch =0 °).
Step S37, the fighting robot trolley controller calculates the attitude angle of the trolley according to the 6-axis sensor original data on the trolley and the 6-axis attitude fusion algorithm;
the reference system is a machine body coordinate system and comprises a heading angle yaw, a roll angle and a pitch angle pitch.
Step S38, the car controller of the fighting robot calculates the target direction and speed of the car according to the direction information (attitude angle) and the motion instruction (rocker and key signal) of the hand-held remote control terminal; and then, according to the current direction information (attitude angle) of the trolley, calculating and converting the PWM output values of the current left and right wheels of the trolley.
And step S39, the trolley electric modulator of the fighting robot receives the PWM value output by the trolley controller, and controls the left and right wheel motors to reach corresponding rotating speeds, so as to complete the control of the trolley posture.
Embodiment 4): the coordinate system is a machine body coordinate system, and the target direction and speed are determined to be at the terminal side of the handheld rocker; the control method comprises the following steps:
step S41, the hand-held remote control terminal controller calculates the attitude angle of the hand-held remote control terminal according to the 6-axis sensor original data on the terminal and the 6-axis attitude fusion algorithm;
the reference system is a machine body coordinate system and comprises a heading angle yaw, a roll angle and a pitch angle pitch.
And step S42, the hand-held remote control terminal controller obtains the electric signals of the rocker and the key, and the electric signals are converted into digital motion instructions through the analog and digital.
And step S43, the controller of the hand-held remote control terminal calculates the target direction and speed of the movement of the trolley according to the direction information (attitude angle) and the movement instruction (rocker and key signal) of the hand-held remote control terminal.
And step S44, the hand-held remote control terminal transmits the target direction and speed of the trolley movement to the 2.4G data transmission module, and the 2.4G data transmission module converts the digital signal into a wireless signal and sends the wireless signal.
And step S45, the 2.4G wireless data transmission module of the fighting robot trolley receives the wireless signals, detects and transmits effective information to the trolley controller, and then the trolley controller acquires the target direction and speed of the trolley movement.
And step S46, the trolley controller judges whether the current posture of the trolley needs to be calibrated to be the zero-angle position in the received terminal signal. If yes, go to step S47; if not, go to step S48.
At step S47, the car controller records the current position of the car as a zero angle (yaw =0 °, roll =0 °, pitch =0 °).
Step S48, the fighting robot trolley controller calculates the attitude angle of the trolley according to the 6-axis sensor original data on the trolley and the 6-axis attitude fusion algorithm;
the reference system is a machine body coordinate system and comprises a heading angle yaw, a roll angle and a pitch angle pitch.
And step S49, calculating and converting the PWM output values of the current left and right wheels of the trolley according to the current direction information (attitude angle) and the target direction and speed of the movement of the trolley by the trolley.
And step S50, the trolley electric modulator of the fighting robot receives the PWM value output by the trolley controller, and controls the left and right wheel motors to reach corresponding rotating speeds, so as to complete the control of the trolley posture.
With respect to the structures of the fisting robot trolley and the handheld remote control terminal corresponding to the above steps S31 to S39, and steps S41 to S50, as shown in fig. 3, the fisting robot trolley includes: the 6-axis sensor (including 3-axis gyroscope, 3-axis accelerometer), 2.4G data transmission module, car controller and 2-channel electric regulation and motor. The handheld rocker terminal includes: the device comprises a 6-axis sensor (comprising a 3-axis gyroscope and a 3-axis accelerometer), a remote control terminal controller, a rocker and a key and a 2.4G data transmission module.
It can be seen that, in this embodiment, the car attitude angle obtained by the fighting robot in real time may be relative to the NED ground coordinate system or may be relative to the body coordinate system. The terminal attitude angle acquired by the wireless remote control terminal in real time can be relative to an NED ground coordinate system or a machine body coordinate system, but needs to be consistent with a reference system of the trolley. The motion instruction given by the rocker and the key is also in a relative relation relative to the attitude of the terminal, and the relative relation is always fixed, so that the target attitude and speed of the trolley can be calculated through the attitude angle and the motion instruction of the terminal, and the calculation execution can be carried out on a controller of the wireless remote control terminal or a trolley controller.
Because the wireless terminal remote controller and the combat robot trolley are the same in motion reference system, corresponding actions can be correctly executed after the trolley controller acquires the target attitude angle and speed of the trolley, so that the limitation of the first-person view angle of the trolley motion is eliminated, and the third-person view angle is realized to control the combat robot trolley.
An embodiment of the present application further provides a control apparatus of a device, fig. 4 is a schematic structural diagram of the control apparatus of the device in the embodiment of the present application, and as shown in fig. 4, the apparatus includes:
a first obtaining module 42, configured to obtain current first direction information of the control terminal and a motion instruction sent by the control terminal to the device, where the motion instruction is used to indicate a motion direction and a target motion speed to be adjusted by the device;
a determining module 44, configured to determine a target movement direction of the device according to the first direction information and the movement direction to be adjusted;
a second obtaining module 46, configured to obtain current second direction information of the device;
an adjusting module 48, configured to adjust the second direction information to a target movement direction, and adjust the movement speed of the device to a target movement speed;
wherein the first direction information and the second direction information are determined based on the same reference coordinate system.
Optionally, the first obtaining module 42 of the embodiment of the present application further includes: the first processing unit is used for acquiring first position information acquired by a first sensor arranged on the control terminal and calculating a first attitude angle of the control terminal according to the first position information; wherein the first direction information comprises a first attitude angle; the second processing unit is used for acquiring an electric signal triggered on the control terminal and converting the electric signal into a digital motion instruction; the electric signals are used for indicating the target movement direction and the target movement speed of the equipment.
Optionally, the second obtaining module 46 in this embodiment of the application is further configured to obtain second position information acquired by a second sensor arranged on the device, and calculate a second attitude angle of the control terminal according to the second position information; wherein the second direction information comprises a second attitude angle; the first sensor and the second sensor are the same type of sensor.
Optionally, the reference coordinate system in the embodiment of the present application includes: a northeast coordinate system or a body coordinate system.
Optionally, in the case that the reference coordinate system is a body coordinate system, the apparatus of the embodiment of the present application may further include: the judging module is used for judging whether the current attitude angle of the equipment is a zero angle or not before acquiring the current second direction information of the equipment; the adjusting module is used for adjusting the current attitude angle of the equipment to be a zero angle under the condition that the current attitude angle of the equipment is not a zero angle; and the triggering module is used for triggering and executing the step of acquiring the current second direction information of the equipment under the condition that the current attitude angle of the equipment is zero.
In the embodiment of the application, because the direction information of the control terminal and the equipment is determined by the same reference coordinate system, but not by the control terminal or the own coordinate system of the equipment, the control direction of the control terminal is kept consistent with the motion direction of the equipment, which is equivalent to that the equipment is controlled by a third person perspective, but the equipment is not operated by a first person perspective, so that the control difficulty of the equipment is reduced, the error of equipment control is reduced, and the problem that the operation error is easily caused by operating the control of the robot trolley by the first person perspective in the prior art is solved.
The relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description. Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Spatially relative terms, such as "above … …," "above … …," "above … …," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial relationship to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is turned over, devices described as "above" or "on" other devices or configurations would then be oriented "below" or "under" the other devices or configurations. Thus, the exemplary term "above … …" can include both an orientation of "above … …" and "below … …". The device may be otherwise variously oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
In the description of the present invention, it is to be understood that the orientation or positional relationship indicated by the orientation words such as "front, rear, upper, lower, left, right", "lateral, vertical, horizontal" and "top, bottom", etc. are usually based on the orientation or positional relationship shown in the drawings, and are only for convenience of description and simplicity of description, and in the case of not making a reverse description, these orientation words do not indicate and imply that the device or element being referred to must have a specific orientation or be constructed and operated in a specific orientation, and therefore, should not be considered as limiting the scope of the present invention; the terms "inner and outer" refer to the inner and outer relative to the profile of the respective component itself.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method of controlling a device, comprising:
acquiring current first direction information of a control terminal and a motion instruction sent to equipment by the control terminal, wherein the motion instruction is used for indicating the motion direction to be adjusted and the target motion speed of the equipment;
determining a target movement direction of the equipment according to the first direction information and the movement direction to be adjusted;
acquiring current second direction information of the equipment;
adjusting the second direction information to the target movement direction and the movement speed of the device to the target movement speed;
wherein the first direction information and the second direction information are determined based on the same reference coordinate system.
2. The method of claim 1, wherein the obtaining of the current first direction information of the control terminal and the motion instruction sent by the control terminal to the device comprises:
acquiring first position information acquired by a first sensor arranged on the control terminal, and calculating a first attitude angle of the control terminal according to the first position information; wherein the first direction information comprises the first attitude angle;
acquiring an electric signal triggered on the control terminal, and converting the electric signal into a digital motion instruction; wherein the electrical signals are used to indicate a target movement direction and a target movement speed of the device.
3. The method of claim 2, wherein obtaining current second direction information of the device comprises:
acquiring second position information acquired by a second sensor arranged on the equipment, and calculating a second attitude angle of the control terminal according to the second position information; wherein the second direction information comprises the second attitude angle; the first sensor and the second sensor are the same type of sensor.
4. The method of claim 1, wherein the reference coordinate system comprises: a northeast coordinate system or a body coordinate system.
5. The method according to claim 4, wherein in the case that the reference coordinate system is the body coordinate system, before acquiring the current second direction information of the device, the method further comprises:
judging whether the current attitude angle of the equipment is a zero angle or not;
under the condition that the current attitude angle of the equipment is not a zero angle, adjusting the current attitude angle of the equipment to be a zero angle;
and under the condition that the current attitude angle of the equipment is a zero angle, triggering and executing the step of acquiring the current second direction information of the equipment.
6. A control apparatus of a device, characterized by comprising:
the device comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring current first direction information of a control terminal and a motion instruction sent to equipment by the control terminal, and the motion instruction is used for indicating the motion direction and the target motion speed of the equipment to be adjusted;
the determining module is used for determining the target movement direction of the equipment according to the first direction information and the movement direction to be adjusted;
the second obtaining module is used for obtaining the current second direction information of the equipment;
the adjusting module is used for adjusting the second direction information into the target movement direction and adjusting the movement speed of the equipment into the target movement speed;
wherein the first direction information and the second direction information are determined based on the same reference coordinate system.
7. The apparatus of claim 6, wherein the first obtaining module comprises:
the first processing unit is used for acquiring first position information acquired by a first sensor arranged on the control terminal and calculating a first attitude angle of the control terminal according to the first position information; wherein the first direction information comprises the first attitude angle;
the second processing unit is used for acquiring an electric signal triggered on the control terminal and converting the electric signal into a digital motion instruction; wherein the electrical signals are used to indicate a target movement direction and a target movement speed of the device.
8. The apparatus according to claim 7, wherein the second obtaining module is further configured to obtain second position information acquired by a second sensor disposed on the device, and calculate a second attitude angle of the control terminal according to the second position information; wherein the second direction information comprises the second attitude angle; the first sensor and the second sensor are the same type of sensor.
9. The apparatus of claim 6, wherein the reference coordinate system comprises: a northeast coordinate system or a body coordinate system.
10. The apparatus of claim 9, wherein in the case that the reference coordinate system is the body coordinate system, the apparatus further comprises:
the judging module is used for judging whether the current attitude angle of the equipment is zero angle or not before acquiring the current second direction information of the equipment;
the adjusting module is used for adjusting the current attitude angle of the equipment to be a zero angle under the condition that the current attitude angle of the equipment is not a zero angle;
and the triggering module is used for triggering and executing the step of acquiring the current second direction information of the equipment under the condition that the current attitude angle of the equipment is a zero angle.
CN202010691827.7A 2020-07-17 2020-07-17 Equipment control method and device Pending CN111736511A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010691827.7A CN111736511A (en) 2020-07-17 2020-07-17 Equipment control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010691827.7A CN111736511A (en) 2020-07-17 2020-07-17 Equipment control method and device

Publications (1)

Publication Number Publication Date
CN111736511A true CN111736511A (en) 2020-10-02

Family

ID=72654909

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010691827.7A Pending CN111736511A (en) 2020-07-17 2020-07-17 Equipment control method and device

Country Status (1)

Country Link
CN (1) CN111736511A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103177545A (en) * 2011-12-26 2013-06-26 联想(北京)有限公司 Remote controller, mobile equipment and method for controlling mobile equipment by using remote controller
CN105391988A (en) * 2015-12-11 2016-03-09 谭圆圆 Multi-view unmanned aerial vehicle and multi-view display method thereof
CN105835984A (en) * 2016-05-10 2016-08-10 方绍峡 Hexapod bionic robot
CN106695741A (en) * 2017-02-10 2017-05-24 中国东方电气集团有限公司 Method for system state detection and initial work of mobile robot
CN106973261A (en) * 2017-03-03 2017-07-21 湖北天专科技有限公司 With the equipment, system and method for third party's view unmanned plane
CN108018796A (en) * 2017-11-21 2018-05-11 浙江工业大学 A kind of tide track altering system and method based on incremental encoder

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103177545A (en) * 2011-12-26 2013-06-26 联想(北京)有限公司 Remote controller, mobile equipment and method for controlling mobile equipment by using remote controller
CN105391988A (en) * 2015-12-11 2016-03-09 谭圆圆 Multi-view unmanned aerial vehicle and multi-view display method thereof
CN105835984A (en) * 2016-05-10 2016-08-10 方绍峡 Hexapod bionic robot
CN106695741A (en) * 2017-02-10 2017-05-24 中国东方电气集团有限公司 Method for system state detection and initial work of mobile robot
CN106973261A (en) * 2017-03-03 2017-07-21 湖北天专科技有限公司 With the equipment, system and method for third party's view unmanned plane
CN108018796A (en) * 2017-11-21 2018-05-11 浙江工业大学 A kind of tide track altering system and method based on incremental encoder

Similar Documents

Publication Publication Date Title
CN111386504B (en) Golf cart system for automatic driving based on accurate position information and golf cart control method using the same
US20190250619A1 (en) Autonomous bicycle
EP3299920B1 (en) Unmanned aerial vehicle control method and device based on no-head mode
KR100807449B1 (en) Control device for legged robot
AU2018333452B2 (en) User interface for reversing a trailer with automated steering system
CN106527439B (en) Motion control method and device
CN104603706A (en) Autopilot control arrangement and methods
US20160229057A1 (en) Unmanned ground vehicle stability control
CN104914988A (en) Gesture recognition apparatus and control method of gesture recognition apparatus
CN106774318B (en) Multi-agent interactive environment perception and path planning motion system
JPH11198075A (en) Behavior support system
KR20130001955A (en) Remote control system and the method for automatic welding
Yim et al. Drift-free roll and pitch estimation for high-acceleration hopping
JP2018513496A (en) Foot contact position tracking device, method for controlling movement thereof, computer-executable program, and computer-readable non-transitory information recording medium storing the same
CN111736511A (en) Equipment control method and device
CN105139619A (en) System capable of simulating moving posture of remote-control moving device
CN111752282A (en) Equipment control method and device
CN205594454U (en) Mobile electronic equipment
JP2004016275A (en) Mobile body
WO2019140807A1 (en) Operating method for stabilizer somatosensory remote control system
CN110371170B (en) Intelligent power assisting device, system and method for controlling intelligent power assisting device to provide power assistance
CN107567605B (en) Improved method for correcting trajectories in a personal mobility assistance device equipped with a sensor
CN110674762B (en) Method for detecting human body in automatic doll walking process
CN108710443B (en) Displacement data generation method and control system
JP2002258944A (en) Remote controller for mobile cart and steering input device for mobile cart

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201002