CN111124185B - Control method and device of equipment, server and storage medium - Google Patents

Control method and device of equipment, server and storage medium Download PDF

Info

Publication number
CN111124185B
CN111124185B CN201911364561.9A CN201911364561A CN111124185B CN 111124185 B CN111124185 B CN 111124185B CN 201911364561 A CN201911364561 A CN 201911364561A CN 111124185 B CN111124185 B CN 111124185B
Authority
CN
China
Prior art keywords
gesture information
touch
relative
touch pad
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911364561.9A
Other languages
Chinese (zh)
Other versions
CN111124185A (en
Inventor
陈勇健
张小军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Donika Avionics Co ltd
Original Assignee
Shenzhen Donika Avionics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Donika Avionics Co ltd filed Critical Shenzhen Donika Avionics Co ltd
Priority to CN201911364561.9A priority Critical patent/CN111124185B/en
Publication of CN111124185A publication Critical patent/CN111124185A/en
Application granted granted Critical
Publication of CN111124185B publication Critical patent/CN111124185B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The embodiment of the application provides a control method and device of equipment, a server and a storage medium. The control method of the device comprises the following steps: receiving a touch instruction of a touch pad to controlled equipment; acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction; calculating relative posture information of the controlled equipment corresponding to the touch pad according to the first posture information and the second posture information; matching control instructions corresponding to the touch instructions according to the relative gesture information; and controlling the controlled equipment to execute the corresponding function in response to the control instruction. The control instruction corresponding to the touch instruction is matched by calculating the relative gesture information of the touch control panel relative to the controlled equipment, so that the effect of improving the accuracy of the touch control panel in controlling the controlled equipment is achieved.

Description

Control method and device of equipment, server and storage medium
Technical Field
The embodiment of the application relates to the technical field of intelligent control, in particular to a control method and device of equipment, a server and a storage medium.
Background
When an aircraft, such as an airplane, is taken, the controlled device is controlled through the touch control panel by connecting the device through the touch control panel. For example, the seat back screen of the aircraft is connected through a touch pad, and a user can control the seat back screen through the touch pad. For another example, a pilot onboard an aircraft performs flight control of the aircraft in the cockpit through a touch pad.
The controlling of the controlled device by the touch pad may be: when the touch control plate is a vertical screen, sliding from top to bottom on the touch control plate, and sliding from top to bottom corresponding to the controlled equipment; when the touch control panel is a transverse screen, the touch control panel slides from the top to the bottom, and correspondingly slides left and right of the controlled equipment. At present, a main stream method is to embed a gyroscope on a touch pad, and the gyroscope of the touch pad sets a reference gesture when leaving the factory, so as to judge the horizontal and vertical screen of the touch pad relative to the gesture of the ground, and further control the controlled equipment.
However, in some scenarios when the aircraft is flying, such as take-off, landing, or while cornering at altitude, the user slides from top to bottom on the touch pad, which should be the case for the controlled device. Because the touch pad gyroscope takes the earth as a reference gesture, the state of the touch pad is a horizontal screen. At this time, the user slides on the touch pad from top to bottom to correspond to the left and right sliding of the controlled device, which results in poor control effect of the touch pad on the controlled device and inaccurate control response.
Disclosure of Invention
The embodiment of the application provides a control method, a device, a server and a storage medium of equipment, which are used for realizing the effect of improving the accuracy of controlling controlled equipment by a touch pad.
In a first aspect, an embodiment of the present application provides a method for controlling a device, including:
receiving a touch instruction of a touch pad to controlled equipment;
acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction;
calculating relative posture information of the controlled equipment corresponding to the touch pad according to the first posture information and the second posture information;
matching control instructions corresponding to the touch instructions according to the relative gesture information;
and controlling the controlled equipment to execute the corresponding function in response to the control instruction.
Optionally, the first gesture information is obtained through a first gyroscope built in the touch pad.
Optionally, the second gesture information is acquired through a second gyroscope built in the controlled device.
Optionally, the relative gesture information includes horizontal screen gesture information and vertical screen gesture information, and the matching the control instruction corresponding to the touch instruction according to the relative gesture information includes:
when the relative gesture information is horizontal screen gesture information, matching a first control instruction corresponding to the touch instruction according to the horizontal screen gesture information;
and when the relative gesture information is vertical screen gesture information, matching a second control instruction corresponding to the touch instruction according to the vertical screen gesture information.
Optionally, the calculating the relative gesture information of the controlled device corresponding to the touch pad according to the first gesture information and the second gesture information includes:
calculating a deflection angle of the controlled device along a target axis relative to the touch pad according to the first gesture information and the second gesture information;
when the deflection angle meets a first preset angle, the relative gesture information is vertical screen gesture information;
and when the deflection angle meets a second preset angle, the relative gesture information is transverse screen gesture information.
Optionally, the first preset angle is smaller than 45 °, and the second preset angle is larger than or equal to 45 °.
Optionally, before receiving a touch instruction of the touch pad to the controlled device, the method includes:
and establishing communication connection between the touch control panel and the controlled equipment in a wireless communication mode.
In a second aspect, an embodiment of the present application provides a control apparatus for a device, including:
the receiving module is used for receiving a touch instruction of the touch pad to the controlled equipment;
the acquisition module is used for acquiring the first gesture information of the touch pad and the second gesture information of the controlled device according to the touch instruction;
the computing module is used for computing the relative gesture information of the controlled device corresponding to the touch pad according to the first gesture information and the second gesture information;
the matching module is used for matching the control instruction corresponding to the touch instruction according to the relative gesture information;
and the control module is used for controlling the controlled equipment to execute the execution action corresponding to the control instruction.
In a third aspect, an embodiment of the present application provides a server, including:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement a method of controlling an apparatus according to any embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements a method for controlling an apparatus according to any embodiment of the present application.
According to the embodiment of the application, the touch control instruction of the touch control panel to the controlled equipment is received; acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction; calculating relative posture information of the controlled equipment corresponding to the touch pad according to the first posture information and the second posture information; matching control instructions corresponding to the touch instructions according to the relative gesture information; the controlled equipment is controlled to respond to the control instruction to execute the corresponding function, the problem that the control effect of the touch control board on the controlled equipment is poor in some scenes when the aircraft flies by taking the ground as a reference posture is solved, and the problem of inaccurate control response is solved, and the effect of improving the accuracy of the control of the touch control board on the controlled equipment is realized.
Drawings
Fig. 1 is a flow chart of a control method of an apparatus according to a first embodiment of the present application;
fig. 2 is a flow chart of a control method of a device according to a second embodiment of the present application;
fig. 3 is a schematic structural diagram of a control device of an apparatus according to a third embodiment of the present application;
fig. 4 is a schematic structural diagram of a server according to a fourth embodiment of the present application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present application are shown in the drawings.
Before discussing exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts steps as a sequential process, many of the steps may be implemented in parallel, concurrently, or with other steps. Furthermore, the order of the steps may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Furthermore, the terms "first," "second," and the like, may be used herein to describe various directions, acts, steps, or elements, etc., but these directions, acts, steps, or elements are not limited by these terms. These terms are only used to distinguish one direction, action, step or element from another direction, action, step or element. For example, the first preset angle may be a second preset angle, and similarly, the second preset angle may be referred to as the first preset angle without departing from the scope of the present application. Both the first preset angle and the second preset angle are preset angles, but they are not the same preset angle. The terms "first," "second," and the like, are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Example 1
Fig. 1 is a flow chart of a control method of a device according to an embodiment of the present application, which is applicable to a scenario in which a touch pad is used to control the device, where the method may be performed by a control apparatus of the device, and the apparatus may be implemented in a software and/or hardware manner and may be integrated on a server.
As shown in fig. 1, a control method of an apparatus according to a first embodiment of the present application includes:
s110, receiving a touch instruction of the touch pad to the controlled device.
The touch pad is a tool for controlling the controlled device. Specifically, through the operation on the touch pad, the controlled device can be caused to execute the corresponding operation. For example, when the touch pad is in a vertical screen state with respect to the controlled device, the touch pad slides from the top to the bottom, and also corresponds to the sliding of the controlled device from the top to the bottom. The controlled device refers to a device that performs a corresponding touch pad operation. In this embodiment, the controlled devices include, but are not limited to, seat back screens on aircraft, consoles for flight decks, and the like, without limitation. The touch instruction is an instruction generated when the touch pad is manipulated. Specifically, the touch control instruction can be converted into a control instruction corresponding to the controlled board, so that the controlled equipment executes the corresponding function.
S120, acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction.
The first gesture information refers to gesture information of the touch pad. Optionally, the first gesture information of the touch pad is obtained through a first gyroscope built in the touch pad. In the present embodiment, the manner how to acquire the first posture information of the touch pad is not particularly limited. Specifically, the first gyroscope uses a fixed coordinate system as a reference system when leaving the factory. Typically, the reference frame of the first gyroscope is in the geodetic coordinate system. The reference frame of the first gyroscope in this embodiment is not particularly limited. The second posture information refers to posture information of the controlled device. Optionally, the second gesture information is acquired by a second gyroscope built in the controlled device. In particular, the reference frame of the second gyroscope is identical to the reference frame of the first gyroscope.
S130, calculating relative posture information of the controlled device corresponding to the touch pad according to the first posture information and the second posture information.
The relative gesture information refers to gesture information of the controlled device relative to the touch pad. In this embodiment, the relative gesture information may optionally include, but is not limited to, a landscape gesture information and a portrait gesture information. Specifically, the relative posture information can be calculated through the first posture information and the second posture information. Specifically, in some scenes, such as take-off, landing or high-altitude cornering, if the controlled device is controlled according to the first gesture information, the state of the touch pad is a horizontal screen, but the state of the touch pad relative to the controlled device is vertical screen gesture information, the user slides the touch pad from top to bottom to slide left and right corresponding to the controlled device, so that the experience effect of the user is poor.
And S140, matching the control instruction corresponding to the touch instruction according to the relative gesture information.
The control instruction refers to an instruction for controlling the touch pad to execute a corresponding operation. Specifically, the control instruction of the controlled device needs to be obtained by matching the touch instruction of the corresponding touch pad according to the relative gesture information. For example, when the touch instruction is that the touch position slides from the top to the bottom of the touch pad and the relative gesture information is vertical screen gesture information, the control instruction controls the cursor of the controlled device to slide from the top to the bottom; when the touch control instruction is that the touch control position slides from the top end to the bottom end of the touch control plate and the relative gesture information is the transverse screen gesture information, the control instruction controls the controlled equipment to slide from the left end to the right end.
S150, controlling the controlled equipment to respond to the control instruction to execute the corresponding function.
Wherein, the response means that the controlled device executes the control instruction to execute the function corresponding to the control instruction. For example, the cursor of the controlled device slides from top to bottom, or from left to right, without limitation.
According to the technical scheme, the touch control instruction of the touch control panel to the controlled equipment is received; acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction; calculating relative posture information of the controlled equipment corresponding to the touch pad according to the first posture information and the second posture information; matching control instructions corresponding to the touch instructions according to the relative gesture information; the controlled equipment is controlled to respond to the control instruction to execute the corresponding function, and the control instruction is determined according to the touch control instruction and the relative gesture information, so that the relative gesture information has more referential property, inaccurate control in certain scenes is avoided, and the technical effect of improving the accuracy of the touch control board in controlling the controlled equipment is achieved.
Example two
Fig. 2 is a flow chart of a control method of a device according to a second embodiment of the present application. The embodiment is further refined in the technical scheme, and is suitable for a scene of controlling equipment by using a touch pad. The method may be performed by a control means of the device, which may be implemented in software and/or hardware and may be integrated on a server.
As shown in fig. 2, a control method of a device according to a second embodiment of the present application includes:
s210, establishing communication connection between the touch pad and the controlled device in a wireless communication mode.
The wireless communication may be, but not limited to, wiFi communication or bluetooth communication. Preferably, the wireless communication is a bluetooth communication. The touch pad establishes communication connection with the controlled equipment in a Bluetooth communication mode so as to realize the control of the touch pad on the controlled equipment.
S220, receiving a touch instruction of the touch pad to the controlled device.
The touch pad is a tool for controlling the controlled device. Specifically, through the operation on the touch pad, the controlled device can be caused to execute the corresponding operation. For example, when the touch pad is in a vertical screen state with respect to the controlled device, the touch pad slides from the top to the bottom, and also corresponds to the sliding of the controlled device from the top to the bottom. The controlled device refers to a device that performs a corresponding touch pad operation. In this embodiment, the controlled devices include, but are not limited to, seat back screens on aircraft, consoles for flight decks, and the like, without limitation. The touch instruction is an instruction generated when the touch pad is manipulated. Specifically, the touch control instruction can be converted into a control instruction corresponding to the controlled board, so that the controlled equipment executes the corresponding function.
S230, acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction.
The first gesture information refers to gesture information of the touch pad. Optionally, the first gesture information of the touch pad is obtained through a first gyroscope built in the touch pad. In the present embodiment, the manner how to acquire the first posture information of the touch pad is not particularly limited. Specifically, the first gyroscope uses a fixed coordinate system as a reference system when leaving the factory. Typically, the reference frame of the first gyroscope is in the geodetic coordinate system. The reference frame of the first gyroscope in this embodiment is not particularly limited. The second posture information refers to posture information of the controlled device. Optionally, the second gesture information is acquired by a second gyroscope built in the controlled device. In particular, the reference frame of the second gyroscope is identical to the reference frame of the first gyroscope.
S240, calculating relative posture information of the controlled device corresponding to the touch pad according to the first posture information and the second posture information.
The relative gesture information refers to gesture information of the controlled device relative to the touch pad. In this embodiment, the relative gesture information may optionally include, but is not limited to, a landscape gesture information and a portrait gesture information. Specifically, the relative posture information can be calculated through the first posture information and the second posture information. Specifically, in some scenes, such as take-off, landing or high-altitude cornering, if the controlled device is controlled according to the first gesture information, the state of the touch pad is a horizontal screen, but the state of the touch pad relative to the controlled device is vertical screen gesture information, the user slides the touch pad from top to bottom to slide left and right corresponding to the controlled device, so that the experience effect of the user is poor.
S250, matching control instructions corresponding to the touch instructions according to the relative gesture information.
The control instruction refers to an instruction for controlling the touch pad to execute a corresponding operation. Specifically, the control instruction of the controlled device needs to be obtained by matching the touch instruction of the corresponding touch pad according to the relative gesture information. For example, when the touch instruction is that the touch position slides from the top to the bottom of the touch pad and the relative gesture information is vertical screen gesture information, the control instruction controls the cursor of the controlled device to slide from the top to the bottom; when the touch control instruction is that the touch control position slides from the top end to the bottom end of the touch control plate and the relative gesture information is the transverse screen gesture information, the control instruction controls the controlled equipment to slide from the left end to the right end.
In an optional implementation manner, the relative gesture information includes horizontal screen gesture information and vertical screen gesture information, and the matching of the control instruction corresponding to the touch instruction according to the relative gesture information includes:
when the relative gesture information is horizontal screen gesture information, matching a first control instruction corresponding to the touch instruction according to the horizontal screen gesture information;
and when the relative gesture information is vertical screen gesture information, matching a second control instruction corresponding to the touch instruction according to the vertical screen gesture information.
The transverse screen gesture information refers to that the touch pad is transverse screen relative to the controlled device. The vertical screen gesture information refers to that the touch pad is vertical screen relative to the controlled device. The first control instruction is a control instruction obtained by matching the touch instruction according to the transverse screen gesture information. For example, when the touch command is sliding from the top to the bottom of the touch pad, the first control command is sliding from the left end to the right end or sliding from the right end to the left end of the controlled device; when the touch instruction slides from the left end to the right end of the touch pad, the first control instruction slides the cursor of the controlled device from top to bottom or from bottom to top. The second control instruction is a control instruction obtained by matching the touch instruction according to the vertical screen gesture information. For example, when the touch command is a sliding from top to bottom of the touch pad, the first control command is a sliding of a cursor of the controlled device from top to bottom or a sliding of the cursor from bottom to top; when the touch instruction slides from the left end to the right end of the touch pad, the first control instruction slides from the left end to the right end or slides from the right end to the left end of the controlled device.
In an optional embodiment, calculating the relative gesture information of the controlled device corresponding to the touch pad according to the first gesture information and the second gesture information includes:
calculating a deflection angle of the controlled device along a target axis relative to the touch pad according to the first gesture information and the second gesture information;
when the deflection angle meets a first preset angle, the relative gesture information is vertical screen gesture information;
and when the deflection angle meets a second preset angle, the relative gesture information is transverse screen gesture information.
Wherein the target axis refers to a specified axis. Specifically, in the present embodiment, the target axis is an axis perpendicular to the display interface of the controlled device. For example, when the controlled device is a seatback screen, the target axis is an axis of the display screen perpendicular to the seatback screen. The yaw angle refers to an angle of rotation along the target axis. Specifically, when the touch pad is completely vertical with respect to the controlled device, the deflection angle is 0 °. The complete vertical state means that the symmetry axis of the touch pad in the vertical direction coincides with the symmetry of the controlled device in the vertical direction. Specifically, in the case of an airplane, the sitting postures of passengers are varied, and in this embodiment, only the yaw angle of the target axis is focused, so that the user can control the airplane normally even when lying on his/her back. For example, when a user is sitting down, the seat back screen may be rotated a certain angle relative to the seat back screen in front of the user along an axis parallel to the seat back screen display screen. While the present embodiment focuses only on the deflection angle of the target axis perpendicular to the display interface of the controlled device, the user can normally control the controlled device through the touch pad regardless of the posture. The first preset angle refers to a condition for judging the relative gesture information to be vertical screen gesture information. When the deflection angle meets a first preset angle, the relative gesture information is vertical screen gesture information. In this embodiment, optionally, when the first preset angle is smaller than 45 °, that is, the deflection angle is smaller than 45 °, the relative gesture information is vertical screen gesture information. The second preset angle refers to a condition for judging the relative gesture as the transverse screen gesture information. And when the deflection angle meets a second preset angle, the relative gesture information is the transverse screen gesture information. In this embodiment, optionally, the second preset angle is greater than or equal to 45 °. When the deflection angle is larger than or equal to 45 degrees, the relative gesture information is the transverse screen gesture information.
And S260, controlling the controlled equipment to respond to the control instruction to execute the corresponding function.
Wherein, the response means that the controlled device executes the control instruction to execute the function corresponding to the control instruction. For example, the cursor of the controlled device slides from top to bottom, or from left to right, without limitation.
According to the technical scheme, the touch control instruction of the touch control panel to the controlled equipment is received; acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction; calculating relative posture information of the controlled equipment corresponding to the touch pad according to the first posture information and the second posture information; matching control instructions corresponding to the touch instructions according to the relative gesture information; the controlled equipment is controlled to respond to the control instruction to execute the corresponding function, and the control instruction is determined according to the touch control instruction and the relative gesture information, so that the relative gesture information has more referential property, inaccurate control in certain scenes is avoided, and the technical effect of improving the accuracy of the touch control board in controlling the controlled equipment is achieved.
Example III
Fig. 3 is a schematic structural diagram of a control device for a device according to a third embodiment of the present application, where the embodiment is applicable to a scenario in which a touch pad is used to control the device, and the device may be implemented in a software and/or hardware manner and may be integrated on a server.
As shown in fig. 3, the control device of the apparatus provided in this embodiment may include a receiving module 310, an obtaining module 320, a calculating module 330, a matching module 340, and a control module 350, where:
a receiving module 310, configured to receive a touch instruction of the touch pad to the controlled device; an obtaining module 320, configured to obtain, according to the touch instruction, first gesture information of the touch pad and second gesture information of the controlled device; a calculating module 330, configured to calculate, according to the first gesture information and the second gesture information, relative gesture information of the controlled device corresponding to the touch pad; a matching module 340, configured to match control instructions corresponding to the touch instruction according to the relative gesture information; and the control module 350 is used for controlling the controlled device to execute the execution action corresponding to the control instruction.
Optionally, the first gesture information is obtained through a first gyroscope built in the touch pad.
Optionally, the second gesture information is acquired through a second gyroscope built in the controlled device.
Optionally, the relative gesture information includes a horizontal screen gesture information and a vertical screen gesture information. The matching module 340 includes: the first control instruction matching unit is used for matching a first control instruction corresponding to the touch instruction according to the transverse screen gesture information when the relative gesture information is the transverse screen gesture information; and the second control instruction matching unit is used for matching a second control instruction corresponding to the touch instruction according to the vertical screen gesture information when the relative gesture information is the vertical screen gesture information.
Optionally, the computing module 330 includes: a deflection angle calculating unit for calculating a deflection angle of the controlled device along a target axis relative to the touch pad according to the first gesture information and the second gesture information; the gesture information determining unit is used for determining that the relative gesture information is vertical screen gesture information when the deflection angle meets a first preset angle; and when the deflection angle meets a second preset angle, the relative gesture information is transverse screen gesture information.
Optionally, the first preset angle is smaller than 45 °, and the second preset angle is larger than or equal to 45 °.
Optionally, the apparatus further comprises: and the communication connection module is used for establishing communication connection between the touch control panel and the controlled equipment in a wireless communication mode.
The control device of the equipment provided by the embodiment of the application can execute the control method of the equipment provided by any embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method. Reference may be made to the description of any method embodiment of the application for details not explicitly described in this embodiment of the application.
Example IV
Fig. 4 is a schematic structural diagram of a server according to a fourth embodiment of the present application. Fig. 4 illustrates a block diagram of an exemplary server 612 suitable for use in implementing embodiments of the application. The server 612 depicted in fig. 4 is merely an example, and is not meant to limit the functionality and scope of use of embodiments of the present application.
As shown in fig. 4, the server 612 is in the form of a general-purpose server. Components of server 612 may include, but are not limited to: one or more processors 616, a memory device 628, and a bus 618 that connects the various system components, including the memory device 628 and the processor 616.
Bus 618 represents one or more of several types of bus structures, including a memory device bus or memory device controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include industry standard architecture (Industry Subversive Alliance, ISA) bus, micro channel architecture (Micro Channel Architecture, MAC) bus, enhanced ISA bus, video electronics standards association (Video Electronics Standards Association, VESA) local bus, and peripheral component interconnect (Peripheral Component Interconnect, PCI) bus.
Server 612 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by server 612 and includes both volatile and nonvolatile media, removable and non-removable media.
The storage 628 may include computer system readable media in the form of volatile memory, such as random access memory (Random Access Memory, RAM) 630 and/or cache memory 632. Terminal 612 can further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 634 can be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, commonly referred to as a "hard drive"). Although not shown in fig. 4, a magnetic disk drive for reading from and writing to a removable nonvolatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable nonvolatile optical disk such as a Read Only Memory (CD-ROM), digital versatile disk (Digital Video Disc-Read Only Memory, DVD-ROM), or other optical media, may be provided. In such cases, each drive may be coupled to bus 618 through one or more data medium interfaces. The storage 628 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of the embodiments of the present application.
A program/utility 640 having a set (at least one) of program modules 642 may be stored, for example, in the storage 628, such program modules 642 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 642 generally perform the functions and/or methods of the described embodiments of the present application.
The server 612 may also communicate with one or more external devices 614 (e.g., keyboard, pointing terminal, display 624, etc.), with one or more terminals that enable a user to interact with the server 612, and/or with any terminal (e.g., network card, modem, etc.) that enables the server 612 to communicate with one or more other computing terminals. Such communication may occur through an input/output (I/O) interface 622. Also, the server 612 may communicate with one or more networks (e.g., local area network (Local Area Network, LAN), wide area network (Wide Area Network, WAN) and/or public network, such as the internet) via the network adapter 620. As shown in fig. 4, network adapter 620 communicates with the other modules of server 612 over bus 618. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with server 612, including, but not limited to: microcode, end drives, redundant processors, external disk drive arrays, disk array (Redundant Arrays of Independent Disks, RAID) systems, tape drives, data backup storage systems, and the like.
The processor 616 executes various functional applications and data processing by running a program stored in the storage device 628, for example, to implement a method for controlling an apparatus provided in any embodiment of the present application, and the method may include:
receiving a touch instruction of a touch pad to controlled equipment;
acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction;
calculating relative posture information of the controlled equipment corresponding to the touch pad according to the first posture information and the second posture information;
matching control instructions corresponding to the touch instructions according to the relative gesture information;
and controlling the controlled equipment to execute the corresponding function in response to the control instruction.
According to the technical scheme, the touch control instruction of the touch control panel to the controlled equipment is received; acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction; calculating relative posture information of the controlled equipment corresponding to the touch pad according to the first posture information and the second posture information; matching control instructions corresponding to the touch instructions according to the relative gesture information; the controlled equipment is controlled to respond to the control instruction to execute the corresponding function, and the control instruction is determined according to the touch control instruction and the relative gesture information, so that the relative gesture information has more referential property, inaccurate control in certain scenes is avoided, and the technical effect of improving the accuracy of the touch control board in controlling the controlled equipment is achieved.
Example five
A fifth embodiment of the present application further provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method for controlling an apparatus as provided in any embodiment of the present application, the method may include:
receiving a touch instruction of a touch pad to controlled equipment;
acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction;
calculating relative posture information of the controlled equipment corresponding to the touch pad according to the first posture information and the second posture information;
matching control instructions corresponding to the touch instructions according to the relative gesture information;
and controlling the controlled equipment to execute the corresponding function in response to the control instruction.
The computer-readable storage media of embodiments of the present application may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or terminal. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
According to the technical scheme, the touch control instruction of the touch control panel to the controlled equipment is received; acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction; calculating relative posture information of the controlled equipment corresponding to the touch pad according to the first posture information and the second posture information; matching control instructions corresponding to the touch instructions according to the relative gesture information; the controlled equipment is controlled to respond to the control instruction to execute the corresponding function, and the control instruction is determined according to the touch control instruction and the relative gesture information, so that the relative gesture information has more referential property, inaccurate control in certain scenes is avoided, and the technical effect of improving the accuracy of the touch control board in controlling the controlled equipment is achieved.
Note that the above is only a preferred embodiment of the present application and the technical principle applied. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, while the application has been described in connection with the above embodiments, the application is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the application, which is set forth in the following claims.

Claims (9)

1. A control method of an apparatus, characterized by comprising:
receiving a touch instruction of a touch pad to controlled equipment;
acquiring first gesture information of the touch pad and second gesture information of the controlled device according to the touch instruction;
calculating relative posture information of the controlled equipment corresponding to the touch pad according to the first posture information and the second posture information;
matching control instructions corresponding to the touch instructions according to the relative gesture information;
controlling the controlled equipment to respond to the control instruction to execute the corresponding function;
the calculating the relative gesture information of the controlled device corresponding to the touch pad according to the first gesture information and the second gesture information includes:
calculating a deflection angle of the controlled device along a target axis relative to the touch pad according to the first gesture information and the second gesture information;
when the deflection angle meets a first preset angle, the relative gesture information is vertical screen gesture information;
and when the deflection angle meets a second preset angle, the relative gesture information is transverse screen gesture information.
2. The method of controlling a device according to claim 1, wherein the first posture information is acquired by a first gyroscope built in the touch pad.
3. The method of controlling an apparatus according to claim 1, wherein the second posture information is acquired by a second gyroscope built in the controlled apparatus.
4. The method for controlling a device according to claim 1, wherein the relative gesture information includes a landscape gesture information and a portrait gesture information, and the matching the control instruction corresponding to the touch instruction according to the relative gesture information includes:
when the relative gesture information is horizontal screen gesture information, matching a first control instruction corresponding to the touch instruction according to the horizontal screen gesture information;
and when the relative gesture information is vertical screen gesture information, matching a second control instruction corresponding to the touch instruction according to the vertical screen gesture information.
5. The control method of the apparatus according to claim 1, wherein the first preset angle is less than 45 ° and the second preset angle is greater than or equal to 45 °.
6. The method for controlling a device according to claim 1, comprising, before receiving a touch instruction from the touch pad to the controlled device:
and establishing communication connection between the touch control panel and the controlled equipment in a wireless communication mode.
7. A control device of an apparatus, characterized by comprising:
the receiving module is used for receiving a touch instruction of the touch pad to the controlled equipment;
the acquisition module is used for acquiring the first gesture information of the touch pad and the second gesture information of the controlled device according to the touch instruction;
the computing module is used for computing the relative gesture information of the controlled device corresponding to the touch pad according to the first gesture information and the second gesture information;
the matching module is used for matching the control instruction corresponding to the touch instruction according to the relative gesture information;
the control module is used for controlling the controlled equipment to execute the execution action corresponding to the control instruction;
the calculation module comprises: a deflection angle calculating unit for calculating a deflection angle of the controlled device along a target axis relative to the touch pad according to the first gesture information and the second gesture information;
the gesture information determining unit is used for determining that the relative gesture information is vertical screen gesture information when the deflection angle meets a first preset angle; and when the deflection angle meets a second preset angle, the relative gesture information is transverse screen gesture information.
8. A server, comprising:
one or more processors;
a storage means for storing one or more programs;
when executed by the one or more processors, causes the one or more processors to implement the method of controlling a device as recited in any one of claims 1-6.
9. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements a method of controlling an apparatus according to any one of claims 1-6.
CN201911364561.9A 2019-12-26 2019-12-26 Control method and device of equipment, server and storage medium Active CN111124185B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911364561.9A CN111124185B (en) 2019-12-26 2019-12-26 Control method and device of equipment, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911364561.9A CN111124185B (en) 2019-12-26 2019-12-26 Control method and device of equipment, server and storage medium

Publications (2)

Publication Number Publication Date
CN111124185A CN111124185A (en) 2020-05-08
CN111124185B true CN111124185B (en) 2023-08-15

Family

ID=70502921

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911364561.9A Active CN111124185B (en) 2019-12-26 2019-12-26 Control method and device of equipment, server and storage medium

Country Status (1)

Country Link
CN (1) CN111124185B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111949046A (en) * 2020-08-20 2020-11-17 中国商用飞机有限责任公司 Airplane, and flight mode control device and flight mode control method for airplane

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103674468A (en) * 2012-09-07 2014-03-26 通用电气航空系统有限责任公司 Method of determining a turbulent condition in an aircraft
CN105824535A (en) * 2016-03-25 2016-08-03 乐视控股(北京)有限公司 Setting adjusting method and terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3075987B1 (en) * 2017-12-21 2019-12-13 Thales METHOD AND SYSTEM FOR DUAL HARMONIZATION OF A HEAD-UP DISPLAY SYSTEM WITH AN INERTIAL ATTITUDE DEVICE REMOVABLE IN THE COCKPIT

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103674468A (en) * 2012-09-07 2014-03-26 通用电气航空系统有限责任公司 Method of determining a turbulent condition in an aircraft
CN105824535A (en) * 2016-03-25 2016-08-03 乐视控股(北京)有限公司 Setting adjusting method and terminal

Also Published As

Publication number Publication date
CN111124185A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
US10054946B2 (en) Electronic device and operating method thereof
CN108279694B (en) Electronic device and control method thereof
US8761961B2 (en) Electronic device and method for controlling unmanned aerial vehicle using the same
US9633412B2 (en) Method of adjusting screen magnification of electronic device, machine-readable storage medium, and electronic device
CN109968979B (en) Vehicle-mounted projection processing method and device, vehicle-mounted equipment and storage medium
US11087633B2 (en) Simulation server capable of interacting with a plurality of simulators to perform a plurality of simulations
WO2020143676A1 (en) Unmanned aerial vehicle data processing method and apparatus, device and storage medium
US10311715B2 (en) Smart device mirroring
US20210026531A1 (en) Collaborative drawing method and electronic device therefor
US11521501B2 (en) Method, apparatus and system for operating waypoint, ground station and computer readable storage medium
KR20170132404A (en) Screen controlling method and electronic device supporting the same
US20140232748A1 (en) Device, method and computer readable recording medium for operating the same
CN111124185B (en) Control method and device of equipment, server and storage medium
WO2024002191A1 (en) Motion state determination method and apparatus, and electronic device and readable storage medium
CN113051022A (en) Graphical interface processing method and graphical interface processing device
US11157155B2 (en) Air line displaying method, apparatus and system, ground station and computer-readable storage medium
CN115061762A (en) Page display method and device, electronic equipment and medium
CN115097976A (en) Method, apparatus, device and storage medium for image processing
US11210862B1 (en) Data selection for spatial reconstruction
US20160293047A1 (en) Simulator for generating and exchanging simulation data for interacting with a portable computing device
CN112546613B (en) Equipment control method, device, equipment and storage medium
CN112947520B (en) Attitude control method and device for improving stability of low-speed aircraft under stall
US10235890B1 (en) System for navigating an aircraft display with a mobile device
US11875462B2 (en) Systems for augmented reality authoring of remote environments
CN111488768B (en) Style conversion method and device for face image, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant