CN112313590A - Control method, device and system of intelligent equipment and storage medium - Google Patents

Control method, device and system of intelligent equipment and storage medium Download PDF

Info

Publication number
CN112313590A
CN112313590A CN201980040074.9A CN201980040074A CN112313590A CN 112313590 A CN112313590 A CN 112313590A CN 201980040074 A CN201980040074 A CN 201980040074A CN 112313590 A CN112313590 A CN 112313590A
Authority
CN
China
Prior art keywords
user operation
operation information
behavior
instruction
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980040074.9A
Other languages
Chinese (zh)
Inventor
杜凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112313590A publication Critical patent/CN112313590A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Selective Calling Equipment (AREA)

Abstract

A control method, a device, a system and a storage medium of intelligent equipment are provided, the method comprises the following steps: acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent equipment; and storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when the preset trigger condition is met. Because the user operation information in the user operation instruction is input by the user, the behavior and the action of the intelligent equipment can be more accurate, precise, natural and active due to the behavior represented by the user operation information; therefore, when the intelligent device needs to start to perform behaviors and actions, the stored user operation information can be called, the intelligent device is controlled to execute corresponding behaviors according to the user operation information, and at the moment, the behaviors and the actions of the intelligent device can be more accurate, natural and active.

Description

Control method, device and system of intelligent equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of intelligent equipment, in particular to a control method, a control device, a control system and a storage medium of intelligent equipment.
Background
In recent years, with the development and progress of smart devices, smart devices are beginning to be applied to a plurality of fields. Smart devices, such as robots, may make some actions and actions.
In the prior art, an intelligent algorithm may be stored in an intelligent device, and the intelligent device may perform some actions and actions when detecting a user operation or performing an automatic trigger.
However, in the prior art, when the intelligent device performs the action and the movement according to the intelligent algorithm, the movement of the intelligent device is stiff and not smooth, so that the action and the movement of the intelligent device are not accurate enough, and the precision of the intelligent device in completing the action is low.
Disclosure of Invention
The embodiment of the application provides a control method, a control device, a control system and a storage medium of an intelligent device, so that the behavior and the action of the intelligent device are accurate, the action accuracy of the intelligent device is improved, and the action of the intelligent device is not stiff and has vitality.
In a first aspect, an embodiment of the present application provides a method for controlling an intelligent device, including:
acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent equipment;
and storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when a preset trigger condition is met.
In a second aspect, an embodiment of the present application provides a control method for an intelligent device, which is applied to a control device for the intelligent device, and the method includes:
acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent equipment;
and sending the user operation information to the intelligent equipment so that the intelligent equipment stores the user operation information and executes the behavior according to the user operation information when a preset trigger condition is met.
In a third aspect, an embodiment of the present application provides a control apparatus for an intelligent device, including: a processor and a memory;
the memory for storing program code;
the processor, configured to invoke the program code, when the program code is executed, is configured to perform the following:
acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent equipment;
and storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when a preset trigger condition is met.
In a fourth aspect, an embodiment of the present application provides a control system for an intelligent device, including: a control device and the smart device;
the control device is used for acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent device, and sending the behavior operation information to the intelligent device;
and the intelligent equipment is used for storing the user operation information and executing the behavior according to the user operation information when the user operation information meets the preset trigger condition.
In a fifth aspect, an embodiment of the present application provides a readable storage medium, on which a computer program is stored; the computer program, when executed, implements a method for controlling an intelligent device according to an embodiment of the present application, or implements a method for controlling an intelligent device according to an embodiment of the present application, according to a second aspect.
In a sixth aspect, the present application provides a program product, where the program product includes a computer program stored in a readable storage medium, and the computer program is readable from the readable storage medium by at least one processor of a control device of a smart device, and the at least one processor executes the computer program to make the control device of the smart device implement the control method of the smart device according to the first aspect, or implement the control method of the smart device according to the second aspect.
According to the control method, the control device, the control system and the storage medium of the intelligent equipment, the user operation instruction is obtained, and the user operation instruction comprises user operation information used for representing the behavior of the intelligent equipment; and storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when the preset trigger condition is met. Therefore, a user operation instruction of the user on the intelligent equipment is obtained, and further user operation information in the user operation instruction is stored; and when the preset trigger condition is detected to be met, calling the stored user operation information to control the intelligent equipment to execute the behavior represented by the user operation information. Because the user operation information in the user operation instruction is input by the user, the behavior and the action of the intelligent equipment can be more accurate, precise, natural and active due to the behavior represented by the user operation information; therefore, when the intelligent device needs to start to perform behaviors and actions, the stored user operation information can be called, the intelligent device is controlled to execute corresponding behaviors according to the user operation information, and at the moment, the behaviors and the actions of the intelligent device can be more accurate, natural and active.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a first schematic view of an application scenario provided in the present application;
fig. 2 is a schematic diagram of an application scenario provided in the present application;
fig. 3 is a flowchart of a control method of an intelligent device according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating user operation information input provided by the present application;
FIG. 5 is a schematic representation of the behavior of an unmanned vehicle provided herein;
fig. 6 is a first schematic action diagram of a control device of an unmanned vehicle provided by the present application;
fig. 7 is a schematic action diagram of a control device of the unmanned vehicle according to the present application;
fig. 8 is a flowchart of a control method for an intelligent device according to another embodiment of the present application;
fig. 9 is a flowchart of a control method for an intelligent device according to another embodiment of the present application;
FIG. 10 is a schematic illustration of an editing process provided herein;
fig. 11 is a flowchart of a control method of an intelligent device according to yet another embodiment of the present application;
FIG. 12 is a first schematic diagram of an interactive interface provided in an embodiment of the present application;
FIG. 13 is a second schematic view of an interactive interface provided in an embodiment of the present application;
fig. 14 is a flowchart of a control method for an intelligent device according to another embodiment of the present application;
fig. 15 is a schematic structural diagram of a control apparatus of an intelligent device according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of a control system of an intelligent device according to another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When a component is referred to as being "connected" to another component, it can be directly connected to the other component or intervening components may also be present.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Some terms in the present application are explained below to facilitate understanding by those skilled in the art. It should be noted that, when the scheme of the embodiment of the present application is applied to an intelligent device or a control device of the intelligent device, names of the control device and the control device of the intelligent device may be changed, but this does not affect the implementation of the scheme of the embodiment of the present application.
1) Movable platforms including but not limited to drones, smart robots.
2) "plurality" means two or more, and other terms are analogous. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
3) "correspond" may refer to an association or binding relationship, and a corresponds to B refers to an association or binding relationship between a and B.
It should be noted that the terms or terms referred to in the embodiments of the present application may be mutually referred and are not described in detail.
The embodiment of the application provides a control method, a control device, a control system and a storage medium of intelligent equipment. Fig. 1 is a schematic view of an application scenario provided by the present application, and fig. 2 is a schematic view of an application scenario provided by the present application, as shown in fig. 1 to fig. 2, the control method of the smart device may be applied to the smart device or the control device of the smart device. The smart device may be a movable platform; movable platforms include, but are not limited to, drones, smart robots, unmanned vehicles, and the like. For example, fig. 1 is a drone; fig. 2 is an intelligent robot.
The control method of the intelligent device can also be applied to any device or system, and further the control method of the intelligent device provided by the application is completed.
It should be understood that the above-described nomenclature for the various components of the device is for identification purposes only, and should not be construed as limiting the embodiments of the present application.
In the prior art, an intelligent algorithm can be stored in an intelligent device, and the intelligent device can perform some behaviors and actions when detecting user operation or performing automatic triggering. However, when the intelligent device performs actions and movements according to the intelligent algorithm, the actions of the intelligent device are relatively rigid and not smooth, so that the actions and movements of the intelligent device are not accurate enough, and the precision of completing the actions by the intelligent device is low.
The control method, device, system and storage medium of the intelligent device provided by the embodiment can solve the problems.
Fig. 3 is a flowchart of a control method of an intelligent device according to an embodiment of the present application, and as shown in fig. 3, the method of the present embodiment may include:
s101, obtaining a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent device.
In this embodiment, the execution subject of this embodiment may be an intelligent device, or may be a control device of the intelligent device. Wherein, the smart device can be a smart robot, a drone, an unmanned vehicle, etc. that can have gesture and/or position change actions. For example, the smart device is an unmanned aerial vehicle, which is a movable platform; alternatively, the smart device is provided with a movable platform, i.e., the smart device includes a movable platform.
In this embodiment, an intelligent device in which an execution subject is an unmanned vehicle is taken as an example for description. This unmanned vehicle can include the chassis and locate the cloud platform on the chassis, and the chassis can be driven by such as universal wheel, can install at least one in shooting device, broadcaster, camera etc. on the cloud platform. Wherein, the cloud platform can be single-axis cloud platform, two-axis cloud platform or three-axis cloud platform etc..
When a user operates the intelligent device to perform related behaviors, the user can send a user operation instruction to the intelligent device through the controller, or the user directly sends the user operation instruction to the intelligent device; furthermore, the user can control the intelligent device to perform a series of behaviors through the user operation instruction. The controller may be a remote control device connected to the intelligent device, or may be a controller on the intelligent device.
In order to enable the intelligent device to perform a series of actions, one or more pieces of user operation information may be included in the user operation instruction. And the user operation information is used for indicating the intelligent equipment to perform one or more actions. The actions performed by the smart device include, for example, a shooting action, a moving action, an accelerating action, and a decelerating action.
In one example, the user may input the user operation information into the smart device in one or more of the following ways: a touch control mode, a physical control mode, a body sensing trigger mode and a voice trigger mode. Thus, the user operation information includes any one or more of: touch information, physical control operation information, somatosensory information and voice information.
For example, an execution subject is taken as an intelligent device for exemplary explanation, and a touch controller is arranged on the intelligent device, for example, the touch controller is an intelligent display screen; a user can touch the touch controller, so that touch information is input into the touch controller, and the touch information indicates behaviors which need to be executed by the intelligent equipment; the touch controller sends the touch information to the intelligent equipment; and then, the intelligent device executes the behavior indicated by the touch information according to the touch information.
For another example, fig. 4 is a schematic diagram of user operation information input provided by the present application, and as shown in fig. 4, an execution subject is taken as a control device for exemplary illustration, a remote control device is provided, a touch key may be displayed on an interactive interface of the remote control device, and the remote control device and the smart device may communicate with each other; as shown in fig. 4, the remote control device shows 2 virtual joysticks; a user can touch the virtual rocker so as to input touch information into the remote control equipment, wherein the touch information indicates behaviors which need to be executed by the intelligent equipment; the remote control equipment sends the touch information to the intelligent equipment; and then, the intelligent device executes the behavior indicated by the touch information according to the touch information. The control device may include, but is not limited to, a mobile terminal such as a mobile phone.
Illustratively, when a user operates the intelligent device, the intelligent device may obtain a user operation instruction, where the user operation instruction includes one or more pieces of user operation information; in order to facilitate the cascading of the user operation information and to facilitate the indication of the behavior of the smart device, the user operation information may include a behavior parameter, where the behavior parameter is used to indicate the behavior of the smart device. For example, one behavior parameter indicates one behavior, or a plurality of behavior parameters indicates one behavior. Illustratively, the behavioral parameters include any one or more of the following: movement parameters, pose parameters, shooting parameters, skill parameters, and audio parameters.
For example, the user operates the smart device to move and shoot, and then the smart device is required to complete the movement behavior and the shooting behavior, and at this time, the movement parameters and the shooting parameters can be acquired. For example, movement parameters include, but are not limited to, distance moved, speed moved, acceleration moved, and firing parameters include, but are not limited to, angle of fire, range distance.
For another example, the user operates the smart device to perform a limb activity, and then the smart device is required to complete the limb activity, for example, a mechanical arm is installed on the smart device; at this time, the intelligent device can acquire the pose parameters of the mechanical arm. For example, the pose parameters include, but are not limited to, the position of the robot arm of the smart device, and the moving speed of the robot arm of the smart device.
For another example, when the user operates the intelligent device to perform operations of various skills, the intelligent device is required to complete the related skills; at this time, the intelligent device may obtain a skill parameter, where the skill parameter may be a parameter corresponding to a special function, or may be enabling or disabling of some functions. For example, skills include, but are not limited to, jumping, shooting, filming, infrared emission, successive revolutions, and the like. Skill parameters include, but are not limited to, skill category, various types of parameters of skill. For example, skill parameters of a skill include, but are not limited to, jump speed, jump distance. For example, the skill parameters of the shooting skill include, but are not limited to, a shooting range, a shooting subject, a shooting time, and a shooting angle. For example, the skill parameters of infrared skills include, but are not limited to, the range of infrared rays, the angle of infrared rays.
For another example, the user operates the intelligent device to send different voices, and the intelligent device is required to send related voices; at this time, the smart device may obtain the voice parameters. The speech parameters include, but are not limited to, volume, speech content, character type.
And S102, storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when the preset trigger condition is met.
In this embodiment, since the intelligent device obtains the user operation information, the intelligent device can store the user operation information, and then the intelligent device stores the behavior of the intelligent device under the user operation instruction.
In one example, the smart device may detect an external trigger condition.
In one example, the smart device may detect an active trigger by the user. For example, the user issues a trigger instruction through voice, touch, pose, and the like.
In one example, the smart device may detect state information of the smart device itself or environment information recognized by the smart device to determine whether the state of the smart device satisfies a trigger condition. At this time, the state of the smart device includes but is not limited to: the voice state of the intelligent device, the position state of the intelligent device, the pose state of the intelligent device and the skill state of the intelligent device. For example, it is detected whether the speed of the smart device satisfies a preset speed, or whether the pose of the smart device is a preset pose, or whether the position of the smart device is within a preset position range.
When the preset trigger condition is determined to be met, the intelligent device can further control the intelligent device to execute the behavior represented by the preset trigger condition according to the stored user operation information.
Illustratively, the preset trigger condition may be a preset user operation, for example, the preset user operation may be a preset action of a user, a preset pose of the user, and a preset audio of the user. Alternatively, the preset trigger condition may be an external environmental condition, for example, the external environmental condition may be a preset environmental temperature, a preset environmental humidity, a preset environmental noise, or a preset environmental identifier. Alternatively, the preset trigger condition may be a preset state of the smart device, for example, the preset state of the smart device may be a preset position of the smart device, a preset motion state of the smart device, and a preset pose state of the smart device.
For example, when a user operates the smart device, if the smart device obtains a user operation instruction, the behavior instructed by the user may be executed, and the smart device may obtain and store user operation information in the user operation instruction; then, the user can execute a certain action, and when the intelligent device determines that the action of the user is that the intelligent device needs to execute a certain action, the intelligent device calls user operation information corresponding to the action of the user; and then, the intelligent equipment completes the action indicated by the user operation information according to the user operation information corresponding to the action of the user.
For another example, when the user operates the intelligent device through the control device, the intelligent device controls the intelligent device to execute the behavior indicated by the user according to the user operation instruction, and the intelligent device can store the user operation information in the user operation instruction; then, the intelligent device can detect the state of the intelligent device, and when the intelligent device determines that the state of the intelligent device meets a certain condition, for example, the intelligent device determines that the position of the intelligent device is at a preset position or the position of the intelligent device is at a preset position, then the intelligent device retrieves user operation information corresponding to the state of the intelligent device; and then, the intelligent equipment controls the intelligent equipment to execute the behavior indicated by the user operation information according to the user operation information corresponding to the state of the intelligent equipment. In addition, when the execution subject is the control device, the control device sends the user operation information corresponding to the state of the intelligent device to the intelligent device, so that the intelligent device executes the behavior indicated by the user operation information, and the intelligent device and/or the control device can store the user operation information.
For example, the user may use the virtual joystick of the remote control device shown in fig. 4 to input a user operation command to the smart device to control the behavior of the unmanned vehicle (smart device). The left virtual rocker can control the chassis of the unmanned vehicle; wherein, the chassis controls the unmanned vehicle to move back and forth, left and right; for example, the left virtual stick controls the direction of movement, acceleration, of the unmanned vehicle. The right virtual rocker controls the pan head of the unmanned vehicle, for example, the right virtual rocker controls the Pitch and Yaw of the pan head; because the cloud platform is last to have the shooting device to, the virtual rocker in the right side is when controlling the cloud platform, and the shooting angle of the shooting device on the control cloud platform that can be indirect. An Inertial Measurement Unit (IMU) may be provided in the remote control device, and the IMU is a device for measuring the three-axis attitude angle (or angular rate) and acceleration of an object; therefore, a user can input a user operation instruction by using the IMU in the remote control equipment, namely, the operation instruction is input by using the posture change of the remote control equipment, and then the intelligent equipment is controlled. When the user inputs the user operation instruction, the remote control equipment can control the behavior of the unmanned vehicle according to the user operation information in the user operation instruction, and the remote control equipment can control the action of the pan-tilt of the unmanned vehicle by using the operation corresponding to Pitch/Yaw. Also, the remote control device may store user operation information in the user operation instruction.
For example, fig. 5 is a schematic view of a behavior of an unmanned vehicle provided by the present application, fig. 6 is a schematic view of a first schematic view of a control device of an unmanned vehicle provided by the present application, and fig. 7 is a schematic view of a second schematic view of a control device of an unmanned vehicle provided by the present application, as shown in fig. 5 to 7, a user may send a user operation instruction to the unmanned vehicle through the control device, where the user operation instruction is used to instruct a chassis of the unmanned vehicle to move, and the user operation instruction is used to instruct a pan tilt head of the unmanned vehicle to tilt and/or rotate. As shown in fig. 6, the chassis of the unmanned vehicle can be controlled to move left and right by moving the control device horizontally left and right (refer to an arrow in the horizontal direction), and can be controlled to move forward and backward by moving the control device forward and backward (refer to an arrow in the vertical direction). As shown in fig. 7, the pan/tilt head of the unmanned vehicle can be controlled to perform yaw movement by turning the control device left and right (refer to an arrow in the horizontal direction), and to perform pitch movement by turning the control device up and down (refer to an arrow in the vertical direction). In the process, user operation information in a user operation instruction sent by a user can be stored, and when a preset trigger condition is met, the chassis of the unmanned vehicle is controlled to move or a holder on the unmanned vehicle is controlled to rotate in a pitching and/or yawing mode according to the stored user operation information.
For another example, the user sends a voice to the smart device, and the smart device controls the smart device to perform a corresponding behavior according to the received voice. For example, the smart device may send a voice response according to the voice sent by the user, or the smart device may move according to the voice sent by the user, or the smart device may perform corresponding skills according to the voice sent by the user. Moreover, the intelligent equipment can carry out semantic analysis on the voice, and the intelligent equipment can obtain user operation information in the voice; the smart device may store user operation information. Then, the intelligent device detects whether the trigger condition is received, for example, the user sends out the voice again, so that the intelligent device can determine that the trigger condition is reached according to the currently received voice; then, the intelligent device retrieves the stored user operation information, and the intelligent device executes a behavior corresponding to the user operation information.
In one example, the method can further comprise the following steps: and when the user operation instruction is obtained, controlling the intelligent equipment to execute the behavior according to the user operation information.
In this embodiment, when the user operation instruction is obtained, the intelligent device may further control the intelligent device to execute a behavior corresponding to the user operation information according to the user operation information in the user operation instruction. When the user operation instruction is acquired, the intelligent device is controlled to execute the behavior represented by the user operation information in the user operation instruction in real time, and the user can confirm whether the storage of the user operation information is expected in real time through materialized presentation, so that error correction or further processing of the user operation information can be performed according to an actual presentation result.
The user operation instruction comprises one or more user operation information; and the user operation information is used for indicating the intelligent equipment to perform one or more actions.
For example, a user operates the intelligent device, and then the intelligent device obtains user operation information, wherein the user operation information comprises shooting parameters; the intelligent device can control the intelligent device to shoot according to the shooting parameters in the user operation information.
In this embodiment, by obtaining a user operation instruction, the user operation instruction includes user operation information used for representing a behavior of the smart device; and storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when the preset trigger condition is met. Therefore, a user operation instruction of the user on the intelligent equipment is obtained, and further user operation information in the user operation instruction is stored; and when the preset trigger condition is detected to be met, calling the stored user operation information to control the intelligent equipment to execute the behavior represented by the user operation information. Because the user operation information in the user operation instruction is input by the user, the behavior and the action of the intelligent equipment can be more accurate, precise, natural and active due to the behavior represented by the user operation information; therefore, when the intelligent device needs to start to perform behaviors and actions, the stored user operation information can be called, the intelligent device is controlled to execute corresponding behaviors according to the user operation information, and at the moment, the behaviors and the actions of the intelligent device can be more accurate, natural and active.
Fig. 8 is a flowchart of a control method of an intelligent device according to another embodiment of the present application, and as shown in fig. 8, the method of this embodiment may include:
s201, obtaining a plurality of user operation instructions, wherein each user operation instruction comprises user operation information used for representing the behavior of the intelligent device.
In one example, the number of the user operation instructions is multiple, and each user operation instruction further includes an instruction identifier, where the instruction identifier is used to identify the user operation instruction.
In one example, the generation times of the plurality of user operation instructions are continuous; when the preset triggering condition is met, the execution sequence of the behaviors represented by the user operation information is the same as the generation sequence of the user operation instructions. Alternatively, at least a part of the generation time of the user operation instruction is discontinuous.
In this embodiment, the execution subject of this embodiment may be an intelligent device, and may be a control device of the intelligent device. Wherein the smart device may be a movable platform. The main body of the present embodiment is the same as the above embodiments, and is described as an example.
When a user operates the intelligent device to perform certain behaviors and actions, the user can input a user operation instruction to the intelligent device, specifically, when the intelligent device is required to perform a plurality of behaviors or a series of behaviors, one user operation instruction cannot enable the intelligent device to complete a plurality of behaviors, and therefore the user can input a plurality of user operation instructions to the intelligent device. Since a plurality of user operation instructions need to be stored, in order to facilitate processing and storing of the plurality of user operation instructions, each user operation instruction may have its own instruction identifier, and the instruction identifier may identify the user operation instruction.
Each user operation instruction comprises user operation information, and the user operation information represents the behavior of the intelligent device.
Each user operation information can comprise a behavior parameter; the behavioral parameters include any one or more of the following: movement parameters, pose parameters, shooting parameters, skill parameters, and audio parameters. Wherein the movement parameters include, but are not limited to, velocity, acceleration, displacement. Pose parameters include, but are not limited to, pose position, pose state. The shooting parameters include, but are not limited to, range, angle of shot, speed of shot. Skill parameters include, but are not limited to, movement skills, jumping skills, shooting skills, infrared skills, zoom skills. The audio parameters include, but are not limited to, play parameters, voice parameters, volume parameters.
In one example, a user may continuously operate a smart device such that the smart device performs a series of actions; at the moment, the user continuously sends user operation instructions to the intelligent equipment; thus, the acquired user operation instructions are continuous in time, that is, the generation times of the acquired plurality of user operation instructions are continuous. Because the user inputs a user operation instruction, the intelligent device executes a behavior according to the user operation information in the user operation instruction, so that, in step S202, after the user operation information in the obtained multiple user operation instructions is stored, when a preset trigger condition is met, the behaviors represented by the user operation information need to be sequentially executed according to the generation sequence of the generation time of the user operation instruction, that is, the execution sequence of the behaviors represented by each user operation information is the same as the generation sequence of the multiple user operation instructions.
For example, a user inputs a user operation instruction a, the user operation instruction a includes user operation information a and user operation information b, and the generation time of the user operation instruction a is time 1; at this time, the smart device executes an action 1 according to each user operation information in the user operation instruction a. Then, a user inputs a user operation instruction B, the user operation instruction B comprises user operation information c and user operation information d, and the generation time of the user operation instruction B is time 2; at this time, the smart device executes an action 2 according to each user operation information in the user operation instruction B. Then, a user inputs a user operation instruction C, the user operation instruction C comprises user operation information e, and the generation time of the user operation instruction C is 3; at this time, the smart device executes an action 3 according to each user operation information in the user operation instruction C. Time 1, time 2, and time 3 are sequential in time order, and time 1, time 2, and time 3 are consecutive in time. Then, when a preset trigger condition is met, the stored user operation information a, user operation information b, user operation information c, user operation information d and user operation information e can be called; and then controlling the intelligent device to sequentially execute the behaviors 1, 2 and 3.
In another example, a user may not continuously operate the smart device, causing the smart device to perform a series of actions; at this time, the user may issue a user operation instruction to the smart device at intervals, that is, during the period when the smart device issues the user operation instruction, the operation on the smart device may be suspended; thus, at least part of the user operation instructions acquired by the smart device are discontinuous in time. Or the user inputs a plurality of user operation instructions into the intelligent equipment, so that the intelligent equipment performs a series of behaviors; then, the smart device adjusts the sequence of the plurality of user operation instructions, so that at least part of the user operation instructions acquired by the smart device are discontinuous in time. Therefore, in step S202, after storing the user operation information in the obtained multiple user operation instructions, when the preset trigger condition is satisfied, the intelligent device does not need to sequentially execute the behaviors represented by the user operation information according to the generation sequence of the generation time of the user operation instructions, and the intelligent device may sequentially execute the behaviors represented by the user operation information according to the arrangement sequence of the stored user operation information.
For example, a user inputs a user operation instruction a, the user operation instruction a includes user operation information a and user operation information b, and the generation time of the user operation instruction a is time 1; at this time, the smart device executes an action 1 according to each user operation information in the user operation instruction a. Then, a user inputs a user operation instruction B, the user operation instruction B comprises user operation information c and user operation information d, and the generation time of the user operation instruction B is time 2; at this time, the smart device executes an action 2 according to each user operation information in the user operation instruction B. Then, a user inputs a user operation instruction C, the user operation instruction C comprises user operation information e, and the generation time of the user operation instruction C is 3; at this time, the smart device executes an action 3 according to each user operation information in the user operation instruction C. Time 1, time 2, and time 3 are sequential in time order, and time 1, time 2, and time 3 are consecutive in time. Then, the sequence of the user operation instruction A, the user operation instruction B and the user operation instruction C can be adjusted, and the adjusted sequence is the user operation instruction A, the user operation instruction C and the user operation instruction D; thus, the plurality of user operation instructions stored by the smart device are not continuous in generation time. Then, when a preset trigger condition is met, the intelligent device can call the stored user operation information a, user operation information b, user operation information e, user operation information c and user operation information d; and then controlling the intelligent device to sequentially execute the behaviors 1, 3 and 2.
S202, storing user operation information corresponding to the user operation instructions in an associated mode to obtain a user operation information set, and controlling the intelligent equipment to execute behaviors represented by the user operation information according to the user operation information set when a preset trigger condition is met.
In this embodiment, the intelligent device obtains a plurality of user operation instructions, and each user operation instruction includes at least one piece of user operation information; therefore, the user operation information represents a series of behaviors of the intelligent device, and therefore, the user operation information in the user operation instructions has a certain association relationship. And then the intelligent equipment associates each piece of user operation information in the obtained multiple user operation instructions to obtain a user operation information set.
Furthermore, a corresponding relationship between the preset trigger condition and the user operation information set may be set, that is, a plurality of preset trigger conditions are configured, and each preset trigger condition corresponds to a respective user operation information set.
Then, when the fact that a preset trigger condition is met is detected, the intelligent device calls a user operation information set corresponding to the preset trigger condition; then, the intelligent device can control the intelligent device to execute the behavior represented by each user operation information in the user operation information set according to the user operation information set corresponding to the preset trigger condition.
For example, after a user inputs a user operation instruction a, a user operation instruction B, a user operation instruction C, and a user operation instruction D, the smart device may sequentially execute behaviors corresponding to the user operation instructions; the user operation instruction A comprises user operation information a and user operation information B, the user operation instruction B comprises user operation information C and user operation information D, the user operation instruction C comprises user operation information e, and the user operation instruction D comprises user operation information f. Then, the input is stopped. The intelligent device needs to sequentially store user operation information a, user operation information b, user operation information c, user operation information d, user operation information e and user operation information f in a correlated manner, so as to obtain a user operation information set Q. Then, when the fact that the preset triggering condition is met is detected, the intelligent device calls a user operation information set Q corresponding to the preset triggering condition. And then, controlling the intelligent equipment to perform a series of behaviors according to each user operation information in the user operation information set Q.
Therefore, in the processes of step S201 and step S202, after a series of recording processes are obtained, the intelligent device may store the recorded operation information of each user in a unified manner; alternatively, in the processes of step S201 and step S202, the smart device may store each user operation information in real time.
In an example, the method provided in this embodiment may further include: in the process of acquiring a plurality of user operation instructions, if a temporary instruction is received, storage of user operation information is temporarily performed.
In this embodiment, in the processes of step S201 and step S202, in the process of inputting the user operation instruction by the user, the intelligent device may store the user operation information in the user operation instruction in real time; in this process, if the smart device receives the pause instruction, the smart device may pause the storage processing of the user operation information in the user operation instruction.
For example, the user issues a pause instruction by means of voice, gesture, touch and the like; or the intelligent equipment detects the state of the intelligent equipment and generates a pause instruction when the state of the intelligent equipment is determined to meet a certain condition; or the intelligent device detects the state of the external environment and generates a pause instruction when determining that the state of the external environment meets a certain condition.
In the embodiment, a plurality of user operation instructions are obtained, and each user operation instruction comprises user operation information used for representing the behavior of the intelligent device; and storing user operation information corresponding to the plurality of user operation instructions in an associated manner to obtain a user operation information set, and controlling the intelligent equipment to execute the behavior represented by each user operation information according to the user operation information set when a preset trigger condition is met. When a user inputs a plurality of user operation instructions, performing associated storage on each user operation information in the plurality of user operation instructions to obtain a user operation information set; because each piece of user operation information in the user operation information set can indicate a series of behaviors, the stored user operation information set can be called when a preset trigger condition is met, and the intelligent device is controlled to execute a series of behaviors according to each piece of user operation information in the user operation information set. Because the user operation information in the user operation instruction is input by the user, the behavior and the action of the intelligent equipment can be more accurate, precise, natural and active due to the behavior represented by the user operation information; therefore, when the intelligent device needs to start to perform behaviors and actions, the stored user operation information can be called, the intelligent device is controlled to execute corresponding behaviors according to the user operation information, and at the moment, the behaviors and the actions of the intelligent device can be more accurate, natural and active.
Fig. 9 is a flowchart of a control method of an intelligent device according to another embodiment of the present application, and as shown in fig. 9, the method of this embodiment may include:
s301, a starting instruction is obtained, and the starting instruction is used for indicating the recording start of the user operation information.
In this embodiment, the execution subject of this embodiment may be an intelligent device, and may be a control device of the intelligent device. Wherein the smart device may be a movable platform. The main body of the present embodiment is the same as the above embodiments, and is described as an example.
Before executing the embodiment of the present application, a trigger instruction is needed to indicate that recording of user operation information in a user operation instruction issued by a user can be started. Therefore, the intelligent device can detect whether the opening instruction is received in real time; after the intelligent device determines that the opening instruction is received, it can be determined that the user operation information in the user operation instruction can be recorded after the user operation instruction is obtained. It is to be understood that recording does not necessarily mean storing, and may be suspended and edited during the recording process, and after the recording is completed, whether to store the recording or not may be determined according to a trigger condition.
For example, the user issues the open command by touch, joystick, gesture, voice, and the like. Alternatively, the state of the smart device may be detected, and the start instruction may be generated when the state of the smart device is determined to satisfy a certain condition, for example, when the smart device determines that an engine of the smart device starts to start, the smart device determines to generate the start instruction. Or, the smart device may detect a state of the external environment, and generate the start instruction when determining that the state of the external environment satisfies a certain condition, for example, when determining that the vibration state of the external environment is greater than a preset vibration amplitude, the smart device determines to generate the start instruction.
S302, obtaining a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent device.
In this embodiment, this step may refer to step S101 shown in fig. 3, or this step may refer to step S201 shown in fig. 8, which is not described again.
And S303, storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when the preset trigger condition is met.
In this embodiment, this step may refer to step S102 shown in fig. 3, or this step may refer to step S202 shown in fig. 8, which is not described again.
And S304, acquiring an ending instruction, wherein the ending instruction is used for indicating the recording end of the user operation information.
In one example, the ending instruction is generated according to the time indicated by the opening instruction and a preset time length.
In this embodiment, after step S303, or during the process of storing the user operation information in step S303, whether an end instruction is received may be detected in real time; after the intelligent device determines that the ending instruction is received, the intelligent device can determine that the user operation information does not need to be recorded continuously, and can end recording and storing of the user operation information.
In one example, the user may actively issue an end instruction. For example, the user issues the ending instruction by touching, joystick, gesture, voice, and the like.
In one example, the smart device may detect a state of the smart device and generate the end instruction upon determining that the state of the smart device satisfies a condition, e.g., determine to generate the end instruction upon determining that an engine of the smart device is no longer running.
In one example, the smart device may detect a state of the external environment and generate the end instruction when the state of the external environment is determined to satisfy a certain condition, for example, when the noise of the external environment is determined to be less than a preset noise value.
In one example, when the start instruction is obtained, the intelligent device may obtain a time for starting recording indicated by the start instruction, and the start instruction may indicate a preset time length; then, the intelligent device superposes the time for starting recording indicated by the starting instruction and the preset time length to determine a time point, and the time point is used as the time point of the generated ending instruction; further, when the time point arrives, the intelligent device automatically generates an end instruction for indicating the end of recording of the user operation information.
S305, editing at least one piece of user operation information to obtain the processed user operation information, and controlling the intelligent device to execute the behavior represented by the processed user operation information when a preset trigger condition is met.
In one example, the editing process includes any one or more of:
moving processing, wherein the moving processing is used for indicating to adjust the execution sequence of the behaviors represented by the user operation information; deletion processing for instructing deletion of the user operation information; adding processing for instructing to add at least one piece of user operation information to form a user operation information set; the superposition processing is used for indicating the execution sequence of the behaviors represented by the at least two pieces of user operation information to be correlated; a scaling process for extending or shortening an execution time length of a behavior indicated by the user operation information; and a clipping process for indicating that execution of the behavior characterized by the user operation information is divided into a plurality of child behaviors or that execution of the behavior characterized by the user operation information is incomplete.
In this embodiment, after step S303 or S304, that is, after the intelligent device stores the user operation information, the intelligent device may edit the stored user operation information to obtain processed user operation information; or, in the process of storing the user operation information in step S303, the intelligent device may perform editing processing on the stored user operation information to obtain processed user operation information; alternatively, before storing the user operation information in step S303, the smart device may perform editing processing on the recorded user operation information. In addition, after the intelligent device obtains the processed user operation information, the processed user operation information can be replaced by the previous user operation information; or, after obtaining the processed user operation information, the smart device may simultaneously retain the previous user operation information and the processed user operation information.
Wherein the editing process includes, but is not limited to, the following processes: moving processing, deleting processing, adding processing, overlaying processing, zooming processing and clipping processing.
For example, the intelligent device receives a movement processing instruction, the movement processing instruction indicates one or more pieces of user operation information, and the movement processing instruction indicates an original position and a moved position of the user operation information which needs to be moved; and then, the intelligent equipment moves the user operation information indicated by the movement processing instruction to a preset position according to the movement processing instruction.
For example, the smart device receives a deletion processing instruction, the deletion processing instruction indicates one or more user operation information, and the deletion processing instruction indicates that the user operation information needs to be deleted; and then, the intelligent equipment deletes the user operation information indicated by the deletion processing instruction according to the deletion processing instruction.
For example, the smart device receives an add processing instruction, the add processing instruction indicating one or more other user operation information, and the add processing instruction indicating that the user operation information needs to be added; then, the intelligent device adds other user operation information indicated by the adding processing instruction to the front, back or middle of the current user operation information according to the adding processing instruction.
For example, the intelligent device receives a superposition processing instruction, the superposition processing instruction indicates a plurality of user operation information, and the superposition processing instruction indicates that the user operation information needs to be superposed; then, the intelligent device superposes the plurality of user operation information indicated by the superposition processing instruction according to the superposition processing instruction. For example, the user operation information a is movement, and the user operation information b is shooting; receiving a superposition processing instruction, wherein the superposition processing instruction indicates that user operation information a and user operation information b need to be superposed; then, the intelligent device superimposes the user operation information a and the user operation information b indicated by the superimposition processing instruction according to the superimposition processing instruction to obtain processed user operation information a, wherein the user operation information a represents that the intelligent device needs to move and shoot, for example, the user operation information a represents that the intelligent device needs to move and shoot at the same time, or the user operation information a represents that the intelligent device needs to shoot at a certain position in the moving process, or the user operation information a represents that the intelligent device needs to move within a certain time period in the shooting process.
For example, the smart device receives a scaling instruction, the scaling instruction indicates one or more pieces of user operation information, and the scaling instruction indicates an execution duration of a behavior characterized by the user operation information to be scaled; then, the intelligent device lengthens or shortens the time length of the user operation information indicated by the zooming processing instruction according to the zooming processing instruction. For example, the user operation information a is zoom; receiving a zooming processing instruction, wherein the zooming processing instruction indicates that the duration of the user operation information a needs to be lengthened; then, the time length of the user operation information a is lengthened according to the zoom processing instruction.
For example, the intelligent device receives a cutting processing instruction, the cutting processing instruction indicates one or more pieces of user operation information, and the cutting processing instruction is used for indicating that the execution of the behavior represented by the user operation information needs to be divided into a plurality of sub-behaviors; and then, the intelligent equipment divides the behavior represented by the user operation information indicated by the cutting processing instruction into a plurality of sub-behaviors. For example, the smart device receives a clipping processing instruction, the user operation information indicated by the clipping processing instruction includes zoom operation information, movement operation information, shooting operation information and shooting operation information, so that the behavior represented by the user operation information includes zooming, moving, shooting and shooting, that is, the smart device can simultaneously complete zooming, moving, shooting and shooting according to the user operation information; then, the user operation information can be segmented to obtain zoom operation information, movement operation information, shooting operation information and shooting operation information; thus, the behavior characterized by the user operation information is divided into a plurality of sub-behaviors, namely zooming, moving, shooting and shooting respectively, and the plurality of sub-behaviors can be executed independently or combined with other behaviors.
For example, the smart device receives a clipping processing instruction, the clipping processing instruction indicates one or more user operation information, and the clipping processing instruction is used for indicating that the executed behavior is incomplete when the behavior represented by the user operation information is executed. Then, the intelligent device cuts out partial sub-behaviors from the behaviors represented by the user operation information indicated by the cutting processing instruction. For example, the smart device receives a clipping processing instruction, the user operation information indicated by the clipping processing instruction includes zoom operation information, movement operation information, shooting operation information and shooting operation information, so that the behavior represented by the user operation information includes zooming, moving, shooting and shooting, that is, the smart device can simultaneously complete zooming, moving, shooting and shooting according to the user operation information; then, the intelligent device can segment the user operation information and remove the zooming operation information; thus, zooming behavior among behaviors characterized by the user operation information is removed.
Moreover, a plurality of the above-described editing processes may be performed simultaneously.
Then, the intelligent equipment detects a trigger condition, and when the trigger condition is determined to be met, the intelligent equipment calls the processed user operation information corresponding to the preset trigger condition; and then, the intelligent equipment controls the intelligent equipment to execute corresponding behaviors according to the processed user operation information.
In an example, the instruction of the editing process may be issued by a user through touch, voice, pose, or the like.
For example, fig. 10 is a schematic diagram of the editing process provided in the present application, as shown in fig. 10, an interactive interface may be displayed on the smart device or the control device of the smart device, and the user may view a behavior represented by the obtained user operation information on the interactive interface; moreover, a play button, a pause button and an edit button are provided on the interactive interface; the user can watch the behavior of the intelligent equipment represented by the obtained user operation information through the touch control playing button; the user can pause and watch the behavior of the intelligent equipment represented by the acquired user operation information by touching the pause button; the user can edit the acquired user operation information by touching the editing button. Moreover, when the user operation information is edited, an editing frame of each user operation information which is sequenced in time sequence can be provided on the interface, and the user can touch the editing frame to edit the user operation information.
For example, in the example shown in fig. 10, the user touches the editing frame 1, and then drags the editing frame 1, and drags the editing frame 1 to the front of the editing frame 2, where the editing frame 1 corresponds to the user operation information a, and the editing frame 2 corresponds to the user operation information b; receiving a movement processing instruction sent by a user, wherein the movement processing instruction indicates that the user operation information a is moved to the front of the user operation information b; thus, the user operation information a is moved ahead of the user operation information b in accordance with the movement processing instruction.
It can be understood that a plurality of interactive interfaces related to the interactive action in the embodiment of the present invention may be provided, or one interactive interface may also be provided, when there are a plurality of interactive interfaces, each interactive interface may be switched, and when there is one interactive interface, a part of the identifier on the interactive interface may be hidden or exposed.
In one example, before controlling the smart device to perform the action in step S303 or step S305, the following process may be further performed:
when the preset trigger condition is a specific trigger condition, detecting whether the intelligent equipment meets the preset condition of executing the behavior; if not, controlling the intelligent equipment to execute corresponding operation so as to meet the preset condition of the execution behavior.
In this embodiment, after the intelligent device stores the user operation information, or after the intelligent device obtains the processed user operation information, or after the intelligent device obtains the user operation information set, it may detect whether a preset trigger condition occurs; after the preset trigger condition is detected to be met, namely when the preset trigger condition is determined to be the specific trigger condition, the intelligent device judges whether the current state of the intelligent device meets the preset condition.
The preset trigger condition may refer to step S102 shown in fig. 3. The specific trigger condition may be a trigger condition that meets certain requirements. For example, the preset trigger condition may be a user operation corresponding to the user operation information, and the specific trigger condition is specific user operation information.
When the current state of the intelligent equipment meets the preset condition, the intelligent equipment determines that user operation information corresponding to the preset trigger condition can be called; and the intelligent equipment controls the intelligent equipment to execute the behavior represented by the user operation information according to the user operation information.
When the intelligent device determines that the current state of the intelligent device does not meet the preset condition, in order to enable the intelligent device to call and execute user operation information, the intelligent device needs to control the intelligent device to execute certain operation, and further the current state of the intelligent device meets the preset condition; then, the intelligent equipment calls user operation information corresponding to a preset trigger condition; and controlling the intelligent equipment to execute the behavior represented by the user operation information according to the user operation information. Wherein the preset condition refers to a specific trigger condition. For example, the preset conditions include, but are not limited to, preset external environmental conditions, a state of the smart device, a specific user operation; for example, the user operation may be a user action, a user pose, and a user audio, the external environment condition may be an environment temperature, an environment humidity, and an environment noise, and the state of the smart device may be a position of the smart device, a motion state of the smart device, and a pose state of the smart device.
For example, the smart device is controlled to perform corresponding operations, including but not limited to the following: moving, rotating, zooming, shooting and pose.
For example, the smart device detects that the preset trigger condition is a specific trigger condition, where the specific trigger condition is that the smart device starts to operate. Then, the intelligent device detects whether the holder of the intelligent device is at a preset position. If the cradle head of the intelligent device is determined to be at the preset position, calling user operation information of shooting; and then, the intelligent equipment controls a design device on the holder to shoot according to the user operation information. If the intelligent equipment determines that the holder of the intelligent equipment is not at the preset position, controlling the holder of the intelligent equipment to move to the preset position; then, the intelligent equipment calls the user operation information of shooting; and then, controlling a design device on the holder to shoot according to the user operation information.
In this embodiment, on the basis of the above embodiment, editing processing, such as moving, deleting, adding, superimposing, zooming, clipping, and the like, may be performed on the obtained user operation information, so as to obtain user operation information in multiple different forms, or obtain a collection of multiple user operation information; and then, when the preset trigger condition is met, controlling the intelligent equipment to execute corresponding behaviors according to the processed user operation information. Therefore, the diversity of the user operation information can be increased, and the diversity of the behaviors of the intelligent device can be increased because different user operation information indicates different behaviors.
Fig. 11 is a flowchart of a control method for an intelligent device according to yet another embodiment of the present application, and as shown in fig. 11, the method according to this embodiment may include:
s401, obtaining a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent device.
In this embodiment, this step may refer to step S101 shown in fig. 3, or this step may refer to step S201 shown in fig. 8, or this step may refer to step S302 shown in fig. 9, and is not described again.
S402, storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when the preset trigger condition is met.
In this embodiment, this step may refer to step S102 shown in fig. 3, or this step may refer to step S202 shown in fig. 8, or this step may refer to step S303 shown in fig. 9, and is not described again.
In an example, the method provided in this embodiment may further perform each step in fig. 9, which is not described again.
S403, displaying a behavior play bar on the interactive interface; wherein, in the execution process of the behavior, the behavior play bar dynamically changes.
In one example, the instruction for instructing to process the user operation information is generated according to the operation of the user on the behavior play bar and/or the operation of the user on the virtual key on the interactive interface.
In this embodiment, in the execution process of the scheme of this embodiment, the intelligent device may provide an interactive interface on the intelligent device or a control device of the intelligent device, where the interactive interface is used to receive an instruction of a user, and the interactive interface is used to show an action of the intelligent device and edit operation information of the user to the user.
Fig. 12 is a first schematic view of an interactive interface provided in the embodiment of the present application, as shown in fig. 12, when an intelligent device executes a corresponding behavior according to user operation information, a behavior of the intelligent device may be displayed on the interactive interface; in order to edit the user operation information and to facilitate the user to selectively watch the behavior represented by the user operation information, a behavior play bar can be provided on the interactive interface, or a behavior play bar is displayed on the interactive interface by jumping to another interactive interface according to the user operation, or a behavior play bar is displayed on the interactive interface by automatically jumping to another interactive interface; a plurality of editing frames (behavior frames) are displayed on the behavior play bar, each editing frame corresponds to a behavior, and each behavior corresponds to at least one piece of user operation information; when the intelligent device executes the behavior represented by the user operation information, the behavior of the user can be displayed on the interactive interface, and the editing frames on the behavior playing bar are progressively advanced frame by frame, namely, the editing frames are marked as played after one editing frame is played; for example, as shown in fig. 12, a black dot is added to the editing frame, and the black dot indicates that the editing frame has been played. Therefore, by adopting the mode that the behavior play bar dynamically changes, which behaviors are marked are executed by the intelligent equipment. It can be understood that, when the behavior play bar is triggered, the behavior represented by the corresponding user operation information may not be displayed on the interactive interface, and at this time, the intelligent device may be triggered to execute the corresponding behavior, that is, the triggering of the behavior play bar is a preset triggering condition for the intelligent device to execute the corresponding behavior. The behavior represented by the corresponding user operation information displayed on the interactive interface may be a virtual multimedia picture or a recorded video of a behavior executed according to a user operation instruction after the user operation instruction is acquired by the intelligent device when the behavior play bar is triggered.
Moreover, a play button, a pause button and an edit button are provided on the interactive interface; the user can touch the playing button, so that the control device obtains a playing instruction and plays the behavior of the intelligent device represented by the obtained user operation information; the user can touch the pause button, so that the control equipment obtains a pause instruction and controls the equipment to pause and play the behavior of the intelligent equipment represented by the obtained user operation information; the user can touch the editing button, so that the control device obtains an editing instruction, and the control device edits the obtained user operation information according to the editing instruction of the user.
In one example, a behavior play bar is provided on the interactive interface, a plurality of editing frames (behavior frames) are displayed on the behavior play bar, each editing frame corresponds to a behavior, and each behavior corresponds to at least one piece of user operation information; therefore, when the user operation information is edited, the operation of the user on the behavior play bar can be detected; then, an editing processing instruction is generated according to the operation of the user on the behavior play bar. For example, a user touches an editing frame 1 on a behavior play bar, then drags the editing frame 1, and drags the editing frame 1 to the front of an editing frame 2, wherein the editing frame 1 corresponds to user operation information a, and the editing frame 2 corresponds to user operation information b; receiving a movement processing instruction sent by a user, wherein the movement processing instruction indicates that the user operation information a is moved to the front of the user operation information b; thus, the user operation information a is moved ahead of the user operation information b in accordance with the movement processing instruction.
In another example, a virtual key is provided on the interactive interface, for example, an "edit button" is provided, so that the user can touch the virtual key on the interactive interface and input different edit processing instructions. For example, a user touches an "edit button" on the interactive interface; then, different editing processing options pop up on the interactive interface, and the user can select the superposition processing; receiving a superposition processing instruction sent by a user, wherein the superposition processing instruction indicates that the user operation information a and the user operation information b are superposed; thus, the user operation information a and the user operation information b are superimposed according to the superimposition processing instruction.
S404, displaying at least one behavior identifier on the interactive interface, wherein each behavior identifier is used for indicating a set of behaviors characterized by at least one piece of user operation information.
In this embodiment, the intelligent device may combine a plurality of user operation information into one user operation information set; a user operation information set which can indicate a series of behaviors of the intelligent device; therefore, in order to facilitate the user to know which user operation information belongs to the same user operation information set, the user can edit the user operation information conveniently, and the behavior identification of the user operation information set can be displayed on the interactive interface while the behavior of the intelligent device is displayed on the interactive interface. It is understood that the behavior identifier is also an identifier of a behavior set, and the behavior set herein refers to the above-mentioned series of behaviors.
A user operation information set which can indicate a series of behaviors of the intelligent device; thus, the smart device can complete multiple behaviors, i.e., complete a complete set of behaviors; a complete set of actions performed by the smart device may be recorded and displayed.
Fig. 13 is a schematic view of an interaction interface provided in the embodiment of the present application, and as shown in fig. 13, an intelligent device may complete a series of behaviors according to a user operation information set 1, that is, complete a behavior set 1; the behaviors of the behavior set 1 can be displayed on the interactive interface; the intelligent device can complete a series of behaviors according to the user operation information set 2, namely, the behavior set 2 is completed; the behaviors of the behavior set 2 can be displayed on the interactive interface; the intelligent equipment can complete a series of behaviors according to the user operation information set 3, namely, the behavior set 3 is completed; the individual behaviors of behavior set 3 may be displayed on the interactive interface. And, in order to distinguish the behavior sets, a behavior identifier may be configured for each behavior set. The behavior identifier is, for example, a text, a picture, a figure, a button, or a frame.
S405, when one behavior mark in the at least one behavior mark is selected, the selected behavior mark is dynamically changed on the interactive interface.
In this embodiment, based on the behavior identifiers in step S404, each behavior identifier indicates each behavior set; the interactive interface may display various behavior identifiers. Furthermore, the user can touch each behavior identifier to select the behavior identifier. Then, the intelligent device can call a behavior set corresponding to the behavior identification according to the behavior identification selected by the user, and then play the behavior set corresponding to the behavior identification on the interactive interface; or the intelligent device calls the behavior set corresponding to the behavior identifier according to the behavior identifier selected by the user, then edits the behavior set corresponding to the behavior identifier, displays the editing process on the interactive interface, and can also display the edited behavior set on the interactive interface.
Moreover, when the user selects a behavior identifier, in order to facilitate the user to confirm whether to select the behavior identifier, the intelligent device may perform one or more dynamic changes on the behavior identifier to highlight the behavior identifier.
For example, the intelligent device highlights the behavior identifier selected by the user, or the intelligent device changes the color of the behavior identifier selected by the user, or the intelligent device bolds the behavior identifier selected by the user.
S406, acquiring a touch instruction of a user on the interactive interface; and displaying the sliding switching of the behavior identifier on the interactive interface according to the sliding direction indicated by the touch instruction.
In this embodiment, based on the behavior identifiers in step S404, each behavior identifier indicates each behavior set; the interactive interface may display various behavior identifiers. Furthermore, the user can select the behavior identification and switch and select the behavior set corresponding to the behavior identification; at this time, when the user selects and switches the behavior identifier, the smart device may obtain a touch instruction of the user. Because the user slides on the interactive interface, the intelligent device can detect the sliding direction indicated by the touch instruction.
Then, the intelligent equipment detects a behavior identifier switched by the touch instruction of the user, and then displays a behavior set corresponding to the behavior identifier; and in the process that the user slides on the interactive interface, the intelligent device displays the action of sliding switching of the behavior identifier on the interactive interface according to the sliding direction indicated by the touch instruction. For example, as shown in fig. 13, 3 behavior identifiers, namely a behavior set 1, a behavior set 2, and a behavior set 3, may be displayed on the interactive interface; sliding on the interactive interface by a user to indicate a switching behavior set; for example, the user slides from left to right; thus, the sliding of the set 1, the behavior set 2 and the behavior set 3 is displayed on the interactive interface from left to right.
On the basis of the above embodiment, the present embodiment may further provide an interactive interface, where the interactive interface is configured to receive an instruction from a user, and the interactive interface is configured to show the action of the smart device and edit the operation information of the user to the user. The behavior and the behavior set of the intelligent device can be displayed on the interactive interface, and the switching state of the behavior set can also be displayed on the interactive interface, so that a user can know the currently displayed behavior of the intelligent device, and the acquired and edited behavior can be visually displayed for the user. Moreover, the user can operate on the interactive interface, and further edit the user operation information; the user establishes the behavior logic through programming, so that the diversity of user operation information can be increased, and the diversity of behaviors of the intelligent device can be increased.
Fig. 14 is a flowchart of a control method of an intelligent device according to another embodiment of the present application, where the method may be applied to a control device of an intelligent device, as shown in fig. 14, the method of this embodiment may include:
s501, obtaining a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent device.
S502, sending the user operation information to the intelligent equipment so that the intelligent equipment stores the user operation information and executes behaviors according to the user operation information when a preset trigger condition is met.
In this embodiment, the execution subject of this embodiment may be a control device of an intelligent device. For example, the smart device may be a mobile platform, a smart robot, a drone, and so on. The control device may be a mobile terminal, a remote control device, or the like.
In one example, the user operation information includes any one or more of: touch information, physical control operation information, somatosensory information and voice information.
In one example, the user operation information includes a behavior parameter indicating a behavior; the behavioral parameters include any one or more of the following: movement parameters, pose parameters, shooting parameters, skill parameters, and audio parameters.
In one example, the number of the user operation instructions is multiple, and each user operation instruction further includes an instruction identifier, where the instruction identifier is used to identify the user operation instruction.
In one example, step 501 includes: and acquiring a plurality of user operation instructions.
Step 502, comprising: and sending a user operation information set formed by the user operation information corresponding to the plurality of user operation instructions to the intelligent equipment.
In one example, the generation times of the plurality of user operation instructions are continuous; when the preset triggering condition is met, the execution sequence of the behaviors represented by the user operation information is the same as the generation sequence of the user operation instructions.
In an example, the method provided in this embodiment further includes: in the process of acquiring a plurality of user operation instructions, if a tentative instruction is received, recording of user operation information is tentative.
In one example, the generation time of at least part of the user operation instruction is discontinuous.
In an example, before step 501, the method provided in this embodiment further includes: and acquiring a starting instruction, wherein the starting instruction is used for indicating the recording start of the user operation information.
In an example, the method provided in this embodiment further includes: and acquiring an ending instruction, wherein the ending instruction is used for indicating the recording end of the user operation information.
In one example, the ending instruction is generated according to the time indicated by the opening instruction and a preset time length.
In an example, the method provided by this embodiment further includes: and editing at least one piece of user operation information to obtain the processed user operation information so as to control the intelligent equipment to execute the behavior represented by the processed user operation information when the preset trigger condition is met.
After the control device side edits at least one piece of user operation information, the processed user operation information can be sent to the intelligent device. That is, after the user operation information is recorded, the recorded user operation information may be edited and then stored in the intelligent device side.
In one example, the editing process includes any one or more of:
moving processing, wherein the moving processing is used for indicating to adjust the execution sequence of the behaviors represented by the user operation information; deletion processing for instructing deletion of the user operation information; adding processing for instructing to add at least one piece of user operation information to form a user operation information set; the superposition processing is used for indicating the execution sequence of the behaviors represented by the at least two pieces of user operation information to be correlated; a scaling process for extending or shortening an execution time length of a behavior indicated by the user operation information; and a clipping process for indicating that execution of the behavior characterized by the user operation information is divided into a plurality of child behaviors or that execution of the behavior characterized by the user operation information is incomplete.
In an example, the method provided in this embodiment further includes: displaying a behavior play bar on the interactive interface; wherein, in the execution process of the behavior, the behavior play bar dynamically changes.
It can be understood that, when the user operation information is not stored but the behavior represented by the user operation information needs to be played on the interactive interface of the control device on the control device side of the intelligent device, the control device may obtain the corresponding user operation information from the intelligent device.
In one example, the instruction for instructing to process the user operation information is generated according to the operation of the user on the behavior play bar and/or the operation of the user on the virtual key on the interactive interface.
In an example, the method provided in this embodiment further includes: and displaying at least one behavior identifier on the interactive interface, wherein each behavior identifier is used for indicating a set of behaviors characterized by at least one piece of user operation information.
In an example, the method provided in this embodiment further includes: when one behavior identifier in the at least one behavior identifier is selected, the selected behavior identifier is dynamically changed on the interactive interface.
In an example, the method provided in this embodiment further includes: acquiring a touch instruction of a user on an interactive interface; and displaying the sliding switching of the behavior identifier on the interactive interface according to the sliding direction indicated by the touch instruction.
In an example, before the performing of the behavior, the method provided by the embodiment further includes:
when the preset trigger condition is a specific trigger condition, detecting whether the intelligent equipment meets the preset condition of executing the behavior; if not, controlling the intelligent equipment to execute corresponding operation so as to meet the preset condition of the execution behavior.
In an example, the method provided in this embodiment further includes: and when the user operation instruction is obtained, controlling the intelligent equipment to execute the behavior according to the user operation information.
In one example, the smart device includes a movable platform.
The steps of this embodiment may refer to the descriptions of the above embodiments, and are not described again.
Fig. 15 is a schematic structural diagram of a control apparatus of an intelligent device according to an embodiment of the present application, and as shown in fig. 15, the control apparatus 600 of the intelligent device according to this embodiment may include: a memory 601 and a processor 602.
A memory 601 for storing program codes.
A processor 602 for invoking program code, which when executed, performs the following: acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent equipment; and storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when the preset trigger condition is met.
The control device may be an intelligent device (optionally, may be installed on the intelligent device, or may be independent of the intelligent device), or a control device of the intelligent device (optionally, may be installed on the control device, or may be independent of the control device).
Optionally, when the control device is a control device of an intelligent device, the processor 602 is configured to call the program code, and when the program code is executed, the processor is configured to perform the following operations: acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent equipment; and sending the user operation information to the intelligent equipment so that the intelligent equipment stores the user operation information and executes behaviors according to the user operation information when a preset trigger condition is met.
In some embodiments, the user operational information includes any one or more of: touch information, physical control operation information, somatosensory information and voice information.
In some embodiments, the user operation information includes a behavior parameter indicating a behavior; the behavioral parameters include any one or more of the following: movement parameters, pose parameters, shooting parameters, skill parameters, and audio parameters.
In some embodiments, the number of the user operation instructions is multiple, and each user operation instruction further includes an instruction identifier for identifying the user operation instruction.
In some embodiments, the processor 602 is specifically configured to: acquiring a plurality of user operation instructions; and storing user operation information corresponding to the plurality of user operation instructions in an associated manner to obtain a user operation information set, and controlling the intelligent equipment to execute the behavior represented by each user operation information according to the user operation information set when a preset trigger condition is met.
In some embodiments, the processor 602 is specifically configured to: acquiring a plurality of user operation instructions; and sending a user operation information set formed by the user operation information corresponding to the plurality of user operation instructions to the intelligent equipment.
In some embodiments, the generation times of the plurality of user operation instructions are continuous; when the preset triggering condition is met, the execution sequence of the behaviors represented by the user operation information is the same as the generation sequence of the user operation instructions.
In some embodiments, the processor 602 is further configured to: in the process of acquiring a plurality of user operation instructions, if a tentative instruction is received, recording of user operation information is tentative.
In some embodiments, the generation time of at least part of the user operation instructions is discontinuous.
In some embodiments, the processor 602 is further configured to: and before the user operation instruction is acquired, acquiring a starting instruction, wherein the starting instruction is used for indicating the recording start of the user operation information.
In some embodiments, the processor 602 is further configured to: and acquiring an ending instruction, wherein the ending instruction is used for indicating the recording end of the user operation information.
In some embodiments, the ending instruction is generated according to the time indicated by the opening instruction and a preset time length.
In some embodiments, the processor 602 is further configured to:
and editing at least one piece of user operation information to obtain the processed user operation information so as to control the intelligent equipment to execute the behavior represented by the processed user operation information when the preset trigger condition is met.
In some embodiments, the editing process includes any one or more of:
moving processing, wherein the moving processing is used for indicating to adjust the execution sequence of the behaviors represented by the user operation information; deletion processing for instructing deletion of the user operation information; adding processing for instructing to add at least one piece of user operation information to form a user operation information set; the superposition processing is used for indicating the execution sequence of the behaviors represented by the at least two pieces of user operation information to be correlated; a scaling process for extending or shortening an execution time length of a behavior indicated by the user operation information; and a clipping process for indicating that execution of the behavior characterized by the user operation information is divided into a plurality of child behaviors or that execution of the behavior characterized by the user operation information is incomplete.
In some embodiments, the processor 602 is further configured to: in the process of acquiring a plurality of user operation instructions, if a temporary instruction is received, storage of user operation information is temporarily performed.
In some embodiments, the movable platform further comprises: a display 603; a display 603 for displaying a behavior play bar on the interactive interface; wherein, in the execution process of the behavior, the behavior play bar dynamically changes.
In some embodiments, the instruction for instructing to process the user operation information is generated according to the operation of the user on the behavior play bar and/or the operation of the user on the virtual key on the interactive interface.
In some embodiments, the display 603 is further configured to display at least one behavior identifier on the interactive interface, each behavior identifier being indicative of a set of behaviors characterized by at least one user operation information.
In some embodiments, the display 603 is further configured to, when one of the at least one behavior identifier is selected, dynamically change the selected behavior identifier on the interactive interface.
In some embodiments, the processor 602 is further configured to obtain a touch instruction of a user on the interactive interface; the display 603 is further configured to display sliding switching of the behavior identifier on the interactive interface according to the sliding direction indicated by the touch instruction.
In some embodiments, the processor 602 is further configured to:
before the behavior is executed, when the preset trigger condition is a specific trigger condition, detecting whether the intelligent equipment meets the preset condition for executing the behavior; if not, controlling the intelligent equipment to execute corresponding operation so as to meet the preset condition of the execution behavior.
In some embodiments, the processor 602 is further configured to:
and when the user operation instruction is obtained, controlling the intelligent equipment to execute the behavior according to the user operation information.
In some embodiments, the smart device includes a movable platform.
The control device 600 of this embodiment may be configured to execute the technical solutions corresponding to the method embodiments, and the implementation principles and technical effects thereof are similar, and are not described herein again.
The embodiment of the present application further provides a computer storage medium, in which program instructions are stored, and when the program is executed, the computer storage medium may include some or all of the steps of the method in the corresponding embodiment.
Fig. 16 is a schematic structural diagram of a control system of a smart device according to an embodiment of the present application, where the system includes a control device 701 and a smart device 702;
the control device 701 is configured to obtain a user operation instruction, where the user operation instruction includes user operation information used for characterizing a behavior of the smart device 702, and send the behavior operation information to the smart device 702.
And the intelligent device 702 is configured to store the user operation information, and execute a behavior according to the user operation information when it is determined that the preset trigger condition is met.
In some embodiments, the user operational information includes any one or more of: touch information, physical control operation information, somatosensory information and voice information.
In some embodiments, the user operation information includes a behavior parameter indicating a behavior; the behavioral parameters include any one or more of the following: movement parameters, pose parameters, shooting parameters, skill parameters, and audio parameters.
In some embodiments, the number of the user operation instructions is multiple, and each user operation instruction further includes an instruction identifier for identifying the user operation instruction.
In some embodiments, the control device 701 is specifically configured to obtain a plurality of user operation instructions, and send user operation information corresponding to the plurality of user operation instructions to the intelligent device 702.
The intelligent device 702 is specifically configured to store user operation information corresponding to a plurality of user operation instructions in an associated manner, so as to obtain a user operation information set; and executing the behavior represented by each user operation information according to the user operation information set when the preset trigger condition is determined to be met.
In some embodiments, the generation times of the plurality of user operation instructions are continuous; when the preset triggering condition is met, the execution sequence of the behaviors represented by the user operation information is the same as the generation sequence of the user operation instructions.
In some embodiments, the control device 701 is further configured to: in the process of acquiring a plurality of user operation instructions, if a tentative instruction is received, recording of user operation information is tentative.
In some embodiments, the generation time of at least part of the user operation instructions is discontinuous.
In some embodiments, the control device 701 is further configured to: and before the user operation instruction is acquired, acquiring a starting instruction, wherein the starting instruction is used for indicating the recording start of the user operation information.
In some embodiments, the control device 701 is further configured to: and acquiring an ending instruction, wherein the ending instruction is used for indicating the recording end of the user operation information.
In some embodiments, the ending instruction is generated according to the time indicated by the opening instruction and a preset time length.
In some embodiments, the control device 701 is further configured to perform editing processing on at least one piece of user operation information, so as to obtain processed user operation information.
The smart device 702 is further configured to execute the behavior represented by the processed user operation information when a preset trigger condition is met.
In some embodiments, the editing process includes any one or more of:
moving processing, wherein the moving processing is used for indicating to adjust the execution sequence of the behaviors represented by the user operation information; deletion processing for instructing deletion of the user operation information; adding processing for instructing to add at least one piece of user operation information to form a user operation information set; the superposition processing is used for indicating the execution sequence of the behaviors represented by the at least two pieces of user operation information to be correlated; a scaling process for extending or shortening an execution time length of a behavior indicated by the user operation information; and a clipping process for indicating that execution of the behavior characterized by the user operation information is divided into a plurality of child behaviors or that execution of the behavior characterized by the user operation information is incomplete.
In some embodiments, the control device 701 is further configured to: displaying a behavior play bar on the interactive interface; wherein, in the execution process of the behavior, the behavior play bar dynamically changes.
In some embodiments, the instruction for instructing to process the user operation information is generated according to the operation of the user on the behavior play bar and/or the operation of the user on the virtual key on the interactive interface.
In some embodiments, the control device 701 is further configured to: and displaying at least one behavior identifier on the interactive interface, wherein each behavior identifier is used for indicating a set of behaviors characterized by at least one piece of user operation information.
In some embodiments, the control device 701 is further configured to: when one behavior identifier in the at least one behavior identifier is selected, the selected behavior identifier is dynamically changed on the interactive interface.
In some embodiments, the control device 701 is further configured to: acquiring a touch instruction of a user on an interactive interface; and displaying the sliding switching of the behavior identifier on the interactive interface according to the sliding direction indicated by the touch instruction.
In some embodiments, the smart device 702 is further configured to: before executing the behavior, when the preset trigger condition is a specific trigger condition, detecting whether the intelligent device 702 meets the preset condition for executing the behavior; if not, executing corresponding operation to meet the preset condition of the execution behavior.
In some embodiments, the smart device 702 is further configured to: when the user operation instruction is obtained, the intelligent device 702 is controlled to execute the behavior according to the user operation information.
In some embodiments, the smart device 702 comprises a removable platform.
The control device 701 may refer to the structure and implementation manner of the related embodiment, and the intelligent device 702 may refer to the structure and implementation manner of the related embodiment, which have similar implementation principles and technical effects and are not described herein again.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: various media capable of storing program codes, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (85)

1. A control method of an intelligent device is characterized by comprising the following steps:
acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent equipment;
and storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when a preset trigger condition is met.
2. The method of claim 1, wherein the user operation information comprises any one or more of:
touch information, physical control operation information, somatosensory information and voice information.
3. The method according to claim 1 or 2, wherein the user operation information includes a behavior parameter indicating the behavior;
the behavior parameters include any one or more of the following: movement parameters, pose parameters, shooting parameters, skill parameters, and audio parameters.
4. The method according to any one of claims 1 to 3, wherein the number of the user operation instructions is plural, and each user operation instruction further comprises an instruction identifier for identifying the user operation instruction.
5. The method according to any one of claims 1 to 4, wherein the obtaining of the user operation instruction comprises:
acquiring a plurality of user operation instructions;
the storing the user operation information to control the intelligent device to execute the behavior according to the user operation information when a preset trigger condition is met, includes:
and associating and storing user operation information corresponding to the plurality of user operation instructions to obtain a user operation information set so as to control the intelligent equipment to execute the behavior represented by each user operation information according to the user operation information set when a preset trigger condition is met.
6. The method according to claim 5, wherein the generation times of a plurality of the user operation instructions are continuous;
when the preset trigger condition is met, the execution sequence of the behaviors represented by the user operation information is the same as the generation sequence of the user operation instructions.
7. The method of claim 6, further comprising:
in the process of obtaining a plurality of user operation instructions, if a tentative instruction is received, recording of the user operation information is tentative.
8. The method of claim 5, wherein at least some of the user operation instructions are generated discontinuously.
9. The method according to any one of claims 1-8, wherein prior to said obtaining the user operation instruction, the method further comprises:
and acquiring a starting instruction, wherein the starting instruction is used for indicating the recording start of the user operation information.
10. The method of claim 9, further comprising:
and acquiring an end instruction, wherein the end instruction is used for indicating the recording end of the user operation information.
11. The method according to claim 10, wherein the end command is generated according to the time indicated by the start command and a preset time length.
12. The method according to any one of claims 1-11, further comprising:
and editing at least one piece of user operation information to obtain the processed user operation information, so as to control the intelligent equipment to execute the behavior represented by the processed user operation information when the preset trigger condition is met.
13. The method of claim 12, wherein the editing process comprises any one or more of:
a moving process for instructing to adjust an execution order of the behaviors represented by the user operation information;
a deletion process for instructing deletion of the user operation information;
adding processing for instructing to add at least one piece of user operation information to form a user operation information set;
the superposition processing is used for indicating the execution sequence of the behaviors represented by at least two pieces of user operation information to be correlated;
a scaling process for indicating that an execution time length of a behavior characterized by the user operation information is lengthened or shortened;
a clipping process to indicate that execution of a behavior characterized by the user operation information is partitioned into a plurality of child behaviors or that execution of a behavior characterized by the user operation information is incomplete.
14. The method according to any one of claims 1-13, further comprising:
displaying a behavior play bar on the interactive interface;
and in the execution process of the behavior, the behavior play bar is dynamically changed.
15. The method according to claim 14, wherein the instruction for instructing to process the user operation information is generated according to a user operation on the behavior play bar and/or a user operation on a virtual key on the interactive interface.
16. The method according to any one of claims 1-15, further comprising:
and displaying at least one behavior identifier on an interactive interface, wherein each behavior identifier is used for indicating a set of behaviors characterized by at least one piece of user operation information.
17. The method of claim 16, further comprising:
when one behavior identifier in at least one behavior identifier is selected, the selected behavior identifier is dynamically changed on the interactive interface.
18. The method of claim 16, further comprising:
acquiring a touch instruction of a user on the interactive interface;
and displaying the sliding switching of the behavior identifier on the interactive interface according to the sliding direction indicated by the touch instruction.
19. The method of any of claims 1-18, wherein prior to the performance of the action, the method further comprises:
when the preset trigger condition is a specific trigger condition, detecting whether the intelligent equipment meets the preset condition for executing the behavior;
if not, controlling the intelligent equipment to execute corresponding operation so as to meet the preset condition for executing the behavior.
20. The method according to any one of claims 1-19, further comprising:
and when the user operation instruction is acquired, controlling the intelligent equipment to execute the behavior according to the user operation information.
21. The method of any one of claims 1-20, wherein the smart device comprises a movable platform.
22. A control method of an intelligent device is applied to the control device of the intelligent device, and is characterized by comprising the following steps:
acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent equipment;
and sending the user operation information to the intelligent equipment so that the intelligent equipment stores the user operation information and executes the behavior according to the user operation information when a preset trigger condition is met.
23. The control method according to claim 22, wherein the user operation information includes any one or more of:
touch information, physical control operation information, somatosensory information and voice information.
24. The method according to claim 22 or 23, wherein the user operation information includes a behavior parameter indicating the behavior;
the behavior parameters include any one or more of the following: movement parameters, pose parameters, shooting parameters, skill parameters, and audio parameters.
25. The method according to any one of claims 22 to 24, wherein the number of the user operation instructions is plural, and each of the user operation instructions further includes an instruction identifier for identifying the user operation instruction.
26. The method according to any one of claims 22-25, wherein the obtaining of the user operation instruction comprises:
acquiring a plurality of user operation instructions;
the sending the user operation information to the intelligent device includes:
and sending a user operation information set formed by the user operation information corresponding to the plurality of user operation instructions to the intelligent equipment.
27. The method according to claim 26, wherein the generation times of a plurality of the user operation instructions are continuous;
when the preset trigger condition is met, the execution sequence of the behaviors represented by the user operation information is the same as the generation sequence of the user operation instructions.
28. The method of claim 27, further comprising:
in the process of obtaining a plurality of user operation instructions, if a tentative instruction is received, recording of the user operation information is tentative.
29. The method of claim 26, wherein at least some of the user manipulation instructions are generated at discrete times.
30. The method according to any one of claims 22-29, wherein prior to said obtaining said user operation instruction, said method further comprises:
and acquiring a starting instruction, wherein the starting instruction is used for indicating the recording start of the user operation information.
31. The method of claim 30, further comprising:
and acquiring an end instruction, wherein the end instruction is used for indicating the recording end of the user operation information.
32. The method of claim 31, wherein the end command is generated according to a time indicated by the start command and a preset time duration.
33. The method according to any one of claims 22-32, further comprising:
and editing at least one piece of user operation information to obtain the processed user operation information, so as to control the intelligent equipment to execute the behavior represented by the processed user operation information when the preset trigger condition is met.
34. The method of claim 33, wherein the editing process comprises any one or more of:
a moving process for instructing to adjust an execution order of the behaviors represented by the user operation information;
a deletion process for instructing deletion of the user operation information;
adding processing for instructing to add at least one piece of user operation information to form a user operation information set;
the superposition processing is used for indicating the execution sequence of the behaviors represented by at least two pieces of user operation information to be correlated;
a scaling process for indicating that an execution time length of a behavior characterized by the user operation information is lengthened or shortened;
a clipping process to indicate that execution of a behavior characterized by the user operation information is partitioned into a plurality of child behaviors or that execution of a behavior characterized by the user operation information is incomplete.
35. The method according to any one of claims 22-34, further comprising:
displaying a behavior play bar on the interactive interface;
and in the execution process of the behavior, the behavior play bar is dynamically changed.
36. The method according to claim 35, wherein the instruction for instructing to process the user operation information is generated according to a user operation on the behavior play bar and/or a user operation on a virtual key on the interactive interface.
37. The method according to any one of claims 22-36, further comprising:
and displaying at least one behavior identifier on an interactive interface, wherein each behavior identifier is used for indicating a set of behaviors characterized by at least one piece of user operation information.
38. The method of claim 37, further comprising:
when one behavior identifier in at least one behavior identifier is selected, the selected behavior identifier is dynamically changed on the interactive interface.
39. The method of claim 37, further comprising:
acquiring a touch instruction of a user on the interactive interface;
and displaying the sliding switching of the behavior identifier on the interactive interface according to the sliding direction indicated by the touch instruction.
40. The method of any of claims 22-39, wherein prior to the performance of the action, the method further comprises:
when the preset trigger condition is a specific trigger condition, detecting whether the intelligent equipment meets the preset condition for executing the behavior;
if not, controlling the intelligent equipment to execute corresponding operation so as to meet the preset condition for executing the behavior.
41. The method of any one of claims 22-40, further comprising:
and when the user operation instruction is acquired, controlling the intelligent equipment to execute the behavior according to the user operation information.
42. The method of any one of claims 22-41, wherein the smart device comprises a movable platform.
43. A control device of an intelligent device, comprising: a processor and a memory;
the memory for storing program code;
the processor, configured to invoke the program code, when the program code is executed, is configured to perform the following:
acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent equipment;
and storing the user operation information so as to control the intelligent equipment to execute the behavior according to the user operation information when a preset trigger condition is met.
44. The control device of claim 43, wherein the user operational information includes any one or more of:
touch information, physical control operation information, somatosensory information and voice information.
45. The control device according to claim 43 or 44, wherein the user operation information includes a behavior parameter indicating the behavior;
the behavior parameters include any one or more of the following: movement parameters, pose parameters, shooting parameters, skill parameters, and audio parameters.
46. The control device according to any one of claims 43 to 45, wherein the number of the user operation instructions is plural, and each of the user operation instructions further includes an instruction identifier for identifying the user operation instruction.
47. The control device of any of claims 43-46, wherein the processor is specifically configured to:
acquiring a plurality of user operation instructions;
and associating and storing user operation information corresponding to the plurality of user operation instructions to obtain a user operation information set so as to control the intelligent equipment to execute the behavior represented by each user operation information according to the user operation information set when a preset trigger condition is met.
48. The control device according to claim 47, wherein the generation times of a plurality of the user operation instructions are continuous;
when the preset trigger condition is met, the execution sequence of the behaviors represented by the user operation information is the same as the generation sequence of the user operation instructions.
49. The control device of claim 48, wherein the processor is further configured to:
in the process of obtaining a plurality of user operation instructions, if a tentative instruction is received, recording of the user operation information is tentative.
50. The control device of claim 47, wherein at least some of the user operation instructions are generated discontinuously.
51. The control device of any one of claims 43-50, wherein the processor is further configured to:
and before the user operation instruction is obtained, obtaining a starting instruction, wherein the starting instruction is used for indicating the recording start of the user operation information.
52. The control device of claim 51, wherein the processor is further configured to:
and acquiring an end instruction, wherein the end instruction is used for indicating the recording end of the user operation information.
53. The control device of claim 52, wherein the end command is generated according to the time indicated by the start command and a preset time length.
54. The control device of any one of claims 43-53, wherein the processor is further configured to:
and editing at least one piece of user operation information to obtain the processed user operation information, so as to control the intelligent equipment to execute the behavior represented by the processed user operation information when the preset trigger condition is met.
55. The control device of claim 54, wherein the editing process comprises any one or more of:
a moving process for instructing to adjust an execution order of the behaviors represented by the user operation information;
a deletion process for instructing deletion of the user operation information;
adding processing for instructing to add at least one piece of user operation information to form a user operation information set;
the superposition processing is used for indicating the execution sequence of the behaviors represented by at least two pieces of user operation information to be correlated;
a scaling process for indicating that an execution time length of a behavior characterized by the user operation information is lengthened or shortened;
a clipping process to indicate that execution of a behavior characterized by the user operation information is partitioned into a plurality of child behaviors or that execution of a behavior characterized by the user operation information is incomplete.
56. The control device of any one of claims 43-55, further comprising: a display;
the display is used for displaying a behavior play bar on the interactive interface;
and in the execution process of the behavior, the behavior play bar is dynamically changed.
57. The control device of claim 56, wherein the instructions for instructing the user to process the user operation information are generated according to user operation on the behavior play bar and/or user operation on a virtual key on the interactive interface.
58. The control device of any one of claims 43-57, further comprising: a display;
the display is used for displaying at least one behavior identifier on the interactive interface, and each behavior identifier is used for indicating at least one set of behaviors represented by the user operation information.
59. The control device of claim 58, wherein the display is further configured to:
when one behavior identifier in at least one behavior identifier is selected, the selected behavior identifier is dynamically changed on the interactive interface.
60. The control device of claim 58, wherein the processor is further configured to obtain a touch instruction of a user on the interactive interface;
the display is further configured to display the sliding switching of the behavior identifier on the interactive interface according to the sliding direction indicated by the touch instruction.
61. The control device of any one of claims 43-60, wherein the processor is further configured to:
before the behavior is executed, when the preset trigger condition is a specific trigger condition, detecting whether the intelligent equipment meets the preset condition for executing the behavior; if not, controlling the intelligent equipment to execute corresponding operation so as to meet the preset condition for executing the behavior.
62. The apparatus according to any one of claims 43-61, wherein the processor is further configured to:
and when the user operation instruction is acquired, controlling the intelligent equipment to execute the behavior according to the user operation information.
63. The control apparatus of any of claims 43-62, wherein the smart device comprises a movable platform.
64. A control system for a smart device, comprising: a control device and the smart device;
the control device is used for acquiring a user operation instruction, wherein the user operation instruction comprises user operation information used for representing the behavior of the intelligent device, and sending the behavior operation information to the intelligent device;
and the intelligent equipment is used for storing the user operation information and executing the behavior according to the user operation information when the user operation information meets the preset trigger condition.
65. The system of claim 64, wherein the user operational information comprises any one or more of:
touch information, physical control operation information, somatosensory information and voice information.
66. The system according to claim 64 or 65, wherein the user operation information includes a behavior parameter indicating the behavior;
the behavior parameters include any one or more of the following: movement parameters, pose parameters, shooting parameters, skill parameters, and audio parameters.
67. The system according to any of claims 64-66, wherein said number of said user operation instructions is plural, each of said user operation instructions further comprises an instruction identification, said instruction identification being used for identifying said user operation instruction.
68. The system according to any one of claims 64 to 67, wherein the control device is specifically configured to obtain a plurality of user operation instructions, and send user operation information corresponding to the plurality of user operation instructions to the smart device;
the intelligent device is specifically configured to store user operation information corresponding to the plurality of user operation instructions in an associated manner to obtain a user operation information set; and executing behaviors represented by the user operation information according to the user operation information set when the preset trigger condition is determined to be met.
69. The system according to claim 68, wherein the generation time of a plurality of said user operation instructions is continuous;
when the preset trigger condition is met, the execution sequence of the behaviors represented by the user operation information is the same as the generation sequence of the user operation instructions.
70. The system of claim 69, wherein the control device is further configured to:
in the process of obtaining a plurality of user operation instructions, if a tentative instruction is received, recording of the user operation information is tentative.
71. The system according to claim 68, wherein at least some of the user manipulation instructions are generated at discrete times.
72. The system of any one of claims 64-71, wherein the control device is further configured to:
and before the user operation instruction is obtained, obtaining a starting instruction, wherein the starting instruction is used for indicating the recording start of the user operation information.
73. The system of claim 72, wherein the control device is further configured to:
and acquiring an end instruction, wherein the end instruction is used for indicating the recording end of the user operation information.
74. The system of claim 73, wherein the end command is generated according to a time indicated by the start command and a preset time duration.
75. The system according to any one of claims 64 to 74, wherein the control device is further configured to perform an editing process on at least one of the user operation information, so as to obtain processed user operation information;
and the intelligent equipment is also used for executing the behavior represented by the processed user operation information when the preset trigger condition is met.
76. The system according to claim 75, wherein the editing process comprises any one or more of:
a moving process for instructing to adjust an execution order of the behaviors represented by the user operation information;
a deletion process for instructing deletion of the user operation information;
adding processing for instructing to add at least one piece of user operation information to form a user operation information set;
the superposition processing is used for indicating the execution sequence of the behaviors represented by at least two pieces of user operation information to be correlated;
a scaling process for indicating that an execution time length of a behavior characterized by the user operation information is lengthened or shortened;
a clipping process to indicate that execution of a behavior characterized by the user operation information is partitioned into a plurality of child behaviors or that execution of a behavior characterized by the user operation information is incomplete.
77. The system of any one of claims 64-76, wherein the control device is further configured to:
displaying a behavior play bar on the interactive interface;
and in the execution process of the behavior, the behavior play bar is dynamically changed.
78. The system according to claim 77, wherein the instructions for instructing the processing of the user operation information are generated according to the user operation on the behavior play bar and/or the user operation on a virtual key on the interactive interface.
79. The system of any one of claims 64-78, wherein the control device is further configured to:
and displaying at least one behavior identifier on an interactive interface, wherein each behavior identifier is used for indicating a set of behaviors characterized by at least one piece of user operation information.
80. The system of claim 79, wherein the control device is further configured to:
when one behavior identifier in at least one behavior identifier is selected, the selected behavior identifier is dynamically changed on the interactive interface.
81. The system of claim 79, wherein the control device is further configured to:
acquiring a touch instruction of a user on the interactive interface;
and displaying the sliding switching of the behavior identifier on the interactive interface according to the sliding direction indicated by the touch instruction.
82. The system according to any one of claims 64-81, wherein the smart device is further configured to:
before the behavior is executed, when the preset trigger condition is a specific trigger condition, detecting whether the intelligent equipment meets the preset condition for executing the behavior;
if not, executing corresponding operation to meet the preset condition for executing the behavior.
83. The system according to any one of claims 64-82, wherein the smart device is further configured to:
and when the user operation instruction is acquired, controlling the intelligent equipment to execute the behavior according to the user operation information.
84. The system of any one of claims 64-83, wherein the smart device comprises a movable platform.
85. A computer-readable storage medium, having stored thereon a computer program for execution by a processor to implement the method of any one of claims 1-21, or for execution by a processor to implement the method of any one of claims 22-42.
CN201980040074.9A 2019-11-28 2019-11-28 Control method, device and system of intelligent equipment and storage medium Pending CN112313590A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/121612 WO2021102800A1 (en) 2019-11-28 2019-11-28 Smart device control method, apparatus, system, and storage medium

Publications (1)

Publication Number Publication Date
CN112313590A true CN112313590A (en) 2021-02-02

Family

ID=74336570

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980040074.9A Pending CN112313590A (en) 2019-11-28 2019-11-28 Control method, device and system of intelligent equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112313590A (en)
WO (1) WO2021102800A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023028830A1 (en) * 2021-08-31 2023-03-09 深圳市大疆创新科技有限公司 Method and device for controlling unmanned aerial vehicle, and unmanned aerial vehicle and storage medium
CN117250883A (en) * 2022-12-06 2023-12-19 北京小米机器人技术有限公司 Intelligent device control method, intelligent device control device, storage medium and chip

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103156645A (en) * 2013-03-18 2013-06-19 飞依诺科技(苏州)有限公司 Workflow self-custom-making method and device for ultrasonic diagnostic device
CN106790424A (en) * 2016-12-01 2017-05-31 同方工业信息技术有限公司 Time control method, client, server and timing control system
US20170315545A1 (en) * 2016-04-29 2017-11-02 Shenzhen Hubsan Technology Co., Ltd. Method for recording flight path and controlling automatic flight of unmanned aerial vehicle
CN108227729A (en) * 2016-12-15 2018-06-29 北京臻迪机器人有限公司 A kind of motion sensing control system and motion sensing control method
CN108563161A (en) * 2018-01-22 2018-09-21 深圳市牧激科技有限公司 Open type intelligent control method, system and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103156645A (en) * 2013-03-18 2013-06-19 飞依诺科技(苏州)有限公司 Workflow self-custom-making method and device for ultrasonic diagnostic device
US20170315545A1 (en) * 2016-04-29 2017-11-02 Shenzhen Hubsan Technology Co., Ltd. Method for recording flight path and controlling automatic flight of unmanned aerial vehicle
CN106790424A (en) * 2016-12-01 2017-05-31 同方工业信息技术有限公司 Time control method, client, server and timing control system
CN108227729A (en) * 2016-12-15 2018-06-29 北京臻迪机器人有限公司 A kind of motion sensing control system and motion sensing control method
CN108563161A (en) * 2018-01-22 2018-09-21 深圳市牧激科技有限公司 Open type intelligent control method, system and computer readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023028830A1 (en) * 2021-08-31 2023-03-09 深圳市大疆创新科技有限公司 Method and device for controlling unmanned aerial vehicle, and unmanned aerial vehicle and storage medium
CN117250883A (en) * 2022-12-06 2023-12-19 北京小米机器人技术有限公司 Intelligent device control method, intelligent device control device, storage medium and chip

Also Published As

Publication number Publication date
WO2021102800A1 (en) 2021-06-03

Similar Documents

Publication Publication Date Title
US11198066B2 (en) Control method and apparatus for game character, electronic device and readable medium
CN108597530B (en) Sound reproducing method and apparatus, storage medium and electronic apparatus
US9672756B2 (en) System and method for toy visual programming
US10086267B2 (en) Physical gesture input configuration for interactive software and video games
US9333420B2 (en) Computer readable medium recording shooting game
CN110262730A (en) Edit methods, device, equipment and the storage medium of game virtual resource
JP2018195177A (en) Information processing method, device and program causing computer to execute information processing method
CN112245921B (en) Virtual object control method, device, equipment and storage medium
CN109076263A (en) Video data handling procedure, equipment, system and storage medium
TWI831074B (en) Information processing methods, devices, equipments, computer-readable storage mediums, and computer program products in virtual scene
US20090219291A1 (en) Movie animation systems
CN112313590A (en) Control method, device and system of intelligent equipment and storage medium
CN112121417B (en) Event processing method, device, equipment and storage medium in virtual scene
KR20220083803A (en) Method, apparatus, medium and program product for state switching of virtual scene
CN114159787A (en) Control method and device of virtual object, electronic equipment and readable medium
CN107096223B (en) Movement control method and device in virtual reality scene and terminal equipment
US8487928B2 (en) Game program, game apparatus, and game control method
WO2019097793A1 (en) Information processing device and information processing method, computer program, and program production method
JP2007299330A (en) Image display device and its control method and program
KR20210011383A (en) Virtual camera placement system
WO2022156490A1 (en) Picture display method and apparatus in virtual scene, device, storage medium, and program product
US20120252575A1 (en) Game device, game device control method, and information storage medium
EP3556443A1 (en) Tangible mobile game programming environment for non-specialists
CN113440850A (en) Virtual object control method and device, storage medium and electronic device
CN110141850B (en) Action control method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination