WO2021179143A1 - 控制系统、方法、电子设备、可移动设备及计算机可读存储介质 - Google Patents

控制系统、方法、电子设备、可移动设备及计算机可读存储介质 Download PDF

Info

Publication number
WO2021179143A1
WO2021179143A1 PCT/CN2020/078478 CN2020078478W WO2021179143A1 WO 2021179143 A1 WO2021179143 A1 WO 2021179143A1 CN 2020078478 W CN2020078478 W CN 2020078478W WO 2021179143 A1 WO2021179143 A1 WO 2021179143A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
controlled device
instruction
interface
controlled
Prior art date
Application number
PCT/CN2020/078478
Other languages
English (en)
French (fr)
Inventor
张鹏辉
程亮
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080038669.3A priority Critical patent/CN113874175A/zh
Priority to PCT/CN2020/078478 priority patent/WO2021179143A1/zh
Publication of WO2021179143A1 publication Critical patent/WO2021179143A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls

Definitions

  • This application relates to the field of device control, and in particular to a control system, method, electronic device, portable device, and computer-readable storage medium.
  • the current programming method is mainly based on module control instruction programming, that is, the user can control the educational robot by running the program after writing a section of the program, but when the user wants to control the educational robot again, he needs to rewrite the program or re-run the program , The operation is more cumbersome, which is not conducive to the user experience.
  • one of the objectives of the present invention is to provide a control system, method, electronic device, removable device, and computer-readable storage medium.
  • control system including:
  • control application being loaded on an electronic device
  • the control application includes a control generation layer and a first communication logic layer;
  • the control generation layer is used to provide a programming interface and a UI interactive interface, generate a control based on the source code on the programming interface, and respond to the UI interactive interface when the control is displayed on the UI interactive interface
  • a first control instruction is generated based on the communication protocol supported by the control generation layer and the operation of the controlled device associated with the control; wherein, the operation of the controlled device associated with the control is based on the operation of the controlled device associated with the control.
  • the interface provided by the controlled device called in the source code is determined;
  • the first communication logic layer is used to: convert the first control instruction into a second control instruction of the communication protocol supported by the controlled device;
  • the controlled device is used to execute the operation of the controlled device associated with the control according to the second control instruction.
  • a control method including:
  • a first control instruction is generated based on the communication protocol supported by the programming interface and the operation of the controlled device associated with the control; wherein the control is associated with The operation of the controlled device is determined based on the interface provided by the controlled device called in the source code;
  • the controlled device is controlled to perform the operation of the controlled device associated with the control.
  • an electronic device including a display, a memory, a processor, and a program of a control application stored on the memory and running on the processor; the control application provides a programming interface and a UI user-interface;
  • the display is used to display the programming interface and the UI interaction interface
  • the processor calls the program of the control application, and when the program is executed, it is used to perform the following operations:
  • a first control instruction is generated based on the communication protocol supported by the programming interface and the operation of the controlled device associated with the control; wherein the control is associated with The operation of the controlled device is determined based on the interface provided by the controlled device called in the source code;
  • the controlled device is controlled to perform the operation of the controlled device associated with the control.
  • a movable device including:
  • a power system installed in the fuselage for driving the movable equipment to move
  • a communication system installed in the body, for receiving the second control instruction sent by the electronic device as described in the third aspect
  • the control system is installed in the fuselage, and is used to provide an interface for the control program in the electronic device as described in the third aspect to call, and to perform corresponding operations according to the second control instruction.
  • a computer-readable storage medium having computer instructions stored thereon, and when the instructions are executed by a processor, the method described in any one of the second aspects is implemented.
  • a control is generated by a source program on a programming interface and displayed on the UI interactive interface, and in response to a trigger instruction of the UI interactive interface, the controlled device is controlled to perform the operation of the controlled device associated with the control.
  • the user can interact with the controlled device through this control, which increases the richness and freedom of programming, realizes the interactivity of the control process, enhances the fun of programming, and optimizes the user experience.
  • the first control that is generated in response to the trigger instruction of the UI interactive interface and supports the communication protocol generated by the control is converted into the second control instruction that supports the communication protocol supported by the controlled device to ensure that the controlled device can The converted second control instruction is accurately analyzed, and the corresponding operation is executed in response to the trigger event of the control accurately.
  • Fig. 1 is a structural diagram of a first control system according to an exemplary embodiment of this application.
  • Fig. 2 is a structural diagram of a second control system according to an exemplary embodiment of the present application.
  • Fig. 3 is a structural diagram of a third control system shown in this application according to an exemplary embodiment.
  • Fig. 4 is a structural diagram of a fourth control system according to an exemplary embodiment of the present application.
  • Fig. 5 is a flow chart of a control method according to an exemplary embodiment of the application.
  • Fig. 6 is a structural diagram of an electronic device according to an exemplary embodiment of the present application.
  • Fig. 7 is a structural diagram of a movable device according to an exemplary embodiment of this application.
  • FIG. 1 is a structural diagram of the first control system according to an exemplary embodiment of the present application.
  • This embodiment can automatically generate controls based on codes or generate controls through user programming, and then use the controls to control the controlled device 20 to perform corresponding operations.
  • the controls exist for a long time after generation and do not need to be generated repeatedly, thereby reducing user cumbersome Operation, and the user can interact with the controlled device 20 through the control, which increases the richness and freedom of programming and realizes the interactivity of the control process.
  • the control system includes a controlled device 20 and a control application 10, and the control application 10 is loaded on an electronic device.
  • the electronic device includes, but is not limited to, a device equipped with a display, such as a smart phone, a computer, a tablet, or a personal digital assistant (Personal Digital Assistant, PDA).
  • the controlled device 20 includes, but is not limited to, mobile robots, educational robots, unmanned vehicles, unmanned aerial vehicles, and unmanned ships.
  • the system automatically generates controls according to codes or the user performs programming on the control application 10 to generate controls, and controls the controlled device 20 to perform corresponding operations through the controls.
  • the controlled device 20 and the control application 10 may be connected through a wired or wireless connection.
  • FIG. 1 illustrates that the controlled device 20 and the control application 10 are connected through a wireless network.
  • the communication technology applied by the connection method includes, but is not limited to: short-range wireless communication technology or mobile communication protocol technology.
  • the short-range wireless communication technology may be infrared technology, WiFi technology, Bluetooth technology, UWB technology, or ZigBee technology, etc.
  • the mobile communication protocol technology can be 3G communication technology, 4G communication technology, GSM communication technology or GPRS communication technology.
  • FIG. 2 is a structural diagram of a second control system according to an exemplary embodiment of this application.
  • the control application 10 includes a control generation layer 21 and a first communication logic layer 22.
  • the control generation layer 21 provides a programming interface 210 and a UI interactive interface.
  • the control generation layer 21 generates a control based on the source code on the programming interface 210, and displays the control on the UI interactively.
  • Interface in response to a trigger event on the control on the UI interactive interface, a first control instruction is generated based on the communication protocol supported by the control generation layer 21 and the operation of the controlled device 20 associated with the control; wherein, The operation of the controlled device 20 associated with the control is determined based on the interface provided by the controlled device 20 called in the source code.
  • the first communication logic layer 22 converts the first control instruction into a second control instruction of the communication protocol supported by the controlled device 20.
  • the controlled device 20 executes the operation of the controlled device 20 associated with the control according to the second control instruction.
  • the user can write related control generation codes in the programming interface 210 provided by the control application 10, for example, using Python, and after receiving the execution of the source program written by the user on the programming interface 210 After the instruction, the control generation layer 21 responds to the running instruction to generate a control according to the source program on the programming interface 210, and displays the generated control on the UI interactive interface 211.
  • the controlled devices include educational robots. The main object of educational robots is teenagers or teenagers, etc., and their age level is lower, so the relevant control generation code can be written in Python language, which is a widely used interpretation type.
  • Advanced programming general-purpose programming language, which has the characteristics of simplicity, ease of use, and easy to learn, so it is suitable for beginners to start programming, and can be used to write related control generated codes, thereby reducing the difficulty of writing related control generated codes;
  • other programming languages such as java, C language or C++ language may also be used, and the embodiments of the present application do not impose any limitation on this.
  • the user can customize the operation of the controlled device 20 associated with the control.
  • the controlled device 20 provides a number of interfaces for controlling the controlled device 20, and each of the interfaces corresponds to the operation of the controlled device 20.
  • the interface provided by the controlled device 20 includes an API interface; the control application 10 can pre-store the information of several interfaces provided by the controlled device 20, and the user can program according to actual needs when programming on the programming interface 210
  • the operation of the controlled device 20 associated with the control to be generated is freely determined, that is, one or more interfaces to be called can be selected from the information of several interfaces provided by the controlled device 20 pre-stored in the control application 10.
  • the operation of the controlled device 20 associated with a user-defined control increases the fun of programming.
  • the custom control function increases the richness of programming and the sense of interaction with the controlled device 20.
  • the user can Free configuration also expands the freedom of programming and improves user experience.
  • the control application 10 may obtain it from the controlled device 20, or it may be the cloud corresponding to the controlled device 20.
  • the control application 10 can be obtained from the cloud corresponding to the controlled device 20. Do not make any restrictions.
  • the controlled device 20 cannot respond to the calls made by multiple custom controls for the interface at the same time, resulting in abnormal control problems. Therefore, in a possible implementation In this manner, when a custom control is generated according to the source program on the programming interface 210, it can be detected whether the interface provided by the controlled device 20 called in the source code has been called by another custom control, If yes, output a reminder message, which is used to remind the interface provided by the controlled device 20 called in the source code of repeated calling problems, thereby reminding the user to change the called interface or destroy the control calling the interface, This avoids the problem of repeatedly calling the interface, thereby ensuring that the process of controlling the controlled device 20 based on a custom control is performed correctly.
  • control generation layer 21 may also send to the controlled device 20 through the first communication logic layer 22 such as RDK.
  • Control generation notification the control generation notification is used to notify the controlled device 20 of the control generated by the control application 10, and the control generation notification includes the information of the interface called by the control so that the controlled device 20
  • the corresponding relationship between the control and the interface provided by the corresponding controlled device 20 is recorded, that is, the operation of the control and the controlled device 20 is associated to facilitate the subsequent interactive control process.
  • the user can also customize the control properties of the control.
  • the user can customize the Code corresponding to the control properties of the control is written on the interface 210, so that the control generation layer 21 can configure the control properties of the control according to the source code on the programming interface 210; this embodiment enables users to freely configure the control properties.
  • the operation and control properties of the associated controlled device 20 increase the richness and freedom of programming, and also increase the fun of programming, and optimize the user experience.
  • control attribute includes but is not limited to: the appearance attribute, display position, or control type of the control.
  • the appearance attributes include at least one of the following: the color, size, display font, shape, or border of the custom control;
  • control type includes at least the following: buttons, sliders, radio buttons, drop-down lists, Text, scroll bar or input box, etc.
  • the user-defined control is a button, and the word "move" is displayed on the button.
  • the button is circular, and the operation of the associated controlled device 20 is a mobile operation, that is, the button is used to control the controlled device.
  • the control device 20 moves.
  • the control generation layer 21 can display the generated control on the UI interactive interface 211, and the user can perform corresponding actions on the control. Trigger operation, so that the control generation layer 21 responds to the trigger event of the control on the UI interactive interface, based on the communication protocol supported by the control generation layer 21 and the operation generation of the controlled device 20 associated with the control
  • the first control instruction, the operation of the controlled device 20 associated with the control is determined based on the interface provided by the controlled device 20 called in the source code.
  • the trigger event is determined based on the control type of the control configured by the user, and the trigger event includes, but is not limited to: a click event, a double click event, a sliding event, a drag event, an input event, or a long press event.
  • the electronic device includes a display, and the programming interface 210 and the UI interaction interface 211 may be displayed on the display screen presented by the display at the same time, or may be displayed on the display presented by the display respectively. On the screen, this embodiment does not impose any restriction on this.
  • Control generation technology such as unity technology, supports a fixed and open communication protocol.
  • product manufacturers may develop a separate set of private communication protocols for The use of the products it produces has since resulted in that the communication protocol supported by the control generation layer 21 is different from the communication protocol supported by the controlled device 20. Therefore, the control application 10 provides a first communication logic layer 22.
  • the first control instruction In response to the triggering event of the control on the UI interactive interface, after the first control instruction is generated based on the communication protocol supported by the control generation layer 21 and the operation of the controlled device 20 associated with the control, the first control instruction A communication logic layer 22 converts the first control instruction into a second control instruction of the communication protocol supported by the controlled device 20 and sends it to the controlled device 20 to control the controlled device 20 to execute the corresponding Operation;
  • This embodiment uses the protocol conversion process of the first communication logic layer 22 to ensure that the controlled device 20 can accurately parse the converted second control instruction, and accurately respond to the trigger event of the control to perform the corresponding operation .
  • first control instruction and the second control instruction is only that the supported communication protocol is different, and the relevant data used to indicate the operation of the controlled device 20 associated with the control is the same.
  • control generation layer 21 if the communication protocol supported by the control generation layer 21 is the same as the communication protocol supported by the controlled device 20, there is no need to perform protocol conversion through the first communication logic layer 22.
  • the controlled device 20 executes the operation of the controlled device 20 associated with the control according to the second control instruction.
  • the second device implements accurate analysis of the second control instruction, correctly responds to the trigger event of the control, performs the operation indicated by the user through the control interaction, provides better feedback to the user, and optimizes the user Experience.
  • the controlled device 20 includes a second communication logic layer 11 and a business logic layer 12; the second communication logic layer 11 is used to receive and parse the second control The instruction determines the operation of the controlled device 20 associated with the control.
  • the second communication logic layer 11 may also send other types of instructions to the control application or other external applications, such as an instruction to feed back the state of the controlled device;
  • the business logic layer 12 is used to provide an API interface for the control generation layer 21 of the control program to call, and to perform operations of the controlled device 20 associated with the control.
  • the second device implements accurate analysis of the second control instruction, correctly responds to the trigger event of the control, performs the operation indicated by the user through the control interaction, provides better feedback to the user, and optimizes the user Experience.
  • the controlled device 20 includes a movable device
  • the first control instruction includes a movement instruction for controlling the movement of the controlled device 20
  • the first communication logic layer 22 transfers the After the first control instruction is converted into the second control instruction of the communication protocol supported by the controlled device 20 and sent to the controlled device 20, the controlled device 20 moves according to the second control instruction.
  • This embodiment controls the movement of the controlled device 20 through a user-defined control, provides a sense of interaction with the controlled device 20, and enhances the richness and freedom of programming.
  • the controlled device 20 further includes a camera device, and the camera installation includes but is not limited to a camera, etc.; the first control instruction also includes a shooting instruction for controlling the camera device to shoot; After the first communication logic layer 22 converts the first control instruction into a second control instruction of the communication protocol supported by the controlled device 20 and sends it to the controlled device 20, the controlled device 20 The second control instruction controls the camera device to shoot. Further, the controlled device 20 transmits the image captured by the camera device to the control application 10 through the second communication logic layer 11, and the After receiving the image through the first communication logic layer 22, the control application 10 displays the image on the UI interactive interface 211 or displays the image on other interfaces.
  • This embodiment controls the shooting of the camera on the controlled device 20 through user-defined controls, provides a sense of interaction with the controlled device 20, and enhances the richness and freedom of programming. Further, the user can also interact in the UI The first-view image captured by the camera device can be viewed on the interface 211 in real time, which optimizes the user experience.
  • the user customizes controls A and B, controls the controlled device 20 to move through the customized control A, and controls the camera device to shoot through the customized control B.
  • control A controls the movement of the controlled device 20
  • the camera device installed on the controlled device 20 can take pictures along the way in real time.
  • the captured pictures along the way are transmitted to the control application 10 in real time.
  • the control application 10 displays the pictures along the way on the UI interactive interface 211 in real time, thereby The user can watch the real-time picture from the first perspective, which improves the user experience.
  • the controlled device 20 further includes a pan/tilt, the pan/tilt is used to support the camera device, and the first control instruction further includes a rotation instruction for controlling the rotation of the pan/tilt.
  • the first communication logic layer 22 converts the first control instruction into a second control instruction of the communication protocol supported by the controlled device 20 and sends it to the controlled device 20, the controlled device 20 The second control instruction controls the rotation of the pan/tilt, so as to change the scene captured by the camera device.
  • the user customizes control A, control B, and control C, controls the controlled device 20 to move through the customized control A, and controls the camera device to shoot through the customized control B.
  • the custom control C to control the rotation of the pan/tilt, in the process of controlling the movement of the controlled device 20 through the control A, the camera device installed on the controlled device 20 is triggered to take pictures along the way in real time by triggering the control B , And then the controlled device 20 transmits the taken pictures along the way to the control application 10 in real time through the second communication logic layer 11, and the control application 10 receives the pictures along the way through the first communication logic layer 22
  • the pictures along the way are displayed in real time, so that the user can watch the real-time pictures of the first angle of view.
  • the rotation of the pan/tilt can be controlled through the control C to change the picture scene taken by the camera. , So that users can watch real-time images from different angles from the first perspective, optimizing the user experience.
  • the system further includes smart glasses
  • the controlled device 20 further includes a camera device
  • the first control instruction further includes a shooting instruction for controlling the camera device to shoot; when the controlled device 20
  • the controlled device 20 may transmit the image taken by the camera device to the smart glasses, The image is displayed in real time on the display screen on the smart glasses, so that the user can watch the real-time picture of the first angle of view through the smart glasses.
  • the controlled device 20 is further equipped with a speaker
  • the first control instruction includes an instruction to adjust the volume
  • the first control instruction is converted into the first communication logic layer 22
  • the speaker plays according to the volume indicated by the second control instruction.
  • the controlled device 20 is further equipped with a launching device, the launching device is used to launch marbles, and the first control instruction includes an instruction for controlling the launching device to launch marbles.
  • the first communication logic layer 22 converts the first control instruction into a second control instruction of the communication protocol supported by the controlled device 20 and sends it to the controlled device 20, the transmitting device The second control command launches marbles.
  • the control can be destroyed, and the control generation layer 21 is also used to: delete the source code corresponding to the control in response to the destruction trigger event of the control, and
  • the first destruction instruction is generated according to the communication protocol supported by the control generation layer 21; the first communication logic layer 22 is also used to: convert the first destruction instruction into the first destruction instruction of the communication protocol supported by the controlled device 20 2.
  • Destruction instruction; the controlled device 20 is also used to: according to the second destruction instruction, disassociate the control from the operation of the controlled device 20, that is, disassociate the control from the controlled device 20 The corresponding relationship of the interface indicated by the operation.
  • This embodiment can realize the destruction of the custom control, and avoid creating too many unused controls and occupying too many memory resources.
  • the present application also provides a control method, which can be applied to an electronic device, and the method includes:
  • step S101 a control is generated according to the source code on the programming interface, and the control is displayed on the UI interactive interface.
  • step S102 in response to a trigger event of the control on the UI interactive interface, a first control instruction is generated based on the communication protocol supported by the programming interface and the operation of the controlled device associated with the control; The operation of the controlled device associated with the control is determined based on the interface provided by the controlled device called in the source code.
  • step S103 the first control instruction is converted into a second control instruction of the communication protocol supported by the controlled device.
  • step S104 according to the second control instruction, the controlled device is controlled to perform the operation of the controlled device associated with the control.
  • it further includes:
  • control attribute includes at least one of the following: appearance attribute, display position, or control type of the custom control.
  • the appearance attribute of the custom control includes at least one of the following: the color, size, display font, or border of the custom control.
  • the control type includes at least one of the following: buttons, sliding bars, radio buttons, drop-down lists, text, scroll bars, or input boxes.
  • the trigger event includes at least one of the following: a click event, a double tap event, a sliding event, a drag event, an input event, or a long press event.
  • it further includes:
  • the source code corresponding to the control is deleted, and the first destruction instruction is generated according to the communication protocol supported by the control generation layer.
  • the first destruction instruction is converted into a second destruction instruction of the communication protocol supported by the controlled device.
  • the controlled device is controlled by the second destruction instruction to disassociate the control from the operation of the controlled device.
  • the controlled device includes a mobile robot, an unmanned vehicle, an unmanned aerial vehicle, and an unmanned vessel.
  • control is a user-defined control.
  • the interface provided by the controlled device includes an API interface.
  • the first control instruction includes a movement instruction for controlling the movement of the controlled device.
  • the controlled device is equipped with a camera device.
  • the first control instruction includes a photographing instruction for controlling the photographing device to perform photographing.
  • the method also includes:
  • FIG. 6 is a structural diagram of an electronic device according to an exemplary embodiment of this application.
  • the electronic device may be a mobile phone, a computer, a personal tablet, or a PDA (Personal Digital Assistant)
  • the electronic device includes a display 33, a memory 32, a processor 31, and a control application program stored on the memory 32 and running on the processor 31; the control application provides a programming interface and UI interaction interface.
  • the display 33 is used to display the programming interface and the UI interaction interface.
  • the processor 31 calls the program of the control application, and when the program is executed, it is used to perform the following operations:
  • a control is generated according to the source code on the programming interface, and the control is displayed on the UI interactive interface.
  • a first control instruction is generated based on the communication protocol supported by the programming interface and the operation of the controlled device associated with the control; wherein the control is associated with The operation of the controlled device is determined based on the interface provided by the controlled device called in the source code.
  • the first control instruction is converted into a second control instruction of the communication protocol supported by the controlled device.
  • the controlled device is controlled to perform the operation of the controlled device associated with the control.
  • the processor 31 may be a central processing unit (Central Processing Unit, CPU), other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (ASIC), Ready-made programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the memory 32 stores a computer program of executable instructions for the control method.
  • the memory 32 may include at least one type of storage medium.
  • the storage medium includes a flash memory, a hard disk, a multimedia card, and a card-type memory (for example, SD or DX memory). Etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory , Disks, CDs, etc.
  • the electronic device 30 may cooperate with a network storage device that performs the storage function of the memory 32 through a network connection.
  • the memory 32 may be an internal storage unit of the electronic device 30, such as a hard disk or a memory of the electronic device 30.
  • the memory 32 may also be an external storage device of the electronic device 30, such as a plug-in hard disk equipped on the electronic device 30, a smart memory card (Smart Media Card, SMC), a Secure Digital (SD) card, and a flash memory card (Flash). Card) and so on. Further, the memory 32 may also include both an internal storage unit of the electronic device 30 and an external storage device. The memory 32 is used to store computer programs and other programs and data required by the device. The memory 32 can also be used to temporarily store data that has been output or will be output.
  • the display 33 is used to display the programming interface and the UI interactive interface.
  • the display 33 includes, but is not limited to, a CRT (Cathode Ray Tube) display, an LCD (liquid crystal) display, an LED (light emitting diode, and a light emitting diode). ) Display or PDP (Plasma Display Panel, plasma display) display.
  • the various embodiments described herein may be implemented using a computer-readable medium such as computer software, hardware, or any combination thereof.
  • the implementation described here can be implemented by using application-specific integrated circuits (ASIC), digital signal processors (DSP), digital signal processing devices (DSPD), programmable logic devices (PLD), field programmable gate arrays ( It is implemented by at least one of an FPGA), a processor, a controller, a microcontroller, a microprocessor, and an electronic unit designed to perform the functions described herein.
  • ASIC application-specific integrated circuits
  • DSP digital signal processors
  • DSPD digital signal processing devices
  • PLD programmable logic devices
  • FPGA field programmable gate arrays
  • implementations such as procedures or functions may be implemented with a separate software module that allows execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, and the software code can be stored in a memory and executed by the controller.
  • the processor is further configured to configure the control properties of the control according to the source code on the programming interface.
  • control attribute includes at least one of the following: appearance attribute, display position, or control type of the custom control.
  • the appearance attribute of the custom control includes at least one of the following: the color, size, display font, or border of the custom control.
  • the control type includes at least one of the following: buttons, sliding bars, radio buttons, drop-down lists, text, scroll bars, or input boxes.
  • the processor is further configured to:
  • the controlled device is controlled by the second destruction instruction to disassociate the control from the operation of the controlled device.
  • control is a user-defined control.
  • the interface provided by the controlled device includes an API interface.
  • the first control instruction includes a movement instruction for controlling the movement of the controlled device.
  • the controlled device includes a camera device
  • the first control instruction includes a shooting instruction for controlling the camera device to shoot;
  • the processor is also used for:
  • FIG. 7 shows a structure diagram of a movable device 40 according to an exemplary embodiment of this application.
  • the movable device 40 includes:
  • the power system 42 is arranged inside the fuselage 41 and is used to drive the movable device to move;
  • the communication system 43 is arranged inside the body 41 and is used to receive the second control instruction sent by the above-mentioned electronic device;
  • the control system 44 is installed in the body, and is used to provide an interface for the control program in the above-mentioned electronic device to call, and to perform corresponding operations according to the second control instruction.
  • the mobile device 40 includes: a mobile robot, an unmanned vehicle, an unmanned aerial vehicle, and an unmanned boat.
  • the communication system 43 is configured to facilitate wired or wireless communication between the movable device 40 and other devices.
  • the mobile device 40 can access a wireless network based on a communication standard, such as WiFi, 3G, or 4G, or a combination thereof.
  • the communication system 43 receives broadcast signals or broadcast-related information from an external broadcast management system via a broadcast channel.
  • the communication system 43 is specifically configured to: receive and parse the second control instruction, and determine the operation of the controlled device associated with the control.
  • the movable device 40 further includes a camera device.
  • the second control instruction includes a photographing instruction for controlling the photographing device to perform photographing.
  • the control system is further configured to: control the camera device to perform shooting according to the shooting instruction, and transmit the image taken by the camera device to the electronic device through the communication system 43.
  • the second control instruction further includes a movement instruction for controlling the movement of the movable device 40; the control system is specifically configured to: control the power system 41 according to the movement instruction to drive the The movable device 40 moves.
  • the movable device 40 further includes a pan/tilt, the pan/tilt is used to support the camera device, and the second control instruction further includes a rotation instruction for controlling the rotation of the pan/tilt; the control The system is specifically configured to: control the pan/tilt head according to the rotation instruction to drive the pan/tilt head to rotate.
  • FIG. 7 is only an example of the movable device 40, and does not constitute a limitation on the movable device 40. It may include more or less components than those shown in the figure, or combine certain components, or different components.
  • the movable device 40 may also include input and output devices.
  • non-transitory computer-readable storage medium including instructions, such as a memory including instructions, which may be executed by a processor of an interactive device to complete the foregoing method.
  • the non-transitory computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
  • the electronic device when the instructions in the storage medium are executed by the processor, the electronic device can execute the aforementioned control method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Telephone Function (AREA)
  • Selective Calling Equipment (AREA)

Abstract

一种控制系统、方法、电子设备(30)、可移动设备(40)及计算机可读存储介质,方法包括:根据编程界面(210)上的源代码生成控件,并将控件显示于UI交互界面(211);响应于UI交互界面(211)上对控件的触发事件,基于编程界面(210)支持的通信协议以及控件关联的被控设备(20)的操作生成第一控制指令;其中,控件关联的被控设备(20)的操作基于在源代码中调用的被控设备(20)提供的接口确定;将第一控制指令转换为被控设备(20)支持的通信协议的第二控制指令;根据第二控制指令,控制被控设备(20)执行控件关联的被控设备(20)的操作。通过控件来进行与被控设备(20)之间的交互,增加了编程的丰富度和自由度。

Description

控制系统、方法、电子设备、可移动设备及计算机可读存储介质 技术领域
本申请涉及设备控制领域,尤其涉及一种控制系统、方法、电子设备、可移动设备及计算机可读存储介质。
背景技术
随着人工智能技术、计算机技术等相关技术的发展,对教育机器人的研究也越来越多。通过用户自主学习编程语言进行自主编程,实现对教育机器人的控制,诸如控制教育机器人移动等,实现寓教于乐的过程。
目前的编程方式主要以模块控制指令编程为主,即用户在编写完一段程序,通过运行程序实现对教育机器人的控制,但在用户想要再次控制教育机器人时,需要重新编写程序或者重新运行程序,操作上较为繁琐,不利于用户的使用体验。
发明内容
有鉴于此,本发明的目的之一是提供一种控制系统、方法、电子设备、可移动设备及计算机可读存储介质。
首先,本申请的第一方面提供了一种控制系统,包括:
包括被控设备以及控制应用,所述控制应用装载于一电子设备上;
所述控制应用包括控件生成层和第一通信逻辑层;
所述控件生成层用于:提供编程界面和UI交互界面,根据所述编程界面上的源代码生成控件,并在所述控件显示于所述UI交互界面时,响应于所述UI交互界面上对所述控件的触发事件,基于所述控件生成层支持的通信协议以及所述控件关联的被控设备的操作生成第一控制指令; 其中,所述控件关联的被控设备的操作基于在所述源代码中调用的所述被控设备提供的接口确定;
所述第一通信逻辑层用于:将所述第一控制指令转换为所述被控设备支持的通信协议的第二控制指令;
所述被控设备用于:根据所述第二控制指令,执行所述控件关联的被控设备的操作。
根据本申请实施例的第二方面,提供一种控制方法,包括:
根据编程界面上的源代码生成控件,并将所述控件显示于UI交互界面;
响应于所述UI交互界面上对所述控件的触发事件,基于所述编程界面支持的通信协议以及所述控件关联的被控设备的操作生成第一控制指令;其中,所述控件关联的被控设备的操作基于在所述源代码中调用的所述被控设备提供的接口确定;
将所述第一控制指令转换为所述被控设备支持的通信协议的第二控制指令;
根据所述第二控制指令,控制所述被控设备执行所述控件关联的被控设备的操作。
根据本申请实施例的第三方面,提供一种电子设备,包括显示器、存储器、处理器及存储在存储器上并可在处理器上运行的控制应用的程序;所述控制应用提供编程界面和UI交互界面;
所述显示器用于显示所述编程界面和UI交互界面;
其中,所述处理器调用所述控制应用的程序,当程序被执行时,用于执行以下操作:
根据所述编程界面上的源代码生成控件,并将所述控件显示于所述UI交互界面;
响应于所述UI交互界面上对所述控件的触发事件,基于所述编程界面支持的通信协议以及所述控件关联的被控设备的操作生成第一控制指 令;其中,所述控件关联的被控设备的操作基于在所述源代码中调用的所述被控设备提供的接口确定;
将所述第一控制指令转换为所述被控设备支持的通信协议的第二控制指令;
根据所述第二控制指令,控制所述被控设备执行所述控件关联的被控设备的操作。
根据本申请实施例的第四方面,提供一种可移动设备,包括:
机身;
动力系统,安装于所述机身内,用于驱动所述可移动设备运动;
通信系统,安装于所述机身内,用于接收如第三方面所述的电子设备发送的第二控制指令;
控制系统,安装于所述机身内,用于提供供如第三方面所述的电子设备中的控制程序调用的接口,以及根据所述第二控制指令执行相应的操作。
根据本申请实施例的第五方面,提供一种计算机可读存储介质,其上存储有计算机指令,该指令被处理器执行时实现第二方面中任意一项所述的方法。
本申请的实施例提供的技术方案可以包括以下有益效果:
本实施例通过在编程界面上的源程序生成控件并显示在UI交互界面,并且响应于所述UI交互界面的触发指令,控制所述被控设备执行所述控件关联的被控设备的操作。用户可以通过该控件来进行与被控设备之间的交互,增加了编程的丰富度和自由度,实现控制过程的交互性,增强了编程的趣味性,优化用户的使用体验。其中,将响应于所述UI交互界面的触发指令生成的支持控件生成的通信协议的第一控件转换为支持所述被控设备支持的通信协议的第二控制指令,保证所述被控设备能够准确解析转换后的第二控制指令,准确响应于所述控件的触发事件执行相应的操作。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是为本申请根据一示例性实施例示出的第一种控制系统的结构图。
图2为本申请根据一示例性实施例示出的第二种控制系统的结构图。
图3为本申请根据一示例性实施例示出的第三种控制系统的结构图。
图4为本申请根据一示例性实施例示出的第四种控制系统的结构图。
图5为本申请根据一示例性实施例示出的一种控制方法的流程图。
图6为本申请根据一示例性实施例示出的一种电子设备的结构图。
图7为本申请根据一示例性实施例示出的一种可移动设备的结构图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案 进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
针对于相关技术中的问题,本申请实施例提供了一种控制系统,请参阅图1,为本申请根据一示例性实施例示出的第一种控制系统的结构图。本实施例可以根据代码自动生成控件或通过用户的编程生成控件,进而通过该控件控制被控设备20执行相应的操作,所述控件在生成后长期存在,无需重复生成,从而减少了用户繁琐的操作,并且用户可以通过该控件来进行与被控设备20之间的交互,增加了编程的丰富度和自由度,实现控制过程的交互性。
图1所示的实施例中,所述控制系统包括被控设备20以及控制应用10,所述控制应用10装载于一电子设备上。所述电子设备包括但不限于智能手机、电脑、平板或者个人数字助理(Personal Digital Assistant,PDA)等具备显示器的设备。所述被控设备20包括但不限于可移动机器人、教育机器人、无人驾驶车辆、无人飞行器以及无人驾驶船只。
本实施例中,系统根据代码自动生成控件或用户在控制应用10上进行编程以生成控件,通过控件控制所述被控设备20执行相应的操作。所述被控设备20与控制应用10之间可以通过有线连接或无线连接,图1以所述被控设备20与控制应用10通过无线网络连接进行具体说明,可以理解的是,所述无线网络连接方式所应用的通信技术包括但不限于:近距离无线通信技术或移动通信协议技术,所述近距离无线通信技术可以是红外技术、WiFi技术、蓝牙技术、UWB技术或者ZigBee技术等,所述移动通信协议技术可以是3G通信技术、4G通信技术、GSM通信技术或者GPRS通信技术等。
进一步地,请参阅图2,为本申请根据一示例性实施例示出的第二 种控制系统的结构图,所述控制应用10包括控件生成层21和第一通信逻辑层22。
请参阅图3,所述控件生成层21提供编程界面210和UI交互界面,所述控件生成层21根据所述编程界面210上的源代码生成控件,并在所述控件显示于所述UI交互界面时,响应于所述UI交互界面上对所述控件的触发事件,基于所述控件生成层21支持的通信协议以及所述控件关联的被控设备20的操作生成第一控制指令;其中,所述控件关联的被控设备20的操作基于在所述源代码中调用的所述被控设备20提供的接口确定。
然后,所述第一通信逻辑层22将所述第一控制指令转换为所述被控设备20支持的通信协议的第二控制指令。
最后,所述被控设备20根据所述第二控制指令,执行所述控件关联的被控设备20的操作。
在一实施例中,用户可以在所述控制应用10提供的编程界面210,例如利用Python进行相关控件生成代码的编写,在接收到针对于用户在所述编程界面210上编写的源程序的运行指令之后,所述控件生成层21响应于所述运行指令,根据所述编程界面210上的源程序生成控件,并将生成的控件展示在所述UI交互界面211上。作为例子,所述被控设备包括教育机器人,教育机器人的主要面向对象为少年或者青少年等,其年龄层次较低,则相关控件生成代码可以通过Python语言编写,Python是一种广泛使用的解释型、高级编程、通用型编程语言,其具有简洁,易用,上手容易等特点,因此适合初学者入门编程,可以将其用于相关控件生成代码的编写,从而降低相关控件生成代码的编写难度;作为例子,也可以采用其他编程语言如java、C语言或者C++语言等,本申请实施例对此不作任何限制。
其中,用户可以自定义该控件关联的被控设备20的操作,所述被控 设备20提供若干用于控制被控设备20的接口,每一所述接口与所述被控设备20的操作对应,所述被控设备20提供的接口包括API接口;所述控制应用10可以预存所述被控设备20提供的若干接口的信息,用户在所述编程界面210上进行编程时,可以根据实际需要自由确定待生成的控件关联的被控设备20的操作,即可以从所述控制应用10预存的所述被控设备20提供的若干接口的信息中选择要调用的一个或多个接口。本实施例中通过用户自定义控件关联的被控设备20的操作,增加了编程的趣味性,自定义控件功能增加了编程的丰富度以及与被控设备20之间的交互感,同时用户可以自由配置,也扩大了编程的自由度,提高了用户的使用体验。
对于所述控制应用10预存的所述被控设备20提供的若干接口的信息,可以是所述控制应用10从所述被控设备20中获取,也可以是所述被控设备20对应的云端在所述被控设备20出厂时已存储了所述被控设备20提供的若干接口的信息,则所述控制应用10可以从所述被控设备20对应的云端获取,本申请实施例对此不做任何限制。
考虑到如果一个接口被多个自定义的控件同时调用,被控设备20无法同时响应于多个自定义的控件针对于该接口的调用,从而造成控制异常问题,因此,在一种可能的实现方式中,在根据所述编程界面210上的源程序生成自定义的控件时,可以检测所述源代码中调用的所述被控设备20提供的接口是否已经被别的自定义的控件调用,若是,则输出提醒消息,所述提醒消息用于提醒所述源代码中调用的所述被控设备20提供的接口发生重复调用问题,从而提醒用户更改调用的接口或者销毁调用该接口的控件,避免接口重复调用问题,进而保证基于自定义的控件对所述被控设备20进行控制的进程正确进行。
另外,在根据所述编程界面210上的源程序生成自定义的控件时,所述控件生成层21还可以通过诸如RDK之类的所述第一通信逻辑层22向所述被控设备20发送控件生成通知,所述控件生成通知用于通知所述被 控设备20所述控制应用10所生成的控件,所述控件生成通知包括所述控件调用的接口的信息,以便所述被控设备20记录所述控件与对应的所述被控设备20提供的接口之间的对应关系,即将所述控件与所述被控设备20的操作关联起来,以便后续的交互控制过程。
用户除了可以自定义所述控件关联的被控设备20的操作之后,还可以自定义所述控件的控件属性,在编程界面210上编写源程序时,用户可以根据自身的需求,在所述编程界面210上编写与所述控件的控件属性相应的代码,从而所述控件生成层21可以根据所述编程界面210上的源代码配置所述控件的控件属性;本实施例实现用户自由配置控件所关联的被控设备20的操作以及控件属性,增加编程的丰富度和自由度,也增加了编程的趣味性,优化用户的使用体验。
其中,所述控件属性包括但不限于:所述控件的外观属性、显示位置或者控件类型。所述外观属性至少包括以下一项:所述自定义控件的颜色、尺寸、显示字体、形状或边框等;所述控件类型至少包括以下一项:按钮、滑动条、单选框、下拉列表、文本、滚动条或输入框等。在一个例子中,用户自定义的控件为按钮,在按钮上显示“移动”字样,该按钮呈圆形,其关联的被控设备20的操作为移动操作,即该按钮用于控制所述被控设备20移动。
进一步地,在根据所述编程界面210上的源代码生成控件之后,所述控件生成层21可以将生成的所述控件展示于所述UI交互界面211上,用户可以对所述控件进行相应的触发操作,从而所述控件生成层21响应于所述UI交互界面上对所述控件的触发事件,基于所述控件生成层21支持的通信协议以及所述控件关联的被控设备20的操作生成第一控制指令,所述控件关联的被控设备20的操作基于在所述源代码中调用的所述被控设备20提供的接口确定。本实施例实现用户可以通过自定义的控件与所述被控设备20进行交互,通过控件给用户提供更好的反馈,增加了用户控制 被控设备20的参与感,优化用户的使用体验。
其中,所述触发事件基于用户配置的所述控件的控件类型所确定,所述触发事件包括但不限于:点击事件、双击事件、滑动事件、拖动事件、输入事件或者长按事件等。
需要说明的是,所述电子设备包括显示器,所述编程界面210和所述UI交互界面211可以同时展示在所述显示器所呈现的显示屏幕上,也可以分别展示在所述显示器所呈现的显示屏幕上,本实施例对此不作任何限制。
在一些应用场景中,为了降低编程的复杂度,使得用户可以轻松快速上手,同时也为了用户能够学习到相关的编程技术,用于生成控件的技术可能会选择目前市场上较为成熟或者普遍使用的控件生成技术,比如unity技术,其支持的通信协议是固定且公开的,而为了保证产品在数据传输过程中的安全性和保密性,产品生产商可能会另外开发一套私有的通信协议以供其生产的产品使用,从此导致所述控件生成层21支持的通信协议与被控设备20支持的通信协议不同,因此,所述控制应用10提供第一通信逻辑层22,在所述控件生成层21响应于所述UI交互界面上对所述控件的触发事件,基于所述控件生成层21支持的通信协议以及所述控件关联的被控设备20的操作生成第一控制指令之后,所述第一通信逻辑层22将所述第一控制指令转换为所述被控设备20支持的通信协议的第二控制指令并发送给所述被控设备20,以控制所述被控设备20执行相应的操作;本实施例通过所述第一通信逻辑层22的协议转换过程,保证所述被控设备20能够准确解析转换后的第二控制指令,准确响应于所述控件的触发事件执行相应的操作。
需要说明的是,所述第一控制指令与所述第二控制指令的区别仅在于支持的通信协议不同,其中用于指示所述控件关联的被控设备20的操作的相关数据相同。
在另一种应用场景中,若所述控件生成层21支持的通信协议与被控设备20支持的通信协议相同,则无需通过所述第一通信逻辑层22进行协议转换。
在一实施例中,所述被控设备20在接收到所述第二控制指令之后,根据所述第二控制指令,执行所述控件关联的被控设备20的操作。本实施例中所述第二设备实现对所述第二控制指令地准确解析,正确响应于所述控件的触发事件,执行用户通过控件交互指示的操作,给用户提供更好的反馈,优化用户的使用体验。
作为其中一种实现方式,请参阅图4,所述被控设备20包括第二通信逻辑层11以及业务逻辑层12;所述第二通信逻辑层11,用于接收并解析所述第二控制指令,确定所述控件关联的被控设备20的操作,当然,所述第二通信逻辑层11也可以向控制应用或其他外部应用发送其他类型的指令,例如反馈被控设备状态的指令;所述业务逻辑层12,用于提供供所述控制程序的控件生成层21调用的API接口,以及执行所述控件关联的被控设备20的操作。本实施例中所述第二设备实现对所述第二控制指令地准确解析,正确响应于所述控件的触发事件,执行用户通过控件交互指示的操作,给用户提供更好的反馈,优化用户的使用体验。
在一示例性实施例中,所述被控设备20包括可移动设备,所述第一控制指令包括控制所述被控设备20移动的移动指令,在所述第一通信逻辑层22将所述第一控制指令转换为所述被控设备20支持的通信协议的第二控制指令并发送给所述被控设备20之后,所述被控设备20根据所述第二控制指令进行移动。本实施例通过用户自定义的控件控制被控设备20移动,提供与被控设备20之间的交互感,增强编程的丰富度和自由度。
在一示例性实施例中,所述被控设备20还包括摄像装置,所述摄像装机包括但不限于照相机等;所述第一控制指令还包括控制所述摄像装置进行拍摄的拍摄指令;在所述第一通信逻辑层22将所述第一控制指令转换 为所述被控设备20支持的通信协议的第二控制指令并发送给所述被控设备20之后,所述被控设备20根据所述第二控制指令控制所述摄像装置进行拍摄,进一步地,所述被控设备20将所述摄像装置拍摄的图像通过所述第二通信逻辑层11传输给所述控制应用10,所述控制应用10在通过所述第一通信逻辑层22接收所述图像之后,在所述UI交互界面211展示所述图像或者在其他界面上展示所述图像。本实施例通过用户自定义的控件控制被控设备20上的摄像装置拍摄,提供与被控设备20之间的交互感,增强编程的丰富度和自由度,进一步地,用户还可以在UI交互界面211上实时观看到所述摄像装置拍摄的第一视角的图像,优化用户的使用体验。
在一示例性的应用场景中,用户自定义控件A以及控件B,通过自定义的控件A控制所述被控设备20移动,以及通过自定义的控件B控制所述摄像装置进行拍摄,在通过控件A控制所述被控设备20移动过程中,通过触发控件B使得安装于所述被控设备20上的摄像装置实时拍摄沿途画面,然后所述被控设备20通过第二通信逻辑层11将拍摄的沿途画面实时传输给所述控制应用10,所述控制应用10在通过所述第一通信逻辑层22接收所述沿途画面之后,在所述UI交互界面211实时展示所述沿途画面,从而用户可以观看到第一视角的实时画面,提高了用户的使用体验。
在一示例性实施例中,所述被控设备20还包括云台,所述云台用于支撑所述摄像装置,所述第一控制指令还包括控制所述云台转动的转动指令,在所述第一通信逻辑层22将所述第一控制指令转换为所述被控设备20支持的通信协议的第二控制指令并发送给所述被控设备20之后,所述被控设备20根据所述第二控制指令控制所述云台转动,从而改变所述摄像装置拍摄的场景。
在另一示例性的场景中,用户自定义控件A、控件B以及控件C,通过自定义的控件A控制所述被控设备20移动,通过自定义的控件B控制所述摄像装置进行拍摄,以及通过自定义的控件C控制所述云台转动, 在通过控件A控制所述被控设备20移动过程中,通过触发控件B使得安装于所述被控设备20上的摄像装置实时拍摄沿途画面,然后所述被控设备20通过第二通信逻辑层11将拍摄的沿途画面实时传输给所述控制应用10,所述控制应用10在通过所述第一通信逻辑层22接收所述沿途画面之后,在所述UI交互界面211实时展示所述沿途画面,从而用户可以观看到第一视角的实时画面,进一步地,可以通过控件C控制所述云台转动,从而更改所述摄像头拍摄的画面场景,从而用户可以观看到第一视角的不同角度的实时画面,优化用户的使用体验。
在一实施例中,所述系统还包括智能眼镜,所述被控设备20还包括摄像装置,所述第一控制指令还包括控制所述摄像装置进行拍摄的拍摄指令;当所述被控设备20根据由第一控制指令转换得到的所述第二控制指令控制所述摄像装置进行拍摄时,进一步地,所述被控设备20可以将所述摄像装置拍摄的图像传输给所述智能眼镜,以在所述智能眼镜上的显示屏实时显示所述图像,从而使用者可以通过智能眼镜观看到第一视角的实时画面。
在一示例性的实施例中,所述被控设备20还安装有扬声器,所述第一控制指令包括调节音量的指令,在所述第一通信逻辑层22将所述第一控制指令转换为所述被控设备20支持的通信协议的第二控制指令并发送给所述被控设备20之后,所述扬声器根据所述第二控制指令指示的音量进行播放。本实施例通过用户自定义的控件提供与被控设备20之间的交互感,增强编程的丰富度和自由度。
在一示例性的实施例中,所述被控设备20还安装有发射装置,所述发射装置用于发射弹珠,所述第一控制指令包括控制所述发射装置发射弹珠的指令,在所述第一通信逻辑层22将所述第一控制指令转换为所述被控设备20支持的通信协议的第二控制指令并发送给所述被控设备20之后,所述发射装置根据所述第二控制指令发射弹珠。本实施例通过用户自定义 的控件提供与被控设备20之间的交互感,增强编程的丰富度和自由度。
在一实施例中,若用户无需使用所述控件,可以销毁所述控件,所述控件生成层21还用于:响应于所述控件的销毁触发事件,删除所述控件对应的源代码,以及根据所述控件生成层21支持的通信协议生成第一销毁指令;所述第一通信逻辑层22还用于:将所述第一销毁指令转换为所述被控设备20支持的通信协议的第二销毁指令;所述被控设备20还用于:根据所述第二销毁指令,解除所述控件与所述被控设备20的操作的关联,即解除所述控件与所述被控设备20的操作指示的接口的对应关系。本实施例可以实现自定义控件的销毁,避免创建过多的无需使用的控件占用过多的内存资源。
相应地,请参阅图5,本申请还提供了一种控制方法,所述方法可应用于电子设备上,所述方法包括:
在步骤S101中,根据编程界面上的源代码生成控件,并将所述控件显示于UI交互界面。
在步骤S102中,响应于所述UI交互界面上对所述控件的触发事件,基于所述编程界面支持的通信协议以及所述控件关联的被控设备的操作生成第一控制指令;其中,所述控件关联的被控设备的操作基于在所述源代码中调用的所述被控设备提供的接口确定。
在步骤S103中,将所述第一控制指令转换为所述被控设备支持的通信协议的第二控制指令。
在步骤S104中,根据所述第二控制指令,控制所述被控设备执行所述控件关联的被控设备的操作。
在一实施例中,还包括:
根据所述编程界面上的源代码配置所述控件的控件属性。
在一实施例中,所述控件属性至少包括以下一项:所述自定义控件的外观属性、显示位置或者控件类型。
在一实施例中,所述自定义控件的外观属性至少包括以下一项:所述自定义控件的颜色、尺寸、显示字体或边框。
所述控件类型至少包括以下一项:按钮、滑动条、单选框、下拉列表、文本、滚动条或输入框。
在一实施例中,所述触发事件至少包括以下一项:点击事件、双击事件、滑动事件、拖动事件、输入事件或者长按事件。
在一实施例中,还包括:
响应于所述控件的销毁触发事件,删除所述控件对应的源代码,以及根据所述控件生成层支持的通信协议生成第一销毁指令。
将所述第一销毁指令转换为所述被控设备支持的通信协议的第二销毁指令。
通过所述第二销毁指令控制所述被控设备解除所述控件与所述被控设备的操作的关联。
在一实施例中,所述被控设备包括可移动机器人、无人驾驶车辆、无人飞行器以及无人驾驶船只。
在一实施例中,所述控件为用户自定义的控件。
在一实施例中,所述被控设备提供的接口包括API接口。
在一实施例中,所述第一控制指令包括控制所述被控设备移动的移动指令。
在一实施例中,所述被控设备安装有摄像装置。
所述第一控制指令包括控制所述摄像装置进行拍摄的拍摄指令。
所述方法还包括:
接收所述摄像装置拍摄的图像,并在所述UI交互界面展示所述图像。
相应的,请参阅图6,为本申请根据一示例性实施例示出的一种电子设备的结构图,所述电子设备可以是手机、电脑、个人平板或者PDA(Personal Digital Assistant,个人数字助理)等具备联网的设备,所述电子 设备包括显示器33、存储器32、处理器31及存储在存储器32上并可在处理器31上运行的控制应用的程序;所述控制应用提供编程界面和UI交互界面。
所述显示器33用于显示所述编程界面和UI交互界面。
其中,所述处理器31调用所述控制应用的程序,当程序被执行时,用于执行以下操作:
根据所述编程界面上的源代码生成控件,并将所述控件显示于所述UI交互界面。
响应于所述UI交互界面上对所述控件的触发事件,基于所述编程界面支持的通信协议以及所述控件关联的被控设备的操作生成第一控制指令;其中,所述控件关联的被控设备的操作基于在所述源代码中调用的所述被控设备提供的接口确定。
将所述第一控制指令转换为所述被控设备支持的通信协议的第二控制指令。
根据所述第二控制指令,控制所述被控设备执行所述控件关联的被控设备的操作。
所述处理器31可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
所述存储器32存储所述控制方法的可执行指令计算机程序,所述存储器32可以包括至少一种类型的存储介质,存储介质包括闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等等)、随机访问存储器(RAM)、静态随机访问存储器(SRAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、可编程只读存储器(PROM)、磁性存储器、磁盘、光盘 等等。而且,所述电子设备30可以与通过网络连接执行存储器32的存储功能的网络存储装置协作。存储器32可以是电子设备30的内部存储单元,例如电子设备30的硬盘或内存。存储器32也可以是电子设备30的外部存储设备,例如电子设备30上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,存储器32还可以既包括电子设备30的内部存储单元也包括外部存储设备。存储器32用于存储计算机程序以及设备所需的其他程序和数据。存储器32还可以用于暂时地存储已经输出或者将要输出的数据。
所述显示器33用于显示所述编程界面和UI交互界面,所述显示器33包括但不限于CRT(Cathode Ray Tube,阴极射线管)显示器、LCD(液晶)显示器、LED(light emitting diode,发光二极管)显示器或者PDP(Plasma Display Panel,等离子显示器)显示器。
这里描述的各种实施方式可以使用例如计算机软件、硬件或其任何组合的计算机可读介质来实施。对于硬件实施,这里描述的实施方式可以通过使用特定用途集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理装置(DSPD)、可编程逻辑装置(PLD)、现场可编程门阵列(FPGA)、处理器、控制器、微控制器、微处理器、被设计为执行这里描述的功能的电子单元中的至少一种来实施。对于软件实施,诸如过程或功能的实施方式可以与允许执行至少一种功能或操作的单独的软件模块来实施。软件代码可以由以任何适当的编程语言编写的软件应用程序(或程序)来实施,软件代码可以存储在存储器中并且由控制器执行。
在一实施例中,所述处理器还用于:根据所述编程界面上的源代码配置所述控件的控件属性。
在一实施例中,所述控件属性至少包括以下一项:所述自定义控件的外观属性、显示位置或者控件类型。
在一实施例中,所述自定义控件的外观属性至少包括以下一项:所述自定义控件的颜色、尺寸、显示字体或边框。
所述控件类型至少包括以下一项:按钮、滑动条、单选框、下拉列表、文本、滚动条或输入框。
在一实施例中,所述处理器还用于:
响应于所述控件的销毁触发事件,删除所述控件对应的源代码,以及根据所述控件生成层支持的通信协议生成第一销毁指令;
将所述第一销毁指令转换为所述被控设备支持的通信协议的第二销毁指令;
通过所述第二销毁指令控制所述被控设备解除所述控件与所述被控设备的操作的关联。
在一实施例中,所述控件为用户自定义的控件。
在一实施例中,所述被控设备提供的接口包括API接口。
在一实施例中,所述第一控制指令包括控制所述被控设备移动的移动指令。
在一实施例中,所述被控设备包括摄像装置;
所述第一控制指令包括控制所述摄像装置进行拍摄的拍摄指令;
所述处理器还用于:
接收所述摄像装置拍摄的图像,并在所述UI交互界面展示所述图像。
相应的,请参阅图7,为本申请根据一示例性实施例示出的一种可移动设备40的结构图,所述可移动设备40包括:
机身41;
动力系统42,设于所述机身41内部,用于驱动所述可移动设备运动;
通讯系统43,设于所述机身41内部,用于接收上述的电子设备发送的第二控制指令;
控制系统44,安装于所述机身内,用于提供供上述的电子设备中的控制程序调用的接口,以及根据所述第二控制指令执行相应的操作。
在一实施例中,所述可移动设备40包括:可移动机器人、无人驾驶车辆、无人飞行器以及无人驾驶船只。
在一实施例中,所述通讯系统43被配置为便于可移动设备40和其它设备之间有线或无线方式的通信。可移动设备40可以接入基于通信标准的无线网络,如WiFi,3G或4G,或它们的组合。在一个示例性实施例中,所述通讯系统43经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。
在一实施例中,所述通信系统43具体用于:接收并解析所述第二控制指令,确定所述控件关联的被控设备的操作。
在一实施例中,所述可移动设备40还包括摄像装置。所述第二控制指令包括控制所述摄像装置进行拍摄的拍摄指令。
所述控制系统还用于:根据所述拍摄指令控制所述摄像装置进行拍摄,以及将所述摄像装置拍摄的图像通过所述通信系统43传输给所述电子设备。
在一实施例中,所述第二控制指令还包括控制所述可移动设备40移动的移动指令;所述控制系统具体用于:根据所述移动指令控制所述动力系统41,以驱动所述可移动设备40移动。
在一实施例中,所述可移动设备40还包括云台,所述云台用于支撑所述摄像装置,所述第二控制指令还包括控制所述云台转动的转动指令;所述控制系统具体用于:根据所述转动指令控制所述云台,以驱动所述云台转动。
本领域技术人员可以理解,图7仅仅是可移动设备40的示例,并不构成对可移动设备40的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如可移动设备40还可以包括输入输出设备等。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器,上述指令可由交互设备的处理器执行 以完成上述方法。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
其中,当所述存储介质中的指令由所述处理器执行时,使得电子设备能够执行前述控制方法。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上对本发明实施例所提供的方法和装置进行了详细介绍,本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。

Claims (37)

  1. 一种控制系统,包括被控设备以及控制应用,所述控制应用装载于一电子设备上;
    所述控制应用包括控件生成层和第一通信逻辑层;
    所述控件生成层用于:提供编程界面和UI交互界面,根据所述编程界面上的源代码生成控件,并在所述控件显示于所述UI交互界面时,响应于所述UI交互界面上对所述控件的触发事件,基于所述控件生成层支持的通信协议以及所述控件关联的被控设备的操作生成第一控制指令;其中,所述控件关联的被控设备的操作基于在所述源代码中调用的所述被控设备提供的接口确定;
    所述第一通信逻辑层用于:将所述第一控制指令转换为所述被控设备支持的通信协议的第二控制指令;
    所述被控设备用于:根据所述第二控制指令,执行所述控件关联的被控设备的操作。
  2. 根据权利要求1所述的系统,其特征在于,
    所述控件生成层还用于:根据所述编程界面上的源代码配置所述控件的控件属性。
  3. 根据权利要求1所述的系统,其特征在于,所述被控设备包括第二通信逻辑层以及业务逻辑层;
    所述第二通信逻辑层,用于接收并解析所述第二控制指令,确定所述控件关联的被控设备的操作;
    所述业务逻辑层,用于提供供所述控制程序的控件生成层调用的接口,以及执行所述控件关联的被控设备的操作。
  4. 根据权利要求2所述的系统,其特征在于,所述控件属性至少包括以下一项:所述控件的外观属性、显示位置或者控件类型。
  5. 根据权利要求4所述的系统,其特征在于,所述控件的外观属性至少包括以下一项:所述自定义控件的颜色、尺寸、显示字体或边框;
    所述控件类型至少包括以下一项:按钮、滑动条、单选框、下拉列表、文本、滚动条或输入框。
  6. 根据权利要求1所述的系统,其特征在于,所述触发事件至少包括以下一项:点击事件、双击事件、滑动事件、拖动事件、输入事件或者长按事件。
  7. 根据权利要求1所述的系统,其特征在于,
    所述控件生成层还用于:响应于所述控件的销毁触发事件,删除所述控件对应的源代码,以及根据所述控件生成层支持的通信协议生成第一销毁指令;
    所述第一通信逻辑层还用于:将所述第一销毁指令转换为所述被控设备支持的通信协议的第二销毁指令;
    所述被控设备还用于:根据所述第二销毁指令,解除所述控件与所述被控设备的操作的关联。
  8. 根据权利要求1至7任意一项所述的系统,其特征在于,所述被控设备包括可移动机器人、无人驾驶车辆、无人飞行器以及无人驾驶船只。
  9. 根据权利要求1所述的系统,其特征在于,所述控件为自定义的控件。
  10. 根据权利要求1所述的系统,其特征在于,所述被控设备提供的接口包括API接口。
  11. 根据权利要求1所述的系统,其特征在于,所述第一控制指令包括控制所述被控设备移动的移动指令。
  12. 根据权利要求1所述的系统,其特征在于,所述被控设备安装有摄像装置;
    所述第一控制指令包括控制所述摄像装置进行拍摄的拍摄指令;
    所述被控设备还用于:将所述摄像装置拍摄的图像传输给所述控制应用;
    所述控制应用还用于:在所述UI交互界面展示所述图像。
  13. 一种控制方法,其特征在于,包括:
    根据编程界面上的源代码生成控件,并将所述控件显示于UI交互界面;
    响应于所述UI交互界面上对所述控件的触发事件,基于所述编程界面支持的通信协议以及所述控件关联的被控设备的操作生成第一控制指令;其中,所述控件关联的被控设备的操作基于在所述源代码中调用的所述被控设备提供的接口确定;
    将所述第一控制指令转换为所述被控设备支持的通信协议的第二控制指令;
    根据所述第二控制指令,控制所述被控设备执行所述控件关联的被控设备的操作。
  14. 根据权利要求13所述的方法,其特征在于,还包括:
    根据所述编程界面上的源代码配置所述控件的控件属性。
  15. 根据权利要求14所述的方法,其特征在于,所述控件属性至少包括以下一项:所述自定义控件的外观属性、显示位置或者控件类型。
  16. 根据权利要求15所述的方法,其特征在于,所述自定义控件的外观属性至少包括以下一项:所述自定义控件的颜色、尺寸、显示字体或边框;
    所述控件类型至少包括以下一项:按钮、滑动条、单选框、下拉列表、文本、滚动条或输入框。
  17. 根据权利要求13所述的方法,其特征在于,所述触发事件至少包括以下一项:点击事件、双击事件、滑动事件、拖动事件、输入事件或者长按事件。
  18. 根据权利要求13所述的方法,其特征在于,还包括:
    响应于所述控件的销毁触发事件,删除所述控件对应的源代码,以及根据所述控件生成层支持的通信协议生成第一销毁指令;
    将所述第一销毁指令转换为所述被控设备支持的通信协议的第二销毁指令;
    通过所述第二销毁指令控制所述被控设备解除所述控件与所述被控设备的操作的关联。
  19. 根据权利要求13至18任意一项所述的方法,其特征在于,所述被控设备包括可移动机器人、无人驾驶车辆、无人飞行器以及无人驾驶船只。
  20. 根据权利要求13所述的方法,其特征在于,所述控件为用户自定义的控件。
  21. 根据权利要求13所述的方法,其特征在于,所述被控设备提供的接口包括API接口。
  22. 根据权利要求13所述的方法,其特征在于,所述第一控制指令包括控制所述被控设备移动的移动指令。
  23. 根据权利要求13所述的方法,其特征在于,所述被控设备安装有摄像装置;
    所述第一控制指令包括控制所述摄像装置进行拍摄的拍摄指令;
    所述方法还包括:
    接收所述摄像装置拍摄的图像,并在所述UI交互界面展示所述图像。
  24. 一种电子设备,其特征在于,包括显示器、存储器、处理器及存储在存储器上并可在处理器上运行的控制应用的程序;所述控制应用提供编程界面和UI交互界面;
    所述显示器用于显示所述编程界面和UI交互界面;
    其中,所述处理器调用所述控制应用的程序,当程序被执行时,用于执行以下操作:
    根据所述编程界面上的源代码生成控件,并将所述控件显示于所述UI交互界面;
    响应于所述UI交互界面上对所述控件的触发事件,基于所述编程界面支持的通信协议以及所述控件关联的被控设备的操作生成第一控制指令;其中,所述控件关联的被控设备的操作基于在所述源代码中调用的所述被控设备提供的接口确定;
    将所述第一控制指令转换为所述被控设备支持的通信协议的第二控制指令;
    根据所述第二控制指令,控制所述被控设备执行所述控件关联的被控设备的操作。
  25. 根据权利要求24所述的设备,其特征在于,
    所述处理器还用于:根据所述编程界面上的源代码配置所述控件的控件属性。
  26. 根据权利要求25所述的设备,其特征在于,所述控件属性至少包括以下一项:所述控件的外观属性、显示位置或者控件类型。
  27. 根据权利要求26所述的设备,其特征在于,所述控件的外观属性至少包括以下一项:所述控件的颜色、尺寸、显示字体或边框;
    所述控件类型至少包括以下一项:按钮、滑动条、单选框、下拉列表、文本、滚动条或输入框。
  28. 根据权利要求24所述的设备,其特征在于,所述处理器还用于:
    响应于所述控件的销毁触发事件,删除所述控件对应的源代码,以及根据所述控件生成层支持的通信协议生成第一销毁指令;
    将所述第一销毁指令转换为所述被控设备支持的通信协议的第二销毁指令;
    通过所述第二销毁指令控制所述被控设备解除所述控件与所述被控设备的操作的关联。
  29. 根据权利要求24所述的设备,其特征在于,所述控件为用户自定义的控件。
  30. 根据权利要求24所述的设备,其特征在于,所述被控设备提供的 接口包括API接口。
  31. 根据权利要求24所述的设备,其特征在于,所述第一控制指令包括控制所述被控设备移动的移动指令。
  32. 根据权利要求24所述的设备,其特征在于,所述被控设备包括摄像装置;
    所述第一控制指令包括控制所述摄像装置进行拍摄的拍摄指令;
    所述处理器还用于:
    接收所述摄像装置拍摄的图像,并在所述UI交互界面展示所述图像。
  33. 一种可移动设备,其特征在于,包括:
    机身;
    动力系统,安装于所述机身内,用于驱动所述可移动设备运动;
    通信系统,安装于所述机身内,用于接收如权利要求24所述的电子设备发送的第二控制指令;
    控制系统,安装于所述机身内,用于提供供如权利要求24所述的电子设备中的控制程序调用的接口,以及根据所述第二控制指令执行相应的操作。
  34. 根据权利要求33所述的设备,其特征在于,所述可移动设备包括:可移动机器人、无人驾驶车辆、无人飞行器以及无人驾驶船只。
  35. 根据权利要求33所述的设备,其特征在于,还包括摄像装置;
    所述第二控制指令包括控制所述摄像装置进行拍摄的拍摄指令
    所述控制系统还用于:根据所述拍摄指令控制所述摄像装置进行拍摄,以及将所述摄像装置拍摄的图像通过所述通信系统传输给所述电子设备。
  36. 根据权利要求33所述的设备,其特征在于,所述第二控制指令包括控制所述可移动设备移动的移动指令;
    所述控制系统具体用于:根据所述移动指令控制所述动力系统,以驱动所述可移动设备移动。
  37. 一种计算机可读存储介质,其特征在于,其上存储有计算机指令,该指令被处理器执行时实现权利要求13至23任意一项所述的方法。
PCT/CN2020/078478 2020-03-09 2020-03-09 控制系统、方法、电子设备、可移动设备及计算机可读存储介质 WO2021179143A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080038669.3A CN113874175A (zh) 2020-03-09 2020-03-09 控制系统、方法、电子设备、可移动设备及计算机可读存储介质
PCT/CN2020/078478 WO2021179143A1 (zh) 2020-03-09 2020-03-09 控制系统、方法、电子设备、可移动设备及计算机可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/078478 WO2021179143A1 (zh) 2020-03-09 2020-03-09 控制系统、方法、电子设备、可移动设备及计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2021179143A1 true WO2021179143A1 (zh) 2021-09-16

Family

ID=77670518

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/078478 WO2021179143A1 (zh) 2020-03-09 2020-03-09 控制系统、方法、电子设备、可移动设备及计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN113874175A (zh)
WO (1) WO2021179143A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106178528A (zh) * 2016-08-31 2016-12-07 北京趣智阿尔法科技有限公司 可编程教育无人机
CN107657866A (zh) * 2016-07-25 2018-02-02 北京东易晖煌国际教育科技有限公司 一种模块化可编程的智能教育机器人
CN107932504A (zh) * 2017-11-13 2018-04-20 浙江工业大学 基于PyQt的机械臂运行控制系统
CN108406764A (zh) * 2018-02-02 2018-08-17 上海大学 智能开放型服务机器人操作系统及方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7814463B2 (en) * 2003-05-16 2010-10-12 Oracle International Corporation User interface debugger for software applications
KR101306556B1 (ko) * 2013-04-08 2013-09-09 한성대학교 산학협력단 스마트 기기 기반 통합 로봇 제어 시스템
CN105993163B (zh) * 2015-07-02 2018-12-14 深圳市大疆创新科技有限公司 图像处理系统、图像数据处理方法、装置及相关设备
KR101850203B1 (ko) * 2016-04-11 2018-04-18 라인 가부시키가이샤 기기간 어플리케이션 연동 방법 및 시스템
CN107132975A (zh) * 2017-05-26 2017-09-05 努比亚技术有限公司 一种控件编辑处理方法、移动终端以及计算机可读存储介质
CN110000753B (zh) * 2019-02-28 2021-10-26 深圳镁伽科技有限公司 用户交互方法、控制设备及存储介质
CN110543144B (zh) * 2019-08-30 2021-06-01 天津施格自动化科技有限公司 图形化编程控制机器人的方法及系统
CN110757464A (zh) * 2019-11-21 2020-02-07 东莞固高自动化技术有限公司 工业机器人示教器及作业系统、控制系统及控制方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107657866A (zh) * 2016-07-25 2018-02-02 北京东易晖煌国际教育科技有限公司 一种模块化可编程的智能教育机器人
CN106178528A (zh) * 2016-08-31 2016-12-07 北京趣智阿尔法科技有限公司 可编程教育无人机
CN107932504A (zh) * 2017-11-13 2018-04-20 浙江工业大学 基于PyQt的机械臂运行控制系统
CN108406764A (zh) * 2018-02-02 2018-08-17 上海大学 智能开放型服务机器人操作系统及方法

Also Published As

Publication number Publication date
CN113874175A (zh) 2021-12-31

Similar Documents

Publication Publication Date Title
US11973613B2 (en) Presenting overview of participant conversations within a virtual conferencing system
US20240064037A1 (en) Providing a room preview within a virtual conferencing system
US11722535B2 (en) Communicating with a user external to a virtual conference
US20230283494A1 (en) Updating element properties based on distance between elements in virtual conference
WO2021160161A1 (zh) 消息提醒方法及电子设备
WO2020057327A1 (zh) 信息列表显示方法、装置及存储介质
CN103677261B (zh) 用户装置的情景感知服务提供方法和设备
US20170293272A1 (en) Electronic device and method for providing information in electronic device
WO2022212391A1 (en) Presenting participant conversations within virtual conferencing system
WO2015188614A1 (zh) 操作虚拟世界里的电脑和手机的方法、装置以及使用其的眼镜
KR102047499B1 (ko) 콘텐트를 공유하는 방법 및 그 장치
KR20170102634A (ko) 전자 장치와 전자 장치의 영상 디스플레이 및 전송 방법
KR20140088820A (ko) 디스플레이장치 및 그 제어방법
CN109451341B (zh) 视频播放方法、视频播放装置、电子设备及存储介质
US20220150598A1 (en) Method for message interaction, and electronic device
EP3282644B1 (en) Timing method and device
KR20090010135A (ko) 화상 속 화상 기능을 가진 휴대형 전자 장치
WO2016065757A1 (zh) 设备列表动态显示的方法及装置
WO2022057393A1 (zh) 事件处理方法、装置、存储介质、移动终端及电脑
US11310064B2 (en) Information processing apparatus, information processing system, and information processing method
KR101560352B1 (ko) 대형 벽면 환경과 연동 가능한 스마트 테이블을 이용한 대형 벽면 활용형 대화형 영상컨텐츠 디스플레이 시스템
WO2021179143A1 (zh) 控制系统、方法、电子设备、可移动设备及计算机可读存储介质
WO2020186929A1 (zh) 直播中的交互方法、装置、电子设备及存储介质
CN108829473B (zh) 事件响应方法、装置及存储介质
CN111973980A (zh) 虚拟宠物的控制方法、移动设备和计算机存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20923796

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20923796

Country of ref document: EP

Kind code of ref document: A1