CN110297472A - Apparatus control method, terminal, controlled plant, electronic equipment and storage medium - Google Patents

Apparatus control method, terminal, controlled plant, electronic equipment and storage medium Download PDF

Info

Publication number
CN110297472A
CN110297472A CN201910578882.2A CN201910578882A CN110297472A CN 110297472 A CN110297472 A CN 110297472A CN 201910578882 A CN201910578882 A CN 201910578882A CN 110297472 A CN110297472 A CN 110297472A
Authority
CN
China
Prior art keywords
controlled device
program code
program
terminal
controlled
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910578882.2A
Other languages
Chinese (zh)
Inventor
张军伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sensetime Intelligent Technology Co Ltd
Original Assignee
Shanghai Sensetime Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sensetime Intelligent Technology Co Ltd filed Critical Shanghai Sensetime Intelligent Technology Co Ltd
Priority to CN201910578882.2A priority Critical patent/CN110297472A/en
Publication of CN110297472A publication Critical patent/CN110297472A/en
Priority to PCT/CN2020/091915 priority patent/WO2020259154A1/en
Priority to TW109121221A priority patent/TWI743853B/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4183Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41835Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by programme execution
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41845Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by system universality, reconfigurability, modularity
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4185Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the network communication
    • G05B19/41855Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the network communication by local area network [LAN], network structure
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Selective Calling Equipment (AREA)
  • Stored Programmes (AREA)

Abstract

The embodiment of the present disclosure discloses a kind of apparatus control method, terminal, controlled plant, electronic equipment and storage medium, this method comprises: obtaining the program code to work for control the first controlled plant by local agent program;Said program code is sent to first controlled plant by the local agent program, makes the first controlled plant operation said program code.The technical solution of the embodiment of the present disclosure, program code can be obtained based on local agent program and is pushed, and then the control to the first controlled plant can be realized based on program code, and program code can be edited flexibly, in this way, can control the first controlled plant more flexiblely.

Description

Device control method, terminal, controlled device, electronic device, and storage medium
Technical Field
The present invention relates to intelligent device control technologies, and in particular, to a device control method, a terminal, a controlled device, an electronic device, and a storage medium.
Background
Currently, a remote control method for a smart device may be applied to various fields, for example, a remote control field or a programming education field. However, in the related art, when controlling the smart device, the corresponding control command can only be sent through the fixed command button, that is, the control command for the smart device is a fixed control command or control commands, and the control manner for the smart device is not flexible.
Disclosure of Invention
The embodiment of the disclosure is expected to provide a technical scheme for equipment control.
The embodiment of the disclosure provides an equipment control method, which comprises the following steps:
acquiring a program code for controlling the first controlled device to work through a local agent program;
and sending the program code to the first controlled equipment through the local agent program, so that the first controlled equipment runs the program code.
Optionally, the method further comprises:
and after the program code is sent to the first controlled equipment, receiving feedback information of the first controlled equipment, wherein the feedback information is generated after the first controlled equipment runs the program code.
Optionally, the method further comprises:
and after receiving the feedback information of the first controlled device, loading and/or displaying the feedback information at a terminal where the local agent program is located.
Optionally, the feedback information includes an execution result of the program code.
Optionally, the obtaining, by the local agent, a program code for controlling an operation of the controlled device includes:
and acquiring the program code submitted by the user based on the WEB page through the local agent program.
The embodiment of the present disclosure further provides an apparatus control method, which is applied to a first controlled apparatus, and the method includes:
receiving a program code which is sent by a local agent program of a terminal and used for controlling the first controlled equipment to work; the program code is collected by the home agent;
and executing the program code.
Optionally, the method further comprises:
and operating the program code to generate feedback information, and sending the feedback information to the terminal.
Optionally, the feedback information includes an execution result of the program code.
Optionally, the method further comprises:
generating a control instruction of a second controlled device by running the program code, wherein the first controlled device and the second controlled device form a communication connection;
sending the control instruction of the second controlled device to the second controlled device, and enabling the second controlled device to execute the control instruction of the second controlled device;
receiving an instruction execution result sent by the second controlled device, generating a control instruction of the first controlled device according to the instruction execution result, and executing the control instruction of the first controlled device; and the instruction execution result is obtained after the second controlled device executes the control instruction of the second controlled device.
Optionally, the method further comprises:
acquiring working mode switching information; the working mode switching information is used for indicating the first controlled device and the second controlled device to switch the current working mode to a target working mode;
controlling the first controlled equipment to switch the working mode according to the working mode switching information; and sending the working mode switching information to second controlled equipment, so that the second controlled equipment switches the working mode based on the working mode switching information.
Optionally, the control instruction of the second controlled device is an image detection instruction, and the instruction execution result is an image detection result;
the generating a control instruction of the first controlled device according to the instruction execution result, and executing the control instruction of the first controlled device include:
generating a control instruction of the first controlled device according to the image detection result, wherein the control instruction of the first controlled device is a human body tracking instruction;
and controlling the motion state of the first controlled equipment according to the human body tracking instruction.
Optionally, the generating a control instruction of the first controlled device according to the image detection result includes:
generating a control instruction for controlling the first controlled equipment to advance in response to the situation that the image detection result shows that the human body becomes small;
generating a control instruction for controlling the first controlled device to keep still in response to the image detection result indicating that the human body is enlarged;
generating a control instruction for controlling the first controlled device to turn left in response to the image detection result representing that a human body is positioned on the left side of the first controlled device;
and generating a control instruction for controlling the first controlled equipment to turn right in response to the image detection result indicating that the human body is positioned on the right side of the first controlled equipment.
Optionally, the first controlled device and the second controlled device are connected by a wired connection.
The disclosed embodiment also provides a terminal, which includes an obtaining module and a first processing module, wherein,
the acquisition module is used for acquiring a program code for controlling the first controlled device to work through the local agent program;
and the first processing module is used for sending the program code to the first controlled equipment through the local agent program so as to enable the first controlled equipment to run the program code.
Optionally, the first processing module is further configured to receive feedback information of the first controlled device after the program code is sent to the first controlled device, where the feedback information is generated by the first controlled device after the program code is run.
Optionally, the first processing module is further configured to load and/or display the feedback information at a terminal where the local agent program is located after receiving the feedback information of the first controlled device.
Optionally, the feedback information includes an execution result of the program code.
Optionally, the obtaining module is configured to obtain, by the local agent, the program code submitted by the user based on the WEB page.
Optionally, the first controlled device comprises a receiving module and a second processing module, wherein,
the receiving module is used for receiving a program code which is sent by a local agent program of the terminal and is used for controlling the first controlled device to work; the program code is collected by the home agent;
and the second processing module is used for operating the program code.
Optionally, the second processing module is further configured to run the program code to generate feedback information, and send the feedback information to the terminal.
Optionally, the feedback information includes an execution result of the program code.
Optionally, the second processing module is further configured to:
generating a control instruction of a second controlled device by running the program code, wherein the first controlled device and the second controlled device form a communication connection;
sending the control instruction of the second controlled device to the second controlled device, and enabling the second controlled device to execute the control instruction of the second controlled device;
receiving an instruction execution result sent by the second controlled device, generating a control instruction of the first controlled device according to the instruction execution result, and executing the control instruction of the first controlled device; and the instruction execution result is obtained after the second controlled device executes the control instruction of the second controlled device.
Optionally, the second processing module is further configured to acquire working mode switching information; controlling the first controlled equipment to switch the working mode according to the working mode switching information; sending the working mode switching information to second controlled equipment to enable the second controlled equipment to switch the working mode based on the working mode switching information; the working mode switching information is used for indicating the first controlled device and the second controlled device to switch the current working mode to a target working mode.
Optionally, the control instruction of the second controlled device is an image detection instruction, and the instruction execution result is an image detection result;
the second processing module is used for generating a control instruction of the first controlled device according to the image detection result, wherein the control instruction of the first controlled device is a human body tracking instruction; and controlling the motion state of the first controlled equipment according to the human body tracking instruction.
Optionally, the second processing module is configured to generate a control instruction for controlling the first controlled device to advance in response to a situation that the image detection result indicates that the human body becomes smaller; generating a control instruction for controlling the first controlled device to keep still in response to the image detection result indicating that the human body is enlarged; generating a control instruction for controlling the first controlled device to turn left in response to the image detection result representing that a human body is positioned on the left side of the first controlled device; and generating a control instruction for controlling the first controlled equipment to turn right in response to the image detection result indicating that the human body is positioned on the right side of the first controlled equipment.
Optionally, the first controlled device and the second controlled device are connected by a wired connection.
The disclosed embodiments also provide a terminal comprising a processor and a memory for storing a computer program capable of running on the processor; wherein,
the processor is configured to execute any one of the above-described device control methods when the computer program is executed.
An embodiment of the present disclosure also provides an electronic device, which includes a processor and a memory for storing a computer program capable of running on the processor; wherein,
the processor is configured to execute any one of the above-described device control methods when the computer program is executed.
The disclosed embodiments also provide a computer storage medium having a computer program stored thereon, where the computer program, when executed by a processor, implements any one of the above-mentioned device control methods.
In the device control method, the terminal, the controlled device, the electronic device and the computer storage medium provided by the embodiment of the disclosure, a program code for controlling the first controlled device to work is acquired through a local agent program; and sending the program code to the first controlled equipment through the local agent program, so that the first controlled equipment runs the program code. According to the technical scheme of the embodiment of the disclosure, the program code can be obtained and pushed based on the local agent program, and then the control of the first controlled device can be realized based on the program code, and the program code can be flexibly edited, so that the first controlled device can be more flexibly controlled.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a first flowchart of an apparatus control method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating a connection relationship between an EV3 smart car and a raspberry pi in an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a connection relationship between an EV3 intelligent vehicle and a terminal in an embodiment of the disclosure;
FIG. 4 is a flowchart of an apparatus control method according to an embodiment of the present disclosure;
FIG. 5 is a functional flow diagram of an agent in an embodiment of the present disclosure;
FIG. 6 is a schematic view of a processing flow of an EV3 smart car and a raspberry pi in a human tracking mode according to an embodiment of the disclosure;
fig. 7 is a schematic view of a processing flow for feedback information in an embodiment of the present disclosure;
FIG. 8 is a second flowchart of an apparatus control method according to an embodiment of the disclosure;
fig. 9 is a flowchart of a device control method according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of another terminal according to an embodiment of the present disclosure;
FIG. 12 is a schematic diagram of a first controlled device according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
The present disclosure will be described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the examples provided herein are merely illustrative of the present disclosure and are not intended to limit the present disclosure. In addition, the embodiments provided below are some embodiments for implementing the disclosure, not all embodiments for implementing the disclosure, and the technical solutions described in the embodiments of the disclosure may be implemented in any combination without conflict.
It should be noted that, in the embodiments of the present disclosure, the terms "comprises," "comprising," or any other variation thereof are intended to cover a non-exclusive inclusion, so that a method or apparatus including a series of elements includes not only the explicitly recited elements but also other elements not explicitly listed or inherent to the method or apparatus. Without further limitation, the use of the phrase "including a. -. said." does not exclude the presence of other elements (e.g., steps in a method or elements in a device, such as portions of circuitry, processors, programs, software, etc.) in the method or device in which the element is included.
For example, the device control method provided by the embodiment of the present disclosure includes a series of steps, but the device control method provided by the embodiment of the present disclosure is not limited to the described steps, and similarly, the terminal and the controlled device provided by the embodiment of the present disclosure include a series of modules, but the terminal and the controlled device provided by the embodiment of the present disclosure are not limited to include the explicitly described modules, and may also include modules that are required to be set for acquiring related information or performing processing based on the information.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
The disclosed embodiments may be implemented in connection with a control system comprising terminals and devices to be controlled and are operational with numerous other general purpose or special purpose computing system environments or configurations. Here, the terminal may be a thin client, a thick client, a hand-held or laptop device, a microprocessor-based system, a set-top box, a programmable consumer electronics, a network personal computer, a small computer system, etc., and the controlled device may be an electronic device such as a smart car.
Electronic devices such as terminals, controlled devices, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types.
In some embodiments of the present disclosure, the educational robot is a general teaching aid model of modern programming education, and the educational robot can be controlled by using a WEB mode or a mobile Application (APP) client; in the related intelligent device control scheme, transmission is usually performed through bluetooth and wireless video, and then information processing is performed on the server side; in the scheme of the first example, a mind control video car system based on Wi-Fi (Wireless Fidelity) communication is disclosed, the system consists of four parts, namely an electroencephalogram acquisition card, a smart phone, a mobile car and a personal computer, the smart phone is connected with the electroencephalogram acquisition card through Bluetooth, and further the remote control of the mobile car can be realized; in a second exemplary scheme, a remote control weeding robot is disclosed, and in specific implementation, a camera and a Global Positioning System (GPS) chip can be mounted on a robot body, a cart finishes video acquisition and then transmits the video acquisition to an upper computer in a wireless communication mode, and after relevant processing is finished, an instruction is issued to the robot; in a third exemplary scheme, a remote management scheme for a sweeper is disclosed, in specific implementation, a communication module can be used for wirelessly transmitting mobile phone information to a network management server, and then a management platform provides services for a user.
In the above-mentioned intelligent device control scheme, only fixed command transmission can be completed in a wireless communication manner, and only fixed control instructions can be sent through fixed command buttons; the communication quality of the wireless communication network can affect the communication between the upper computer and the lower computer.
In some embodiments of the present disclosure, the implementation of remote control over an intelligent device based on a WEB mode is an efficient and convenient scheme, and plays an extremely important role in the fields of remote control and education; however, there are many problems to be solved in order to realize accurate and real-time control; firstly, as a wireless connection mode is adopted between an upper computer and a lower computer, but a wireless network is unstable and has high packet loss rate, the problem is high in occasions with high performance requirements, and the application scene is limited; secondly, in the related art, only the transmission of control instructions from the upper computer to the lower computer is realized, feedback information cannot be collected or effectively processed, and the issuing of the control instructions of the WEB side of the upper computer and the collection and reporting of the feedback information of the execution instructions of the lower computer also need to be designed into a flexible and complex architecture.
In view of the above technical problems, in some embodiments of the present disclosure, a technical solution for device control is provided, and embodiments of the present disclosure may be implemented in scenes such as educational robots, remote control, intelligent remote control, and the like.
First embodiment
The embodiment of the disclosure provides an equipment control method; fig. 1 is a first flowchart of an apparatus control method according to an embodiment of the present disclosure, and as shown in fig. 1, the first flowchart may include:
step 101: acquiring a program code for controlling the first controlled device to work through a local agent program;
in the embodiment of the disclosure, the first controlled device may be an electronic device such as an intelligent vehicle, an embedded development device, an intelligent robot, and the like; the first controlled device can comprise a sensor, a camera and the like, and the terminal can provide a human-computer interaction interface. In the embodiment of the present disclosure, the first controlled device may be a single electronic device or an integrated electronic device.
In practical applications, a local agent (agent) program may be installed in the terminal, and the local agent program is at least used for collecting program codes submitted by the user, where the program codes submitted by the user may be program codes for controlling the operation of the first controlled device, and thus, by running the local agent program, program codes for controlling the operation of the first controlled device may be collected.
As an implementation manner, the program code submitted by the user based on the WEB page may be acquired by the local agent program, that is, the user may submit the program code for controlling the first controlled device to operate through the WEB page of the terminal, and the local agent program may collect the corresponding program code under the condition that the local agent program is running; therefore, the user can conveniently submit the program codes based on the WEB page.
In the embodiment of the present disclosure, the type of the program code is not limited, and the type of the program code may be predetermined according to an actual application requirement, for example, the program code may be python code or other program codes.
Step 102: the program code is sent to the first controlled device by the local agent.
In practical applications, a communication connection between the terminal and the first controlled device may be established in advance, and in the embodiment of the present disclosure, a connection manner of the communication connection between the terminal and the first controlled device is not limited, for example, the terminal and the first controlled device may perform data interaction in a wired connection manner or in various wireless connection manners.
In an example, the local agent program may be configured to establish a communication connection between the terminal and the first controlled device, for example, when the local agent program is running, the terminal may send an Address Resolution Protocol (ARP) packet in a local area network in a broadcast manner, and when receiving the ARP packet, the first controlled device in the local area network replies a corresponding packet, and the local agent program may analyze the packet replied by the first controlled device to obtain an Internet Protocol (IP) Address of the first controlled device, so that a communication connection between the terminal and the first controlled device may be established; optionally, when there are multiple controlled devices in the local area network, if the local agent receives a data packet returned by the multiple controlled devices, an intelligent device list representing the multiple controlled devices may be generated, and the user may select at least one controlled device in the multiple controlled devices from the intelligent device list of the multiple controlled devices; the communication connection between the terminal and the selected controlled device can be established based on the local agent program.
Optionally, after the communication connection between the terminal and the first controlled device is established, the communication connection between the terminal and the first controlled device may be monitored to obtain the communication connection status between the terminal and the first controlled device; for example, based on the local agent program, the terminal may continue to send an ARP packet in the lan in a broadcast manner, so that the terminal may determine a communication connection status between the terminal and the first controlled device according to a reply condition for the ARP packet; for example, after the terminal sends the ARP packet to the first controlled device, if the terminal does not receive a reply packet for the ARP packet within a set duration, it may determine that the communication connection between the terminal and the first controlled device is interrupted; if a reply data packet aiming at the ARP data packet is received within a set time length, the terminal and the first controlled equipment can be considered to be in communication connection; the set time period may be set according to actual requirements, and in a specific example, the set time period may be 1 to 3 seconds.
In the embodiment of the present disclosure, the terminal may send the program code to the first controlled device according to a communication connection established between the terminal and the first controlled device in advance.
Step 103: the first controlled device receives and runs the program code.
In practical application, the first controlled device may run the upload program code to generate a corresponding control instruction, and the first controlled device may execute the control instruction; in this manner, since the control instruction is generated according to the program code, the terminal can realize the control of the first controlled device by the program code.
For example, when the first controlled device comprises a smart car, by executing the above program code, a control instruction may be generated, which may indicate a motion state of the smart car.
In practical applications, the steps 101 to 102 may be implemented based on a Processor of a terminal, and the step 103 may be implemented based on a Processor of the first controlled Device, where the Processor may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Central Processing Unit (CPU), a controller, a microcontroller, and a microprocessor. It is understood that the electronic device for implementing the above processor function may be other electronic devices for different terminals or controlled devices, and the embodiments of the present disclosure are not particularly limited.
It can be seen that, by adopting the technical scheme of the embodiment of the present disclosure, the program code can be obtained and pushed based on the local agent program, and then the control of the first controlled device can be realized based on the program code, and the program code can be flexibly edited, so that the first controlled device can be more flexibly controlled.
Alternatively, the first controlled device runs the program code to generate feedback information, and then, the feedback information may be transmitted to the terminal.
In this embodiment, the feedback information may be used to indicate feedback on the running condition of the program code, for example, the feedback information may include an execution result of the program code.
It can be seen that by sending the feedback information to the control device, the terminal can know the running condition of the program code, and can further perform processing on the program code conveniently, for example, when the running condition of the program code is not expected, the program code can be modified.
Optionally, after receiving the feedback information sent by the first controlled device, the terminal may load the feedback information and/or display the feedback information on a display interface of the terminal.
Therefore, the feedback information is loaded and/or displayed locally on the terminal, so that a user can conveniently and intuitively know the running condition of the program code.
In the embodiment of the present disclosure, an implementation manner of loading the feedback information by the terminal is not limited, and illustratively, the terminal loads the feedback information in a synchronous loading manner or an asynchronous loading manner.
In a specific example, the terminal may load and/or display the feedback information at the WEB front end, so that the user can intuitively know the running condition of the program code through the WEB front end.
Optionally, the first controlled device may generate a control instruction of a second controlled device by running the program code, where the first controlled device and the second controlled device form a communication connection;
in this embodiment, one second controlled device may be provided, or a plurality of second controlled devices may be provided; the second controlled device may be a raspberry pi or other electronic device; the second controlled device may include a sensor, a camera, and the like; the second controlled device may be the same kind as the first controlled device or may be different.
After the first controlled device generates the control instruction of the second controlled device, the first controlled device may send the control instruction of the second controlled device to the second controlled device; the second controlled device may execute the control instruction of the second controlled device to obtain an instruction execution result.
The second controlled device may transmit the instruction execution result to the first controlled device; the first controlled device may generate a control instruction of the first controlled device according to the instruction execution result, and execute the control instruction of the first controlled device.
It can be seen that the present embodiment provides a cooperative control scheme for a first controlled device and a second controlled device, and specifically, based on the program code, the cooperative control of the first controlled device and the second controlled device can be implemented.
Optionally, the first controlled device and the second controlled device are connected in a wired connection manner, for example, the first controlled device and the second controlled device are connected through a serial port; in this manner, the reliability of communication between the first controlled device and the second controlled device can be ensured.
Optionally, the first controlled device may further obtain the working mode switching information; the working mode switching information is used for indicating the first controlled equipment and the second controlled equipment to switch the current working mode to a target working mode;
therefore, the first controlled equipment can control the first controlled equipment to switch the working mode according to the working mode switching information; the first controlled equipment can also send the working mode switching information to the second controlled equipment; the second controlled device may switch the operation mode based on the operation mode switching information after receiving the operation mode switching information.
That is to say, in the embodiment of the present disclosure, multiple operation modes may be set for each controlled device, and when the controlled device is in different operation modes, the operation modes of the controlled device are different. It can be seen that the present embodiment can implement synchronous switching of the operating modes of the first controlled device and the second controlled device.
Optionally, the control instruction of the second controlled device is an image detection instruction, and the instruction execution result is an image detection result; at this time, the first controlled device may generate a human body tracking instruction according to the received image detection result; then, the motion state of the first controlled device can be controlled according to the human body tracking instruction, and then human body tracking is achieved.
It can be seen that in the embodiments of the present disclosure, human body tracking can be achieved on the basis of cooperative control of the first controlled device and the second controlled device.
Optionally, the first controlled device generates a control instruction for controlling the first controlled device to advance in response to the situation that the image detection result indicates that the human body becomes small; generating a control instruction for controlling the first controlled device to keep still in response to the image detection result indicating that the human body is enlarged; generating a control instruction for controlling the first controlled device to turn left in response to the image detection result representing that a human body is positioned on the left side of the first controlled device; and generating a control instruction for controlling the first controlled equipment to turn right in response to the image detection result indicating that the human body is positioned on the right side of the first controlled equipment.
It can be seen that, in the embodiment of the present disclosure, by controlling the motion state of the first controlled device, the human body tracking can be accurately achieved.
Second embodiment
The present disclosure is further illustrated by the following specific application example.
In this particular application embodiment, the first controlled device may include an EV3 smart car, a motor, related sensors, and the like; the EV3 intelligent vehicle can drive the relevant sensors to move through the motor; the second controlled device can comprise a raspberry, a motor, a camera and the like; fig. 2 is a schematic diagram illustrating a connection relationship between an EV3 smart car and a raspberry pi in an embodiment of the present disclosure, and as shown in fig. 2, the raspberry pi may be connected to a camera in a Universal Serial Bus (USB) connection manner, so as to collect surrounding video or image information in real time; the raspberry pie can also drive the camera to rotate through the motor; the raspberry pi can be connected to an EV3 intelligent vehicle through a serial port patch cord; in one example, the relevant sensor can be a touch sensor, and a trigger signal generated when the touch sensor is touched can be used for synchronously modifying the function modes of the raspberry pi and the EV3 smart car; specifically, the touch sensor can generate a trigger signal to the EV3 smart car directly connected when being touched, so that the functional mode zone bit of the EV3 smart car is changed, the zone bit is sent to the raspberry group through the serial port, and the change of the functional mode zone bit of the raspberry group is triggered, so that the synchronous switching of the functional modes of the raspberry group and the EV3 smart car is realized.
In one example, the connection line correspondence of the raspberry pi and the EV3 smart car may be as shown in table 1.
TABLE 1
The terminal can install the described local agent program; fig. 3 is a schematic diagram illustrating a connection relationship between an EV3 smart vehicle and a terminal in an embodiment of the present disclosure, as shown in fig. 3, the EV3 smart vehicle may be connected to the terminal in a wireless connection manner, so that after a terminal local agent program obtains a program code, the program code may be pushed to the EV3 smart vehicle in real time, and in order to implement wireless connection between the EV3 smart vehicle and the terminal, a wireless network card may be externally connected to a USB interface of the EV3 smart vehicle; optionally, the EV3 smart car and the terminal can be in a unified network segment, so that the wireless network connection stability and the communication reliability between the EV3 smart car and the terminal can be ensured.
Fig. 4 is a flowchart of a device control method according to an embodiment of the present disclosure, and as shown in fig. 4, the flowchart may include:
step 401: connect EV3 smart car and raspberry group, connect EV3 smart car and terminal.
The implementation of this step has already been described in the foregoing description, and is not described herein again.
Step 402: and on the terminal side, realizing program code collection of the WEB front end based on the local agent program, and pushing the collected program codes to the EV3 intelligent vehicle.
Here, the python code submitted by the user can be collected based on the local agent program, and after the python code is pushed to the EV3 intelligent vehicle, the EV3 intelligent vehicle can be made to work based on the python code, that is, remote control of the EV3 intelligent vehicle can be realized based on the python code, so that controlled devices such as the EV3 intelligent vehicle can be controlled more flexibly and diversely; in an education scene, the students can continuously try and make mistakes and verify a novel control scheme, and the students can develop the thinking of the programming process.
In order to realize connection between the EV3 smart car and the terminal, a User Datagram Protocol (UDP) service needs to be started on the EV3 smart car side, and the implementation of the home agent program needs to be based on the UDP service started by the EV3 smart car.
Fig. 5 is a schematic diagram of a functional flow of an agent program in an embodiment of the present disclosure, and referring to fig. 5, the flow may include:
step A1: and acquiring an intelligent vehicle list of the terminal in the same network segment in a broadcasting mode.
Specifically, when a local agent program is operated, a terminal can issue an ARP (address resolution protocol) data packet in a local area network in a broadcasting mode, an EV3 intelligent vehicle in the local area network replies a corresponding data packet after receiving the ARP data packet, and the local agent program can analyze the data packet replied by the EV3 intelligent vehicle to obtain the IP address of the EV3 intelligent vehicle; when IP addresses of a plurality of EV3 intelligent vehicles are obtained, an EV3 intelligent vehicle list in the same network segment can be generated.
Step A2: select EV3 smart car and establish connection.
Specifically, a particular smart vehicle may be selected in the EV3 smart vehicle list and a UDP connection may be created with the corresponding smart vehicle.
Step A3: and pushing program codes to the EV3 intelligent vehicle, and receiving feedback information of the EV3 intelligent vehicle.
Specifically, at the front end of the WEB on the terminal side, a user can submit a program code, after the user clicks a submit button, the local agent program can collect the program code submitted by the user, and then the program code can be pushed to a selected EV3 smart car; the EV3 smart car may run program code, generate feedback information, and send the feedback information to the terminal.
Step A4: and loading feedback information at the front end of the WEB in an asynchronous loading mode.
Step A5: after the program codes are executed, the terminal can be disconnected from the EV3 intelligent vehicle, and resources can be recycled.
Specifically, after receiving or loading the feedback information, it may be considered that the program code has been executed, and at this time, by disconnecting the terminal from the EV3 smart car, the network transmission resource between the terminal and the EV3 smart car may be released, and at the terminal side, the corresponding hardware resource and/or software resource (e.g., WEB front-end resource) may also be released.
Step 403: the EV3 intelligent vehicle receives and runs the program codes, and pushes the image detection instruction to the raspberry pi.
Specifically, the EV3 smart car starts a corresponding UDP service, which may continuously monitor information of the fixed port, and after the local agent program pushes the program code to the EV3 smart car, the EV3 smart car may run the received program code (which may be encapsulated in a data packet), so as to obtain a control command; if the control instruction relates to an image processing task, an image detection instruction can be generated, a corresponding service mode is started, and the image detection instruction is pushed to the raspberry pi to instruct the raspberry pi to execute the image detection task, so that an image detection result can be obtained from the raspberry pi.
Illustratively, the EV3 smart car and raspberry pi may have multiple modes of operation, several modes of operation of the EV3 smart car and raspberry pi are illustratively described below by table 2.
Raspberry pie mode of operation Data results for EV3 Smart vehicle
Face detection Face ID and position coordinates
Gesture classification Type of gesture
Human body tracking Human body ID and coordinates
Sphere detection and the like Sphere color and position coordinates
TABLE 2
Step 404: the raspberry group executes the image detection command, transmits the image detection result (also called image processing result) to the EV3 intelligent vehicle, and the EV3 intelligent vehicle executes a corresponding control command according to the image detection result.
Specifically, the image detection command can be packaged in a data packet to be sent, the raspberry group receives the data packet from the EV3 intelligent vehicle, obtains the working mode of the raspberry group through analysis, then executes the corresponding image detection task, and sends the image detection result to the EV3 intelligent vehicle. In the same working mode, the EV3 smart car and raspberry pi cooperate to perform specific tasks.
Fig. 6 is a schematic view of a processing flow of an EV3 smart car and a raspberry in a human tracking mode in an embodiment of the disclosure, and as shown in fig. 6, the processing flow of the EV3 smart car and the raspberry can be illustrated from an image processing stage, a detection data processing stage and an instruction execution stage; in the image processing stage, an image detection instruction can be pushed by an EV3 intelligent vehicle to trigger, firstly, a camera is used for collecting pictures, then, the human body detection is realized on the basis of a deep learning technology aiming at the collected images, and the human body coordinates are obtained; the data processing stage mainly realizes human body ID identification and position coordinate processing; the data processing result is sent to an EV3 intelligent vehicle, the EV3 intelligent vehicle can control the motor to run according to the logic shown in FIG. 6, and the basic human body tracking function can be realized in this way; referring to fig. 6, when it is determined that the human body is small, the EV3 smart car is driven by the motor to advance; when the human body is determined to be enlarged, the EV3 intelligent vehicle is controlled to be static in a motor driving mode; when the human body is determined to be positioned on the left side, the EV3 intelligent vehicle is driven to turn left through the motor; when the human body is determined to be positioned on the right side, the EV3 is driven by the motor to turn right.
Step 405: the EV3 smart car collects the feedback information and sends the feedback information to the terminal.
Here, based on steps 401 to 405, remote control of the EV3 smart car based on the WEB front end has been implemented, but a complete set of feedback flow needs to be designed in cooperation to ensure the integrity of the bidirectional information flow.
Fig. 7 is a schematic diagram of a processing flow for feedback information in an embodiment of the present disclosure, as shown in fig. 7, on an EV3 smart car side (denoted as EV3 side in fig. 7), after receiving a python code, the original code and the bytecode may be converted based on a python interpreter of the smart car, and then the original code and the bytecode are converted into machine code so as to execute an instruction. The execution result of the program codes can be redirected to a text file, and when a UDP communication module is started on the EV3 side, a communication connection between the EV3 intelligent vehicle and the terminal can be established based on the UDP communication module; the UDP communication module can detect whether the text file is updated in real time, and can read the content (namely feedback information) of the text file after the text information is updated; then, actively pushing the read feedback information to a local agent program of the terminal through the established communication connection; the local agent program is based on a JAVA framework, and can load the feedback information to the WEB front end in an asynchronous loading mode, and the WEB front end feeds back the information in a specified region part after a series of rendering.
It can be seen that, in the application embodiment of the present disclosure, a complete remote control scheme based on an EV3 smart vehicle is provided, and program codes can be input through a WEB end, and then transmitted to a lower computer (i.e., an EV3 smart vehicle) in real time, and the lower computer can reply feedback information; furthermore, the collection of the WEB front end python codes can be supported and completed through the local agent program, and the program codes are pushed to the lower computer side to be executed based on UDP connection; through the local agent program, information interaction between the terminal and the EV3 intelligent vehicle can be realized, and the transmission is efficient and reliable; furthermore, a connection scheme among the terminal, the EV3 intelligent vehicle, the raspberry group and relevant peripherals is designed, so that high-efficiency feasibility of connection can be guaranteed, and most of embedded development equipment is compatible.
In the related art, independent control is usually performed on an EV3 smart car and a raspberry group, and a cooperative control scheme of the EV3 smart car and the raspberry group is not given; moreover, the switching of different working modes can not be realized for the EV3 intelligent vehicle or the raspberry group aiming at realizing the single working mode control of the EV3 intelligent vehicle or the raspberry group.
In the application embodiment of the disclosure, a complete implementation scheme is provided for connection of the EV3 intelligent vehicle and the raspberry as well as control of the EV3 intelligent vehicle and the raspberry; moreover, synchronous switching mechanisms of different working modes can be realized aiming at the EV3 intelligent vehicle and the raspberry, and information interaction consistency under different working modes can be ensured.
Third embodiment
Based on the contents described in the foregoing embodiments, a device control method according to the present embodiment will be described below from the viewpoint of a terminal.
Fig. 8 is a second flowchart of a device control method according to an embodiment of the present disclosure, and as shown in fig. 8, the flowchart may include:
step 801: acquiring a program code for controlling the first controlled device to work through a local agent program;
in the embodiment of the disclosure, the first controlled device may be an electronic device such as an intelligent vehicle and an embedded development device; the first controlled device can comprise a sensor, a camera and the like, and the terminal can provide a human-computer interaction interface. In the embodiment of the present disclosure, the first controlled device may be a single electronic device, or may integrate a plurality of electronic devices.
In practical applications, a local agent (agent) program may be installed in the terminal, and the local agent program is at least used for collecting program codes submitted by the user, where the program codes submitted by the user may be program codes for controlling the operation of the first controlled device, and thus, by running the local agent program, program codes for controlling the operation of the first controlled device may be collected.
In the embodiment of the present disclosure, the type of the program code is not limited, and the type of the program code may be predetermined according to an actual application requirement, for example, the program code may be python code or other program codes.
Step 802: and sending the program code to the first controlled equipment through the local agent program, so that the first controlled equipment runs the program code.
In practical applications, a communication connection between the terminal and the first controlled device may be established in advance, and in the embodiment of the present disclosure, a connection manner of the communication connection between the terminal and the first controlled device is not limited, for example, the terminal and the first controlled device may perform data interaction in a wired connection manner or in various wireless connection manners.
In practical application, the first controlled device may run the upload program code to generate a corresponding control instruction, and the first controlled device may execute the control instruction; in this manner, since the control instruction is generated according to the program code, the terminal can realize the control of the first controlled device by the program code.
In practical applications, steps 801 to 802 may be implemented based on a processor of the terminal, where the processor may be at least one of an ASIC, a DSP, a DSPD, a PLD, an FPGA, a CPU, a controller, a microcontroller, and a microprocessor. It is understood that the electronic device for implementing the above processor function may be other electronic devices for different terminals or controlled devices, and the embodiments of the present disclosure are not particularly limited.
It can be seen that, by adopting the technical scheme of the embodiment of the present disclosure, the program code can be obtained and pushed based on the local agent program, and then the control of the first controlled device can be realized based on the program code, and the program code can be flexibly edited, so that the first controlled device can be more flexibly controlled.
Optionally, the method further comprises:
and after the program code is sent to the first controlled equipment, receiving feedback information of the first controlled equipment, wherein the feedback information is generated after the first controlled equipment runs the program code.
Optionally, the feedback information includes an execution result of the program code.
It can be seen that by sending the feedback information to the control device, the terminal can know the running condition of the program code, and can further perform processing on the program code conveniently, for example, when the running condition of the program code is not expected, the program code can be modified.
Optionally, the method further comprises:
and after receiving the feedback information of the first controlled device, loading and/or displaying the feedback information at a terminal where the local agent program is located.
Therefore, the feedback information is loaded and/or displayed locally on the terminal, so that a user can conveniently and intuitively know the running condition of the program code.
Optionally, the obtaining, by the local agent, a program code for controlling an operation of the controlled device includes:
and acquiring the program code submitted by the user based on the WEB page through the local agent program.
Therefore, the user can conveniently submit the program codes based on the WEB page, and the local agent program is further favorable for collecting the program codes submitted by the user.
Fourth embodiment
Based on the contents described in the foregoing embodiment, another apparatus control method of the present embodiment will be described below from the perspective of the first controlled apparatus.
Fig. 9 is a third flowchart of the device control method according to the embodiment of the present disclosure, and as shown in fig. 9, the flowchart may include:
step 901: receiving a program code which is sent by a local agent program of a terminal and used for controlling the first controlled equipment to work; the program code is collected by the home agent.
In the embodiment of the disclosure, the first controlled device may be an electronic device such as an intelligent vehicle and an embedded development device; the first controlled device can comprise a sensor, a camera and the like, and the terminal can provide a human-computer interaction interface. In the embodiment of the present disclosure, the first controlled device may be a single electronic device, or may integrate a plurality of electronic devices.
In practical applications, a local agent (agent) program may be installed in the terminal, and the local agent program is at least used for collecting program codes submitted by the user, where the program codes submitted by the user may be program codes for controlling the operation of the first controlled device, and thus, by running the local agent program, program codes for controlling the operation of the first controlled device may be collected.
In the embodiment of the present disclosure, the type of the program code is not limited, and the type of the program code may be predetermined according to an actual application requirement, for example, the program code may be python code or other program codes.
In practical applications, a communication connection between the terminal and the first controlled device may be established in advance, and in the embodiment of the present disclosure, a connection manner of the communication connection between the terminal and the first controlled device is not limited, for example, the terminal and the first controlled device may perform data interaction in a wired connection manner or in various wireless connection manners.
Step 902: and executing the program code.
In practical application, the first controlled device may run the upload program code to generate a corresponding control instruction, and the first controlled device may execute the control instruction; in this manner, since the control instruction is generated according to the program code, the terminal can realize the control of the first controlled device by the program code.
It can be seen that, by adopting the technical scheme of the embodiment of the present disclosure, the program code can be obtained and pushed based on the local agent program, and then the control of the first controlled device can be realized based on the program code, and the program code can be flexibly edited, so that the first controlled device can be more flexibly controlled.
Optionally, the method further comprises:
and operating the program code to generate feedback information, and sending the feedback information to the terminal.
Optionally, the feedback information includes an execution result of the program code.
It can be seen that by sending the feedback information to the control device, the terminal can know the running condition of the program code, and can further perform processing on the program code conveniently, for example, when the running condition of the program code is not expected, the program code can be modified.
Optionally, the method further comprises:
generating a control instruction of a second controlled device by running the program code, wherein the first controlled device and the second controlled device form a communication connection;
sending the control instruction of the second controlled device to the second controlled device, and enabling the second controlled device to execute the control instruction of the second controlled device;
receiving an instruction execution result sent by the second controlled device, generating a control instruction of the first controlled device according to the instruction execution result, and executing the control instruction of the first controlled device; and the instruction execution result is obtained after the second controlled device executes the control instruction of the second controlled device.
It can be seen that the present embodiment provides a cooperative control scheme for a first controlled device and a second controlled device, and specifically, based on the program code, the cooperative control of the first controlled device and the second controlled device can be implemented.
Optionally, the method further comprises:
acquiring working mode switching information; the working mode switching information is used for indicating the first controlled device and the second controlled device to switch the current working mode to a target working mode;
controlling the first controlled equipment to switch the working mode according to the working mode switching information; and sending the working mode switching information to second controlled equipment, so that the second controlled equipment switches the working mode based on the working mode switching information.
That is to say, in the embodiment of the present disclosure, multiple operation modes may be set for each controlled device, and when the controlled device is in different operation modes, the operation modes of the controlled device are different. It can be seen that the present embodiment can implement synchronous switching of the operating modes of the first controlled device and the second controlled device.
Optionally, the control instruction of the second controlled device is an image detection instruction, and the instruction execution result is an image detection result;
the generating a control instruction of the first controlled device according to the instruction execution result, and executing the control instruction of the first controlled device include:
generating a control instruction of the first controlled device according to the image detection result, wherein the control instruction of the first controlled device is a human body tracking instruction;
and controlling the motion state of the first controlled equipment according to the human body tracking instruction.
It can be seen that in the embodiments of the present disclosure, human body tracking can be achieved on the basis of cooperative control of the first controlled device and the second controlled device.
Optionally, the generating a control instruction of the first controlled device according to the image detection result includes:
generating a control instruction for controlling the first controlled equipment to advance in response to the situation that the image detection result shows that the human body becomes small;
generating a control instruction for controlling the first controlled device to keep still in response to the image detection result indicating that the human body is enlarged;
generating a control instruction for controlling the first controlled device to turn left in response to the image detection result representing that a human body is positioned on the left side of the first controlled device;
and generating a control instruction for controlling the first controlled equipment to turn right in response to the image detection result indicating that the human body is positioned on the right side of the first controlled equipment.
It can be seen that, in the embodiment of the present disclosure, by controlling the motion state of the first controlled device, the human body tracking can be accurately achieved.
Optionally, the first controlled device and the second controlled device are connected in a wired connection manner; in this manner, the reliability of communication between the first controlled device and the second controlled device can be ensured.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Fifth embodiment
On the basis of the device control method provided by the foregoing embodiment, the embodiment of the present disclosure provides a terminal.
Fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present disclosure, and as shown in fig. 10, the apparatus includes an obtaining module 1001 and a first processing module 1002, where,
an obtaining module 1001 configured to obtain, by a home agent, a program code for controlling an operation of a first controlled device;
the first processing module 1002 is configured to send a program code to the first controlled device through the local agent, so that the first controlled device runs the program code.
In an embodiment, the first processing module 1002 is further configured to receive feedback information of the first controlled device after the program code is sent to the first controlled device, where the feedback information is generated after the first controlled device runs the program code.
In an embodiment, the first processing module 1002 is further configured to, after receiving the feedback information of the first controlled device, load and/or display the feedback information at a terminal where the local agent program is located.
In an embodiment, the feedback information comprises results of the execution of the program code.
In an embodiment, the obtaining module 1001 is configured to obtain, by the local agent, the program code submitted by a user based on a WEB page.
The obtaining module 1001 and the first processing module 1002 may be implemented by a processor located in a terminal, where the processor is at least one of an ASIC, a DSP, a DSPD, a PLD, an FPGA, a CPU, a controller, a microcontroller, and a microprocessor.
In addition, each functional module in this embodiment may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware or a form of a software functional module.
Based on the understanding that the technical solution of the present embodiment essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method of the present embodiment. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Specifically, the computer program instructions corresponding to a device control method in the present embodiment may be stored on a storage medium such as an optical disc, a hard disc, a usb disk, or the like, and when the computer program instructions corresponding to a device control method in the storage medium are read or executed by an electronic device, any one of the device control methods of the foregoing third embodiments is implemented.
Based on the same technical concept of the foregoing embodiments, referring to fig. 11, it shows another terminal provided by the embodiments of the present disclosure, which may include: a first memory 1101 and a first processor 1102; wherein,
the first memory 1101 for storing computer programs and data;
the first processor 1102 is configured to execute the computer program stored in the first memory to implement any one of the device control methods according to the third embodiment.
In practical applications, the first memory 1101 may be a volatile memory (RAM); or a non-volatile memory (non-volatile memory) such as a ROM, a flash memory (flash memory), a Hard Disk Drive (HDD) or a Solid-State Drive (SSD); or a combination of the above types of memories and provides instructions and data to the first processor 1102.
The first processor 1102 may be at least one of an ASIC, a DSP, a DSPD, a PLD, an FPGA, a CPU, a controller, a microcontroller, and a microprocessor. It is understood that the electronic devices for implementing the above-described processor functions may be other devices, and the embodiments of the present disclosure are not particularly limited.
Sixth embodiment
On the basis of another device control method proposed by the foregoing embodiment, the embodiment of the present disclosure proposes a controlled device.
Fig. 12 is a schematic structural diagram of a first controlled device according to an embodiment of the present disclosure, as shown in fig. 12, the device includes a receiving module 1201 and a second processing module 1202, wherein,
a receiving module 1201, configured to receive a program code sent by a local agent of a terminal and used to control operation of the first controlled device; the program code is collected by the home agent;
a second processing module 1202, configured to run the program code.
In an embodiment, the second processing module 1202 is further configured to run the program code to generate feedback information, and send the feedback information to the terminal.
In an embodiment, the feedback information comprises results of the execution of the program code.
In an embodiment, the second processing module 1202 is further configured to:
generating a control instruction of a second controlled device by running the program code, wherein the first controlled device and the second controlled device form a communication connection;
sending the control instruction of the second controlled device to the second controlled device, and enabling the second controlled device to execute the control instruction of the second controlled device;
receiving an instruction execution result sent by the second controlled device, generating a control instruction of the first controlled device according to the instruction execution result, and executing the control instruction of the first controlled device; and the instruction execution result is obtained after the second controlled device executes the control instruction of the second controlled device.
In an embodiment, the second processing module 1202 is further configured to obtain operating mode switching information; controlling the first controlled equipment to switch the working mode according to the working mode switching information; sending the working mode switching information to second controlled equipment to enable the second controlled equipment to switch the working mode based on the working mode switching information; the working mode switching information is used for indicating the first controlled device and the second controlled device to switch the current working mode to a target working mode.
In one embodiment, the control instruction of the second controlled device is an image detection instruction, and the instruction execution result is an image detection result;
the second processing module 1202 is configured to generate a control instruction of the first controlled device according to the image detection result, where the control instruction of the first controlled device is a human body tracking instruction; and controlling the motion state of the first controlled equipment according to the human body tracking instruction.
In an embodiment, the second processing module 1202 is configured to generate a control instruction for controlling the first controlled device to advance in response to the image detection result indicating that a human body becomes smaller; generating a control instruction for controlling the first controlled device to keep still in response to the image detection result indicating that the human body is enlarged; generating a control instruction for controlling the first controlled device to turn left in response to the image detection result representing that a human body is positioned on the left side of the first controlled device; and generating a control instruction for controlling the first controlled equipment to turn right in response to the image detection result indicating that the human body is positioned on the right side of the first controlled equipment.
In one embodiment, the first controlled device and the second controlled device are connected by a wired connection.
The receiving module 1201 and the second processing module 1202 may be implemented by a processor located in the first controlled device, where the processor is at least one of an ASIC, a DSP, a DSPD, a PLD, an FPGA, a CPU, a controller, a microcontroller, and a microprocessor.
In addition, each functional module in this embodiment may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware or a form of a software functional module.
Based on the understanding that the technical solution of the present embodiment essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method of the present embodiment. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Specifically, the computer program instructions corresponding to another device control method in the present embodiment may be stored on a storage medium such as an optical disc, a hard disc, or a usb disk, and when the computer program instructions corresponding to another device control method in the storage medium are read or executed by an electronic device, any one of the device control methods of the fourth embodiment described above is implemented.
Based on the same technical concept of the foregoing embodiment, referring to fig. 13, it shows an electronic device provided by an embodiment of the present disclosure, which may include: a second memory 1301 and a second processor 1302; wherein,
the second memory 1301 for storing computer programs and data;
the second processor 1302 is configured to execute the computer program stored in the second memory to implement any one of the device control methods of the fourth embodiment.
In practical applications, the second memory 1301 may be a volatile memory, such as a RAM; or a nonvolatile memory such as a ROM, a flash memory, an HDD, or an SSD; or a combination of the above types of memories and provides instructions and data to the second processor 1302.
The second processor 1302 may be at least one of an ASIC, a DSP, a DSPD, a PLD, an FPGA, a CPU, a controller, a microcontroller, and a microprocessor. It is understood that the electronic device for implementing the processor function may be other electronic devices, and the embodiments of the present disclosure are not limited in particular.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and for specific implementation, reference may be made to the description of the above method embodiments, and for brevity, details are not described here again
The foregoing description of the various embodiments is intended to highlight various differences between the embodiments, and the same or similar parts may be referred to each other, which are not repeated herein for brevity
The methods disclosed in the method embodiments provided by the present application can be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in various product embodiments provided by the application can be combined arbitrarily to obtain new product embodiments without conflict.
The features disclosed in the various method or apparatus embodiments provided herein may be combined in any combination to arrive at new method or apparatus embodiments without conflict.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present disclosure.
While the embodiments of the present disclosure have been described in connection with the drawings, the present disclosure is not limited to the specific embodiments described above, which are intended to be illustrative rather than limiting, and it will be apparent to those of ordinary skill in the art in light of the present disclosure that many more modifications can be made without departing from the spirit of the disclosure and the scope of the appended claims.

Claims (10)

1. An apparatus control method, characterized in that the method comprises:
acquiring a program code for controlling the first controlled device to work through a local agent program;
and sending the program code to the first controlled equipment through the local agent program, so that the first controlled equipment runs the program code.
2. The method of claim 1, further comprising:
and after the program code is sent to the first controlled equipment, receiving feedback information of the first controlled equipment, wherein the feedback information is generated after the first controlled equipment runs the program code.
3. The method of claim 2, further comprising:
and after receiving the feedback information of the first controlled device, loading and/or displaying the feedback information at a terminal where the local agent program is located.
4. An apparatus control method applied to a first controlled apparatus, the method comprising:
receiving a program code which is sent by a local agent program of a terminal and used for controlling the first controlled equipment to work; the program code is collected by the home agent;
and executing the program code.
5. A terminal, characterized in that the terminal comprises an acquisition module and a first processing module, wherein,
the acquisition module is used for acquiring a program code for controlling the first controlled device to work through the local agent program;
and the first processing module is used for sending the program code to the first controlled equipment through the local agent program so as to enable the first controlled equipment to run the program code.
6. A first controlled device, characterized in that it comprises a receiving module and a second processing module, wherein,
the receiving module is used for receiving a program code which is sent by a local agent program of the terminal and is used for controlling the first controlled device to work; the program code is collected by the home agent;
and the second processing module is used for operating the program code.
7. A terminal, characterized in that the terminal comprises a processor and a memory for storing a computer program capable of running on the processor; wherein,
the processor, when executing the computer program, is adapted to perform the method of claim 1.
8. An electronic device, characterized in that the device comprises a processor and a memory for storing a computer program executable on the processor; wherein,
the processor, when being configured to execute the computer program, is configured to perform the method of claim 4.
9. A computer storage medium on which a computer program is stored, characterized in that the computer program realizes the method of claim 1 when executed by a processor.
10. A computer storage medium on which a computer program is stored, characterized in that the computer program realizes the method of claim 4 when executed by a processor.
CN201910578882.2A 2019-06-28 2019-06-28 Apparatus control method, terminal, controlled plant, electronic equipment and storage medium Pending CN110297472A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201910578882.2A CN110297472A (en) 2019-06-28 2019-06-28 Apparatus control method, terminal, controlled plant, electronic equipment and storage medium
PCT/CN2020/091915 WO2020259154A1 (en) 2019-06-28 2020-05-22 Device control method, terminal, controlled device, electronic device, medium, and program
TW109121221A TWI743853B (en) 2019-06-28 2020-06-22 Device control method, electronic device and medium thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910578882.2A CN110297472A (en) 2019-06-28 2019-06-28 Apparatus control method, terminal, controlled plant, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110297472A true CN110297472A (en) 2019-10-01

Family

ID=68029509

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910578882.2A Pending CN110297472A (en) 2019-06-28 2019-06-28 Apparatus control method, terminal, controlled plant, electronic equipment and storage medium

Country Status (3)

Country Link
CN (1) CN110297472A (en)
TW (1) TWI743853B (en)
WO (1) WO2020259154A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111262912A (en) * 2020-01-09 2020-06-09 北京邮电大学 System, method and device for controlling vehicle motion
CN112001827A (en) * 2020-09-25 2020-11-27 上海商汤临港智能科技有限公司 Teaching aid control method and device, teaching equipment and storage medium
WO2020259154A1 (en) * 2019-06-28 2020-12-30 上海商汤智能科技有限公司 Device control method, terminal, controlled device, electronic device, medium, and program
CN113903218A (en) * 2021-10-21 2022-01-07 深圳市优必选科技股份有限公司 Programming method, electronic device and computer readable storage medium
CN115442450A (en) * 2022-08-24 2022-12-06 山东浪潮科学研究院有限公司 Cloud sharing method and storage medium for programmable artificial intelligent vehicle
CN115827516A (en) * 2023-02-03 2023-03-21 北京融合未来技术有限公司 Equipment control method and device, data acquisition system, equipment and medium

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112860522A (en) * 2021-03-02 2021-05-28 北京梧桐车联科技有限责任公司 Program operation monitoring method, device and equipment
CN114070659B (en) * 2021-10-29 2023-11-17 深圳市优必选科技股份有限公司 Equipment locking method and device and terminal equipment
CN115118699A (en) * 2022-06-21 2022-09-27 国仪量子(合肥)技术有限公司 Data transmission method, device, system, upper computer and storage medium
CN116661400A (en) * 2023-07-14 2023-08-29 广船国际有限公司 Control method, device, equipment and medium of target equipment based on machine vision

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102520665A (en) * 2011-12-23 2012-06-27 中国科学院自动化研究所 Open robot demonstration device and robot control system
CN104175308A (en) * 2014-08-12 2014-12-03 湖南信息职业技术学院 Self-service robot
CN104932298A (en) * 2015-04-28 2015-09-23 中国地质大学(武汉) Controller of teaching robot
CN105045153A (en) * 2015-07-31 2015-11-11 中国地质大学(武汉) Three-mode control system based on mobile robot platform
CN105488815A (en) * 2015-11-26 2016-04-13 北京航空航天大学 Real-time object tracking method capable of supporting target size change
CN105760824A (en) * 2016-02-02 2016-07-13 北京进化者机器人科技有限公司 Moving body tracking method and system
CN106737676A (en) * 2016-12-28 2017-05-31 南京埃斯顿机器人工程有限公司 It is a kind of based on script can secondary development industrial robot programing system

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6853867B1 (en) * 1998-12-30 2005-02-08 Schneider Automation Inc. Interface to a programmable logic controller
WO2004114032A1 (en) * 2003-06-25 2004-12-29 Avecontrol Oy Signal processing method and signal processing system
US7950010B2 (en) * 2005-01-21 2011-05-24 Sap Ag Software deployment system
US20060168575A1 (en) * 2005-01-21 2006-07-27 Ankur Bhatt Defining a software deployment
TWI337691B (en) * 2007-07-24 2011-02-21 Apparatus and method for positioning control
US8914783B2 (en) * 2008-11-25 2014-12-16 Fisher-Rosemount Systems, Inc. Software deployment manager integration within a process control system
JP5240141B2 (en) * 2009-09-14 2013-07-17 株式会社リコー Program download system, program download method, image forming apparatus, program distribution server, and download program
TW201233096A (en) * 2011-01-31 2012-08-01 Fuither Tech Co Ltd Remote assistance service method for embedded operation system
CN102981758B (en) * 2012-11-05 2015-05-13 福州瑞芯微电子有限公司 Connection method between electronic devices
CN104808600B (en) * 2014-01-26 2017-07-07 广东美的制冷设备有限公司 Controlled terminal multi-control modes self-adaptation control method, system and control terminal
CN204965043U (en) * 2015-09-23 2016-01-13 苏州工业园区宅艺智能科技有限公司 Intelligent house control system based on cloud platform
CN106651384A (en) * 2015-10-30 2017-05-10 阿里巴巴集团控股有限公司 Sample quality detecting method, detecting data recording method, sample quality detecting device and detecting data recording system
CN105760106B (en) * 2016-03-08 2019-01-15 网易(杭州)网络有限公司 A kind of smart home device exchange method and device
CN105871670A (en) * 2016-05-20 2016-08-17 珠海格力电器股份有限公司 Control method, device and system of terminal equipment
US20170351226A1 (en) * 2016-06-01 2017-12-07 Rockwell Automation Technologies, Inc. Industrial machine diagnosis and maintenance using a cloud platform
CN108958103A (en) * 2018-06-25 2018-12-07 珠海格力电器股份有限公司 Control method, controlled method and device, intelligent terminal and intelligent electric appliance
CN109166404A (en) * 2018-10-12 2019-01-08 山东爱泊客智能科技有限公司 The implementation method and device of self-editing process control based on shared controllable model
CN110297472A (en) * 2019-06-28 2019-10-01 上海商汤智能科技有限公司 Apparatus control method, terminal, controlled plant, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102520665A (en) * 2011-12-23 2012-06-27 中国科学院自动化研究所 Open robot demonstration device and robot control system
CN104175308A (en) * 2014-08-12 2014-12-03 湖南信息职业技术学院 Self-service robot
CN104932298A (en) * 2015-04-28 2015-09-23 中国地质大学(武汉) Controller of teaching robot
CN105045153A (en) * 2015-07-31 2015-11-11 中国地质大学(武汉) Three-mode control system based on mobile robot platform
CN105488815A (en) * 2015-11-26 2016-04-13 北京航空航天大学 Real-time object tracking method capable of supporting target size change
CN105760824A (en) * 2016-02-02 2016-07-13 北京进化者机器人科技有限公司 Moving body tracking method and system
CN106737676A (en) * 2016-12-28 2017-05-31 南京埃斯顿机器人工程有限公司 It is a kind of based on script can secondary development industrial robot programing system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
R.帕特里克•戈贝尔: "《ROS入门实例》", 31 January 2016 *
王丽佳等: "分块多特征目标描述子的移动机器人目标跟踪", 《控制与决策》 *
陈宏兴等: "移动机器人的安卓智能手机遥控软件系统", 《PLC & FA》 *
韩建海: "《工业机器人》", 30 September 2009 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020259154A1 (en) * 2019-06-28 2020-12-30 上海商汤智能科技有限公司 Device control method, terminal, controlled device, electronic device, medium, and program
CN111262912A (en) * 2020-01-09 2020-06-09 北京邮电大学 System, method and device for controlling vehicle motion
CN112001827A (en) * 2020-09-25 2020-11-27 上海商汤临港智能科技有限公司 Teaching aid control method and device, teaching equipment and storage medium
CN112001827B (en) * 2020-09-25 2024-09-13 上海商汤临港智能科技有限公司 Teaching aid control method and device, teaching equipment and storage medium
CN113903218A (en) * 2021-10-21 2022-01-07 深圳市优必选科技股份有限公司 Programming method, electronic device and computer readable storage medium
CN113903218B (en) * 2021-10-21 2024-03-15 优必选(湖北)科技有限公司 Educational programming method, electronic equipment and computer readable storage medium
CN115442450A (en) * 2022-08-24 2022-12-06 山东浪潮科学研究院有限公司 Cloud sharing method and storage medium for programmable artificial intelligent vehicle
CN115827516A (en) * 2023-02-03 2023-03-21 北京融合未来技术有限公司 Equipment control method and device, data acquisition system, equipment and medium

Also Published As

Publication number Publication date
TW202122944A (en) 2021-06-16
TWI743853B (en) 2021-10-21
WO2020259154A1 (en) 2020-12-30

Similar Documents

Publication Publication Date Title
CN110297472A (en) Apparatus control method, terminal, controlled plant, electronic equipment and storage medium
JP6126738B2 (en) Hand shaking control method, apparatus and system
CN111283680B (en) System and method for wireless remote control of robot
US10591999B2 (en) Hand gesture recognition method, device, system, and computer storage medium
WO2022127829A1 (en) Self-moving robot, and path planning method, apparatus and device therefor, and storage medium
JP6932852B2 (en) Data communication method and human-computer interaction system
CN109992111B (en) Augmented reality extension method and electronic device
CN112543960A (en) Information processing apparatus, mediation apparatus, simulation system, and information processing method
US20240069550A1 (en) Method for processing abnormality of material pushing robot, device, server, and storage medium
CN110737798A (en) Indoor inspection method and related product
CN111427340A (en) Sweeper, server, sweeper control method and control system
CN104239842B (en) A kind of methods, devices and systems for realizing visual identity
CN109324605A (en) A kind of flying vehicles control method, apparatus, equipment and storage medium
Kool et al. Visual machine intelligence for home automation
CN111061371A (en) Control method and device of electronic painted screen, mobile terminal and storage medium
Chamberlain et al. A distributed robotic vision service
KR20210072463A (en) Method of human-machine interaction, and device for the same
CN213122965U (en) Test system of vehicle-mounted networking terminal
CN114610143A (en) Method, device, equipment and storage medium for equipment control
CN110554966A (en) Drive debugging method, behavior analysis method and drive debugging system
CN113043268A (en) Robot eye calibration method, device, terminal, system and storage medium
CN115147785B (en) Vehicle identification method and device, electronic equipment and storage medium
CN111152230B (en) Robot teaching method, system, teaching robot and storage medium
CN206226618U (en) A kind of embedded RGB D video flowing acquisition systems
CN117731205A (en) Cleaning equipment operation control method and device and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20191001