WO2020259154A1 - 设备控制方法、终端、受控设备、电子设备、介质和程序 - Google Patents
设备控制方法、终端、受控设备、电子设备、介质和程序 Download PDFInfo
- Publication number
- WO2020259154A1 WO2020259154A1 PCT/CN2020/091915 CN2020091915W WO2020259154A1 WO 2020259154 A1 WO2020259154 A1 WO 2020259154A1 CN 2020091915 W CN2020091915 W CN 2020091915W WO 2020259154 A1 WO2020259154 A1 WO 2020259154A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- controlled device
- program code
- control instruction
- program
- terminal
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 106
- 238000001514 detection method Methods 0.000 claims description 64
- 238000012545 processing Methods 0.000 claims description 59
- 238000004891 communication Methods 0.000 claims description 40
- 238000004590 computer program Methods 0.000 claims description 30
- 230000004044 response Effects 0.000 claims description 28
- 230000015654 memory Effects 0.000 claims description 26
- 230000033001 locomotion Effects 0.000 claims description 12
- 238000010586 diagram Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 15
- 238000005516 engineering process Methods 0.000 description 9
- 230000003993 interaction Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 230000001360 synchronised effect Effects 0.000 description 7
- KLDZYURQCUYZBL-UHFFFAOYSA-N 2-[3-[(2-hydroxyphenyl)methylideneamino]propyliminomethyl]phenol Chemical compound OC1=CC=CC=C1C=NCCCN=CC1=CC=CC=C1O KLDZYURQCUYZBL-UHFFFAOYSA-N 0.000 description 6
- 201000001098 delayed sleep phase syndrome Diseases 0.000 description 6
- 208000033921 delayed sleep phase type circadian rhythm sleep disease Diseases 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000018109 developmental process Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000007689 inspection Methods 0.000 description 2
- 241001417527 Pempheridae Species 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000009333 weeding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/4183—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41835—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by programme execution
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41845—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by system universality, reconfigurability, modularity
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/4185—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the network communication
- G05B19/41855—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the network communication by local area network [LAN], network structure
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0287—Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/12—Target-seeking control
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Definitions
- This application relates to smart device control technology, in particular to a device control method, terminal, controlled device, electronic device, computer storage medium, and computer program.
- the remote control method for smart devices can be applied to multiple fields, for example, it can be applied to the field of remote control or programming education.
- corresponding control instructions can be sent through fixed command buttons.
- the embodiments of the present application propose a device control method, terminal, controlled device, electronic device, computer storage medium, and computer program.
- the embodiment of the present application provides a device control method, the method includes:
- the program code is sent to the first controlled device through the local agent program, so that the first controlled device runs the program code.
- the method further includes:
- the feedback information of the first controlled device is received, and the feedback information is generated by the first controlled device after running the program code.
- the operation status of the above program code can be learned, and then the program code can be processed conveniently. For example, when the operation status of the program code does not meet expectations, the program code can be modified.
- the method further includes:
- the feedback information includes the execution result of the program code.
- the obtaining program code for controlling the operation of the controlled device through a local agent program includes:
- the program code submitted by the user based on the web (WEB) page is obtained through the local agent program.
- the local agent program can collect the corresponding program code; it can be seen that the user can submit the program code more conveniently based on the WEB page.
- the embodiment of the present application also provides another device control method, which is applied to the first controlled device, and the method includes:
- the method further includes:
- the terminal can learn about the operation of the program code, and then facilitate processing of the program code. For example, when the operation of the program code does not meet expectations, the program code can be modified.
- the feedback information includes the execution result of the program code.
- the method further includes:
- the embodiments of the present application provide a cooperative control solution for the first controlled device and the second controlled device.
- the first controlled device and the second controlled device can be implemented based on the above program code. Cooperative control of the second controlled device.
- the method further includes:
- the working mode switching information is used to instruct the first controlled device and the second controlled device to switch the current working mode to the target working mode;
- control the first controlled device to switch operating modes; send the operating mode switching information to a second controlled device, so that the second controlled device is based on the operating mode switching information Switch working mode.
- control instruction of the second controlled device is an image detection instruction, and the execution result of the instruction is an image detection result
- the generating the control instruction of the first controlled device according to the instruction execution result and executing the control instruction of the first controlled device includes:
- the movement state of the first controlled device is controlled.
- human body tracking can be implemented on the basis of the coordinated control of the first controlled device and the second controlled device.
- the generating a control instruction of the first controlled device according to the image detection result includes:
- a control instruction for controlling the first controlled device to turn right is generated.
- the human body tracking can be accurately realized by controlling the motion state of the first controlled device.
- the first controlled device and the second controlled device are connected through a wired connection.
- An embodiment of the present application also provides a terminal, the terminal includes an acquisition module and a first processing module, wherein:
- the obtaining module is configured to obtain program code for controlling the operation of the first controlled device through the local agent program
- the first processing module is configured to send the program code to the first controlled device through the local agent program, so that the first controlled device runs the program code.
- the first processing module is further configured to receive feedback information of the first controlled device after sending the program code to the first controlled device, and the The feedback information is generated after the first controlled device runs the program code.
- the operation status of the above program code can be learned, and then the program code can be processed conveniently. For example, when the operation status of the program code does not meet expectations, the program code can be modified.
- the first processing module is further configured to load the feedback information on the terminal where the local agent program is located after receiving the feedback information of the first controlled device And/or display.
- the feedback information includes the execution result of the program code.
- the obtaining module is configured to obtain the program code submitted by the user based on the WEB page through the local agent program.
- the local agent program can collect the corresponding program code; it can be seen that the user can submit the program code more conveniently based on the WEB page.
- An embodiment of the present application also provides a first controlled device.
- the first controlled device includes a receiving module and a second processing module, wherein:
- the receiving module is configured to receive the program code sent by the terminal through the local agent program for controlling the operation of the first controlled device; the program code is collected by the local agent program;
- the second processing module is configured to run the program code.
- the second processing module is further configured to run the program code to generate feedback information, and send the feedback information to the terminal.
- the terminal can learn about the operation of the program code, and then facilitate processing of the program code. For example, when the operation of the program code does not meet expectations, the program code can be modified.
- the feedback information includes the execution result of the program code.
- the second processing module is further configured to:
- the embodiments of the present application provide a cooperative control solution for the first controlled device and the second controlled device.
- the first controlled device and the second controlled device can be implemented based on the above program code. Cooperative control of the second controlled device.
- the second processing module is further configured to obtain work mode switching information; according to the work mode switching information, control the first controlled device to switch the work mode; change the work mode The switching information is sent to the second controlled device so that the second controlled device switches the working mode based on the working mode switching information; the working mode switching information is used to instruct the first controlled device and the second controlled device The controlled device switches the current working mode to the target working mode.
- control instruction of the second controlled device is an image detection instruction, and the execution result of the instruction is an image detection result
- the second processing module is configured to generate a control instruction of the first controlled device according to the image detection result, the control instruction of the first controlled device is a human body tracking instruction; according to the human body tracking instruction, Controlling the movement state of the first controlled device.
- human body tracking can be implemented based on the coordinated control of the first controlled device and the second controlled device.
- the second processing module is configured to generate a control instruction for controlling the advancement of the first controlled device in response to the image detection result indicating that the human body becomes smaller;
- the image detection result indicates that the human body becomes larger, and a control instruction for controlling the first controlled device to remain still is generated; in response to the image detection result, the human body is located on the left side of the first controlled device Case, generate a control instruction for controlling the first controlled device to turn left; in response to the image detection result indicating that the human body is on the right side of the first controlled device, generate a control command for controlling the first controlled device Control instructions for the controlled device to turn right.
- the human body tracking can be accurately realized by controlling the motion state of the first controlled device.
- the first controlled device and the second controlled device are connected through a wired connection.
- An embodiment of the present application also provides a terminal, the terminal including a processor and a memory for storing a computer program that can run on the processor; wherein,
- any one of the above-mentioned device control methods applied to the terminal is executed.
- An embodiment of the present application also provides an electronic device, which includes a processor and a memory for storing a computer program that can run on the processor; wherein,
- the processor When the processor is used to run the computer program, it executes any one of the aforementioned device control methods applied to the first controlled device.
- the embodiment of the present application also provides a computer storage medium on which a computer program is stored, and when the computer program is executed by a processor, any one of the above device control methods is implemented.
- the embodiment of the present application also provides a computer program, including computer readable code, when the computer readable code runs in the terminal, the processor in the terminal executes any of the above-mentioned device control methods applied to the terminal .
- the embodiments of the present application also provide another computer program, including computer-readable code.
- the processor in the electronic device executes the Any of the above device control methods.
- the program code for controlling the operation of the first controlled device is obtained through a local agent program;
- the program sends the program code to the first controlled device, so that the first controlled device runs the program code.
- the technical solution of the embodiment of the present application can obtain and push the program code based on the local agent program, and then can realize the control of the first controlled device based on the program code, and the program code can be flexibly edited, so that it can be more flexible Control the first controlled device.
- Fig. 1 is a first flowchart of a device control method according to an embodiment of the application
- Figure 2 is a schematic diagram of the connection relationship between the EV3 smart car and the Raspberry Pi in an embodiment of the application;
- Figure 3 is a schematic diagram of the connection relationship between the EV3 smart car and the terminal in an embodiment of the application
- Figure 5 is a schematic diagram of the functional flow of the agent program in an embodiment of the application.
- FIG. 6 is a schematic diagram of the processing flow of the EV3 smart car and the Raspberry Pi in the human tracking mode in the embodiment of the application;
- FIG. 7 is a schematic diagram of a processing flow for feedback information in an embodiment of the application.
- FIG. 8 is a third flowchart of a device control method according to an embodiment of the application.
- FIG. 9 is a fourth flowchart of a device control method according to an embodiment of this application.
- FIG. 10 is a schematic diagram of the composition structure of a terminal according to an embodiment of the application.
- FIG. 11 is a schematic structural diagram of another terminal according to an embodiment of the application.
- FIG. 12 is a schematic diagram of the composition structure of a first controlled device according to an embodiment of the application.
- FIG. 13 is a schematic structural diagram of an electronic device according to an embodiment of the application.
- the terms "including”, “including” or any other variations thereof are intended to cover non-exclusive inclusion, so that a method or device including a series of elements not only includes what is clearly stated Elements, but also include other elements that are not explicitly listed, or elements inherent to the implementation of the method or device. Without more restrictions, the element defined by the sentence “including a" does not exclude the existence of other related elements (such as steps or steps in the method) in the method or device that includes the element.
- the unit in the device for example, the unit may be part of a circuit, part of a processor, part of a program or software, etc.).
- the device control method provided by the embodiment of the application includes a series of steps, but the device control method provided by the embodiment of the application is not limited to the recorded steps.
- the terminal and controlled device provided by the embodiment of the application include There are a series of modules, but the terminals and controlled devices provided in the embodiments of the present application are not limited to include explicitly recorded modules, and may also include modules that need to be set to obtain related information or perform processing based on information.
- the embodiments of the present application can be applied to a control system composed of a terminal and a controlled device, and can be operated with many other general or dedicated computing system environments or configurations.
- the terminal can be a thin client, a thick client, a handheld or laptop device, a microprocessor-based system, a set-top box, a programmable consumer electronic product, a network personal computer, a small computer system, etc.
- the controlled device can be Electronic equipment such as smart cars.
- the device control method provided in the embodiments of the present application may also be implemented by a processor executing computer program code.
- Electronic devices such as terminals and controlled devices can be described in the general context of computer system executable instructions (such as program modules) executed by the computer system.
- program modules may include routines, programs, object programs, components, logic, data structures, etc., which perform specific tasks or implement specific abstract data types.
- the educational robot is a universal teaching aid model for modern programming education, which can be controlled by WEB or a mobile application (Application, APP) client; in related smart device control solutions , Usually through Bluetooth and wireless video transmission, and then information processing on the server side; in the first exemplary solution, a Wi-Fi (Wireless Fidelity) communication-based idea control video car system is proposed.
- the system consists of four parts: an EEG acquisition card, a smart phone, a mobile car and a personal computer.
- the connection between the smart phone and the EEG acquisition card is realized through Bluetooth, and the remote control of the mobile car can be realized; in the second exemplary solution, A remote control weeding robot is proposed.
- a camera and a Global Positioning System (GPS) chip can be installed on the robot body.
- the car completes the video capture and then transmits it to the host computer through wireless communication.
- the instructions are issued to the robot; in the third exemplary solution, a remote management solution for the sweeper is proposed.
- the communication module can be used to wirelessly transmit the mobile phone information to the network management server.
- the management platform provides services to users.
- the solution mainly uses WEB to provide related services to facilitate users' management of equipment.
- fixed command transmission can only be accomplished through wireless communication, and fixed control commands can only be sent through fixed command buttons; the communication quality of the wireless communication network will affect the communication between upper and lower computers.
- remote control of smart devices based on WEB is an efficient and convenient solution, which plays an extremely important role in the field of remote control and education; but in order to achieve accurate and real-time control, there are still many problems that need to be resolved. ; Firstly, due to the wireless connection between the upper and lower computers, but the wireless network is unstable and the packet loss rate is large, there are big problems in the occasions with higher performance requirements, which limits the application scenarios; secondly, only the implementation of related technologies In order to transmit control instructions from the upper computer to the lower computer, feedback information cannot be collected or processed effectively. The issuance of control instructions on the web side of the upper computer and the collection and reporting of the feedback information of the instructions executed by the lower computer also require flexible design. Complex architecture.
- FIG. 1 is a flowchart 1 of the device control method of an embodiment of the application. As shown in FIG. 1, the flow may include:
- Step 101 The terminal obtains the program code for controlling the operation of the first controlled device through the local agent program;
- the first controlled device may be an electronic device such as a smart car, an embedded development device, or a smart robot; the first controlled device may include a sensor, a camera, and other devices, and the terminal may provide a human-computer interaction interface.
- the first controlled device may be a separate electronic device or an integrated electronic device.
- a local agent program can be installed in the terminal.
- the local agent program is at least used to collect the program code submitted by the user.
- the program code submitted by the user may be a program used to control the operation of the first controlled device. Code, in this way, by running the local agent program, the program code used to control the operation of the first controlled device can be collected.
- the program code submitted by the user based on the WEB page can be obtained through the local agent program, that is, the user can submit the program code for controlling the operation of the first controlled device through the WEB page of the terminal ,
- the local agent program can collect the corresponding program code; it can be seen that the user can submit the program code more conveniently based on the WEB page.
- the type of the program code is not limited.
- the type of the program code can be predetermined according to actual application requirements.
- the program code can be python code or other program codes.
- Step 102 The terminal sends the program code to the first controlled device through the local agent program.
- the communication connection between the terminal and the first controlled device can be established in advance.
- the embodiment of this application does not limit the connection mode of the communication connection between the terminal and the first controlled device.
- the device can perform data interaction through wired connection or various wireless connection methods.
- the above-mentioned local agent program may be used to establish a communication connection between the terminal and the first controlled device.
- the terminal may issue address resolution in the local area network by broadcasting.
- IP Internet Protocol
- the first controlled device in the LAN will reply to the corresponding data packet upon receiving the ARP data packet, and the above local agent can parse the data packet replies from the first controlled device , Obtain the Internet Protocol (IP) address of the first controlled device, so that a communication connection between the terminal and the first controlled device can be established; in some embodiments of the present application, when there are multiple In the case of a controlled device, if the above-mentioned local agent program receives data packets from multiple controlled devices, it can generate a smart device list representing multiple controlled devices, and the user can be in the smart device list of multiple controlled devices , Select at least one controlled device among multiple controlled devices; based on the local agent program, a communication connection between the terminal and the selected controlled device can be established.
- IP Internet Protocol
- the communication connection between the terminal and the first controlled device can also be monitored to learn about the communication between the terminal and the first controlled device Connection status; for example, based on the above-mentioned local agent program, the terminal can continue to broadcast ARP data packets in the local area network.
- the terminal can determine the communication connection between the terminal and the first controlled device based on the response to the ARP data packet Status; for example, after the terminal sends an ARP data packet to the first controlled device, if it does not receive a reply data packet for the ARP data packet within a set time period, it can be considered that the communication connection between the terminal and the first controlled device is interrupted ; If the reply data packet for the ARP data packet is received within the set time, it can be considered that the terminal and the first controlled device maintain a communication connection; the set time can be set according to actual needs, for example, the set time can be 1 to 3 seconds.
- the terminal may send the above program code to the first controlled device according to the pre-established communication connection between the terminal and the first controlled device.
- Step 103 The first controlled device receives and runs the above program code.
- the first controlled device can run the uploaded program code to generate corresponding control instructions, and the first controlled device can execute the control instructions; thus, since the control instructions are generated based on the program code, the terminal can pass The above program code realizes the control of the first controlled device.
- the first controlled device includes a smart car
- a control instruction can be generated, and the control instruction can indicate the motion state of the smart car.
- step 101 to step 102 can be implemented based on the processor of the terminal, and step 103 can be implemented based on the processor of the first controlled device.
- the aforementioned processor can be an Application Specific Integrated Circuit (ASIC), digital Signal processor (Digital Signal Processor, DSP), Digital Signal Processing Device (Digital Signal Processing Device, DSPD), Programmable Logic Device (PLD), Field Programmable Gate Array (Field Programmable Gate Array, FPGA), At least one of a central processing unit (Central Processing Unit, CPU), a controller, a microcontroller, and a microprocessor.
- ASIC Application Specific Integrated Circuit
- DSP Digital Signal Processor
- DSPD Digital Signal Processing Device
- PLD Programmable Logic Device
- FPGA Field Programmable Gate Array
- CPU Central Processing Unit
- controller a controller
- microcontroller a microcontroller
- microprocessor a microprocessor
- the program code can be obtained and pushed based on the local agent program, and then the control of the first controlled device can be realized based on the program code, and the program code can be flexibly edited. , Can control the first controlled device more flexibly.
- the first controlled device runs the program code to generate feedback information, and then the feedback information can be sent to the terminal.
- the feedback information may be used to indicate feedback on the running status of the program code.
- the feedback information may include the execution result of the above-mentioned program code.
- the terminal can learn about the operation of the program code, and then facilitate processing of the program code. For example, when the operation of the program code does not meet expectations, the program code can be modified.
- the terminal may load the terminal and/or display the feedback information on the display interface of the terminal.
- the implementation manner of loading feedback information by the terminal is not limited.
- the terminal loads the feedback information in a synchronous loading or asynchronous loading manner.
- the terminal may load and/or display the feedback information on the WEB front end, so that the user can intuitively learn the operation status of the above program code through the WEB front end.
- the first controlled device may generate the control instruction of the second controlled device by running the program code, wherein the first controlled device and the second controlled device form a communication connection;
- one second controlled device or multiple second controlled devices may be set; the second controlled device may be an electronic device such as a Raspberry Pi; the second controlled device may include a sensor and a camera
- the second controlled device and the first controlled device can be of the same or different types.
- the first controlled device After the first controlled device generates the control instruction of the second controlled device, it can send the control instruction of the second controlled device to the second controlled device; the second controlled device can execute the control instruction of the second controlled device , Get the instruction execution result.
- the second controlled device may send the instruction execution result to the first controlled device; the first controlled device may generate the control instruction of the first controlled device according to the instruction execution result, and execute the control instruction of the first controlled device.
- the embodiments of the present application provide a cooperative control solution for the first controlled device and the second controlled device.
- the first controlled device and the second controlled device can be implemented based on the above program code. Cooperative control of the second controlled device.
- the first controlled device and the second controlled device are connected through a wired connection, for example, the first controlled device and the second controlled device are connected through a serial port; in this way, the first controlled device can be connected The reliability of communication between a controlled device and a second controlled device.
- the first controlled device may also obtain working mode switching information; the working mode switching information is used to instruct the first controlled device and the second controlled device to switch the current working mode to the target working mode;
- the first controlled device can control the first controlled device to switch the working mode according to the working mode switching information; the first controlled device can also send the working mode switching information to the second controlled device; the second controlled device is in After receiving the working mode switching information, the working mode can be switched based on the working mode switching information.
- the embodiment of the present application for each controlled device, multiple working modes can be set. When in different working modes, the working mode of the controlled device is different. It can be seen that the embodiment of the present application can realize synchronous switching of the working modes of the first controlled device and the second controlled device.
- control instruction of the second controlled device is an image detection instruction
- execution result of the instruction is the image detection result
- the first controlled device may generate a human tracking instruction according to the received image detection result ; Then, the motion state of the first controlled device can be controlled according to the human body tracking instruction, so as to realize human body tracking.
- human body tracking can be implemented on the basis of the coordinated control of the first controlled device and the second controlled device.
- the first controlled device generates a control instruction for controlling the advancement of the first controlled device in response to the image detection result indicating that the human body becomes smaller; in response to the image detection The result indicates that the human body becomes larger, and a control instruction for controlling the first controlled device to remain stationary is generated; in response to the image detection result indicating that the human body is located on the left side of the first controlled device, the A control instruction for controlling the first controlled device to turn left; in response to the image detection result indicating that the human body is located on the right side of the first controlled device, generating a right for controlling the first controlled device Turn the control instructions.
- the human body tracking can be accurately realized by controlling the motion state of the first controlled device.
- the first controlled device can include EV3 smart cars, motors, related sensors and other devices; the EV3 smart car can drive related sensors to move through motors; the second controlled device can include Raspberry Pi, motors, cameras, etc. Device;
- Figure 2 is a schematic diagram of the connection relationship between the EV3 smart car and the Raspberry Pi in an embodiment of the application. As shown in Figure 2, the Raspberry Pi 201 can be connected to the camera 202 through a Universal Serial Bus (USB) connection.
- USB Universal Serial Bus
- Raspberry Pi 201 can also be driven by a motor to rotate the camera; Raspberry Pi 201 can be connected to the EV3 smart car 203 through a serial port adapter; EV3 smart car 203 can also be connected to related sensors 204, A driving wheel 205 and a second driving wheel 206; in one example, the related sensor 204 can be a touch sensor, and the trigger signal generated when the touch sensor is touched can be used to modify the Raspberry Pi 201 and EV3 smart car simultaneously 203 function mode; the EV3 smart car 203 can also drive the first drive wheel 205 and the second drive wheel 206 to work; in some embodiments of the present application, the touch sensor can generate a trigger signal to the direct The connected EV3 smart car 203 then changes the function mode flag bit of the EV3 smart car 203, and sends the flag bit to the Raspberry Pi 201 through the serial port, which then triggers the change of the function mode flag bit of the Raspberry Pi 201, thereby realizing the tree Synchronous switching of
- connection lines of the Raspberry Pi 201 and the EV3 smart car 203 can be as shown in Table 1. From Table 1, the corresponding relationship between the interface of the Raspberry Pi 201 and the color of the connection line can be seen.
- FIG. 3 is a schematic diagram of the connection relationship between the EV3 smart car and the terminal in an embodiment of this application.
- the EV3 smart car 203 can be connected to the terminal 301 through a wireless connection.
- a wireless network card can be connected to the USB interface of the EV3 smart car 203;
- the EV3 smart car 203 and the terminal 301 can be in the same network segment. In this way, the stability of the wireless network connection and the reliability of the communication between the EV3 smart car 203 and the terminal 301 can be ensured.
- Fig. 4 is a second flowchart of the device control method according to an embodiment of the application. As shown in Fig. 4, the process may include:
- Step 401 Connect the EV3 smart car 203 and the Raspberry Pi 201, and connect the EV3 smart car 203 and the terminal 301.
- Step 402 On the terminal side, the program code collection of the WEB front-end is implemented based on the local agent program, and the collected program code is pushed to the EV3 smart car 203.
- the python code submitted by the user can be collected based on the local agent program.
- the EV3 smart car 203 can be made to work based on the python code, that is to say, the EV3 smart car can be implemented based on the python code.
- the remote control of 203 in this way, can control the EV3 smart car 203 and other controlled equipment more flexibly and diversely; in the education scene, students can continue to trial and error and verify novel control schemes, which can cultivate students' programming process thinking.
- UDP User Datagram Protocol
- FIG. 5 is a schematic diagram of the functional flow of the agent program in an embodiment of the application. Referring to FIG. 5, the flow may include:
- Step A1 Obtain a list of smart cars in the same network segment of the terminal 301 by broadcasting.
- the terminal 301 when running the local agent program, can send ARP data packets in the local area network by broadcasting, and the EV3 smart car 203 in the local area network will reply with the corresponding data after receiving the ARP data packet.
- the above-mentioned local agent program can parse the data packet returned by the EV3 smart car 203 to obtain the IP address of the EV3 smart car 203; when the IP addresses of multiple EV3 smart cars 203 are obtained, it can generate the EV3 under the same network segment of the terminal 301 Smart car list.
- Step A2 Select EV3 smart car 203 and establish a connection.
- a specific smart car can be selected from the above-mentioned EV3 smart car list, and then a UDP connection with the corresponding smart car can be created.
- Step A3 Push the program code to the EV3 smart car 203, and receive the feedback information of the EV3 smart car 203.
- the user can submit the program code.
- the local agent program can collect the program code submitted by the user, and then push the program code to the selection EV3 smart car 203; EV3 smart car 203 can run the program code, generate feedback information, and send the feedback information to the terminal 301.
- Step A4 Load feedback information on the front end of the WEB through asynchronous loading.
- Step A5 After the program code is executed, the connection between the terminal 301 and the EV3 smart car 203 can be disconnected, and resources can be recovered.
- the program code after receiving or loading the feedback information, the program code can be considered to be executed.
- the connection between the terminal 301 and the EV3 smart car 203 can be released.
- corresponding hardware resources and/or software resources for example, WEB front-end resources
- WEB front-end resources can also be released for the network transmission resources between the terminals.
- Step 403 The EV3 smart car 203 receives and runs the program code, and pushes the image detection instruction to the Raspberry Pi 201.
- the EV3 smart car 203 starts the corresponding UDP service, which can continuously monitor the information of the fixed port, and after the local agent program pushes the program code to the EV3 smart car 203, the EV3 smart car 203 can run
- the received program code can be encapsulated in a data packet
- get the control instruction if the control instruction involves image processing tasks, it can generate image inspection instructions, start the corresponding service mode, and push the image inspection instructions to the Raspberry Pi 201 to instruct the Raspberry Pi 201 to perform an image detection task, so that the image detection result can be obtained from the Raspberry Pi 201.
- the EV3 smart car 203 and the Raspberry Pi 201 may have multiple working modes.
- Table 2 illustrates several working modes of the EV3 smart car 203 and the Raspberry Pi 201 as an example.
- Step 404 The Raspberry Pi 201 executes the image detection instruction, and transmits the image detection result (also referred to as the image processing result) to the EV3 smart car 203, and the EV3 smart car 203 executes the corresponding control command according to the image detection result.
- the image detection result also referred to as the image processing result
- the image detection instruction can be encapsulated in a data packet and sent.
- the Raspberry Pi 201 receives the data packet from the EV3 smart car 203, obtains its own working mode through analysis, and then executes the corresponding image detection task , And send the image detection result to the EV3 smart car 203. In the same working mode, the EV3 smart car 203 and the Raspberry Pi 201 cooperate to perform specific tasks.
- FIG. 6 is a schematic diagram of the processing flow of the EV3 smart car 203 and the Raspberry Pi 201 in the human tracking mode in the embodiment of the application.
- the EV3 smart car can be explained from the image processing stage, the detection data processing stage and the instruction execution stage
- the camera 202 is used to collect pictures, and then the collected images are detected based on deep learning technology to achieve human body detection. Coordinates; the data processing stage mainly realizes human body ID recognition and position coordinate processing; the result of data processing is sent to the EV3 smart car 203.
- the EV3 smart car 203 can control the motor operation according to the logic shown in Figure 6, which can be achieved in this way Basic human body tracking function; referring to Figure 6, when it is determined that the human body becomes smaller, the EV3 smart car 203 is driven forward by the motor; when the human body is determined to be larger, the EV3 smart car 203 is controlled to stand still by the motor driving mode; when the human body is determined to be on the left , The motor drives the EV3 smart car 203 to turn left; when it is determined that the human body is on the right side, the motor drives the EV3 smart car 203 to turn right.
- Step 405 The EV3 smart car 203 collects feedback information and sends the feedback information to the terminal 301.
- Figure 7 is a schematic diagram of the processing flow for feedback information in an embodiment of the application.
- the python code when received, it can be based on its own Python interpreter, complete the conversion of original code and bytecode, and then convert it into machine code to facilitate the execution of instructions.
- the execution result of the program code can be redirected to a text file.
- the communication connection between the EV3 smart car 203 and the terminal 301 can be established based on the UDP communication module; the UDP communication module can detect the above text file in real time Whether to be updated, after the text information is updated, the content (that is, feedback information) will be read; then the read feedback information will be actively pushed to the local agent program of the terminal 301 through the established communication connection; the local agent program is based on JAVA
- the feedback information can be loaded to the WEB front end by asynchronous loading, and the WEB front end will feedback information in the designated area after a series of renderings.
- a complete remote control scheme based on the EV3 smart car 203 is proposed.
- the program code can be input through the WEB terminal, and then transmitted to the lower computer (that is, the EV3 smart car) in real time, and realize The lower computer responds to feedback information; further, through the local agent program, it can support the completion of the collection of the web front-end python code, and push it to the lower computer based on the UDP connection to execute the program code; through the local agent program, the terminal 301 and EV3 smart car can be realized 203’s information interaction, transmission is efficient and reliable; further, a connection scheme between the terminal, EV3 smart car 203, Raspberry Pi 201 and related peripherals is designed to ensure the efficient feasibility of the connection, and it is compatible with most embedded developments. equipment.
- a complete implementation scheme is given for the connection between the EV3 smart car and the Raspberry Pi, and the control of the EV3 smart car and the Raspberry Pi; and, it can be used for the EV3 smart car and the Raspberry Pi.
- Send, realize the synchronous switching mechanism of different working modes, can ensure the consistency of information exchange under different working modes.
- the following describes a device control method of the embodiment of the present application from the perspective of the terminal.
- FIG. 8 is the third flow chart of the device control method according to the embodiment of the application. As shown in FIG. 8, the flow may include:
- Step 801 Obtain program code for controlling the operation of the first controlled device through the local agent program
- the first controlled device may be an electronic device such as a smart car or an embedded development device; the first controlled device may include a sensor, a camera, and other devices, and the terminal may provide a human-computer interaction interface.
- the first controlled device may be a single electronic device, or may integrate multiple electronic devices.
- a local agent program can be installed in the terminal.
- the local agent program is used at least to collect the program code submitted by the user.
- the program code submitted by the user may be the program code used to control the operation of the first controlled device.
- the program code used to control the work of the first controlled device can be collected.
- the type of the program code is not limited.
- the type of the program code can be predetermined according to actual application requirements.
- the program code can be python code or other program codes.
- Step 802 Send the program code to the first controlled device through the local agent program, so that the first controlled device runs the program code.
- the communication connection between the terminal and the first controlled device can be established in advance.
- the embodiment of this application does not limit the connection mode of the communication connection between the terminal and the first controlled device.
- the device can perform data interaction through wired connection or various wireless connection methods.
- the first controlled device can run the uploaded program code to generate corresponding control instructions, and the first controlled device can execute the control instructions; thus, since the control instructions are generated based on the program code, the terminal can pass The above program code realizes the control of the first controlled device.
- steps 801 to 802 may be implemented based on the processor of the terminal, and the foregoing processor may be at least one of ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. It is understandable that, for different terminals or controlled devices, the electronic devices used to implement the above-mentioned processor functions may also be other, which are not specifically limited in the embodiment of the present application.
- the program code can be obtained and pushed based on the local agent program, and then the control of the first controlled device can be realized based on the program code, and the program code can be flexibly edited. , Can control the first controlled device more flexibly.
- the method further includes:
- the feedback information of the first controlled device is received, and the feedback information is generated by the first controlled device after running the program code.
- the feedback information includes the execution result of the program code.
- the terminal can be informed of the operation of the above program code, and then it is convenient to process the program code. For example, when the operation of the program code does not meet expectations, the program code can be modified .
- the method further includes:
- the obtaining the program code for controlling the operation of the controlled device through the local agent program includes:
- the program code submitted by the user based on the WEB page is obtained through the local agent program.
- the user can submit the program code more conveniently based on the WEB page, which in turn facilitates the local agent program to collect the program code submitted by the user.
- Fig. 9 is a fourth flowchart of a device control method according to an embodiment of this application. As shown in Fig. 9, the process may include:
- Step 901 A program code for controlling the operation of the first controlled device sent by a terminal through a local agent program is received; the program code is collected by the local agent program.
- the first controlled device may be an electronic device such as a smart car or an embedded development device; the first controlled device may include a sensor, a camera, and other devices, and the terminal may provide a human-computer interaction interface.
- the first controlled device may be a single electronic device, or may integrate multiple electronic devices.
- a local agent program can be installed in the terminal.
- the local agent program is used at least to collect the program code submitted by the user.
- the program code submitted by the user may be the program code used to control the operation of the first controlled device.
- the program code used to control the work of the first controlled device can be collected.
- the type of the program code is not limited.
- the type of the program code can be predetermined according to actual application requirements.
- the program code can be python code or other program codes.
- the communication connection between the terminal and the first controlled device can be established in advance.
- the embodiment of this application does not limit the connection mode of the communication connection between the terminal and the first controlled device.
- the device can perform data interaction through wired connection or various wireless connection methods.
- Step 902 Run the program code.
- the first controlled device can run the uploaded program code to generate corresponding control instructions, and the first controlled device can execute the control instructions; thus, since the control instructions are generated based on the program code, the terminal can pass The above program code realizes the control of the first controlled device.
- the program code can be obtained and pushed based on the local agent program, and then the control of the first controlled device can be realized based on the program code, and the program code can be flexibly edited. , Can control the first controlled device more flexibly.
- the method further includes:
- the feedback information includes the execution result of the program code.
- the terminal can be informed of the operation of the above program code, and then it is convenient to process the program code. For example, when the operation of the program code does not meet expectations, the program code can be modified .
- the method further includes:
- the embodiments of the present application provide a cooperative control solution for the first controlled device and the second controlled device.
- the first controlled device and the second controlled device can be implemented based on the above program code. Cooperative control of the second controlled device.
- the method further includes:
- the working mode switching information is used to instruct the first controlled device and the second controlled device to switch the current working mode to the target working mode;
- control the first controlled device to switch operating modes; send the operating mode switching information to a second controlled device, so that the second controlled device is based on the operating mode switching information Switch working mode.
- the embodiment of the present application for each controlled device, multiple working modes can be set. When in different working modes, the working mode of the controlled device is different. It can be seen that the embodiment of the present application can realize synchronous switching of the working modes of the first controlled device and the second controlled device.
- control instruction of the second controlled device is an image detection instruction, and the execution result of the instruction is an image detection result
- the generating the control instruction of the first controlled device according to the instruction execution result and executing the control instruction of the first controlled device includes:
- the movement state of the first controlled device is controlled.
- human body tracking can be implemented on the basis of the coordinated control of the first controlled device and the second controlled device.
- the generating a control instruction of the first controlled device according to the image detection result includes:
- a control instruction for controlling the first controlled device to turn right is generated.
- the human body tracking can be accurately realized by controlling the motion state of the first controlled device.
- the first controlled device and the second controlled device are connected through a wired connection; in this way, the communication between the first controlled device and the second controlled device can be ensured reliability.
- the writing order of the steps does not mean a strict execution order but constitutes any limitation on the implementation process.
- the specific execution order of each step should be based on its function and possibility.
- the inner logic is determined.
- an embodiment of the present application proposes a terminal.
- FIG. 10 is a schematic diagram of the composition structure of a terminal according to an embodiment of the application. As shown in FIG. 10, the device includes an acquisition module 1001 and a first processing module 1002, where,
- the obtaining module 1001 is configured to obtain program code for controlling the operation of the first controlled device through a local agent program
- the first processing module 1002 is configured to send the program code to the first controlled device through the local agent program, so that the first controlled device runs the program code.
- the first processing module 1002 is further configured to receive feedback information of the first controlled device after sending the program code to the first controlled device, and the feedback information It is generated after the first controlled device runs the program code.
- the first processing module 1002 is further configured to load and/or load the feedback information on the terminal where the local agent program is located after receiving the feedback information of the first controlled device. Or display.
- the feedback information includes an execution result of the program code.
- the obtaining module 1001 is configured to obtain the program code submitted by the user based on the WEB page through the local agent program.
- Both the acquisition module 1001 and the first processing module 1002 can be implemented by a processor located in a terminal.
- the processor is at least one of ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. kind.
- the functional modules in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
- the above-mentioned integrated unit can be realized in the form of hardware or software function module.
- the integrated unit is implemented in the form of a software function module and is not sold or used as an independent product, it can be stored in a computer readable storage medium.
- the technical solutions of the embodiments of the present application are essentially In other words, the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product.
- the computer software product is stored in a storage medium and includes several instructions to enable a computer device ( It may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application.
- the aforementioned storage media include: U disk, mobile hard disk, read only memory (Read Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes.
- the computer program instructions corresponding to a device control method in the embodiments of the present application may be stored on storage media such as optical disks, hard disks, and USB flash drives.
- storage media such as optical disks, hard disks, and USB flash drives.
- FIG. 11 shows another terminal provided by an embodiment of the present application, which may include: a first memory 1101 and a first processor 1102; wherein,
- the first memory 1101 is configured to store computer programs and data
- the first processor 1102 is configured to execute a computer program stored in the first memory, so as to implement any device control method applied to the terminal in the foregoing embodiments.
- the above-mentioned first memory 1101 may be a volatile memory (volatile memory), such as RAM; or a non-volatile memory (non-volatile memory), such as ROM, flash memory, hard disk ( Hard Disk Drive (HDD) or Solid-State Drive (SSD); or a combination of the foregoing types of memories, and provides instructions and data to the first processor 1102.
- volatile memory such as RAM
- non-volatile memory non-volatile memory
- ROM read-only memory
- flash memory such as hard disk ( Hard Disk Drive (HDD) or Solid-State Drive (SSD); or a combination of the foregoing types of memories, and provides instructions and data to the first processor 1102.
- HDD Hard Disk Drive
- SSD Solid-State Drive
- the first processor 1102 may be at least one of ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. It is understandable that, for different devices, the electronic devices used to implement the above-mentioned processor functions may also be other, which is not specifically limited in the embodiment of the present application.
- an embodiment of the present application proposes a controlled device.
- FIG. 12 is a schematic diagram of the composition structure of a first controlled device according to an embodiment of the application. As shown in FIG. 12, the device includes a receiving module 1201 and a second processing module 1202, wherein,
- the receiving module 1201 is configured to receive the program code for controlling the operation of the first controlled device sent by the terminal through a local agent program; the program code is collected by the local agent program;
- the second processing module 1202 is configured to run the program code.
- the second processing module 1202 is further configured to run the program code to generate feedback information, and send the feedback information to the terminal.
- the feedback information includes an execution result of the program code.
- the second processing module 1202 is further configured to:
- the second processing module 1202 is further configured to obtain work mode switching information; control the first controlled device to switch work modes according to the work mode switching information; and change the work mode switching information Sent to the second controlled device to enable the second controlled device to switch the working mode based on the working mode switching information; the working mode switching information is used to instruct the first controlled device and the second controlled device The device switches the current working mode to the target working mode.
- control instruction of the second controlled device is an image detection instruction, and the execution result of the instruction is an image detection result;
- the second processing module 1202 is configured to generate a control instruction of the first controlled device according to the image detection result, and the control instruction of the first controlled device is a human tracking instruction; according to the human tracking instruction , Controlling the motion state of the first controlled device.
- the second processing module 1202 is configured to generate a control instruction for controlling the progress of the first controlled device in response to the image detection result indicating that the human body becomes smaller; in response to the The image detection result indicates that the human body becomes larger, and a control instruction for controlling the first controlled device to remain still is generated; in response to the image detection result indicating that the human body is located on the left side of the first controlled device, Generate a control instruction for controlling the first controlled device to turn left; in response to the image detection result indicating that the human body is on the right side of the first controlled device, generate a control instruction for controlling the first controlled device Control instructions for turning the device right.
- the first controlled device and the second controlled device are connected through a wired connection.
- Both the receiving module 1201 and the second processing module 1202 can be implemented by a processor located in the first controlled device.
- the processors are ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. At least one of them.
- the functional modules in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
- the above-mentioned integrated unit can be realized in the form of hardware or software function module.
- the integrated unit is implemented in the form of a software function module and is not sold or used as an independent product, it can be stored in a computer readable storage medium.
- the technical solutions of the embodiments of the present application are essentially In other words, the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product.
- the computer software product is stored in a storage medium and includes several instructions to enable a computer device ( It may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application.
- the aforementioned storage media include: U disk, mobile hard disk, read only memory (Read Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes.
- the computer program instructions corresponding to another device control method in the embodiment of the present application can be stored on storage media such as optical disks, hard disks, U disks, etc., when the storage medium corresponds to another device control method
- storage media such as optical disks, hard disks, U disks, etc.
- FIG. 13 shows an electronic device provided by an embodiment of the present application, which may include: a second memory 1301 and a second processor 1302; wherein,
- the second memory 1301 is configured to store computer programs and data
- the second processor 1302 is configured to execute a computer program stored in the second memory to implement any device control method applied to the first controlled device in the foregoing embodiments.
- the above-mentioned second memory 1301 may be a volatile memory, such as RAM; or a non-volatile memory, such as ROM, flash memory, HDD or SSD; or a combination of the above-mentioned types of memory, and transfer to the second
- the processor 1302 provides instructions and data.
- the aforementioned second processor 1302 may be at least one of ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. It is understandable that, for different electronic devices, the electronic devices used to implement the above-mentioned processor functions may also be other, which is not specifically limited in the embodiment of the present application.
- the functions or modules contained in the apparatus provided in the embodiments of the application can be used to execute the methods described in the above method embodiments.
- the functions or modules contained in the apparatus provided in the embodiments of the application can be used to execute the methods described in the above method embodiments.
- brevity, here No longer refer to the description of the above method embodiments.
- the method of the above embodiments can be implemented by means of software plus the necessary general hardware platform. Of course, it can also be implemented by hardware, but in many cases the former is better. ⁇
- the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make a terminal (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present application.
- the embodiment of the application discloses a device control method, a terminal, a controlled device, an electronic device, a medium, and a program.
- the method includes: obtaining program code for controlling the operation of a first controlled device through a local agent program;
- the local agent program sends the program code to the first controlled device, so that the first controlled device runs the program code.
- the technical solution of the embodiment of the present application can obtain and push the program code based on the local agent program, and then can realize the control of the first controlled device based on the program code, and the program code can be flexibly edited, so that it can be more flexible Control the first controlled device.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Manufacturing & Machinery (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Selective Calling Equipment (AREA)
- Stored Programmes (AREA)
Abstract
Description
Claims (32)
- 一种设备控制方法,包括:通过本地代理程序获取用于控制第一受控设备工作的程序代码;通过所述本地代理程序将所述程序代码发送至所述第一受控设备,使所述第一受控设备运行所述程序代码。
- 根据权利要求1所述的方法,其中,所述方法还包括:在将所述程序代码发送至所述第一受控设备后,接收所述第一受控设备的反馈信息,所述反馈信息由所述第一受控设备运行所述程序代码后生成。
- 根据权利要求2所述的方法,其中,所述方法还包括:在接收所述第一受控设备的反馈信息后,将所述反馈信息在所述本地代理程序所在的终端进行加载和/或显示。
- 根据权利要求2或3所述的方法,其中,所述反馈信息包括所述程序代码的执行结果。
- 根据权利要求1至3任一项所述的方法,其中,所述通过本地代理程序获取用于控制受控设备工作的程序代码,包括:通过所述本地代理程序获取用户基于网络WEB页面提交的所述程序代码。
- 一种设备控制方法,应用于第一受控设备中,所述方法包括:接收终端通过本地代理程序发送的用于控制所述第一受控设备工作的程序代码;所述程序代码是由所述本地代理程序收集的;运行所述程序代码。
- 根据权利要求6所述的方法,其中,所述方法还包括:运行所述程序代码生成反馈信息,并将所述反馈信息发送至所述终端。
- 根据权利要求7所述的方法,其中,所述反馈信息包括所述程序代码的执行结果。
- 根据权利要求6至8任一项所述的方法,其中,所述方法还包括:通过运行所述程序代码,生成第二受控设备的控制指令,其中,所述第一受控设备和所述第二受控设备形成通信连接;将所述第二受控设备的控制指令发送至第二受控设备,使所述第二受控设备执行所述第二受控设备的控制指令;接收所述第二受控设备发送的指令执行结果,根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令;其中,所述指令执行结果是所述第二受控设备在执行所述第二受控设备的控制指令后得到的。
- 根据权利要求9所述的方法,其中,所述方法还包括:获取工作模式切换信息;所述工作模式切换信息用于指示所述第一受控设备和所述第二受控设备将当前工作模式切换为目标工作模式;根据所述工作模式切换信息,控制所述第一受控设备切换工作模式;将所述工作模式切换信息发送至第二受控设备,使所述第二受控设备基于所述工作模式切换信息切换工作模式。
- 根据权利要求9或10所述的方法,其中,所述第二受控设备的控制指令为图像检测指令,所述指令执行结果为图像检测结果;所述根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令,包括:根据所述图像检测结果,生成所述第一受控设备的控制指令,所述第一受控设备的 控制指令为人体跟踪指令;根据所述人体跟踪指令,控制所述第一受控设备的运动状态。
- 根据权利要求11所述的方法,其中,所述根据所述图像检测结果,生成所述第一受控设备的控制指令,包括:响应于所述图像检测结果表示人体变小的情况,生成用于控制所述第一受控设备前进的控制指令;响应于所述图像检测结果表示人体变大的情况,生成用于控制所述第一受控设备保持静止的控制指令;响应于所述图像检测结果表示人体位于所述第一受控设备的左侧的情况,生成用于控制所述第一受控设备左转的控制指令;响应于所述图像检测结果表示人体位于所述第一受控设备的右侧的情况,生成用于控制所述第一受控设备右转的控制指令。
- 根据权利要求6至12任一项所述的方法,其中,所述第一受控设备和所述第二受控设备通过有线连接方式进行连接。
- 一种终端,包括获取模块和第一处理模块,其中,获取模块,配置为通过本地代理程序获取用于控制第一受控设备工作的程序代码;第一处理模块,配置为通过所述本地代理程序将所述程序代码发送至所述第一受控设备,使所述第一受控设备运行所述程序代码。
- 根据权利要求14所述的终端,其中,所述第一处理模块,还配置为在将所述程序代码发送至所述第一受控设备后,接收所述第一受控设备的反馈信息,所述反馈信息由所述第一受控设备运行所述程序代码后生成。
- 根据权利要求15所述的终端,其中,所述第一处理模块,还配置为在在接收所述第一受控设备的反馈信息后,将所述反馈信息在所述本地代理程序所在的终端进行加载和/或显示。
- 根据权利要求15或16所述的终端,其中,所述反馈信息包括所述程序代码的执行结果。
- 根据权利要求14至16任一项所述的终端,其中,所述获取模块,配置为通过所述本地代理程序获取用户基于网络WEB页面提交的所述程序代码。
- 一种第一受控设备,包括接收模块和第二处理模块,其中,接收模块,配置为接收终端通过本地代理程序发送的用于控制所述第一受控设备工作的程序代码;所述程序代码是由所述本地代理程序收集的;第二处理模块,配置为运行所述程序代码。
- 根据权利要求19所述的第一受控设备,其中,所述第二处理模块,还配置为运行所述程序代码生成反馈信息,并将所述反馈信息发送至所述终端。
- 根据权利要求20所述的第一受控设备,其中,所述反馈信息包括所述程序代码的执行结果。
- 根据权利要求19至21任一项所述的第一受控设备,其中,所述第二处理模块,还配置为:通过运行所述程序代码,生成第二受控设备的控制指令,其中,所述第一受控设备和所述第二受控设备形成通信连接;将所述第二受控设备的控制指令发送至第二受控设备,使所述第二受控设备执行所述第二受控设备的控制指令;接收所述第二受控设备发送的指令执行结果,根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令;其中,所述指令执行结果是所述第二受控设备在执行所述第二受控设备的控制指令后得到的。
- 根据权利要求22所述的第一受控设备,其中,所述第二处理模块,还配置为获取工作模式切换信息;根据所述工作模式切换信息,控制所述第一受控设备切换工作模式;将所述工作模式切换信息发送至第二受控设备,使所述第二受控设备基于所述工作模式切换信息切换工作模式;所述工作模式切换信息用于指示所述第一受控设备和所述第二受控设备将当前工作模式切换为目标工作模式。
- 根据权利要求22或23所述的第一受控设备,其中,所述第二受控设备的控制指令为图像检测指令,所述指令执行结果为图像检测结果;所述第二处理模块,配置为根据所述图像检测结果,生成所述第一受控设备的控制指令,所述第一受控设备的控制指令为人体跟踪指令;根据所述人体跟踪指令,控制所述第一受控设备的运动状态。
- 根据权利要求24所述的第一受控设备,其中,所述第二处理模块,配置为响应于所述图像检测结果表示人体变小的情况,生成用于控制所述第一受控设备前进的控制指令;响应于所述图像检测结果表示人体变大的情况,生成用于控制所述第一受控设备保持静止的控制指令;响应于所述图像检测结果表示人体位于所述第一受控设备的左侧的情况,生成用于控制所述第一受控设备左转的控制指令;响应于所述图像检测结果表示人体位于所述第一受控设备的右侧的情况,生成用于控制所述第一受控设备右转的控制指令。
- 根据权利要求19至25任一项所述的第一受控设备,其中,所述第一受控设备和所述第二受控设备通过有线连接方式进行连接。
- 一种终端,包括处理器和配置为存储能够在处理器上运行的计算机程序的存储器;其中,所述处理器配置为运行所述计算机程序时,执行权利要求1至5任一项所述的方法。
- 一种电子设备,包括处理器和配置为存储能够在处理器上运行的计算机程序的存储器;其中,所述处理器配置为运行所述计算机程序时,执行权利要求6至13任一项所述的方法。
- 一种计算机存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现权利要求1至5任一项所述的方法。
- 一种计算机存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现权利要求6至13任一项所述的方法。
- 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在终端中运行时,所述终端中的处理器执行用于实现权利要求1至5中的任一权利要求所述的方法。
- 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器执行用于实现权利要求6至13中的任一权利要求所述的方法。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910578882.2 | 2019-06-28 | ||
CN201910578882.2A CN110297472A (zh) | 2019-06-28 | 2019-06-28 | 设备控制方法、终端、受控设备、电子设备和存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020259154A1 true WO2020259154A1 (zh) | 2020-12-30 |
Family
ID=68029509
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/091915 WO2020259154A1 (zh) | 2019-06-28 | 2020-05-22 | 设备控制方法、终端、受控设备、电子设备、介质和程序 |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN110297472A (zh) |
TW (1) | TWI743853B (zh) |
WO (1) | WO2020259154A1 (zh) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112860522A (zh) * | 2021-03-02 | 2021-05-28 | 北京梧桐车联科技有限责任公司 | 程序的运行监控方法、装置及设备 |
CN114070659A (zh) * | 2021-10-29 | 2022-02-18 | 深圳市优必选科技股份有限公司 | 一种设备锁定方法、装置及终端设备 |
CN115118699A (zh) * | 2022-06-21 | 2022-09-27 | 国仪量子(合肥)技术有限公司 | 数据传输方法、装置、系统、上位机和存储介质 |
CN116661400A (zh) * | 2023-07-14 | 2023-08-29 | 广船国际有限公司 | 基于机器视觉的目标设备的控制方法、装置、设备及介质 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110297472A (zh) * | 2019-06-28 | 2019-10-01 | 上海商汤智能科技有限公司 | 设备控制方法、终端、受控设备、电子设备和存储介质 |
CN111262912B (zh) * | 2020-01-09 | 2021-06-29 | 北京邮电大学 | 一种控制车辆运动的系统、方法及装置 |
CN112001827B (zh) * | 2020-09-25 | 2024-09-13 | 上海商汤临港智能科技有限公司 | 教具控制方法及装置、教学设备、存储介质 |
CN113903218B (zh) * | 2021-10-21 | 2024-03-15 | 优必选(湖北)科技有限公司 | 一种教育编程方法、电子设备及计算机可读存储介质 |
CN115442450B (zh) * | 2022-08-24 | 2024-07-30 | 山东浪潮科学研究院有限公司 | 可编程人工智能小车的云化共享方法及存储介质 |
CN115827516A (zh) * | 2023-02-03 | 2023-03-21 | 北京融合未来技术有限公司 | 设备控制方法和装置、数据采集系统、设备和介质 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004114032A1 (en) * | 2003-06-25 | 2004-12-29 | Avecontrol Oy | Signal processing method and signal processing system |
US20090030546A1 (en) * | 2007-07-24 | 2009-01-29 | Wah Hong Industrial Corp. | Apparatus and method for positioning control |
CN102981758A (zh) * | 2012-11-05 | 2013-03-20 | 福州瑞芯微电子有限公司 | 电子设备之间的连接方法 |
CN104808600A (zh) * | 2014-01-26 | 2015-07-29 | 广东美的制冷设备有限公司 | 受控终端多控制模式自适应控制方法、系统及控制终端 |
CN204965043U (zh) * | 2015-09-23 | 2016-01-13 | 苏州工业园区宅艺智能科技有限公司 | 一种基于云平台的智能家居控制系统 |
CN105871670A (zh) * | 2016-05-20 | 2016-08-17 | 珠海格力电器股份有限公司 | 终端设备的控制方法、装置和系统 |
CN108958103A (zh) * | 2018-06-25 | 2018-12-07 | 珠海格力电器股份有限公司 | 控制方法、被控方法、装置、智能终端及智能电器 |
CN110297472A (zh) * | 2019-06-28 | 2019-10-01 | 上海商汤智能科技有限公司 | 设备控制方法、终端、受控设备、电子设备和存储介质 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6853867B1 (en) * | 1998-12-30 | 2005-02-08 | Schneider Automation Inc. | Interface to a programmable logic controller |
US7950010B2 (en) * | 2005-01-21 | 2011-05-24 | Sap Ag | Software deployment system |
US20060168575A1 (en) * | 2005-01-21 | 2006-07-27 | Ankur Bhatt | Defining a software deployment |
US8914783B2 (en) * | 2008-11-25 | 2014-12-16 | Fisher-Rosemount Systems, Inc. | Software deployment manager integration within a process control system |
JP5240141B2 (ja) * | 2009-09-14 | 2013-07-17 | 株式会社リコー | プログラムダウンロードシステム、プログラムダウンロード方法、画像形成装置、プログラム配信サーバおよびダウンロードプログラム |
TW201233096A (en) * | 2011-01-31 | 2012-08-01 | Fuither Tech Co Ltd | Remote assistance service method for embedded operation system |
CN102520665A (zh) * | 2011-12-23 | 2012-06-27 | 中国科学院自动化研究所 | 开放式机器人示教装置和机器人控制系统 |
CN104175308A (zh) * | 2014-08-12 | 2014-12-03 | 湖南信息职业技术学院 | 自主服务机器人 |
CN104932298A (zh) * | 2015-04-28 | 2015-09-23 | 中国地质大学(武汉) | 一种教学机器人控制器 |
CN105045153A (zh) * | 2015-07-31 | 2015-11-11 | 中国地质大学(武汉) | 基于移动机器人平台的三模式控制系统 |
CN106651384A (zh) * | 2015-10-30 | 2017-05-10 | 阿里巴巴集团控股有限公司 | 样本品质检测方法和检测数据录入方法、装置和系统 |
CN105488815B (zh) * | 2015-11-26 | 2018-04-06 | 北京航空航天大学 | 一种支持目标尺寸变化的实时对象跟踪方法 |
CN105760824B (zh) * | 2016-02-02 | 2019-02-01 | 北京进化者机器人科技有限公司 | 一种运动人体跟踪方法和系统 |
CN105760106B (zh) * | 2016-03-08 | 2019-01-15 | 网易(杭州)网络有限公司 | 一种智能家居设备交互方法和装置 |
US20170351226A1 (en) * | 2016-06-01 | 2017-12-07 | Rockwell Automation Technologies, Inc. | Industrial machine diagnosis and maintenance using a cloud platform |
CN106737676B (zh) * | 2016-12-28 | 2019-03-15 | 南京埃斯顿机器人工程有限公司 | 一种基于脚本可二次开发的工业机器人编程系统 |
CN109166404A (zh) * | 2018-10-12 | 2019-01-08 | 山东爱泊客智能科技有限公司 | 基于共享式可操控模型的自编程控制的实现方法与装置 |
-
2019
- 2019-06-28 CN CN201910578882.2A patent/CN110297472A/zh active Pending
-
2020
- 2020-05-22 WO PCT/CN2020/091915 patent/WO2020259154A1/zh active Application Filing
- 2020-06-22 TW TW109121221A patent/TWI743853B/zh active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004114032A1 (en) * | 2003-06-25 | 2004-12-29 | Avecontrol Oy | Signal processing method and signal processing system |
US20090030546A1 (en) * | 2007-07-24 | 2009-01-29 | Wah Hong Industrial Corp. | Apparatus and method for positioning control |
CN102981758A (zh) * | 2012-11-05 | 2013-03-20 | 福州瑞芯微电子有限公司 | 电子设备之间的连接方法 |
CN104808600A (zh) * | 2014-01-26 | 2015-07-29 | 广东美的制冷设备有限公司 | 受控终端多控制模式自适应控制方法、系统及控制终端 |
CN204965043U (zh) * | 2015-09-23 | 2016-01-13 | 苏州工业园区宅艺智能科技有限公司 | 一种基于云平台的智能家居控制系统 |
CN105871670A (zh) * | 2016-05-20 | 2016-08-17 | 珠海格力电器股份有限公司 | 终端设备的控制方法、装置和系统 |
CN108958103A (zh) * | 2018-06-25 | 2018-12-07 | 珠海格力电器股份有限公司 | 控制方法、被控方法、装置、智能终端及智能电器 |
CN110297472A (zh) * | 2019-06-28 | 2019-10-01 | 上海商汤智能科技有限公司 | 设备控制方法、终端、受控设备、电子设备和存储介质 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112860522A (zh) * | 2021-03-02 | 2021-05-28 | 北京梧桐车联科技有限责任公司 | 程序的运行监控方法、装置及设备 |
CN114070659A (zh) * | 2021-10-29 | 2022-02-18 | 深圳市优必选科技股份有限公司 | 一种设备锁定方法、装置及终端设备 |
CN114070659B (zh) * | 2021-10-29 | 2023-11-17 | 深圳市优必选科技股份有限公司 | 一种设备锁定方法、装置及终端设备 |
CN115118699A (zh) * | 2022-06-21 | 2022-09-27 | 国仪量子(合肥)技术有限公司 | 数据传输方法、装置、系统、上位机和存储介质 |
CN116661400A (zh) * | 2023-07-14 | 2023-08-29 | 广船国际有限公司 | 基于机器视觉的目标设备的控制方法、装置、设备及介质 |
Also Published As
Publication number | Publication date |
---|---|
TW202122944A (zh) | 2021-06-16 |
CN110297472A (zh) | 2019-10-01 |
TWI743853B (zh) | 2021-10-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020259154A1 (zh) | 设备控制方法、终端、受控设备、电子设备、介质和程序 | |
JP6126738B2 (ja) | 手振り制御方法、装置およびシステム | |
CN111283680B (zh) | 一种无线远程遥控机器人的系统和方法 | |
US10591999B2 (en) | Hand gesture recognition method, device, system, and computer storage medium | |
WO2022127829A1 (zh) | 自移动机器人及其路径规划方法、装置、设备和存储介质 | |
WO2021027967A1 (zh) | 一种路线确定方法、可行进设备、和存储介质 | |
US20180300552A1 (en) | Differential Tracking for Panoramic Images | |
CN109992111B (zh) | 增强现实扩展方法和电子设备 | |
WO2022178985A1 (zh) | 推料机器人的异常处理方法、装置、服务器及存储介质 | |
WO2019069436A1 (ja) | 監視装置、監視システムおよび監視方法 | |
JP2018149669A (ja) | 学習装置及び学習方法 | |
WO2023115927A1 (zh) | 云端机器人的建图方法、系统、设备及存储介质 | |
CN112346809A (zh) | 网页图像标注方法、装置、电子设备及存储介质 | |
WO2017012499A1 (zh) | 一种无人机控制方法、装置和系统 | |
Barbosa et al. | ROS, Android and cloud robotics: How to make a powerful low cost robot | |
US20170026617A1 (en) | Method and apparatus for real-time video interaction by transmitting and displaying user interface correpsonding to user input | |
Velamala et al. | Development of ROS-based GUI for control of an autonomous surface vehicle | |
CN116483357A (zh) | 一种基于ros的机器人在线仿真实训平台的构建方法 | |
WO2022000757A1 (zh) | 一种基于ar的机器人物联网交互方法、装置及介质 | |
CN112233208B (zh) | 机器人状态处理方法、装置、计算设备和存储介质 | |
Annable et al. | Nubugger: A visual real-time robot debugging system | |
WO2020067204A1 (ja) | 学習用データ作成方法、機械学習モデルの生成方法、学習用データ作成装置及びプログラム | |
Bhuvanesh et al. | Design and Development of a Mobile Wireless Video Streaming Mote | |
CN110896442A (zh) | 一种设备监控方法、装置、系统及摄像设备 | |
CN110554966A (zh) | 一种驱动调试方法、行为分析方法及驱动调试系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20832004 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20832004 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20832004 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 13.09.2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20832004 Country of ref document: EP Kind code of ref document: A1 |