CN108805928B - Method and device for controlling live broadcast of unmanned equipment, computer equipment and storage medium - Google Patents

Method and device for controlling live broadcast of unmanned equipment, computer equipment and storage medium Download PDF

Info

Publication number
CN108805928B
CN108805928B CN201810502117.8A CN201810502117A CN108805928B CN 108805928 B CN108805928 B CN 108805928B CN 201810502117 A CN201810502117 A CN 201810502117A CN 108805928 B CN108805928 B CN 108805928B
Authority
CN
China
Prior art keywords
unmanned
unmanned equipment
controlling
equipment
state instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810502117.8A
Other languages
Chinese (zh)
Other versions
CN108805928A (en
Inventor
杜立
黄俊凯
黄晓霞
靳倩慧
盛亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201810502117.8A priority Critical patent/CN108805928B/en
Priority to PCT/CN2018/102874 priority patent/WO2019223159A1/en
Publication of CN108805928A publication Critical patent/CN108805928A/en
Application granted granted Critical
Publication of CN108805928B publication Critical patent/CN108805928B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The invention discloses a method and a device for controlling the live broadcast of unmanned equipment, computer equipment and a storage medium, wherein the method comprises the following steps: the method comprises the steps that the unmanned equipment obtains a first state instruction to be executed, wherein the first state instruction is used for controlling the movement track of the unmanned equipment; the unmanned equipment moves on a preset path according to the moving track, and spatial video information in the moving direction of the unmanned equipment in the moving process is acquired through an image sensor on the unmanned equipment; and the unmanned equipment pushes and streams the spatial video information to the server side. The method and the device for controlling the live broadcast of the unmanned equipment can be used in a scene of automatically watching a house, any user can automatically control the device for controlling the live broadcast of the unmanned equipment, the device can walk in a preset route according to own preference and shoot related images, the function of automatically and remotely watching the house by remotely controlling the equipment is realized, the user does not need to be subjected to professional control training, the control mode is simple, and the control flexibility is higher.

Description

Method and device for controlling live broadcast of unmanned equipment, computer equipment and storage medium
Technical Field
The invention relates to the technical field of remote control, in particular to a method and a device for controlling live broadcast of unmanned equipment, computer equipment and a storage medium.
Background
With the improvement of living standard of people, the demand of people for buying houses is great, and for a large city, a large number of people rent houses exist. Before buying a house or renting a house, people can personally go to the field to see the house in order to ensure that a proper house can be bought or rented, and make a decision after personally feeling.
In order to facilitate people to know the conditions of houses in advance, many house transaction service providers in the prior art adopt an on-line house-watching mode, picture information of related houses is provided on a network firstly, under the condition that the information of the related houses on the network is satisfied, house supply and demand parties agree house-watching time firstly, and then service personnel provided by a line service provider accompany the house, for example, the house is watched through a house intermediary, the cost and the cost of the process are high, the defects of time waste, high labor cost and the like exist, and the house-watching efficiency is low. Moreover, the keys for unlocking at present are also various and are not uniformly managed, so that the labor maintenance cost is increased.
In the prior art, a technology of watching a house through an online video is provided for satisfying a client, but the technology of watching the house through the online video is that a worker holds a video shooting device in hand, the video shooting device moves in the house, a data stream is sent to a server end through a wireless network, a user side obtains live broadcast data through accessing the server end, and the idea of watching the house on line is completed. The prior art also provides a method for automatically watching a house by a robot, the robot carries video equipment, a user remotely controls the robot to move, the function of watching the house by the video is realized, but because the environment inside the house is complex, the user who does not have professional training cannot drive unmanned equipment under the complex environment, and the operation is hasty, so that the robot is easy to damage.
Disclosure of Invention
The present invention is directed to solve at least one of the above technical drawbacks, and in particular, to a method, an apparatus, a computer device, and a storage medium for freely controlling a live broadcast of an unattended device, so as to freely control a terminal and view an image of an environment where the terminal is located.
The invention provides a method for controlling the live broadcast of unmanned equipment, which comprises the following steps:
the method comprises the steps that the unmanned equipment obtains a first state instruction to be executed, wherein the first state instruction is used for controlling the movement track of the unmanned equipment;
and the unmanned equipment pushes and streams the spatial video information to a server side.
Further, when the unmanned device moves on a preset path according to the movement track, the method further includes:
acquiring a first parameter value between the obstacle and the first parameter value;
judging whether the first parameter value meets a first preset condition or not;
and executing a preset second state instruction when the first parameter value meets a first preset condition.
Further, the spatial video information includes: projecting, by a spot projector on the drone, a spot image of a ranging spot in a direction of travel; the step of obtaining a first parameter value with the obstacle comprises:
acquiring a frame image of the spatial video information;
and calculating the view proportion of the light spot image in the frame picture image.
Further, the second state instruction includes: any one of a stop travel command, a decrease travel speed command, or a reverse travel command.
Further, before executing the first state instruction, the method further comprises:
acquiring a preset traveling path and real-time position information;
judging whether the real-time position information is in the traveling path;
prohibiting execution of the first status instruction when the real-time location information is not in the travel path.
Further, prior to executing the first state instruction, the method further comprises:
acquiring a preset travelling path and real-time position information;
judging whether the end point position represented by the first state instruction is in the advancing path or not according to the boundary distance between the real-time position information and the advancing path;
when the end position is not in the travel path, inhibiting execution of the first status instruction.
The invention also discloses a device for controlling the live broadcast of the unmanned equipment, which comprises the following components:
an acquisition module: the unmanned equipment is used for acquiring a first state instruction to be executed, wherein the first state instruction is used for controlling the movement track of the unmanned equipment;
a processing module: the unmanned equipment moves on a preset path according to the moving track and obtains spatial video information in the moving direction of the unmanned equipment in the moving process through an image sensor on the unmanned equipment;
an execution module: and the unmanned equipment is used for pushing and sending the spatial video information to a server side.
Further, the method also comprises the following steps:
a first acquisition module: the method comprises the steps of obtaining a first parameter value between the obstacle and the first parameter value;
a first processing module: the first parameter value is used for judging whether the first parameter value meets a first preset condition or not;
a first execution module: when the first parameter value meets a first preset condition, executing a preset second state instruction
Further, the spatial video information includes: the step of acquiring a first parameter value between the unmanned equipment and an obstacle through a spot image of a ranging spot projected by a spot projector on the unmanned equipment in the traveling direction specifically comprises the following steps:
acquiring a frame image of the spatial video information;
and calculating the view proportion of the light spot image in the frame picture image.
Further, the second status instruction includes: any one of the stop traveling command, the decrease traveling speed command, and the reverse traveling command.
Further, the device further comprises:
a second obtaining module: the system comprises a control module, a processing module and a display module, wherein the control module is used for acquiring a preset travelling path and real-time position information;
a second processing module: for determining whether the real-time location information is in the travel path;
a second execution module: for prohibiting execution of the first status instruction when the real-time location information is not in the travel path.
Further, the device further comprises:
a third obtaining module: acquiring a preset travelling path and real-time position information;
a third processing module: judging whether the end point position represented by the first state instruction is in the advancing path or not according to the boundary distance between the real-time position information and the advancing path;
a third execution module: when the end position is not in the travel path, prohibiting execution of the first status instruction.
The invention also discloses computer equipment which comprises a memory and a processor, wherein computer readable instructions are stored in the memory, and when the computer readable instructions are executed by the processor, the processor executes the steps of the method for controlling the live broadcast of the unmanned equipment.
The invention also discloses a storage medium storing computer readable instructions, which when executed by one or more processors, cause the one or more processors to execute the steps of the above method for controlling the live broadcast of the unmanned device.
The invention has the beneficial effects that: the invention discloses a method and a device for controlling the live broadcast of unmanned equipment, which are characterized in that the unmanned equipment moves according to a control instruction by receiving a related control instruction on a remote terminal, and simultaneously, a spatial image of the position of the unmanned equipment is shot and is transmitted back to the remote terminal so as to realize that a client views the spatial image remotely in real time.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart of a method for controlling live broadcast of an unmanned device according to the present invention;
FIG. 2 is a flowchart of a method for acquiring a distance by an unmanned aerial vehicle according to the present invention;
FIG. 3 is a flowchart of a method for acquiring a distance by using a ranging spot by the unmanned aerial vehicle according to the present invention;
FIG. 4 is a flowchart illustrating a first embodiment of the preconditions for executing a first state instruction according to the invention;
FIG. 5 is a flowchart illustrating a second embodiment of the present invention for performing preconditions for a first state instruction;
FIG. 6 is a schematic diagram of a module of an apparatus for controlling live broadcast of an unmanned aerial vehicle according to the present invention;
FIG. 7 is a block diagram of the basic structure of the computer device of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention and are not to be construed as limiting the present invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As will be understood by those skilled in the art, a "terminal" as used herein includes both devices having a wireless signal receiver, which are only devices having a wireless signal receiver without transmit capability, and devices having receive and transmit hardware, which are devices having receive and transmit hardware capable of performing two-way communications over a two-way communications link. Such a device may include: a cellular or other communication device having a single line display or a multi-line display or a cellular or other communication device without a multi-line display; PCS (Personal Communications Service), which may combine voice, data processing, facsimile and/or data communication capabilities; PDA (Personal Digital Assistant) Personal digital assistants) that may include radio frequency receivers, pagers, internet/intranet access, web browsers, notepads, calendars, and/or GPS (Global Positioning System) receivers; a conventional laptop and/or palmtop computer or other appliance having and/or including a radio frequency receiver. As used herein, a "terminal" or "terminal device" may be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space. As used hereinThe "terminal" and "terminal Device" may also be a communication terminal, an Internet access terminal, and a music/video playing terminal, for example, a PDA, an MID (Mobile Internet Device) and/or a Mobile phone with a music/video playing function, and may also be a smart television, a set-top box, and other devices.
In order to enable any person to remotely control a device for traveling, execute a shooting function and achieve the purpose of viewing shot images in real time, the invention provides a method for controlling live broadcast of unmanned equipment, wherein when the method is used in an application scene of remotely watching a house, the method for controlling live broadcast of the unmanned equipment at least comprises two terminals, one is a remote terminal used for viewing images and control states by a user and sending a control instruction, the remote terminal can be a computer, a notebook, a mobile phone or other terminals, and the unmanned equipment can be unmanned equipment such as a robot or an unmanned aerial vehicle, a remote control car and the like. And the other is an unmanned device that is controlled to move and can perform the associated photographing action. Remote communication between the unmanned aerial vehicle and the remote control, the unmanned aerial vehicle analyzes and judges the sent related instruction according to the motion state of the unmanned aerial vehicle, and executes or forbids the first state instruction, so that the purpose of remotely controlling to see the house is achieved, meanwhile, the unmanned aerial vehicle has an automatic obstacle avoidance function, the intelligent degree is high, and the control mode is simple.
Specifically, referring to fig. 1, the method for controlling the live broadcast of the unmanned device according to the remote terminal and the unmanned device includes the following steps:
s100, the unmanned equipment acquires a first state instruction to be executed, wherein the first state instruction is used for controlling the movement track of the unmanned equipment;
the first instruction is used for controlling the movement track of the unmanned equipment, namely controlling the unmanned equipment to move according to the specified direction. Further, a direction input unit is arranged on the remote terminal, and the unmanned equipment can be controlled to move through the direction input unit or through setting related direction parameters. The user can send any direction instruction which can enable the unmanned equipment to move, and the unmanned equipment is controlled to move according to the intention of the user.
S200, the unmanned equipment moves on a preset path according to the moving track, and spatial video information of the unmanned equipment in the moving direction in the moving process is acquired through an image sensor on the unmanned equipment;
the spatial video information is of various types, including: the image of the ranging spot projected in the direction of travel or the image of the surroundings at the current location. The spatial video information is acquired by an image sensor on the unmanned device.
In the present invention, the ranging spot image projected in the traveling direction is mainly used to monitor the distance between the unmanned device and the obstacle in the traveling direction to avoid collision. The image of the surrounding environment at the current position is used for the convenience of the user to know the environment around the position of the unmanned device.
The spatial video information for both cases may be obtained simultaneously or in time-slices. The case of obtaining the distance measuring spot image and the image of the surrounding environment at the current position is obtained by two different image pickup devices. In the case of time-sharing acquisition, usually, one image pickup device is used, and during the traveling process, a ranging light spot image is acquired first to measure the distance between the unmanned device and the obstacle in front, and when the unmanned device stops, the environment around the position where the unmanned device is located starts to be photographed to acquire a corresponding image.
In the present invention, the first instruction includes: instruction information for controlling the moving direction. The unmanned equipment automatically moves according to the first state instruction, so that an obstacle avoidance mechanism is needed to avoid obstacles. Obtaining the ranging spot image is actually an obstacle avoidance method. In the invention, the distance between the unmanned equipment and the front obstacle is judged by a distance measuring light spot image.
S300, the unmanned equipment pushes the spatial video information to be sent to a server side.
As the spatial video information is various, including: and after obtaining the relevant spatial video information, directly pushing and sending the relevant spatial video information to a server side.
Further, referring to fig. 2, when the unmanned device moves on a preset path according to the movement track, the method further includes:
s210, acquiring a first parameter value between the obstacle and the first parameter value;
an obstacle is an object that blocks in the direction of travel of the drone. The first parameter value is a relationship, such as a distance relationship, between the drone and an obstacle.
S220, judging whether the first parameter value meets a first preset condition or not;
the first preset condition is a condition associated with the first parameter value, for example, when the first parameter value is a distance relationship between the unmanned aerial vehicle and an obstacle, in order to better enable the unmanned aerial vehicle to perform a specified action in a traveling direction, a first threshold value related to the distance may be set, and when the first parameter value is smaller than the first threshold value, it indicates that the distance between the unmanned aerial vehicle and the obstacle is closer. The first preset condition is a relationship between the first parameter value and the first threshold, for example, the first parameter value is greater than the first threshold, or the first parameter value is less than the first threshold, or other relationships are satisfied, and is determined according to a corresponding application scenario.
And S230, when the first parameter value meets a first preset condition, executing a preset second state instruction.
Further, in the present invention, the second state command includes any one of command information of a stop travel command, a decrease travel speed command, or a reverse travel command. The three steps are described in terms of specific embodiments.
For example, when the distance between the obstacle and the unmanned device is obtained by using the ultrasonic method, the first threshold value is assumed to be 1 meter, and the first preset condition is that the first parameter value is smaller than the first threshold value. And when the first parameter value obtained by the ultrasonic wave is 0.8 m, the first parameter value is smaller than a first threshold value, and a first preset condition is met, so that the unmanned equipment executes a second state instruction, and stops immediately, or reduces the traveling speed, or travels reversely.
In the present invention, there are many ways to obtain the first parameter value between the obstacle and the target, and the first parameter value may be obtained by laser ranging, visible light ranging or other ways. The invention further discloses a method for ranging by using the ranging spots, namely the acquired first parameter value is a parameter value related to the ranging spots.
Referring to fig. 3, the specific steps of obtaining the first parameter value by using the ranging spot and executing the second preset condition include:
s211, acquiring a frame image of the spatial video information;
before this step is performed, it should be noted that this step is applied to the case where the spatial video information refers to a ranging spot image, and the purpose of acquiring a frame image of the spatial video information is to measure the distance between the unmanned device and an obstacle. The specific distance measurement method is referred to step S212.
S212, calculating the view proportion of the light spot image in the frame image;
in general, the device for transmitting the light beam is directly mounted on the unmanned device, and a camera for taking a spatial image with a complete light spot in front of the device is also mounted on the unmanned device. In this embodiment, the device that sends the light beam is the facula projector, and the facula projector superposes in proper order with the camera of the image of shooting with the facula and constitutes range unit, and range unit's range finding mode is directional direction of advancing all the time, and laser emitter throws the facula on the barrier, and the camera is real-time or regularly gathers the image in the region that the facula belongs to.
After image information with complete light spots is collected, the complete light spot area can be obtained through an image processing technology, and the occupied ratio of the light spots to the whole image area can be obtained through a fixed algorithm because the camera is fixed and the area of the whole shot picture is fixed and unchangeable. In this embodiment, the shape of the light spot projected by the light spot projector is fixed, and only because of the distance, the areas of the light spots projected on the obstacle may be different, it can be understood that, when the unmanned device is close to the obstacle, the area of the light spot projected by the light spot projector on the unmanned device is small, and when the light spot projector is far away from the obstacle, the area of the light spot projected by the obstacle is large, so that, on the premise that the projection direction is fixed, a function value about the projection area and the distance of the light spot can be easily obtained, correspondingly, on the premise that the position and the shooting angle of the camera are fixed, the size of the image shot by the camera is fixed, and the relationship between the ratio between the size of the light spot presented in the shot image and the area of the image and the distance between the shooting location and the obstacle can be obtained according to the prior art, so that the distance between the unmanned device and the obstacle can be obtained by judging the ratio of the area of the light spot presented in the image in the whole image.
Further, in another embodiment, the distance between the unmanned device and the obstacle may be measured by emitting visible light, which is the same as the spot ranging described above, and the distance between the position of the laser emitting device and the obstacle may be calculated by using laser ranging, that is, using a pulse valve or a phase method. The above are only some of the distance measuring methods disclosed in the present invention, but the present invention is not limited to the above distance measuring method, and may be other methods.
And S213, executing a preset second state instruction when the view scale is larger than the first threshold.
In this embodiment, the first threshold is a critical value of the view ratio of the light spot image in the frame image in step S212. As can be seen from the step S212, in the frame, the distance between the current unmanned device and the front obstacle can be determined from the ratio of the spot image. When the image of the surrounding environment at the current position is to be shot at the same time, if the unmanned device is close to the obstacle, the actions of turning, going forward and the like of the unmanned device can be blocked, the obstacle can block a part of visual angles, and the surrounding environment is not favorable for shooting, so that an optimal distance between the unmanned device and the obstacle needs to be set, the image of the surrounding environment can be shot well, the unmanned device can be ensured not to collide with the obstacle,
the view ratio of the light spot image in the frame image can be obtained in step S212, that is, the distance between the current position of the unmanned device and the obstacle in front of the unmanned device can be obtained, and the view ratio of the light spot image in the frame image corresponding to the critical distance between the unmanned device and the obstacle is the first threshold value. Assuming that the total area of the frame picture image is S1 and the area of the spot image in the frame picture image is S2, the actual distance L between the current unmanned device and the obstacle is a mapping of a proportional value of the spot image area S2 to the total area of the frame picture image S1 and a corresponding distance function relation f (n), i.e., L = (S2/S1) × f (n). From this relationship, it can be derived that the functional relationship of f (n) is fixed, the variable is S2/S1, and S2/S1 and L are in an inverse relationship, i.e. when the distance L is small, the proportional relationship of S2/S1 is large, when the distance L is large, the proportional relationship of S2/S1 is small, so when the proportional relationship of S2/S1 is larger than the first threshold value, the corresponding distance L is too small. At this time, the preset second state instruction is executed.
In one embodiment, the second state instruction includes: any one of stop traveling, reduction of traveling speed, or reverse traveling. For example, in an embodiment, when the proportional relationship is greater than the first threshold, the second state instruction for stopping traveling is executed to avoid collision between the unmanned device and the obstacle.
Further, in another embodiment, when the proportional relation is larger than the first threshold, the traveling speed is reduced to approach the obstacle more slowly, and in this embodiment, a second threshold may be further set, so that the unmanned device moves to a position corresponding to the second threshold slowly, and then performs other actions such as stopping or turning. It should be noted that the distance between the unmanned device and the obstacle may be acquired in real time or may be acquired at a certain frequency. When the distance between the unmanned aerial vehicle and the obstacle in front is detected, the distance is probably smaller than a preset value due to the existence of a time interval when the distance is acquired according to a certain frequency, so a first threshold value can be set firstly, when the distance is larger than the first threshold value, the traveling speed is reduced, the unmanned aerial vehicle is controlled to adjust the distance value between the unmanned aerial vehicle and the obstacle according to a smaller moving unit and a distance detection frequency, and the unmanned aerial vehicle can reach the position corresponding to a second threshold value more accurately. Further, in another embodiment, when the proportional relationship is greater than the first threshold, a second state command of reverse travel is executed to enable the unmanned device to move in a reverse direction, and meanwhile, the relationship value of S2/S1 is continuously monitored until S2/S1 is equal to the first threshold or less than the first threshold, and then the unmanned device is stopped or other actions are executed. The distance corresponding to the first threshold may be the safest distance, for example, the distance is suitable for steering of the unmanned device, or the distance is not easy to collide, or the distance of the first threshold may be the most suitable distance for capturing a surrounding space image, because when the distance between the unmanned device and an obstacle is too close, capturing of the space image is easily affected due to blocking of the obstacle, the distance value of the first threshold is set, and meanwhile, when the distance between the unmanned device and the obstacle is smaller than the first threshold, the unmanned device is controlled to retreat, so that the distance between the unmanned device and the obstacle is greater than or equal to the critical value of the distance, so as to reach the optimal capturing distance range, and the unmanned device can capture the surrounding image more comprehensively. The user can observe the most comprehensive space image at the current distance.
Further, the second status instruction further comprises: and sending warning information. When the S2/S1 is larger than the preset first threshold value, the fact that the unmanned equipment has danger of colliding with an upper front obstacle or is not beneficial to execution of next action is indicated, and therefore warning information can be sent to a remote terminal or a related server to give an alarm prompt to remind a user or a manager to check the unmanned equipment or take related measures to adjust the unmanned equipment. The warning information can be used for controlling an alarm to give an alarm, making an emergency call, sending an emergency mail and the like.
In the present invention, referring to fig. 4, before the step of obtaining the spatial video information in the view range in the traveling direction according to the first status instruction, the method further includes:
s110, acquiring a preset traveling path and real-time position information;
the step can be specifically applied to some scenes, for example, in an automatic house-watching system, the unmanned device is remotely controlled, and the surrounding environment where the unmanned device is located is shot so as to carry out automatic house-watching. Because the number of objects in a room is large, the unmanned device can be used for automatic walking shooting completely, collision is easy to happen, the unmanned device capable of automatically walking usually can move according to a specified route, different clients have different preferences and observation habits, and the clients cannot check any scene in the current environment at will. Therefore, the method for controlling the unmanned equipment to control the live broadcast of the unmanned equipment can solve the problem. However, in a room, the unmanned aerial vehicle cannot be moved freely due to restrictions of a layout and decoration, for example, when the unmanned aerial vehicle travels to a balcony without glass or a guardrail having a gap, the unmanned aerial vehicle is liable to fall from a gap of the guardrail of the balcony, is unsuitable to go to an edge of the balcony, or is equipped with a toilet for squatting, the unmanned aerial vehicle is liable to fall, or is liable to fall off from a stair at a next floor in a duplex room, and the like. In a room, there are some unsafe places that are not suitable for the unmanned device to move forward, so in this embodiment, a travel path suitable for the current environment can be further established to ensure that the unmanned device can move safely and can completely capture images around the current environment.
S120, judging whether the real-time position information is in the advancing path;
s130, when the real-time position information is not in the travel path, the first state instruction is forbidden to be executed.
The method comprises the steps that the traveling paths of the unmanned equipment at the current position are not completely the same, in order to adapt to different house types, a traveling path can be set for each room, when the unmanned equipment starts to work, the real-time position of the unmanned equipment is obtained, the traveling path of the corresponding room is matched according to the position of the unmanned equipment, when a first state instruction is received, whether the current unmanned equipment is in the traveling path or not is judged firstly, if yes, the unmanned equipment moves in the direction indicated by the first state instruction, and if not, the position where the current unmanned equipment is located is an illegal area, namely an unsafe area, and the first state instruction is not executed. Furthermore, the unmanned equipment can be controlled to send an alarm signal to remind a worker or a remote operator that the current position is an illegal position, so that the worker or the remote operator is prompted to move the unmanned equipment to be separated from an illegal area.
Further, referring to fig. 5, in the present invention, another way to limit the unmanned device from executing the first status instruction is:
s140, acquiring a preset traveling path and real-time position information;
s150, pre-judging whether the end position represented by the first state instruction is in the advancing path according to the real-time position information and the boundary distance of the advancing path;
and S160, when the end position is not in the travel path, prohibiting the first state instruction from being executed.
The method is similar to the above method, and a travel path needs to be preset, and the current position information of the unmanned device is obtained to match the corresponding travel path. The difference is that the unmanned aerial vehicle needs to determine whether the current position of the unmanned aerial vehicle and the end point position of the unmanned aerial vehicle about to travel are within the boundary of the travel path according to the direction pointed by the first state command, for example, when the first state command moves forward at a speed of 10cm/s for 10 seconds, since the distance of 10 seconds of movement is 100cm, when the first state command is executed, it is determined in advance whether the end point position of the current unmanned aerial vehicle after moving forward by 100cm is within the preset travel path range, only when the end point of the travel path is within the travel path, the first state command is executed, and if the unmanned aerial vehicle is out of the travel path after executing the first state command, the execution of the first state command is prohibited, so as to avoid running risk of the unmanned aerial vehicle after executing the first state command.
Furthermore, the travel path mentioned in the above two solutions is a preset electronic path, and can be sent to the device for executing movement by means of data transmission, however, in the present invention, the preset travel path is not limited to the above electronic path, and may also be an electromagnetic track set in a house, and the unmanned device for executing movement is provided with a hall sensor for detecting the electromagnetic track, and the hall sensor detects whether the unmanned device is on the electromagnetic track, and then determines whether to execute the first state command in sequence. Specifically, when unmanned aerial vehicle equipment is unmanned aerial vehicle, set up hall sensor in unmanned aerial vehicle's both sides, the hall sensor who is when one side can't detect the electromagnetic field or the electromagnetic field that detects is too little, and the unmanned aerial vehicle ware people is accurate until receiving electromagnetic field signal to this hall sensor's opposite side skew, makes through autopilot's mode and sees more convenient and intelligent in room.
The present invention also discloses a device for controlling the live broadcast of the unmanned device, please refer to fig. 6, which includes:
the acquisition module 100: the unmanned equipment comprises a first state instruction used for obtaining a first state instruction to be executed, wherein the first state instruction is used for controlling the movement track of the unmanned equipment;
the processing module 200: the unmanned equipment moves on a preset path according to the moving track and obtains spatial video information in the moving direction of the unmanned equipment in the moving process through an image sensor on the unmanned equipment;
the execution module 300: and the unmanned equipment is used for pushing and sending the spatial video information to a server side.
The first instructions include: command information for controlling the direction of movement. I.e. to control the drone to move in a given direction. Furthermore, a direction input unit is arranged on the remote terminal, and the unmanned equipment can be controlled to move through the direction input unit or through setting related direction parameters. The user can send any direction instruction which can enable the unmanned equipment to move, and the unmanned equipment is controlled to move according to the intention of the user.
The spatial video information is of various types, including: the image of the ranging spot projected in the direction of travel or the image of the surroundings at the current location. When the processing module 200 receives the first status command, the distance measuring spot image and the image of the surrounding environment of the current location may be obtained at the same time, or the distance measuring spot image is obtained first, and the image of the surrounding environment of the current location is obtained under a certain condition. The simultaneous acquisition is usually the case where the camera means for taking the image of the ranging spot and the camera means for taking the image of the environment around the current location are two different means, so that the two take different images and complement each other. The case where two kinds of images are acquired separately may be a case where one image pickup device is used in common. In this case, the distance measuring spot image is preferentially obtained to measure the distance between the current position and the obstacle, so as to ensure that the unmanned device operates within a safe distance range, and the image of the environment around the current position is automatically shot under the condition that the unmanned device stops or the current traveling direction is safe.
Any kind of image information which is shot can be sent to the server for storage in a plug flow mode through the execution module 300, or sent to the remote client for real-time viewing through the server.
Further, the automatic traveling apparatus further includes:
a first acquisition module: the method comprises the steps of obtaining a first parameter value between the obstacle and the first parameter value;
a first processing module: the first parameter value is used for judging whether the first parameter value meets a first preset condition or not;
a first execution module: and executing a preset second state instruction when the first parameter value meets a first preset condition.
Further, the spatial video information includes: the step of acquiring a first parameter value between the unmanned equipment and an obstacle through a spot image of a ranging spot projected by a spot projector on the unmanned equipment in the traveling direction specifically comprises the following steps:
acquiring a frame image of the spatial video information;
and calculating the view proportion of the light spot image in the frame picture image.
Combining the above steps, it can be obtained: the first obtaining module: the frame image is used for acquiring the spatial video information; a first processing module: the view scale of the light spot image in the frame picture image is calculated; a first execution module: and the second state instruction is used for executing a preset second state instruction when the view scale is larger than the first threshold value.
The first acquisition module, the first processing module and the first execution module are mainly used for measuring the distance between the unmanned equipment and the obstacle according to the light spot image of the ranging light spot in the spatial video information.
The device for projecting the ranging light spots is installed on the unmanned equipment, the device for shooting the light spot images with the complete light spots is also installed on the unmanned equipment, the closer the distance between the unmanned equipment and the obstacle in front of the unmanned equipment is, the larger the proportion of the complete light spots in the shot frame image images with the light spot images is, so that a first threshold value of the area proportion of the complete light spots in one frame image to the whole frame image can be set according to the relation, and a distance threshold value between the unmanned equipment and the obstacle can be obtained. And controlling the unmanned equipment to execute the second state instruction through the whole first distance threshold value.
The second state instruction includes: any one of a stop travel command, a decrease travel speed command, or a reverse travel command. It can be applied to different occasions according to the situation. The application and control method are the same as those of the method for controlling the live broadcast of the unmanned equipment.
Further, the second status instruction further comprises: and sending warning information. When the S2/S1 is larger than the preset first threshold value, the fact that the unmanned equipment has danger of colliding with an upper front obstacle or is not beneficial to execution of next action is indicated, and therefore warning information can be sent to a remote terminal or a related server to give an alarm prompt to remind a user or a manager to check the unmanned equipment or take related measures to adjust the unmanned equipment. The warning information can be used for controlling an alarm to give an alarm, making an emergency call, sending an emergency mail and the like so as to improve the safety.
Further, the automatic advancing device further includes:
a second obtaining module: the system comprises a data acquisition module, a data processing module and a data processing module, wherein the data acquisition module is used for acquiring a preset traveling path and real-time position information;
a second processing module: for determining whether the real-time location information is in the travel path;
a second execution module: for prohibiting execution of the first status instruction when the real-time location information is not in the travel path.
If the automatic traveling device is arbitrarily moved at the current position, it is not suitable for the automatic traveling device to move due to the problems of the layout and the topography of the moving field, so that it is necessary to set a traveling path according to each different layout and topography of the field. For example, in order to adapt to different house types, a travel path may be set for each room, when the unmanned device starts to work, the real-time position of the unmanned device is obtained, the travel path of the corresponding room is matched according to the position of the unmanned device, when a first state instruction is received, it is first determined whether the current unmanned device is in the travel path, if so, the current unmanned device moves in the direction indicated by the first state instruction, if not, the current unmanned device is in an illegal area, that is, an unsafe area, and the first state instruction is not executed. Furthermore, the unmanned equipment can be controlled to send an alarm signal to remind a worker or a remote operator that the current position is an illegal position, so that the worker or the remote operator is prompted to move the unmanned equipment to be separated from an illegal area.
Further, the present invention includes another structure for restricting the unmanned device from executing the first status instruction, including:
a third obtaining module: acquiring a preset traveling path and real-time position information;
a third processing module: judging whether the end point position represented by the first state instruction is in the advancing path or not according to the boundary distance between the real-time position information and the advancing path;
a third execution module: inhibiting execution of the first status instruction when the end position is not in the travel path.
Similar to the above method, this method also needs to preset a travel path, acquire current location information of the unmanned device, and match the corresponding travel path. The difference is that the unmanned equipment needs to judge whether the current position of the unmanned equipment and the end point position to be advanced are within the boundary of the advancing path according to the direction pointed by a state command Su Soudi, execute the first state command only when the distance of the advancing path is within the advancing path, and prohibit the unmanned equipment from executing the first state command if the unmanned equipment is separated from the advancing path after executing the first state command, so as to avoid the running risk of the unmanned equipment after executing the first state command.
The invention also discloses computer equipment which comprises a memory and a processor, wherein computer readable instructions are stored in the memory, and when the computer readable instructions are executed by the processor, the processor is enabled to execute any one of the above methods for controlling the live broadcast of the unmanned equipment.
FIG. 7 is a block diagram of a basic structure of a computer device according to an embodiment of the present invention.
The computer device includes a processor, a non-volatile storage medium, a memory, and a network interface connected by a system bus. The non-volatile storage medium of the computer device stores an operating system, a database and computer readable instructions, the database can store control information sequences, and the computer readable instructions can enable a processor to realize a user behavior association prompting method when being executed by the processor. The processor of the computer device is used for providing calculation and control capability and supporting the operation of the whole computer device. The memory of the computer device may have stored therein computer-readable instructions that, when executed by the processor, may cause the processor to perform a method of user behavior association prompting. The network interface of the computer device is used for connecting and communicating with the terminal. Those skilled in the art will appreciate that the architecture shown in fig. 7 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
The computer equipment receives the state information of the prompt behavior sent by the associated client, namely whether the associated terminal starts the prompt or not and whether the user closes the prompt task or not. And the relevant terminal can execute corresponding operation according to the preset instruction by verifying whether the task condition is achieved or not, so that the relevant terminal can be effectively supervised. Meanwhile, when the prompt information state is different from the preset state instruction, the server side controls the associated terminal to ring continuously so as to prevent the problem that the prompt task of the associated terminal is automatically terminated after being executed for a period of time.
The present invention also provides a storage medium storing computer-readable instructions, which when executed by one or more processors, cause the one or more processors to perform the method for controlling a live broadcast of an unmanned device according to any of the above embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (9)

1. A method for controlling a live broadcast of an unmanned device, comprising:
the method comprises the steps that the unmanned equipment obtains a first state instruction to be executed, wherein the first state instruction is used for controlling the movement track of the unmanned equipment;
the unmanned equipment acquires a preset travelling path and real-time position information, judges whether the real-time position information is in the travelling path, and forbids to execute the first state instruction when the real-time position information is not in the travelling path, otherwise, executes the first state instruction;
the unmanned equipment moves on a preset path according to the moving track, and spatial video information of the unmanned equipment in the moving direction in the moving process is acquired through an image sensor on the unmanned equipment;
and the unmanned equipment pushes and streams the spatial video information to a server side.
2. The method for controlling the live broadcast of the unmanned aerial vehicle as claimed in claim 1, wherein when the unmanned aerial vehicle moves on a preset path according to the movement track, the method further comprises:
acquiring a first parameter value between the obstacle and the first parameter value;
judging whether the first parameter value meets a first preset condition or not;
and when the first parameter value meets a first preset condition, executing a preset second state instruction.
3. The method for controlling the live unmanned device, according to claim 2, wherein the spatial video information comprises: projecting, by a spot projector on the drone, a spot image of a ranging spot in a direction of travel; the step of obtaining a first parameter value with the obstacle comprises:
acquiring a frame image of the spatial video information;
and calculating the view proportion of the light spot image in the frame picture image.
4. The method for controlling the live broadcast of the unmanned aerial vehicle as claimed in claim 2 or 3, wherein the second status instruction comprises: any one of a stop travel command, a decrease travel speed command, or a reverse travel command.
5. The method of controlling a live unattended device according to claim 1, wherein prior to executing the first status instruction, the method further comprises:
acquiring a preset travelling path and real-time position information;
judging whether the end point position represented by the first state instruction is in the advancing path or not according to the boundary distance between the real-time position information and the advancing path;
inhibiting execution of the first status instruction when the end position is not in the travel path.
6. An apparatus for controlling live broadcast of an unmanned device, comprising:
an acquisition module: the unmanned equipment comprises a first state instruction used for obtaining a first state instruction to be executed, wherein the first state instruction is used for controlling the movement track of the unmanned equipment;
a judging module: acquiring a preset travelling path and real-time position information, judging whether the real-time position information is in the travelling path, and if not, forbidding to execute the first state instruction, otherwise, executing the first state instruction;
a processing module: the unmanned equipment moves on a preset path according to the moving track and obtains spatial video information in the moving direction of the unmanned equipment in the moving process through an image sensor on the unmanned equipment;
an execution module: and the unmanned equipment is used for pushing and sending the spatial video information to a server side.
7. The apparatus for controlling the live unmanned device, according to claim 6, further comprising:
a first obtaining module: the method comprises the steps of obtaining a first parameter value between the obstacle and the first parameter value;
a first processing module: the first parameter value is used for judging whether the first parameter value meets a first preset condition or not;
a first execution module: and when the first parameter value meets a first preset condition, executing a preset second state instruction.
8. A computer device comprising a memory and a processor, the memory having stored therein computer readable instructions which, when executed by the processor, cause the processor to carry out the steps of the method of controlling a live broadcast of an unmanned device as claimed in any of claims 1 to 5.
9. A storage medium having stored thereon computer-readable instructions which, when executed by one or more processors, cause the one or more processors to perform the steps of the method of controlling a live unattended device according to any one of claims 1 to 5.
CN201810502117.8A 2018-05-23 2018-05-23 Method and device for controlling live broadcast of unmanned equipment, computer equipment and storage medium Active CN108805928B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810502117.8A CN108805928B (en) 2018-05-23 2018-05-23 Method and device for controlling live broadcast of unmanned equipment, computer equipment and storage medium
PCT/CN2018/102874 WO2019223159A1 (en) 2018-05-23 2018-08-29 Method and apparatus for controlling live broadcast of unmanned device, computer device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810502117.8A CN108805928B (en) 2018-05-23 2018-05-23 Method and device for controlling live broadcast of unmanned equipment, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN108805928A CN108805928A (en) 2018-11-13
CN108805928B true CN108805928B (en) 2023-04-18

Family

ID=64092902

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810502117.8A Active CN108805928B (en) 2018-05-23 2018-05-23 Method and device for controlling live broadcast of unmanned equipment, computer equipment and storage medium

Country Status (2)

Country Link
CN (1) CN108805928B (en)
WO (1) WO2019223159A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113848986A (en) * 2021-11-03 2021-12-28 广州港集团有限公司 Unmanned aerial vehicle safety inspection method and system
CN114494848B (en) * 2021-12-21 2024-04-16 重庆特斯联智慧科技股份有限公司 Method and device for determining vision path of robot
CN115599126A (en) * 2022-12-15 2023-01-13 深之蓝海洋科技股份有限公司(Cn) Automatic collision-prevention wireless remote control unmanned submersible and automatic collision-prevention method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2671492A2 (en) * 2012-06-08 2013-12-11 LG Electronics, Inc. Robot cleaner, controlling method of the same, and robot cleaning system
CN107343153A (en) * 2017-08-31 2017-11-10 王修晖 A kind of image pickup method of unmanned machine, device and unmanned plane
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment
CN107807639A (en) * 2017-10-20 2018-03-16 上海犀木信息技术有限公司 Room mobile robot is seen in a kind of mobile platform, mobile platform system and interior

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102094347B1 (en) * 2013-07-29 2020-03-30 삼성전자주식회사 Auto-cleaning system, cleaning robot and controlling method thereof
CN105222760A (en) * 2015-10-22 2016-01-06 一飞智控(天津)科技有限公司 The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2671492A2 (en) * 2012-06-08 2013-12-11 LG Electronics, Inc. Robot cleaner, controlling method of the same, and robot cleaning system
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment
CN107343153A (en) * 2017-08-31 2017-11-10 王修晖 A kind of image pickup method of unmanned machine, device and unmanned plane
CN107807639A (en) * 2017-10-20 2018-03-16 上海犀木信息技术有限公司 Room mobile robot is seen in a kind of mobile platform, mobile platform system and interior

Also Published As

Publication number Publication date
CN108805928A (en) 2018-11-13
WO2019223159A1 (en) 2019-11-28

Similar Documents

Publication Publication Date Title
US20200333789A1 (en) Information processing apparatus, information processing method, and medium
US20230132171A1 (en) Integrative security system and method
US11875579B2 (en) Parking objects detection system using live inventory management
WO2019206270A1 (en) Distance measurement method, intelligent control method and apparatus, electronic device and storage medium
CN108805928B (en) Method and device for controlling live broadcast of unmanned equipment, computer equipment and storage medium
TWI659398B (en) Intrusion detection with directional sensing
JP2024032829A (en) Monitoring system, monitoring method and monitoring program
US20170123413A1 (en) Methods and systems for controlling an unmanned aerial vehicle
US20040119819A1 (en) Method and system for performing surveillance
US10742935B2 (en) Video surveillance system with aerial camera device
US10728505B2 (en) Monitoring system
US20100013917A1 (en) Method and system for performing surveillance
US8144199B2 (en) Moving object automatic tracking apparatus
JP6080568B2 (en) Monitoring system
KR101959366B1 (en) Mutual recognition method between UAV and wireless device
EP2946283B1 (en) Delay compensation while controlling a remote sensor
KR20170100892A (en) Position Tracking Apparatus
KR20150003893U (en) An Automated System for Military Surveillance and Security utilizing RADAR and DRONE
US9948897B2 (en) Surveillance camera management device, surveillance camera management method, and program
KR20180040255A (en) Airport robot
KR20180068435A (en) Visual observation system and visual observation method using the same
WO2005120070A2 (en) Method and system for performing surveillance
US20220397915A1 (en) Systems And Methods For Service Drone Landing Zone Operations
JP7252788B2 (en) shooting system
KR102334509B1 (en) Mutual recognition method between UAV and wireless device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant