Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances, in other words that the embodiments described are to be practiced in sequences other than those illustrated or described herein. Moreover, the terms "comprises," "comprising," and any other variation thereof, may also include other things, such as processes, methods, systems, articles, or apparatus that comprise a list of steps or elements is not necessarily limited to only those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such processes, methods, articles, or apparatus.
It should be noted that the description relating to "first", "second", etc. in the present invention is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present invention.
Please refer to fig. 1 and fig. 3 in combination, which are a flowchart of a remote monitoring method and a schematic diagram of an application scenario of the remote monitoring method according to an embodiment of the present invention. The remote monitoring method is used for remotely monitoring the transportation equipment, so that the operation safety of the transportation equipment is guaranteed. Transportation devices include, but are not limited to, cars, motorcycles, trucks, Sport Utility Vehicles (SUVs), Recreational Vehicles (RVs), aircraft, and the like. In the present embodiment, the remote monitoring method is used for remotely monitoring the unmanned vehicle 100. Therein, the unmanned vehicle 100 has a five-level automation system. The five-level system is referred to as 'full automation', a vehicle with the five-level automation system can realize automatic driving under any legal and drivable road environment, and the vehicle can be driven to a specified place through an optimized route only by setting a destination and starting the system by a human driver. The remote monitoring method specifically comprises the following steps.
Step S102, the control sensor detects the surrounding environment to obtain sensing data. The present embodiment controls the sensor 10 to detect the ambient data using the remote monitoring platform 30 to obtain the sensed data. Wherein the sensor 10 is provided to the unmanned vehicle 100. The sensor 10 includes, but is not limited to, radar, lidar, thermal imager, image sensor, infrared, ultrasonic sensor, and the like, which have a sensing function. Accordingly, the sensed data includes, but is not limited to, radar sensed data, lidar sensed data, thermal imager sensed data, image sensor sensed data, infrared sensor sensed data, ultrasonic sensor sensed data, and the like. In this embodiment, the remote monitoring platform 30 is communicatively coupled to the sensor 10. The remote monitoring platform 30 and the sensor 10 communicate with each other through wireless connection, which includes but is not limited to WIFI, 4G network, 5G network, and the like.
And step S104, controlling the processing equipment to perform perception processing on the sensing data to obtain environment data. Wherein the processing device 20 is provided to the unmanned vehicle 100, the processing device 20 being electrically connected to the sensor 10. The various sensors 10 acquire the sensed data and transmit the sensed data to the processing device 20, respectively. The remote monitoring platform 30 controls the processing device 20 to perform sensing processing on the sensing data according to the fusion sensing algorithm to obtain the environmental data. In the present embodiment, the fusion perception algorithm includes, but is not limited to, pre-fusion perception algorithm, post-fusion perception algorithm, and hybrid fusion perception algorithm. When sensing data is perceptually processed according to the pre-fusion perception algorithm, the processing device 20 first performs data synchronization on various sensing data, and then performs perception processing on the synchronized data to obtain environment data. When sensing processing is performed on the sensing data according to the post-fusion sensing algorithm, the processing device 20 first senses the sensing data of different sensors 10 to obtain corresponding sensor target data, and then performs data fusion on the various sensor target data to obtain environment data. When the sensing data is perceptually processed according to the hybrid fused sensing algorithm, the processing device 20 processes various sensing data to obtain the environmental data by mixing the pre-fused sensing algorithm and the post-fused sensing algorithm. In some possible embodiments, the processing device 20 may also perform perceptual processing on the sensed data in a manner that combines multiple fusion perceptual algorithms. The multiple fusion perception algorithms are a pre-fusion perception algorithm, a post-fusion perception algorithm and a mixed fusion perception algorithm which are used in parallel, or the pre-fusion perception algorithm, the post-fusion perception algorithm and the mixed fusion perception algorithm are combined in a certain method to obtain environment data. In this embodiment, the remote monitoring platform 30 is communicatively coupled to the processing device 20. The remote monitoring platform 30 and the processing device 20 communicate with each other through wireless connection, which includes but is not limited to WIFI, 4G network, 5G network, and the like.
And S106, controlling the processing equipment to perform visualization processing on the environment data to obtain a visualization result. The embodiment utilizes the remote monitoring platform 30 to control the processing device 20 to perform visualization processing on the environmental data to obtain a visualization result. The visualization processing may be rendering processing on the environment data to generate visualized video data, i.e. a visualization result; the visualization process may also be other existing processes, and is not limited herein.
In this embodiment, the remote monitoring platform 30 sends a visualization instruction to the processing device 20, and the processing device 20 is controlled to perform visualization processing on the environmental data according to the visualization instruction to obtain a visualization result. That is, the processing device 20 receives the visualization instruction and then performs visualization processing on the environment data. The visual instruction may be generated by an operator of the remote monitoring platform 30 by using an external device electrically connected to the remote monitoring platform 30, or may be directly generated by the remote monitoring platform 30 at regular time, which is not limited herein. The external device includes, but is not limited to, a mouse, a keyboard, a voice input device, and the like. In some possible embodiments, after the processing device 20 perceptively processes the sensed data to obtain the environmental data, the environmental data may be directly visualized. That is, the processing device 20 may perform visualization processing on the environmental data without receiving visualization instructions.
In this embodiment, the remote monitoring platform 30 controls the processing device 20 to perform visualization processing on the environmental data to obtain a path of visualization result. It is understood that one sensor 10 will obtain one path of sensing data, and the various sensors 10 will transmit multiple paths of sensing data to the processing device 20 simultaneously. The processing device 20 performs sensing processing on the multiple paths of sensing data simultaneously to obtain corresponding multiple paths of environment data, and performs visualization processing on the multiple paths of environment data to obtain a path of visualization result. The processing device 20 transmits a path of the visualization result to the remote monitoring platform 30.
In some possible embodiments, several designated sensors may be included in the visualization instructions. The remote monitoring platform 30 may control the processing device 20 to obtain a plurality of designated sensors in the visualization instruction, control the processing device 20 to select designated environment data matched with the plurality of designated sensors from the environment data, and control the processing device 20 to perform visualization processing on the designated environment data to obtain a visualization result. That is, the processing device 20 may select a part of the environment data from all the environment data to be visualized according to the visualization instruction, instead of visualizing all the environment data. For example, when the operator of the remote monitoring platform 30 only wants to monitor the environment in front of the unmanned vehicle 100, the sensor 10 provided at the front end of the unmanned vehicle 100 may be added as a designated sensor in the visual instruction. After receiving the visualization instruction, the processing device 20 selects the environmental data of the corresponding sensor 10 for visualization processing, and forms a visualization result about the front of the unmanned vehicle 100.
And step S108, acquiring a visualization result. The remote monitoring platform 30 obtains the visualization results.
And step S110, monitoring according to the visualization result. It is understood that the remote monitoring platform 30 may monitor the unmanned vehicle 100 in real time through the visualization results. When the unmanned vehicle 100 encounters some conditions which cannot be handled during the driving process, or when vehicle parts and the like suddenly break down during the driving process of the unmanned vehicle 100, the remote monitoring platform 30 can control the unmanned vehicle 100 according to the visualization result. For example, the unmanned vehicle 100 analyzes the environmental data during driving to determine that the road ahead is narrow and cannot be driven through. The drone vehicle 100 will stop traveling or change routes at this time. However, the operator of the remote monitoring platform 30 may allow the unmanned vehicle 100 to pass by judging the width of the road in front of the unmanned vehicle 100 through the visual result. Then, the operator may issue an instruction to the unmanned vehicle 100 through the remote monitoring system 30 to control the unmanned vehicle 100 to travel through the road ahead.
In the above embodiment, the remote monitoring platform performs visualization processing on the environmental data by controlling the processing device to obtain a visualization result. The remote monitoring platform can monitor the unmanned vehicle through the visual result. Furthermore, the remote monitoring platform can control the unmanned vehicle according to the visual result when the unmanned vehicle has an accident or is about to have an accident, so that the driving safety of the unmanned vehicle is guaranteed. In addition, in this embodiment, after the multi-channel environment data is firstly subjected to visualization processing to obtain a channel of visualization result, the channel of visualization result is transmitted to the remote monitoring platform. Instead of transmitting the multiple paths of environment data to the remote monitoring platform, the remote monitoring platform is used for carrying out visual processing on all the environment data. The embodiment can greatly reduce the transmitted data volume, thereby reducing the occupation of network bandwidth. Meanwhile, the remote monitoring platform can select a plurality of specified sensors through visual instructions, and only the environmental data corresponding to the specified sensors are subjected to visual processing, so that the calculation amount of processing equipment can be reduced.
In some possible embodiments, the remote monitoring method may also be used for remotely monitoring the machine equipment, so as to ensure stable operation of the machine equipment.
Please refer to fig. 2 and fig. 4 in combination, which are a sub-flowchart of the remote monitoring method and a sub-schematic diagram of an application scenario of the remote monitoring method according to an embodiment of the present invention. The remote monitoring method further comprises the following steps.
Step S202, sending an adjustment instruction to the sensor. The present embodiment utilizes the remote monitoring platform 30 to send adjustment instructions to several of the sensors 10. Wherein, the adjustable sensor is a sensor 10 with adjustable view field direction. In the present embodiment, the adjustable sensor includes, but is not limited to, radar, lidar, image sensors, and the like. It will be appreciated that all sensors 10 have a certain field of view direction. When the sensor 10 is mounted on the unmanned vehicle 100 and fixed, the sensor 10 can only detect the surrounding environment in the field of view direction and obtain corresponding sensing data.
And step S204, controlling the sensor to adjust the field direction according to the adjusting instruction. Wherein the adjustment instruction comprises a specified direction. The present embodiment utilizes the remote monitoring platform 30 to control the plurality of adjustable sensors to adjust the first viewing direction F1 according to the designated direction to obtain the second viewing direction F2. In this embodiment, the designated direction may be set by an operator of the remote monitoring platform 30 according to the visualization result. For example, the operator of the remote monitoring platform 30 determines that there may be a safety hazard in the front right of the unmanned vehicle 100 according to the visualization result. Then, the operator sends an adjustment instruction through the remote monitoring platform 30, and the designated direction set in the adjustment instruction is the front right. The sensor 10 adjusts the adjustable sensors provided at the front end, and the right front end of the unmanned vehicle 100 according to the designated direction. Preferably, the remote monitoring platform 30 controls the rotation adjustment angle of the plurality of adjustable sensors. Wherein the adjusting angle is not larger than a preset angle. In the present embodiment, the preset angle is 20 degrees. In some possible embodiments, the predetermined angle is 10 degrees. For example, the field of view of the adjustable sensor disposed at the front end of the unmanned vehicle 100 is directed straight ahead. That is, at this time, the first field of view direction F1 of the adjustable sensor is straight ahead. Upon receiving the adjustment instruction and acquiring the specified direction as the right front, the adjustable sensor may rotate the field of view direction to the right side by 10 degrees to obtain a second field of view direction F2 (shown in fig. 4). After the adjustable sensors adjust the viewing direction to the second viewing direction F2, the remote monitoring platform 30 controls the plurality of adjustable sensors to detect the surrounding environment based on the second viewing direction F2.
In the above embodiment, the remote monitoring platform may adjust the field of view direction of the adjustable sensor through the designated direction in the adjustment instruction, so that the adjustable sensor detects the surrounding environment based on the second field of view direction closer to the designated direction, and thereby more sensing data of the surrounding environment in the designated direction may be collected, and the remote monitoring platform may obtain more visualization results about the designated direction, thereby better ensuring the driving safety of the unmanned vehicle.
In some possible embodiments, in an environment with simple road conditions or low traffic volume, the unmanned vehicle 100 may achieve safe driving through part of the environmental data, and then part of the sensors 10 may be turned off. When the operator of the remote monitoring platform 30 determines that the unmanned vehicle 100 is driven to an environment with complex road conditions or a large traffic flow according to the visualization result, the operator can send a start instruction to the unmanned vehicle 100 through the remote monitoring platform 30. The sensor 10 activates the sensor 10 that is turned off according to the activation instruction to detect the surrounding environment, thereby acquiring more sensing data to secure the driving safety of the unmanned vehicle 100.
Please refer to fig. 5, which is a schematic structural diagram of a remote monitoring platform according to an embodiment of the present invention. The remote monitoring platform 30 includes a processor 31, and a memory 32. The remote monitoring platform 30 includes, but is not limited to, an electronic device such as a notebook computer, a desktop computer, and a tablet computer. In the present embodiment, the memory 32 is used for storing remote monitoring program instructions, and the processor 31 is used for executing the remote monitoring program instructions to implement the remote monitoring method as described above.
The processor 31 may be, in some embodiments, a Central Processing Unit (CPU), a controller, a microcontroller, a microprocessor or other data Processing chip, and is configured to execute the remote monitoring program instructions stored in the memory 32.
The memory 32 includes at least one type of readable storage medium including flash memory, hard disks, multi-media cards, card-type memory (e.g., SD or DX memory, etc.), magnetic memory, magnetic disks, optical disks, and the like. The memory 32 may in some embodiments be an internal storage unit of the computer device, such as a hard disk of the computer device. The memory 32 may also be a storage device of an external computer device in other embodiments, such as a plug-in hard disk provided on the computer device, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and so forth. Further, the memory 32 may also include both internal storage units of the computer device and external storage devices. The memory 32 may be used not only to store application software installed in the computer device and various kinds of data such as codes for implementing a remote monitoring method, etc., but also to temporarily store data that has been output or will be output.
Please refer to fig. 6, which is a schematic structural diagram of a remote monitoring system according to an embodiment of the present invention. The remote monitoring system 1000 includes a sensor 10, a processing device 20, and a remote monitoring platform 30. In the present embodiment, the sensor 10 and the processing device 20 are electrically connected and provided to a transportation device or a machine device. The remote monitoring platform 30 is in communication with the sensor 10 and the processing device 20, respectively. The specific structure of the remote monitoring platform 30 refers to the above-mentioned embodiment. Since the remote monitoring system 1000 adopts all technical solutions of all the embodiments, at least all the beneficial effects brought by the technical solutions of the embodiments are achieved, and are not described in detail herein.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, insofar as these modifications and variations of the invention fall within the scope of the claims of the invention and their equivalents, the invention is intended to include these modifications and variations.
The above-mentioned embodiments are only examples of the present invention, which should not be construed as limiting the scope of the present invention, and therefore, the present invention is not limited by the claims.