CN112987704A - Remote monitoring method, platform and system - Google Patents

Remote monitoring method, platform and system Download PDF

Info

Publication number
CN112987704A
CN112987704A CN202110217583.3A CN202110217583A CN112987704A CN 112987704 A CN112987704 A CN 112987704A CN 202110217583 A CN202110217583 A CN 202110217583A CN 112987704 A CN112987704 A CN 112987704A
Authority
CN
China
Prior art keywords
remote monitoring
visualization
controlling
processing
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110217583.3A
Other languages
Chinese (zh)
Inventor
肖健雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Baodong Zhijia Technology Co ltd
Original Assignee
Shenzhen Baodong Zhijia Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Baodong Zhijia Technology Co ltd filed Critical Shenzhen Baodong Zhijia Technology Co ltd
Priority to CN202110217583.3A priority Critical patent/CN112987704A/en
Publication of CN112987704A publication Critical patent/CN112987704A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0038Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0055Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot with safety arrangements

Abstract

The invention provides a remote monitoring method, which comprises the following steps: controlling a sensor to detect a surrounding environment to obtain sensed data; controlling a processing device to perform sensing processing on the sensing data to obtain environment data, wherein the processing device is electrically connected with the sensor; the control processing equipment performs visualization processing on the environmental data to obtain a visualization result; acquiring a visualization result; and monitoring according to the visualization result. In addition, the invention also provides a remote monitoring platform and a remote monitoring system. The technical scheme of the invention effectively ensures the driving safety of the unmanned vehicle.

Description

Remote monitoring method, platform and system
Technical Field
The invention relates to the technical field of unmanned driving, in particular to a remote monitoring method, a remote monitoring platform and a remote monitoring system.
Background
When the unmanned vehicle performs road test or actual operation, the control platform usually acquires the operation data of the unmanned vehicle or controls the unmanned vehicle in a remote communication mode. The unmanned vehicle transmits operating data, such as identification data, positioning data, and the like, to the control platform via the network. A supervisor, a security officer, or an operator of the control platform may supervise the unmanned vehicle based on the operational data. When a monitor, a security officer or an operator finds potential safety hazards in the supervision process, the unmanned vehicle can be controlled in time, so that accidents, injuries to other people or influences on normal traffic operation are avoided, and the running safety of the unmanned vehicle is guaranteed.
Disclosure of Invention
The invention provides a remote monitoring method, a remote monitoring platform and a remote monitoring system, which can ensure the driving safety of an unmanned vehicle by remotely monitoring the unmanned vehicle.
In a first aspect, an embodiment of the present invention provides a remote monitoring method, where the remote monitoring method includes:
controlling a sensor to detect a surrounding environment to obtain sensed data;
controlling a processing device to perform sensing processing on the sensing data to obtain environment data, wherein the processing device is electrically connected with the sensor;
controlling the processing equipment to perform visualization processing on the environment data to obtain a visualization result;
acquiring the visualization result;
and monitoring according to the visualization result.
In a second aspect, an embodiment of the present invention provides a remote monitoring platform, where the remote monitoring platform includes a processor and a memory, where the memory is used to store remote monitoring program instructions, and the processor is used to execute the remote monitoring program instructions to implement the remote monitoring method described above.
In a third aspect, an embodiment of the present invention provides a remote monitoring system, where the remote monitoring system includes a sensor, a processing device, and the remote monitoring platform as described above, and the remote monitoring platform is respectively in communication connection with the sensor and the processing device.
According to the remote monitoring method, the remote monitoring platform and the remote monitoring system, the remote monitoring platform can obtain the visual result by controlling the processing equipment to perform visual processing on the environmental data. The remote monitoring platform can monitor the unmanned vehicle according to the visual result, so that the driving safety of the unmanned vehicle is guaranteed.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
Fig. 1 is a flowchart of a remote monitoring method according to an embodiment of the present invention.
Fig. 2 is a sub-flowchart of a remote monitoring method according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an application scenario of a remote monitoring method according to an embodiment of the present invention.
Fig. 4 is a sub-schematic diagram of an application scenario of a remote monitoring method according to an embodiment of the present invention.
Fig. 5 is a schematic structural diagram of a remote monitoring platform according to an embodiment of the present invention.
Fig. 6 is a schematic structural diagram of a remote monitoring system according to an embodiment of the present invention.
DESCRIPTION OF SYMBOLS IN THE DRAWINGS
Label name
1000 remote monitoring system 31 processor
100 unmanned vehicle 32 memory
10 sensor F1 first field of view direction
20 processing apparatus F2 second field of view direction
30 remote monitoring platform
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances, in other words that the embodiments described are to be practiced in sequences other than those illustrated or described herein. Moreover, the terms "comprises," "comprising," and any other variation thereof, may also include other things, such as processes, methods, systems, articles, or apparatus that comprise a list of steps or elements is not necessarily limited to only those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such processes, methods, articles, or apparatus.
It should be noted that the description relating to "first", "second", etc. in the present invention is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present invention.
Please refer to fig. 1 and fig. 3 in combination, which are a flowchart of a remote monitoring method and a schematic diagram of an application scenario of the remote monitoring method according to an embodiment of the present invention. The remote monitoring method is used for remotely monitoring the transportation equipment, so that the operation safety of the transportation equipment is guaranteed. Transportation devices include, but are not limited to, cars, motorcycles, trucks, Sport Utility Vehicles (SUVs), Recreational Vehicles (RVs), aircraft, and the like. In the present embodiment, the remote monitoring method is used for remotely monitoring the unmanned vehicle 100. Therein, the unmanned vehicle 100 has a five-level automation system. The five-level system is referred to as 'full automation', a vehicle with the five-level automation system can realize automatic driving under any legal and drivable road environment, and the vehicle can be driven to a specified place through an optimized route only by setting a destination and starting the system by a human driver. The remote monitoring method specifically comprises the following steps.
Step S102, the control sensor detects the surrounding environment to obtain sensing data. The present embodiment controls the sensor 10 to detect the ambient data using the remote monitoring platform 30 to obtain the sensed data. Wherein the sensor 10 is provided to the unmanned vehicle 100. The sensor 10 includes, but is not limited to, radar, lidar, thermal imager, image sensor, infrared, ultrasonic sensor, and the like, which have a sensing function. Accordingly, the sensed data includes, but is not limited to, radar sensed data, lidar sensed data, thermal imager sensed data, image sensor sensed data, infrared sensor sensed data, ultrasonic sensor sensed data, and the like. In this embodiment, the remote monitoring platform 30 is communicatively coupled to the sensor 10. The remote monitoring platform 30 and the sensor 10 communicate with each other through wireless connection, which includes but is not limited to WIFI, 4G network, 5G network, and the like.
And step S104, controlling the processing equipment to perform perception processing on the sensing data to obtain environment data. Wherein the processing device 20 is provided to the unmanned vehicle 100, the processing device 20 being electrically connected to the sensor 10. The various sensors 10 acquire the sensed data and transmit the sensed data to the processing device 20, respectively. The remote monitoring platform 30 controls the processing device 20 to perform sensing processing on the sensing data according to the fusion sensing algorithm to obtain the environmental data. In the present embodiment, the fusion perception algorithm includes, but is not limited to, pre-fusion perception algorithm, post-fusion perception algorithm, and hybrid fusion perception algorithm. When sensing data is perceptually processed according to the pre-fusion perception algorithm, the processing device 20 first performs data synchronization on various sensing data, and then performs perception processing on the synchronized data to obtain environment data. When sensing processing is performed on the sensing data according to the post-fusion sensing algorithm, the processing device 20 first senses the sensing data of different sensors 10 to obtain corresponding sensor target data, and then performs data fusion on the various sensor target data to obtain environment data. When the sensing data is perceptually processed according to the hybrid fused sensing algorithm, the processing device 20 processes various sensing data to obtain the environmental data by mixing the pre-fused sensing algorithm and the post-fused sensing algorithm. In some possible embodiments, the processing device 20 may also perform perceptual processing on the sensed data in a manner that combines multiple fusion perceptual algorithms. The multiple fusion perception algorithms are a pre-fusion perception algorithm, a post-fusion perception algorithm and a mixed fusion perception algorithm which are used in parallel, or the pre-fusion perception algorithm, the post-fusion perception algorithm and the mixed fusion perception algorithm are combined in a certain method to obtain environment data. In this embodiment, the remote monitoring platform 30 is communicatively coupled to the processing device 20. The remote monitoring platform 30 and the processing device 20 communicate with each other through wireless connection, which includes but is not limited to WIFI, 4G network, 5G network, and the like.
And S106, controlling the processing equipment to perform visualization processing on the environment data to obtain a visualization result. The embodiment utilizes the remote monitoring platform 30 to control the processing device 20 to perform visualization processing on the environmental data to obtain a visualization result. The visualization processing may be rendering processing on the environment data to generate visualized video data, i.e. a visualization result; the visualization process may also be other existing processes, and is not limited herein.
In this embodiment, the remote monitoring platform 30 sends a visualization instruction to the processing device 20, and the processing device 20 is controlled to perform visualization processing on the environmental data according to the visualization instruction to obtain a visualization result. That is, the processing device 20 receives the visualization instruction and then performs visualization processing on the environment data. The visual instruction may be generated by an operator of the remote monitoring platform 30 by using an external device electrically connected to the remote monitoring platform 30, or may be directly generated by the remote monitoring platform 30 at regular time, which is not limited herein. The external device includes, but is not limited to, a mouse, a keyboard, a voice input device, and the like. In some possible embodiments, after the processing device 20 perceptively processes the sensed data to obtain the environmental data, the environmental data may be directly visualized. That is, the processing device 20 may perform visualization processing on the environmental data without receiving visualization instructions.
In this embodiment, the remote monitoring platform 30 controls the processing device 20 to perform visualization processing on the environmental data to obtain a path of visualization result. It is understood that one sensor 10 will obtain one path of sensing data, and the various sensors 10 will transmit multiple paths of sensing data to the processing device 20 simultaneously. The processing device 20 performs sensing processing on the multiple paths of sensing data simultaneously to obtain corresponding multiple paths of environment data, and performs visualization processing on the multiple paths of environment data to obtain a path of visualization result. The processing device 20 transmits a path of the visualization result to the remote monitoring platform 30.
In some possible embodiments, several designated sensors may be included in the visualization instructions. The remote monitoring platform 30 may control the processing device 20 to obtain a plurality of designated sensors in the visualization instruction, control the processing device 20 to select designated environment data matched with the plurality of designated sensors from the environment data, and control the processing device 20 to perform visualization processing on the designated environment data to obtain a visualization result. That is, the processing device 20 may select a part of the environment data from all the environment data to be visualized according to the visualization instruction, instead of visualizing all the environment data. For example, when the operator of the remote monitoring platform 30 only wants to monitor the environment in front of the unmanned vehicle 100, the sensor 10 provided at the front end of the unmanned vehicle 100 may be added as a designated sensor in the visual instruction. After receiving the visualization instruction, the processing device 20 selects the environmental data of the corresponding sensor 10 for visualization processing, and forms a visualization result about the front of the unmanned vehicle 100.
And step S108, acquiring a visualization result. The remote monitoring platform 30 obtains the visualization results.
And step S110, monitoring according to the visualization result. It is understood that the remote monitoring platform 30 may monitor the unmanned vehicle 100 in real time through the visualization results. When the unmanned vehicle 100 encounters some conditions which cannot be handled during the driving process, or when vehicle parts and the like suddenly break down during the driving process of the unmanned vehicle 100, the remote monitoring platform 30 can control the unmanned vehicle 100 according to the visualization result. For example, the unmanned vehicle 100 analyzes the environmental data during driving to determine that the road ahead is narrow and cannot be driven through. The drone vehicle 100 will stop traveling or change routes at this time. However, the operator of the remote monitoring platform 30 may allow the unmanned vehicle 100 to pass by judging the width of the road in front of the unmanned vehicle 100 through the visual result. Then, the operator may issue an instruction to the unmanned vehicle 100 through the remote monitoring system 30 to control the unmanned vehicle 100 to travel through the road ahead.
In the above embodiment, the remote monitoring platform performs visualization processing on the environmental data by controlling the processing device to obtain a visualization result. The remote monitoring platform can monitor the unmanned vehicle through the visual result. Furthermore, the remote monitoring platform can control the unmanned vehicle according to the visual result when the unmanned vehicle has an accident or is about to have an accident, so that the driving safety of the unmanned vehicle is guaranteed. In addition, in this embodiment, after the multi-channel environment data is firstly subjected to visualization processing to obtain a channel of visualization result, the channel of visualization result is transmitted to the remote monitoring platform. Instead of transmitting the multiple paths of environment data to the remote monitoring platform, the remote monitoring platform is used for carrying out visual processing on all the environment data. The embodiment can greatly reduce the transmitted data volume, thereby reducing the occupation of network bandwidth. Meanwhile, the remote monitoring platform can select a plurality of specified sensors through visual instructions, and only the environmental data corresponding to the specified sensors are subjected to visual processing, so that the calculation amount of processing equipment can be reduced.
In some possible embodiments, the remote monitoring method may also be used for remotely monitoring the machine equipment, so as to ensure stable operation of the machine equipment.
Please refer to fig. 2 and fig. 4 in combination, which are a sub-flowchart of the remote monitoring method and a sub-schematic diagram of an application scenario of the remote monitoring method according to an embodiment of the present invention. The remote monitoring method further comprises the following steps.
Step S202, sending an adjustment instruction to the sensor. The present embodiment utilizes the remote monitoring platform 30 to send adjustment instructions to several of the sensors 10. Wherein, the adjustable sensor is a sensor 10 with adjustable view field direction. In the present embodiment, the adjustable sensor includes, but is not limited to, radar, lidar, image sensors, and the like. It will be appreciated that all sensors 10 have a certain field of view direction. When the sensor 10 is mounted on the unmanned vehicle 100 and fixed, the sensor 10 can only detect the surrounding environment in the field of view direction and obtain corresponding sensing data.
And step S204, controlling the sensor to adjust the field direction according to the adjusting instruction. Wherein the adjustment instruction comprises a specified direction. The present embodiment utilizes the remote monitoring platform 30 to control the plurality of adjustable sensors to adjust the first viewing direction F1 according to the designated direction to obtain the second viewing direction F2. In this embodiment, the designated direction may be set by an operator of the remote monitoring platform 30 according to the visualization result. For example, the operator of the remote monitoring platform 30 determines that there may be a safety hazard in the front right of the unmanned vehicle 100 according to the visualization result. Then, the operator sends an adjustment instruction through the remote monitoring platform 30, and the designated direction set in the adjustment instruction is the front right. The sensor 10 adjusts the adjustable sensors provided at the front end, and the right front end of the unmanned vehicle 100 according to the designated direction. Preferably, the remote monitoring platform 30 controls the rotation adjustment angle of the plurality of adjustable sensors. Wherein the adjusting angle is not larger than a preset angle. In the present embodiment, the preset angle is 20 degrees. In some possible embodiments, the predetermined angle is 10 degrees. For example, the field of view of the adjustable sensor disposed at the front end of the unmanned vehicle 100 is directed straight ahead. That is, at this time, the first field of view direction F1 of the adjustable sensor is straight ahead. Upon receiving the adjustment instruction and acquiring the specified direction as the right front, the adjustable sensor may rotate the field of view direction to the right side by 10 degrees to obtain a second field of view direction F2 (shown in fig. 4). After the adjustable sensors adjust the viewing direction to the second viewing direction F2, the remote monitoring platform 30 controls the plurality of adjustable sensors to detect the surrounding environment based on the second viewing direction F2.
In the above embodiment, the remote monitoring platform may adjust the field of view direction of the adjustable sensor through the designated direction in the adjustment instruction, so that the adjustable sensor detects the surrounding environment based on the second field of view direction closer to the designated direction, and thereby more sensing data of the surrounding environment in the designated direction may be collected, and the remote monitoring platform may obtain more visualization results about the designated direction, thereby better ensuring the driving safety of the unmanned vehicle.
In some possible embodiments, in an environment with simple road conditions or low traffic volume, the unmanned vehicle 100 may achieve safe driving through part of the environmental data, and then part of the sensors 10 may be turned off. When the operator of the remote monitoring platform 30 determines that the unmanned vehicle 100 is driven to an environment with complex road conditions or a large traffic flow according to the visualization result, the operator can send a start instruction to the unmanned vehicle 100 through the remote monitoring platform 30. The sensor 10 activates the sensor 10 that is turned off according to the activation instruction to detect the surrounding environment, thereby acquiring more sensing data to secure the driving safety of the unmanned vehicle 100.
Please refer to fig. 5, which is a schematic structural diagram of a remote monitoring platform according to an embodiment of the present invention. The remote monitoring platform 30 includes a processor 31, and a memory 32. The remote monitoring platform 30 includes, but is not limited to, an electronic device such as a notebook computer, a desktop computer, and a tablet computer. In the present embodiment, the memory 32 is used for storing remote monitoring program instructions, and the processor 31 is used for executing the remote monitoring program instructions to implement the remote monitoring method as described above.
The processor 31 may be, in some embodiments, a Central Processing Unit (CPU), a controller, a microcontroller, a microprocessor or other data Processing chip, and is configured to execute the remote monitoring program instructions stored in the memory 32.
The memory 32 includes at least one type of readable storage medium including flash memory, hard disks, multi-media cards, card-type memory (e.g., SD or DX memory, etc.), magnetic memory, magnetic disks, optical disks, and the like. The memory 32 may in some embodiments be an internal storage unit of the computer device, such as a hard disk of the computer device. The memory 32 may also be a storage device of an external computer device in other embodiments, such as a plug-in hard disk provided on the computer device, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and so forth. Further, the memory 32 may also include both internal storage units of the computer device and external storage devices. The memory 32 may be used not only to store application software installed in the computer device and various kinds of data such as codes for implementing a remote monitoring method, etc., but also to temporarily store data that has been output or will be output.
Please refer to fig. 6, which is a schematic structural diagram of a remote monitoring system according to an embodiment of the present invention. The remote monitoring system 1000 includes a sensor 10, a processing device 20, and a remote monitoring platform 30. In the present embodiment, the sensor 10 and the processing device 20 are electrically connected and provided to a transportation device or a machine device. The remote monitoring platform 30 is in communication with the sensor 10 and the processing device 20, respectively. The specific structure of the remote monitoring platform 30 refers to the above-mentioned embodiment. Since the remote monitoring system 1000 adopts all technical solutions of all the embodiments, at least all the beneficial effects brought by the technical solutions of the embodiments are achieved, and are not described in detail herein.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, insofar as these modifications and variations of the invention fall within the scope of the claims of the invention and their equivalents, the invention is intended to include these modifications and variations.
The above-mentioned embodiments are only examples of the present invention, which should not be construed as limiting the scope of the present invention, and therefore, the present invention is not limited by the claims.

Claims (11)

1. A remote monitoring method, characterized in that the remote monitoring method comprises:
controlling a sensor to detect a surrounding environment to obtain sensed data;
controlling a processing device to perform sensing processing on the sensing data to obtain environment data, wherein the processing device is electrically connected with the sensor;
controlling the processing equipment to perform visualization processing on the environment data to obtain a visualization result;
acquiring the visualization result;
and monitoring according to the visualization result.
2. The remote monitoring method according to claim 1, wherein controlling the processing device to perform visualization processing on the environment data to obtain a visualization result specifically comprises:
sending a visualization instruction to the processing device;
and controlling the processing equipment to perform visualization processing on the environment data according to the visualization instruction to obtain the visualization result.
3. The remote monitoring method according to claim 2, wherein controlling the processing device to perform visualization processing on the environment data according to the visualization instruction to obtain the visualization result specifically comprises:
controlling the processing device to acquire a plurality of specified sensors in the visualization instruction;
controlling the processing equipment to select specified environmental data matched with the specified sensors from the environmental data;
and controlling the processing equipment to perform visualization processing on the specified environment data to obtain the visualization result.
4. The remote monitoring method of claim 1, wherein the remote monitoring method further comprises:
sending an adjustment instruction to the sensor;
and controlling the sensor to adjust the field direction according to the adjusting instruction.
5. The remote monitoring method of claim 4, wherein sending an adjustment instruction to the sensor specifically comprises:
sending the adjustment instructions to a number of adjustable sensors of the sensors.
6. The remote monitoring method according to claim 5, wherein the adjustment instruction includes a specified direction, and controlling the sensor to adjust the direction of the field of view according to the adjustment instruction specifically includes:
controlling the plurality of adjustable sensors to adjust the first view field direction according to the designated direction to obtain a second view field direction;
controlling the plurality of adjustable sensors to detect the surrounding environment based on the second field of view direction.
7. The remote monitoring method of claim 6, wherein controlling the plurality of adjustable sensors to adjust the first field of view direction to obtain the second field of view direction specifically comprises:
and controlling the plurality of adjustable sensors to rotate by an adjusting angle, wherein the adjusting angle is not larger than a preset angle.
8. The remote monitoring method according to claim 1, wherein controlling the processing device to perform sensing processing on the sensed data to obtain the environmental data specifically comprises:
and controlling the processing equipment to perform perception processing on the sensing data according to a fusion perception algorithm to obtain the environmental data.
9. The remote monitoring method according to any one of claims 1 to 8, wherein controlling the processing device to perform visualization processing on the environment data to obtain a visualization result specifically comprises:
and controlling the processing equipment to perform visualization processing on the environment data to obtain a path of visualization result.
10. A remote monitoring platform, comprising a processor and a memory, the memory storing remote monitoring program instructions, the processor being configured to execute the remote monitoring program instructions to implement the remote monitoring method of any one of claims 1 to 9.
11. A remote monitoring system comprising a sensor, a processing device, and the remote monitoring platform of claim 10 communicatively coupled to the sensor and the processing device, respectively.
CN202110217583.3A 2021-02-26 2021-02-26 Remote monitoring method, platform and system Pending CN112987704A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110217583.3A CN112987704A (en) 2021-02-26 2021-02-26 Remote monitoring method, platform and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110217583.3A CN112987704A (en) 2021-02-26 2021-02-26 Remote monitoring method, platform and system

Publications (1)

Publication Number Publication Date
CN112987704A true CN112987704A (en) 2021-06-18

Family

ID=76351073

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110217583.3A Pending CN112987704A (en) 2021-02-26 2021-02-26 Remote monitoring method, platform and system

Country Status (1)

Country Link
CN (1) CN112987704A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106394545A (en) * 2016-10-09 2017-02-15 北京汽车集团有限公司 Driving system, unmanned vehicle and vehicle remote control terminal
CN207165242U (en) * 2017-07-18 2018-03-30 苏州国科康成医疗科技有限公司 Tele-medicine playback system
CN207166665U (en) * 2017-09-11 2018-03-30 上海太鼎汽车工程技术有限公司 Remote monitoring system for low speed automatic Pilot sweeper
CN111045425A (en) * 2019-12-05 2020-04-21 中国北方车辆研究所 Auxiliary teleoperation driving method for ground unmanned vehicle
CN111522350A (en) * 2020-07-06 2020-08-11 深圳裹动智驾科技有限公司 Sensing method, intelligent control equipment and automatic driving vehicle
CN111829545A (en) * 2020-09-16 2020-10-27 深圳裹动智驾科技有限公司 Automatic driving vehicle and dynamic planning method and system for motion trail of automatic driving vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106394545A (en) * 2016-10-09 2017-02-15 北京汽车集团有限公司 Driving system, unmanned vehicle and vehicle remote control terminal
CN207165242U (en) * 2017-07-18 2018-03-30 苏州国科康成医疗科技有限公司 Tele-medicine playback system
CN207166665U (en) * 2017-09-11 2018-03-30 上海太鼎汽车工程技术有限公司 Remote monitoring system for low speed automatic Pilot sweeper
CN111045425A (en) * 2019-12-05 2020-04-21 中国北方车辆研究所 Auxiliary teleoperation driving method for ground unmanned vehicle
CN111522350A (en) * 2020-07-06 2020-08-11 深圳裹动智驾科技有限公司 Sensing method, intelligent control equipment and automatic driving vehicle
CN111829545A (en) * 2020-09-16 2020-10-27 深圳裹动智驾科技有限公司 Automatic driving vehicle and dynamic planning method and system for motion trail of automatic driving vehicle

Similar Documents

Publication Publication Date Title
US20230278583A1 (en) Autonomous driving system
CN105667508B (en) Vehicle speed regulation
US20230341855A1 (en) Systems and methods for evaluating and sharing autonomous vehicle driving style information with proximate vehicles
CN109435951B (en) Remote monitoring method, system, terminal and storage medium for unmanned vehicle
US9881482B2 (en) Method and device for displaying information of a system
US10395387B2 (en) Method and apparatus for detecting a utilization of an electronic device by a driver, for a vehicle
US9643493B2 (en) Display control apparatus
US20170025013A1 (en) Distance calculation apparatus, distance calculation method, driving assist apparatus, and driving assist system
WO2015174017A1 (en) In-vehicle apparatus and travel image storage system
CN108473144B (en) Method for controlling an automated driver assistance system of a motor vehicle
CN109470491A (en) Blind monitoring road test evaluation system
CN112987704A (en) Remote monitoring method, platform and system
CN111540224A (en) Road data processing method and related equipment
KR20210152602A (en) driver assistance apparatus and method of thereof
US20210061288A1 (en) Driver driving style detection and application system
US20220297721A1 (en) Multi-sensor synchronization method and system
CN116030614A (en) Traction management system and method for autonomous vehicle
KR20200135588A (en) Vehicle and control method thereof
CN113432614B (en) Vehicle navigation method, device, electronic equipment and computer readable storage medium
CN111557026A (en) Driving support device, driving support system, driving support method, and recording medium storing driving support program
CN115675570A (en) Method and device for displaying obstacle information of train
CN115631626A (en) Vehicle data monitoring and analyzing method, device, equipment and medium
CN110962743A (en) Driving prompting method, vehicle-mounted terminal, electronic terminal, vehicle and storage medium
US20210027078A1 (en) Looking away determination device, looking away determination system, looking away determination method, and storage medium
CN113771561A (en) Vehicle towing method, electronic device, towing vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 518057 2301, yuemeite building, No. 1, Gaoxin South seventh Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong

Applicant after: Shenzhen antuzhihang Technology Co.,Ltd.

Address before: 2301, yuemeite building, No.1, Gaoxin South 7th Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Applicant before: Shenzhen Baodong Zhijia Technology Co.,Ltd.

CB02 Change of applicant information
CB02 Change of applicant information

Address after: 518057, Office Building 2807, Haofang Tianji Square, No. 11008 Beihuan Avenue, Nanlian Community, Nantou Street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen antuzhihang Technology Co.,Ltd.

Address before: 518057 2301, yuemeite building, No. 1, Gaoxin South seventh Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong

Applicant before: Shenzhen antuzhihang Technology Co.,Ltd.

CB02 Change of applicant information