CN118298656A - Risk early warning method and device - Google Patents

Risk early warning method and device Download PDF

Info

Publication number
CN118298656A
CN118298656A CN202211713905.4A CN202211713905A CN118298656A CN 118298656 A CN118298656 A CN 118298656A CN 202211713905 A CN202211713905 A CN 202211713905A CN 118298656 A CN118298656 A CN 118298656A
Authority
CN
China
Prior art keywords
traffic event
traffic
event
information
service
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211713905.4A
Other languages
Chinese (zh)
Inventor
肖培伦
李国镇
王琤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Didi Infinity Technology and Development Co Ltd
Original Assignee
Beijing Didi Infinity Technology and Development Co Ltd
Filing date
Publication date
Application filed by Beijing Didi Infinity Technology and Development Co Ltd filed Critical Beijing Didi Infinity Technology and Development Co Ltd
Publication of CN118298656A publication Critical patent/CN118298656A/en
Pending legal-status Critical Current

Links

Abstract

According to embodiments of the present disclosure, a method, apparatus, electronic device, computer storage medium, and computer program product for risk early warning are provided. The method of one aspect described herein comprises: obtaining, by the cloud device, reporting information about a traffic event, wherein detection of the traffic event is based at least on a set of event images acquired by a set of acquisition devices associated with a set of vehicles; determining a set of trips expected to be affected by the traffic event using at least part of the reported information; and sending a reminder associated with the traffic event to a set of terminal devices associated with a set of itineraries. The reported information is obtained with the user's knowledge and consent and processed (e.g., uploaded or stored, etc.) in compliance with relevant legal regulations. Based on the above manner, the embodiment of the disclosure can generate early warning based on real-time image data, thereby improving the efficiency and accuracy of risk early warning.

Description

Risk early warning method and device
Technical Field
Example embodiments of the present disclosure relate generally to the field of computers, and more particularly, relate to a method, apparatus, electronic device, computer storage medium, and computer program product for risk early warning.
Background
With the progress of the age, travel of passenger vehicles has become the majority of options. Prior to travel, one may desire to know traffic conditions around the road to be traveled in order to route the journey. Or during travel, one may desire to learn about the traffic condition of the remaining journey through a navigation application. Therefore, how to more effectively provide accurate traffic condition information to users has become a focus of attention.
Disclosure of Invention
In a first aspect of the present disclosure, a method of risk early warning is provided. The method comprises the following steps: obtaining, by the cloud device, reporting information about a traffic event, wherein detection of the traffic event is based at least on a set of event images acquired by a set of acquisition devices associated with a set of vehicles; determining a set of trips expected to be affected by the traffic event, the set of trips including planned trips and/or ongoing trips that have not yet been initiated, using at least part of the reported information; and sending a reminder associated with the traffic event to a set of terminal devices associated with a set of itineraries.
In a second aspect of the present disclosure, a method of risk early warning is provided. The method comprises the following steps: acquiring an environment image of a vehicle by a terminal device, wherein the environment image is acquired by acquisition equipment associated with the vehicle, and the vehicle is a service vehicle for providing travel service; identifying a traffic event based on the environmental image; and sending reporting information associated with the traffic event to a remote device, the remote device including an edge computing device and/or a cloud device, the reporting information including: travel information of a service travel associated with travel services currently provided by the service vehicle, and an event image associated with a traffic event.
In a third aspect of the present disclosure, an apparatus for risk early warning is provided. The device comprises: an information acquisition module configured to acquire reporting information about a traffic event, wherein detection of the traffic event is based at least on a set of event images acquired by a set of acquisition devices associated with a set of vehicles; a trip determination module configured to determine a set of trips expected to be affected by the traffic event, the set of trips including planned trips and/or ongoing trips that have not yet been initiated, using at least a portion of the reported information; and a reminder sending module configured to send a reminder associated with the traffic event to a set of terminal devices associated with a set of trips.
In a fourth aspect of the present disclosure, an apparatus for risk early warning is provided. The device comprises: an image acquisition module configured to acquire, by a terminal device, an environmental image of a vehicle, the environmental image being acquired by an acquisition device associated with the vehicle, the vehicle being a service vehicle for providing travel services; an event identification module configured to identify a traffic event based on the environmental image; and an information reporting module configured to send reporting information associated with the traffic event to a remote device, the remote device including an edge computing device and/or a cloud device, the reporting information comprising: travel information of a service travel associated with travel services currently provided by the service vehicle, and an event image associated with a traffic event.
In a fifth aspect of the present disclosure, there is provided an electronic device comprising: a memory and a processor; wherein the memory is for storing one or more computer instructions, wherein the one or more computer instructions are executable by the processor to implement a method according to the first or second aspect of the present disclosure.
In a sixth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon one or more computer instructions, wherein the one or more computer instructions are executed by a processor to implement a method according to the first or second aspect of the present disclosure.
In a seventh aspect of the present disclosure, there is provided a computer program product comprising computer executable instructions which when executed by a processor implement a method according to the first or second aspect of the present disclosure.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals designate like or similar elements, and wherein:
FIG. 1 illustrates a schematic diagram of an example environment in which embodiments in accordance with the present disclosure may be implemented;
FIG. 2 illustrates a schematic diagram of an example process of risk early warning according to some embodiments of the present disclosure;
FIG. 3 shows a schematic diagram of an example process of risk early warning according to further embodiments of the present disclosure;
FIG. 4 illustrates a schematic block diagram of an apparatus for risk early warning according to some embodiments of the present disclosure;
FIG. 5 shows a schematic block diagram of an apparatus for risk early warning according to further embodiments of the present disclosure; and
Fig. 6 illustrates a block diagram of an electronic device capable of implementing various embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
In describing embodiments of the present disclosure, the term "comprising" and its like should be taken to be open-ended, i.e., including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like, may refer to different or the same object. Other explicit and implicit definitions are also possible below.
As discussed above, it has become a focus of attention to be able to obtain accurate traffic condition information. Conventional traffic condition information is typically dependent on the traffic sector, but such information may be less time efficient. In addition, some users may report traffic information that is less accurate, which may be difficult to provide adequate assistance to the user's travel.
In view of this, embodiments of the present disclosure provide a solution for risk early warning. According to a first aspect of the solution, the cloud device may obtain reporting information about a traffic event, wherein the detection of the traffic event is based at least on a set of event images acquired by a set of acquisition devices associated with a set of vehicles. Further, the cloud device determines a set of trips expected to be affected by the traffic event using at least a portion of the reported information, wherein the set of trips includes planned trips and/or ongoing trips that have not yet been initiated. Accordingly, the cloud device sends a reminder associated with the traffic event to a set of terminal devices associated with a set of itineraries.
In this way, the embodiment of the disclosure can acquire traffic events more timely and accurate, and can accurately provide corresponding reminders for needed terminal equipment.
According to a second aspect of the aspect, the terminal device may acquire an environmental image of the vehicle, wherein the environmental image is acquired by an acquisition device associated with the vehicle, and the vehicle is a service vehicle for providing travel services. Further, the terminal device may identify a traffic event based on the environmental image, and send reporting information associated with the traffic event to a remote device, where the remote device includes an edge computing device and/or a cloud device, and the reporting information includes: travel information of a service travel associated with travel services currently provided by the service vehicle, and an event image associated with a traffic event.
In this way, the embodiment of the disclosure can acquire and utilize the travel service vehicle to timely and accurately identify the traffic event, so as to serve as a basis for reminding other terminals and/or vehicles.
The solutions described in the present specification and embodiments, as related to personal information processing (including, but not limited to, possible trip information, user information, location information, etc.), all perform processing on the premise of having a validity base (e.g., obtaining personal information body consent, or being necessary for performing a contract, etc.), and perform processing only within a prescribed or agreed range. The user refuses to process the personal information except the necessary information of the basic function, and the basic function is not influenced by the user.
Various example implementations of this scheme will be described in detail below with reference to the accompanying drawings.
Example Environment
Referring first to FIG. 1, a schematic diagram of an environment 100 in which an application according to an exemplary implementation of the present disclosure may be used is schematically shown.
As shown in FIG. 1, environment 100 may include a plurality of vehicles 110-1 and 110-2 (individually or collectively referred to as vehicles 110) traveling on a roadway. Such a vehicle 110 may be associated with respective terminal devices, such as terminal device 125-1 and terminal device 125-2 (individually or collectively referred to as terminal device 125).
In some embodiments, terminal device 125 may be an in-vehicle terminal device integrated within vehicle 110 or a device independent of vehicle 110. For example, the terminal device 125 may be an in-vehicle device in the vehicle 110. As another example, the terminal device 125 may also be, for example, an appropriate terminal device used by a driver, security or passenger of the vehicle 110.
In some embodiments, the vehicle 125 may include a service vehicle for providing travel services. Such travel services may include, but are not limited to: various forms of network taxi, shared car and other services.
Further, vehicle 110 may be any type of vehicle that may carry a person and/or object and that is moved by a power system such as an engine, including, but not limited to, a car, truck, bus, electric vehicle, motorcycle, caravan, train, and the like. In some cases, the vehicle 110 in the environment 100 may also be a vehicle having some intelligent driving capability, such a vehicle also being referred to as an intelligent driving vehicle.
As shown in fig. 1, vehicle 110 may also be associated with image-capturing devices (e.g., image-capturing device 115-1 and image-capturing device 115-2, individually or collectively referred to as image-capturing device 115). In one example, the image capture device 115 may be an on-board capture device of the vehicle 110, such as a fixedly mounted tachograph. In another example, the image acquisition device 115 may also be, for example, a stand-alone image acquisition device integrated in the terminal device 125 or arranged in communication connection with the terminal device 125.
The image acquisition device 115 may be configured to acquire an environmental image 130 of the vehicle 110. As will be described in detail below, the terminal device 125 may identify the traffic event 120 based on the acquired environmental image 130.
In some embodiments, as shown in FIG. 1, environment 100 may also include an edge computing device 135. As will be described in detail below, the edge computing device 135 may be used to integrate the detection results of the plurality of vehicles 110 with respect to the traffic event 120 to determine whether the traffic event 120 needs to be reported to the cloud device 140.
Further, as shown in fig. 1, the environment 100 may also include a cloud device 140 that may be used to receive reporting information about the traffic event 120 from the terminal device 125 and determine a set of trips that are expected to be affected by the traffic event 120 based on the reporting information.
Additionally, the cloud device 140 may send a reminder regarding the traffic event 120 to a terminal device (e.g., terminal device 145) associated with the set of trips. In some embodiments, such a terminal device 145 may be a device associated with a vehicle 150 that may travel through an area of occurrence of the traffic event 120.
In some embodiments, the terminal device 145 may be an in-vehicle terminal device integrated within the vehicle 150, or a device independent of the vehicle 150. For example, the terminal device 145 may be an in-vehicle device in the vehicle 150. As another example, the terminal device 145 may also be, for example, an appropriate terminal device used by a driver, security or passenger of the vehicle 150. In some embodiments, vehicle 150 may also include a service vehicle for providing travel services.
In other embodiments, the terminal device 145 may also include a terminal device associated with a user that has not yet begun a trip. For example, the terminal device 145 may include a passenger calling a network taxi service whose future journey may be through the area of occurrence of the traffic event 120.
Specific processes related to traffic event recognition, traffic event reporting, and traffic event alerting are described in further detail below.
Image-based traffic event identification
As described with reference to fig. 1, according to an embodiment of the present disclosure, a terminal device associated with a vehicle may identify traffic events occurring in an environment based on an environment image of the vehicle.
With continued reference to fig. 1, the terminal device 125 may acquire an environmental image 130 captured by the image acquisition device 115. As an example, the environment image 130 may include, for example, a video captured by a tachograph on which the vehicle 130 is mounted during running of the vehicle.
In some embodiments, the environmental image 130 may also include, for example, an image captured by a passenger or driver of the vehicle 130 using a mobile device. For example, upon finding a traffic accident, a passenger may take an image of the traffic accident using the held mobile device.
In some embodiments, vehicle 130 may be a service vehicle for providing travel services, which may be onboard a service provider (e.g., a driver) and a service demander (e.g., a passenger). In this way, embodiments of the present disclosure are able to take advantage of a large number of out-going service vehicles in a city to obtain more real-time traffic information.
Further, the terminal device 125 may also identify the traffic event 120 based on the environmental image 130. In particular, the terminal device 125 may identify various types of traffic events, such as traffic events related to a host vehicle, traffic events related to other traffic participants, traffic events related to road conditions, traffic events related to traffic restrictions, and traffic events associated with particular traffic participants, among others.
For identification of traffic events associated with the vehicle 110, the terminal device 125 may detect a distance of a surrounding traffic element of the vehicle from the vehicle 110 based on the environmental image 130; and may determine that a traffic event associated with vehicle 110 occurred if the distance is less than a first threshold distance.
Illustratively, the terminal device 125 may utilize a target detection algorithm to identify various traffic elements that appear in the environmental image 130. Such traffic elements may include, for example: other vehicles, pedestrians, non-motor vehicles, etc.
It should be appreciated that any suitable object detection algorithm may be employed to identify traffic elements, such as, for example, YOLO (You Only Look Once, see only once) based object detection algorithms, SSD (Single Shot MultiBox Detector, single lens multi box detector) based object detection algorithms, FASTER RCNN based object detection algorithms, and the disclosure is not intended to be limited to a particular detection algorithm.
Further, the terminal device 125 may determine a distance, e.g., a lateral distance and/or a longitudinal distance, of each surrounding traffic element relative to the vehicle 110 based on calibration information of the image acquisition device 115. If the distance is less than the first threshold distance, the terminal device 125 may, for example, determine that a traffic event, such as a collision accident, associated with the vehicle 110 has occurred.
It should be appreciated that any suitable ranging algorithm may be employed to calculate distance, such as, for example, geoNet-based, LEGO (LEARNING EDGE WITH Geometry), learning edges and Geometry, and the disclosure is not intended to be limiting on the specific ranging algorithm.
Alternatively, the terminal device 125 may also integrate determination that a traffic event associated with the vehicle 110 occurred in conjunction with sensor data (e.g., speed data and/or acceleration data) of the vehicle 110 and/or the terminal device.
For the identification of traffic events associated with other traffic participants, the terminal device 125 tracks first and second traffic elements of the vehicle perimeter based on the environmental images. Further, in the event that the distance between the first traffic element and the second traffic element is detected to be less than the second threshold distance and the relative speed between the first traffic element and the second traffic element is detected to be less than the threshold speed, the terminal device 125 may determine that a traffic event associated with the first traffic element and the second traffic element occurred.
Illustratively, the terminal device 125 may detect and locate various traffic elements (e.g., motor vehicles, non-motor vehicles, pedestrians, etc.) around using a target detection and ranging algorithm, and track the trajectory of the traffic elements over a continuous time period using a target tracking algorithm to estimate the relative movement speed of the traffic elements.
It should be appreciated that any suitable tracking algorithm may be employed to track traffic elements, such as, for example, simple Online real-time multi-target tracking (SORT-based AND REALTIME TRACKING), JDE (Jointly LEARNS THE Detector and Embedding model, joint learning detector and embedded model), byteTrack, and the like, and the disclosure is not intended to be limited to a particular tracking algorithm.
Further, if the relative speed of two traffic elements drops significantly and is less than a threshold speed (e.g., near 0) and the relative distance of the two traffic elements is below a safety threshold, the terminal device 125 may determine that a traffic event, such as a collision accident, has occurred with respect to the two traffic elements.
For the identification of traffic events with road conditions, the terminal device 125 may determine the road conditions of the first road based on the environmental image. Further, if the road condition is below a threshold level, the terminal device 125 may determine that a traffic event associated with the first road occurred.
For example, the terminal device 125 may utilize the image recognition model to detect certain obstacles (e.g., rocks) on the road, cracks in the road pavement, potholes, ponding, mud, etc., in the environmental image 130. Further, the terminal device 125 may determine that a traffic event associated with the road has occurred, such as road debris, road pavement cracking, and the like.
For the identification of traffic events with traffic restrictions, the terminal device 125 may detect a traffic restriction identification associated with the second link based on the environmental image. If the traffic restriction identification is detected, the terminal device 125 may determine that a traffic event associated with the second link has occurred.
Illustratively, the terminal device 125 may detect, for example, a road disablement sign, a road closure sign, a road maintenance sign, etc., from the environmental image 130 to determine that a road disablement event, a road closure event, a road maintenance event, etc., associated with the road has occurred.
For traffic events associated with a particular traffic participant, the terminal device 125 may determine the number of particular traffic participants within the predetermined area based on the environmental image. Further, if the number is greater than the threshold number, the terminal device 125 determines that a traffic event associated with a particular traffic participant has occurred.
For example, the terminal device 125 may utilize environmental images acquired over a predetermined period of time to determine the number of particular traffic participants. Such specific participants may include, for example: large vehicles, pedestrians, wandering animals, etc.
As another example, where legal regulations allow and do not reveal relevant privacy data, the terminal device 125 may also locally identify the number of certain particular traffic participants (e.g., students, elderly) and, if the number is greater than a threshold number, determine that a high frequency occurrence of traffic events has occurred for the particular traffic participants.
Based on the manner discussed above, embodiments of the present disclosure are able to detect various types of traffic events occurring in the surrounding environment in real time based on the surrounding image of the vehicle. In addition, considering the comprehensive coverage of the travel path of the travel service vehicle, the embodiment of the disclosure can further improve the timeliness and accuracy of traffic event detection and reduce the cost of traffic event detection.
Reporting of traffic events
As discussed with reference to fig. 1, after identifying a traffic event based on the traffic image, the terminal device 125 may initiate a report to a remote device (e.g., the edge computing device 135 and/or the cloud device 140).
In some embodiments, the terminal device 125 may send descriptive information associated with the traffic event to the edge computing device 135 after detecting the traffic event, considering accuracy of the image-based recognition. Further, upon receiving a confirmation from the edge computing device regarding the traffic event, the terminal device 125 may send the reporting information associated with the traffic event to the cloud device.
Taking fig. 1 as an example, multiple terminal devices 125-1 and 125-2 may each detect the same traffic event 120. Thus, the edge computing device 135 may receive descriptive information about the same traffic event from multiple terminal devices, for example.
Further, the edge computing device 135 may determine the confidence level of the traffic event 120 based on descriptive information from a plurality of terminal devices. For example, the edge computing device 135 may determine the number of a set of terminal devices that sent descriptive information about the same traffic event and determine the confidence level of the traffic event 120 based on the number.
Illustratively, if there are multiple terminal devices reporting the same traffic event, the edge computing device 135 may consider the traffic event to have a higher confidence and generate a confirmation for the traffic event.
In some embodiments, the edge computing device 135 may associate different terminal devices to the same traffic event based on the descriptive information sent by them. For example, the edge computing device 135 may determine that the terminal device 125-1 and the terminal device 125-2 identified the same traffic event based on the location and type of the traffic event indicated in the descriptive information sent by the terminal device 125-1 and the location and type of the traffic event indicated in the descriptive information sent by the terminal device 125-2.
In some embodiments, the edge computing device 135 also receives the environmental image 130 from the terminal device 125 for determining the traffic event 120 and utilizes the greater computing power (e.g., more accurate recognition model) possessed by the edge computing device 135 to confirm whether the traffic event occurred.
In some embodiments, upon detecting the traffic event 120 or receiving a confirmation about the traffic event 120 from the edge computing device 135, the terminal device 125 may send the reported information about the traffic event 120 directly to the cloud device 140 or to the cloud device 140 via the edge computing device 130.
In some embodiments, the reported information may include a type of traffic event, a location of occurrence of the traffic event, a time of occurrence of the traffic event, or a combination of more than one of the foregoing.
Illustratively, the terminal device 125 may determine the type of traffic event 120 based on the environmental image 130; determining an occurrence location of the traffic event 120 based on a ranging algorithm; an occurrence of a traffic event, etc., is determined based on the acquisition time of the environmental image 130.
In some embodiments, the terminal device 125 may also determine the severity of the traffic event 120 based on the environmental image 130 and indicate the severity of the traffic event 120 in the reported information. For example, for a crash incident, the terminal device 125 may utilize an image-based vehicle deformation detection algorithm to determine the extent of damage to the incident vehicle and, in turn, the severity of the traffic event 120.
In some embodiments, for scenarios where the vehicle 125 is a service vehicle for providing travel services, the reported information may also include trip information for a service trip associated with the travel services currently provided by the service vehicle. For example, in the case where legal regulations allow and do not reveal relevant privacy data, the terminal device 125 may indicate a trip identification, trip start point information, etc. of the service trip in the report information.
In some embodiments, the terminal device 125 may also include a set of event images in the reported information. Such a set of event images may include, for example, one or more perimeter images 130 or videos having a predetermined duration for identifying traffic event 120.
It should be appreciated that appropriate privacy protection or anonymization processing is required for privacy potentially contained in the surrounding image 130 acquired by the image acquisition device 115 to be uploaded to the cloud device 140 as allowed by law. For example, the terminal device 125 may perform the desensitization operation after the acquisition of the peripheral image 130 is completed, so as to avoid disclosure or penetration of the private data.
By indicating travel information of travel service in the reported information, the embodiment of the disclosure can be beneficial to the cloud device to more conveniently determine the position and/or time of the traffic event, and the like, and further reduce the communication and calculation cost.
Reminding of traffic event
As discussed with reference to fig. 1, upon receiving the reporting information regarding the traffic event 120, the cloud device 140 may send a reminder regarding the traffic event 120 to a particular terminal device 145.
In some embodiments, the cloud device 140 may determine a set of trips expected to be affected by the traffic event 120 based on utilizing at least a portion of the reported information. Such trips may include planned trips that have not yet been initiated and/or ongoing trips.
For example, planning a journey may include, for example, a future journey planned by a user through a navigation application, a journey planned by a user through a travel application, a reserved future journey, and/or a journey being called and to be responded to, and so forth.
By way of example, the ongoing itinerary may include, for example, a current itinerary being navigated by the user through the navigation application, an ongoing itinerary (including an itinerary in a pick-up, an itinerary after pick-up) by the user through the travel application, and so on.
In some embodiments, the cloud device 140 may report at least a portion of the information to determine an impact time of the traffic event 120 and determine a set of trips expected to be impacted by the traffic event 120 based on the impact time.
In some embodiments, the cloud device 140 may utilize a machine learning model to determine the time of impact of the traffic event 120. Such machine learning models may include any suitable model such as neural network NN, decision trees, multi-layer perceptron MLP, support vector regression SVR, and the like, which is not intended to be limiting in this disclosure.
Specifically, the cloud device 140 may determine input features to the time estimation model based on at least a portion of the reported information, and determine an impact time of the traffic event using the time estimation model.
In some embodiments, the cloud device may determine, for example, a type of traffic event, a location of occurrence of the traffic event, a time of occurrence of the traffic event, or a combination of more than one of the above based on the reported information, and determine at least a portion of the input features to the time estimation model based thereon.
Additionally or alternatively, the input feature may also indicate reporting terminal information for a traffic event, for example, wherein the reporting terminal information may indicate at least the number of terminals reporting the same traffic event, for example. It should be appreciated that the number of reporting terminals may be determined by the edge computing device 135 and/or the cloud device 140.
Additionally or alternatively, the input features may also indicate, for example, level information of the traffic event, wherein the level information may indicate a severity of the traffic event (e.g., a degree of damage to the accident vehicle). Such level information may be determined by the terminal device 125 and included in the reported information, or by the edge computing device 135 and/or the cloud device 140.
Additionally or alternatively, the input features may also indicate traffic state information associated with the traffic event 120. For example, the cloud device 120 may determine a traffic congestion state of a corresponding road when the traffic event 120 occurs, and take it as an input feature to the time estimation model.
Additionally or alternatively, the input feature may also indicate whether the traffic event 120 is a real-time detected event. For example, some terminal devices 125 may report that a traffic event has occurred for a period of time.
In some embodiments, one or more of the information discussed above may be appropriately encoded as input to a time estimation model and thereby determine the time of impact of the traffic event 120. Such an impact time may, for example, indicate a length of time required for the traffic event 120 to end the impact on normal traffic.
It should be appreciated that the time estimation model may be trained using an appropriate training data set, and the training process for the event estimation model is not described in detail herein.
Further, the cloud device 140 may determine an impact time period of the traffic event based on the impact time and the occurrence time of the traffic event, and may in turn determine a target trip expected to pass through an area associated with the occurrence location of the traffic event during the impact time period.
For example, the cloud device 140 may determine a predetermined area based on the occurrence location of the traffic event 120 and, in turn, screen out a journey that may be traveling through the predetermined area during the period of impact. For example, the cloud device 140 may obtain the location of the terminal of the navigation application and/or the terminal of the travel application and determine a predetermined arrival time ETA by which it arrives at the predetermined area. If it is determined based on the ETA that the terminal will travel through the area during the period of influence of the traffic event, the cloud device 140 may determine the itinerary associated with the terminal as the itinerary that is expected to be influenced by the traffic event 120.
In some embodiments, cloud device 140 may also obtain a set of service trips from the travel service platform. Further, the cloud device 140 may determine a target trip expected to travel through the area during the impact period from a set of service trips based on the time information and route information of the set of service trips.
For the trip service trip, the cloud device 140 may, for example, periodically obtain track information corresponding to the trip service trip, so that the expected arrival time ETA of the trip service trip to the predetermined area may be conveniently determined. If it is determined based on the ETA that the travel service itinerary is to travel through the area during the period of influence of the traffic event, the cloud device 140 may determine the travel service itinerary as an itinerary that is expected to be influenced by the traffic event 120.
Further, the cloud device 140 may send a reminder regarding the traffic event 120 to a set of terminal devices associated with the determined set of trips.
As discussed with reference to fig. 1, such a set of terminal devices may include, for example, vehicle-mounted terminal devices and/or appropriate terminal devices for use by drivers, security officers, or passengers associated with a trip.
In some embodiments, the alert sent to the group of terminal devices (e.g., terminal device 145 shown in fig. 1) may indicate at least one item of information: the type of traffic event, the location of occurrence of the traffic event, the time of occurrence of the traffic event, and the time of impact and/or time period of impact of the traffic event.
In some embodiments, the alert sent to the terminal device 145 may also include at least one target image for indicating the traffic event 120. Specifically, the cloud device 140 may determine at least one target image for indicating a traffic event, for example, from the received set of event images. For example, the cloud device 140 may determine at least one image from a set of event images that is capable of characterizing the traffic event 120 as a target image based on an appropriate key frame detection algorithm. For example, for an example in which the uploaded set of event images includes a video of a predetermined length, the cloud device 140 may utilize a key frame detection algorithm to determine one or more key frames as target images.
In this way, the cloud device 140 may make the terminal device 145 more intuitively perceive the traffic event 120 and facilitate its corresponding decision making.
In some embodiments, for the scenario of travel services, a particular service trip may be affected by the traffic event 120, for example. In such a case, the cloud device 140 may, for example, send a first reminder to a first terminal device corresponding to a service provider (e.g., driver of the travel service) associated with a particular service trip. Additionally or alternatively, the cloud device 140 may also send a second reminder to the service demander associated with the particular service trip (e.g., the passenger of the travel service) corresponding to the second terminal device.
In some embodiments, the cloud device 140 may send the first alert to the first terminal device in a first manner and send the second alert to the second terminal device in a second manner, where the first manner is different from the second manner.
For example, to avoid interference with the driver, the cloud device 140 may send the first alert to the first terminal device in a voice manner. Conversely, in order to make the passenger more clearly aware of the details of the traffic event, the cloud device 140 may send the second alert to the second terminal device, for example, by using an in-application alert.
In some embodiments, the second reminder indicates additional information about the traffic event as compared to the first reminder. Specifically, the additional information includes, for example, image information associated with the traffic event. That is, the cloud device 140 may not provide the image information to the terminal of the driver and provide the image information to the passenger, for example.
In some embodiments, the second terminal device may also receive an acknowledgement of the traffic event 120 by the service demander after receiving the second alert. For example, the passenger may view, through the second terminal identification device, a scene picture and a type description of the traffic event 120 pushed by the cloud device 140. Further, the passenger may determine, for example, through the second terminal device, that the identification of the traffic event 120 is accurate. Thus, the second terminal device 120 may send a confirmation to the cloud device 140 regarding the traffic event.
Further, the cloud device 140 may further cause the second terminal to provide an entry for changing the route of the first travel service trip. For example, after the passenger confirms the traffic event 120, the second terminal device may provide access to the replacement route for evading the traffic event 120.
For example, upon the service demander selecting the portal, the cloud device 140 and/or the second terminal device may generate at least one recommended route that bypasses the traffic event 120.
Based on such a manner, embodiments of the present disclosure reduce the possible impact of traffic events on travel services, thereby improving the efficiency of travel services.
In some embodiments, the cloud device 140 may also send a reminder to a plurality of vehicle-related terminal devices within a predetermined range of the occurrence location of the traffic event, whether it is taking a related trip or acquiring its trip information.
For example, some idle travel service vehicles may not have been taken on passengers or have not been on a trip, and the cloud device 140 may determine that these idle vehicles are within a predetermined range of a traffic event and send a reminder about the traffic event to their corresponding terminal devices (e.g., driver-related terminal devices or vehicle-mounted devices). In this way, embodiments of the present disclosure are also able to avoid the impact of traffic events on other surrounding vehicles.
In some embodiments, the cloud device 140 may also determine the push range more accurately based on the direction of travel of the vehicle. For example, the cloud device 140 may push a reminder within the predetermined range, but with a vehicle (or its associated terminal) that is driving out of the range. In some embodiments, such a predetermined range may also be determined based on, for example, the time of impact of the traffic event discussed above. For example, a longer impact time may correspond to a greater range.
In this way, the embodiment of the disclosure can realize more accurate information push, and avoid interference to irrelevant terminals or vehicles.
Example procedure
Fig. 2 illustrates a flow chart of a process 200 for risk early warning according to various embodiments of the present disclosure. Process 200 may be implemented by cloud device 40 as shown in fig. 1.
At block 210, the cloud device 140 obtains reporting information regarding traffic events, wherein detection of traffic events is based at least on a set of event images acquired by a set of acquisition devices associated with a set of vehicles.
At block 220, the cloud device 140 determines a set of trips expected to be affected by the traffic event, the set of trips including planned trips and/or ongoing trips that have not yet been initiated, using at least a portion of the reported information.
At block 230, the cloud device 140 sends a reminder associated with the traffic event to a set of terminal devices associated with a set of trips.
In some embodiments, determining a set of trips expected to be affected by a traffic event includes: determining the influence time of the traffic event by utilizing at least part of the reported information; and determining a set of trips expected to be impacted by the traffic event based on the impact time.
In some embodiments, determining the impact time includes: determining input features to the time estimation model based on at least part of the reported information; and determining the influence time of the traffic event by using the time estimation model.
In some embodiments, the input features are indicative of at least one of: the traffic event type, the traffic event occurrence position, the traffic event occurrence time, the traffic event reporting terminal information, the reporting terminal information at least indicating the number of terminals reporting the traffic event, the traffic event level information, the level information indicating the severity of the traffic event, and the traffic state information associated with the traffic event.
In some embodiments, determining a set of trips expected to be affected by a traffic event based on the time of impact comprises: determining an influence time period of the traffic event based on the influence time and the occurrence time of the traffic event; determining a target journey that is expected to pass through an area associated with the location of occurrence of the traffic event during the impact period; and determining a set of trips expected to be affected by the traffic event based on the target trips.
In some embodiments, determining the target trip includes: acquiring a group of service trips from a travel service platform; and determining a target trip expected to travel through the area during the period of influence from the set of service trips based on the time information and the route information for the set of service trips.
In some embodiments, process 200 includes: determining at least one target image for indicating a traffic event from a set of event images; and generating a reminder for transmission to a group of terminal devices based on the at least one target image.
In some embodiments, the set of event images includes a reported video and the at least one target image includes a keyframe determined from the video.
In some embodiments, the alert indicates at least one of: the type of traffic event, the location of occurrence of the traffic event, the time of occurrence of the traffic event, and the time of impact and/or time period of impact of the traffic event.
In some embodiments, the set of itineraries includes an associated first service itinerary associated with the travel service, and the set of terminal devices includes: the service provider associated with the first service trip corresponds to a first terminal device and/or the service demander associated with the first service trip corresponds to a second terminal device.
In some embodiments, sending a reminder associated with a traffic event to a set of terminal devices associated with a set of trips includes: sending a first reminder to a first terminal device in a first manner; and sending a second alert to the second terminal device in a second manner, the first manner being different from the second manner.
In some embodiments, the second reminder indicates additional information about the traffic event as compared to the first reminder, wherein the additional information includes image information associated with the traffic event.
In some embodiments, the process 200 further comprises: receiving an acknowledgement from the second terminal device regarding the traffic event; and causing the second terminal device to provide an entry for changing the route of the first travel service trip.
In some embodiments, the set of vehicles includes a service vehicle for providing travel services, and the acquisition device includes an image acquisition device onboard the service vehicle.
In some embodiments, the reported information includes trip information for a target trip service trip associated with the service vehicle, the target trip service trip including: service trips by the service vehicle when the image acquisition device acquires the associated event images.
In some embodiments, the process 200 further comprises: determining the occurrence position of a traffic event; and transmitting a reminder associated with the traffic event to a terminal device associated with at least one of the plurality of vehicles based on the directions of travel of the plurality of vehicles within a predetermined range of the occurrence location.
In some embodiments, the predetermined range is determined based on an impact time of the traffic event.
Fig. 3 shows a flow chart of a process 300 for risk early warning according to further embodiments of the present disclosure. The process 300 may be implemented by the terminal device 125 as shown in fig. 1.
As shown in fig. 3, at block 310, the terminal device 125 acquires an environmental image of a vehicle, the environmental image being acquired by an acquisition device associated with the vehicle, the vehicle being a service vehicle for providing travel services.
At block 320, the terminal device 125 identifies a traffic event based on the environmental image.
At block 330, the terminal device 125 sends reporting information associated with the traffic event to a remote device, the remote device including an edge computing device and/or a cloud device, the reporting information including: travel information of a service travel associated with travel services currently provided by the service vehicle, and an event image associated with a traffic event.
In some embodiments, identifying the traffic event includes: detecting the distance between the surrounding traffic elements of the vehicle and the vehicle based on the environment image; and in response to the distance being less than the first threshold distance, determining that a traffic event associated with the vehicle occurred.
In some embodiments, identifying the traffic event includes: tracking a first traffic element and a second traffic element around the vehicle based on the environmental image; and in response to detecting that the distance between the first traffic element and the second traffic element is less than the second threshold distance and the relative speed between the first traffic element and the second traffic element is less than the threshold speed, determining that a traffic event associated with the first traffic element and the second traffic element occurred.
In some embodiments, identifying the traffic event includes: determining a road condition of the first road based on the environmental image; and determining that a traffic event associated with the first road has occurred in response to the road condition being below the threshold level.
In some embodiments, identifying the traffic event includes: in response to identifying a traffic restriction identification associated with the second link based on the environmental image, it is determined that a traffic event associated with the second link occurred.
In some embodiments, identifying the traffic event includes: determining a number of particular traffic participants within a predetermined area based on the environmental image; and in response to the number being greater than the threshold number, determining that a traffic event associated with the particular traffic participant occurred.
In some embodiments, transmitting the reported information associated with the traffic event to the remote device includes: transmitting descriptive information associated with the traffic event to an edge computing device; and in response to receiving a confirmation from the edge computing device regarding the traffic event, sending reporting information associated with the traffic event to the cloud device.
In some embodiments, the confirmation regarding the traffic event is based on the number of terminal devices of a group of terminal devices that sent descriptive information regarding the same traffic event to the edge computing device.
In some embodiments, the reporting information further indicates at least one of: the type of traffic event, the location of occurrence of the traffic event, and the time of occurrence of the traffic event.
In some embodiments, the reported information also indicates a severity of the traffic event, wherein the severity is determined based on an event image associated with the traffic event.
In some embodiments, a terminal device includes: the vehicle-mounted terminal device mounted on the vehicle, the terminal device associated with the service requiring party of the service trip, and the terminal device associated with the service provider of the service trip.
Example apparatus and apparatus
Embodiments of the present disclosure also provide corresponding apparatus for implementing the above-described methods or processes. Fig. 4 shows a schematic block diagram of an apparatus 400 for risk early warning according to some embodiments of the present disclosure.
As shown in fig. 4, the apparatus 400 includes an information acquisition module 410 configured to acquire reporting information regarding traffic events, wherein detection of the traffic events is based at least on a set of event images acquired by a set of acquisition devices associated with a set of vehicles. The apparatus 400 further includes a trip determination module 420 configured to determine a set of trips expected to be affected by the traffic event, the set of trips including planned trips and/or ongoing trips that have not yet been initiated, using at least a portion of the reported information. In addition, the apparatus 400 includes a reminder sending module 430 configured to send a reminder associated with a traffic event to a set of terminal devices associated with a set of trips.
In some embodiments, the trip determination module 420 is further configured to: determining the influence time of the traffic event by utilizing at least part of the reported information; and determining a set of trips expected to be impacted by the traffic event based on the impact time.
In some embodiments, the trip determination module 420 is further configured to: determining input features to the time estimation model based on at least part of the reported information; and determining the influence time of the traffic event by using the time estimation model.
In some embodiments, the input features are indicative of at least one of: the traffic event type, the traffic event occurrence position, the traffic event occurrence time, the traffic event reporting terminal information, the reporting terminal information at least indicating the number of terminals reporting the traffic event, the traffic event level information, the level information indicating the severity of the traffic event, and the traffic state information associated with the traffic event.
In some embodiments, the trip determination module 420 is further configured to: determining an influence time period of the traffic event based on the influence time and the occurrence time of the traffic event; determining a target journey that is expected to pass through an area associated with the location of occurrence of the traffic event during the impact period; and determining a set of trips expected to be affected by the traffic event based on the target trips.
In some embodiments, the trip determination module 420 is further configured to: acquiring a group of service trips from a travel service platform; and determining a target trip expected to travel through the area during the period of influence from the set of service trips based on the time information and the route information for the set of service trips.
In some embodiments, alert sending module 430 is further configured to: determining at least one target image for indicating a traffic event from a set of event images; and generating a reminder for transmission to a group of terminal devices based on the at least one target image.
In some embodiments, the set of event images includes a reported video and the at least one target image includes a keyframe determined from the video.
In some embodiments, the alert indicates at least one of: the type of traffic event, the location of occurrence of the traffic event, the time of occurrence of the traffic event, and the time of impact and/or time period of impact of the traffic event.
In some embodiments, the set of itineraries includes an associated first service itinerary associated with the travel service, and the set of terminal devices includes: the service provider associated with the first service trip corresponds to a first terminal device and/or the service demander associated with the first service trip corresponds to a second terminal device.
In some embodiments, alert sending module 430 is further configured to: sending a first reminder to a first terminal device in a first manner; and sending a second alert to the second terminal device in a second manner, the first manner being different from the second manner.
In some embodiments, the second reminder indicates additional information about the traffic event as compared to the first reminder, wherein the additional information includes image information associated with the traffic event.
In some embodiments, alert sending module 430 is further configured to: receiving an acknowledgement from the second terminal device regarding the traffic event; and causing the second terminal device to provide an entry for changing the route of the first travel service trip.
In some embodiments, the set of vehicles includes a service vehicle for providing travel services, and the acquisition device includes an image acquisition device onboard the service vehicle.
In some embodiments, the reported information includes trip information for a target trip service trip associated with the service vehicle, the target trip service trip including: service trips by the service vehicle when the image acquisition device acquires the associated event images.
In some embodiments, the process 200 further comprises: determining the occurrence position of a traffic event; and transmitting a reminder associated with the traffic event to a terminal device associated with at least one of the plurality of vehicles based on the directions of travel of the plurality of vehicles within a predetermined range of the occurrence location.
In some embodiments, the predetermined range is determined based on an impact time of the traffic event.
Embodiments of the present disclosure also provide corresponding apparatus for implementing the above-described methods or processes. Fig. 5 shows a schematic block diagram of an apparatus 500 for risk early warning according to further embodiments of the present disclosure.
As shown in fig. 5, the apparatus 500 includes an image acquisition module 510 configured to acquire, by a terminal device, an environmental image of a vehicle, the environmental image being acquired by an acquisition device associated with the vehicle, the vehicle being a service vehicle for providing travel services. The apparatus 500 further includes an event identification module 520 configured to identify a traffic event based on the environmental image. In addition, the apparatus 500 further includes an information reporting module 530 configured to send reporting information associated with the traffic event to a remote device, the remote device including an edge computing device and/or a cloud device, the reporting information including: travel information of a service travel associated with travel services currently provided by the service vehicle, and an event image associated with a traffic event.
In some embodiments, event identification module 520 is further configured to: detecting the distance between the surrounding traffic elements of the vehicle and the vehicle based on the environment image; and in response to the distance being less than the first threshold distance, determining that a traffic event associated with the vehicle occurred.
In some embodiments, event identification module 520 is further configured to: tracking a first traffic element and a second traffic element around the vehicle based on the environmental image; and in response to detecting that the distance between the first traffic element and the second traffic element is less than the second threshold distance and the relative speed between the first traffic element and the second traffic element is less than the threshold speed, determining that a traffic event associated with the first traffic element and the second traffic element occurred.
In some embodiments, event identification module 520 is further configured to: determining a road condition of the first road based on the environmental image; and determining that a traffic event associated with the first road has occurred in response to the road condition being below the threshold level.
In some embodiments, event identification module 520 is further configured to: in response to identifying a traffic restriction identification associated with the second link based on the environmental image, it is determined that a traffic event associated with the second link occurred.
In some embodiments, event identification module 520 is further configured to: determining a number of particular traffic participants within a predetermined area based on the environmental image; and in response to the number being greater than the threshold number, determining that a traffic event associated with the particular traffic participant occurred.
In some embodiments, the information reporting module 530 is further configured to: transmitting descriptive information associated with the traffic event to an edge computing device; and in response to receiving a confirmation from the edge computing device regarding the traffic event, sending reporting information associated with the traffic event to the cloud device.
In some embodiments, the confirmation regarding the traffic event is based on the number of terminal devices of a group of terminal devices that sent descriptive information regarding the same traffic event to the edge computing device.
In some embodiments, the reporting information further indicates at least one of: the type of traffic event, the location of occurrence of the traffic event, and the time of occurrence of the traffic event.
In some embodiments, the reported information also indicates a severity of the traffic event, wherein the severity is determined based on an event image associated with the traffic event.
In some embodiments, a terminal device includes: the vehicle-mounted terminal device mounted on the vehicle, the terminal device associated with the service requiring party of the service trip, and the terminal device associated with the service provider of the service trip.
The elements included in apparatus 400 and/or apparatus 500 may be implemented in various ways, including software, hardware, firmware, or any combination thereof. In some embodiments, one or more units may be implemented using software and/or firmware, such as machine executable instructions stored on a storage medium. In addition to or in lieu of machine-executable instructions, some or all of the elements of apparatus 400 and/or apparatus 500 may be at least partially implemented by one or more hardware logic components. By way of example and not limitation, exemplary types of hardware logic components that can be used include Field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standards (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
Fig. 6 illustrates a block diagram of an electronic device/server 600 in which one or more embodiments of the disclosure may be implemented. It should be understood that the electronic device/server 600 illustrated in fig. 6 is merely exemplary and should not be construed as limiting the functionality and scope of the embodiments described herein.
As shown in fig. 6, the electronic device/server 600 is in the form of a general-purpose electronic device. The components of electronic device/server 600 may include, but are not limited to, one or more processors or processing units 610, memory 620, storage 630, one or more communication units 640, one or more input devices 650, and one or more output devices 660. The processing unit 610 may be an actual or virtual processor and is capable of performing various processes according to programs stored in the memory 620. In a multiprocessor system, multiple processing units execute computer-executable instructions in parallel to increase the parallel processing capabilities of the electronic device/server 600.
The electronic device/server 600 typically includes a number of computer storage media. Such media may be any available media that is accessible by electronic device/server 600 and includes, but is not limited to, volatile and non-volatile media, removable and non-removable media. The memory 620 may be volatile memory (e.g., registers, cache, random Access Memory (RAM)), non-volatile memory (e.g., read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory), or some combination thereof. Storage device 630 may be a removable or non-removable media and may include machine-readable media such as flash drives, magnetic disks, or any other media that may be capable of storing information and/or data (e.g., training data for training) and may be accessed within electronic device/server 600.
The electronic device/server 600 may further include additional removable/non-removable, volatile/nonvolatile storage media. Although not shown in fig. 6, a magnetic disk drive for reading from or writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk may be provided. In these cases, each drive may be connected to a bus (not shown) by one or more data medium interfaces. Memory 620 may include a computer program product 625 having one or more program modules configured to perform the various methods or acts of the various embodiments of the disclosure.
The communication unit 640 enables communication with other electronic devices through a communication medium. Additionally, the functionality of the components of the electronic device/server 600 may be implemented in a single computing cluster or in multiple computing machines capable of communicating over a communication connection. Thus, the electronic device/server 600 may operate in a networked environment using logical connections to one or more other servers, a network Personal Computer (PC), or another network node.
The input device 650 may be one or more input devices such as a mouse, keyboard, trackball, etc. The output device 650 may be one or more output devices such as a display, speakers, printer, etc. The electronic device/server 600 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., as needed through the communication unit 640, with one or more devices that enable a user to interact with the electronic device/server 600, or with any device (e.g., network card, modem, etc.) that enables the electronic device/server 600 to communicate with one or more other electronic devices. Such communication may be performed via an input/output (I/O) interface (not shown).
According to an exemplary implementation of the present disclosure, a computer-readable storage medium is provided, on which one or more computer instructions are stored, wherein the one or more computer instructions are executed by a processor to implement the method described above.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various implementations of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of implementations of the present disclosure has been provided for illustrative purposes, is not exhaustive, and is not limited to the implementations disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various implementations described. The terminology used herein was chosen in order to best explain the principles of each implementation, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand each implementation disclosed herein.
The schemes described in the present specification and embodiments, if related to personal information processing, all perform processing on the premise of having a validity base (for example, obtaining agreement of a personal information body, or being necessary for executing a contract, etc.), and perform processing only within a prescribed or agreed range. The user refuses to process the personal information except the necessary information of the basic function, and the basic function is not influenced by the user.

Claims (33)

1. A risk early warning method comprising:
Obtaining, by the cloud device, reporting information about a traffic event, wherein detection of the traffic event is based at least on a set of event images acquired by a set of acquisition devices associated with a set of vehicles;
Determining a set of trips expected to be affected by the traffic event using at least part of the reported information, the set of trips including planned trips and/or ongoing trips that have not yet been initiated; and
A reminder associated with the traffic event is sent to a set of terminal devices associated with the set of trips.
2. The method of claim 1, wherein determining a set of trips expected to be affected by the traffic event comprises:
Determining an impact time of the traffic event using the at least part of the reported information; and
Based on the impact time, the set of trips expected to be impacted by the traffic event is determined.
3. The method of claim 2, wherein determining the impact time comprises:
determining input features to a time estimation model based on the at least part of the reported information; and
And determining the influence time of the traffic event by using the time estimation model.
4. A method according to claim 3, wherein the input features are indicative of at least one of:
The type of the traffic event is determined,
The location of the occurrence of the traffic event,
The time of occurrence of the traffic event,
Reporting terminal information of the traffic event, the reporting terminal information at least indicating the number of terminals reporting the traffic event,
Level information of the traffic event, the level information indicating a severity of the traffic event, and
Traffic state information associated with the traffic event.
5. The method of claim 2, wherein determining the set of trips expected to be affected by the traffic event based on the time of influence comprises:
determining an influence period of the traffic event based on the influence time and the occurrence time of the traffic event;
determining a target journey that is expected to pass through an area associated with the location of occurrence of the traffic event during the impact period; and
Based on the target journey, the set of journeys expected to be affected by the traffic event is determined.
6. The method of claim 5, wherein determining the target travel comprises:
acquiring a group of service trips from a travel service platform; and
The target itinerary that is expected to travel through the area during the impact period is determined from the set of service itineraries based on the time information and route information for the set of service itineraries.
7. The method of claim 1, further comprising:
Determining at least one target image from the set of event images for indicating the traffic event; and
The reminder is generated for sending to the set of terminal devices based on the at least one target image.
8. The method of claim 7, wherein the set of event images comprises a reported video and the at least one target image comprises a key frame determined from the video.
9. The method of claim 1, wherein the reminder indicates at least one of:
The type of the traffic event is determined,
The location of the occurrence of the traffic event,
The time of occurrence of the traffic event, and
The time of impact and/or the time period of impact of the traffic event.
10. The method of claim 1, wherein the set of itineraries includes an associated first service itinerary associated with a travel service, the set of terminal devices comprising: the service provider associated with the first service trip corresponds to a first terminal device and/or the service demander associated with the first service trip corresponds to a second terminal device.
11. The method of claim 10, wherein sending a reminder associated with the traffic event to a set of terminal devices associated with the set of trips comprises:
sending a first reminder to the first terminal device in a first manner; and
And sending a second reminder to the second terminal equipment in a second mode, wherein the first mode is different from the second mode.
12. The method of claim 11, wherein the second reminder indicates additional information about the traffic event as compared to the first reminder, wherein the additional information includes image information associated with the traffic event.
13. The method of claim 12, further comprising:
Receiving an acknowledgement from the second terminal device regarding the traffic event; and
And enabling the second terminal equipment to provide an entrance for changing the route of the first travel service trip.
14. The method of claim 1, wherein the set of vehicles includes a service vehicle for providing travel services, and the acquisition device includes an image acquisition device onboard the service vehicle.
15. The method of claim 14, wherein the reported information includes trip information for a target trip service trip associated with the service vehicle, the target service trip comprising: and when the image acquisition device acquires the associated event image, the service vehicle performs a service trip.
16. The method of claim 1, further comprising:
determining the occurrence position of the traffic event; and
And sending a reminder associated with the traffic event to a terminal device associated with at least one vehicle of the plurality of vehicles based on the driving directions of the plurality of vehicles within the predetermined range of the occurrence position.
17. The method of claim 16, wherein the predetermined range is determined based on an impact time of the traffic event.
18. A risk early warning method comprising:
acquiring, by a terminal device, an environmental image of a vehicle, the environmental image being acquired by an acquisition device associated with the vehicle, the vehicle being a service vehicle for providing travel services;
identifying a traffic event based on the environmental image; and
Transmitting reporting information associated with the traffic event to a remote device, the remote device including an edge computing device and/or a cloud device, the reporting information comprising: trip information of a service trip associated with a travel service currently provided by the service vehicle, and an event image associated with the traffic event.
19. The method of claim 18, wherein identifying the traffic event comprises:
detecting a distance between a surrounding traffic element of the vehicle and the vehicle based on the environmental image; and
In response to the distance being less than a first threshold distance, it is determined that the traffic event associated with the vehicle occurred.
20. The method of claim 18, wherein identifying the traffic event comprises:
Tracking a first traffic element and a second traffic element of the vehicle perimeter based on the environmental image; and
In response to detecting that the distance between the first traffic element and the second traffic element is less than a second threshold distance and the relative speed between the first traffic element and the second traffic element is less than a threshold speed, it is determined that the traffic event associated with the first traffic element and the second traffic element occurred.
21. The method of claim 18, wherein identifying the traffic event comprises:
determining a road condition of a first road based on the environmental image; and
In response to the road condition being below a threshold level, it is determined that the traffic event associated with the first road occurred.
22. The method of claim 18, wherein identifying the traffic event comprises:
in response to identifying a traffic restriction identification associated with a second link based on the environmental image, it is determined that the traffic event associated with the second link occurred.
23. The method of claim 18, wherein identifying the traffic event comprises:
Determining a number of particular traffic participants within a predetermined area based on the environmental image; and
In response to the number being greater than a threshold number, it is determined that the traffic event associated with the particular traffic participant occurred.
24. The method of claim 18, wherein sending the report information associated with the traffic event to a remote device comprises:
transmitting descriptive information associated with the traffic event to an edge computing device; and
In response to receiving a confirmation from the edge computing device regarding the traffic event, the reporting information associated with the traffic event is sent to the cloud device.
25. The method of claim 24, wherein the confirmation regarding the traffic event is based on a number of terminal devices of a group of terminal devices that sent descriptive information regarding the same traffic event to the edge computing device.
26. The method of claim 18, wherein the reporting information further indicates at least one of:
The type of the traffic event is determined,
The occurrence position of the traffic event, and
The time of occurrence of the traffic event.
27. The method of claim 18, wherein the reporting information further indicates a severity of the traffic event, wherein the severity is determined based on an event image associated with the traffic event.
28. The method of claim 18, wherein the terminal device comprises:
the vehicle-mounted terminal device mounted on the vehicle,
Terminal device associated with a service consumer of the service itinerary, and
A terminal device associated with a service provider of the service itinerary.
29. An apparatus for risk early warning, comprising:
an information acquisition module configured to acquire reporting information about a traffic event, wherein detection of the traffic event is based at least on a set of event images acquired by a set of acquisition devices associated with a set of vehicles;
A trip determination module configured to determine a set of trips expected to be affected by the traffic event using at least a portion of the reported information, the set of trips including planned trips and/or ongoing trips that have not yet been initiated; and
And a reminder sending module configured to send a reminder associated with the traffic event to a set of terminal devices associated with the set of trips.
30. An apparatus for risk early warning, comprising:
An image acquisition module configured to acquire, by a terminal device, an environmental image of a vehicle, the environmental image being acquired by an acquisition device associated with the vehicle, the vehicle being a service vehicle for providing travel services;
An event identification module configured to identify a traffic event based on the environmental image; and
An information reporting module configured to send reporting information associated with the traffic event to a remote device, the remote device including an edge computing device and/or a cloud device, the reporting information comprising: trip information of a service trip associated with a travel service currently provided by the service vehicle, and an event image associated with the traffic event.
31. An electronic device, comprising:
a memory and a processor;
Wherein the memory is for storing one or more computer instructions, wherein the one or more computer instructions are executed by the processor to implement the method of any one of claims 1 to 17 or 18 to 28.
32. A computer readable storage medium having stored thereon one or more computer instructions, wherein the one or more computer instructions are executed by a processor to implement the method of any of claims 1 to 17 or 18 to 28.
33. A computer program product comprising computer executable instructions which when executed by a processor implement the method of any one of claims 1 to 17 or 18 to 28.
CN202211713905.4A 2022-12-29 Risk early warning method and device Pending CN118298656A (en)

Publications (1)

Publication Number Publication Date
CN118298656A true CN118298656A (en) 2024-07-05

Family

ID=

Similar Documents

Publication Publication Date Title
US10417911B2 (en) Inter-vehicle cooperation for physical exterior damage detection
CN112712717B (en) Information fusion method, device and equipment
US11836985B2 (en) Identifying suspicious entities using autonomous vehicles
CN109472975A (en) Driving assist system, drive supporting device and driving support method
CN110400478A (en) A kind of road condition notification method and device
CN113223317B (en) Method, device and equipment for updating map
US11162800B1 (en) Accident fault detection based on multiple sensor devices
JP6710714B2 (en) Vehicle management system, vehicle management method, and program
CN112466141A (en) Vehicle-road-collaboration-oriented intelligent network connection end equipment interaction method, system and storage medium
CN108966145A (en) Hit-and-run criminal is tracked using V2X communication
US20230139740A1 (en) Remote access application for an autonomous vehicle
US20180143033A1 (en) Method and system for lane-based vehicle navigation
CN111434534A (en) Vehicle information processing device and vehicle information processing method
JPWO2020100922A1 (en) Data distribution systems, sensor devices and servers
JP2022542366A (en) Evaluate vehicle safety performance
JP2019067201A (en) Vehicle search system, vehicle search method, and vehicle and program employed in the same
CN114194209A (en) Risk assessment in an autonomous driving environment
CN114724364B (en) Vehicle control method, apparatus, device, storage medium, and program product
US20210049384A1 (en) Systems and methods for collecting information from a vehicle for damage assessment caused by riders
US20210323565A1 (en) Vehicle control device, automated driving vehicle development system, vehicle control method, and program
CN111801714A (en) Method for encrypting vehicle defect report
CN111523366A (en) Information processing apparatus, information processing method, and program
CN113386738A (en) Risk early warning system, method and storage medium
US20230090338A1 (en) Method and system for evaluation and development of automated driving system features or functions
CN112700648B (en) Method and device for determining traffic violation position

Legal Events

Date Code Title Description
PB01 Publication