US12008904B2 - Terminal device equipped with a camera and display and mounted on a vehicle that predicts an intention of the vehicle to turn left or right - Google Patents

Terminal device equipped with a camera and display and mounted on a vehicle that predicts an intention of the vehicle to turn left or right Download PDF

Info

Publication number
US12008904B2
US12008904B2 US17/701,740 US202217701740A US12008904B2 US 12008904 B2 US12008904 B2 US 12008904B2 US 202217701740 A US202217701740 A US 202217701740A US 12008904 B2 US12008904 B2 US 12008904B2
Authority
US
United States
Prior art keywords
image
mobile object
vehicle
terminal device
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US17/701,740
Other versions
US20220319326A1 (en
Inventor
Yuji Yasui
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Assigned to HONDA MOTOR CO., LTD. reassignment HONDA MOTOR CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YASUI, YUJI
Publication of US20220319326A1 publication Critical patent/US20220319326A1/en
Application granted granted Critical
Publication of US12008904B2 publication Critical patent/US12008904B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/525Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking automatically indicating risk of collision between vehicles in traffic or with pedestrians, e.g. after risk assessment using the vehicle sensor data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J27/00Safety equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J6/00Arrangement of optical signalling or lighting devices on cycles; Mounting or supporting thereof; Circuits therefor
    • B62J6/22Warning or information lights
    • B62J6/26Warning or information lights warning or informing other road users, e.g. police flash lights
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/46Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for giving flashing caution signals during drive, other than signalling change of direction, e.g. flashing the headlights or hazard lights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/008Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J50/00Arrangements specially adapted for use on cycles not provided for in main groups B62J1/00 - B62J45/00
    • B62J50/20Information-providing devices
    • B62J50/21Information-providing devices intended to provide information to rider or passenger
    • B62J50/22Information-providing devices intended to provide information to rider or passenger electronic, e.g. displays

Definitions

  • the present invention relates to an information supply method, a storage medium, and a terminal device.
  • Another objective of the present invention is to provide an information supply method, a storage medium, and a terminal device capable of supplying useful information to a user by displaying an image of an alarm about presence of traffic participants in accordance with an intention of a predicted behavior.
  • An information supply method, a storage medium, and a terminal device according to the present invention adopt the following configurations.
  • the information supply method, the application program stored in the storage medium, and the terminal device can predict the intention of a behavior of a mobile object more accurately by performing a process of predicting whether the mobile object has an intention to turn right or left based on images captured at different times. Further, the information supply method can supply useful information to a user by performing a process of displaying an image of an alarm about presence of traffic participants on the display based on the result of the estimation.
  • FIG. 1 is a diagram showing an overview of a support system.
  • FIG. 2 is a diagram showing an example of a scenario of support.
  • FIG. 3 is a diagram showing another example of the scenario of the support.
  • FIG. 5 is a diagram showing still another example of the scenario of the support.
  • FIG. 6 is a diagram showing an example of a functional configuration of the support system.
  • FIG. 8 is a diagram showing an exemplary configuration of functional units or the like of a terminal device.
  • FIG. 9 is a flowchart showing an example of a flow of a process executed by an application program (a terminal device).
  • FIG. 10 is a diagram showing an example of an image displayed on a display of the terminal device.
  • FIG. 11 is a diagram showing a process of estimating a movement amount.
  • FIG. 12 is a diagram showing a process of determining a posture.
  • FIG. 14 is a diagram showing a process of detecting traffic participants.
  • FIG. 15 is a diagram showing a process of predicting behaviors of the detected traffic participants.
  • FIG. 16 is a diagram showing an example of definition of a risk during straight ahead traveling.
  • FIG. 17 is a diagram showing an example of a process of determining a risk during straight ahead traveling of an own vehicle.
  • FIG. 18 is a diagram showing an example of definition of a risk during turning left.
  • FIG. 19 is a diagram showing an example of a process of determining a risk during turning-left of the own vehicle.
  • FIG. 20 is a diagram showing an example of definition of a risk during turning right.
  • FIG. 22 is a diagram showing an example of a process of estimating a potential risk.
  • FIG. 23 is a flowchart showing an example of a flow of a process executed by the processor.
  • FIG. 24 is a flowchart showing an example of a flow of a process for alarm/caution.
  • FIG. 25 is a diagram showing a scenario and a notification in which a front alarm risk occurs.
  • FIG. 26 is a diagram showing a scenario and a notification in which a left alarm risk or a right alarm risk occurs.
  • FIG. 27 is a diagram showing a scenario and a notification in which a caution risk occurs.
  • FIG. 28 is a diagram showing a scenario and a notification in which a potential risk occurs.
  • FIG. 1 is a diagram showing an overview of a support system 1 .
  • a mobile object is assumed to be a two-wheeled vehicle (hereinafter sometimes referred to as an “own vehicle”) in description, but may also be applied to various mobile objects such as a small vehicle or a three-wheeled vehicle.
  • front and rear directions of the own vehicle are referred to an X direction
  • a direction orthogonal to the front and rear directions is referred to as a Y direction
  • a direction orthogonal to the X and Y directions is referred to as a Z direction in some cases.
  • the support system 1 includes a passing light PL, a terminal device (for example, a smartphone) SP, and an application program AP installed in the terminal device SP.
  • the passing light PL is mounted so that, for example, a light illuminates (passes) the vicinity or the like of a headlight in the positive X direction (the forward direction).
  • a passing light PL-R is mounted in the positive Y direction of the vehicle and a passing light PL-L is mounted in the negative Y direction of the vehicle.
  • a camera and a light are mounted on a first surface of the terminal device SP and a display is displayed on a second surface opposite to the first surface.
  • the terminal device SP is mounted on the own vehicle.
  • the terminal device SP is mounted at a position at which the camera can execute imaging in the positive X direction of the vehicle and the light illuminates in the positive X direction.
  • the terminal device SP is provided at a position at which a driver (an operator) can view the display.
  • the support system 1 executes passing on traffic participants using one or both of a flash of the terminal device SP and the passing light PL.
  • the support system 1 further displays an alarm on the display of the terminal device SP.
  • both executing of the passing and displaying of the alarm are referred to as “support” in some cases.
  • the support system 1 informs a driver of approach to traffic participants early to prevent the approach to the traffic participants and requests the traffic participants to execute cooperation (deceleration, stopping, avoiding, or the like) of the prevention of the approach by reporting a possibility of the approach to the traffic participants by auto-passing of the passing light.
  • the support system 1 executes support to prevent an approach of a collision when two meet each other.
  • the support system 1 executes support to avoid an approach of the own vehicle to a pedestrian, a two-wheeled vehicle, or a four-wheeled vehicle.
  • the support system 1 executes support to prevent an approach to traffic participants at an intersection.
  • the support system 1 executes support to prevent an approach of traffic participants when a four-wheeled vehicle is congested and the own vehicle slips on the side of the four-wheeled vehicle, or executes support to prevent an approach to traffic participants stopping in front.
  • the support system 1 executes support to prevent an approach at a potential risk.
  • the support system 1 prevents a risk when the own vehicle turns right during congestion of a four-wheeled vehicle, or prevents a risk when a traffic participant rushes out from the rear side of the stopping vehicle.
  • the support system 1 when there is a row of objects (four-wheeled vehicles) which are present near the own vehicle and extend in a direction identical to a traveling direction of the own vehicle, as in the right drawing of FIG. 5 , it is determined that there is a potential risk.
  • FIG. 6 is a diagram showing an example of a functional configuration of the support system 1 .
  • the support system 1 includes the terminal device SP and the passing light PL, for example, as described in FIG. 1 .
  • the passing light PL and the terminal device SP are connected with a battery BT of the own vehicle M by a power line and are supplied with power from the battery BT to operate.
  • the terminal device SP may be supplied with power from the battery BT via the passing light PL to operate or may operate with power of a battery mounted on the terminal device SP.
  • the terminal device SP and the passing light PL are connected by a communication line to communicate with each other.
  • the terminal device SP transmits an execution signal for operating the passing light PL.
  • the passing light PL executes passing in accordance with the execution signal.
  • the terminal device SP is mounted in the own vehicle by a holder, as shown in FIG. 7 .
  • a camera Cam of the terminal device SP executes imaging in the positive X direction and a light Lig of the terminal device SP illuminates in the positive X direction, and a display Dis is mounted to be oriented in the negative X direction.
  • FIG. 8 is a diagram showing an exemplary configuration of functional units or the like of the terminal device SP.
  • the terminal device SP includes, for example, an acquirer 10 , a processor 20 , a notifier 30 , and a storage 50 .
  • the acquirer 10 , the processor 20 , and the notifier 30 are implemented, for example, by causing a hardware processor such as a central processing unit (CPU) to execute an application program AP (software) stored in the storage 50 .
  • the application program AP is supply by a server device that provides the application program AP.
  • the application program AP may be supply by, for example, a service provider that manufactures and sells the own vehicle.
  • the application program AP includes, for example, information functioning as a deep neural network (DNN) 52 to be described below in detail.
  • DNN deep neural network
  • the application program AP may be stored in advance in a storage device (a storage device including a non-transitory storage medium) such as a flash memory or the like of the storage 50 , as described above, or may be stored in a detachably mounted storage medium such as an SD card or the like so that the storage medium (a non-transitory storage medium) is mounted on the terminal device SP to be installed on the storage of the terminal device SP.
  • a storage device a storage device including a non-transitory storage medium
  • a storage device such as a flash memory or the like of the storage 50
  • a detachably mounted storage medium such as an SD card or the like so that the storage medium (a non-transitory storage medium) is mounted on the terminal device SP to be installed on the storage of the terminal device SP.
  • the acquirer 10 acquires an image captured by the camera Cam.
  • the processor 20 analyzes the image and executes various processes based on a result of the analysis.
  • the notifier 30 gives a warning indicating an alarm, a caution, or the like to a driver based on a processing result of the processor 20 . The details of the process will be described below.
  • FIG. 9 is a flowchart showing an example of a flow of a process executed by an application program (the terminal device SP).
  • the terminal device SP estimates a movement amount (step S 100 ) and estimates a posture (step S 110 ).
  • the movement amount is a movement amount of a traffic participant or a movement amount of the own vehicle.
  • the posture is a posture of the own vehicle.
  • the terminal device SP detects a traffic participant (step S 120 ) and predicts a behavior of the detected traffic participant (step S 130 ). Subsequently, the terminal device SP detects a risk of the own vehicle during straight ahead traveling (step S 140 ), detects a risk during turning left (step S 150 ), and detects a risk during turning right (step S 160 ). Subsequently, the terminal device SP detects a potential risk (step S 170 ). Subsequently, the terminal device SP executes a determination process (step S 180 ).
  • the terminal device SP displays an alarm level report or a caution level report shown in FIG. 10 on the display of the terminal device SP in accordance with a result of the determination in the foregoing determination process.
  • the alarm level report or the caution level report is a report about the front side, a report about the left or the front left, a report about the right or the front right, or the like.
  • FIG. 11 is a diagram showing a process of estimating a movement amount.
  • the acquirer 10 of the terminal device SP acquires images (“images captured at different times”) of a time series (k . . . k ⁇ n) captured by the camera Cam (step S 101 ).
  • the processor 20 of the terminal device SP derives a difference between an image captured at a present time and an image captured at an i time earlier among the captured images (step S 102 ).
  • the processor 20 digitizes the derived difference (step S 104 ) and obtains the total sum of all the pixel values (step S 106 ).
  • the total sum of all the pixel values is the total sum of pixels for which it is determined that there is a difference.
  • the processor 20 derives a difference of a luminance value for each pixel and digitizes the difference in accordance with whether the derived difference is equal to or greater than a threshold or is less than the threshold. For example, a pixel with a difference equal to or greater than the threshold is expressed with black color and a pixel with a difference less than the threshold is expressed with white color. Differences of RGB components may be used. In this case, a value obtained by processing a difference of each component statistically may be compared with the threshold and may be digitized for each pixel.
  • the terminal device SP can easily determine a state of the own vehicle or a state of the traffic participant based on an image.
  • FIG. 12 is a diagram showing a process of determining a posture.
  • the processor 20 cuts out a rectangular region AR 1 from an image of a time k among the image of the time series.
  • the rectangular region AR 1 is a region centering on a central portion of an image and is a region in which a reference position (for example, a center of gravity or the like) of an image is set at its center.
  • the processor 20 cuts out a region identical or approximate to a region in which the image of the time k is captured from the image at a time k ⁇ i among the images of the time series and expands the image of the cutout region so that the image is approximate to the image of the time k.
  • This process is an example of a process of “approximating the size of a cut image in which a specific region is cut from an image captured at the first time to the size of an image captured at the second time.”
  • the expansion amount of an image may be determined as follows. For example, when a movement speed of the own vehicle is obtained by a positioning device that measures a position of the terminal device SP, the processor 20 may expand the image by the movement amount of an i sampling time in accordance with the speed of the own vehicle. For example, the processor 20 expands the image with reference to information in which a speed is associated with the degree of expansion. For example, when the speed is obtained, as described above, the processor 20 may expand the image to the degree of an expansion appropriate for the speed. When a speed is not obtained, the processor 20 may expand the image by a predetermined value.
  • adjustment may be executed through image processing so that an imaging range of an image at the time k is matched to an imaging range of an image at a time k ⁇ 1.
  • the processor 20 may acquire regions including predetermined scenery from the image at the time k and the image at the time k ⁇ 1 and may contract the size of an image including the region acquired from the image at the time k so that the size of the image is approximated to the size of an image including the region acquired from the image at the time k ⁇ 1.
  • the processor 20 acquires a plurality of images obtained by rotating the expanded image each by a predetermined angle in the left and right directions using a reference position as a starting point. For example, images rotated clockwise and counterclockwise about the reference position of the image are acquired.
  • a parameter ID is assigned to the image in accordance with the degree of rotation. For example, images to which parameter ID+1, . . . , and +n are assigned and which are rotated in the left direction and images to which parameter ID ⁇ 1, . . . , and ⁇ n are assigned and which are rotated in the right direction are acquired.
  • the processor 20 sets a rectangular region AR 2 in a region centering on a central portion of an image to which a parameter ID is assigned.
  • the rectangular region AR 2 is, for example, a region corresponding to the rectangular region AR 1 .
  • the rectangular region AR 2 is a region which is centered on a central portion of an image without being affected by rotation of the image and in which a reference position of the image is set at its center.
  • the processor 20 compares an image (which is an example of a “second image”) of the rectangular region AR 1 at the time k with an image (which is an example of a “first image”) of the rectangular region AR 1 to which a parameter ID is assigned and derives a difference therebetween. Subsequently, the processor 20 digitizes each derived difference and obtains a total sum of all the pixel values for each image. Further, the processor 20 selects a parameter ID of the image in which the total sum (which is a total sum of the pixels for which it is determined that there is a difference) of all the pixels is the minimum.
  • the parameter ID “0” in FIG. 12 is selected.
  • the parameter ID “+n” in FIG. 12 is selected.
  • the parameter ID “ ⁇ n” in FIG. 12 is selected.
  • FIG. 13 is a flowchart showing an example of a flow of a process executed by the processor 20 .
  • the terminal device SP can determine whether the own vehicle has an intention to travel straight ahead or to turn right or left more accurately based on images captured at different times.
  • the processor 20 may determine whether the own vehicle has an intention to travel straight ahead, an intention to turn right, or an intention to turn left based on a detection result of an acceleration sensor that is mounted on the terminal device SP and can detect an inclination of the terminal device SP. For example, the processor 20 may determine whether the own vehicle has an intention to turn in a direction in which the terminal device SP is inclined.
  • each of one image rotated in the left direction and one image rotated in the right direction may be compared with an image captured at the time k and it may be determined that the own vehicle will travel in a direction corresponding to an image with a small difference.
  • the central portions of the images have been compared in the above description. However, instead of the central portions, other regions may be compared.
  • the imaging interval of an image is equal to or less than a predetermined value and the process of adjusting the size of the image at the time k ⁇ i and the image at the time k among the images of the time series (for example, the process of expanding the image at the time k ⁇ 1) is omitted, the foregoing process may be omitted when it can be determined more accurately whether the own vehicle has the intention to travel straight ahead, the intention to turn right, or the intention to turn left.
  • FIG. 14 is a diagram showing a process of detecting traffic participants.
  • the processor 20 identifies kinds of traffic participants in images using the DNN 52 .
  • the DNN 52 is a learned model that outputs information indicating the kinds of traffic participants included in the images or information for identifying positions of the traffic participants.
  • the kinds of traffic participants are, for example, kinds of traffic participants such as four-wheeled vehicles, two-wheeled vehicles, bicycles, pedestrians, and the like.
  • detection boxes which are rectangular regions are associated with the traffic participants.
  • the detection boxes are, for example, regions set so that the traffic participants are included.
  • FIG. 15 is a diagram showing a process of predicting behaviors of the detected traffic participants.
  • the processor 20 estimates movement directions of the traffic participants based on changes in positions of the traffic participants identified in images captured at different times and predicts behaviors of the traffic participants based on a result of the estimation.
  • the processor 20 determines that the own vehicle and the traffic participant tend to become close. In this case, for example, the traffic participant becomes close to the own vehicle or the own vehicle becomes close to the traffic participant.
  • the processor 20 determines that the own vehicle and the traffic participant tend to become away. In this case, for example, the traffic participant becomes away from the own vehicle or the own vehicle becomes away from the traffic participant.
  • the processor 20 estimates a movement amount of a traffic participant based on an expansion amount or a contraction amount of the detection box.
  • the processor 20 determines that a traffic participant moves to the right.
  • the processor 20 determines that the traffic participant moves to the left.
  • the processor 20 estimates the movement amount of the traffic participant based on the movement amount of the detection box.
  • the processor 20 estimates the movement direction and the movement amount of the traffic participant, as described above, and predicts the behavior of the traffic participant.
  • the processor 20 predicts a future position of the traffic participant based on detection results of the position of the traffic participant at the time k ⁇ 1 and the position of the traffic participant at the time k.
  • FIG. 16 is a diagram showing an example of definition of a risk during straight ahead traveling.
  • a risk during straight ahead traveling (a straight-traveling risk) is set in a predetermined range of the front of the own vehicle.
  • a straight-traveling risk is set in a first angle range using a line segment (d 1 ) of a traveling direction of the own vehicle or an extension direction (reference direction) of a road as a reference.
  • a second angle range less than the first angle range is a front risk range when a line segment of the traveling direction is a reference.
  • a range in the left direction is a left risk range and a range in the right direction is a right risk range.
  • the range of a first distance from the own vehicle is an alert level range and the range of a second distance from the own vehicle is a caution level range.
  • a region in the first angle range and the range of the first distance is referred to as a “region a”
  • a region in a range between the first angle and the second angle (which may be an angle different from the second angle: the same applies below) in the left direction and in the range of the first distance is referred to as a “region b”
  • a region in a range between the first and second angles in the right direction and the range of the first distance is referred to as a “region c” in some cases (see FIGS. 17 , 19 , and 21 to be described below).
  • a region in a range between the first and second angles and the range of the first and second distances is referred to as a “region d”
  • a region in a range between the first angle and second angles in the left direction and in the range of the first and second distances is referred to as a “region e”
  • a region in a range between the first and second angles in the right direction and the range of the first and second distances is referred to as a “region f” in some cases (see FIGS. 17 , 19 , and 21 to be described below).
  • FIG. 17 is a diagram showing a process of determining a risk during straight ahead traveling of the own vehicle.
  • the processor 20 determines whether the own vehicle is traveling straight ahead based on the result of the above-described process (step S 300 ). When the own vehicle is not traveling straight ahead, the process proceeds to a process of step S 400 of FIG. 19 to be described below.
  • the processor 20 determines in which region there is a risk of the own vehicle among the region a to the region e. For example, the processor 20 determines in which region a position (a target position) of a target traffic participant (for example, a traffic participant present at a position closest from the own vehicle) or a future position (a target position) of the traffic participant is located among the region a to the region e.
  • a target position for example, a traffic participant present at a position closest from the own vehicle
  • a future position a target position
  • FIG. 18 is a diagram showing an example of definition of a risk during turning left.
  • a risk during turning left (a left-turning risk) is set centering on the front left side of the own vehicle.
  • a risk is set using a traveling direction of the own vehicle as a reference direction.
  • a risk is set using a line segment (dL) rotated by a predetermined angle in the left direction from the line segment of the traveling direction of the own vehicle instead of the line segment of the traveling direction.
  • the predetermined angle is, for example, an angle within a range from 30 degrees to 60 degrees or angle set in accordance with the shape of a road.
  • step S 400 determines whether the own vehicle is turning left or has an intention to turn left.
  • step S 500 of FIG. 21 the processor 20 determines in which region there is a risk of the own vehicle among the region a to the region e.
  • FIG. 20 is a diagram showing an example of definition of a risk during turning right.
  • a risk during turning right (a right-turning risk) is set centering on the front right of the own vehicle.
  • a risk is set using a traveling direction of the own vehicle as a reference direction.
  • a risk is set using a line segment (dR) rotated by a predetermined angle in the right direction from the line segment of the traveling direction of the own vehicle as a reference instead of the line segment of the traveling direction.
  • the predetermined angle is, for example, an angle within a range from 30 degrees to 60 degrees or angle set in accordance with the shape of a road.
  • FIG. 21 is a diagram showing an example of a process of determining a risk during turning-right of the own vehicle.
  • the processor 20 determines whether the own vehicle is turning right or has an intention to turn right (step S 500 ). When the own vehicle is not turning right and has no intention to turn right, the process proceeds to a process of step S 600 of FIG. 23 . When the own vehicle is turning right or has the intention to turn right, the processor 20 determines in which region there is a risk of the own vehicle among the region a to the region e.
  • the processor 20 sets a risk (an alert region) in front of the own vehicle.
  • a risk is set in the front of and in the left direction of the own vehicle.
  • a risk is set in front of and in the right direction of the own vehicle. In this way, a risk appropriate in the traveling direction of the own vehicle can be set.
  • FIG. 22 is a diagram showing an example of a process of estimating a potential risk.
  • the processor 20 detects an average inter-vehicle distance Lvv_ave and a transverse distance Lev based on positions of other vehicles in an image.
  • the average inter-vehicle distance Lvv_ave is, for example, an average inter-vehicle distance of predetermined vehicles (for example, all the vehicles) present from the detected own vehicle to another vehicle.
  • the other vehicle is a vehicle present at a position closest to the own vehicle.
  • An inter-vehicle distance between vehicles is, for example, the distance between reference positions of vehicles forming a row in the traveling direction. When only one vehicle is present, the inter-vehicle distance is set to a large value such as a preset distance (for example, 30 m).
  • the transverse distance Lev is a distance between the own vehicle and another vehicle in the transverse direction.
  • the other vehicle is, for example, a vehicle present located at a position closest from the own vehicle.
  • the average inter-vehicle distance Lvv_ave and the transverse distance Lev are used for a process related to estimation of the following potential risk.
  • FIG. 23 is a flowchart showing an example of a flow of a process executed by the processor 20 .
  • the processor 20 determines whether the average inter-vehicle distance Lvv_ave is a threshold Lvv_LR (step S 600 ).
  • the average inter-vehicle distance Lvv_ave is, for example, an average inter-vehicle distance based on a distance between a first object now entering an intersection (a connection portion between first and second roads) and a second object behind the first object in a longitudinal direction or an inter-vehicle distance between the first object and each of objects until an n-th object present behind the second object.
  • Step S 600 it may be determined whether another vehicle is stopping and the average inter-vehicle distance Lvv_ave is the threshold Lvv_LR.
  • the process of determining whether another vehicle is stopping is omitted in this process to reduce a calculation load and the average inter-vehicle distance Lvv_ave is the threshold Lvv_LR, it is alternatively determined that the other vehicle is stopping.
  • the process proceeds to a process of step S 700 of FIG. 24 .
  • the processor 20 determines whether a transverse distance
  • is less than a threshold Lev_LR, the processor 20 determines that there is a potential risk in front of the own vehicle (step S 612 ). When there is the potential risk in front, a flag “F_FLR 1” is set.
  • FIG. 24 is a flowchart showing an example of a flow of a process (determination process) for alarm/caution.
  • the processor 20 determines whether the own vehicle is moving (step S 700 ). When the own vehicle is not moving, one routine of the flowchart ends.
  • the processor 20 determines whether there is an alarm risk in front (step S 702 ). When there is no front alarm risk, the processor 20 determines whether there is an alarm risk in the left direction (step S 704 ). When there is no alarm risk in the left direction, the processor 20 determines whether there is an alarm risk in the right direction (step S 706 ).
  • the notifier 30 executes display in accordance with the alarm risk (step S 708 ) and further executes passing of a first mode (step S 710 ).
  • the processor 20 determines whether there is a caution risk in front (step S 712 ). When there is no caution risk in front, the processor 20 determines whether there is a caution risk in front, the processor 20 determines whether there is a caution risk in the left direction (step S 714 ). When there is no caution risk in the left direction, the processor 20 determines whether there is a caution risk in the right direction (step S 716 ).
  • the notifier 30 executes display in accordance with the caution risk (step S 718 ) and further executes passing of the second mode (step S 720 ).
  • the processor 20 determines whether there is a potential risk in front (step S 722 ). When there is no potential risk in front, the processor 20 determines whether there is a potential risk in the left direction (step S 724 ). When there is no potential risk in the left direction, the processor 20 determines whether there is a potential risk in the right direction (step S 726 ).
  • the notifier 30 executes display in accordance with the potential risk (step S 728 ).
  • FIG. 25 is a diagram showing a scenario and a notification in which an alarm risk in the forward direction occurs.
  • the notifier 30 displays alarm information indicating that there is the alarm risk in the forward direction on the display and further executes the passing of the first mode.
  • This alarm display is predetermined display such as an arrow (for example, a red arrow) indicating the forward direction, as shown.
  • the first mode is a mode in which the passing light PL illuminates at a high period.
  • FIG. 26 is a diagram showing a scenario and a notification in which an alarm risk in the left direction or an alarm risk in the right direction occurs.
  • the notifier 30 displays alarm information indicating that there is the alarm risk in the left direction on the display and further executes the passing of the first mode.
  • This alarm display is predetermined display such as an arrow (for example, a red arrow) indicating the left direction, as shown.
  • the notifier 30 displays alarm information indicating that there is the alarm risk in the right direction on the display and further executes the passing of the first mode.
  • This alarm display is predetermined display such as an arrow (for example, a red arrow) indicating the right direction, as shown.
  • FIG. 27 is a diagram showing a scenario and a notification in which a caution risk occurs.
  • the notifier 30 displays caution information indicating that there is a caution risk in the forward direction on the display and further executes the passing of the second mode.
  • This caution display is predetermined display such as an arrow (for example, a yellow arrow) indicating the forward direction, as shown.
  • the second mode is a mode in which the passing light PL illuminates at a low period.
  • the notifier 30 displays caution information indicating that there is a caution risk in the left direction on the display and further executes the passing of the second mode.
  • This alarm display is predetermined display such as an arrow (for example, a yellow arrow) indicating the left direction, as shown.
  • the notifier 30 displays caution information indicating that there is a caution risk in the right direction on the display and further executes the passing of the second mode.
  • This alarm display is predetermined display such as an arrow (for example, a yellow arrow) indicating the right direction, as shown.
  • a color of an image (an image such as an arrow indicating an alert) displayed on the display is different from a color of an image (an image such as an arrow indicating an alert) displayed on the display in a case in which there is the alarm risk.
  • FIG. 28 is a diagram showing a scenario and a notification in which a potential risk occurs.
  • the notifier 30 displays potential risk information indicating that there is a potential risk in the forward direction on the display.
  • This potential risk display is predetermined display such as an arrow (for example, a yellow arrow) indicating the forward direction, as shown.
  • the notifier 30 displays potential risk information indicating that there is a potential risk in the left direction on the display.
  • This potential risk display is predetermined display such as an arrow (for example, a yellow arrow) indicating the forward direction, as shown.
  • the camera and the display provide in the terminal device SP have been described, but one or both of the camera and the display may not be provided in the terminal device SP and may be provided in another device, a mobile object, or the like.
  • the terminal device SP executes communication with the other device or the mobile object, acquires an image, or display information on the display.
  • the application program may not determine whether the alert level is the first alert level or the second alert level and may display information regarding the alert on the display in a case of a situation requiring alert (when the alert level is equal to or greater than a predetermined level). Further, in this case, the application program may execute passing.
  • the situation requiring alert is a situation in which the alert level is the first alert level, a situation in which the alert level is the second alert level, or a situation in which there is a potential risk.
  • an application program causes a computer of a terminal device equipped with a camera and mounted in a mobile object to execute a process of acquiring an image; a process of determining an alert level for an operator of the mobile object is a first alert level or a second alert level that is a level lower than the first alert level based on a surrounding situation of the mobile object obtained from the image; and a process of displaying information appropriate to the alert level on a display provided on a casing in which the camera of the terminal device is provided in accordance with a result of the determination.
  • an application program causes a computer of a terminal device equipped with a camera and mounted in a mobile object to execute a process of acquiring images acquired at different times, a process of estimating whether the mobile object has an intention to turn right or left based on the images captured at the different times, and a process of displaying an image for reporting presence of traffic participants on a display provided on a casing in which the camera of the terminal device is provided based on a result of the estimation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)

Abstract

An information supply method is a method of causing a computer of a terminal device equipped with a camera and mounted in a mobile object to execute: a process of acquiring images captured at different times; a process of predicting whether the mobile object has an intention to turn right or left based on the images captured at the different times; and a process of displaying an image for an alarm about presence of a traffic participant on a display provided on a casing in which the camera of the terminal device is provided based on a result of the estimation.

Description

CROSS-REFERENCE TO RELATED APPLICATION
Priority is claimed on Japanese Patent Application No. 2021-057597, filed Mar. 30, 2021, the content of which is incorporated herein by reference.
BACKGROUND Field
The present invention relates to an information supply method, a storage medium, and a terminal device.
Description of Related Art
In the related art, a device that determines sudden deceleration of a vehicle based on captured images is disclosed (see Japanese Unexamined Patent Application, First Publication No. 2002-79896).
SUMMARY
However, in the foregoing technology, an intention of a behavior of a mobile object cannot be predicted in some cases.
The present invention has been devised in view of such circumstances and an objective of the present invention is to provide an information supply method, a storage medium, and a terminal device capable of predicting an intention of a behavior of a mobile object more accurately.
Further, another objective of the present invention is to provide an information supply method, a storage medium, and a terminal device capable of supplying useful information to a user by displaying an image of an alarm about presence of traffic participants in accordance with an intention of a predicted behavior.
An information supply method, a storage medium, and a terminal device according to the present invention adopt the following configurations.
    • (1) According to an aspect of the present invention, there is provided an information supply method of causing a computer of a terminal device equipped with a camera and mounted in a mobile object to execute: a process of acquiring images captured at different times; a process of predicting whether the mobile object has an intention to turn right or left based on the images captured at the different times; and a process of displaying an image for an alarm about presence of a traffic participant on a display provided on a casing in which the camera of the terminal device is provided based on a result of the estimation.
    • (2) In the aspect of (1), the computer may further execute predicting whether the moving object is traveling straight ahead or the moving object is turning right or left.
    • (3) In the aspect of (1) or (2), the computer may further execute: rotating an image captured at a first time at a predetermined angle in a left or right direction using a reference position as a starting point; comparing an image captured at a second time different from the first time with an image rotated at the predetermined angle; and predicting whether the mobile object has an intention to turn right or left based on a result of the comparison.
    • (4) In the aspect of (3), the computer may further execute comparing a first image in a central portion of the image captured at the first time with a second image at a central portion of the image rotated at the predetermined angle.
    • (5) In the aspect of (4), the computer may further execute: acquiring the first image of a central portion of each of a plurality of images obtained by rotating the image captured at the first time at each predetermined angle in left and right directions using the reference position as the starting point; comparing each of the acquired first images at the central portions with the second image at the central portion of the image captured at the second time; and predicting that the mobile object has an intention to turn right or left when the rotational angle of the first image in which the difference between the first and second images is minimized is equal to or greater than a threshold.
    • (6) In the aspect of (5), the computer to may execute approximating the size of a cutout image in which a specific region is cut from an image captured at the first time to the size of an image captured at the second time, rotating the cutout image at the predetermined angle, and comparing the image captured at the second time with an image rotated at the predetermined angle.
    • (7) In the aspect of any one of (1) to (6), the computer may further execute a process of displaying an image of an alarm about approach of a traffic participant on the display when the mobile object is predicted to have an intention to turn right or left and the traffic participant is present or approaches in or near a route through which the mobile object passes when the mobile object turns right or left.
    • (8) According to another aspect of the present invention, an application program stored in a non-transitory computer storage medium causes a computer of a terminal device to execute a process of: acquiring images captured at different times; predicting whether the mobile object has an intention to turn right or left based on the images captured at the different times; and displaying an image for an alarm about presence of a traffic participant on a display is provided based on a result of the estimation.
    • (9) According to still another aspect of the present invention, a terminal device includes: a camera provided in a casing; a display provided in the casing; an estimator configured to predict whether a mobile object has an intention to turn right or left based on images captured at different times by the camera; and a display controller configured to display an image of an alarm about presence of a traffic participant on a display provided in the casing in which the camera is provided based on a result of the estimation.
According to (1) to (9), the information supply method, the application program stored in the storage medium, and the terminal device can predict the intention of a behavior of a mobile object more accurately by performing a process of predicting whether the mobile object has an intention to turn right or left based on images captured at different times. Further, the information supply method can supply useful information to a user by performing a process of displaying an image of an alarm about presence of traffic participants on the display based on the result of the estimation.
According to (7), the information supply method can inform of an operator of the mobile object by displaying an image of an alarm about approach of traffic participants on the display when the mobile object turns right or left.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a diagram showing an overview of a support system.
FIG. 2 is a diagram showing an example of a scenario of support.
FIG. 3 is a diagram showing another example of the scenario of the support.
FIG. 4 is a diagram showing still another example of the scenario of the support.
FIG. 5 is a diagram showing still another example of the scenario of the support.
FIG. 6 is a diagram showing an example of a functional configuration of the support system.
FIG. 7 is a diagram showing an example of a terminal device mounted in an own vehicle.
FIG. 8 is a diagram showing an exemplary configuration of functional units or the like of a terminal device.
FIG. 9 is a flowchart showing an example of a flow of a process executed by an application program (a terminal device).
FIG. 10 is a diagram showing an example of an image displayed on a display of the terminal device.
FIG. 11 is a diagram showing a process of estimating a movement amount.
FIG. 12 is a diagram showing a process of determining a posture.
FIG. 13 is a flowchart showing an example of a flow of a process executed by a processor.
FIG. 14 is a diagram showing a process of detecting traffic participants.
FIG. 15 is a diagram showing a process of predicting behaviors of the detected traffic participants.
FIG. 16 is a diagram showing an example of definition of a risk during straight ahead traveling.
FIG. 17 is a diagram showing an example of a process of determining a risk during straight ahead traveling of an own vehicle.
FIG. 18 is a diagram showing an example of definition of a risk during turning left.
FIG. 19 is a diagram showing an example of a process of determining a risk during turning-left of the own vehicle.
FIG. 20 is a diagram showing an example of definition of a risk during turning right.
FIG. 21 is a diagram showing an example of a process of determining a risk during turning-right of the own vehicle.
FIG. 22 is a diagram showing an example of a process of estimating a potential risk.
FIG. 23 is a flowchart showing an example of a flow of a process executed by the processor.
FIG. 24 is a flowchart showing an example of a flow of a process for alarm/caution.
FIG. 25 is a diagram showing a scenario and a notification in which a front alarm risk occurs.
FIG. 26 is a diagram showing a scenario and a notification in which a left alarm risk or a right alarm risk occurs.
FIG. 27 is a diagram showing a scenario and a notification in which a caution risk occurs.
FIG. 28 is a diagram showing a scenario and a notification in which a potential risk occurs.
DETAILED DESCRIPTION
Hereinafter, an information supply method, an application program, and a terminal device according to an embodiment of the present invention will be described with reference to the drawings. As used throughout this disclosure, the singular forms “a,” “an,” and “the” include plural reference unless the context clearly dictates otherwise.
DETAILED DESCRIPTION
Overview 1
FIG. 1 is a diagram showing an overview of a support system 1. In the embodiment, for example, a mobile object is assumed to be a two-wheeled vehicle (hereinafter sometimes referred to as an “own vehicle”) in description, but may also be applied to various mobile objects such as a small vehicle or a three-wheeled vehicle. Hereinafter, front and rear directions of the own vehicle are referred to an X direction, a direction orthogonal to the front and rear directions is referred to as a Y direction, a direction orthogonal to the X and Y directions is referred to as a Z direction in some cases.
The support system 1 includes a passing light PL, a terminal device (for example, a smartphone) SP, and an application program AP installed in the terminal device SP. The passing light PL is mounted so that, for example, a light illuminates (passes) the vicinity or the like of a headlight in the positive X direction (the forward direction). In the passing light PL, a passing light PL-R is mounted in the positive Y direction of the vehicle and a passing light PL-L is mounted in the negative Y direction of the vehicle.
In the terminal device SP, a camera and a light are mounted on a first surface of the terminal device SP and a display is displayed on a second surface opposite to the first surface. The terminal device SP is mounted on the own vehicle. The terminal device SP is mounted at a position at which the camera can execute imaging in the positive X direction of the vehicle and the light illuminates in the positive X direction. The terminal device SP is provided at a position at which a driver (an operator) can view the display.
The support system 1 executes passing on traffic participants using one or both of a flash of the terminal device SP and the passing light PL. The support system 1 further displays an alarm on the display of the terminal device SP. Hereinafter, both executing of the passing and displaying of the alarm are referred to as “support” in some cases.
For example, since the braking distance of a two-wheeled vehicle tends to be long, a capability to avoid approach to traffic participants tends to be low. The support system 1 informs a driver of approach to traffic participants early to prevent the approach to the traffic participants and requests the traffic participants to execute cooperation (deceleration, stopping, avoiding, or the like) of the prevention of the approach by reporting a possibility of the approach to the traffic participants by auto-passing of the passing light.
As shown in FIG. 2 , the support system 1 executes support to prevent an approach of a collision when two meet each other. For example, the support system 1 executes support to avoid an approach of the own vehicle to a pedestrian, a two-wheeled vehicle, or a four-wheeled vehicle.
As shown in FIG. 3 , the support system 1 executes support to prevent an approach to traffic participants at an intersection. As shown in FIG. 4 , the support system 1 executes support to prevent an approach of traffic participants when a four-wheeled vehicle is congested and the own vehicle slips on the side of the four-wheeled vehicle, or executes support to prevent an approach to traffic participants stopping in front.
As shown in FIG. 5 , the support system 1 executes support to prevent an approach at a potential risk. For example, the support system 1 prevents a risk when the own vehicle turns right during congestion of a four-wheeled vehicle, or prevents a risk when a traffic participant rushes out from the rear side of the stopping vehicle. For example, in the embodiment, when there is a row of objects (four-wheeled vehicles) which are present near the own vehicle and extend in a direction identical to a traveling direction of the own vehicle, as in the right drawing of FIG. 5 , it is determined that there is a potential risk. In the embodiment, when the own vehicle traveling on a first road enters a second road connected to the first road (turns right on the first road and enters the second road) as in the left drawing of FIG. 5 and objects (four-wheeled vehicles) are present on front and rear sides a connection portion (an intersection) of the first and second roads in the extension direction of the first road on the first road, it is determined that there is a potential risk. Hereinafter, the details will be described.
Functional Configuration of Support System
FIG. 6 is a diagram showing an example of a functional configuration of the support system 1.
The support system 1 includes the terminal device SP and the passing light PL, for example, as described in FIG. 1 . For example, the passing light PL and the terminal device SP are connected with a battery BT of the own vehicle M by a power line and are supplied with power from the battery BT to operate. The terminal device SP may be supplied with power from the battery BT via the passing light PL to operate or may operate with power of a battery mounted on the terminal device SP.
The terminal device SP and the passing light PL are connected by a communication line to communicate with each other. For example, the terminal device SP transmits an execution signal for operating the passing light PL. The passing light PL executes passing in accordance with the execution signal.
The terminal device SP is mounted in the own vehicle by a holder, as shown in FIG. 7 . For example, a camera Cam of the terminal device SP executes imaging in the positive X direction and a light Lig of the terminal device SP illuminates in the positive X direction, and a display Dis is mounted to be oriented in the negative X direction.
FIG. 8 is a diagram showing an exemplary configuration of functional units or the like of the terminal device SP. The terminal device SP includes, for example, an acquirer 10, a processor 20, a notifier 30, and a storage 50. The acquirer 10, the processor 20, and the notifier 30 are implemented, for example, by causing a hardware processor such as a central processing unit (CPU) to execute an application program AP (software) stored in the storage 50. For example, the application program AP is supply by a server device that provides the application program AP. The application program AP may be supply by, for example, a service provider that manufactures and sells the own vehicle. The application program AP includes, for example, information functioning as a deep neural network (DNN) 52 to be described below in detail.
Some or all of the constituent elements of the foregoing functional units may be implemented by hardware (a circuit unit including circuitry) such as a large-scale integration (LSI), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a graphics processing unit (GPU) or may be implemented by software and hardware in cooperation. The application program AP may be stored in advance in a storage device (a storage device including a non-transitory storage medium) such as a flash memory or the like of the storage 50, as described above, or may be stored in a detachably mounted storage medium such as an SD card or the like so that the storage medium (a non-transitory storage medium) is mounted on the terminal device SP to be installed on the storage of the terminal device SP.
The acquirer 10 acquires an image captured by the camera Cam. The processor 20 analyzes the image and executes various processes based on a result of the analysis. The notifier 30 gives a warning indicating an alarm, a caution, or the like to a driver based on a processing result of the processor 20. The details of the process will be described below.
Flowchart
FIG. 9 is a flowchart showing an example of a flow of a process executed by an application program (the terminal device SP). First, the terminal device SP estimates a movement amount (step S100) and estimates a posture (step S110). The movement amount is a movement amount of a traffic participant or a movement amount of the own vehicle. The posture is a posture of the own vehicle.
Subsequently, the terminal device SP detects a traffic participant (step S120) and predicts a behavior of the detected traffic participant (step S130). Subsequently, the terminal device SP detects a risk of the own vehicle during straight ahead traveling (step S140), detects a risk during turning left (step S150), and detects a risk during turning right (step S160). Subsequently, the terminal device SP detects a potential risk (step S170). Subsequently, the terminal device SP executes a determination process (step S180).
The terminal device SP displays an alarm level report or a caution level report shown in FIG. 10 on the display of the terminal device SP in accordance with a result of the determination in the foregoing determination process. The alarm level report or the caution level report is a report about the front side, a report about the left or the front left, a report about the right or the front right, or the like. Then, a process of one routine of the flowchart ends. Hereinafter, the details of the process will be described.
Estimating Movement Amount
FIG. 11 is a diagram showing a process of estimating a movement amount. The acquirer 10 of the terminal device SP acquires images (“images captured at different times”) of a time series (k . . . k−n) captured by the camera Cam (step S101). Subsequently, the processor 20 of the terminal device SP derives a difference between an image captured at a present time and an image captured at an i time earlier among the captured images (step S102). Subsequently, the processor 20 digitizes the derived difference (step S104) and obtains the total sum of all the pixel values (step S106). The total sum of all the pixel values is the total sum of pixels for which it is determined that there is a difference. For example, the processor 20 derives a difference of a luminance value for each pixel and digitizes the difference in accordance with whether the derived difference is equal to or greater than a threshold or is less than the threshold. For example, a pixel with a difference equal to or greater than the threshold is expressed with black color and a pixel with a difference less than the threshold is expressed with white color. Differences of RGB components may be used. In this case, a value obtained by processing a difference of each component statistically may be compared with the threshold and may be digitized for each pixel.
For example, when the own vehicle is not moving (when the own vehicle is stopped), a difference between images is small (similarity is high). Therefore, when the difference is digitized, whiteness increases. For example, when the own vehicle is moving, the difference is large (the similarity is low). Therefore, when the difference is digitized, the whiteness decreases.
Subsequently, the processor 20 determines whether the total sum of all the pixel values is equal to or less than a threshold (step S108). When the total sum is equal to or less than the threshold (when the whiteness is high), the processor 20 determines that the own vehicle is stopping (F_RUN=0) (step S110). When the total sum is not equal to or less than the threshold (when the whiteness is low), the processor 20 determines that the own vehicle is traveling or the movement amount of a surrounding traffic participant is large (F_RUN=1) (step S110).
As described above, the terminal device SP can easily determine a state of the own vehicle or a state of the traffic participant based on an image.
Estimating Posture
FIG. 12 is a diagram showing a process of determining a posture. The processor 20 cuts out a rectangular region AR1 from an image of a time k among the image of the time series. The rectangular region AR1 is a region centering on a central portion of an image and is a region in which a reference position (for example, a center of gravity or the like) of an image is set at its center. The processor 20 cuts out a region identical or approximate to a region in which the image of the time k is captured from the image at a time k−i among the images of the time series and expands the image of the cutout region so that the image is approximate to the image of the time k. This process is an example of a process of “approximating the size of a cut image in which a specific region is cut from an image captured at the first time to the size of an image captured at the second time.”
When the image is expanded, the expansion amount of an image may be determined as follows. For example, when a movement speed of the own vehicle is obtained by a positioning device that measures a position of the terminal device SP, the processor 20 may expand the image by the movement amount of an i sampling time in accordance with the speed of the own vehicle. For example, the processor 20 expands the image with reference to information in which a speed is associated with the degree of expansion. For example, when the speed is obtained, as described above, the processor 20 may expand the image to the degree of an expansion appropriate for the speed. When a speed is not obtained, the processor 20 may expand the image by a predetermined value.
Instead of the foregoing expansion process, adjustment may be executed through image processing so that an imaging range of an image at the time k is matched to an imaging range of an image at a time k−1. For example, the processor 20 may acquire regions including predetermined scenery from the image at the time k and the image at the time k−1 and may contract the size of an image including the region acquired from the image at the time k so that the size of the image is approximated to the size of an image including the region acquired from the image at the time k−1.
Subsequently, the processor 20 acquires a plurality of images obtained by rotating the expanded image each by a predetermined angle in the left and right directions using a reference position as a starting point. For example, images rotated clockwise and counterclockwise about the reference position of the image are acquired. A parameter ID is assigned to the image in accordance with the degree of rotation. For example, images to which parameter ID+1, . . . , and +n are assigned and which are rotated in the left direction and images to which parameter ID −1, . . . , and −n are assigned and which are rotated in the right direction are acquired.
The processor 20 sets a rectangular region AR2 in a region centering on a central portion of an image to which a parameter ID is assigned. The rectangular region AR2 is, for example, a region corresponding to the rectangular region AR1. The rectangular region AR2 is a region which is centered on a central portion of an image without being affected by rotation of the image and in which a reference position of the image is set at its center.
Subsequently, the processor 20 compares an image (which is an example of a “second image”) of the rectangular region AR1 at the time k with an image (which is an example of a “first image”) of the rectangular region AR1 to which a parameter ID is assigned and derives a difference therebetween. Subsequently, the processor 20 digitizes each derived difference and obtains a total sum of all the pixel values for each image. Further, the processor 20 selects a parameter ID of the image in which the total sum (which is a total sum of the pixels for which it is determined that there is a difference) of all the pixels is the minimum.
When the own vehicle travels straight ahead at the times k and k−i, the parameter ID “0” in FIG. 12 is selected. When the own vehicle is turning left at the times k−i and k, for example, the parameter ID “+n” in FIG. 12 is selected. When the own vehicle is turning right at the times k−i and k, for example, the parameter ID “−n” in FIG. 12 is selected.
FIG. 13 is a flowchart showing an example of a flow of a process executed by the processor 20. First, the processor 20 determines whether the parameter ID is greater than a threshold +a (step S110). When the parameter ID is greater than the threshold +a, the processor 20 determines that the own vehicle has an intention to turn left because the image is rotated left (step S112). It is determined that “F_LTURN=1” and “F_RTURN=0” are satisfied.
When the parameter ID is not greater than the threshold +a, the processor 20 determines whether the parameter ID is less than a threshold −a (step S114). When the parameter ID is less than the threshold −a, the processor 20 determines that the own vehicle has an intention to turn right because the image is rotated right (step S116). It is determined that “F_LTURN=0” and “F_RTURN=1” are satisfied.
When the parameter ID is not less than the threshold −a, the processor 20 determines that the own vehicle has an intention to travel straight ahead (step S118). It is determined that “F_LTURN=0” and “F_RTURN=0” are satisfied. Then, one routine of the flowchart ends.
As described above, the terminal device SP can determine whether the own vehicle has an intention to travel straight ahead or to turn right or left more accurately based on images captured at different times.
Instead of (or in addition to) the foregoing process, the processor 20 may determine whether the own vehicle has an intention to travel straight ahead, an intention to turn right, or an intention to turn left based on a detection result of an acceleration sensor that is mounted on the terminal device SP and can detect an inclination of the terminal device SP. For example, the processor 20 may determine whether the own vehicle has an intention to turn in a direction in which the terminal device SP is inclined.
In the foregoing example, the plurality of prepared images rotated each by the predetermined angle have been described. However, each of one image rotated in the left direction and one image rotated in the right direction may be compared with an image captured at the time k and it may be determined that the own vehicle will travel in a direction corresponding to an image with a small difference. In the foregoing example, the central portions of the images have been compared in the above description. However, instead of the central portions, other regions may be compared. Further, although the imaging interval of an image is equal to or less than a predetermined value and the process of adjusting the size of the image at the time k−i and the image at the time k among the images of the time series (for example, the process of expanding the image at the time k−1) is omitted, the foregoing process may be omitted when it can be determined more accurately whether the own vehicle has the intention to travel straight ahead, the intention to turn right, or the intention to turn left.
Detecting Traffic Participants
FIG. 14 is a diagram showing a process of detecting traffic participants. The processor 20 identifies kinds of traffic participants in images using the DNN 52. When images are input, the DNN 52 is a learned model that outputs information indicating the kinds of traffic participants included in the images or information for identifying positions of the traffic participants. The kinds of traffic participants are, for example, kinds of traffic participants such as four-wheeled vehicles, two-wheeled vehicles, bicycles, pedestrians, and the like. As the information for identifying the positions, for example, detection boxes which are rectangular regions are associated with the traffic participants. The detection boxes are, for example, regions set so that the traffic participants are included.
Process of Predicting Behaviors of Detected Traffic Participants
FIG. 15 is a diagram showing a process of predicting behaviors of the detected traffic participants. The processor 20 estimates movement directions of the traffic participants based on changes in positions of the traffic participants identified in images captured at different times and predicts behaviors of the traffic participants based on a result of the estimation.
When a detection box associated with a traffic participant obtained from the image at the tine k is expanded with respect to a detection box associated with a traffic participant obtained from the image at the time k−1, the processor 20 determines that the own vehicle and the traffic participant tend to become close. In this case, for example, the traffic participant becomes close to the own vehicle or the own vehicle becomes close to the traffic participant.
When a detection box associated with a traffic participant obtained from the image at the tine k is contracted with respect to a detection box associated with a traffic participant obtained from the image at the time k−1, the processor 20 determines that the own vehicle and the traffic participant tend to become away. In this case, for example, the traffic participant becomes away from the own vehicle or the own vehicle becomes away from the traffic participant. The processor 20 estimates a movement amount of a traffic participant based on an expansion amount or a contraction amount of the detection box.
When a position of the detection box at the time k is moved to the right with respect to a position of the detection box at the time k−1, the processor 20 determines that a traffic participant moves to the right. When the position of the detection box at the time k is moved to the left, the processor 20 determines that the traffic participant moves to the left. The processor 20 estimates the movement amount of the traffic participant based on the movement amount of the detection box.
The processor 20 estimates the movement direction and the movement amount of the traffic participant, as described above, and predicts the behavior of the traffic participant. The processor 20 predicts a future position of the traffic participant based on detection results of the position of the traffic participant at the time k−1 and the position of the traffic participant at the time k.
Process of Detecting Risk
Risk During Straight Ahead Traveling
FIG. 16 is a diagram showing an example of definition of a risk during straight ahead traveling. A risk during straight ahead traveling (a straight-traveling risk) is set in a predetermined range of the front of the own vehicle. A straight-traveling risk is set in a first angle range using a line segment (d1) of a traveling direction of the own vehicle or an extension direction (reference direction) of a road as a reference. A second angle range less than the first angle range is a front risk range when a line segment of the traveling direction is a reference. In a range between the first and second angle ranges, a range in the left direction is a left risk range and a range in the right direction is a right risk range. In the risk ranges, the range of a first distance from the own vehicle is an alert level range and the range of a second distance from the own vehicle is a caution level range.
Hereinafter, when the reference direction (the traveling direction) is a reference, a region in the first angle range and the range of the first distance is referred to as a “region a,” a region in a range between the first angle and the second angle (which may be an angle different from the second angle: the same applies below) in the left direction and in the range of the first distance is referred to as a “region b,” and a region in a range between the first and second angles in the right direction and the range of the first distance is referred to as a “region c” in some cases (see FIGS. 17, 19, and 21 to be described below).
When the reference direction (the traveling direction) is a reference, a region in a range between the first and second angles and the range of the first and second distances is referred to as a “region d,” a region in a range between the first angle and second angles in the left direction and in the range of the first and second distances is referred to as a “region e,” and a region in a range between the first and second angles in the right direction and the range of the first and second distances is referred to as a “region f” in some cases (see FIGS. 17, 19, and 21 to be described below).
FIG. 17 is a diagram showing a process of determining a risk during straight ahead traveling of the own vehicle. The processor 20 determines whether the own vehicle is traveling straight ahead based on the result of the above-described process (step S300). When the own vehicle is not traveling straight ahead, the process proceeds to a process of step S400 of FIG. 19 to be described below.
When the own vehicle is traveling straight ahead, the processor 20 determines in which region there is a risk of the own vehicle among the region a to the region e. For example, the processor 20 determines in which region a position (a target position) of a target traffic participant (for example, a traffic participant present at a position closest from the own vehicle) or a future position (a target position) of the traffic participant is located among the region a to the region e. In the drawing, “*” denotes a target position.
For example, when a target position is located in the region a, a flag “F_FR=1” is assigned. When the target position is located in the region b, a flag “F_LR=1” is assigned. When the target position is located in the region c, a flag “F_RR=1” is assigned.
For example, when the target position is located in the region d, a flag “F_FC=1” is assigned. When the target position is located in the region d, a flag “F_LC=1” is assigned. When the target position is located in the region e, a flag “F_RC=1” is assigned. In other cases, another flag=0 is assigned.
Risk During Turning Left
FIG. 18 is a diagram showing an example of definition of a risk during turning left. A risk during turning left (a left-turning risk) is set centering on the front left side of the own vehicle. For the straight-traveling risk, a risk is set using a traveling direction of the own vehicle as a reference direction. For the left-turning risk, a risk is set using a line segment (dL) rotated by a predetermined angle in the left direction from the line segment of the traveling direction of the own vehicle instead of the line segment of the traveling direction. The predetermined angle is, for example, an angle within a range from 30 degrees to 60 degrees or angle set in accordance with the shape of a road. FIG. 19 is a diagram showing an example of a process of determining a risk during turning left of the own vehicle. Based on a result of the above-described process, the processor 20 determines whether the own vehicle is turning left or has an intention to turn left (step S400). When the own vehicle is not turning left and has no intention to turn left, the process proceeds to a process of step S500 of FIG. 21 . When the own vehicle is turning left or has the intention to turn left, the processor 20 determines in which region there is a risk of the own vehicle among the region a to the region e.
As described in FIG. 17 , the flag “F_FR=1,” the flag “F_LR=1,” the flag “F_RR=1,” the flag “F_FC=1,” the flag “F_LC=1,” the flag “F_RC=1,” or the other flag=0 is assigned in a relation between a target position and the regions.
Risk During Turning Right
FIG. 20 is a diagram showing an example of definition of a risk during turning right. A risk during turning right (a right-turning risk) is set centering on the front right of the own vehicle. For the straight-traveling risk, a risk is set using a traveling direction of the own vehicle as a reference direction. For the right-turning risk, a risk is set using a line segment (dR) rotated by a predetermined angle in the right direction from the line segment of the traveling direction of the own vehicle as a reference instead of the line segment of the traveling direction. The predetermined angle is, for example, an angle within a range from 30 degrees to 60 degrees or angle set in accordance with the shape of a road.
FIG. 21 is a diagram showing an example of a process of determining a risk during turning-right of the own vehicle. Based on a result of the above-described process, the processor 20 determines whether the own vehicle is turning right or has an intention to turn right (step S500). When the own vehicle is not turning right and has no intention to turn right, the process proceeds to a process of step S600 of FIG. 23 . When the own vehicle is turning right or has the intention to turn right, the processor 20 determines in which region there is a risk of the own vehicle among the region a to the region e.
As described in FIG. 17 , the flag “F_FR=1,” the flag “F_LR=1,” the flag “F_RR=1,” the flag “F_FC=1,” the flag “F_LC=1,” the flag “F_RC=1,” or the other flag=0 is assigned in a relation between a target position and the regions.
As described above, when the own vehicle is traveling straight ahead, the processor 20 sets a risk (an alert region) in front of the own vehicle. When the own vehicle is turning left, a risk is set in the front of and in the left direction of the own vehicle. When the own vehicle is turning right, a risk is set in front of and in the right direction of the own vehicle. In this way, a risk appropriate in the traveling direction of the own vehicle can be set.
Estimating Potential Risk
FIG. 22 is a diagram showing an example of a process of estimating a potential risk. The processor 20 detects an average inter-vehicle distance Lvv_ave and a transverse distance Lev based on positions of other vehicles in an image. The average inter-vehicle distance Lvv_ave is, for example, an average inter-vehicle distance of predetermined vehicles (for example, all the vehicles) present from the detected own vehicle to another vehicle. The other vehicle is a vehicle present at a position closest to the own vehicle. An inter-vehicle distance between vehicles is, for example, the distance between reference positions of vehicles forming a row in the traveling direction. When only one vehicle is present, the inter-vehicle distance is set to a large value such as a preset distance (for example, 30 m).
The transverse distance Lev is a distance between the own vehicle and another vehicle in the transverse direction. The other vehicle is, for example, a vehicle present located at a position closest from the own vehicle. The average inter-vehicle distance Lvv_ave and the transverse distance Lev are used for a process related to estimation of the following potential risk.
FIG. 23 is a flowchart showing an example of a flow of a process executed by the processor 20. The processor 20 determines whether the average inter-vehicle distance Lvv_ave is a threshold Lvv_LR (step S600). The average inter-vehicle distance Lvv_ave is, for example, an average inter-vehicle distance based on a distance between a first object now entering an intersection (a connection portion between first and second roads) and a second object behind the first object in a longitudinal direction or an inter-vehicle distance between the first object and each of objects until an n-th object present behind the second object. In Step S600, it may be determined whether another vehicle is stopping and the average inter-vehicle distance Lvv_ave is the threshold Lvv_LR. When the process of determining whether another vehicle is stopping is omitted in this process to reduce a calculation load and the average inter-vehicle distance Lvv_ave is the threshold Lvv_LR, it is alternatively determined that the other vehicle is stopping. When the average inter-vehicle distance Lvv_ave is not the threshold Lvv_LR, the process proceeds to a process of step S700 of FIG. 24 .
When the average inter-vehicle distance Lvv_ave is the threshold Lvv_LR, the processor 20 determines whether the own vehicle is turning right or has an intention to turn right (step S602). When the own vehicle is turning right or has the intention to turn right, the processor 20 determines that there is a potential risk in the left direction (step S604). When there is a potential risk in the left direction, a flag “F_LLR=1” is set.
When the own vehicle is not turning right or has no intention to turn right, the processor 20 determines whether the own vehicle is turning left or an intention to turn left (step S606). When the own vehicle is turning left or has the intention to turn left, the processor 20 determines that there is a potential risk in the right direction (step S608). When there is the potential risk in the right direction, a flag “F_RLR=1” is set.
When the own vehicle turns left, there is a possibility of a traffic participant approaching the own vehicle in the left direction. However, since presence of a traffic participant that is likely to approach can be recognized from a gap between vehicles before turning left, caution is paid to a traffic participant approaching from the right side in which recognition is difficult. When a driver pays caution to the left direction at the time of turning left and the drivers moves her or his face to that direction, a possibility of driving of the driver being unstable is undeniable. Therefore, caution is paid to the right side to prevent the possibility.
When the own vehicle is not turning left or has no intention to turn left, the processor 20 determines whether a transverse distance |Lev| is less than a threshold Lev_LR (step S610). When the transverse distance |Lev| is less than a threshold Lev_LR, the processor 20 determines that there is a potential risk in front of the own vehicle (step S612). When there is the potential risk in front, a flag “F_FLR=1” is set.
When the transverse distance |Lev| is not less than a threshold Lev_LR, the processor 20 determines that there is no potential risk in front (step S614). In this case, “F_LLR=0,” “F_RLR=0,” and “F_FLR=0” are set.
Process Related to Alarm/Caution
FIG. 24 is a flowchart showing an example of a flow of a process (determination process) for alarm/caution. The processor 20 determines whether the own vehicle is moving (step S700). When the own vehicle is not moving, one routine of the flowchart ends.
Process Related to Alarm Risk
When the own vehicle is moving, the processor 20 determines whether there is an alarm risk in front (step S702). When there is no front alarm risk, the processor 20 determines whether there is an alarm risk in the left direction (step S704). When there is no alarm risk in the left direction, the processor 20 determines whether there is an alarm risk in the right direction (step S706).
When there is the alarm risk in front, the alarm risk in the left direction, or the alarm risk in the right direction, the notifier 30 executes display in accordance with the alarm risk (step S708) and further executes passing of a first mode (step S710). As described above in FIGS. 17, 19, and 21 , when the flag “F_FR=1 (a flag indicating that there is an alarm risk in front)” “F_LR=1 (a flag indicating that there is an alarm risk in the left direction),” or “F_RR=1 (a flag indicating that there is an alarm risk in the right direction)” is set, the processes of steps S708 and S710 are executed. The details will be described with reference to FIGS. 25 and 26 to be described below.
Process Related to Caution Risk
When there is no alarm risk in the right direction, the processor 20 determines whether there is a caution risk in front (step S712). When there is no caution risk in front, the processor 20 determines whether there is a caution risk in front, the processor 20 determines whether there is a caution risk in the left direction (step S714). When there is no caution risk in the left direction, the processor 20 determines whether there is a caution risk in the right direction (step S716).
When there is the caution risk in front, the caution risk in the left direction, or the caution risk in the right direction, the notifier 30 executes display in accordance with the caution risk (step S718) and further executes passing of the second mode (step S720). As described above in FIGS. 17, 19, and 21 , when the flag “F_FC=1 (a flag indicating that there is a caution risk in front)” “F_LC=1 (a flag indicating that there is a caution risk in the left direction),” or “F_RC=1 (a flag indicating that there is a caution risk in the right direction)” is set, the processes of steps S718 and S720 are executed. The details will be described with reference to FIG. 27 to be described below.
Process Related to Potential Risk
When there is no caution risk in the right direction, the processor 20 determines whether there is a potential risk in front (step S722). When there is no potential risk in front, the processor 20 determines whether there is a potential risk in the left direction (step S724). When there is no potential risk in the left direction, the processor 20 determines whether there is a potential risk in the right direction (step S726).
When there is the potential risk in front, the potential risk in the left direction, or the potential risk in the right direction, the notifier 30 executes display in accordance with the potential risk (step S728). As described above in FIGS. 17, 19, and 21 , when the flag “F_FLR=1 (a flag indicating that there is a potential risk in front)” “F_LLR=1 (a flag indicating that there is a potential risk in the left direction),” or “F_RLR=1 (a flag indicating that there is a potential risk in the right direction)” is set, the process of step S728 is executed. The details will be described with reference to FIG. 28 to be described below. Then, the process of one routine of the flowchart ends.
Notification of Alarm Risk (First Alert Level)
FIG. 25 is a diagram showing a scenario and a notification in which an alarm risk in the forward direction occurs. When there is an alarm risk in the forward direction, the notifier 30 displays alarm information indicating that there is the alarm risk in the forward direction on the display and further executes the passing of the first mode. This alarm display is predetermined display such as an arrow (for example, a red arrow) indicating the forward direction, as shown. The first mode is a mode in which the passing light PL illuminates at a high period.
FIG. 26 is a diagram showing a scenario and a notification in which an alarm risk in the left direction or an alarm risk in the right direction occurs. When there is the alarm risk in the left direction, the notifier 30 displays alarm information indicating that there is the alarm risk in the left direction on the display and further executes the passing of the first mode. This alarm display is predetermined display such as an arrow (for example, a red arrow) indicating the left direction, as shown.
When there is the alarm risk in the right direction, the notifier 30 displays alarm information indicating that there is the alarm risk in the right direction on the display and further executes the passing of the first mode. This alarm display is predetermined display such as an arrow (for example, a red arrow) indicating the right direction, as shown.
Notification of Caution Risk (Second Alert Level)
FIG. 27 is a diagram showing a scenario and a notification in which a caution risk occurs. When there is caution risk in the forward direction, the notifier 30 displays caution information indicating that there is a caution risk in the forward direction on the display and further executes the passing of the second mode. This caution display is predetermined display such as an arrow (for example, a yellow arrow) indicating the forward direction, as shown. The second mode is a mode in which the passing light PL illuminates at a low period.
When there is a caution risk in the left direction, the notifier 30 displays caution information indicating that there is a caution risk in the left direction on the display and further executes the passing of the second mode. This alarm display is predetermined display such as an arrow (for example, a yellow arrow) indicating the left direction, as shown.
When there is a caution risk in the right direction, the notifier 30 displays caution information indicating that there is a caution risk in the right direction on the display and further executes the passing of the second mode. This alarm display is predetermined display such as an arrow (for example, a yellow arrow) indicating the right direction, as shown. In a case in which there is the caution risk, as described above, a color of an image (an image such as an arrow indicating an alert) displayed on the display is different from a color of an image (an image such as an arrow indicating an alert) displayed on the display in a case in which there is the alarm risk.
Notification of Potential Risk (Third Alert Level)
FIG. 28 is a diagram showing a scenario and a notification in which a potential risk occurs. When there is a potential risk in the forward direction, the notifier 30 displays potential risk information indicating that there is a potential risk in the forward direction on the display. This potential risk display is predetermined display such as an arrow (for example, a yellow arrow) indicating the forward direction, as shown.
When there is a potential risk in the left direction, the notifier 30 displays potential risk information indicating that there is a potential risk in the left direction on the display. This potential risk display is predetermined display such as an arrow (for example, a yellow arrow) indicating the forward direction, as shown. When there is the potential risk, there is no traffic participants that are approaching. Therefore, the passing is not executed.
In the foregoing example, the camera and the display provide in the terminal device SP have been described, but one or both of the camera and the display may not be provided in the terminal device SP and may be provided in another device, a mobile object, or the like. In this case, the terminal device SP executes communication with the other device or the mobile object, acquires an image, or display information on the display.
In the foregoing embodiment, as described above, it is determined whether the alert level is the first alert level or the second alert level and different information is provided between the first alert level and the second alert level. However, instead of (or in addition to) this, the application program may not determine whether the alert level is the first alert level or the second alert level and may display information regarding the alert on the display in a case of a situation requiring alert (when the alert level is equal to or greater than a predetermined level). Further, in this case, the application program may execute passing. The situation requiring alert is a situation in which the alert level is the first alert level, a situation in which the alert level is the second alert level, or a situation in which there is a potential risk.
According to the above-described embodiment, an application program causes a computer of a terminal device equipped with a camera and mounted in a mobile object to execute a process of acquiring an image; a process of determining an alert level for an operator of the mobile object is a first alert level or a second alert level that is a level lower than the first alert level based on a surrounding situation of the mobile object obtained from the image; and a process of displaying information appropriate to the alert level on a display provided on a casing in which the camera of the terminal device is provided in accordance with a result of the determination. Thus, it is possible to improve convenience for a user.
According to the above-described embodiment, an application program causes a computer of a terminal device equipped with a camera and mounted in a mobile object to execute a process of acquiring images acquired at different times, a process of estimating whether the mobile object has an intention to turn right or left based on the images captured at the different times, and a process of displaying an image for reporting presence of traffic participants on a display provided on a casing in which the camera of the terminal device is provided based on a result of the estimation. Thus, it is possible to estimate the intention of a behavior of the mobile object more accurately.
The embodiments for carrying out the present invention have been described above, but the present invention is not limited to the embodiments. Various modifications and substitutions can be made within the scope of the present invention without departing from the gist of the present invention.

Claims (8)

What is claimed is:
1. An information supply method of causing at least one processor of a terminal device equipped with a camera and mounted in a mobile object to execute instructions stored in at least one memory of the terminal device, the instructions comprising:
acquiring images captured at different times;
rotating a first image captured at a first time at a predetermined angle in a left or right direction using a reference position as a starting point;
comparing a second image captured at a second time different than the first time with the first image rotated at the predetermined angle; and
predicting whether the mobile object has an intention to turn right or left based on a result of the comparing;
detecting a traffic participant that has a possibility of interfering with the mobile object based on a third image that is included in the images captured at different times, and
notifying an occupant of the mobile object of the traffic participant near the mobile object by displaying an image representation of an alarm representing a presence of the traffic participant on a display device provided on a casing in which the camera of the terminal device is provided based on a result of an estimation and the traffic participant that has a possibility of interfering with the mobile object.
2. The information supply method according to claim 1, wherein the method causes the at least one processor to further execute:
predicting whether the moving object is traveling straight ahead or the moving object is turning right or left.
3. The information supply method according to claim 1, wherein the method causes the at least one processor to further execute:
comparing the first image in a central portion of the image captured at the first time with the second image at a central portion of the image rotated at the predetermined angle.
4. The information supply method according to claim 3, wherein the method causes the at least one processor to further execute:
acquiring the first image of a central portion of each of a plurality of images obtained by rotating the image captured at the first time at each predetermined angle in left and right directions using the reference position as the starting point;
comparing each of the first images at the central portions with the second image at the central portion of the image captured at the second time; and
predicting that the mobile object has an intention to turn right or left when a rotational angle of the first image in which a difference between the first and second images is minimized is equal to or greater than a threshold.
5. The information supply method according to claim 1, wherein the method causes the at least one processor to further execute:
approximating a size of a cutout image in which a specific region is cut from the first image captured at the first time to a size of the second image captured at the second time, rotating the cutout image at the predetermined angle, and comparing the second image captured at the second time with the cutout image rotated at the predetermined angle.
6. The information supply method according to claim 1, wherein the method causes the at least one processor to further execute:
a process of displaying the image representation of the alarm about an approach of the traffic participant on the display device when the mobile object is predicted to have an intention to turn right or left and the traffic participant is present or approaches in or near a route through which the mobile object passes when the mobile object turns right or left.
7. A non-transitory computer storage medium storing an application program causing a computer of a terminal device to execute a process of:
acquiring images captured at different times;
rotating a first image captured at a first time at a predetermined angle in a left or right direction using a reference position as a starting point;
comparing a second image captured at a second time different than the first time with the first image rotated at the predetermined angle; and
predicting whether the mobile object has an intention to turn right or left based on a result of the comparing;
detecting a traffic participant that has a possibility of interfering with the mobile object based on a third image that is included in the images captured at different times, and
notifying an occupant of the mobile object of the traffic participant near the mobile object by displaying an image representation of an alarm representing a presence of the traffic participant on a display device provided on a casing in which the camera of the terminal device is provided based on a result of an estimation and the traffic participant that has a possibility of interfering with the mobile object.
8. A terminal device comprising:
a camera provided in a casing;
a display provided in the casing; and
a processor configure to:
rotating a first image captured at a first time at a predetermined angle in a left or right direction using a reference position as a starting point;
comparing a second image captured at a second time different than the first time with the first image rotated at the predetermined angle; and
predicting whether the mobile object has an intention to turn right or left based on a result of the comparing;
detecting a traffic participant that has a possibility of interfering with the mobile object based on a third image that is included in the images captured at different times, and
notifying an occupant of the mobile object of the traffic participant near the mobile object by displaying an image representation of an alarm representing a presence of the traffic participant on a display device provided on a casing in which the camera of the terminal device is provided based on a result of an estimation and the traffic participant that has a possibility of interfering with the mobile object.
US17/701,740 2021-03-30 2022-03-23 Terminal device equipped with a camera and display and mounted on a vehicle that predicts an intention of the vehicle to turn left or right Active 2042-08-03 US12008904B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021057597A JP7247252B2 (en) 2021-03-30 2021-03-30 Application program, information provision method, and terminal device
JP2021-057597 2021-03-30

Publications (2)

Publication Number Publication Date
US20220319326A1 US20220319326A1 (en) 2022-10-06
US12008904B2 true US12008904B2 (en) 2024-06-11

Family

ID=83405367

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/701,740 Active 2042-08-03 US12008904B2 (en) 2021-03-30 2022-03-23 Terminal device equipped with a camera and display and mounted on a vehicle that predicts an intention of the vehicle to turn left or right

Country Status (3)

Country Link
US (1) US12008904B2 (en)
JP (1) JP7247252B2 (en)
CN (1) CN120773648A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20250047982A1 (en) * 2023-08-01 2025-02-06 Canon Kabushiki Kaisha Information processing apparatus, method, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002079896A (en) 2000-09-06 2002-03-19 Auto Network Gijutsu Kenkyusho:Kk Traveling state announcing device
JP2005196644A (en) 2004-01-09 2005-07-21 Nissan Motor Co Ltd Right turn support information presentation device
JP2015058915A (en) 2013-09-20 2015-03-30 本田技研工業株式会社 Vehicular roll angle estimation device
JP2016122381A (en) 2014-12-25 2016-07-07 株式会社リコー Optical flow calculation device, optical flow calculation method, and program
US20160318445A1 (en) * 2015-04-30 2016-11-03 Honda Motor Co., Ltd. System and method for vehicle collision mitigation with vulnerable road user context sensing
WO2017104712A1 (en) 2015-12-14 2017-06-22 ヤマハ発動機株式会社 Vehicle-use roll-angle estimation system, vehicle, vehicle-use roll-angle estimation method and program
US20210316721A1 (en) * 2020-04-13 2021-10-14 Hyundai Motor Company Vehicle and method of controlling the same
US20220382294A1 (en) * 2020-02-21 2022-12-01 Zoox, Inc. Object or person attribute characterization

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002079896A (en) 2000-09-06 2002-03-19 Auto Network Gijutsu Kenkyusho:Kk Traveling state announcing device
JP2005196644A (en) 2004-01-09 2005-07-21 Nissan Motor Co Ltd Right turn support information presentation device
JP2015058915A (en) 2013-09-20 2015-03-30 本田技研工業株式会社 Vehicular roll angle estimation device
JP2016122381A (en) 2014-12-25 2016-07-07 株式会社リコー Optical flow calculation device, optical flow calculation method, and program
US20160318445A1 (en) * 2015-04-30 2016-11-03 Honda Motor Co., Ltd. System and method for vehicle collision mitigation with vulnerable road user context sensing
WO2017104712A1 (en) 2015-12-14 2017-06-22 ヤマハ発動機株式会社 Vehicle-use roll-angle estimation system, vehicle, vehicle-use roll-angle estimation method and program
US20220382294A1 (en) * 2020-02-21 2022-12-01 Zoox, Inc. Object or person attribute characterization
US20210316721A1 (en) * 2020-04-13 2021-10-14 Hyundai Motor Company Vehicle and method of controlling the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Japanese Office Action for Japanese Patent Application No. 2021-057597 dated Dec. 6, 2022.

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20250047982A1 (en) * 2023-08-01 2025-02-06 Canon Kabushiki Kaisha Information processing apparatus, method, and storage medium

Also Published As

Publication number Publication date
JP7247252B2 (en) 2023-03-28
CN120773648A (en) 2025-10-14
JP2022154522A (en) 2022-10-13
CN115139907A (en) 2022-10-04
US20220319326A1 (en) 2022-10-06

Similar Documents

Publication Publication Date Title
EP3807128B1 (en) A rider assistance system and method
CN102779430B (en) Collision-warning system, controller and method of operating thereof after the night of view-based access control model
US9987979B2 (en) Vehicle lighting system
US11752929B2 (en) Notification device
CN107845104B (en) Method for detecting overtaking vehicle, related processing system, overtaking vehicle detection system and vehicle
US10864906B2 (en) Method of switching vehicle drive mode from automatic drive mode to manual drive mode depending on accuracy of detecting object
CN110430401A (en) Vehicle blind area early warning method, early warning device, MEC platform and storage medium
CN115071702B (en) Vehicle control device, vehicle control method, and vehicle control computer program
US12141982B2 (en) Image processing apparatus, image processing method, and computer-readable storage medium storing program
US10531016B2 (en) On-vehicle display control device, on-vehicle display system, on-vehicle display control method, and non-transitory storage medium
CN111052201B (en) Collision prediction device, collision prediction method, and storage medium
CN115107641B (en) Light projection device, method and storage medium
US11663834B2 (en) Traffic signal recognition method and traffic signal recognition device
CN114365208B (en) Driving assistance device, driving assistance method, and storage medium
US12008904B2 (en) Terminal device equipped with a camera and display and mounted on a vehicle that predicts an intention of the vehicle to turn left or right
US11935412B2 (en) Information supply method and storage medium
JP2020110010A (en) Vehicle display control device, vehicle display system, vehicle display control method and program
JP4661602B2 (en) Rear vehicle analysis device and collision prediction device
CN115139907B (en) Information providing method, storage medium, and terminal device
US20220319187A1 (en) Image processing apparatus, imaging apparatus, movable object, and method for image processing

Legal Events

Date Code Title Description
AS Assignment

Owner name: HONDA MOTOR CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YASUI, YUJI;REEL/FRAME:059350/0633

Effective date: 20220321

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE