WO2020183652A1 - Dispositif d'aide à la conduite - Google Patents

Dispositif d'aide à la conduite Download PDF

Info

Publication number
WO2020183652A1
WO2020183652A1 PCT/JP2019/010296 JP2019010296W WO2020183652A1 WO 2020183652 A1 WO2020183652 A1 WO 2020183652A1 JP 2019010296 W JP2019010296 W JP 2019010296W WO 2020183652 A1 WO2020183652 A1 WO 2020183652A1
Authority
WO
WIPO (PCT)
Prior art keywords
road traffic
traffic information
distance coefficient
viewpoint
viewpoint distance
Prior art date
Application number
PCT/JP2019/010296
Other languages
English (en)
Japanese (ja)
Inventor
悟史 山口
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2021504710A priority Critical patent/JP7105985B2/ja
Priority to PCT/JP2019/010296 priority patent/WO2020183652A1/fr
Publication of WO2020183652A1 publication Critical patent/WO2020183652A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a driving support device.
  • Patent Document 1 discloses a technique for calculating the visibility of a safety confirmation object by the driver based on the distance of the safety confirmation object with respect to the driver's line-of-sight direction. As a result, not only the visibility of the safety confirmation object being watched by the driver is calculated, but also the visibility of the safety confirmation object included in the so-called "peripheral vision" is calculated.
  • road information display showing various information about roads
  • traffic information display showing various information related to traffic
  • road information display and the traffic information display are collectively referred to as “road traffic information display”.
  • the technique described in Patent Document 1 calculates the visibility of the safety confirmation object by the driver.
  • the degree of recognition of the road traffic information display by the driver instead of calculating the visibility of the safety confirmation object by the driver (hereinafter referred to as “recognition degree”). It is conceivable to calculate (hereinafter referred to as “diversification technology”).
  • the degree of recognition of the road traffic information display by the driver is not only different depending on the distance of the road traffic information display with respect to the line-of-sight direction, but also different depending on the content of the road traffic information display. .. Therefore, when calculating the degree of recognition of the road traffic information display by the driver, it is preferable to consider not only the distance of the road traffic information display with respect to the line-of-sight direction but also the content of the road traffic information display. is there.
  • Patent Document 1 considers the distance of the safety confirmation object with respect to the line-of-sight direction when calculating the visibility of the safety confirmation object by the driver, but does not consider the content of the safety confirmation object. .. Therefore, the diversion technology does not consider the content of the road traffic information display when calculating the recognition degree of the road traffic information display by the driver. As a result, there is a problem that the accuracy of determining whether or not the road traffic information display is recognized by the driver is low.
  • the present invention has been made to solve the above problems, and an object of the present invention is to improve the accuracy of determining whether or not the road traffic information display has been recognized by the driver.
  • the driving support device of the present invention uses a road traffic information detection unit that detects an area corresponding to the road traffic information display in the front image and detects the content of the road traffic information display, and a vehicle interior image to drive in the front image.
  • the viewpoint distance coefficient is corrected based on the viewpoint detection unit that detects the position corresponding to the viewpoint of the person, the viewpoint distance coefficient calculation unit that calculates the viewpoint distance coefficient according to the distance of the area to the position, and the content of the road traffic information display. It is provided with a viewpoint distance coefficient correction unit for calculating a viewpoint distance coefficient, and a recognition degree calculation unit for calculating the recognition degree of the road traffic information display by the driver by integrating the corrected viewpoint distance coefficient.
  • the present invention since it is configured as described above, it is possible to improve the accuracy of determining whether or not the road traffic information display is recognized by the driver.
  • FIG. It is a block diagram which shows the main part of the driving support system including the driving support device which concerns on Embodiment 1.
  • FIG. It is explanatory drawing which shows the example of the front image. It is explanatory drawing which shows the area corresponding to the road traffic information display in the front image shown in FIG. 2A. It is explanatory drawing which shows the position corresponding to the viewpoint of the driver in the front image shown in FIG. 2A. It is explanatory drawing which shows the distance of the said region with respect to the said position in the front image shown in FIG. 2A. It is explanatory drawing which shows the other example of the front image. It is explanatory drawing which shows the area corresponding to the road traffic information display in the front image shown in FIG. 3A.
  • FIG. 1 is a block diagram showing a main part of a driving support system including a driving support device according to the first embodiment.
  • the operation support device 100 of the first embodiment will be described with reference to FIG. Further, the driving support system 200 including the driving support device 100 will be described.
  • the vehicle 1 is provided with a control device 2, a first image pickup device 3, a second image pickup device 4, and a display device 5.
  • the main part of the driving support system 200 is composed of the control device 2, the first image pickup device 3, the second image pickup device 4, and the display device 5.
  • the first image pickup device 3 takes an image of the front of the vehicle 1 at predetermined time intervals, and outputs an image signal indicating the captured image (hereinafter referred to as "front image") I1.
  • the first imaging device 3 is composed of, for example, an infrared camera or a visible light camera.
  • the first imaging device 3 is provided, for example, on the front end of the vehicle 1, on the dashboard of the vehicle 1, or on the ceiling of the front of the vehicle interior of the vehicle 1.
  • the second imaging device 4 captures the interior of the vehicle 1 at predetermined time intervals and outputs an image signal indicating the captured image (hereinafter referred to as "in-vehicle image") I2.
  • the second imaging device 4 is composed of, for example, an infrared camera or a visible light camera.
  • the second imaging device 4 is provided, for example, on the dashboard of the vehicle 1.
  • the display device 5 is composed of, for example, a liquid crystal display or an organic EL (Electro Luminescence) display, and is provided on the dashboard of the vehicle 1.
  • the display device 5 is configured by a HUD (Head-Up Display).
  • the control device 2 is composed of, for example, an ECU (Electronic Control Unit).
  • the driving support device 100 is provided in the control device 2. Hereinafter, the driving support device 100 will be described.
  • the road traffic information detection unit 21 acquires the image signal output by the first imaging device 3.
  • the road traffic information detection unit 21 detects the road traffic information display S included in the front image I1 by executing an image recognition process for the front image I1 using the acquired image signal. More specifically, the road traffic information detection unit 21 detects the area A corresponding to the road traffic information display S in the front image I1 and also detects the content of the road traffic information display S.
  • the process in which the road traffic information detection unit 21 detects the contents of the area A and the road traffic information display S is referred to as “road traffic information detection process”.
  • the viewpoint detection unit 22 acquires the image signal output by the second imaging device 4.
  • the viewpoint detection unit 22 uses the acquired image signal to perform image recognition processing on the in-vehicle image I2, thereby performing an image recognition process on the vehicle interior image I2, thereby corresponding to the driver's viewpoint in the front image I1 (hereinafter referred to as “viewpoint position”). It detects P.
  • viewpoint detection process the process in which the viewpoint detection unit 22 detects the viewpoint position P is referred to as “viewpoint detection process”.
  • the viewpoint detection unit 22 detects the position of the driver's head (more specifically, the position of the driver's eyes) by executing the image recognition process for the in-vehicle image I2, and also detects the driver's line of sight. Detect the vector.
  • the viewpoint detection unit 22 stores in advance a table showing the correspondence between the position of the driver's eyes, the driver's line-of-sight vector, and the viewpoint position P in the front image I1.
  • the viewpoint detection unit 22 detects the viewpoint position P in the front image I1 by using the stored table.
  • the viewpoint distance coefficient calculation unit 23 acquires the detection result by the road traffic information detection unit 21 and the detection result by the viewpoint detection unit 22.
  • the viewpoint distance coefficient calculation unit 23 calculates the distance L of the region A with respect to the viewpoint position P by using the acquired detection result. Further, the viewpoint distance coefficient calculation unit 23 calculates a coefficient (hereinafter referred to as “viewpoint distance coefficient”) C corresponding to the calculated distance L. That is, the viewpoint distance coefficient C is a value that gradually decreases as the distance L increases. In other words, the viewpoint distance coefficient C is a value that gradually increases as the distance L decreases.
  • the viewpoint distance coefficient calculation unit 23 has a coordinate value indicating the viewpoint position P in the front image I1 (hereinafter referred to as “first coordinate value”) and a predetermined portion (for example, the central portion or the upper left) of the region A in the front image I1.
  • the distance L in pixel units is calculated based on the difference value from the coordinate value (hereinafter referred to as "second coordinate value") indicating the position of the end portion).
  • the viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C corresponding to the calculated distance L in pixel units.
  • the viewpoint distance coefficient calculation unit 23 calculates the distance L in pixel units based on the difference value between the first coordinate value and the second coordinate value.
  • the viewpoint distance coefficient calculation unit 23 stores in advance a table showing the correspondence between the first coordinate value, the second coordinate value, and the coefficient for converting the distance L in pixel units into the distance L in meters. There is.
  • the viewpoint distance coefficient calculation unit 23 calculates the distance L in meters by converting the calculated distance L in pixels into the distance L in meters using the stored table.
  • the viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C corresponding to the calculated distance L in meters.
  • FIG. 2A shows an example of the front image I1.
  • the front image I1 shown in FIG. 2A includes a road sign (more specifically, a regulation sign) having the meaning of “no vehicle intrusion”, that is, a road traffic information display S.
  • the regulation sign is composed of one symbol (hereinafter referred to as "mark") and does not include a character string.
  • the road traffic information detection unit 21 detects the area A corresponding to the road traffic information display S (see FIG. 2B). Further, the road traffic information detection unit 21 detects the content of the road traffic information display S. More specifically, the road traffic information detection unit 21 detects that the road traffic information display S is a regulatory sign having the meaning of "no vehicle entry". At this time, it is also detected that the road traffic information display S is composed of one mark and that the road traffic information display S does not include a character string.
  • the viewpoint detection unit 22 detects the viewpoint position P in the front image I1 (see FIG. 2C).
  • the viewpoint distance coefficient calculation unit 23 calculates the distance L of the region A with respect to the viewpoint position P (see FIG. 2D), and calculates the viewpoint distance coefficient C corresponding to the calculated distance L.
  • FIG. 3A shows another example of the front image I1.
  • the front image I1 shown in FIG. 3A includes a signboard including the character string “future / closed”, that is, the road traffic information display S.
  • the road traffic information detection unit 21 detects the area A corresponding to the road traffic information display S (see FIG. 3B). Further, the road traffic information detection unit 21 detects the content of the road traffic information display S. More specifically, the road traffic information detection unit 21 detects that the road traffic information display S is a signboard including the character string "future / closed”.
  • the viewpoint detection unit 22 detects the viewpoint position P in the front image I1 (see FIG. 3C).
  • the viewpoint distance coefficient calculation unit 23 calculates the distance L of the region A with respect to the viewpoint position P (see FIG. 3D), and calculates the viewpoint distance coefficient C corresponding to the calculated distance L.
  • the viewpoint distance coefficient correction unit 24 acquires the viewpoint distance coefficient C calculated by the viewpoint distance coefficient calculation unit 23. Further, the viewpoint distance coefficient correction unit 24 acquires the detection result by the road traffic information detection unit 21. The viewpoint distance coefficient correction unit 24 corrects the acquired viewpoint distance coefficient C based on the content of the road traffic information display S by using the acquired detection result.
  • the viewpoint distance coefficient correction unit 24 is based on the number of marks included in the road traffic information display S, the presence or absence of a character string in the road traffic information display S, the number of characters in the character string included in the road traffic information display S, and the like.
  • the difficulty level D1 of recognition of the road traffic information display S is calculated.
  • the difficulty level D1 is also an index indicating the complexity of the road traffic information display S.
  • the road traffic information display S in the front image I1 shown in FIG. 2 is composed of one mark and does not include a character string.
  • the road traffic information display S in the front image I1 shown in FIG. 3 includes a character string.
  • the difficulty level D1 of the former (FIG. 2) is calculated to be smaller than the difficulty level D1 of the latter (FIG. 3).
  • the difficulty level D1 of the latter (FIG. 3) is calculated to be larger than the difficulty level D1 of the former (FIG. 2).
  • the viewpoint distance coefficient correction unit 24 corrects the viewpoint distance coefficient C so that when the difficulty level D1 is large, the value becomes smaller than when the difficulty level D1 is small. In other words, the viewpoint distance coefficient correction unit 24 corrects the viewpoint distance coefficient C so that when the difficulty level D1 is small, the value becomes larger than when the difficulty level D1 is large.
  • the forward image I1 is imaged at predetermined time intervals.
  • the road traffic information detection unit 21 executes the road traffic information detection process at predetermined time intervals. Therefore, when the road traffic information display S is included in the front image I1, the viewpoint distance coefficient C is calculated at a predetermined time interval, and the calculated viewpoint distance coefficient C is sequentially corrected.
  • the recognition degree calculation unit 25 calculates the recognition degree D2 of the road traffic information display S by the driver by temporally integrating the corrected viewpoint distance coefficient C'. That is, the recognition degree D2 is updated every time the viewpoint distance coefficient C is calculated and the calculated viewpoint distance coefficient C is corrected.
  • the recognition degree determination unit 26 acquires the detection result by the road traffic information detection unit 21.
  • the connection line between the road traffic information detection unit 21 and the recognition degree determination unit 26 is not shown.
  • the road traffic information display S in the front image I1 goes out of the front image I1 due to the traveling of the vehicle 1 (that is, due to the passage of time during traveling).
  • the recognition degree determination unit 26 determines whether or not the road traffic information display S has gone out of the front image I1 by using the acquired detection result.
  • the recognition degree determination unit 26 acquires the recognition degree D2 calculated by the recognition degree calculation unit 25 when it is determined that the road traffic information display S has gone out of the front image I1.
  • the recognition degree determination unit 26 determines whether or not the road traffic information display S is recognized by the driver by determining whether or not the acquired recognition degree D2 is equal to or higher than a predetermined threshold value Dth. is there. That is, the threshold value Dth is set to a value that can determine whether or not the road traffic information display S is recognized by the driver.
  • the display control unit 27 controls the display device 5 to display the image I3 corresponding to the road traffic information display S (hereinafter, "first”). It is called “display control”). By visually recognizing the image I3, the driver can confirm the road traffic information display S that he / she missed.
  • the driving support device 100 is required by the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27.
  • the part is composed.
  • the driving support device 100 has a processor 51 and a memory 52.
  • the memory 52 has functions of a road traffic information detection unit 21, a viewpoint detection unit 22, a viewpoint distance coefficient calculation unit 23, a viewpoint distance coefficient correction unit 24, a recognition degree calculation unit 25, a recognition degree determination unit 26, and a display control unit 27.
  • the program to realize it is stored.
  • the processor 51 reads out and executes the stored program, the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, and recognition
  • the functions of the coefficient determination unit 26 and the display control unit 27 are realized.
  • the driving support device 100 has a processing circuit 53.
  • the functions of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27 are dedicated. It is realized by the processing circuit 53.
  • the driving support device 100 has a processor 51, a memory 52, and a processing circuit 53 (not shown).
  • a processor 51 among the functions of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27.
  • Some functions are realized by the processor 51 and the memory 52, and the remaining functions are realized by the dedicated processing circuit 53.
  • the processor 51 uses, for example, at least one of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a microcontroller, and a DSP (Digital Signal Processor).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • microprocessor a microcontroller
  • DSP Digital Signal Processor
  • the memory 52 is composed of a non-volatile memory or a non-volatile memory and a volatile memory.
  • the volatile memory in the memory 52 is, for example, one using RAM (Random Access Memory).
  • the non-volatile memory among the memory 52 is, for example, a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Memory) or a EEPROM (Electrically Erasable Memory) MemoryRead At least one of (Hard Disc Drive) is used.
  • the processing circuit 53 is composed of a digital circuit or a digital circuit and an analog circuit.
  • the processing circuit 53 includes, for example, an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field-Programmable Gate Array), an FPGA (Field-Programmable Gate Array), a System-System (System) System, and an ASIC (System) System. At least one of them is used.
  • the first image pickup apparatus 3 continuously executes a process of capturing an image of the front of the vehicle 1 at predetermined time intervals and outputting an image signal indicating the front image I1.
  • the second imaging device 4 continuously executes a process of capturing an image of the interior of the vehicle 1 at predetermined time intervals and outputting an image signal indicating the in-vehicle image I2.
  • step ST1 the road traffic information detection unit 21 executes the road traffic information detection process.
  • the viewpoint detection unit 22 executes the viewpoint detection process in step ST3.
  • step ST4 the viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C.
  • step ST5 the viewpoint distance coefficient correction unit 24 corrects the viewpoint distance coefficient C.
  • step ST6 the recognition degree calculation unit 25 calculates the recognition degree D2.
  • step ST2 “NO” the process of the driving support device 100 returns to step ST1. Further, following step ST6, the processing of the driving support device 100 returns to step ST1. That is, as described above, the road traffic information detection process is executed at predetermined time intervals.
  • the operation of the driving support device 100 will be described focusing on the operations of the recognition degree determination unit 26 and the display control unit 27.
  • the recognition degree determination unit 26 determines whether or not the road traffic information display S has gone out of the front image I1 by using the detection result by the road traffic information detection unit 21. When it is determined that the road traffic information display S has gone out of the front image I1, the recognition degree determination unit 26 executes the process of step ST11.
  • step ST11 the recognition degree determination unit 26 acquires the recognition degree D2 calculated by the recognition degree calculation unit 25, and determines whether or not the acquired recognition degree D2 is equal to or greater than the threshold value Dth. ..
  • the display control unit 27 executes the first display control in step ST12.
  • step ST11 “YES” the process of step ST12 is skipped.
  • the driving support device 100 includes the viewpoint distance coefficient correction unit 24 that corrects the viewpoint distance coefficient C based on the content of the road traffic information display S.
  • the viewpoint distance coefficient C'for the calculation of the recognition degree D2 the viewpoint distance coefficient C before the correction is used for the calculation of the recognition degree D2 (that is, compared with the conventional driving support device). The accuracy of determining whether or not the road traffic information display S is recognized by the driver can be improved.
  • the driving support device 100 may execute the processes of ST4 to ST6 for each of the plurality of road traffic information displays S. (See FIG. 7). Further, in this case, the driving support device 100 may execute the processes of steps ST11 and ST12 for each of the plurality of road traffic information display S.
  • the recognition degree determination unit 26 and the display control unit 27 may be provided outside the driving support device 100. That is, the main part of the driving support device 100 is composed of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, and the recognition degree calculation unit 25. You may.
  • the driving support device 100 of the first embodiment detects the region A corresponding to the road traffic information display S in the front image I1 and also detects the content of the road traffic information display S.
  • the viewpoint detection unit 22 that detects the position P corresponding to the driver's viewpoint in the front image I1 and the viewpoint distance coefficient C corresponding to the distance L of the region A with respect to the position P are calculated.
  • the viewpoint distance coefficient correction unit 24 that corrects the viewpoint distance coefficient C based on the contents of the road traffic information display S, and the corrected viewpoint distance coefficient C'by the driver.
  • a recognition degree calculation unit 25 for calculating the recognition degree D2 of the road traffic information display S is provided. By using the corrected viewpoint distance coefficient C'for calculating the recognition degree D2, it is possible to improve the accuracy of determining whether or not the road traffic information display S is recognized by the driver.
  • the driving support device 100 determines whether or not the recognition degree D2 is equal to or higher than the threshold value Dth, thereby determining whether or not the road traffic information display S is recognized by the driver, and the recognition degree determination unit 26.
  • the display control unit 27 that executes the control (first display control) to display the image I3 corresponding to the road traffic information display S on the display device 5 is provided. ..
  • the driver can confirm the road traffic information display S that he / she missed.
  • the recognition degree determination unit 26 determines whether or not the recognition degree D2 is equal to or greater than the threshold value Dth when the road traffic information display S goes out of the front image I1. As a result, when the vehicle 1 passes the installation position of the road traffic information display S, it is possible to determine whether or not the recognition degree D2 is equal to or higher than the threshold value Dth. That is, it is possible to determine whether or not the recognition degree D2 is equal to or higher than the threshold value Dth for the road traffic information display S that the driver may have missed.
  • FIG. 9 is a block diagram showing a main part of the driving support system including the driving support device according to the second embodiment.
  • the driving support device 100a of the second embodiment will be described with reference to FIG. Moreover, the driving support system 200a including the driving support device 100a will be described.
  • the same blocks as those shown in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
  • control device 2a is provided in the vehicle 1.
  • the control device 2a is composed of, for example, an ECU.
  • the vehicle 1 is provided with sensors 6.
  • the sensors 6 include various sensors.
  • the sensors 6 include a wheel speed sensor, an acceleration sensor, a gyro sensor, a steering sensor, and a wiper sensor.
  • the vehicle 1 is provided with a wireless communication device 7.
  • the wireless communication device 7 is composed of a transmitter and a receiver for wireless communication.
  • the wireless communication device 7 is free to communicate with the server device 8 outside the vehicle 1.
  • the storage device 9 is provided in the vehicle 1.
  • the storage device 9 is composed of a non-volatile memory.
  • the storage device 9 may be integrally configured with the memory (that is, the memory 52) of the control device 2a.
  • the vehicle 1 is provided with an operation input device 10.
  • the operation input device 10 is composed of, for example, at least one of a touch panel, a hardware key, and a microphone for voice input.
  • the operation input device 10 is provided on the dashboard of the vehicle 1, for example.
  • the operation input device 10 may be provided integrally with the display device 5, or may be provided adjacent to the display device 5.
  • the main part of the driving support system 200a is composed of the control device 2a, the first imaging device 3, the second imaging device 4, the display device 5, the sensors 6, the wireless communication device 7, the storage device 9, and the operation input device 10. There is.
  • the driving support device 100a is provided in the control device 2a.
  • the vehicle information acquisition unit 31 acquires information indicating the running state of the vehicle 1 (hereinafter referred to as "vehicle information").
  • vehicle information includes, for example, information indicating the traveling speed of the vehicle 1 and information indicating the turning amount of the vehicle 1.
  • the vehicle information acquisition unit 31 calculates the traveling speed of the vehicle 1 by using the output signal from the wheel speed sensor of the sensors 6. As a result, information indicating the traveling speed of the vehicle 1 is acquired.
  • the vehicle information acquisition unit 31 calculates the turning amount of the vehicle 1 by using the output signal from the acceleration sensor, the gyro sensor, or the steering sensor of the sensors 6. As a result, information indicating the turning amount of the vehicle 1 is acquired.
  • the environmental information acquisition unit 32 acquires information indicating the environment outside the vehicle 1 (hereinafter referred to as "environmental information").
  • the environmental information includes, for example, information indicating the brightness around the vehicle 1 and information indicating the weather around the vehicle 1.
  • the environmental information acquisition unit 32 detects the brightness around the vehicle 1 by executing an image recognition process for the front image I1. As a result, information indicating the brightness of the surroundings of the vehicle 1 is acquired.
  • the environmental information acquisition unit 32 determines the weather around the vehicle 1 by executing the image recognition process for the front image I1. As a result, information indicating the weather around the vehicle 1 is acquired.
  • the environmental information acquisition unit 32 determines whether or not the wiper of the vehicle 1 is operating by using the output signal from the wiper sensor of the sensors 6. When the wiper of the vehicle 1 is operating, the environmental information acquisition unit 32 determines that the weather around the vehicle 1 is rain or snow. Further, the environmental information acquisition unit 32 determines that the weather around the vehicle 1 is sunny or cloudy when the wiper of the vehicle 1 is stopped. As a result, information indicating the weather around the vehicle 1 is acquired.
  • the server device 8 stores information indicating the weather in each area.
  • the wireless communication device 7 receives information indicating the weather in the area including the current position of the vehicle 1.
  • the environment information acquisition unit 32 acquires the received information. As a result, information indicating the weather around the vehicle 1 is acquired.
  • the driver information acquisition unit 33 acquires information about the driver of the vehicle 1 (hereinafter referred to as "driver information").
  • the driver information includes, for example, information indicating the driver's eyesight.
  • vehicle 1 is used by a plurality of users.
  • the storage device 9 stores a database in which information indicating the visual acuity of each user and information indicating the face image of each user are associated with each other.
  • the information indicating the visual acuity of each user is, for example, input in advance by each user using the operation input device 10.
  • the facial image of each user is, for example, pre-imaged by the second imaging device 4.
  • the driver information acquisition unit 33 acquires the face image of the driver of the vehicle 1 by executing the image recognition process for the in-vehicle image I2.
  • the driver information acquisition unit 33 determines which of the plurality of users the driver of the vehicle 1 is by matching the acquired face image with the face image in the database.
  • the driver information acquisition unit 33 acquires information indicating the user's visual acuity corresponding to the result of the determination from the database. As a result, information indicating the eyesight of the driver of the vehicle 1 is acquired.
  • the storage device 9 stores the following trained model. That is, a trained model is stored in which a value indicating the driver's visual acuity is output when a value indicating the result of the image recognition process for the in-vehicle image I2 is input.
  • the trained model is pre-generated by machine learning.
  • the driver information acquisition unit 33 executes an image recognition process for the in-vehicle image I2, and inputs a value indicating the result of the image recognition process into the trained model. As a result, information indicating the driver's eyesight is acquired.
  • correction information vehicle information, environmental information, and driver information are collectively referred to as "correction information”.
  • the vehicle information acquisition unit 31, the environment information acquisition unit 32, and the driver information acquisition unit 33 constitute the correction information acquisition unit 34.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S. That is, the viewpoint distance coefficient correction unit 24a executes the same correction processing as the viewpoint distance coefficient correction unit 24. In addition to this, the viewpoint distance coefficient correction unit 24a executes at least one of the following correction processes (1) to (6).
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the size of the area A is large, the value becomes larger than when the size of the area A is small. To do. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the size of the area A is small, the value is smaller than when the size of the area A is large.
  • the detection result by the road traffic information detection unit 21 is used for such correction processing.
  • the viewpoint distance coefficient C is corrected so as to have a larger value.
  • the viewpoint distance coefficient correction unit 24a has a viewpoint distance coefficient such that when the traveling speed of the vehicle 1 is high, the value is smaller than when the traveling speed of the vehicle 1 is low. Correct C. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the traveling speed of the vehicle 1 is low, the value becomes larger than when the traveling speed of the vehicle 1 is high. Vehicle information acquired by the vehicle information acquisition unit 31 is used for such correction processing.
  • the viewpoint distance coefficient C is corrected so as to have a smaller value.
  • the viewpoint distance coefficient correction unit 24a has a viewpoint distance coefficient so that when the turning amount of the vehicle 1 is large, the value becomes smaller than when the turning amount of the vehicle 1 is small. Correct C. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the turning amount of the vehicle 1 is large, the value becomes smaller than when the turning amount of the vehicle 1 is small. Vehicle information acquired by the vehicle information acquisition unit 31 is used for such correction processing.
  • the viewpoint distance coefficient C is corrected so as to have a smaller value.
  • the viewpoint distance coefficient correction unit 24a has a viewpoint distance coefficient so that when the surroundings of the vehicle 1 are dark, the value is smaller than when the surroundings of the vehicle 1 are bright. Correct C. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the surroundings of the vehicle 1 are bright, the value becomes larger than when the surroundings of the vehicle 1 are dark.
  • the environmental information acquired by the environmental information acquisition unit 32 is used for the correction process.
  • the viewpoint distance coefficient C is corrected so as to have a smaller value.
  • the viewpoint distance coefficient correction unit 24a compares when the weather around the vehicle 1 is rain or snow and the weather around the vehicle 1 is sunny or cloudy.
  • the viewpoint distance coefficient C is corrected so that the value becomes small.
  • the viewpoint distance coefficient correction unit 24a makes a large value when the weather around the vehicle 1 is sunny or cloudy as compared with when the weather around the vehicle 1 is rain or snow.
  • the distance coefficient C is corrected.
  • the environmental information acquired by the environmental information acquisition unit 32 is used for the correction process.
  • the viewpoint distance coefficient C is corrected so as to have a smaller value.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the driver's visual acuity is low, the value becomes smaller than when the driver's visual acuity is high. To do. In other words, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C so that when the driver's visual acuity is high, the value becomes larger than when the driver's visual acuity is low.
  • the driver information acquired by the driver information acquisition unit 33 is used for such correction processing.
  • the viewpoint distance coefficient C is corrected so as to have a smaller value.
  • viewpoint detection unit 21 viewpoint detection unit 22, viewpoint distance coefficient calculation unit 23, viewpoint distance coefficient correction unit 24a, recognition degree calculation unit 25, recognition degree determination unit 26, display control unit 27, and correction information acquisition unit 34.
  • viewpoint detection unit 22 viewpoint detection unit 22
  • viewpoint distance coefficient calculation unit 23 viewpoint distance coefficient correction unit 24a
  • recognition degree calculation unit 25 recognition degree determination unit 26
  • display control unit 27 correction information acquisition unit 34.
  • the main part of the driving support device 100a is configured.
  • Each function of 34 may be realized by the processor 51 and the memory 52, or may be realized by the dedicated processing circuit 53.
  • step ST1 the road traffic information detection unit 21 executes the road traffic information detection process.
  • the viewpoint detection unit 22 executes the viewpoint detection process in step ST3.
  • step ST4 the viewpoint distance coefficient calculation unit 23 calculates the viewpoint distance coefficient C.
  • step ST7 the correction information acquisition unit 34 acquires the correction information.
  • step ST5a the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C.
  • the viewpoint distance coefficient correction unit 24a executes at least one of the above correction processes (1) to (6) in addition to the correction process based on the content of the road traffic information display S.
  • the viewpoint distance coefficient correction unit 24a executes only the correction process (1) among the correction processes (1) to (6) above, the correction information is unnecessary. In this case, the process of step ST7 may be skipped.
  • step ST6 the recognition degree calculation unit 25 calculates the recognition degree D2.
  • the viewpoint distance coefficient correction unit 24a executes at least one of the above correction processes (1) to (6) in addition to the correction process based on the content of the road traffic information display S.
  • the corrected viewpoint distance coefficient C'for calculating the recognition degree D2 it is possible to further improve the accuracy of determining whether or not the road traffic information display S is recognized by the driver.
  • the correction information acquisition unit 34 may be provided outside the driving support device 100a. That is, the driving support device 100a is provided by the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24a, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27. The main part of the above may be configured.
  • the recognition degree determination unit 26 and the display control unit 27 may be provided outside the driving support device 100a. That is, the main part of the driving support device 100a is composed of the road traffic information detection unit 21, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24a, and the recognition degree calculation unit 25. You may.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S and the viewpoint distance based on the size of the area A. Correct the coefficient C. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S, and corrects the viewpoint distance coefficient C based on the traveling state of the vehicle 1. More specifically, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the traveling speed of the vehicle 1 or the turning amount of the vehicle 1. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S, and corrects the viewpoint distance coefficient C based on the environment outside the vehicle 1. More specifically, the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the brightness around the vehicle 1 or the weather around the vehicle 1. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
  • the viewpoint distance coefficient correction unit 24a corrects the viewpoint distance coefficient C based on the content of the road traffic information display S, and corrects the viewpoint distance coefficient C based on the driver's visual acuity. As a result, the accuracy of determining whether or not the road traffic information display S is recognized by the driver can be further improved.
  • FIG. 13 is a block diagram showing a main part of the driving support system including the driving support device according to the third embodiment.
  • the driving support device 100b of the third embodiment will be described with reference to FIG. Moreover, the driving support system 200b including the driving support device 100b will be described.
  • the same blocks as those shown in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
  • control device 2b is provided in the vehicle 1.
  • the control device 2b is composed of, for example, an ECU.
  • the vehicle 1 is provided with the voice output device 11.
  • the audio output device 11 is composed of, for example, a speaker.
  • the main part of the driving support system 200b is composed of the control device 2b, the first imaging device 3, the second imaging device 4, the display device 5, and the audio output device 11.
  • the driving support device 100b is provided in the control device 2b.
  • the importance determination unit 41 acquires the detection result by the road traffic information detection unit 21.
  • the importance determination unit 41 determines the importance D3 of the road traffic information display S based on the content of the road traffic information display S by using the acquired detection result.
  • the importance D3 is, for example, one of three values (that is, the first value, the second value, and the third value) according to the magnitude of the influence of the content of the road traffic information display S on the driving of the vehicle 1. It is determined that it is one of the values. The larger the value of the importance D3, the greater the influence that the content of the road traffic information display S has on the driving of the vehicle 1.
  • the importance determination unit 41 indicates that the content of the road traffic information display S requests the stop of the vehicle 1 (for example, the road traffic information display S is a regulation sign meaning "closed” or "closed”. If), it is determined that the importance D3 is the third value. Further, the importance determination unit 41 determines that the importance D3 is the second value when the content of the road traffic information display S requests the deceleration or lane change of the vehicle 1. In addition, the importance determination unit 41 determines that the importance D3 is the first value in other cases.
  • the voice output control unit 42 sets the importance level D3 to a predetermined reference value Dref (for example, a second value) by the importance level determination unit 41.
  • a predetermined reference value Dref for example, a second value
  • the control for outputting the voice corresponding to the road traffic information display S to the voice output device 11 (hereinafter referred to as “voice output control”) is executed.
  • the voice is, for example, a voice that reads out a sentence corresponding to the content of the road traffic information display S.
  • the output control unit 43 is composed of the display control unit 27 and the audio output control unit 42.
  • the voice output control unit 42 constitutes the main part of the driving support device 100b.
  • the recognition degree determination unit 26 acquires the recognition degree D2 calculated by the recognition degree calculation unit 25, and determines whether or not the acquired recognition degree D2 is equal to or greater than the threshold value Dth. ..
  • the importance degree determination unit 41 determines the importance degree D3 of the road traffic information display S in step ST13.
  • the output control unit 43 executes the first display control in step ST12.
  • the output control unit 43 executes the first display control and the audio output control in step ST14.
  • the display control unit 27 executes the first display control.
  • the voice output control unit 42 executes voice output control.
  • the driving support device 100b may have the same viewpoint distance coefficient correction unit 24a as the driving support device 100a instead of the viewpoint distance coefficient correction unit 24. Further, the control device 2b may have the same correction information acquisition unit 34 as the control device 2a.
  • the driving support system 200b may include sensors 6, a wireless communication device 7, a storage device 9, and an operation input device 10 similar to the driving support system 200a.
  • the driving support device 100b can employ various modifications similar to those described in the first and second embodiments.
  • the driving support device 100b of the third embodiment has the importance of determining whether or not the importance D3 of the road traffic information display S is equal to or higher than the reference value Dref based on the content of the road traffic information display S.
  • the voice output device corresponds to the road traffic information display S.
  • a voice output control unit 42 that executes control to output to 11 is provided.
  • FIG. 15 is a block diagram showing a main part of the driving support system including the driving support device according to the fourth embodiment.
  • FIG. 15 is calculated, and the operation support device 100c according to the fourth embodiment will be described later.
  • the driving support system 200c including the driving support device 100c will be described.
  • the same blocks as those shown in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
  • the same blocks as those shown in FIG. 9 are designated by the same reference numerals, and the description thereof will be omitted.
  • control device 2c is provided in the vehicle 1.
  • the control device 2c is composed of, for example, an ECU.
  • the main part of the driving support system 200c is composed of the control device 2c, the first imaging device 3, the second imaging device 4, the display device 5, the storage device 9, and the operation input device 10.
  • the driving support device 100c is provided in the control device 2c.
  • the road traffic information detection unit 21a executes the same road traffic information detection process as the road traffic information detection unit 21.
  • the road traffic information detection unit 21a stores the data indicating the road traffic information display S detected by the road traffic information detection process in the storage device 9.
  • the storage device 9 holds the stored data for a predetermined time.
  • the display control unit 27a executes the first display control similar to the display control unit 27. In addition to this, when the operation for instructing the display of the image I3 corresponding to the road traffic information display S indicated by the data stored in the storage device 9 is input to the operation input device 10, the display control unit 27a is concerned. The control for displaying the image I3 corresponding to the road traffic information display S indicated by the stored data on the display device 5 (hereinafter referred to as “second display control”) is executed.
  • the driver can confirm the content of the road traffic information display S that the vehicle 1 has passed at any timing regardless of whether or not he / she overlooked it.
  • the driving support device 100c is required by the road traffic information detection unit 21a, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27a.
  • the part is composed.
  • the functions of the road traffic information detection unit 21a, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, the recognition degree calculation unit 25, the recognition degree determination unit 26, and the display control unit 27a are It may be realized by the processor 51 and the memory 52, or may be realized by the dedicated processing circuit 53.
  • the recognition degree determination unit 26 may be provided outside the driving support device 100c. That is, the main part of the driving support device 100c is composed of the road traffic information detection unit 21a, the viewpoint detection unit 22, the viewpoint distance coefficient calculation unit 23, the viewpoint distance coefficient correction unit 24, and the recognition degree calculation unit 25. You may.
  • the driving support device 100c may have the same viewpoint distance coefficient correction unit 24a as the driving support device 100a instead of the viewpoint distance coefficient correction unit 24. Further, the control device 2c may have the same correction information acquisition unit 34 as the control device 2a. In this case, the driving support system 200c may include the same sensors 6 and the wireless communication device 7 as the driving support system 200a.
  • the driving support device 100c may have the same importance determination unit 41 and voice output control unit 42 as the driving support device 100b.
  • the driving support system 200c may include an audio output device 11 similar to the driving support system 200b.
  • the driving support device 100c of the fourth embodiment is controlled to display the image I3 corresponding to the road traffic information display S on the display device 5 in response to the operation input to the operation input device 10 (second display control). ) Is included in the display control unit 27a.
  • the driver can confirm the content of the road traffic information display S that the vehicle 1 has passed at any timing regardless of whether or not he / she overlooked it.
  • the invention of the present application is capable of freely combining the respective embodiments, modifying any constituent element of each embodiment, or omitting any constituent element in each embodiment. ..
  • the driving support device of the present invention can be used for driving support of a vehicle.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

La présente invention concerne un dispositif d'aide à la conduite (100) comprenant : une unité de détection d'informations de circulation routière (21) qui détecte, dans une image avant (I1), une région (A) correspondant à un panneau d'informations de circulation routière (S), et détecte également le contenu du panneau d'informations de circulation routière (S) ; une unité de détection de point de vue (22) qui utilise une image d'intérieur de véhicule (I2) pour détecter, dans l'image avant (I1), une position (P) correspondant au point de vue d'un conducteur ; une unité de calcul de coefficient de distance de point de vue (23) qui calcule un coefficient de distance de point de vue (C) correspondant à la distance (L) de la région (A) par rapport à la position (P) ; une unité de correction de coefficient de distance de point de vue (24) qui corrige le coefficient de distance de point de vue (C) sur la base du contenu du panneau d'informations de circulation routière (S) ; et une unité de calcul de niveau de conscience (25) qui calcule le niveau de conscience du conducteur (D2) du panneau d'informations de circulation routière (S) en intégrant des coefficients de distance de point de vue corrigés (C').
PCT/JP2019/010296 2019-03-13 2019-03-13 Dispositif d'aide à la conduite WO2020183652A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021504710A JP7105985B2 (ja) 2019-03-13 2019-03-13 運転支援装置
PCT/JP2019/010296 WO2020183652A1 (fr) 2019-03-13 2019-03-13 Dispositif d'aide à la conduite

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/010296 WO2020183652A1 (fr) 2019-03-13 2019-03-13 Dispositif d'aide à la conduite

Publications (1)

Publication Number Publication Date
WO2020183652A1 true WO2020183652A1 (fr) 2020-09-17

Family

ID=72427228

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/010296 WO2020183652A1 (fr) 2019-03-13 2019-03-13 Dispositif d'aide à la conduite

Country Status (2)

Country Link
JP (1) JP7105985B2 (fr)
WO (1) WO2020183652A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7400119B2 (ja) 2021-04-06 2023-12-18 グーグル エルエルシー 地理空間情報に基づいたリソースの利用

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000099883A (ja) * 1999-10-13 2000-04-07 Denso Corp 交通情報表示装置
JP2004037149A (ja) * 2002-07-01 2004-02-05 Mazda Motor Corp 経路誘導装置、経路誘導方法、及び、経路誘導用プログラム
JP2005182307A (ja) * 2003-12-17 2005-07-07 Denso Corp 車両運転支援装置
JP2008082886A (ja) * 2006-09-27 2008-04-10 Matsushita Electric Ind Co Ltd 案内標識表示装置及び案内標識表示方法
JP2009110394A (ja) * 2007-10-31 2009-05-21 Equos Research Co Ltd 道路標識表示装置
JP2015170249A (ja) * 2014-03-10 2015-09-28 株式会社デンソーアイティーラボラトリ 安全確認判定装置、及び運転支援装置
JP2017111469A (ja) * 2015-12-14 2017-06-22 富士通株式会社 道路標識視認判定システム、道路標識視認判定方法、及びプログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000099883A (ja) * 1999-10-13 2000-04-07 Denso Corp 交通情報表示装置
JP2004037149A (ja) * 2002-07-01 2004-02-05 Mazda Motor Corp 経路誘導装置、経路誘導方法、及び、経路誘導用プログラム
JP2005182307A (ja) * 2003-12-17 2005-07-07 Denso Corp 車両運転支援装置
JP2008082886A (ja) * 2006-09-27 2008-04-10 Matsushita Electric Ind Co Ltd 案内標識表示装置及び案内標識表示方法
JP2009110394A (ja) * 2007-10-31 2009-05-21 Equos Research Co Ltd 道路標識表示装置
JP2015170249A (ja) * 2014-03-10 2015-09-28 株式会社デンソーアイティーラボラトリ 安全確認判定装置、及び運転支援装置
JP2017111469A (ja) * 2015-12-14 2017-06-22 富士通株式会社 道路標識視認判定システム、道路標識視認判定方法、及びプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7400119B2 (ja) 2021-04-06 2023-12-18 グーグル エルエルシー 地理空間情報に基づいたリソースの利用

Also Published As

Publication number Publication date
JPWO2020183652A1 (ja) 2021-09-13
JP7105985B2 (ja) 2022-07-25

Similar Documents

Publication Publication Date Title
JP6330903B2 (ja) 情報呈示装置及び情報呈示方法
CN107848416B (zh) 显示控制装置、显示装置及显示控制方法
US9589194B2 (en) Driving assistance device and image processing program
EP2936065B1 (fr) Système pour un véhicule
US8970451B2 (en) Visual guidance system
US10181308B2 (en) System and method for controlling the luminosity of a head-up display and display using said system
JP5689872B2 (ja) 車両の周辺監視装置
JP5299026B2 (ja) 車両用表示装置
KR20180056867A (ko) 디스플레이 장치 및 그의 동작 방법
US20110228980A1 (en) Control apparatus and vehicle surrounding monitoring apparatus
JP5286035B2 (ja) 車速制御装置
JP2017111469A (ja) 道路標識視認判定システム、道路標識視認判定方法、及びプログラム
JP6354804B2 (ja) 視界制御装置
KR101976106B1 (ko) 정보제공을 위한 차량용 통합 헤드업디스플레이장치
CN105224272B (zh) 一种图像显示方法及汽车显示装置
CN106500716A (zh) 车辆导航投影系统及其方法
US10946744B2 (en) Vehicular projection control device and head-up display device
JP6625480B2 (ja) 表示システム
WO2020183652A1 (fr) Dispositif d'aide à la conduite
US10474912B2 (en) Vehicle display controller, vehicle display system, vehicle display control method, and non-transitory storage medium
US11828947B2 (en) Vehicle and control method thereof
JP2007280203A (ja) 情報提示装置、自動車、及び情報提示方法
JP4026598B2 (ja) 車両用画像表示装置
CN113689358A (zh) 车辆挡风玻璃图像增强显示方法、电子设备、存储介质及玻璃
WO2022044396A1 (fr) Dispositif de commande de reconnaissance d'objet et procédé de reconnaissance d'objet

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19918971

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021504710

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19918971

Country of ref document: EP

Kind code of ref document: A1