WO2019163315A1 - Dispositif de traitement de l'information, dispositif d'imagerie et système d'imagerie - Google Patents

Dispositif de traitement de l'information, dispositif d'imagerie et système d'imagerie Download PDF

Info

Publication number
WO2019163315A1
WO2019163315A1 PCT/JP2019/000364 JP2019000364W WO2019163315A1 WO 2019163315 A1 WO2019163315 A1 WO 2019163315A1 JP 2019000364 W JP2019000364 W JP 2019000364W WO 2019163315 A1 WO2019163315 A1 WO 2019163315A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
brightness
image
luminance
information processing
Prior art date
Application number
PCT/JP2019/000364
Other languages
English (en)
Japanese (ja)
Inventor
山本 英明
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2019163315A1 publication Critical patent/WO2019163315A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present technology relates to an information processing apparatus, an imaging apparatus, and an imaging system that can be applied to a vehicle-mounted camera or the like.
  • Patent Document 1 describes an obstacle detection device using a stereo camera that photographs the front of a vehicle through a front window.
  • the stereo camera is composed of two cameras that respectively capture a reference image and a comparative image.
  • feature points are detected from the reference image, and corresponding points in the comparison image corresponding to the feature points are searched.
  • the search range for the corresponding points is set based on the amount of parallax of each camera with respect to the front window and the dashboard reflected on the front window.
  • a set of feature points having corresponding points is extracted from the reference image as a reflected image on the front window. This makes it possible to improve obstacle detection accuracy.
  • an object of the present technology is to provide an information processing apparatus, an imaging apparatus, and an imaging system that can improve the accuracy of sensing using an image captured from inside a vehicle.
  • an information processing apparatus includes an acquisition unit, a detection unit, and a correction unit.
  • the acquisition unit acquires an external image of the vehicle photographed through a window glass of the vehicle.
  • the detection unit detects brightness information relating to the brightness of an object existing inside the vehicle and reflected on the window glass.
  • the correction unit corrects the external image of the vehicle based on the detected brightness information.
  • an external image photographed through a window glass of a vehicle is acquired. Also, brightness information relating to the brightness of the object inside the vehicle reflected on the window glass is detected. Then, the external image is corrected based on the brightness information of the object. By using the corrected external image, it is possible to improve the accuracy of sensing using an image taken from inside the vehicle.
  • the brightness information may include information on at least one of illuminance, luminance, and reflectance of the object. This makes it possible to accurately detect the brightness of the object and improve the correction accuracy of the external image. As a result, sensing accuracy can be sufficiently improved.
  • the target object may include at least one of a dashboard, an interior part, and a mounted object placed inside the vehicle. Thereby, for example, it becomes possible to reduce the influence of the reflection of various objects existing inside the vehicle on the window glass.
  • the window glass may include at least one of a front window glass, a side window glass, and a rear window glass. Accordingly, it is possible to correct the front, side, and rear external images of the vehicle taken from the inside of the vehicle, and it is possible to sense various directions with high accuracy.
  • the correction unit may correct the luminance of the external image based on the brightness information of the object. Thereby, for example, it is possible to easily remove the reflection of the object from the external image, and it is possible to easily avoid, for example, a sensing error associated with the reflection.
  • the correction unit may calculate a luminance change amount of the external image due to the reflection of the object through the window glass based on the brightness information. For example, by using the luminance change amount, it is possible to remove the reflection of the external image with high accuracy. As a result, sensing errors and the like can be sufficiently avoided.
  • the detection unit may detect the brightness information based on an output of a sensor unit that measures a parameter related to the brightness of the object. Thereby, for example, the brightness of the object can be detected in detail.
  • the sensor unit may include an illuminance sensor that measures the illuminance of the object. Thereby, it becomes possible to easily measure the brightness of the object irradiated with external light or the like.
  • the correction unit may determine a region in which the target object is reflected in the external image and correct a luminance of a region in which it is determined that the target object is reflected based on illuminance of the target object. Thereby, it is possible to accurately correct the reflection of the object. As a result, sensing errors and the like can be sufficiently avoided.
  • the sensor unit may include a plurality of the illuminance sensors arranged in the vehicle according to the intensity of light reflected toward the window glass by the object.
  • the sensor unit may include a position sensor capable of measuring a parameter relating to the brightness for each of a plurality of measurement points on the object and positions of the plurality of measurement points.
  • the detection unit may detect the luminance for each of the plurality of measurement points on the object based on the output of the position sensor. Thereby, it is possible to detect the brightness
  • the position sensor may be a range sensor that measures reflectance at each of the plurality of measurement points. Thereby, it becomes possible to measure the position of each measurement point of the object and the reflectance at each measurement point with high accuracy.
  • the sensor unit may include an illuminance sensor that measures the illuminance of the object.
  • the detection unit may detect the luminance for each of the plurality of measurement points based on the illuminance of the object and the reflectance for each of the plurality of measurement points.
  • the position sensor may be a distance image sensor that measures luminance at each of the plurality of measurement points. Thereby, it becomes possible to measure the position of each measurement point of the object and the luminance at each measurement point with high accuracy.
  • the correction unit may convert the positions of the plurality of measurement points of the object measured by the position sensor into positions in the external image. This makes it possible to correct the external image with high accuracy based on, for example, the luminance for each measurement point. As a result, it is possible to realize highly accurate sensing using an external image.
  • the information processing apparatus may further include a state detection unit that detects at least one of a state outside the vehicle and a state inside the vehicle based on the output of the position sensor.
  • the position sensor can detect the state inside and outside the vehicle. As a result, the number of parts can be reduced, and the manufacturing cost and the like can be suppressed.
  • the external image may be taken by an imaging unit mounted inside the vehicle.
  • the object includes a first region that reflects light along a first optical path toward the imaging unit via the window glass, and a second optical path that is different from the first optical path. And a second region that reflects light.
  • the detection unit may detect a luminance difference between the first and second regions in the external image. Thereby, it is possible to easily detect the luminance change amount of the external image caused by the reflection of the object. As a result, the external image can be easily corrected.
  • An imaging device includes an imaging unit, an acquisition unit, a detection unit, and a correction unit.
  • the imaging unit is mounted inside the vehicle.
  • the acquisition unit acquires an external image of the vehicle photographed by the imaging unit through the window glass of the vehicle.
  • the detection unit detects brightness information relating to the brightness of an object existing inside the vehicle and reflected on the window glass.
  • the correction unit corrects the external image of the vehicle based on the detected brightness information.
  • An imaging system includes an imaging unit, an acquisition unit, a detection unit, and a correction unit.
  • the imaging unit is mounted inside the vehicle.
  • the acquisition unit acquires an external image of the vehicle photographed by the imaging unit through the window glass of the vehicle.
  • the detection unit detects brightness information relating to the brightness of an object existing inside the vehicle and reflected on the window glass.
  • the correction unit corrects the external image of the vehicle based on the detected brightness information.
  • FIG. 1 is a schematic diagram illustrating a configuration example of a vehicle on which a control unit according to the first embodiment of the present technology is mounted.
  • FIG. 1 is a schematic diagram when the front side of the vehicle 100 is viewed from the side surface of the vehicle 100.
  • the interior space 11 inside the vehicle 100 is schematically illustrated in a region separated by a dotted line.
  • the vehicle 100 includes, for example, a driving assistance function that assists driving, an automatic driving function that enables automatic driving to a destination, and the like.
  • the vehicle 100 includes a windshield 20, a dashboard 30, a camera 40, a brightness sensor 50, and a light 12. Moreover, the vehicle 100 has the display apparatus 13 and the control unit 60 which are not shown in figure (refer FIG. 2).
  • the windshield 20 is a window glass disposed in a window (front wind) provided in front of the vehicle 100.
  • Windshield 20 has an inner surface 21 directed toward the inside of vehicle 100 (in-vehicle space 11) and an outer surface 22 directed toward the outside of vehicle 100. As shown in FIG. 1, the windshield 20 is disposed so as to be inclined so that the inner surface 21 faces the lower side of the vehicle 100.
  • the windshield 20 is made of a transparent material.
  • a passenger in the vehicle interior space 11 can visually recognize the front of the vehicle 100 through the windshield 20.
  • the specific configuration of the windshield 20 is not limited.
  • any transparent member that can be used as the window glass of the vehicle 100 may be used as the windshield 20.
  • the windshield 20 corresponds to a windshield.
  • the dashboard 30 is connected to the lower end of the windshield 20 and is disposed in front of the interior space 11.
  • the dashboard 30 functions as a partition plate between the engine room of the vehicle 100 and the driver's seat (in-vehicle space 11), for example.
  • the dashboard 30 is appropriately provided with instruments such as a speedometer and a fuel gauge, and a storage unit.
  • the specific configuration of the dashboard 30 is not limited, and may be appropriately designed according to the design and use of the vehicle 100, for example.
  • the dashboard 30 is an example of an interior part.
  • the dashboard 30 has a first surface 31a and a second surface 31b.
  • the first surface 31 a is connected to the lower end of the windshield 20 and is disposed substantially parallel to the front-rear direction and the left-right direction of the vehicle 100. Accordingly, the first surface 31a is, for example, a substantially horizontal surface disposed on the back side of the dashboard 30 when viewed from the passenger.
  • the second surface 31b is disposed below the side of the first surface 31a opposite to the side connected to the windshield 20 so that the second surface 31b faces the rear upper side of the vehicle 100. Therefore, the second surface 31b is, for example, an inclined surface disposed on the front side of the dashboard 30 when viewed from the passenger.
  • the light irradiated on the dashboard 30 (the first surface 31a and the second surface 31b) (hereinafter referred to as irradiation light 23) is reflected by the dashboard 30.
  • a part of the light reflected by the dashboard 30 travels toward the windshield 20.
  • the light reflected by the dashboard 30 toward the windshield 20 will be referred to as first reflected light 32.
  • the irradiation light 23 applied to the dashboard 30 is not limited to the sunlight 24.
  • the present technology can be applied even when the light from the outside light, the light from the lamp in the tunnel, the light from the vehicle interior light, or the like becomes the irradiation light 23.
  • FIG. 1 schematically shows the first reflected light 32 reflected at each point on the first surface 31a of the dashboard 30 using solid arrows.
  • illustration of the first reflected light 32 reflected by the second surface 31b is omitted.
  • the first reflected light 32 is reflected toward the windshield 20 also from the second surface 31b.
  • the first surface 31 a and the second surface 31 b have different arrangement angles (inclinations) with respect to the windshield 20. Accordingly, the irradiation light 23 (sunlight 24) is incident on the first surface 31a and the second surface 31b at different incident angles. For this reason, the intensity
  • the dashboard 30 is an example of an object that exists inside the vehicle and is reflected on the window glass.
  • the camera 40 is mounted inside the vehicle 100 and captures an external image of the vehicle 100 through the windshield 20 of the vehicle 100. As shown in FIG. 1, the camera 40 is disposed in the upper part of the vehicle interior space 11 toward the front of the vehicle 100. By arranging the camera 40 in the upper part of the vehicle interior space 11, it is possible to ensure a sufficient field of view in front of the vehicle 100. Note that the position and orientation of the camera 40 are not limited.
  • the camera 40 for example, a digital camera including an image sensor such as a CMOS (Complementary Metal-Oxide Semiconductor) sensor or a CCD (Charge Coupled Device) sensor is used.
  • CMOS Complementary Metal-Oxide Semiconductor
  • CCD Charge Coupled Device
  • the specific configuration of the camera 40 is not limited, and for example, an RGB camera that captures a color image, a monochrome camera that captures a monochrome image, or the like may be used as appropriate.
  • the present technology is not limited to a monocular camera, and the present technology can also be applied when, for example, a stereo camera or the like is used.
  • the camera 40 corresponds to an imaging unit mounted inside the vehicle.
  • transmitted light 41 that enters the windshield 20 from the front of the vehicle 100 and passes through the windshield enters the camera 40.
  • the transmitted light 41 incident on the camera 40 is received by an image sensor or the like.
  • an external image (front image) of the vehicle 100 can be taken.
  • light reflected by the windshield 20 may enter the camera 40.
  • a part of the first reflected light 32 reflected toward the windshield 20 by the dashboard 30 is reflected toward the camera 40 by the windshield 20.
  • the light reflected toward the camera 40 by the windshield 20 will be referred to as second reflected light 33.
  • the image taken by the camera 40 includes an image of the dashboard 30 reflected on the windshield 20.
  • an image in which the dashboard 30 is reflected is taken.
  • the image photographed by the camera 40 may be an image in which the dashboard 30 or the like is reflected through the windshield 20 (see FIG. 5A).
  • the brightness sensor 50 measures a parameter related to the brightness of the dashboard 30.
  • the illuminance of the dashboard 30 is measured as a parameter relating to brightness.
  • an illuminance sensor 51 that measures the illuminance of the dashboard 30 is used.
  • the illuminance is a value representing the brightness of light that illuminates the surface of an object, for example. Therefore, it can be said that the brightness of the sunlight 24 incident on the dashboard 30 is measured by measuring the illuminance. Thereby, it is possible to easily evaluate the brightness of the dashboard 30.
  • the brightness sensor 50 includes a first illuminance sensor 51a and a second illuminance sensor 51b.
  • the 1st illumination intensity sensor 51a is arrange
  • the 2nd illumination intensity sensor 51b is arrange
  • Each illuminance sensor 51 is arranged at an arbitrary position such as the left and right ends of each surface.
  • the intensity of the first reflected light 32 reflected toward the windshield 20 by the first surface 31a and the second surface 31b has different values.
  • the first illuminance sensor 51a and the second illuminance sensor 51b it is possible to appropriately measure the illuminance of each surface. Thereby, for example, it is possible to appropriately evaluate the degree of reflection due to the first reflected light 32 reflected from each surface.
  • the brightness sensor 50 includes a plurality of illuminance sensors 51 arranged inside the vehicle 100 according to the intensity of the first reflected light 32 reflected toward the windshield 20 by the dashboard 30. It is. Note that the number and arrangement positions of the illuminance sensors 51 are not limited, and two or more illuminance sensors 51 may be appropriately arranged according to the shape of the dashboard 30, for example. Of course, a single illuminance sensor 51 may be used. In the present embodiment, the brightness sensor 50 corresponds to a sensor unit.
  • the lights 12 are disposed, for example, on both the left and right sides of the front and rear of the vehicle 100, for example.
  • the light 12 includes a headlamp that illuminates the front of the vehicle 100, an auxiliary headlamp (fog lamp), a vehicle width lamp that indicates the vehicle width and the like (small lamp), and a taillight that is disposed behind the vehicle 100 ( Including tail lamps).
  • a headlamp disposed in front of the vehicle 100 is illustrated as an example of the light 12.
  • the display device 13 is, for example, a device that is arranged in the interior space 11 and can output visual information and the like to the passenger.
  • the display device includes, for example, an instrument panel (instrument panel and instruments), an interior lamp, a display device such as a display, a switch backlight, and the like.
  • the kind etc. of the display apparatus 13 are not limited, For example, the arbitrary elements, apparatus, etc. which can change the brightness of a display, etc. may be used as the display apparatus 13.
  • FIG. 2 is a block diagram illustrating a configuration example of the control unit 60.
  • the control unit 60 is disposed, for example, at a predetermined position inside the vehicle 100 and is appropriately connected to each block provided in the vehicle 100.
  • the control unit 60 corresponds to the information processing apparatus according to this embodiment, and includes hardware necessary for a computer such as a CPU, a RAM, and a ROM.
  • the information processing method according to the present technology is executed when the CPU loads a program according to the present technology recorded in advance in the ROM into the RAM and executes the program.
  • control unit 60 The specific configuration of the control unit 60 is not limited, and devices such as PLD (Programmable Logic Device) such as FPGA (Field Programmable Gate Array) and other ASIC (Application Specific Integrated Circuit) may be used.
  • PLD Processable Logic Device
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • the control unit 60 includes an image acquisition unit 61, a brightness detection unit 62, a correction unit 63, a storage unit 64, and a light emission control unit 65.
  • each functional block is configured by the CPU of the control unit 60 executing a predetermined program.
  • the image acquisition unit 61 acquires an external image of the vehicle 100 photographed through the windshield 20 of the vehicle 100. Specifically, an external image of the vehicle 100 captured by the camera 40 through the windshield 20 is read. The read external image is output to the correction unit 63.
  • the image acquisition unit 61 corresponds to an acquisition unit.
  • the brightness detection unit 62 detects brightness information related to the brightness of the dashboard 30. Specifically, the brightness detection unit 62 reads an output result measured by the brightness sensor 50 and detects brightness information based on the read output of the brightness sensor 50.
  • the illuminance of the first surface 31a measured by the first illuminance sensor 51a and the illuminance of the second surface 31b measured by the second illuminance sensor 51b are used. . That is, the illuminance of the horizontal surface (first surface 31a) of the dashboard 30 and the illuminance of the inclined surface (second surface 31b) are detected as data representing the brightness of the dashboard 30. The detected illuminance of each surface is output to the correction unit 63.
  • the brightness detection unit 62 corresponds to a detection unit.
  • the correction unit 63 corrects the external image of the vehicle 100 based on the detected brightness information.
  • the brightness of the external image is corrected based on the brightness information of the dashboard 30.
  • the brightness of the external image is corrected so that the image of the dashboard 30 (the reflected image) reflected in the external image is removed.
  • the process of correcting the luminance of the external image is executed for each pixel, for example.
  • the process for correcting the luminance of the external image will be described in detail later.
  • the storage unit 64 includes an HDD (Hard Disk Drive) provided by the control unit 60, an SSD (Solid State Drive), or the like.
  • the storage unit 64 stores various data used for external image correction processing.
  • the storage unit 64 stores an area map M (u, v) representing a reflection area where the reflection of the dashboard 30 occurs (see FIG. 5B).
  • u and v are values representing the coordinates of the pixels in the horizontal and vertical directions of an image (external image) taken by the camera 40.
  • the area map M (u, v) for example, 1 is stored when the pixel specified by the coordinates (u, v) is included in the reflection area, and 0 is stored when the pixel is not included. This makes it possible to easily determine the reflection area.
  • the specific configuration of the area map M (u, v) is not limited.
  • the light reflected by the dashboard 30 is, for example, diffused light emitted in various directions. Accordingly, the reflection area of the dashboard 30 is substantially the same even when the direction in which the sunlight 24 is irradiated changes. That is, it can be said that the reflection area of the dashboard 30 is an area determined by the positional relationship among the dashboard 30, the windshield 20, and the camera 40.
  • an image in which the dashboard 30 is reflected is taken by the camera 40 at the time of factory shipment or maintenance. Based on this image, it is possible to generate an area map M (u, v) representing the reflection area.
  • the area map M (u, v) may be generated by an arbitrary method capable of converting the reflection area and the like into data.
  • the storage unit 64 stores various parameters necessary for correcting the luminance based on the illuminance of the dashboard 30. These parameters are read as appropriate when the correction unit 63 corrects the external image. This point will be described in detail later.
  • the storage unit 64 stores data and the like necessary for the operation of each unit of the control unit 60.
  • the light emission control unit 65 controls the light intensity of the light 12 and the display device 13 mounted on the vehicle 100 based on the output of the illuminance sensor 51.
  • the control of the light emission intensity includes switching of light emission ON / OFF, stepwise intensity control of the light emission intensity, and the like.
  • the illuminance sensor 51 for example, one or both of the first and second illuminance sensors 51a and 51b are appropriately used.
  • the light emission control unit 65 appropriately controls turning on / off of the lights 12 such as a small lamp, a head lamp, and a tail lamp according to the illuminance.
  • the display brightness of the display and the brightness of backlights such as instruments and switches are appropriately controlled according to the illuminance.
  • the brightness of the light 12 and the display device 13 is automatically adjusted when it becomes dark due to dark hours such as evening or night, weather such as rainy or cloudy, or when the brightness changes due to a tunnel or the like. It is possible to control.
  • the illuminance sensor 51 is used not only to correct the external image but also to control the light 12 and the display device 13. Thereby, the number of parts can be suppressed, and the manufacturing cost of the vehicle 100 can be suppressed.
  • the control unit 60 may be appropriately provided with a function block that controls each unit of the vehicle 100.
  • the camera 40 and the control unit 60 mounted on the vehicle 100 constitute an imaging apparatus according to the present technology. Further, the camera 40 and the control unit 60 mounted on the vehicle 100 function as an imaging system according to the present technology.
  • FIG. 3 is a schematic diagram for explaining the reflection by the windshield 20.
  • incident light 2 emitted from the subject 1 outside the vehicle 100 is incident on the outer surface 22 of the windshield 20.
  • Part of the incident light 2 from the subject 1 is reflected by the windshield 20.
  • Another part enters the vehicle interior space 11 as transmitted light 41 that passes through the windshield 20 and enters the camera 40.
  • luminance of the transmitted light 41 is described as Ltrn .
  • the irradiation light 23 such as sunlight 24 is reflected, and a part of the light enters the windshield 20 as the first reflected light 32.
  • a part of the first reflected light 32 is reflected by the windshield 20 and enters the camera 40 as the second reflected light 33.
  • luminance of the 2nd reflected light 33 is described as Lref . It can be said that the brightness L ref of the second reflected light 33 represents the brightness of the reflection as viewed from the image sensor of the camera 40.
  • the transmitted light 41 transmitted through the windshield 20 and the second reflected light 33 reflected by the windshield 20 enter the camera 40.
  • the image sensor (camera 40) in the region where the second reflected light 33 is incident, the sum of the luminance L ref luminance L trn and second reflected light 33 of the transmitted light 41 is detected.
  • a region where the second reflected light 33 is incident is a region where the dashboard 30 is reflected (a reflection region).
  • FIG. 4 is a graph showing the luminance detected in the reflection area.
  • the horizontal axis of the graph represents the brightness of the transmitted light 41 as L trn
  • the vertical axis represents the brightness L cam detected in the reflection area.
  • the luminance is represented by a gradation of 8 bits (0 to 255).
  • the luminance detected by the camera 40 is a straight line with an inclination passing through the origin (0, 0). (Dotted line 42 in the figure). From another viewpoint, it can be said that the characteristic represented by the dotted line 42 is an ideal camera characteristic in which there is no reflection by the windshield 20.
  • the luminance L cam detected by the camera 40 is brightly shifted by the value of the luminance L ref of the second reflected light 33 (in the drawing).
  • Solid line 43 As a result, for example, the graph of the solid line 43 is a straight line whose slope is 1 and whose intercept is represented by the luminance L ref of the second reflected light 33. That is, the brightness detected by the camera 40 becomes brighter due to the reflection by the windshield 20.
  • a saturation region 44 in which the luminance L cam detected by the camera 40 is saturated due to the luminance shift.
  • the saturation region 44 it is conceivable that the brightness of the image becomes the maximum value 255 and the data of the transmitted light 41 is lost due to overexposure.
  • a process of reducing the exposure (exposure time, sensitivity, etc.) of the camera 40 by the shift amount (L ref ) so that the brightness of the subject 1 (transmitted light 41) does not reach the saturation region 44 is executed. Is done. Thereby, it is possible to suppress the generation of the saturated region 44 and to avoid the loss of the data of the transmitted light 41.
  • the luminance L ref of the second reflected light 33 is a value corresponding to the brightness of the dashboard 30.
  • the luminance L ref of the second reflected light 33 is expressed by the following equation using the illuminance E that represents the brightness of the dashboard 30.
  • L ref ⁇ E + ⁇ (1)
  • ⁇ and ⁇ are coefficients determined according to the material and shape of the object (dashboard 30) reflected on the windshield 20 and the reflectance of the windshield 20.
  • FIG. 5 is a schematic diagram showing an example of external image correction processing.
  • FIG. 6 is a flowchart illustrating an example of external image correction processing. The process shown in FIG. 6 is a loop process that is continuously executed during the operation of the vehicle 100, for example.
  • the camera 40 captures an external image in front of the vehicle 100 through the windshield 20 (step 101).
  • the captured external image is read by the image acquisition unit 61.
  • the luminance value of the external image is described as luminance data X (u, v).
  • the luminance data X (u, v) is data representing the luminance value of a pixel existing at coordinates (u, v) in the external image, for example.
  • FIG. 5A is a schematic diagram illustrating an example of an external image photographed by the camera 40.
  • the external image 45 includes a landscape in front of the vehicle 100 that is captured by detecting the transmitted light 41 that has passed through the windshield 20.
  • an area where the second reflected light 33 reflected by the windshield 20 is detected is a reflection in which an image of the dashboard 30 (the first surface 31a and the second surface 31b) is taken.
  • a region 46 is formed.
  • 5A schematically illustrates a first reflection area 46a in which the first surface 31a of the dashboard 30 is reflected, and a second reflection area 46b in which the second surface 31b is reflected.
  • the reflection area 46 is an area where the brightness (luminance) is increased as compared with other areas where there is no reflection. It should be noted that the amount of luminance shift in the first reflection area 46a and the second reflection area 46b is a value corresponding to the material and arrangement of the first surface 31a and the second surface 31b.
  • the illuminance of the dashboard 30 is measured by the illuminance sensor 51 arranged on the dashboard 30 (step 102).
  • the illuminance E a of the first surface 31a is measured by the first illuminance sensor 51a disposed on the first surface 31a.
  • the illuminance Eb of the second surface 31b is measured by the second illuminance sensor 51b disposed on the second surface 31b.
  • the measured illuminance (E a and E b ) of each surface is read by the brightness detection unit 62.
  • the correction unit 63 calculates the amount of luminance shift of the external image 45 due to the reflection of the dashboard 30 through the windshield 20 (step 103). That is, the brightness of the reflection of the dashboard 30 in the external image 45 is calculated.
  • the luminance shift amount of the external image 45 corresponds to the luminance change amount of the external image.
  • the amount of luminance shift of the external image 45 in each reflection region 46 is calculated using the equation (1). It can be said that this process is a process of performing illuminance luminance conversion for converting the illuminance E of the dashboard 30 into the luminance L ref of the second reflected light 33. Accordingly, the coefficients ⁇ and ⁇ in the equation (1) are calibration values for illuminance / luminance conversion. The coefficients ⁇ and ⁇ are appropriately read from the storage unit 64.
  • the luminance L ref_b ⁇ b E b + ⁇ b of the second reflected light 33 in the second reflection region 46b is calculated. Is done.
  • FIG. 7 is a flowchart showing an example of a process for correcting the brightness of the reflection area 46.
  • FIG. 7 shows an example of internal processing in step 104 shown in FIG.
  • the correction unit 63 reads the external image 45 (luminance data X (u, v)), the area map M (u, v), and the luminance shift amount L ref in the reflection area 46 (step 201). For example, as the shift amount L ref of luminance, and the luminance of the shift amount L REF_A in the first glare region 46a, and the shift amount L REF_B luminance of the second image reflection area 46b are read respectively.
  • FIG. 5B is a schematic diagram illustrating an example of the area map M (u, v).
  • FIG. 5B schematically shows a first area 47a representing the first reflection area 46a and a second area 47b representing the second reflection area 46b.
  • the area map M (u, v) is appropriately configured so that, for example, the first and second areas 47a and 47b can be distinguished from each other.
  • the luminance shift amounts (L ref_a and L ref_b ) in each reflection region 46 are shown using a gray scale. Therefore, it can be said that the diagram shown in FIG. 5B is a diagram showing the reflection luminance of the dashboard 30 (the first surface 31a and the second surface 31b) in the external image 45.
  • the brightness shown in FIG. 5B is superimposed on the scenery in front of the vehicle 100, so that reflection in the external image 45 occurs.
  • correction data Y (u, v) for the external image 45 and a variable n for designating each pixel of the external image 45 are prepared (step 202).
  • the correction data Y (u, v) is data set corresponding to the luminance data X (u, v), and is data that stores the result of the correction process.
  • the number of pixels in the horizontal direction of the external image 45 is W
  • the number of pixels in the vertical direction is H
  • the coordinate u is an integer of 1 ⁇ u ⁇ W
  • the coordinate v is an integer of 1 ⁇ v ⁇ H.
  • an integer index from 1 to W ⁇ H representing each pixel is set for each of W ⁇ H pixels.
  • the variable n is used as a variable that specifies the integer index of each pixel.
  • the determination is performed based on the area map M (u, v).
  • the coordinates (u, v) of the pixel designated by the variable n are referred to, and it is determined whether or not the position represented by the coordinates is included in the reflection area 46 of the area map M (u, v). Is done.
  • the pixels specified by the variable n are the first reflection area 46a (first area 47a) and the second reflection area 46b (second area 47b) of the area map M (u, v). It is possible to determine which is included.
  • the luminance of the external image 45 is corrected (Step 205). Specifically, a value obtained by subtracting the luminance shift amount (L ref ) in the target reflection area 46 from the luminance data X (u, v) is calculated as the correction data Y (u, v).
  • step 207 it is determined whether or not the processing has been executed for all the pixels of the external image 45 (step 207). Specifically, it is determined whether or not a variable n that designates a pixel satisfies n ⁇ W ⁇ H. When it is determined that the variable n is smaller than W ⁇ H (No in Step 207), it is determined that an unprocessed pixel remains, and the process returns to Step 203 to execute the process for the next pixel.
  • the area in which the dashboard 30 is reflected in the external image 45 is determined, and the area of the area determined to be reflected in the dashboard 30 based on the illuminance of the dashboard 30 is determined.
  • the brightness is corrected. It can be said that this process is a process of removing the luminance shift amount due to the reflection shown in FIG. 5B from the external image 45 shown in FIG. 5A, for example. Thereby, it is possible to accurately correct the reflection of the dashboard 30 and the like.
  • the corrected image is constructed based on the correction data Y (u, v), assuming that processing has been completed for all pixels (Step 208). ). For example, a predetermined type of corrected image is generated using the luminance data recorded in the correction data Y (u, v).
  • FIG. 5C is a schematic diagram illustrating an example of a corrected image.
  • FIG. 5C shows a corrected image 48 obtained by removing the luminance L ref for each reflection area 46 shown in FIG. 5B from the external image 45 shown in FIG. 5A.
  • the corrected image 48 the brightness of the areas brightened by the reflection of the dashboard 30 (first and second reflection areas 46a and 46b) is corrected, and the original brightness L trn of the transmitted light 41 is reproduced. Image.
  • the scenery in front of the vehicle 100 can be appropriately represented.
  • the corrected image 48 (correction data Y (u, v)) is output to a recognizer or the like (step 105).
  • the recognizer is, for example, a processing block or a processing device that performs image sensing or the like that detects an object around the vehicle 100 from input image data.
  • the recognizer for example, processing for detecting other moving objects such as pedestrians, bicycles, and other vehicles, processing for recognizing signals and signs, and the like are executed. Further, for example, a process of detecting a distance to an object using stereo parallax may be executed.
  • the specific configuration of the recognizer is not limited, and the present technology can be applied to any recognizer that performs image sensing or the like.
  • the sensing error is an error in which an object in an image is erroneously detected or cannot be detected by, for example, the dashboard 30 being reflected.
  • erroneous detection of stereo parallax accompanying reflection is suppressed. As a result, it is possible to sufficiently improve the accuracy of image sensing.
  • step 205 in FIG. 7 the luminance shift amount L ref due to reflection is uniformly subtracted from the luminance of the external image 45.
  • a process of subtracting the weighted shift amount L ref may be executed for each pixel included in the reflection area 46.
  • the brightness of the reflection may differ depending on the location of the dashboard 30. obtain.
  • the distribution of brightness due to such a difference in material (reflectance, etc.) and shape (reflection angle, etc.) can be measured in advance. Further, according to the brightness distribution, for example, weighting parameters at each point of the dashboard 30 can be calculated.
  • a weighting parameter is set for each pixel of the area map M (u, v), and the weighted shift amount Lref is subtracted for each pixel. That is, the parameter for weighting is set so that the shift amount L ref becomes large in the portion where the reflection is strong. In a portion where reflection is weak, a parameter for weighting is set so that the shift amount L ref is small. As a result, the luminance shift of the external image 45 accompanying the reflection can be removed with high accuracy in pixel units, and the accuracy of image sensing can be greatly improved.
  • the external image 45 photographed through the windshield 20 of the vehicle 100 is acquired. Further, brightness information relating to the brightness of the dashboard 30 inside the vehicle 100 reflected on the windshield 20 is detected. The external image 45 is corrected based on the brightness information of the dashboard 30. By using the corrected external image 45, it is possible to improve the accuracy of sensing using an image taken from inside the vehicle.
  • a method for correcting the reflection of an image a method of comparing two images taken with a stereo camera is conceivable. For example, by searching for feature points due to reflection based on the parallax of each image, a pixel or the like in which reflection occurs is specified. In this method, for example, it may be difficult to correct correctly the luminance value of a pixel in which reflection occurs. Also, two cameras are required, and it is difficult to apply to a single camera, for example.
  • an illuminance sensor 51 that measures the illuminance E of the dashboard 30 to be reflected is used.
  • the brightness of the dashboard 30 reflected by the windshield 20 (the luminance L ref of the second reflected light 33) can be calculated with high accuracy.
  • the measurement of the illuminance E by the illuminance sensor 51 is executed in accordance with the timing at which the external image 45 is taken.
  • a corrected image can be generated based on the brightness (illuminance E) of the dashboard 30 when the reflection occurs.
  • the time zone, the weather, the traveling environment, and the like change, it is possible to appropriately correct the reflection, and it is possible to improve the reliability of the apparatus.
  • the reflection area 46 of the external image 45 is determined using the area map M (u, v).
  • the area map M (u, v) is determined using the area map M (u, v).
  • the illuminance sensor 51 by using the measurement result of the illuminance sensor 51, not only the correction of the reflection but also the control of the light 12 and the display device 13 mounted on the vehicle 100 are executed. As a result, the sensing accuracy of the recognizer can be improved, and the burden on the driver and the like can be sufficiently reduced. In recent years, vehicles equipped with an autolight function using an illuminance sensor or the like have become widespread, and the installation rate of the illuminance sensor is expected to increase. For example, by using such an illuminance sensor 51, it is possible to provide an apparatus capable of correcting the reflection at a low cost.
  • FIG. 8 is a schematic diagram illustrating a configuration example of a vehicle equipped with a control unit according to the second embodiment of the present technology.
  • FIG. 9 is a block diagram illustrating a configuration example of the control unit 260.
  • the vehicle 200 includes a windshield 20 and a dashboard 30.
  • the windshield 20 and the dashboard 30 are configured similarly to the windshield 20 and the dashboard 30 shown in FIG.
  • the illustration of the second surface 31b of the dashboard 30 shown in FIG. 1 is omitted.
  • a placement object 34 placed on the dashboard 30 is shown.
  • the dashboard 30 and the placement object 34 are examples of objects that exist inside the vehicle and are reflected on the window glass.
  • the vehicle 200 includes a camera 40, a brightness sensor 250, and a control unit 260.
  • the camera 40 has the same configuration as the camera 40 shown in FIGS. 1 and 2, for example.
  • the brightness sensor 250 measures parameters related to the brightness of the dashboard 30 and the object 34.
  • the brightness sensor 250 has a TOF camera 251 capable of detecting time of flight (TOF: Time of Flight).
  • TOF Time of Flight
  • the brightness of the dashboard 30 is measured by the TOF camera 251 as a parameter relating to brightness.
  • the TOF camera 251 is arranged in the vicinity of the camera 40 toward the dashboard 30. In FIG. 8, the photographing range of the TOF camera 251 is schematically illustrated using dotted lines.
  • FIG. 10 is a schematic diagram illustrating a configuration example of the TOF camera 251.
  • the TOF camera 251 includes an image sensor 252 and a TOF sensor 253.
  • the image sensor 252 and the TOF sensor 253 are arranged close to each other.
  • the TOF camera 251 corresponds to a distance image sensor.
  • the image sensor 252 captures a target luminance image.
  • the luminance image is, for example, an image in which the luminance value of each target point is detected, and includes a color image (RGB image), a monochrome image, and the like.
  • RGB image color image
  • a digital camera equipped with an image sensor using a CCD, a CMOS, or the like is used.
  • the TOF sensor 253 measures the distance to the target.
  • the TOF sensor 253 includes a light receiving element (image sensor) having the same number of pixels as the image sensor 252, for example.
  • image sensor image sensor
  • light is irradiated onto the object using a light emitting element (not shown), and the time until the light reflected at each point of the object is received by the light receiving element is measured. Thereby, it is possible to measure a distance image in which the distance to each target point is recorded.
  • the image sensor 252 measures the luminance (luminance image) for each of the plurality of measurement points 35 on the object (dashboard 30 and mounted object 34) to be reflected.
  • the measurement point 35 the measurement point 35 on the placement object 34 is illustrated by a black circle.
  • reflected light 25 irradiated with sunlight 24 or the like and reflected at the measurement point 35 is detected by the image sensor 252.
  • the distance (distance image) to the plurality of measurement points 35 on the object (dashboard 30 and mounted object 34) to be reflected is measured by the TOF sensor 253.
  • the distance to the measurement point 35 on the placement object 34 measured by the TOF sensor 253 is schematically illustrated.
  • the position of each measurement point 35 can be measured.
  • the position of the measurement point 35 is, for example, a three-dimensional position represented by the coordinate value of the measurement point 35 in a predetermined three-dimensional coordinate system.
  • the position of the measurement point 35 measured by the TOF sensor 253 is the position of the TOF sensor 253 in the sensor coordinate system.
  • This position (the coordinate value in the sensor coordinate system of the TOF sensor 253) can be appropriately converted into, for example, a position in the sensor coordinate system of the image sensor 252. Therefore, by using the TOF camera 251, it is possible to measure data including the luminance and the three-dimensional position for each pixel (measurement point 35).
  • the luminance and position of the measurement point 35 can be measured simultaneously. Thereby, the position and brightness
  • the specific configuration of the TOF camera 251 is not limited, and for example, a TOF camera 251 including a TOF sensor 253 that can capture a luminance image and a distance image may be used.
  • the TOF camera 251 can measure the luminance for each of the plurality of measurement points on the dashboard 30 and the placement object 34 and the position of the plurality of measurement points 35.
  • the TOF camera 251 functions as a position sensor.
  • the control unit 260 includes an image acquisition unit 261, a brightness detection unit 262, a correction unit 263, a storage unit 264, and a state detection unit 266.
  • the image acquisition unit 261 acquires an external image 45 of the vehicle 200 photographed through the windshield 20 of the vehicle 200.
  • the brightness detection unit 262 detects brightness information related to the brightness of the dashboard 30 and the placed object 34.
  • the brightness information the brightness for each of the plurality of measurement points 35 on the dashboard 30 and the placement object 34 is detected based on the output of the TOF camera 251.
  • the correction unit 263 corrects the luminance of the external image 45 of the vehicle 200 based on the detected luminance for each measurement point 35.
  • the storage unit 264 stores various parameters necessary for correcting the luminance of the external image 45 based on the luminance for each measurement point 35. The operation of the correction unit 263 will be described in detail later.
  • the state detection unit 266 detects the internal state of the vehicle 200 based on the output of the TOF camera 251.
  • the internal state of the vehicle 200 refers to various states in the vehicle interior space 11 including, for example, the state of the driver and other passengers on the vehicle 200, the state of the seat, the state of the mounted object 34, and the like. It is.
  • the state of the driver or passenger to be detected includes, for example, position, posture, physical condition, arousal level, concentration level, fatigue level, line-of-sight direction, and the like.
  • the position data of the driver output from the TOF camera 251 it is possible to monitor the state of the driver. Thereby, for example, it is possible to execute a process of notifying the driver of the awakening level, the fatigue level, or the like, or a process of safely stopping the vehicle 200 in an emergency.
  • a process of detecting an external state of the vehicle 200 based on the output of the TOF camera 251 may be executed.
  • an arbitrary detection process using the output of the TOF camera 251 may be executed.
  • FIG. 11 is a schematic diagram illustrating an example of the correction process of the external image 45.
  • the correction processing of the external image 45 by the control unit 260 will be described with reference to FIG.
  • FIG. 11A is a schematic diagram illustrating an example of an external image 45 captured by the camera 40. As shown in FIG. 11A, in the external image 45, the dashboard 30 and the placement object 34 are reflected through the windshield 20.
  • the position of the placement object 34 may not be fixed with respect to the vehicle 200. For this reason, the range in which the mounted object 34 is reflected in the external image 45 may change every time the image is taken. That is, when the placement object 34 or the like is reflected, the reflection region 46 in which the reflection is generated in the external image 45 may change.
  • the reflected image 49 is, for example, an image generated by detecting the second reflected light 33 reflected by the windshield 20 (see FIG. 9). Therefore, it can be said that the external image 45 is an image in which the reflected image 49 is superimposed on the scenery in front of the vehicle 200 constituted by the transmitted light 41 transmitted through the windshield 20.
  • the dashboard 30 and the mounted object 34 are photographed by the TOF camera 251 at the timing when the external image 45 is photographed. Then, the brightness detection unit 262 reads the output of the TOF camera 251. For example, a luminance image and a distance image captured by the image sensor 252 and the TOF sensor 253 are read, respectively.
  • FIG. 11B is a schematic diagram illustrating an example of a luminance image 254 photographed by the image sensor 252 of the TOF camera 251.
  • the luminance image 254 is an image obtained by directly photographing the dashboard 30 and the placement object 34. For example, for a certain pixel in the luminance image 254, the luminance of the measurement point 35 corresponding to the pixel is recorded.
  • the luminance image 254 is a luminance map E (u ′, v ′) that represents the luminance distribution of the objects (dashboard 30 and mounted object 34) that are to be reflected in the external image 45.
  • u ′ and v ′ are values representing the coordinates of pixels in the horizontal and vertical directions of an image (luminance image) taken by the image sensor 252 (TOF camera 251).
  • the brightness of the reflected image 49 is calculated by the correcting unit 263 from the brightness of the dashboard 30 and the placed object 34.
  • the brightness of the reflected image 49 is the brightness L ref of the second reflected light 33 emitted from the dashboard 30 and the placement object 34 and reflected by the windshield 20 (see FIG. 8).
  • the luminance L ref of the second reflected light 33 is expressed by the following equation using, for example, a luminance map E (u ′, v ′) representing the luminance of the dashboard 30 and the mounted object 34.
  • L ref (u ′, v ′) ⁇ E (u ′, v ′) + ⁇ (2)
  • ⁇ and ⁇ are coefficients determined according to the characteristics (reflectance and the like) of the windshield 20.
  • the coefficients ⁇ and ⁇ are calculated in advance and stored in the storage unit 264, for example.
  • the luminance map E (u ′, v ′) represents luminance according to characteristics such as reflectance at the measurement points 35 of the dashboard 30 and the placed object 34. Therefore, for example, even when the characteristics of the dashboard 30 and the mounted object 34 change, L ref (u ′, v ′) can be calculated using the coefficients ⁇ and ⁇ stored in the storage unit 264.
  • L ref (u ′, v ′) represents the luminance distribution of the second reflected light 33 in the coordinate system (u ′, v ′) of the luminance image 254 shown in FIG. 11B.
  • L ref (u ′, v ′) represents the luminance distribution of the second reflected light 33 in the coordinate system (u ′, v ′) of the luminance image 254 shown in FIG. 11B.
  • the correction unit 263 coordinate conversion from the coordinate system (u ′, v ′) of the luminance image 254 to the coordinate system (u, v) of the external image 45 is executed.
  • coordinate transformation using perspective projection transformation is executed.
  • perspective projection conversion it is possible to execute processing such as converting three-dimensional coordinates into two-dimensional coordinates or converting two-dimensional coordinates into three-dimensional coordinates.
  • external parameters including the center coordinates of the lens and the optical axis direction of the lens used for photographing an object
  • internal parameters including focal length, image center position, image size, distortion aberration coefficient, etc.
  • a calibration value for correcting the is used. These calibration values are coefficients that are determined based on, for example, the characteristics and arrangement relationships between the camera 40 and the TOF camera 251.
  • the calibration value is calculated in advance using, for example, a predetermined calibration pattern (checkered pattern or the like) and stored in the storage unit 264.
  • the luminance distribution of the second reflected light 33 calculated by the equation (2) is coordinate-converted by the correction unit 263 using the following equation.
  • L ref (u, v) W (L ref (u ′, v ′), c) (3)
  • W is a function (a conversion matrix or the like) for converting from the coordinate system (u ′, v ′) of the luminance image 254 to the coordinate system (u, v) of the external image 45.
  • c is a calibration value for performing coordinate conversion by the function W.
  • coordinate conversion by W for example, two-dimensional coordinates on the TOF camera 251 (luminance image 254) are converted into three-dimensional coordinates in a three-dimensional space, and the converted three-dimensional coordinates are converted on the camera 40 (external image 45). This is a process of converting into two-dimensional coordinates.
  • the value of the distance image measured by the TOF camera 251 is used.
  • the specific structure of the function W which performs coordinate transformation is not limited, For example, the arbitrary methods which can convert a coordinate may be used suitably.
  • the coordinates (u ′, v ′) of the measurement point 35 on the luminance image 254 shown in FIG. 11B are converted into the coordinates (u, v) of the measurement point 35 on the external image 45 shown in FIG. 11A.
  • other measurement points 35 are also converted into coordinates on the external image 45.
  • the position of the measurement points 35 of the dashboard 30 and the placement object 34 measured by the TOF camera 251 is converted into positions in the external image 45 by the correction unit 263.
  • the luminance distribution L ref (u, v) of the second reflected light 33 calculated using the expression (3) is the luminance distribution of the reflected image 49 in the external image 45 shown in FIG. 11A. That is, by using the equations (2) and (3), it is possible to convert the actual luminance of the dashboard 30 and the placement object 34 into the luminance of the reflected image 49.
  • correction data Y (u, v) X (u, v) obtained by subtracting the luminance distribution L ref (u, v) of the second reflected light 33 from the luminance data X (u, v). ) ⁇ L ref (u, v) is calculated.
  • a corrected image 48 is generated based on the calculated correction data Y (u, v).
  • FIG. 11C is a schematic diagram illustrating an example of the corrected image 48.
  • FIG. 11C shows a corrected image 48 obtained by removing the luminance distribution L ref (u, v) of the second reflected light 33 from the external image 45 shown in FIG. 11A.
  • L ref luminance distribution
  • the corrected image 48 not only the image of the dashboard 30 reflected in the external image 45 but also the image of the placement object 34 is removed. Thereby, it is possible to generate the corrected image 48 in which the scenery in front is clearly captured.
  • the generated corrected image 48 is output to a recognizer or the like, and image sensing or the like using the corrected image 48 is executed.
  • FIG. 12 is a schematic diagram showing another configuration example of a vehicle on which the TOF camera 251 is mounted.
  • 12A and 12B are schematic views when the interior space 11 of the vehicle 201 is viewed from the side and the upper side of the vehicle 201.
  • the vehicle 201 has an all-around camera 240 and a plurality of TOF cameras 251.
  • the all-around camera 240 is a camera that can capture an image over a range of 360 °, for example.
  • the all-around camera 240 is disposed, for example, in the upper center of the vehicle interior space 11. In the example illustrated in FIG. 12, the all-around camera 240 disposed on the ceiling between the front row seat and the rear row seat is schematically illustrated.
  • a front image of the vehicle 201 can be taken through the windshield 220.
  • a side image and a rear image of the vehicle 201 can be taken through the side glass 221 and the rear glass 222.
  • the front image, the side image, and the rear image are used as the external image 45 of the vehicle 201.
  • the specific configuration of the all-around camera 240 is not limited.
  • the side glass 221 corresponds to a side window glass
  • the rear glass 222 corresponds to a rear window glass.
  • the plurality of TOF cameras 251 are arranged so as to be able to photograph an object existing in the vehicle interior space 11 reflected on each window glass (front glass 220, side glass 221, rear glass 222, etc.). In more detail, it arrange
  • the reflection target surface is, for example, a surface of an object that is actually reflected on the window glass, and is typically a surface (region) directed to the window glass of each object.
  • Each TOF camera 251 is disposed on the ceiling near the side wall of the interior space 11 corresponding to the four seats on the left side of the front row, the right side of the front row, the left side of the rear row, and the right side of the rear row. In this way, by arranging the TOF camera 251 corresponding to each seat at the corner of the interior space 11, it is possible to photograph the reflection target surface of each seat 5 and the window glass of the passenger 4 sitting there. is there. In this case, the seat 5 and the passenger 4 are objects that are present inside the vehicle and reflected on the window glass.
  • the seat 5 is an example of an interior part.
  • sensing of other vehicles, pedestrians, obstacles, etc. around the vehicle 201 is executed using the all-round camera 240.
  • the obstacle 3 existing on the right side outside the vehicle 201 is schematically illustrated.
  • sensing around the vehicle 201 can be easily realized.
  • the position and brightness of an object (passenger 4, seat 5, etc.) reflected on the window glass such as the side glass 221 is measured by each TOF camera 251 in accordance with the shooting timing of the all-round camera 240.
  • the brightness of the reflected image 49 of each object that is reflected in the external image 45 captured by the omnidirectional camera 240 is calculated.
  • the brightness of the reflected image 49 can be appropriately calculated from the output of the TOF camera 251 based on the characteristics and the arrangement relationship of the all-round camera 240 and each TOF camera 251.
  • the present technology can be applied even when sensing is performed not only through the front window (front glass 220) but also through the side window (side glass 221) and the rear window (rear glass 222). .
  • the periphery of the vehicle 201 can be sensed with high accuracy over a range of 360 °.
  • FIG. 13 is a schematic diagram illustrating a configuration example of a vehicle on which a control unit according to the third embodiment of the present technology is mounted.
  • FIG. 14 is a block diagram illustrating a configuration example of the control unit 360.
  • the vehicle 300 includes a camera 40, a brightness sensor 350, and a control unit 360.
  • the brightness sensor 350 includes an illuminance sensor 351 and a LiDAR sensor 352.
  • the illuminance sensor 351 is disposed on the dashboard 30 and measures the illuminance E of the dashboard 30.
  • a placement object 34 is placed on the dashboard 30.
  • the illuminance sensor 351 is disposed in front of the dashboard 30 (on the windshield 20 side) so that the irradiation light 23 such as sunlight 24 is not blocked by the mounted object 34 or the like.
  • the illuminance E detected by the illuminance sensor 351 is a value representing the brightness of the sunlight 24 that is transmitted through the windshield 20 and incident. Therefore, the illuminance E can be used as a parameter representing not only the brightness of the dashboard 30 but also the brightness of the mounted object 34 illuminated by the sunlight 24.
  • the LiDAR sensor 352 is a sensor that performs distance detection (Light Detection and Ranging / Laser Imaging Detection and Ranging) using a laser beam or the like.
  • the LiDAR sensor 352 is disposed in the vicinity of the camera 40 toward the dashboard 30.
  • FIG. 13 schematically shows laser light 353 emitted from the LiDAR sensor 352.
  • a scanning 3D scanner that scans the laser beam 353 and measures three-dimensional point cloud data (3D Point Cloud data) representing the distance to the target is used.
  • the laser beam 353 is reflected by an object on which the scanned laser beam 353 is incident.
  • a part of the reflected laser beam 353 enters a detector included in the sensor.
  • the phase or the like of the light incident on the detector it is possible to measure the distance to the object, that is, the three-dimensional position of the object.
  • the laser beam 353 emitted from the LiDAR sensor 352 is applied to the dashboard 30, the mounted object 34, and the like inside the vehicle 300.
  • the light reflected at the irradiated point is detected and the position of the point is measured.
  • the point irradiated with the laser beam 353 becomes the measurement point 35.
  • the measurement point 35 on the placement object 34 is illustrated by a black circle.
  • the LiDAR sensor 352 measures the positions of the plurality of measurement points 35 on the dashboard 30 and the placement object 34 in accordance with the scanning of the laser beam 353.
  • the laser beam 353 passes through the windshield 20 and the like, and is also applied to an object outside the vehicle 300. Therefore, the LiDAR sensor 352 can measure not only the object existing in the vehicle interior space 11 but also the position of the object outside the vehicle 300. That is, the LiDAR sensor 352 can measure the position of an object inside and outside the vehicle 300.
  • the LiDAR sensor 352 can measure information (Reflective value, etc.) relating to the reflectance of the target by measuring the detection intensity (L i : LiDAR Intensity) of the reflected light from the target. For example, the approximate reflectance for each measurement point 35 is measured based on the detection intensity L i .
  • the LiDAR sensor 352 corresponds to a range sensor that measures the reflectance for each of the plurality of measurement points 35.
  • the specific configuration of the LiDAR sensor 352 is not limited.
  • a sensor that can detect point cloud data at a frame rate of several fps to several tens of fps may be used.
  • a LiDAR sensor using laser light having an arbitrary wavelength such as ultraviolet light, visible light, or near infrared light may be used.
  • the scanning range, measurement range, and the like of the laser beam 353 may be set as appropriate according to the size of the vehicle 300 and the like.
  • the LiDAR sensor 352 can measure the reflectance for each of the plurality of measurement points 35 on the dashboard 30 and the placement object 34 and the positions of the plurality of measurement points 35.
  • the LiDAR sensor 352 functions as a position sensor.
  • the illuminance E measured by the illuminance sensor 351 and the reflectance (detection intensity L i ) measured by the LiDAR sensor 352 correspond to parameters relating to the brightness of the object.
  • the control unit 360 includes an image acquisition unit 361, a brightness detection unit 362, a correction unit 363, a storage unit 364, a light emission control unit 365, and a state detection unit 366.
  • the image acquisition unit 361 acquires an external image 45 of the vehicle 300 photographed through the windshield 20 of the vehicle 300.
  • the brightness detection unit 362 detects brightness information related to the brightness of the dashboard 30 and the placement object 34.
  • the illuminance E of the dashboard 30 is detected based on the output of the illuminance sensor 351 as the brightness information.
  • the reflectance for each of the plurality of measurement points 35 on the dashboard 30 and the placement object 34 is detected based on the output of the LiDAR sensor 352.
  • the brightness detection unit 362 detects the luminance for each of the plurality of measurement points 35 based on the illuminance E of the dashboard 30 and the reflectance for each of the plurality of measurement points 35.
  • the luminance of each measurement point 35 can be calculated as appropriate using the product of the illuminance E and the reflectance.
  • the method of calculating the luminance for each of the plurality of measurement points 35 is not limited, and for example, a coefficient such as a calibration value according to the characteristics of the LiDAR sensor 352 may be used as appropriate.
  • the correction unit 363 corrects the luminance of the external image 45 of the vehicle 300 based on the detected luminance for each measurement point 35.
  • the storage unit 364 stores various parameters necessary for correcting the luminance of the external image 45 based on the luminance for each measurement point 35. The operation of the correction unit 363 will be described in detail later.
  • the light emission control unit 365 controls the light intensity of the light 12 and the display device 13 mounted on the vehicle 300 based on the output of the illuminance sensor 351. Therefore, the illuminance sensor 351 is used not only to correct the external image but also to control the light 12 and the display device 13. As a result, the number of parts can be suppressed, and the manufacturing cost of the vehicle 300 can be suppressed.
  • the state detection unit 366 detects the internal state of the vehicle 300 based on the output of the LiDAR sensor 352. For example, based on the position information of the object measured by the LiDAR sensor 352, the position, posture, physical condition, arousal level, concentration level, fatigue level, gaze direction (head direction), etc. of the passenger are detected.
  • the state detection unit 366 detects an external state of the vehicle 300. For example, based on the output of the LiDAR sensor 352, detection processing, recognition processing, tracking processing, detection processing of the distance to the object, and the like of objects around the vehicle 300 are performed. Examples of objects to be detected include vehicles, people, obstacles, structures, roads, traffic lights, traffic signs, road markings, and the like.
  • the output of the LiDAR sensor 352 is used not only for correction of the external image but also for processing for detecting various states inside and outside the vehicle 300.
  • the present invention is not limited to this, and an arbitrary detection process using the output of the LiDAR sensor 352 may be executed. As a result, the number of parts can be suppressed, and the manufacturing cost of the vehicle 300 can be suppressed.
  • FIG. 15 is a schematic diagram illustrating an example of a correction process for the external image 45.
  • the correction processing of the external image 45 by the control unit 360 will be described with reference to FIG.
  • FIG. 15A is a schematic diagram illustrating an example of an external image 45 photographed by the camera 40.
  • FIG. 15B is a schematic diagram illustrating an example of point cloud data measured by the LiDAR sensor 352.
  • the measurement of the point cloud data 354 by the LiDAR sensor 352 and the measurement of the illuminance E by the illuminance sensor 351 are executed at the timing when the external image 45 is captured.
  • the point group data 354 includes the three-dimensional position (x, y, z) of each measurement point 35 and the reflectance (detection intensity L i ) of each measurement point 35.
  • the point cloud data 354 is described as L i (x, y, z).
  • the three-dimensional position (x, y, z) is a position coordinate based on the sensor coordinate system of the LiDAR sensor 352, for example.
  • the brightness detection unit 362 calculates luminance data E (x, y, z) of each measurement point 35 from the point cloud data L i (x, y, z) based on the illuminance E of the dashboard 30. Then, the correction unit 363 calculates the luminance of the reflected image 49 in the external image 45 shown in FIG. 15A from the luminance data E (x, y, z). That is, the luminance L ref of the second reflected light 33 emitted from the dashboard 30 and the placed object 34 and reflected by the windshield 20 is calculated (see FIG. 13).
  • the luminance L ref of the second reflected light 33 is expressed by the following equation using luminance data E (x, y, z) indicating the luminance of the dashboard 30 and the mounted object 34, for example.
  • L ref (x, y, z) ⁇ ′E (x, y, z) + ⁇ ′ (4)
  • ⁇ ′ and ⁇ ′ are coefficients determined in accordance with the characteristics of the windshield 20 (reflectance, etc.).
  • the coefficients ⁇ ′ and ⁇ ′ are calculated in advance and stored in the storage unit 364, for example.
  • the luminance distribution of the second reflected light 33 calculated by the equation (4) is coordinate-converted by the correction unit 363 using the following equation.
  • L ref (u, v) W ′ (L ref (x, y, z), c ′) (5)
  • W ′ is a function (conversion matrix or the like) for converting the sensor coordinate system (x, y, z) of the LiDAR sensor 352 to the coordinate system (u, v) of the external image 45.
  • W ′ is appropriately set using, for example, perspective projection conversion.
  • C ′ is a calibration value for performing coordinate conversion by the function W ′.
  • the luminance distribution L ref (u, v) of the second reflected light 33 calculated using the equation (5) is the luminance distribution of the reflected image 49 in the external image 45 shown in FIG. 15A. That is, by using the expressions (4) and (5), the brightness of the reflected image 49 can be calculated from the illuminance E and the reflectance (detection intensity L i ).
  • the corrected image 48 shown in FIG. 15C can be generated by appropriately subtracting the calculated luminance distribution L ref (u, v) of the second reflected light 33 from the luminance of the external image 45 shown in FIG. 15A. Is possible. As described above, even when the illuminance sensor 351 and the LiDAR sensor 352 are used, it is possible to appropriately remove the reflection of the dashboard 30 and the reflection of the mounted object 34 on the dashboard 30. . Thereby, the accuracy of image sensing or the like can be sufficiently improved.
  • the point cloud data L i (x, y, z) of the LiDAR sensor 352 includes information such as the reflectance of an object existing outside the vehicle 300. Therefore, by using the illuminance E measured by the illuminance sensor 351 and the point cloud data L i (x, y, z), for example, a subject outside the vehicle (another vehicle, a pedestrian, an obstacle, etc.) reflected in the external image 45 Can be estimated.
  • the brightness distribution of the subject distributed of bright places and dark places
  • the intensity of brightness excessive whitening and extreme whiteness
  • FIG. 16 is a schematic diagram illustrating a configuration example of a vehicle on which a control unit according to the fourth embodiment of the present technology is mounted.
  • FIG. 17 is a block diagram illustrating a configuration example of the control unit 460.
  • the vehicle 400 includes a windshield 20, a dashboard 30, a camera 40, and a control unit 460.
  • the dashboard 30 is connected to the lower end of the windshield 20.
  • An opening 431 is provided in front of the dashboard 30 (side connected to the windshield 20).
  • the opening 431 is a hole through which, for example, hot / cold air output from an air conditioner or the like passes, and is connected to an air duct 432 or the like disposed below the dashboard 30.
  • FIG. 16 schematically shows the opening 431 and the air duct 432.
  • the sunlight 24 that has entered the opening 431 through the windshield 20 is reflected or absorbed inside the air duct 432 connected to the opening 431. Therefore, the light reflected inside the air duct 432 is hardly reflected on the windshield 20. Therefore, it can be said that the opening 431 is a hole in which reflection on the dashboard 30 hardly occurs.
  • a part of the irradiated sunlight 24 enters the windshield 20 as the first reflected light 32.
  • a part of the first reflected light 32 is reflected by the windshield 20 and enters the camera 40 as the second reflected light 33.
  • a region different from the opening 431 on the dashboard 30 is referred to as a reflective portion 433.
  • the dashboard 30 is along the second optical path 435 that is different from the first optical path, and the reflection part 433 that reflects light along the first optical path 434 that goes to the camera 40 through the windshield 20.
  • an opening 431 for reflecting light is an optical path through which, for example, the first reflected light 32 and the second reflected light 33 shown in FIG. 16 pass, and is an optical path through which light reflected in the external image 45 passes.
  • the second optical path 435 is an optical path through which light reflected and absorbed in the air duct 432 passes, for example.
  • the reflective portion 433 corresponds to the first region
  • the opening 431 corresponds to the second region.
  • the camera 40 has the same configuration as the camera 40 shown in FIGS. 1 and 2, for example. Note that the shooting range of the camera 40 is set so that, for example, the reflection around the opening 431, that is, the lower end of the windshield 20 is shot (see FIG. 18A).
  • the control unit 460 includes an image acquisition unit 461, a brightness detection unit 462, a correction unit 463, and a storage unit 464.
  • the image acquisition unit 461 acquires an external image 45 of the vehicle 400 taken through the windshield 20 of the vehicle 400.
  • the brightness detection unit 462 detects brightness information related to the brightness of the dashboard 30.
  • the brightness information of the dashboard 30 is detected based on the external image 45 photographed by the camera 40. The operation of the brightness detection unit 462 will be described in detail later.
  • the correction unit 463 corrects the luminance of the external image 45 of the vehicle 400 based on the detected brightness information.
  • the storage unit 464 stores various parameters necessary for correcting the luminance of the external image 45.
  • the storage unit 464 stores an area map M (u, v) representing a reflection area where the reflection of the dashboard 30 occurs.
  • FIG. 18 is a schematic diagram illustrating an example of a correction process for the external image 45.
  • the correction processing of the external image 45 by the control unit 360 will be described with reference to FIG.
  • FIG. 18A is a schematic diagram illustrating an example of an external image 45 photographed by the camera 40. As shown in FIG. 18A, an image of the reflection portion 433 around the opening 431 is schematically illustrated in the reflection region 46 below the external image 45. Note that the white area in the reflection area 46 is an area where the reflection portion 433 is reflected, and the gray area surrounded by the white area corresponds to the opening 431.
  • FIG. 18B is an enlarged view of a region surrounded by a dotted line range 6 shown in FIG. 18A, and is an enlarged view of a range in which an image around the opening 431 is reflected in the external image 45.
  • a luminance shift accompanying the reflection for example, an increase in luminance L ref due to the second reflected light 33
  • L hole L trn .
  • the brightness detection unit 462 detects a luminance difference between the reflection unit 433 and the opening 431 in the external image 45. Thereby, it is possible to calculate the luminance shift amount accompanying the reflection, that is, the luminance L ref of the second reflected light 33.
  • the luminance L ref of the second reflected light 33 is calculated using, for example, the following equation.
  • L ref L near -L hole (7)
  • the area map M (u, v) representing the reflection area 46 records a range in which a gray area corresponding to the opening 431 is captured.
  • the brightness detection unit 462 calculates the luminance value (L hole ) of the pixel included in the gray area and the luminance value (L near ) of the pixel included in the bright area around the gray area. Then, based on the calculated L hole and L near , L ref is calculated using equation (7).
  • the method for calculating the luminance L ref of the second reflected light 33 is not limited, and for example, L ref may be calculated using the average value of the luminance of each of the gray region and the surrounding region. .
  • the external image 45 can be easily corrected by using the hole (opening 431) provided in the dashboard 30 or the like. Thereby, the accuracy of image sensing or the like can be sufficiently improved.
  • the brightness of the dashboard 30 and the like can be corrected using the external image 45, it is not necessary to use another sensor or the like. For this reason, it is possible to reduce the number of parts and the like, and it is possible to greatly reduce the manufacturing cost of the apparatus.
  • the method for correcting the external image 45 (front image) mainly photographed through the windshield 20 has been described.
  • the present technology is not limited to this, and the present technology can also be applied when, for example, a side image or a rear image is captured through a side glass or a rear glass of a vehicle.
  • a camera that captures the outside of the vehicle via each window glass is appropriately disposed, and an illuminance sensor, a TOF camera, a LiDAR sensor, and the like are appropriately disposed so as to remove reflections from the camera.
  • an illuminance sensor, a TOF camera, a LiDAR sensor, and the like are appropriately disposed so as to remove reflections from the camera.
  • the single camera 40 is used.
  • the present technology can also be applied when a plurality of cameras (for example, a stereo camera) are used.
  • a plurality of cameras for example, a stereo camera
  • an area map, a correction parameter, a transformation matrix for coordinate transformation, and the like are stored for each camera. Thereby, it becomes possible to correct
  • the information processing method according to the present technology including external image correction and the like was executed by the control unit.
  • the information processing method according to the present technology may be executed by the cloud server. That is, the function of the control unit may be mounted on the cloud server.
  • the cloud server operates as an information processing apparatus according to the present technology.
  • a computer mounted on a vehicle and another computer (cloud server) that can communicate via a network or the like work together to execute the information processing method and program according to the present technology.
  • An information processing apparatus according to the technology may be constructed.
  • the information processing method and the program according to the present technology can be executed not only in a computer system configured by a single computer but also in a computer system in which a plurality of computers operate in conjunction with each other.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems.
  • the information processing method and the program according to the present technology by the computer system are executed by, for example, acquiring an external image of the vehicle, detecting brightness information of an object reflected on the window glass, and correcting the external image by a single computer. It includes both the case where it is executed and the case where each process is executed by a different computer.
  • the execution of each process by a predetermined computer includes causing another computer to execute a part or all of the process and acquiring the result.
  • the information processing method and program according to the present technology can be applied to a configuration of cloud computing in which one function is shared by a plurality of devices via a network and is processed jointly.
  • the vehicle that is a moving body is described as an example.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor).
  • FIG. 19 is a block diagram illustrating a schematic configuration example of a vehicle control system 7000 that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. .
  • the communication network 7010 for connecting the plurality of control units conforms to an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various calculations, and a drive circuit that drives various devices to be controlled. Is provided.
  • Each control unit includes a network I / F for communicating with other control units via a communication network 7010, and is connected to devices or sensors inside and outside the vehicle by wired communication or wireless communication. A communication I / F for performing communication is provided. In FIG.
  • a microcomputer 7610 As a functional configuration of the integrated control unit 7600, a microcomputer 7610, a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F 7660, an audio image output unit 7670, An in-vehicle network I / F 7680 and a storage unit 7690 are illustrated.
  • other control units include a microcomputer, a communication I / F, a storage unit, and the like.
  • the drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 7100 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • a vehicle state detection unit 7110 is connected to the drive system control unit 7100.
  • the vehicle state detection unit 7110 includes, for example, a gyro sensor that detects the angular velocity of the rotational movement of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an operation amount of an accelerator pedal, an operation amount of a brake pedal, and steering of a steering wheel. At least one of sensors for detecting an angle, an engine speed, a rotational speed of a wheel, or the like is included.
  • the drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, or the like.
  • the body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
  • the body control unit 7200 can be input with radio waves or various switch signals transmitted from a portable device that substitutes for a key.
  • the body system control unit 7200 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
  • the battery control unit 7300 controls the secondary battery 7310 that is a power supply source of the drive motor according to various programs. For example, information such as battery temperature, battery output voltage, or remaining battery capacity is input to the battery control unit 7300 from a battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature adjustment of the secondary battery 7310 or the cooling device provided in the battery device.
  • the outside information detection unit 7400 detects information outside the vehicle on which the vehicle control system 7000 is mounted.
  • the outside information detection unit 7400 is connected to at least one of the imaging unit 7410 and the outside information detection unit 7420.
  • the imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the outside information detection unit 7420 detects, for example, current weather or an environmental sensor for detecting weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the surrounding information detection sensors.
  • the environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects sunlight intensity, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • the imaging unit 7410 and the outside information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 20 shows an example of installation positions of the imaging unit 7410 and the vehicle outside information detection unit 7420.
  • the imaging units 7910, 7912, 7914, 7916, and 7918 are provided at, for example, at least one of the front nose, the side mirror, the rear bumper, the back door, and the upper part of the windshield in the vehicle interior of the vehicle 7900.
  • An imaging unit 7910 provided in the front nose and an imaging unit 7918 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900.
  • Imaging units 7912 and 7914 provided in the side mirror mainly acquire an image of the side of the vehicle 7900.
  • An imaging unit 7916 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 7900.
  • the imaging unit 7918 provided on the upper part of the windshield in the passenger compartment is mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or
  • FIG. 20 shows an example of shooting ranges of the respective imaging units 7910, 7912, 7914, and 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided in the front nose
  • the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided in the side mirrors, respectively
  • the imaging range d The imaging range of the imaging part 7916 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 7910, 7912, 7914, and 7916, an overhead image when the vehicle 7900 is viewed from above is obtained.
  • the vehicle outside information detection units 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, corners of the vehicle 7900 and the upper part of the windshield in the vehicle interior may be, for example, an ultrasonic sensor or a radar device.
  • the vehicle outside information detection units 7920, 7926, and 7930 provided on the front nose, the rear bumper, the back door, and the windshield in the vehicle interior of the vehicle 7900 may be, for example, LIDAR devices.
  • These outside information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, and the like.
  • the vehicle exterior information detection unit 7400 causes the imaging unit 7410 to capture an image outside the vehicle and receives the captured image data. Further, the vehicle exterior information detection unit 7400 receives detection information from the vehicle exterior information detection unit 7420 connected thereto. When the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information.
  • the outside information detection unit 7400 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received information.
  • the vehicle exterior information detection unit 7400 may perform environment recognition processing for recognizing rainfall, fog, road surface conditions, or the like based on the received information.
  • the vehicle outside information detection unit 7400 may calculate a distance to an object outside the vehicle based on the received information.
  • the outside information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received image data.
  • the vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and combines the image data captured by the different imaging units 7410 to generate an overhead image or a panoramic image. Also good.
  • the vehicle exterior information detection unit 7400 may perform viewpoint conversion processing using image data captured by different imaging units 7410.
  • the vehicle interior information detection unit 7500 detects vehicle interior information.
  • a driver state detection unit 7510 that detects the driver's state is connected to the in-vehicle information detection unit 7500.
  • Driver state detection unit 7510 may include a camera that captures an image of the driver, a biosensor that detects biometric information of the driver, a microphone that collects sound in the passenger compartment, and the like.
  • the biometric sensor is provided, for example, on a seat surface or a steering wheel, and detects biometric information of an occupant sitting on the seat or a driver holding the steering wheel.
  • the vehicle interior information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and determines whether the driver is asleep. May be.
  • the vehicle interior information detection unit 7500 may perform a process such as a noise canceling process on the collected audio signal.
  • the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs.
  • An input unit 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by a device that can be input by a passenger, such as a touch panel, a button, a microphone, a switch, or a lever.
  • the integrated control unit 7600 may be input with data obtained by recognizing voice input through a microphone.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) that supports the operation of the vehicle control system 7000. May be.
  • the input unit 7800 may be, for example, a camera.
  • the passenger can input information using a gesture.
  • data obtained by detecting the movement of the wearable device worn by the passenger may be input.
  • the input unit 7800 may include, for example, an input control circuit that generates an input signal based on information input by a passenger or the like using the input unit 7800 and outputs the input signal to the integrated control unit 7600.
  • a passenger or the like operates the input unit 7800 to input various data or instruct a processing operation to the vehicle control system 7000.
  • the storage unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, and the like.
  • the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • General-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
  • General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced).
  • GSM Global System of Mobile communications
  • WiMAX registered trademark
  • LTE registered trademark
  • LTE-A Long Term Evolution-Advanced
  • another wireless communication protocol such as a wireless LAN (also referred to as Wi-Fi (registered trademark)), Bluetooth (registered trademark), or the like may be implemented.
  • the general-purpose communication I / F 7620 is connected to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or an operator-specific network) via, for example, a base station or an access point. May be.
  • the general-purpose communication I / F 7620 is a terminal (for example, a driver, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal) that exists in the vicinity of the vehicle using, for example, P2P (Peer To Peer) technology. You may connect with.
  • the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in vehicles.
  • the dedicated communication I / F 7630 is a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of the lower layer IEEE 802.11p and the upper layer IEEE 1609. May be implemented.
  • the dedicated communication I / F 7630 typically includes vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication. ) Perform V2X communication, which is a concept that includes one or more of the communications.
  • the positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), performs positioning, and performs latitude, longitude, and altitude of the vehicle.
  • the position information including is generated.
  • the positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
  • the beacon receiving unit 7650 receives, for example, radio waves or electromagnetic waves transmitted from a radio station installed on the road, and acquires information such as the current position, traffic jam, closed road, or required time. Note that the function of the beacon receiving unit 7650 may be included in the dedicated communication I / F 7630 described above.
  • the in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 present in the vehicle.
  • the in-vehicle device I / F 7660 may establish a wireless connection using a wireless communication protocol such as a wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
  • the in-vehicle device I / F 7660 is connected to a USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High) via a connection terminal (and a cable if necessary). -definition Link) etc.
  • the in-vehicle device 7760 includes, for example, at least one of a mobile device or a wearable device possessed by the passenger, or an information device carried into or attached to the vehicle. Further, the in-vehicle device 7760 may include a navigation device that searches for a route to an arbitrary destination, and the in-vehicle device I / F 7660 transmits a control signal to and from these in-vehicle devices 7760. Or, exchange data signals.
  • the in-vehicle network I / F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the in-vehicle network I / F 7680 transmits and receives signals and the like in accordance with a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 is connected via at least one of a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F 7660, and an in-vehicle network I / F 7680.
  • the vehicle control system 7000 is controlled according to various programs based on the acquired information. For example, the microcomputer 7610 calculates a control target value of the driving force generation device, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. Also good.
  • the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning. You may perform the cooperative control for the purpose. Further, the microcomputer 7610 controls the driving force generator, the steering mechanism, the braking device, or the like based on the acquired information on the surroundings of the vehicle, so that the microcomputer 7610 automatically travels independently of the driver's operation. You may perform the cooperative control for the purpose of driving.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 7610 is information acquired via at least one of the general-purpose communication I / F 7620, the dedicated communication I / F 7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I / F 7660, and the in-vehicle network I / F 7680.
  • the three-dimensional distance information between the vehicle and the surrounding structure or an object such as a person may be generated based on the above and local map information including the peripheral information of the current position of the vehicle may be created.
  • the microcomputer 7610 may generate a warning signal by predicting a danger such as a collision of a vehicle, approach of a pedestrian or the like or an approach to a closed road based on the acquired information.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
  • an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are illustrated as output devices.
  • Display unit 7720 may include at least one of an on-board display and a head-up display, for example.
  • the display portion 7720 may have an AR (Augmented Reality) display function.
  • the output device may be other devices such as headphones, wearable devices such as glasses-type displays worn by passengers, projectors, and lamps.
  • the display device can display the results obtained by various processes performed by the microcomputer 7610 or information received from other control units in various formats such as text, images, tables, and graphs. Display visually. Further, when the output device is an audio output device, the audio output device converts an audio signal made up of reproduced audio data or acoustic data into an analog signal and outputs it aurally.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • each control unit may be configured by a plurality of control units.
  • the vehicle control system 7000 may include another control unit not shown.
  • some or all of the functions of any of the control units may be given to other control units. That is, as long as information is transmitted and received via the communication network 7010, the predetermined arithmetic processing may be performed by any one of the control units.
  • a sensor or device connected to one of the control units may be connected to another control unit, and a plurality of control units may transmit / receive detection information to / from each other via the communication network 7010. .
  • the computer program for realizing the functions of the control units 60, 260, 360, and 460 according to the first to fourth embodiments described with reference to FIG. 2, FIG. 9, FIG. 14, and FIG. Can be mounted on any of the control units shown in FIG. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed via a network, for example, without using a recording medium.
  • the vehicle control system 7000 described above can be applied to the control units 60, 260, 360, and 460 according to the first to fourth embodiments and the integrated control unit 7600 of the application example shown in FIG.
  • the image acquisition unit 61 and the brightness detection unit 62 of the control unit 60 according to the first embodiment correspond to the vehicle exterior information detection unit 7400 and the vehicle interior information detection unit 7500, and the correction unit 63 and the storage unit 64 are integrated.
  • the control unit 7600 corresponds to the microcomputer 7610 and the storage unit 7690
  • the light emission control unit 65 corresponds to the body system control unit 7200.
  • control units 60, 260, 360, and 460 according to the first to fourth embodiments are modules for the integrated control unit 7600 shown in FIG. Integrated circuit module).
  • control units 60, 260, 360, and 460 according to the first to fourth embodiments may be realized by a plurality of control units of the vehicle control system 7000 illustrated in FIG.
  • this technique can also take the following structures.
  • an acquisition unit that acquires an external image of the vehicle photographed through a window glass of the vehicle;
  • a detection unit for detecting brightness information relating to the brightness of an object existing inside the vehicle and reflected on the window glass;
  • An information processing apparatus comprising: a correction unit that corrects the external image of the vehicle based on the detected brightness information.
  • the brightness information includes information on at least one of illuminance, luminance, and reflectance of the object.
  • the target object includes at least one of a dashboard, an interior part, and a placement object placed inside the vehicle.
  • the information processing apparatus according to any one of (1) to (3), The information processing apparatus, wherein the window glass includes at least one of a front window glass, a side window glass, and a rear window glass.
  • the information processing apparatus corrects the luminance of the external image based on the brightness information of the object.
  • the information processing apparatus according to any one of (1) to (5), The correction unit calculates a luminance change amount of the external image due to reflection of the object through the window glass based on the brightness information.
  • the information processing apparatus according to any one of (1) to (6), The information processing apparatus, wherein the detection unit detects the brightness information based on an output of a sensor unit that measures a parameter relating to the brightness of the object.
  • the information processing apparatus includes an illuminance sensor that measures illuminance of the object.
  • the information processing apparatus includes an illuminance sensor that measures illuminance of the object.
  • the information processing apparatus determines a region in which the target object is reflected in the external image, and corrects luminance of a region in which the target object is determined to be reflected based on illuminance of the target object.
  • the sensor unit includes a plurality of the illuminance sensors arranged inside the vehicle according to the intensity of light reflected toward the window glass by the object.
  • the information processing apparatus according to any one of (8) to (10), and further, based on an output of the illuminance sensor, at least one of a light mounted on the vehicle and a display device
  • An information processing apparatus including a light emission control unit that controls light emission intensity.
  • the information processing apparatus according to any one of (7) to (11),
  • the sensor unit includes a position sensor capable of measuring a parameter relating to the brightness for each of a plurality of measurement points on the object, and positions of the plurality of measurement points, The information processing apparatus, wherein the detection unit detects brightness at each of the plurality of measurement points on the object based on an output of the position sensor.
  • the position sensor is a range sensor that measures reflectance at each of the plurality of measurement points.
  • the information processing apparatus includes an illuminance sensor that measures the illuminance of the object, The said detection part detects the brightness
  • the information processing apparatus according to any one of (12) to (14),
  • the position sensor is a distance image sensor that measures luminance at each of the plurality of measurement points.
  • the correction unit converts positions of the plurality of measurement points of the object measured by the position sensor into positions in the external image.
  • the information processing apparatus according to any one of (12) to (16), further including: an external state of the vehicle and an internal state of the vehicle based on an output of the position sensor
  • An information processing apparatus including a state detection unit that detects at least one.
  • the information processing apparatus according to any one of (1) to (17), The external image is taken by an imaging unit mounted inside the vehicle, The object transmits light along a first optical path that reflects light along a first optical path toward the imaging unit through the window glass, and a second optical path that is different from the first optical path. A second region to reflect, The information processing apparatus, wherein the detection unit detects a luminance difference between the first and second regions in the external image.
  • an imaging unit mounted inside the vehicle An acquisition unit for acquiring an external image of the vehicle imaged through the window glass of the vehicle by the imaging unit; A detection unit for detecting brightness information relating to the brightness of an object existing inside the vehicle and reflected on the window glass; A correction unit that corrects the external image of the vehicle based on the detected brightness information.
  • an imaging unit mounted inside the vehicle An acquisition unit for acquiring an external image of the vehicle imaged through the window glass of the vehicle by the imaging unit; A detection unit for detecting brightness information relating to the brightness of an object existing inside the vehicle and reflected on the window glass; A correction unit that corrects the external image of the vehicle based on the detected brightness information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Image Processing (AREA)

Abstract

Un dispositif de traitement de l'information selon un aspect de la présente invention comporte une unité d'acquisition, une unité de détection et une unité de correction. L'unité d'acquisition acquiert une image extérieure d'un véhicule, l'image extérieure étant capturée par l'intermédiaire d'une vitre du véhicule. L'unité de détection détecte des informations de luminosité relatives à la luminosité d'un objet présent dans le véhicule et réfléchie sur la vitre. L'unité de correction corrige l'image extérieure du véhicule sur la base des informations de luminosité détectées.
PCT/JP2019/000364 2018-02-23 2019-01-09 Dispositif de traitement de l'information, dispositif d'imagerie et système d'imagerie WO2019163315A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-031013 2018-02-23
JP2018031013A JP2019145021A (ja) 2018-02-23 2018-02-23 情報処理装置、撮像装置、及び撮像システム

Publications (1)

Publication Number Publication Date
WO2019163315A1 true WO2019163315A1 (fr) 2019-08-29

Family

ID=67687661

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/000364 WO2019163315A1 (fr) 2018-02-23 2019-01-09 Dispositif de traitement de l'information, dispositif d'imagerie et système d'imagerie

Country Status (2)

Country Link
JP (1) JP2019145021A (fr)
WO (1) WO2019163315A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210157002A1 (en) * 2019-11-21 2021-05-27 Yandex Self Driving Group Llc Methods and systems for computer-based determining of presence of objects
WO2021190873A1 (fr) * 2020-03-24 2021-09-30 Siemens Mobility GmbH Surveillance de capteur automatisée
US20220141392A1 (en) * 2020-10-29 2022-05-05 Toyota Jidosha Kabushiki Kaisha Object detection apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006308329A (ja) * 2005-04-26 2006-11-09 Denso Corp レインセンサ
JP2010079706A (ja) * 2008-09-26 2010-04-08 Mazda Motor Corp 車両用対象物検出装置
JP2012220889A (ja) * 2011-04-13 2012-11-12 Fujitsu Ten Ltd ディマー制御装置および表示制御装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006308329A (ja) * 2005-04-26 2006-11-09 Denso Corp レインセンサ
JP2010079706A (ja) * 2008-09-26 2010-04-08 Mazda Motor Corp 車両用対象物検出装置
JP2012220889A (ja) * 2011-04-13 2012-11-12 Fujitsu Ten Ltd ディマー制御装置および表示制御装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210157002A1 (en) * 2019-11-21 2021-05-27 Yandex Self Driving Group Llc Methods and systems for computer-based determining of presence of objects
US11740358B2 (en) * 2019-11-21 2023-08-29 Yandex Self Driving Group Llc Methods and systems for computer-based determining of presence of objects
WO2021190873A1 (fr) * 2020-03-24 2021-09-30 Siemens Mobility GmbH Surveillance de capteur automatisée
US20220141392A1 (en) * 2020-10-29 2022-05-05 Toyota Jidosha Kabushiki Kaisha Object detection apparatus

Also Published As

Publication number Publication date
JP2019145021A (ja) 2019-08-29

Similar Documents

Publication Publication Date Title
US10904503B2 (en) Image processing device, information generation device, and information generation method
JP6834964B2 (ja) 画像処理装置、画像処理方法、およびプログラム
US11076141B2 (en) Image processing device, image processing method, and vehicle
JP6764573B2 (ja) 画像処理装置、画像処理方法、およびプログラム
CN110574357B (zh) 成像控制设备、用于控制成像控制设备的方法以及移动体
JP7226440B2 (ja) 情報処理装置、情報処理方法、撮影装置、照明装置、及び、移動体
US20210218875A1 (en) Information processing apparatus and information processing method, imaging apparatus, mobile device, and computer program
WO2019163315A1 (fr) Dispositif de traitement de l'information, dispositif d'imagerie et système d'imagerie
US11585898B2 (en) Signal processing device, signal processing method, and program
US20220057203A1 (en) Distance measurement device and distance measurement method
US20230219495A1 (en) Signal processing device, light adjusting control method, signal processing program, and light adjusting system
WO2016203989A1 (fr) Dispositif et procédé de traitement d'images
JP2018032986A (ja) 情報処理装置および方法、車両、並びに情報処理システム
WO2018042815A1 (fr) Dispositif de traitement d'image et procédé de traitement d'image
JP7059185B2 (ja) 画像処理装置、画像処理方法、および撮像装置
CN111868778B (zh) 图像处理装置、图像处理方法以及存储介质
CN114788257A (zh) 信息处理装置、信息处理方法、程序、成像装置和成像系统
WO2023234033A1 (fr) Dispositif de télémétrie
WO2021229983A1 (fr) Dispositif et programme de capture d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19756903

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19756903

Country of ref document: EP

Kind code of ref document: A1