US12214733B2 - Camera unit installing method, moving device, image processing system, image processing method, and storage medium - Google Patents
Camera unit installing method, moving device, image processing system, image processing method, and storage medium Download PDFInfo
- Publication number
- US12214733B2 US12214733B2 US17/933,676 US202217933676A US12214733B2 US 12214733 B2 US12214733 B2 US 12214733B2 US 202217933676 A US202217933676 A US 202217933676A US 12214733 B2 US12214733 B2 US 12214733B2
- Authority
- US
- United States
- Prior art keywords
- area
- image
- camera unit
- optical system
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B30/00—Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/06—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe involving anamorphosis
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B43/00—Testing correct operation of photographic apparatus or parts thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R2011/0001—Arrangements for holding or mounting articles, not otherwise provided for characterised by position
- B60R2011/004—Arrangements for holding or mounting articles, not otherwise provided for characterised by position outside the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/40—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
- B60R2300/402—Image calibration
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/001—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
Definitions
- a camera which is the imaging device that captures an image for the electronic rearview mirror is required to have a high resolution such that a driver can more accurately confirm a relatively far rear view.
- a camera for the rearview confirmation system is required to image a broader range such that a driver can confirm safety in a broader area including rear blind spots or rear-lateral angles of the vehicle to avoid collision at the time of rearward movement or the like.
- an objective of the present disclosure is to provide a camera unit installing method that can facilitate settings or the like of a viewing angle in consideration of the aforementioned circumstances.
- FIG. 3 is a functional block diagram illustrating a configuration of an image processing system according to the first embodiment.
- FIG. 6 A is a diagram illustrating a top view of a vehicle according to a second embodiment
- FIG. 6 B is a diagram illustrating a left side view of the vehicle according to the second embodiment
- FIG. 6 C is a diagram illustrating a front view of the vehicle according to the second embodiment.
- FIG. 1 is a diagram illustrating a positional relationship between camera units and a vehicle according to the first embodiment.
- camera units 11 , 12 , 13 , and 14 are installed, for example, on front, right, rear, and left sides of a vehicle 1 which is a mobile object (a mobile object body), respectively.
- a vehicle 1 which is a mobile object (a mobile object body)
- four camera units are provided, but the number of camera units is not limited to four and at least one camera unit may be provided.
- the camera units 11 to 14 are installed to image a front area, a right area, a left area, and a rear area of the vehicle 1 which is a mobile object as imaging areas.
- the camera units 11 to 14 have substantially the same configuration, each including an imaging device that captures an optical image and an optical system that forms an optical image on a light receiving surface of the imaging device.
- the gravity center of the boundary 93 may not match a position at which the optical axis of the optical system crosses the light receiving surface.
- the gravity center of the boundary 93 substantially matches the position at which the optical axis of the optical system crosses the light receiving surface, it is possible to facilitate optical design, to obtain stable optical characteristics, and to reduce a load for distortion correction.
- the optical system having the projection characteristic y( ⁇ ) satisfying the conditions of Expression 1 may be referred to as a different-angle-of-view lens.
- FIG. 3 is a functional block diagram illustrating the configuration of the image processing system according to the first embodiment.
- R, G, R, and G signals are sequentially output, for example, from a predetermined row of the Bayer layout.
- G, B, G, and B signals are sequentially output from a neighboring row.
- the image processing units 31 a to 34 a perform de-Bayer processing on image data input according to the Bayer layout from the imaging units 21 to 24 and convert the image data to image data in an RGB raster format.
- the image processing units perform various correction processes such as white balance adjustment, gain offset adjustment, gamma processing, color matrix processing, and reversible compression.
- so-called RAW image signals are formed without performing irreversible compression or the like.
- the recognition units 31 b to 34 b may cut out the RAW image signals acquired from the high-resolution areas 10 a and perform the image recognizing process on only the RAW image signals acquired from the high-resolution areas 10 a.
- an area cut out for image recognition at that time be a rectangular shape which is appropriate for the image recognizing process.
- the cut-out rectangular area may be only a part of each high-resolution area 10 a (for example, a rectangle inscribing the high-resolution area 10 a ) or may be a rectangular shape including both the high-resolution area 10 a and the low-resolution area 10 b.
- the recognition units 31 b to 34 b receive prediction information which is a set of information on an object type and a moving direction of the corresponding object or preferential recognition area information from an integrated control unit 41 c of the integrated processing unit 40 . This prediction information will be described later.
- the camera information units 31 c to 34 c store camera information of the camera units 11 to 14 in a memory in advance.
- the camera information units may temporarily store information from various sensors and the like provided in the camera units 11 to 14 .
- the camera information includes, for example, the characteristic information (such as resolution boundary information) illustrated in FIG. 2 of optical images formed by the different-angle-of-view lenses 21 c to 24 c.
- the camera information also includes the numbers of pixels of the imaging devices 21 d to 24 d , position coordinate and posture (such as pitch, roll, and yaw) information in a vehicle coordinate system of the camera units, and imaging directions.
- the camera information may include information such as gamma characteristics, sensitivity characteristics, and frame rates.
- the camera information may include information on an image processing method or an image format when the RAW image signals are generated by the image processing units 31 a to 34 a.
- the position coordinates for attachment may be stored in a memory in the corresponding camera information unit in advance because the attachment position of each camera unit on the vehicle is often determined.
- Posture coordinates of a camera unit are coordinates relative to the vehicle 1 and may be acquired using an encoder or the like (not illustrated) provided in the corresponding camera unit. Alternatively, the posture coordinates of each camera unit may be acquired using a three-dimensional acceleration sensor or the like.
- the information on an imaging direction may be acquired, for example, using a geomagnetic sensor.
- the resolution boundary information of a camera is determined by lens design and thus is stored in the memory of the camera information unit in advance.
- the camera information is information specific to the imaging units 21 to 24 and differs, and such information is transmitted to the integrated processing unit 40 and is referred to by the integrated processing unit 40 at the time of performing image processing or the like.
- the camera information units 31 c to 34 c serve as a storage unit that stores characteristic information of the optical characteristics thereof or position and posture information of the corresponding camera unit.
- a CPU which is a computer or a memory which is a storage medium storing a computer program is provided in the camera processing units 31 to 34 .
- the CPU is configured to control constituents of the camera processing units 31 to 34 by executing the computer program in the memory.
- the image processing units 31 a to 34 a or the recognition units 31 b to 34 b are configured by hardware such as a dedicated circuit (ASIC) or a processor (a reconfigurable process or a DSP). Accordingly, it is possible to realize an increase in image recognition speed in a high-resolution area and to enhance a possibility of avoidance of an accident.
- the image processing units 31 a to 34 a may have a distortion correcting function.
- Some or all of the functional blocks in the camera processing units 31 to 34 may be realized by causing the CPU to execute a computer program stored in the memory. In this case, it is preferable to increase a processing speed of the CPU.
- Reference sign 40 denotes an integrated processing unit and includes a system on chip (SOC)/field programmable gate array (FPGA) 41 , a CPU 42 which is a computer, and a memory 43 which is a storage medium.
- the CPU 42 performs various types of control of the image processing system 100 as a whole by executing a computer program stored in the memory 43 .
- the integrated processing unit 40 is accommodated in a housing different from that for the camera units.
- the SOC/FPGA 41 includes an image processing unit 41 a , a recognition unit 41 b , and an integrated control unit 41 c .
- the image processing unit 41 a acquires RAW image signals from the corresponding camera processing units 31 to 34 and acquires camera information of the camera units 11 to 14 from the camera information units 31 c to 34 c.
- the camera information includes optical characteristics of the different-angle-of-view lenses 21 c to 24 c , the numbers of pixels, photoelectric conversion characteristics, y characteristics, and sensitivity characteristics of the imaging devices 21 d to 24 d , format information of RAW image signals, or position coordinates and posture information in the vehicle coordinate system of the camera units.
- the image processing unit 41 a acquires camera information such as characteristic information of the optical system.
- the image processing unit 41 a performs resolution conversion on the RAW image signals from the camera processing units 31 to 34 on the basis of the acquired camera information and performs an image processing step such as distortion correction on image signals acquired from the low-resolution area 10 b of the imaging units 21 to 24 .
- the image processing unit 41 a performs distortion correction on an image signal from a distortion-correction area on the basis of the optical characteristics, and synthesizes the distortion-corrected image signal and an image signal from a non-distortion-correction area which has not been subjected to distortion correction to generate a composite image. That is, the image processing unit 41 a also serves as a display signal generating unit and performs a display signal generating step of generating a composite image by performing distortion correction or the like.
- the distortion-correction area in the first embodiment can be set by a user or automatically.
- the image processing unit 41 a since the image signal acquired from the high-resolution area 10 a is hardly distorted, the image processing unit 41 a does not perform distortion correction on the image signal acquired from the high-resolution area 10 a .
- the image processing unit 41 a may also perform simplified distortion correction on the image signal acquired from the high-resolution area 10 a .
- the image processing unit 41 a appropriately an irreversible compression process or the like on the RAW image signals from the camera processing units 31 to 34 .
- the image processing unit 41 a synthesizes the image signal from the low-resolution area 10 b of each of the imaging units 21 to 24 having performed distortion correction and the image signal from the high-resolution area 10 a such that the image signals join smoothly to form the whole image for each of the imaging units 21 to 24 .
- the recognition unit 41 b performs an image recognizing process on the whole image of each of the imaging units 21 to 24 in which distortion correction has been performed on at least the low-resolution area and recognizes a predetermined object (for example, a vehicle, a person, or an obstacle) in the whole image of each of the imaging units 21 to 24 . That is, after distortion correction has been performed on an image signal corresponding to at least the low-resolution area (high-distortion area), the recognition unit 41 b performs image recognition and outputs a second image recognition result.
- a predetermined object for example, a vehicle, a person, or an obstacle
- the recognition unit 41 b also refers to the recognition results (types or coordinates of an object) from the recognition units 31 b to 34 b . It is described above that the recognition unit 41 b performs image recognition on the whole image of each of the imaging units 21 to 24 , but image recognition does not have to be performed on the whole image. For example, a peripheral part of an image may not be subjected to image recognition.
- the recognition unit 41 b has only to recognize, for example, an area including the areas recognized by the recognition units 31 b to 34 b and wider than the areas.
- the recognition unit 41 b serves as a second image recognizing unit that performs image recognition on an image signal of an area including a partial area subjected to image recognition by the first image recognizing unit and wider than the partial area out of the image signals acquired by the image acquiring unit and outputs a second image recognition result.
- the second image recognizing unit performs image recognition on a composite image into which image signals corresponding to the high-resolution area 10 a which is a low-distortion area and the low-resolution area 10 b which is a high-distortion area are synthesized and outputs the second image recognition result.
- the image processing unit 41 a forms a panoramic composite image by synthesizing the images from the camera units 12 to 14 which are a plurality of imaging units such that the images join.
- the images of the plurality of imaging units to join be set such that at least parts of the imaging viewing angles thereof overlap by a predetermined amount or greater.
- the camera units 12 and 13 may be disposed such that the imaging areas thereof overlap each other as will be described later.
- the camera units 13 and 14 may be disposed such that the imaging areas thereof overlap each other. At this time, the imaging areas of the low-distortion areas of at least two image acquiring units may overlap each other.
- the recognition unit 41 b performs image recognition on the panoramic composite image. Accordingly, for example, it is possible to recognize an image of an object which is imaged to extend over the viewing angles of a plurality of imaging units. This is because the whole image of an object may not be recognized from the individual whole images from the imaging units, but substantially the whole image of the object may appear in the panoramic composite image and the image of the object may be able to be recognized through image processing.
- the integrated control unit 41 c outputs an integrated image recognition result by employing the recognition result with higher reliability.
- a proportion occupied by an object in the image recognized by the recognition units 31 b to 34 b and a proportion occupied by the same object recognized by the recognition unit 41 b in a screen may be compared, and the recognition result with the larger proportion may be determined to have higher reliability and be employed.
- the recognition result from the recognition unit 41 b may be determined to have higher reliability than the recognition result from the recognition units 31 b to 34 b and be employed.
- this recognition result may be determined to have lower reliability, and the recognition result from the recognition unit 41 b may be determined to have higher reliability and be employed.
- the recognition unit 41 b may perform image recognition on only the low-resolution area in a state in which distortion correction has been performed on the low-resolution area, and may perform image recognition on an object extending over the low-resolution area and the high-resolution area when there is such an object. That is, an object which is located in only the high-resolution area may be considered to have high reliability of recognition using the recognition units 31 b to 34 b and may not be subjected to an image recognizing process by the recognition unit 41 b.
- the integrated control unit 41 c serves as an integrated processing unit that outputs an integrated image recognition result on the basis of reliability of the first image recognition result and reliability of the second image recognition result.
- the integrated control unit 41 c forms a signal for displaying a desired image on a first display unit 50 and a second display unit 51 out of the whole images from the imaging units 21 to 24 , the panoramic composite image, and the like.
- the integrated control unit 41 c generates a frame for emphasizing a recognized object, information on a type, a size, a position, a speed, and the like of the object, a computer graphics (CG) for warning, and the like.
- CG computer graphics
- the integrated control unit 41 c may generate a CG of a boundary image for displaying a boundary on the basis of characteristic information of the optical system such as display resolution boundary information acquired from the camera information units 31 c to 34 c.
- the integrated control unit 41 c performs a display process of superimposing such a CG or text on an image.
- the first display unit 50 , the second display unit 51 , and the like serve as a display unit and are configured to display an image signal or an integrated image recognition result.
- the integrated control unit 41 c is configured to allow a plurality of camera units to share information on a recognized object. That is, for example, it is assumed that an object recognized by the camera unit 14 is moving to the viewing angle of the camera unit 11 . In this case, the integrated control unit 41 c transmits prediction information including information on the type of the object and the moving direction of the object or preferential recognition area information to the recognition unit 31 b of the camera unit 11 .
- the integrated control unit 41 c performs communication with the travel control unit (ECU) 60 and the like via a communication unit (not illustrated) provided therein using a protocol such as CAN, FlexRay, or Ethernet. Accordingly, the integrated control unit 41 c performs a display process of appropriately changing information to be displayed on the basis of a vehicle control signal received from the travel control unit (ECU) 60 . That is, a range of an image to be displayed on the display unit or the like is changed, for example, according to a moving state of the vehicle acquired from the vehicle control signal.
- the travel control unit (ECU) 60 is a unit that is mounted in the vehicle 1 and includes a computer or a memory for comprehensively performing drive control, direction control, and the like of the vehicle 1 .
- the vehicle control signal for example, information on traveling (a moving state) of the vehicle such as a traveling speed, a traveling direction, a shift lever, a shift gear, a blinker's state, and a direction of the vehicle from a geomagnetic sensor or the like are input from the travel control unit (ECU) 60 to the integrated processing unit 40 .
- the integrated control unit 41 c transmits information such as a type, a position, a moving direction, and a moving speed of a predetermined object (such as an obstacle) recognized by the recognition unit 41 b to the travel control unit (ECU) 60 .
- the travel control unit (ECU) 60 performs control required for avoidance of an obstacle such as stopping and driving of the vehicle and changing of the traveling direction.
- the travel control unit (ECU) 60 serves as a movement control unit that controls movement of a vehicle which is a mobile object (a moving device) on the basis of the integrated image recognition result.
- the first display unit 50 may be installed in the vicinity of the center in a vehicle width direction of a front-upper part of a driver's seat of the vehicle 1 such that a display screen thereof faces the rear of the vehicle, and serves as an electronic rearview mirror.
- the first display unit 50 may be used as a mirror when it is not used as a display.
- the first display unit 50 may be configured to include a touch panel or an operation button, to acquire an instruction from a user, and to output information to the integrated control unit 41 c.
- the second display unit 51 is installed, for example, near an instrument panel in the vicinity of the center in the vehicle width direction of the front part of the driver's seat of the vehicle 1 .
- a navigation system, an audio system, and the like are mounted in the vehicle 1 which is a mobile object (a moving device).
- various control signals from the navigation system, the audio system, and the travel control unit (ECU) 60 can also be displayed on the second display unit.
- the second display unit is configured to include a touch panel or an operation button and to acquire an instruction from a user.
- the second display unit 51 may be, for example, a display unit of a tablet terminal.
- the second display unit 51 may be configured to display an image through wired connection to the integrated processing unit 40 or may be configured to wirelessly receive an image via a communication unit 62 and to display the received image.
- a liquid crystal display panel, an organic EL display panel, or the like can be used as a display panel of the first display unit 50 or the second display unit 51 .
- the number of display units is not limited to three.
- Some or all of the functional blocks included in the integrated processing unit 40 and the like may be realized in hardware or may be realized by causing the CPU 42 to execute a computer program stored in the memory 43 .
- a dedicated circuit (ASIC), a processor (a reconfigurable process or a DSP), or the like can be used as the hardware.
- Some or all image processes which are performed by the image processing units 31 a to 34 a may be performed by the image processing unit 41 a of the integrated processing unit 40 . That is, in the first embodiment, for example, the image acquiring unit and the first image recognizing unit are accommodated in the same housing of the camera unit, and the camera unit and the second image recognizing unit are accommodated in different housings. However, for example, the first image recognizing unit along with the second image recognizing unit may be accommodated in the housing of the integrated processing unit 40 .
- the integrated processing unit 40 is mounted in a vehicle 1 which is a mobile object, but some processes of the image processing unit 41 a , the recognition unit 41 b , and the integrated control unit 41 c of the integrated processing unit 40 may be performed, for example, by an external server or the like via a network.
- the imaging units 21 to 24 which are the image acquiring unit are mounted in the vehicle 1 which is a mobile object, but, for example, some functions of the camera processing units 31 to 34 or the integrated processing unit 40 may be performed by an external server or the like. Some or all functions of the integrated processing unit 40 may be provided in the travel control unit (ECU) 60 .
- ECU travel control unit
- Reference sign 61 denotes a storage unit, which stores whole images of the imaging units 21 to 24 generated by the integrated processing unit 40 or a panoramic composite image.
- a CG such as a predetermined frame indicating a recognized object, text, or warning, or an image overlapped with the CG and displayed on the first display unit 50 and the second display unit 51 , and the like is stored along with time or GPS information.
- the integrated processing unit 40 can also regenerate past information stored in the storage unit 61 and display the information on the first display unit 50 or the second display unit 51 .
- Reference sign 62 denotes a communication unit, which communicates with an external server or the like via a network and which can transmit information not stored yet in the storage unit 61 or past information stored in the past in the storage unit 61 to the external server or the like to store the information in the external server or the like.
- an image may be transmitted to an external tablet terminal or the like and the image may be displayed on the second display unit 51 which is a display unit of the table terminal.
- the communication unit 62 can acquire congestion information or various types of information from an external server or the like and display the acquired information on the first display unit 50 or the second display unit 51 via the integrated processing unit 40 .
- Reference sign 63 denotes an operation unit, which is used to input various instructions to the image processing system in response to a user's operation.
- the operation unit includes, for example, a touch panel or an operation button.
- FIGS. 4 A to 4 D are diagrams illustrating a relationship between a light receiving surface of an imaging device of a camera unit and a high-resolution area and a low-resolution area according to the first embodiment.
- FIG. 4 A is a diagram illustrating an example of a relationship the optical system of the rear camera unit 13 and the light receiving surface 130 of the imaging device according to the first embodiment
- FIG. 4 B is a diagram illustrating an example of a distortion-corrected image of the rear camera unit 13 according to the first embodiment.
- FIG. 4 C is a diagram illustrating an example of a relationship the optical system of the left camera unit 14 and the light receiving surface 140 of the imaging device according to the first embodiment
- FIG. 4 D is a diagram illustrating an example of a distortion-corrected image of the left camera unit 14 according to the first embodiment.
- FIG. 4 A for example, an image illustrated in FIG. 2 A is formed on the light receiving surface 130 of the imaging device of the rear camera unit 13 .
- the substantial center of the light receiving surface 130 of the imaging device and the center of the high-resolution area 10 a (the optical axis of the optical system) are disposed to substantially match.
- Reference sign 93 denotes a boundary between the high-resolution area 10 a and the low-resolution area 10 b , and the center of the boundary 93 substantially matches, for example, the substantial center of the light receiving surface 130 of the imaging device of the rear camera unit 13 .
- the center of the boundary 93 (the high-resolution area 10 a ) matches a position at which the optical axis of the optical system crosses the light receiving surface, but may not match the position.
- the center of boundary 93 is disposed substantially at the center of the screen.
- the distortion correction may be performed on the image of the high-resolution area 10 a , but the distortion correction may not be performed on the image of the high-resolution area 10 a .
- the image of the high-resolution area 10 a having distortion correction not performed thereon and the image of the low-resolution area (high-distortion area) 10 b having distortion correction performed thereon are synthesized, a process for causing boundary parts to join smoothly is necessary.
- Distortion correction in the first embodiment is a correction process for reducing distortion and includes a process in which distortion is not zero.
- distortion may be left in a part such as a peripheral part.
- An area having distortion correction not performed thereon is not limited to a circular high-resolution area (low-distortion area) 10 a as illustrated in FIG. 4 B.
- the area may have another shape such as a rectangular shape or a size or a position thereof may be changed.
- an image of an imaging viewing angle 13 a of the rear camera unit 13 is formed in the high-resolution area (low-distortion area) 10 a of the light receiving surface 130 as illustrated in FIG. 4 A .
- An image of an imaging viewing angle 13 b is formed in the low-resolution area (high-distortion area) 10 b of the light receiving surface 130 .
- the same relationship as illustrated in FIGS. 4 A and 4 B is established for the front camera unit 11 .
- the center of the high-resolution area (low-distortion area) 10 a of the left camera unit 14 is deviated from the center of the light receiving surface 140 of the imaging device to an upper side in the drawing (in a first direction) as illustrated in FIG. 4 C . That is, in the left camera unit 14 , the optical system and the imaging device are disposed such that the center of the boundary 93 is deviated from the center of the light receiving surface 140 of the imaging device to the upper side in the drawing (in the first direction).
- the first direction is also referred to as a vignetting direction F. Accordingly, it is possible to control a range of the low-resolution area 10 b which is formed on the imaging device.
- the imaging area of the low-resolution area 10 b can be extended to a direction opposite to the vignetting direction F.
- a plurality of pixels are arranged in rows and columns on the light receiving surface 140 of the imaging device, and photoconverting and reading is performed row by row in a predetermined second direction (a vertical scanning direction) sequentially from the pixels in a predetermined row.
- the light receiving surface is rectangular, and the number of pixels in the length direction is larger than the number of pixels in the width direction.
- Charge signals photoconverted by the pixels of the light receiving surface of the imaging device are sequentially read row by row from an upper-left end to a lower-right end for each row when the length direction of the rectangle is defined as a lateral direction.
- the first direction is opposite to the second direction.
- the center of the boundary 93 is deviated to the upper side of the screen as illustrated in FIG. 4 D . Therefore, a part of the viewing angle in the upward direction (the vignetting direction F) in FIGS. 4 C and 4 D is lost, and the viewing angle in the downward direction can be widened in comparison with the example illustrated in FIG. 4 A . As a result, it is possible to effectively use the pixels on the light receiving surface of the imaging device.
- the same relationship as illustrated in FIGS. 4 C and 4 D is established for the right camera unit 12 .
- the first direction is set to a direction opposite to the second direction, but the first direction and the second direction may be the same direction. In this case, a process for changing the direction of an image is necessary.
- the first direction may be set to a direction perpendicular to the second direction. In this case, a maximum number of pixels may not be effectively used.
- the first direction and the second direction may be set to have an appropriate angle.
- FIGS. 5 A to 7 B are diagrams illustrating an arrangement example of the left camera unit 14
- FIGS. 5 A and 5 B are diagrams illustrating an example of the first embodiment in which the imaging device of the camera unit 14 is vertically arranged and the optical axis of the optical system is directed slightly downward with respect to the horizontal direction on the rear-left side of the vehicle 1
- FIG. 5 A is a diagram illustrating a top view of the vehicle 1 according to the first embodiment
- FIG. 5 B is a diagram illustrating a left side view of the vehicle 1 according to the first embodiment.
- the upward direction (the vignetting direction F) of the imaging device in FIG. 4 C is disposed to face the vehicle 1 as illustrated in FIG. 5 A . That is, the first direction (the vignetting direction F) is a direction perpendicular to the vertical direction and is disposed to face the mobile object.
- a vehicle body which is an untargeted area may not be imaged on the imaging device and the imaging viewing angle 14 b corresponding to the low-resolution area 10 b which can be imaged is extended to the front-left side. Accordingly, it is possible to facilitate recognition of an obstacle or the like on the front-left side.
- the number of pixels in the height direction (the Z direction) is larger than the number of pixels in the width direction (the Y direction) of the vehicle 1 , and the optical axis of the optical system is directed outward with respect to a side line of the vehicle 1 in the length direction (the X direction) of the vehicle 1 .
- the optical axis is indicated by a one-dot chain line.
- the optical axis of the optical system is disposed in the horizontal direction or downward with respect to the horizontal direction.
- the high-resolution area 10 a can be more used as display for the electronic rearview mirror, a driver can accurately confirm the rear-lateral side and a blind spot on the front-lateral side in the low-resolution area 10 b can be imaged with the low-resolution area 10 b.
- the viewing angle of the low-resolution area 10 b in the height direction of the vehicle 1 can include a front wheel and thus it is possible to sufficiently visually recognize a blind spot on the front-lateral side, which is very effective when the vehicle parks in a narrow space or the like.
- FIGS. 5 A and 5 B the left camera unit 14 is illustrated, but the right camera unit 12 can be disposed to be symmetric. In this case, the same advantages can be achieved.
- the image processing unit 41 a of the integrated processing unit 40 rotates and displays an image from the camera unit 14 on the basis of position and posture information of a camera in the camera information.
- the image from the camera unit 14 is synthesized with an image from the rear camera unit 13 and the composite image is displayed according to necessity.
- the image processing unit 41 a rotates and displays an image from the camera unit 12 on the basis of position and posture information of a camera in the camera information.
- the image from the camera unit 12 is synthesized with an image from the rear camera unit 13 and the composite image is displayed according to necessity.
- the image processing unit 41 a displays the images from the camera units 11 and 13 without rotating the images on the basis of the position and posture information of the camera. Alternatively, the image is synthesized with another image.
- the camera unit includes an imaging device and an optical system configured to form an optical image on a light receiving surface of the imaging device, and the light receiving surface includes a high-resolution area and a low-resolution area in a peripheral part of the high-resolution area.
- the optical system and the imaging device are disposed such that the gravity center of the high-resolution area deviates in a first direction from the center of the light receiving surface, and the camera unit is installed such that the first direction is directed to an area other than a predetermined targeted area.
- an untargeted area may not be formed on the imaging device and the imaging viewing angle 14 b corresponding to the low-resolution area 10 b which can be imaged is extended to the targeted area, it is possible to easily recognize an obstacle or the like in the targeted area.
- FIGS. 6 A to 6 C are diagrams illustrating an example of a second embodiment in which the imaging device of the camera unit 14 is vertically disposed and the optical axis of the optical system is disposed just downward on the rear-left side of a vehicle 1 .
- FIG. 6 A is a diagram illustrating a top view of the vehicle 1 according to the second embodiment
- FIG. 6 B is a diagram illustrating a left side view of the vehicle 1 according to the second embodiment
- FIG. 6 C is a diagram illustrating a front view of the vehicle 1 according to the second embodiment.
- the upward direction (the vignetting direction F) of the imaging device in FIG. 4 C is directed to face the vehicle 1 as illustrated in FIGS. 6 A and 6 C .
- the number of pixels in the length direction (the X direction) of the vehicle 1 is larger than the number of pixels in the width direction (the Y direction) of the vehicle 1 , and the optical axis of the optical system is directed outward with respect to the downward vertical direction in the width direction (the Y direction) of the vehicle 1 as illustrated in FIG. 6 C .
- the optical axis of the optical system is disposed substantially downward along the vertical direction. As illustrated in FIG. 6 C , the optical axis is inclined outward from the vertical direction on one side of the vehicle 1 .
- the imaging viewing angle 14 a corresponding to the high-resolution area 10 a is slightly narrowed to, for example, 50 degrees as in the front view illustrated in FIG. 6 C .
- the imaging viewing angle 14 b corresponding to the low-resolution area 10 b includes a front-lower side of the vehicle 1 to the rear-lower side as illustrated in FIG. 6 B , an obstacle or the like on the lower side can be more easily recognized than in the example illustrated in FIGS. 5 A and 5 B .
- the imaging viewing angle 14 a corresponding to the high-resolution area 10 a is assigned to a blind spot on the front-lateral side, it is possible to confirm the blind spot on the front-lateral side with a high resolution.
- the imaging viewing angle 14 a corresponding to the high-resolution area 10 a
- the light receiving surface of the imaging device can be effectively used to obtain a down-around-view image around the vehicle 1 , and it is possible to greatly reduce a blind spot particularly in case of a large truck or the like.
- the left camera unit 14 has been described above with reference to FIGS. 6 A to 6 C , but the right camera unit 12 can also be arranged to be symmetric therewith. In this case, the same advantages can be achieved.
- the image processing unit 41 a of the integrated processing unit 40 rotates an image from the camera unit 14 particularly on the basis of position and posture information of the camera in the camera information.
- the image from the camera unit 14 is synthesized with an image from the rear camera unit 13 and the composite image is displayed according to necessity.
- the image processing unit 41 a rotates and displays an image from the camera unit 12 on the basis of position and posture information of the camera in the camera information.
- the image from the camera unit 12 is synthesized with an image from the rear camera unit 13 and the composite image is displayed according to necessity.
- FIGS. 7 A and 7 B are diagrams illustrating an example of a third embodiment in which the imaging device of the camera unit 14 is horizontally disposed and the optical axis of the optical system is disposed slightly downward with respect to the horizontal direction on the rear-left side of a vehicle 1 .
- FIG. 7 A is a diagram illustrating a top view of the vehicle 1 according to the third embodiment
- FIG. 7 B is a diagram illustrating a left side view of the vehicle 1 according to the third embodiment.
- the upward direction (the vignetting direction F) of the imaging device in FIG. 4 C is directed to face a sky area as illustrated in FIG. 7 B . That is, the vignetting direction F is directed upward with respect to the horizontal plane.
- a second direction (a sequential row reading direction) of the imaging device is directed downward with respect to the horizontal plane. That is, the first direction is not parallel to a horizontal scanning direction.
- the vignetting direction can be set to a direction to the sky area which is an untargeted area.
- the imaging viewing angle 14 a corresponding to the high-resolution area 10 a is slightly narrowed to, for example, 50 degrees and the imaging viewing angle 14 b corresponding to the left low-resolution area 10 b is widened to the front-left side, it is possible to recognize an obstacle or the like on the front-left side.
- the number of pixels in the width direction (the Y direction) is larger than the number of pixels in the height direction (the Z direction) of the vehicle 1 , and the optical axis of the optical system is disposed outward with respect to the side line of the vehicle 1 in the length direction (the X direction) of the vehicle 1 .
- the optical axis of the optical system is disposed horizontally or downward with respect to the horizontal plane.
- the vignetting viewing angle is directed to the sky side in which there are little obstacles and thus it is possible to effectively use the light receiving surface of the imaging device.
- the high-resolution area 10 a can be more used as display for the electronic rearview mirror, a driver can accurately confirm the rear side and a blind spot on the front-lateral side in the low-resolution area 10 b can be imaged with the imaging viewing angle 14 b corresponding to the low-resolution area 10 b.
- the imaging viewing angle 14 a corresponding to the high-resolution area 10 a includes a part of the vehicle 1 , it is possible to easily feel a sense of distance to an obstacle or the like.
- the height direction (the Z direction) of the vehicle 1 since a front wheel can be included in the imaging viewing angle 14 b corresponding to the low-resolution area 10 b , it is possible to sufficiently visually recognize a blind spot on the front-lateral side, which is very effective when the vehicle parks in a narrow space or the like.
- FIGS. 7 A and 7 B the left camera unit 14 is illustrated, but the right camera unit 12 can be disposed to be symmetric therewith. In this case, the same advantages can be achieved.
- the image processing unit 41 a of the integrated processing unit 40 displays an image without rotating the image when the camera unit 14 is disposed as illustrated in FIGS. 6 A to 6 C particularly on the basis of position and posture information of the camera in the camera information.
- the image from the camera unit 14 is synthesized with an image from the rear camera unit 13 and the composite image is displayed according to necessity.
- the image processing unit 41 a displays an image from the camera unit 12 without rotating the image on the basis of position and posture information of the camera in the camera information.
- the image from the camera unit 12 is synthesized with an image from the rear camera unit 13 and the composite image is displayed according to necessity.
- the center of the high-resolution area 10 a of the optical system including the high-resolution area 10 a and the low-resolution area (high-distortion area) 10 b is intentionally deviated from the center of the light receiving surface of the imaging device. Accordingly, a part of the viewing angle is lost out and the other side of the viewing angle is extended.
- the vignetting direction (the first direction) is directed to an area other than a targeted area, it is possible to optimize the imaging viewing angle and to most effectively use the pixels of the imaging device.
- An amount of deviation between the center of the high-resolution area 10 a and the center of the light receiving surface of the imaging device, or the like can be changed according to applications or positions.
- the arrangement illustrated in FIGS. 4 A and 4 B can be achieved by reducing the amount of deviation or making it zero.
- the amount of deviation can be simply increased as illustrated in FIGS. 4 C and 4 D .
- the targeted area includes an area in which an obstacle is assumed to be present such as an area on an obliquely rear-lateral side of the vehicle 1 or an outer peripheral area of a wheel, for example, when a camera unit is installed on a lateral side of the vehicle 1 .
- the vignetting direction (the first direction) is directed to the vehicle 1 (or the installation position) or the sky area other than the targeted area.
- the targeted area changes according to applications in which the camera unit is installed.
- the solution of an image of the high-resolution area (low-distortion area) 10 a displayed on the first display unit 50 or the second display unit 51 is higher than that of the low-resolution area (low-distortion area) 10 b , and thus it is possible to more accurately display far images of the front view, the side view, and the rear view of the vehicle 1 .
- the present disclosure is more advantageous in terms of costs, processing efficiency, decrease in size, and the like in comparison with a case in which a plurality of camera units with different viewing angles are used.
- an image for the electronic rearview mirror displayed on the first display unit 50 can be displayed in a low-distortion state and thus a driver can visually recognize the surroundings of the vehicle with a more natural perspective feeling.
- the high-resolution area 10 a is configured to have low optical distortion and image recognition can be performed thereon in a state of a RAW image signal of which distortion has not been corrected, it is possible to reduce a processing load for image recognition and to perform image recognition at a high speed.
- the case in which distortion correction is not performed includes a case in which a distortion correction factor is less than a predetermined value X1.
- the case in which distortion correction is performed may include a case in which the distortion correction factor is greater than a predetermined value X2 (where X2 is equal to or greater than X1), where X1 may be set to, for example, 10% and X2 may be set to, for example, 90%.
- the present disclosure is not limited to two types of distortion correction factors, but may employ a configuration in which the distortion correction factor changes gradually.
- the present disclosure includes such embodiments.
- a boundary image indicating a boundary between an area in which distortion correction is performed and an area in which distortion correction is not performed may be able to be displayed, for example, while the vehicle is normally traveling.
- the aforementioned boundary may be displayed to reduce a feeling of discomfort.
- a width, a concentration, a color, or the like of a line of the boundary image may be changed to reduce a feeling of discomfort between the distortion-corrected area and the non-distortion-corrected area.
- the distortion correction factor in the distortion-corrected area may be adjusted depending on the size and shape of the boundary image, and the images of the distortion-corrected area and the non-distortion-corrected area for a stationary object may be smoothly joined.
- the boundary image may be displayed. Accordingly, the boundary between the distortion-corrected area and the non-distortion-corrected area becomes clear and thus it is possible to efficiently perform the adjustment operation.
- the image processing system is mounted in a mobile object such as a vehicle
- a mobile object such as a vehicle
- the mobile object according to the embodiments is not limited to a vehicle such as an automobile, but may be any of a train, a ship, an aircraft, a robot, and a drone as long as it is a movable object.
- the image processing system according to the embodiments may or may not be mounted in such a mobile object.
- the configurations according to the embodiments can also be applied to a case in which a mobile object is remotely controlled.
- At least one of various functions, processes, and methods described above in the first to third embodiments may be realized using a program.
- a program for realizing at least one of various functions, processes, and methods described above in the first embodiment is referred to as a “program X.”
- a computer that executes the program X is referred to as a “computer Y”
- Examples of the computer Y include a personal computer, a microcomputer, and a central processing unit (CPU).
- the computer of the image processing system or the like according to the aforementioned embodiments is also an example of the computer Y.
- At least one of various functions, processes, and methods described above in the first to third embodiments can be realized by causing the computer Y to execute the program X.
- the program X is supplied to the computer Y via a computer-readable storage medium.
- the computer-readable storage medium according to the fourth embodiment includes at least one of a hard disk device, a magnetic storage device, an optical storage device, a magneto-optical storage device, a memory card, a ROM, and a RAM.
- the computer-readable storage medium according to the fourth embodiment is a non-transitory storage medium.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Stereoscopic And Panoramic Photography (AREA)
Abstract
Description
1<f×sin(θ max)/y(θ max)≤1.9 (Expression 1)
Claims (18)
1.0<f×sin θ max/y(θ max)≤1.9
0.1<f×sin θ max/y(θ max)≤1.9
1.0<f×sin θ max/y(θ max)≤1.9
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2021-155782 | 2021-09-24 | ||
| JP2021155782A JP2023046930A (en) | 2021-09-24 | 2021-09-24 | Method for installing camera unit, movable body, image processing system, image processing method, and computer program |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20230096414A1 US20230096414A1 (en) | 2023-03-30 |
| US12214733B2 true US12214733B2 (en) | 2025-02-04 |
Family
ID=83271208
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/933,676 Active 2043-04-06 US12214733B2 (en) | 2021-09-24 | 2022-09-20 | Camera unit installing method, moving device, image processing system, image processing method, and storage medium |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US12214733B2 (en) |
| EP (1) | EP4155817B1 (en) |
| JP (1) | JP2023046930A (en) |
| CN (1) | CN115883776A (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12371050B2 (en) * | 2023-08-31 | 2025-07-29 | Hon Hai Precision Industry Co., Ltd. | Method for early warning a blind area, electronic device and storage medium |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN116171239A (en) * | 2020-10-23 | 2023-05-26 | 索尼集团公司 | Camera module, information processing system, information processing method, and information processing device |
| JP2023178051A (en) * | 2022-06-03 | 2023-12-14 | キヤノン株式会社 | Mobile object, method of controlling the mobile object, and computer program |
| US20250095121A1 (en) * | 2023-09-15 | 2025-03-20 | Robert Bosch Gmbh | Device and method for surround view camera system with reduced manhattan effect distortion |
| JP2025062393A (en) * | 2023-10-02 | 2025-04-14 | キヤノン株式会社 | Imaging device and moving object |
| US12519922B1 (en) * | 2024-09-25 | 2026-01-06 | Microsoft Technology Licensing, Llc | Adjustment of a monocular display parameter to display content |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004345554A (en) | 2003-05-23 | 2004-12-09 | Clarion Co Ltd | Vehicle rearward monitoring device and vehicle rearward monitoring system |
| US20050083427A1 (en) | 2003-09-08 | 2005-04-21 | Autonetworks Technologies, Ltd. | Camera unit and apparatus for monitoring vehicle periphery |
| JP2007038856A (en) * | 2005-08-03 | 2007-02-15 | Auto Network Gijutsu Kenkyusho:Kk | Vehicle periphery visual recognition device |
| JP2010095202A (en) | 2008-10-19 | 2010-04-30 | Alpine Electronics Inc | Room mirror position rear-view camera image display |
| US20130265442A1 (en) * | 2012-04-04 | 2013-10-10 | Kyocera Corporation | Calibration operation device, camera device, camera system and camera calibration method |
| JP2015121591A (en) | 2013-12-20 | 2015-07-02 | 株式会社富士通ゼネラル | In-vehicle camera |
| WO2020153317A1 (en) | 2019-01-23 | 2020-07-30 | ソニーセミコンダクタソリューションズ株式会社 | Vehicle-mounted camera |
| US20220174254A1 (en) * | 2019-03-02 | 2022-06-02 | Jaguar Land Rover Limited | A camera assembly and a method |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004354572A (en) * | 2003-05-28 | 2004-12-16 | Minolta Co Ltd | Imaging apparatus |
| JP2005110202A (en) * | 2003-09-08 | 2005-04-21 | Auto Network Gijutsu Kenkyusho:Kk | Camera device and vehicle periphery monitoring device |
| US10525883B2 (en) * | 2014-06-13 | 2020-01-07 | Magna Electronics Inc. | Vehicle vision system with panoramic view |
| WO2018016305A1 (en) * | 2016-07-22 | 2018-01-25 | パナソニックIpマネジメント株式会社 | Imaging system and mobile body system |
| EP3547678B1 (en) * | 2017-12-19 | 2022-07-20 | Panasonic Intellectual Property Management Co., Ltd. | Imaging device, imaging system, and display system |
-
2021
- 2021-09-24 JP JP2021155782A patent/JP2023046930A/en active Pending
-
2022
- 2022-09-08 EP EP22194687.4A patent/EP4155817B1/en active Active
- 2022-09-20 CN CN202211141129.5A patent/CN115883776A/en active Pending
- 2022-09-20 US US17/933,676 patent/US12214733B2/en active Active
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004345554A (en) | 2003-05-23 | 2004-12-09 | Clarion Co Ltd | Vehicle rearward monitoring device and vehicle rearward monitoring system |
| US20050083427A1 (en) | 2003-09-08 | 2005-04-21 | Autonetworks Technologies, Ltd. | Camera unit and apparatus for monitoring vehicle periphery |
| JP2007038856A (en) * | 2005-08-03 | 2007-02-15 | Auto Network Gijutsu Kenkyusho:Kk | Vehicle periphery visual recognition device |
| JP2010095202A (en) | 2008-10-19 | 2010-04-30 | Alpine Electronics Inc | Room mirror position rear-view camera image display |
| US20130265442A1 (en) * | 2012-04-04 | 2013-10-10 | Kyocera Corporation | Calibration operation device, camera device, camera system and camera calibration method |
| JP2015121591A (en) | 2013-12-20 | 2015-07-02 | 株式会社富士通ゼネラル | In-vehicle camera |
| WO2020153317A1 (en) | 2019-01-23 | 2020-07-30 | ソニーセミコンダクタソリューションズ株式会社 | Vehicle-mounted camera |
| US20220174254A1 (en) * | 2019-03-02 | 2022-06-02 | Jaguar Land Rover Limited | A camera assembly and a method |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12371050B2 (en) * | 2023-08-31 | 2025-07-29 | Hon Hai Precision Industry Co., Ltd. | Method for early warning a blind area, electronic device and storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4155817A1 (en) | 2023-03-29 |
| CN115883776A (en) | 2023-03-31 |
| EP4155817B1 (en) | 2025-06-25 |
| JP2023046930A (en) | 2023-04-05 |
| US20230096414A1 (en) | 2023-03-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12214733B2 (en) | Camera unit installing method, moving device, image processing system, image processing method, and storage medium | |
| US12165419B2 (en) | Movable apparatus, control method for movable apparatus, and storage medium | |
| US12028603B2 (en) | Image processing system, image processing method, storage medium, image pickup apparatus, and optical unit | |
| US12417642B2 (en) | System to integrate high distortion wide-angle camera recognition with low distortion normal-angle camera recognition | |
| JP7631275B2 (en) | Mobile body and imaging device installation method | |
| US12387455B2 (en) | Image processing system, mobile object, image processing method, and storage medium, with output of image recognition result integrtated on basis of first result regarding image recognition on at least partial region and second result regarding image recognition on wider region | |
| US12325361B2 (en) | Mobile object, image processing method, and storage medium | |
| US12530904B2 (en) | Image processing system, image processing method, and storage medium | |
| US12406344B2 (en) | Image processing system, image processing method, and storage medium | |
| EP4408006A1 (en) | Image processing system, movable apparatus, image processing method, and storage medium | |
| US12470839B2 (en) | Movable apparatus and installation method for imaging device | |
| JP7434476B2 (en) | Image processing system, image processing method, imaging device, optical system, and computer program | |
| CN118665346A (en) | Installation method of mobile device, image processing device, storage medium and camera device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAKAHARA, IKUNARI;TSUCHIYA, TOMOAKI;REEL/FRAME:061368/0908 Effective date: 20220830 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |