WO2016152572A1 - Dispositif de présentation de vue indirecte - Google Patents
Dispositif de présentation de vue indirecte Download PDFInfo
- Publication number
- WO2016152572A1 WO2016152572A1 PCT/JP2016/057713 JP2016057713W WO2016152572A1 WO 2016152572 A1 WO2016152572 A1 WO 2016152572A1 JP 2016057713 W JP2016057713 W JP 2016057713W WO 2016152572 A1 WO2016152572 A1 WO 2016152572A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- head
- motion
- manipulator
- movement
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 66
- 230000000007 visual effect Effects 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 11
- 230000004044 response Effects 0.000 abstract description 3
- 210000003128 head Anatomy 0.000 description 63
- 238000010586 diagram Methods 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 3
- 230000003111 delayed effect Effects 0.000 description 3
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 2
- 230000004888 barrier function Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000000034 method Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/31—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/26—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/28—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with an adjustable field of view
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/02—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to an indirect view presentation device that indirectly presents a view to a pilot.
- Patent Document 1 displays video captured by a fixed stereo camera, it can express binocular parallax, but cannot express motion parallax according to the motion of the pilot's head, There is a possibility that the pilot's recognition of the three-dimensional space may be broken.
- an object of the present invention is to provide an indirect visual field presentation device capable of expressing motion parallax according to the motion of a driver.
- the indirect field of view presentation apparatus includes a camera mounted on a transportation facility, a motion detection unit, a manipulator, and a display device.
- the motion detector detects the motion of the head of the transport operator.
- the manipulator moves the camera so as to be interlocked with the movement of the head detected by the movement detection unit.
- the display device displays an image captured by the camera to the operator.
- FIG. 1 is a block diagram illustrating a basic configuration of an indirect field presentation device according to the first embodiment of the present invention.
- FIG. 2 is a schematic diagram illustrating the indirect view presentation device according to the first embodiment of the present invention.
- FIG. 3 is a schematic diagram illustrating a manipulator provided in the indirect view presentation device according to the first embodiment of the present invention.
- FIG. 4 is a schematic diagram for explaining the operation of the image drawing unit included in the indirect field presentation device according to the first embodiment of the present invention.
- FIG. 5 is an example illustrating a screen displayed on a stereo display included in the indirect field presentation device according to the first embodiment of the present invention.
- FIG. 6 is a schematic diagram for explaining the operation of the indirect view presentation device according to the modification of the first embodiment of the present invention.
- FIG. 1 is a block diagram illustrating a basic configuration of an indirect field presentation device according to the first embodiment of the present invention.
- FIG. 2 is a schematic diagram illustrating the indirect view presentation device according to the first embodiment of the present invention.
- FIG. 7 is a block diagram illustrating a basic configuration of an indirect view presentation device according to the second embodiment of the present invention.
- FIG. 8 is an example illustrating a case where an automobile including the indirect field presentation device according to the second embodiment of the present invention backs up.
- FIG. 9 is a diagram illustrating the inside of an automobile provided with the indirect field presentation device according to the second embodiment of the present invention shown in FIG.
- the indirect field presentation device includes a motion detection unit 1, a motion control unit 2, a manipulator 3, a stereo camera 4, a display control unit 5, and a stereo.
- a display 6, a storage unit 7, a state detection unit 8, and an output unit 9 are provided.
- the indirect field of view presentation apparatus is an apparatus that indirectly presents the field of view of the outside world viewed from the transportation system to a pilot who operates the transportation system such as an automobile, an aircraft, and a ship.
- the motion detection unit 1 is a motion sensor that detects a motion indicating the position and orientation of the head H of the operator D of the vehicle D, which is a transportation facility, at each time.
- the movement of the head H detected by the movement detector 1 may be at least one of the position and orientation of the head H in the system where the operator D exists.
- the motion detector 1 sequentially outputs motion information indicating the detected motion of the head H to the motion controller 2 at a predetermined sampling frequency.
- the motion detection unit 1 may detect the motion of the head H by performing image processing on an image obtained by an image sensor that detects an electromagnetic wave emitted from the head H, for example. In addition, the motion detection unit 1 may detect the motion of the head H using an acceleration sensor, an angular velocity sensor, or the like built in a wearing tool attached to the head H.
- the motion control unit 2 controls the driving of the manipulator 3 that holds the stereo camera 4 so that the stereo camera 4 moves in conjunction with the motion of the head H detected by the motion detection unit 1.
- the motion control unit 2 includes a computer such as a microcontroller.
- the motion control unit 2 generates a command value that commands the manipulator 3 to drive based on the motion information input from the motion detection unit 1, and outputs the command value to the manipulator 3.
- the command value output by the motion control unit 2 indicates at least one of the position and orientation of the stereo camera 4 with respect to the automobile V.
- the manipulator 3 holds the stereo camera 4 and moves the stereo camera 4 in accordance with the movement of the head H detected by the movement detected by the movement detection unit 1 according to the control of the movement control unit 2.
- This is a drive mechanism (robot arm).
- the manipulator 3 is driven based on the command value input from the motion control unit 2 and moves the stereo camera 4.
- the manipulator 3 has six degrees of freedom so that the position and orientation of the stereo camera 4 with respect to the automobile V can be controlled.
- the manipulator 3 drives, for example, three linear motion joints along the XYZ triaxial directions in the three-dimensional orthogonal coordinate system, three rotational joints having the three axial directions as rotational axes, and the linear motion joints and the rotational joints.
- An actuator Various other combinations can be adopted for the joint and actuator of the manipulator 3.
- the rotary joint may be a ball joint.
- the manipulator 3 moves the stereo camera 4 in at least one of the XYZ axes so as to be interlocked with the position of the head H detected by the motion detection unit 1 according to the control of the motion control unit 2. You may make it make it.
- the manipulator 3 uses at least one of the XYZ axes as a rotation axis so as to be interlocked with the direction of the head H detected by the motion detection unit 1 according to the control of the motion control unit 2.
- the stereo camera 4 may be rotated.
- the manipulator 3 is arranged in the stereo camera 4 in at least one of the XYZ axes so as to be interlocked with the direction of the head H detected by the motion detection unit 1 according to the control of the motion control unit 2. May be moved.
- the manipulator 3 may be provided with only a linear joint when the stereo camera 4 is only moved in parallel, and may be provided with only a rotary joint when the stereo camera 4 is rotated only.
- the initial position SP of the stereo camera 4 is set by the motion control unit 2.
- the motion control unit 2 sets the reference position RP of the head H detected by the motion detection unit 1 in a state where the pilot D is sitting on the seat S.
- the motion control unit 2 sets an initial position SP of the stereo camera 4 corresponding to the reference position RP for the manipulator 3.
- the movement of the head H from the reference position RP is linked to the movement of the stereo camera 4 from the initial position SP.
- the motion control unit 2 sets the position of the head H in a state where it hits the headrest R of the seat S on which the operator D sits as a reference position RP.
- the head H is rarely located behind the headrest R.
- the motion control unit 2 sets the initial position SP of the stereo camera 4 as the last end in the movable range in the front-rear direction of the automobile V.
- the stereo camera 4 can move out of the movable range by setting the initial position SP as the last end. Operation is reduced and the range of motion can be used effectively.
- the motion control unit 2 may set the position of the head H most frequently detected by the motion detection unit 1 as the reference position RP.
- the manipulator 3 can effectively use the movable range by setting the initial position SP of the stereo camera 4 based on the reference position RP of the head H that exists for the longest time.
- the motion control unit 2 may set the center in the movable range of each drive shaft of the manipulator 3 as the reference position RP.
- the movable range of the manipulator 3 is determined so as to correspond to the movable range of the head H.
- the manipulator 3 has a drive shaft having the same movable range as the movable range of the head H, it is not necessary to set the initial position SP for the drive shaft having a movable range equal to or larger than the movable range of the head H.
- the manipulator 3 has a drive axis (Y axis) with a movable range L5 that is greater than or equal to the movable range of the head H, and therefore it is not necessary to set the initial position SP for the Y axis.
- the manipulator 3 detects a motion indicating the position and orientation of the stereo camera 4 at each time, and outputs it to the display control unit 5 as motion information indicating the motion of the stereo camera 4.
- the stereo camera 4 is mounted on the car V.
- the stereo camera 4 captures a stereoscopic image in the traveling direction (front) of the automobile V.
- the stereo camera 4 is held by the manipulator 3.
- the stereo camera 4 moves in conjunction with the movement of the head H detected by the movement detection unit 1 when the manipulator 3 is driven.
- the stereo camera 4 sequentially outputs the captured stereoscopic image to the display control unit 5 at a predetermined frequency.
- the display control unit 5 includes a difference detection unit 51, an image drawing unit 52, a CG generation unit 53, and a warning unit 54.
- the display control unit 5 includes a computer such as a microcontroller.
- the difference detection unit 51, the image drawing unit 52, the CG generation unit 53, and the warning unit 54 in FIG. 1 are each displayed as a logical structure, and may be configured by separate hardware or integrated hardware. May be.
- the display control unit 5 performs image processing on a stereoscopic image taken by the stereo camera 4 at a predetermined frequency, and sequentially displays it on the stereo display 6.
- the actual movement of the head H, the movement information of the movement detection unit 1, the command value of the movement control unit 2, the movement information of the manipulator 3, and the display of the stereo display 6 are between the corresponding generation times.
- a delay of approximately ms order occurs.
- the display control unit 5 performs image processing on the image displayed on the stereo display 6 so as to compensate for at least part of the delay time that occurs between the actual movement of the head H and the display on the stereo display 6. .
- the time lag between the movement information of the movement detector 1 and the movement of the manipulator 3 is larger than the others.
- the difference detection unit 51 inputs the movement information of the movement detection unit 1 and the movement information of the manipulator 3, and detects the difference between the movement of the head H detected by the movement detection unit 1 and the movement of the stereo camera 4. To do. As illustrated in FIG. 4, the difference detection unit 51 calculates a virtual viewpoint VV based on the movement of the head H detected by the movement detection unit 1.
- the virtual viewpoint VV is a viewpoint of the stereo camera 4 virtualized so as to correspond to the movement of the head H in order to compensate for the delay of the movement of the stereo camera 4 with respect to the movement of the head H detected by the movement detector 1.
- the difference detection unit 51 detects the difference between the virtual viewpoint VV and the viewpoint F of the stereo camera 4 obtained from the movement information of the manipulator 3 as the difference between the movement of the head H and the movement of the stereo camera 4.
- the difference detection unit 51 may calculate the virtual viewpoint VV based on the command value of the motion control unit 2.
- the image drawing unit 52 inputs a stereoscopic image taken by the stereo camera 4, corrects distortion or the like in the stereoscopic image, and performs mapping so as to correspond to the virtual screen VS.
- the distance between the head H and the stereo display 6 at the reference position RP is L1.
- the virtual screen VS is, for example, a screen that has the same size and the same shape as the stereo display 6 and is virtually located at the same distance L2 as the distance L1 from the initial position SP of the stereo camera 4.
- the image mapped to the virtual screen VS becomes a view seen from the viewpoint F of the stereo camera 4 using the virtual screen VS as a window, like a view seen from the viewpoint E of the operator D using the stereo display 6 as a window.
- the object A located at a distance L3 from the stereo camera 4 is displayed on the stereo display 6 so as to be seen at a distance L3 from the operator D.
- the image drawing unit 52 Based on the difference detected by the difference detection unit 51, the image drawing unit 52 performs image processing on the stereoscopic image captured by the stereo camera 4 using affine transformation, thereby generating the virtual screen VS from the virtual viewpoint VV. Draw the viewed image. As illustrated in FIG. 4, the image drawing unit 52 draws a virtual image when the virtual screen VS is viewed from the virtual viewpoint VV when the virtual screen VS is included inside the viewing angle ⁇ of the stereo camera 4. be able to.
- the image drawing unit 52 changes the stereoscopic image captured by the stereo camera 4 based on the difference detected by the difference detection unit 51 to the left and right. Shift in direction.
- the image drawing unit 52 displays a stereoscopic image captured by the stereo camera 4 based on the difference detected by the difference detection unit 51. Shift up and down.
- the image drawing unit 52 enlarges or reduces (zooms in) the stereoscopic image captured by the stereo camera 4 based on the difference detected by the difference detection unit 51. ⁇ Out)
- the image drawing unit 52 rotates the two captured images together with the virtual screen VS so as to always remain horizontal.
- the image drawing unit 52 determines whether the stereo image captured by the stereo camera 4 is based on the difference detected by the difference detection unit 51. The two images are alternately shifted up and down.
- the image drawing unit 52 is a stereoscopic image captured by the stereo camera 4 based on the difference detected by the difference detection unit 51. Zoom in and out alternately on the left and right two images.
- the image drawing unit 52 converts the stereoscopic image captured by the stereo camera 4 in the vertical direction based on the difference detected by the difference detection unit 51. Shift and zoom in / out.
- the image drawing unit 52 performs image processing on the stereoscopic image captured by the stereo camera 4 using affine transformation, and draws a virtual image viewed from the virtual viewpoint VV, thereby obtaining a difference.
- the difference delay time detected by the detection unit 51 is compensated. Accordingly, the image drawing unit 52 can efficiently compensate for the delay time from the actual movement of the head H to the display on the stereo display 6.
- the image drawing unit 52 may draw a virtual image viewed from the virtual viewpoint VV only when the difference detected by the difference detection unit 51 is equal to or greater than a predetermined threshold.
- the CG generation unit 53 generates a CG to be combined with the image drawn by the image drawing unit 52 based on the information stored in the storage unit 7 and the state of the automobile V detected by the state detection unit 8.
- the storage unit 7 stores map information, feature information corresponding to the map information, and the like.
- the state detection unit 8 detects the traveling state of the automobile V based on detection results of a positioning system, an acceleration sensor, an angular velocity sensor, a rudder angle sensor, a vehicle speed sensor, and the like. That is, the state detection unit 8 detects the current position, speed, posture with respect to the travel route, etc. of the vehicle V in the map information as the travel state of the vehicle V.
- the state detection unit 8 may include a distance measuring sensor that detects an obstacle or the like existing around the automobile V.
- the CG generation unit 53 generates a CG image corresponding to the stereoscopic video captured by the stereo camera 4.
- the CG generation unit 53 generates a CG image of a feature that may exist at the angle of view of the stereo camera 4 from the current position detected by the state detection unit 8. For example, as shown in FIG. 5, the CG generation unit 53 may generate a CG image of the feature B that is originally shielded by an obstacle and is not captured by the stereo camera 4.
- the CG generation unit 53 may generate a CG image that highlights the feature C such as an intersection by texture mapping.
- the CG generation unit 53 may generate a CG image indicating an icon indicating a type of a feature or a destination, a travel route, and the like.
- the image drawing unit 52 synthesizes the CG image generated by the CG generation unit 53 and the image processed based on the difference detected by the difference detection unit 51 so as to correspond to each other, for example, as shown in FIG. In this way, it is displayed on the stereo display 6.
- the warning unit 54 inputs the movement information of the movement detection unit 1 and the movement information of the manipulator 3, and the movement of the stereo camera 4 is not interlocked with the movement of the head H detected by the movement detection unit 1. Is detected.
- the warning unit 54 warns the operator D via the output unit 9 when the movement of the stereo camera 4 is not linked with the movement of the head H detected by the movement detection unit 1.
- the warning unit 54 is, for example, when the movement of the head H is out of the detectable range of the movement detection unit 1, when a command value out of the movable range is input to the manipulator 3, or when a failure of the device is detected, etc. In addition, it may be determined that the stereo camera 4 is not linked to the head H.
- the output unit 9 warns the operator D according to the control of the warning unit 54.
- the output unit 9 includes, for example, a display device that displays light, images, characters, and the like, and an acoustic device such as a speaker that outputs sound.
- the output unit 9 may be configured as a part of the stereo display 6.
- the stereo display 6 is a display device that displays a stereoscopic image captured by the stereo camera 4 to the operator D.
- the stereo display 6 may be a glasses type using a liquid crystal shutter or a polarizing filter, or may be a naked eye type using a parallax barrier or a lenticular lens.
- the stereo display 6 may be a head mounted display (HMD).
- HMD head mounted display
- the virtual screen VS does not correspond to the HMD and is set to have a predetermined shape at a predetermined distance from the stereo camera 4, and the display control unit 5 separately performs processing related to head tracking. I do.
- the motion detection unit 1 may detect a target mounted on a mounting tool (glasses) mounted on the head H. Or you may make it the motion detection part 1 detect the motion of the head H by the acceleration sensor, the angular velocity sensor, etc. which were mounted in the mounting tool.
- the stereo display 6 is a naked eye type, the binocular parallax barrier is optimized based on the distance from the head H when the operator D is sitting on the seat S.
- the stereoscopic image captured by the stereo camera 4 that follows the head H of the operator D is displayed on the stereo display 6, thereby responding to the motion of the operator D.
- Motion parallax can be expressed.
- the movement of the stereo camera 4 does not need to completely follow the movement of the head H, and the manufacturing cost can be reduced by restricting the drive shaft of the manipulator 3, and the movement control unit 2 and display
- the processing speed of the control unit 5 can be improved.
- the stereo display 6 displays an image photographed by the stereo camera 4 that has been subjected to image processing based on the difference detected by the difference detection unit 51. Therefore, the indirect view presentation device according to the first embodiment can compensate for at least a part of the delay time from the motion detected by the motion detection unit 1 to the display on the stereo display 6. In particular, when the difference detection unit 51 detects the difference between the movement of the head H detected by the movement detection unit 1 and the movement of the stereo camera 4, the stereo display 6 displays an image that efficiently compensates for the delay time. can do.
- the range of motion of the manipulator 3 is set. It can be used effectively.
- the warning unit 54 warns the driver D that the stereo camera 4 is not interlocked with the movement of the head H. Therefore, the warning unit 54 can warn that a failure occurs in the space recognition of the pilot D, and can improve safety.
- the initial position SP of the stereo camera 4 may be set for each operation mode.
- the manipulator 3 for example, as shown in FIG. 6, a plurality of initial positions SP1 to SP4 are set by the motion control unit 2.
- the initial position SP1 coincides with the reference position RP in the left-right direction (Y-axis direction) of the automobile V.
- the initial position SP2 is located at the center in the left-right direction of the automobile V.
- the initial position SP2 is located at the center in the left-right direction of the automobile V.
- the initial positions SP3 and SP4 are located on either side in the left-right direction of the automobile V.
- the manipulator 3 moves the stereo camera 4 from the initial position SP corresponding to the operation mode to a position away from the initial position SP corresponding to the operation mode in response to the switching of the operation mode.
- the operation mode may be switched according to an operation by the operator D, for example, or may be automatically switched according to the traveling state of the automobile V detected by the state detection unit 8.
- the operation mode may be switched to the right-side initial position SP4 when an obstacle is detected in the front right of the vehicle V.
- the pilot D can be provided with a field of view that is very close to the field of view that would be visible in the front.
- the driver D can improve driving operability by obtaining the sense of distance on the left and right symmetrically.
- the operator D can easily determine the initial positions SP3 and SP4 side such as narrow roads and obstacles.
- the indirect visual field presentation apparatus includes a plurality of manipulators 3-1, 3-2, ..., 3-n and a plurality of stereo cameras 4-1, 4-2. ,..., 4-n are different from the first embodiment.
- Other configurations, operations, and effects that are not described in the second embodiment are substantially the same as those in the first embodiment and are omitted because they overlap.
- the plurality of manipulators 3-1 to 3-n are installed at different locations of the vehicle V, respectively.
- the plurality of stereo cameras 4-1 to 4-n are held by the plurality of manipulators 3-1 to 3-n corresponding to the plurality of manipulators 3-1 to 3-n.
- the manipulator 3-1 and the stereo camera 4-1 correspond to the manipulator 3 and the stereo camera 4 in the first embodiment, respectively.
- the display control unit 5 inputs an image photographed by the stereo cameras 4-1 to 4-n according to the operation mode and displays the image on the stereo display 6.
- the motion control unit 2 drives the manipulators 3-1 to 3-n corresponding to the stereo cameras 4-1 to 4-n corresponding to the operation mode so as to be linked to the motion detected by the motion detection unit 1. To do.
- the indirect view presentation device includes a manipulator 3-2 and a stereo camera 4-2 positioned behind the automobile V.
- the stereo camera 4-2 photographs the back of the automobile V.
- the motion control unit 2 controls the manipulator 3-2 so that the stereo camera 4-2 moves partially different from the stereo camera 4-1, which captures the front.
- the image drawing unit 52 causes the stereo display 6 to display the image captured by the stereo camera 4-2 by reversing the image horizontally.
- the manipulator 3-2 moves the stereo camera 4 to the front of the vehicle V according to the control of the motion control unit 2. Move to the right. That is, the stereo camera 4 moves to the left with respect to the shooting direction (in front of the stereo camera 4-2). Further, when the head H rotates counterclockwise (rotation in one direction around the Z axis), the manipulator 3-2 rotates clockwise (Z axis) according to the control of the motion control unit 2. Rotate around in the other direction).
- the manipulator 3-2 responds to the movement of the head H in the left-right direction (movement in the Y-axis direction and rotation around the Z-axis).
- the stereo camera 4-2 is moved so as to be reversed.
- the vertical direction is the same as that of the manipulator 3-1 and the stereo camera 4-1.
- the indirect view presentation device includes a switching unit 10 that switches to any one of a plurality of operation modes.
- the switching unit 10 can be configured by, for example, a touch sensor that detects an operation by the operator D.
- the switching unit 10 may be formed integrally with a display device such as the stereo display 6 as a touch panel.
- the switching unit 10 switches to one of a plurality of operation modes according to the operation by the operator D.
- the switching unit 10 may automatically switch the operation mode according to the traveling state of the automobile V detected by the state detection unit 8.
- the switching unit 10 corresponds to the stereo camera 4-2 from the operation mode corresponding to the stereo camera 4-1 when the state detection unit 8 detects a gear change from the drive of the automobile V to the reverse. Switch to the operation mode you want to use.
- the motion control unit 2 starts driving the manipulator 3-2, and the display control unit 5 causes the stereo display 6 to display an image photographed by the stereo camera 4-2.
- the automobile V is parked in a parking lot. On the left side of the automobile V, other automobiles Va and Vb are parked. In this state, when the operator D turns the vehicle V back while turning the steering wheel to the left, the vehicle V backs leftward with respect to the front, that is, rightward with respect to the rear. At this time, the image displayed on the stereo display 6 is reversed in the left-right direction with respect to the image taken by the stereo camera 4 as shown in FIG.
- the stereo display 6 may display the CG image of the arrow G indicating the traveling direction as shown in FIG. 9 generated by the CG generation unit 53.
- the CG generation unit 53 may generate a CG image of the arrow G based on the steering angle detected by the state detection unit 8.
- the image drawing unit 52 displays an image captured by the stereo camera 4-2 on the two stereo displays 6.
- the two stereo displays 6 are arranged so as to be adjacent to each other so as to connect the images and to face the head H of the operator D.
- the stereo display 6 can provide a natural view to the operator D even when the shooting direction of the stereo cameras 4-1 to 4-n is changed.
- the stereo display 6 may be a combination of three or more and may be a curved display.
- the stereo camera 4 may be a single camera that captures a planar image.
- the stereo display 6 may be a display device that displays a planar image. Even in this case, by controlling the camera to follow the motion of the head H, motion parallax according to the motion of the operator D can be expressed on the display device.
- the operator D does not necessarily have to get on the automobile V (transportation vehicle). That is, if at least the stereo camera 4 and the manipulator 3 are mounted on the transportation facility, the indirect view presentation device represents motion parallax according to the motion of the pilot D in the remote control of the transportation facility by the pilot D. Can do.
- the stereo display 6 corresponding to the entire area of the windshield of the automobile V is illustrated in FIG. 2, but various dimensions can be adopted for the stereo display 6.
- the stereo display 6 may be a display of a navigation device or a display installed so as to correspond to the lower part of the windshield.
- the height of the viewpoint E of the pilot D corresponds to the height of the viewpoint F of the stereo camera 4 is illustrated in FIG. 2, the heights of the viewpoint E and the viewpoint F need not necessarily match. Absent.
- the warning unit 54 when the warning unit 54 detects the movement of the head H such that the stereoscopic display 6 cannot be stereoscopically viewed, the warning unit 54 sends the driver via the output unit 9. D may be warned.
- the warning unit 54 for example, against the movement of the head H in which the interpupillary distance in the horizontal direction deviates from the design of the stereo display 6 or the tilt of the head H during roll rotation exceeds the allowable error of the polarization filter. It is determined that the stereoscopic display 6 cannot be stereoscopically viewed.
- the image pickup device included in the stereo camera 4 may be configured to always keep horizontal. Thereby, the image processing for keeping the image horizontal in the image drawing unit 52 can be omitted, and the processing load on the image drawing unit 52 can be reduced. Therefore, the delay time from the movement of the head H to the display on the stereo display 6 can be reduced.
- the motion control unit 2 generates a command value for the manipulator 3 based on the motion of the head H predicted based on the motion information of the motion detection unit 1. You may do it. That is, the motion control unit 2 predicts the motion that the head H will perform based on the momentary change of motion information, and controls the manipulator 3 so that the stereo camera 4 moves in response to the prediction. To do.
- the stereo cameras 4-1 to 4-n and the manipulators 3-1 to 3-n may be installed on the side of the automobile V such as a door mirror, on the roof, or the like.
- the display control unit 5 may synthesize images captured by a plurality of stereo cameras 4 held by one manipulator 3 and display them on the stereo display 6 with the angle of view increased.
- the present invention includes various embodiments that are not described here, such as configurations in which the configurations described in the first and second embodiments are mutually applied. Therefore, the technical scope of the present invention is defined only by the invention specifying matters according to the scope of claims reasonable from the above description.
- an indirect visual field presentation device capable of expressing motion parallax according to the motion of the pilot by the camera being interlocked with the motion of the head of the pilot.
- V Car Transportation
- 3 manipulator 4 stereo camera (camera) 6 Stereo display (display device)
Abstract
La présente invention concerne un dispositif de présentation de vue indirecte qui comporte : une caméra stéréo (4) qui est installée sur une automobile (V) ; une unité de détection de mouvement (1) pour la détection du mouvement de la tête (H) du conducteur (D) du véhicule (V) ; un manipulateur (3) pour commander le mouvement de la caméra stéréo (4) en réponse au mouvement de la tête (H) détecté par l'unité de détection de mouvement (1) ; et un affichage stéréo (6) qui affiche des images capturées par la caméra stéréo (4) à destination du conducteur (D).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-057717 | 2015-03-20 | ||
JP2015057717 | 2015-03-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016152572A1 true WO2016152572A1 (fr) | 2016-09-29 |
Family
ID=56978716
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/057713 WO2016152572A1 (fr) | 2015-03-20 | 2016-03-11 | Dispositif de présentation de vue indirecte |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2016152572A1 (fr) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6429350B1 (ja) * | 2018-08-08 | 2018-11-28 | 豊 川口 | 車両 |
JP6429347B1 (ja) * | 2018-05-18 | 2018-11-28 | 豊 川口 | 視界表示システムおよび移動体 |
WO2019176035A1 (fr) * | 2018-03-14 | 2019-09-19 | 株式会社ソニー・インタラクティブエンタテインメント | Dispositif de génération d'image, système de génération d'image, et procédé de génération d'image |
JP2021030806A (ja) * | 2019-08-21 | 2021-03-01 | 株式会社島津製作所 | 操縦支援システム |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06276552A (ja) * | 1993-03-24 | 1994-09-30 | Nissan Motor Co Ltd | 立体画像撮影装置および画像提示装置 |
JP2002122397A (ja) * | 2000-10-17 | 2002-04-26 | Mitsubishi Heavy Ind Ltd | 暗視装置 |
JP2004330873A (ja) * | 2003-05-07 | 2004-11-25 | Denso Corp | 車両の後方確認支援装置 |
JP2007112368A (ja) * | 2005-10-24 | 2007-05-10 | Tgal:Kk | 車両用後側方確認カメラ装置 |
JP2008213649A (ja) * | 2007-03-02 | 2008-09-18 | Toyota Motor Corp | 車両用周辺監視装置 |
JP2010118935A (ja) * | 2008-11-13 | 2010-05-27 | Mitsubishi Electric Corp | 車体透過表示装置 |
JP2014177143A (ja) * | 2013-03-13 | 2014-09-25 | Mitsubishi Electric Corp | 車両用運転支援装置 |
-
2016
- 2016-03-11 WO PCT/JP2016/057713 patent/WO2016152572A1/fr active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06276552A (ja) * | 1993-03-24 | 1994-09-30 | Nissan Motor Co Ltd | 立体画像撮影装置および画像提示装置 |
JP2002122397A (ja) * | 2000-10-17 | 2002-04-26 | Mitsubishi Heavy Ind Ltd | 暗視装置 |
JP2004330873A (ja) * | 2003-05-07 | 2004-11-25 | Denso Corp | 車両の後方確認支援装置 |
JP2007112368A (ja) * | 2005-10-24 | 2007-05-10 | Tgal:Kk | 車両用後側方確認カメラ装置 |
JP2008213649A (ja) * | 2007-03-02 | 2008-09-18 | Toyota Motor Corp | 車両用周辺監視装置 |
JP2010118935A (ja) * | 2008-11-13 | 2010-05-27 | Mitsubishi Electric Corp | 車体透過表示装置 |
JP2014177143A (ja) * | 2013-03-13 | 2014-09-25 | Mitsubishi Electric Corp | 車両用運転支援装置 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019176035A1 (fr) * | 2018-03-14 | 2019-09-19 | 株式会社ソニー・インタラクティブエンタテインメント | Dispositif de génération d'image, système de génération d'image, et procédé de génération d'image |
JPWO2019176035A1 (ja) * | 2018-03-14 | 2021-03-11 | 株式会社ソニー・インタラクティブエンタテインメント | 画像生成装置、画像生成システム、および画像生成方法 |
JP7122372B2 (ja) | 2018-03-14 | 2022-08-19 | 株式会社ソニー・インタラクティブエンタテインメント | 画像生成装置、画像生成システム、および画像生成方法 |
JP6429347B1 (ja) * | 2018-05-18 | 2018-11-28 | 豊 川口 | 視界表示システムおよび移動体 |
JP2019200712A (ja) * | 2018-05-18 | 2019-11-21 | 豊 川口 | 視界表示システムおよび移動体 |
JP6429350B1 (ja) * | 2018-08-08 | 2018-11-28 | 豊 川口 | 車両 |
JP2020023293A (ja) * | 2018-08-08 | 2020-02-13 | 豊 川口 | 車両 |
JP2021030806A (ja) * | 2019-08-21 | 2021-03-01 | 株式会社島津製作所 | 操縦支援システム |
JP7367922B2 (ja) | 2019-08-21 | 2023-10-24 | 株式会社島津製作所 | 操縦支援システム |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11528413B2 (en) | Image processing apparatus and image processing method to generate and display an image based on a vehicle movement | |
EP2464113B1 (fr) | Dispositif de génération d'images périphériques du véhicule | |
US20160297362A1 (en) | Vehicle exterior side-camera systems and methods | |
US10789845B2 (en) | Parking assistance method and parking assistance device | |
US7554461B2 (en) | Recording medium, parking support apparatus and parking support screen | |
US10061992B2 (en) | Image display system | |
JP5439890B2 (ja) | 画像処理方法、画像処理装置及びプログラム | |
JP6289498B2 (ja) | 車両内の3次元ディスプレイの角度を自動的に調整するためのシステム及び方法 | |
JP5874920B2 (ja) | 車両周囲確認用モニター装置 | |
JP5044204B2 (ja) | 運転支援装置 | |
WO2016190135A1 (fr) | Système d'affichage pour véhicule | |
US20130057690A1 (en) | Driving assist apparatus, driving assist system, and driving assist camera unit | |
WO2016152572A1 (fr) | Dispositif de présentation de vue indirecte | |
JP2014198531A (ja) | 画像表示制御装置、画像表示システム、および表示ユニット | |
JP2015054598A (ja) | 車両用表示装置 | |
WO2018159017A1 (fr) | Dispositif de commande d'affichage de véhicule, système d'affichage de véhicule, programme et procédé de commande d'affichage de véhicule | |
JP4927514B2 (ja) | 運転支援装置 | |
JP2006123670A (ja) | 移動体周辺監視装置 | |
EP1486377A2 (fr) | Dispositif pour détecter l'environnement d'un corps mobile | |
JP2018077400A (ja) | ヘッドアップディスプレイ | |
JPH10175482A (ja) | 車両後方視界支援装置 | |
JP4855918B2 (ja) | 運転支援装置 | |
JP6384053B2 (ja) | 後写鏡角度設定システム、後写鏡角度設定方法および後写鏡角度設定プログラム | |
JP5020621B2 (ja) | 運転支援装置 | |
WO2016088150A1 (fr) | Dispositif d'affichage embarqué, véhicule, et procédé d'affichage |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16768475 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16768475 Country of ref document: EP Kind code of ref document: A1 |