WO2016075954A1 - Dispositif et procédé d'affichage - Google Patents

Dispositif et procédé d'affichage Download PDF

Info

Publication number
WO2016075954A1
WO2016075954A1 PCT/JP2015/055957 JP2015055957W WO2016075954A1 WO 2016075954 A1 WO2016075954 A1 WO 2016075954A1 JP 2015055957 W JP2015055957 W JP 2015055957W WO 2016075954 A1 WO2016075954 A1 WO 2016075954A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
moving body
display
vehicle
shadow
Prior art date
Application number
PCT/JP2015/055957
Other languages
English (en)
Japanese (ja)
Inventor
拓良 柳
Original Assignee
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社 filed Critical 日産自動車株式会社
Priority to JP2016558896A priority Critical patent/JP6500909B2/ja
Publication of WO2016075954A1 publication Critical patent/WO2016075954A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a display device and a display method for stereoscopically displaying an image around a moving object.
  • Patent Document 1 a moving body display device that projects an image of an object around the moving body and an image of the moving body onto a three-dimensional coordinate system
  • the problem to be solved by the present invention is to display an image in which the position of the moving object can be easily grasped.
  • the present invention solves the above problem by displaying a video including a position display image indicating the position of the moving object.
  • the position of the moving body can be expressed by displaying the position display image, it is possible to display an image in which the position of the moving body can be easily grasped.
  • FIG. 2A and 2B are diagrams showing an example of the installation position of the camera of this embodiment. It is a figure which shows an example of a solid coordinate system. It is a figure which shows an example of the position display image of the own vehicle in the solid coordinate system shown to FIG. 3A. It is a figure which shows the other example of a solid coordinate system. It is a figure which shows the other example of the position display image of the own vehicle in the solid coordinate system shown to FIG. 4A. It is a figure which shows the example of a display of the position display image of the own vehicle and another vehicle. It is a figure which shows the 1st example of the 1st aspect of a position display image.
  • FIG. 10 is a flowchart showing a subroutine showing a procedure of reference point setting processing shown in FIG. 9. It is a figure for demonstrating the other example of a display of a position display image.
  • the display system 1 of the present embodiment displays a video for grasping the moving body and the surroundings of the moving body on a display viewed by an operator of the moving body.
  • FIG. 1 is a block configuration diagram of a display system 1 including a display device 100 according to the present embodiment.
  • the display system 1 of this embodiment includes a display device 100 and a mobile device 200.
  • Each device of the display device 100 and the mobile device 200 includes a wired or wireless communication device (not shown), and exchanges information with each other.
  • the moving body to which the display system 1 of the present embodiment is applied includes a vehicle, a helicopter, a submarine explorer, an airplane, an armored vehicle, a train, a forklift, and other devices having a moving function.
  • a case where the moving body is a vehicle will be described as an example.
  • the moving body of the present embodiment may be a manned machine on which a human can be boarded, or an unmanned machine on which a human is not boarding.
  • the display system 1 of the present embodiment may be configured as a device mounted on a moving body, or may be configured as a portable device that can be brought into the moving body.
  • a part of the configuration of the display system 1 according to the present embodiment may be mounted on a moving body, and another configuration may be mounted on a device physically different from the moving body, and the configuration may be distributed.
  • the mobile body and another device are configured to be able to exchange information.
  • the mobile device 200 of this embodiment includes a camera 40, a controller 50, a sensor 60, a navigation device 70, and a display 80.
  • a LAN CANCAController ⁇ Area Network
  • LAN CANCAController ⁇ Area Network
  • the camera 40 of the present embodiment is provided at a predetermined position of a vehicle (an example of a moving body; the same applies hereinafter).
  • the number of cameras 40 provided in the vehicle may be one or plural.
  • the camera 40 mounted on the vehicle images the vehicle and / or the surroundings of the vehicle, and sends the captured image to the display device 100.
  • the captured image in the present embodiment includes a part of the vehicle and a video around the vehicle.
  • the captured image data is used for calculation processing of the positional relationship with the ground surface around the vehicle and generation processing of the image of the vehicle or the surroundings of the vehicle.
  • FIGS. 2A and 2B are diagrams illustrating an example of the installation position of the camera 40 mounted on the host vehicle V.
  • the host vehicle V includes a right front camera 40R1, a right center camera 40R2, a right rear camera 40R3, a left front camera 40L1 of the host vehicle V, and a left side.
  • Six cameras 40 of a center camera 40L2 and a left rear camera 40L3 are installed.
  • the arrangement position of the camera 40 is not specifically limited, The imaging direction can also be set arbitrarily.
  • Each camera 40 sends a captured image to the display device 100 at a command from the control device 10 to be described later or at a preset timing.
  • the captured image captured by the camera 40 is used for generating a video, and for detecting an object and measuring a distance to the object.
  • the captured image of the camera 40 of the present embodiment includes an image of an object around the vehicle.
  • the target object in the present embodiment includes other vehicles, pedestrians, road structures, parking lots, signs, facilities, and other objects existing around the host vehicle V.
  • the object in the present embodiment includes the ground surface around the moving body.
  • the “ground surface” is a term indicating a concept including the surface of the earth and the surface of the earth's crust (land).
  • the term “surface” of the present embodiment refers to a land surface, a sea surface, a river or river surface, a lake surface, a seabed surface, a road surface, a parking lot surface, a port surface, or two of these. Includes faces that contain more than one.
  • the term “ground surface” in the present embodiment has a meaning including the surface of a structure such as a floor surface or a wall surface of the facility. .
  • the term “surface” used in the description here is a (tangible) surface exposed to the camera 40 during imaging.
  • the camera 40 of this embodiment includes an image processing device 401.
  • the image processing apparatus 401 extracts features such as an edge, a color, a shape, and a size from the captured image data of the camera 40, and identifies an attribute of the target object included in the captured image from the extracted features.
  • the image processing apparatus 401 stores in advance the characteristics of each target object, and identifies the target object included in the captured image by pattern matching processing.
  • the method for detecting the presence of an object using captured image data is not particularly limited, and a method known at the time of filing this application can be used as appropriate.
  • the image processing device 401 calculates the distance from the own vehicle to the object from the position of the feature point extracted from the data of the captured image of the camera 40 or the change over time of the position.
  • the image processing apparatus 401 uses imaging parameters such as the installation position of the camera 40, the optical axis direction, and imaging characteristics.
  • the method for measuring the distance to the object using the captured image data is not particularly limited, and a method known at the time of filing the present application can be used as appropriate.
  • the distance measuring device 41 may be provided as means for acquiring data for calculating the positional relationship with the host vehicle V.
  • the distance measuring device 41 may be used together with the camera 40 or may be used instead of the camera 40.
  • the distance measuring device 41 detects a target existing around the host vehicle V and measures the distance between the target and the host vehicle V. That is, the distance measuring device 41 has a function of detecting an object around the host vehicle V.
  • the distance measuring device 41 sends distance measurement data up to the measured object to the display device 100.
  • the distance measuring device 41 may be a radar distance measuring device or an ultrasonic distance measuring device. A ranging method known at the time of filing of the present application can be used.
  • the number of distance measuring devices 41 that install the distance measuring devices 41 on the host vehicle V is not particularly limited.
  • the installation position of the distance measuring device 41 that installs the distance measuring device 41 in the host vehicle V is not particularly limited.
  • the distance measuring device 41 may be provided at a position corresponding to the installation position of the camera 40 shown in FIG. 2 or in the vicinity thereof, or may be provided in front of or behind the host vehicle V. When the moving body is a helicopter, airplane, submarine spacecraft, or the like that moves in the height direction, the camera 40 and / or the distance measuring device 41 may be provided on the bottom side of the body.
  • the controller 50 of this embodiment controls the operation of the moving object including the host vehicle V.
  • the controller 50 centrally manages each piece of information related to the operation of the moving body, including detection information of the sensor 60 described later.
  • the sensor 60 of the present embodiment includes a speed sensor 61 and a longitudinal acceleration sensor 62.
  • the speed sensor 61 detects the moving speed of the host vehicle V.
  • the longitudinal acceleration sensor 62 detects the acceleration in the longitudinal direction of the host vehicle V.
  • the navigation device 70 of the present embodiment includes a position detection device 71 including a GPS (Global Positioning System) 711, map information 72, and road information 73.
  • the navigation device 70 obtains the current position of the host vehicle V using the GPS 711 and sends it to the display device 100.
  • the map information 72 of the present embodiment is information in which points are associated with roads, structures, facilities, and the like.
  • the navigation device 70 has a function of referring to the map information 72, obtaining a route from the current position of the host vehicle V detected by the position detection device 71 to the destination, and guiding the host vehicle V.
  • the road information 73 of this embodiment is information in which position information and road attribute information are associated with each other.
  • the road attribute information includes road attributes such as that each road is an overtaking lane / not an overtaking lane, an uphill lane / not an uphill lane.
  • the navigation device 70 refers to the road information 73 and, at the current position detected by the position detection device 71, the lane adjacent to the road on which the host vehicle V is traveling is an overtaking lane (a lane having a relatively high traveling speed). It is possible to obtain information on whether or not there is an uphill lane (a lane having a relatively low traveling speed).
  • the control device 10 can predict the vehicle speed of the other vehicle from the detected attribute information of the road on which the other vehicle travels.
  • the display 80 of the present embodiment displays an image of the host vehicle V and the surroundings of the host vehicle V generated from an arbitrary virtual viewpoint generated by the display device 100 described later.
  • the display system 1 in which the display 80 is mounted on a moving body will be described as an example.
  • the display 80 may be provided on the portable display device 100 side that can be brought into the moving body.
  • the display device 100 of this embodiment includes a control device 10.
  • the control device 10 of the display device 100 is stored in a ROM (Read Only Memory) 12 in which a program for displaying a moving body and surrounding images is stored, and in the ROM 12.
  • the CPU (Central Processing Unit) 11 serving as an operation circuit for realizing the functions of the display device 100 and the RAM (Random Access Memory) 13 functioning as an accessible storage device are provided.
  • the control device 10 may include a Graphics / Processing / Unit that executes image processing.
  • the control device 10 of the display device 100 realizes an image acquisition function, an information acquisition function, an image generation function, and a display function.
  • the control apparatus 10 of this embodiment performs each function by cooperation of the software for implement
  • the control device 10 acquires captured image data captured by the camera 40.
  • the display device 100 acquires captured image data from the mobile device 200 using a communication device (not shown).
  • the control device 10 acquires various types of information from the mobile device 200 using a communication device (not shown).
  • the control apparatus 10 acquires the current position information of the host vehicle V as a moving body.
  • the control device 10 acquires the current position detected by the GPS 711 of the navigation device 70.
  • the control device 10 acquires position information of an object existing around the host vehicle V as a moving body.
  • the acquired position information of the object is used for setting processing of the position of the virtual light source described later.
  • the control device 10 calculates the distance from the host vehicle V to the target object from the captured image of the camera 40 as position information of the target object with respect to the host vehicle V.
  • the control device 10 may use the imaging parameter of the camera 40 for the calculation process of the position information of the object.
  • the control device 10 may acquire the position information of the object calculated by the image processing device 401.
  • the control device 10 acquires the speed of the object.
  • the control device 10 calculates the speed of the object from the change with time of the position information of the object.
  • the control device 10 may calculate the speed of the object based on the captured image data.
  • the control device 10 may acquire the speed information of the object calculated by the image processing device 401.
  • the control device 10 acquires attribute information of the lane on which the target object travels.
  • the control device 10 refers to the road information 73 of the navigation device 70 and acquires the attribute information of the lane on which the object travels.
  • the control device 10 refers to the map information 72 or the road information 73 and identifies a road and a traveling lane that include the acquired position of the object.
  • the control device 10 refers to the road information 73 and acquires attribute information associated with the travel lane of the identified object.
  • the control device 10 calculates the positional relationship between the position of the host vehicle V detected by the GPS 711 and the symmetrical object, and considers the positional relationship, and determines the attribute of the traveling lane of the target object from the traveling lane attribute of the own vehicle V.
  • the traveling lane of the other vehicle is an overtaking lane.
  • the control device 10 acquires the acceleration in the traveling direction of the host vehicle V that is a moving body.
  • the control device 10 acquires the acceleration in the traveling direction of the host vehicle V from the host vehicle V.
  • the control device 10 acquires the longitudinal acceleration detected by the longitudinal acceleration sensor 62.
  • the control device 10 may calculate the acceleration from the speed detected by the speed sensor 61.
  • the control device 10 may calculate acceleration from a change in position information of the host vehicle V detected by the GPS 711.
  • the control device 10 uses the position information of the moving body to generate a video including a position display image indicating the position of the moving body in the captured image. Specifically, the control device 10 of the present embodiment sets a reference point using the position information of the moving body, and displays a position display image and a captured image that indicate the position of the moving body observed from the reference point in the captured image. Generate a video containing The position display image of the present embodiment is an image that represents the position where the moving object is present. The location of the moving object may be positively expressed by a diagram, a shadow, or the like, or may be passively expressed by a lack or change of the background.
  • the position display image of the present embodiment may include a diagram image indicating the position of the moving object in the captured image that is observed from a preset reference point.
  • the control device 10 uses the position information of the moving body to set a viewpoint from which the moving body is viewed as a reference point, and includes a diagram image indicating the position of the moving body when the moving body is viewed from this viewpoint. Generate a display image.
  • the reference point is a viewpoint for observing the moving body.
  • the reference point may be set based on the position where the moving object exists.
  • the reference point may be set based on the position of the projection plane described later.
  • the diagram image of the position display image may be a graphic image imitating the outer shape of the moving body or an icon image reminiscent of the moving body.
  • the representation mode of the diagram image is not particularly limited.
  • the form of lines such as line thickness, color, broken line, and double line of the diagram image is not limited.
  • the hue, brightness, saturation, color tone (tone), pattern, gradation, and size (enlargement / reduction) of the diagram image are not particularly limited.
  • the control apparatus 10 of this embodiment calculates
  • the position display image of the present embodiment may include a background image indicating the position of the moving object observed from a preset reference point.
  • the control device 10 uses the position information of the moving body to set a viewpoint from which the moving body is viewed as a reference point, and when the moving body is viewed from this viewpoint, a position display including a background image indicating the position of the moving body Generate an image.
  • the position of the moving object is expressed by a background image.
  • the reference point is a viewpoint for observing the moving body.
  • the reference point may be set based on the position where the moving object exists.
  • the reference point may be set based on the position of the projection plane described later.
  • the representation mode of the background image is not particularly limited.
  • the mode of lines such as line thickness, color, broken line, and double line of the background image is not limited.
  • the hue, brightness, saturation, color tone (tone), pattern, gradation, and size (enlargement / reduction) of the background image are not particularly limited.
  • the control apparatus 10 of this embodiment calculates
  • the position display image of the present embodiment may include a shadow image indicating the position of the moving object observed from a preset reference point.
  • the position of the moving object is expressed by the shadow image of the moving object.
  • the reference point is a light source that irradiates light to the moving body.
  • the control device 10 sets a virtual light source as a reference point using position information of the moving body, and imitates a shadow that is generated when light is emitted from the virtual light source to the moving body.
  • a position display image including a shadow image indicating the existence position of is generated.
  • a shadow image imitating a shadow that is generated when light is emitted from a virtual light source to a moving object indicates the position of the moving object that is observed from a preset reference point.
  • the expression form of the shadow image is not particularly limited.
  • the form of the line such as the thickness, color, broken line, double line, etc. of the shadow image is not limited.
  • the hue, brightness, saturation, color tone (tone), pattern, gradation, and size (enlargement / reduction) of the shadow image are not particularly limited.
  • the position display image is displayed at the position where the moving body exists in the coordinate system of the captured image, which is obtained using the actual position information of the moving body.
  • the control device 10 sets a virtual light source according to the position information of the host vehicle V, and generates a shadow image imitating a shadow that is generated when the host vehicle V is irradiated with light from the virtual light source.
  • the control device 10 displays the image obtained by projecting the captured image on the three-dimensional coordinate system including the position display image indicating the position of the moving object.
  • the displayed video includes part or all of the captured image and the position display image.
  • the control device 10 generates a position display image indicating the position where the moving object is present.
  • the position display image includes a diagram image, a background image, and a shadow image.
  • the control device 10 sets a viewpoint as a reference point according to the acquired position information of the host vehicle V, and generates a diagram image and a background image imitating a video when a moving object and an object are viewed from the reference point. .
  • the control device 10 sets a virtual light source as a reference point according to the acquired position information of the host vehicle V, and generates a shadow image imitating a shadow generated when light is emitted from the virtual light source to the moving object or the object. Generate.
  • the shadow image may be an image approximated to the shadow itself, or may be an image obtained by deforming the shadow itself in order to display the position and orientation of the host vehicle V.
  • the position display image including the diagram image, the background image, and the shadow image does not strictly correspond to the current shape of the host vehicle V, and is an image that can indicate the position where the host vehicle V exists. Since the position only needs to be known, it is not necessary to imitate the shape of the host vehicle V or the like. However, a position display image imitating the shape of the host vehicle V or the like may be generated so that the traveling direction of the host vehicle V can be understood.
  • the shadow image is an image that looks like a shadow, not an actual shadow. A pattern or color may be added to the position display image.
  • the hue, brightness, saturation, color tone (tone), pattern, and gradation of the position display image can be changed according to the illuminance outside the moving body, the brightness of the captured image, and the like.
  • the position display image including the diagram image, the background image, and the shadow image is not limited to the shape corresponding to the current shape of the host vehicle V, and may be an image showing the movable range of the host vehicle V.
  • the position display image of the present embodiment may be an image showing the movable range of the door when the door of the vehicle V that is currently closed is released.
  • the mode of the position display image is not limited, and is appropriately designed according to information desired to be shown to the user. You may use the shadow mapping technique known at the time of application for the production
  • FIG. 3A is a diagram illustrating an example of a cylindrical solid coordinate system S1.
  • the host vehicle V shows a state of being placed on the plane G0.
  • FIG. 3B is a diagram illustrating a display example of the position display image SH including the diagram image, the background image, and the shadow image in the three-dimensional coordinate system S1 illustrated in FIG. 3A.
  • the position display image SH including the diagram image, background image, or shadow image in this example is projected on the plane G0.
  • FIG. 4A is a diagram illustrating an example of a spherical solid coordinate system S2.
  • the host vehicle V shows a state placed on the plane G0.
  • FIG. 4B is a diagram illustrating a display example of the position display image SH including the diagram image, the background image, or the shadow image of the host vehicle V in the three-dimensional coordinate system S2 illustrated in FIG. 4A.
  • the position display image SH is projected on the plane G0.
  • the position display image SH shown in FIGS. 3B and 4B is a shadow image SH imitating a shadow generated when light is emitted from the virtual light source LG set in the three-dimensional coordinate systems S1 and S2 to the host vehicle V.
  • the position display image SH shown in FIG. 3B and FIG. 4B is a diagram image showing the existence position of the host vehicle V when the host vehicle V is viewed from the viewpoint LG set in the three-dimensional coordinate systems S1 and S2.
  • the captured image is projected onto the three-dimensional coordinate system S (S1, S2) of FIGS. 3B and 4B.
  • the position display image SH is the presence of the host vehicle V viewed from the viewpoint LG by changing the background image of the captured image projected on the three-dimensional coordinate systems S1 and S2 (for example, by shifting the relative position of the background image). A position may be indicated. As shown in FIGS. 3B and 4B, the presence of the host vehicle V and the direction of the host vehicle V can be expressed by the position display image SH of the host vehicle V. Thereby, even if the image of the surrounding object is projected onto the three-dimensional coordinate system S1, S2, an image in which the positional relationship between the host vehicle V and the object can be easily grasped can be displayed.
  • the shape of the three-dimensional coordinate system S of the present embodiment is not particularly limited, and may be a bowl shape disclosed in Japanese Patent Application Laid-Open No. 2012-138660.
  • the control device 10 of the present embodiment sets a reference point according to the acquired position information of an object such as another vehicle, and generates a position display image indicating the position of the object observed from this reference point.
  • the position display image includes the above-described diagram image, background image, and shadow image.
  • the control device 10 according to the present embodiment uses the acquired position information of the target object such as the other vehicle VX to set a viewpoint for viewing the target object as a reference point, and the mobile object when the mobile object is viewed from this viewpoint.
  • a diagram image or background image indicating the location of the image is generated.
  • the control device 10 of the present embodiment sets a virtual light source as a reference point according to the acquired position information of the target object such as the other vehicle VX, and creates a shadow that is generated when the target object is irradiated with light from the virtual light source.
  • a simulated shadow image is generated.
  • a position display image (a diagram image, a background image, a shadow image) indicating the position where the moving object is present is projected onto the three-dimensional coordinate systems S1 and S2.
  • the reference point for generating the diagram image or the background image may be set according to a reference position (center of gravity, driver's seat position, etc.) set in advance on the moving body, or a predetermined point in the three-dimensional coordinate system S1, S2.
  • the position may be set.
  • the reference point is set according to the height of the headrest of the driver's seat of the moving body.
  • FIG. 5 includes a position display image (line image, background image, shadow image) SH of the own vehicle V, a position display image SH1 of the other vehicle VX1 as an object, and a position display image SH2 of the other vehicle VX2.
  • the control device 10 of the present embodiment displays a position display image SH as a position display image indicating the position of the moving body including the host vehicle V and / or other vehicles VX1, VX2 on the projection plane SQ.
  • the position display image SH is an image showing the position of the moving body when the moving body is observed from a reference point set at a position corresponding to the position of the moving body.
  • the position display image SH may be a shadow image imitating a shadow when a moving body is irradiated with light from a virtual light source.
  • the position of the reference point LG including the viewpoint and the virtual light source may be set according to the reference position ⁇ of the moving object.
  • the reference position ⁇ can be arbitrarily set according to the center of gravity position, the center position, and the like of the moving body.
  • the reference point (viewpoint, virtual light source) LG for the host vehicle V, the other vehicle VX1, and the other vehicle VX2 may be one point, or may be set for each of the host vehicle V, the other vehicle VX1, and the other vehicle VX2.
  • the position of the reference point (viewpoint, virtual light source) LG for the host vehicle V and the position of the reference point (viewpoint, virtual light source) LG for the other vehicle VX1 (VX2) may be the same or different. May be.
  • the positional relationship between the host vehicle V and the target object is displayed by also displaying the position display images (line diagram image, background image, shadow image) of the target object such as the other vehicle VX as well as the host vehicle V.
  • the control device 10 of the present embodiment sets a projection surface SQ for projecting a position display image (line image, background image, shadow image) SH along the direction in which the host vehicle V moves. .
  • the control device 10 of the present embodiment displays a position display image (a diagram image, a background) when the other vehicle VX1 traveling in the adjacent lane Ln1 adjacent to the traveling lane Ln2 in which the host vehicle V travels (moves) is observed from the reference point.
  • Image, shadow image) SH1 is generated.
  • the position display image (line diagram image, background image, shadow image) SH1 of the other vehicle VX1 (target object) traveling in the adjacent lane Ln1 is converted into the position display image (line diagram image, background image,
  • the common projection plane SQ together with the shadow image (SH) it is possible to present an image that makes it easy to grasp the positional relationship between the host vehicle V and the other vehicle V1.
  • the projection plane SQ along the traveling direction of the host vehicle V it is possible to present an image in which the distance between the host vehicle V and the other vehicle VX1 can be easily recognized.
  • a shadow image SH1 is projected. Further, by setting the projection plane SQ so as to be substantially orthogonal (intersect at 90 degrees) with the road surface of the lane Ln2 on which the host vehicle V is traveling, the position of the host vehicle V is at a position where the driver of the host vehicle V can easily see.
  • the display image SH and the position display image SH1 of the other vehicle VX1 can be displayed. That is, the driver
  • the projection position when the diagram image SH (or background image SH) is projected onto the projection plane SQ indicates the own vehicle V (or other vehicle VX1) so as to indicate the position of the own vehicle V (or other vehicles VX1, VX2).
  • VX2) is determined according to an arbitrary reference position ⁇ .
  • the position on the XZ coordinate of the arbitrary reference position ⁇ of the host vehicle V (or other vehicles VX1, VX2) shown in FIG. 5 is preferably the same as the position on the XZ coordinate of the projection plane SQ.
  • the diagram image SH, the background image SH, and the shadow image SH are displayed at substantially the same position on the projection plane SQ.
  • the diagram image SH2, the background image SH2, and the shadow image SH2 are displayed at substantially the same position on the projection plane SQ.
  • region Rd1 of each figure is the background image Rd1 corresponding to road structures, such as a travel path of a mobile body, a travel path adjacent to it, and a guardrail / roadside zone.
  • a region Rd2 (region on the upper side of the region Rd1) in each figure is a background image Rd2 corresponding to a tree, a building, a sign, or the like on the road side of the traveling path of the moving body.
  • Background image Rd1 and background image Rd2 are displayed on projection plane SQ.
  • the projection plane SQ may include both the background images Rd1 and Rd2 or any one of them.
  • FIGS. 6A to 6F are views showing aspects of a position display image including a shadow image and a position display image including a diagram image SH imitating the shape of the vehicle.
  • the shadow image also approximates the shape of the vehicle.
  • a position display image including a shadow image and a position display image including a diagram image SH representing a shape of a vehicle will be described with reference to the same drawing.
  • FIG. 6A shows a position display image in which the shadow area of the shadow image SH6a is shown as an opaque area (transmittance is less than a predetermined value).
  • FIG. 6A shows a position display image in which the region in the figure of the diagram image SH6a is shown as an opaque region (low transmittance).
  • FIG. 6B shows a position display image in which the shadow area of the shadow image SH6b is shown as a semi-transparent area (transmittance is a predetermined value or more).
  • FIG. 6B shows a position display image in which the in-figure region of the diagram image SH6b is shown as a semi-transparent region (transmittance is a predetermined value or more).
  • FIG. 6C shows a position display image in which the brightness of the shadow area (or diagram area) of the shadow image (or diagram image) SH6c is expressed higher than the brightness of the area other than the shadow image (or diagram image). .
  • the brightness of the background images Rd1 and Rd2 is low.
  • the captured image is included in the position display image, if the brightness of the shadow image (or diagram image) SH6a is lowered as shown in FIG. 6A, the drawing position of the shadow image (or diagram image) SH6a may become unclear. There is.
  • the brightness of the shadow area (or diagram area) of the shadow image (or diagram image) SH6c is the shadow image (or diagram area).
  • Image) A position display image including SH6c is shown.
  • FIG. 6D shows a position display image in which the shadow region (or diagram region) of the shadow image (or diagram image) SH6d is colored. Since colors cannot be expressed in the drawings attached to the application, it is shown for convenience that the shadow image (or diagram image) SH6d is colored by hatching. The hue of the color of the shadow area (or diagram area) is not limited. The brightness and saturation are not limited.
  • FIG. 6E shows a position display image including the outline of the shadow region (or diagram region) of the shadow image (or diagram image) SH6e. The inward extension (inner side) surrounded by the outline of the shadow area (or diagram area) shown in FIG. 6E is transparent or translucent.
  • FIG. 6F shows a position display image obtained by blurring a part of the shadow region (or diagram region) of the shadow image (or diagram image) SH6f.
  • gradation is given so that the brightness decreases from the inside toward the outside.
  • the outline of the shadow area (or diagram area) can be clearly shown.
  • the object on the other side of the moving object can be shown by increasing the inner transparency. Thereby, the presence position of a moving body and the presence of the object of the other side (side) of a moving body can be confirmed simultaneously.
  • FIG. 7 is a diagram illustrating an aspect of a position display image including a background image.
  • FIG. 7 shows the shadow image (or diagram image) SH7a as translucent or transparent, and the height (z direction) of the road background image Rd1a that is transparently displayed inside the shadow image (or diagram image) SH7a. This is an example in which (position) is changed.
  • the height in the Z-axis direction of the road background image Rd1a transparently displayed inside the shadow image (or diagram image) SH7a is lower than the height of the other road background images Rd1b and Rd1c in the Z-axis direction.
  • FIG. 7 shows the shadow image (or diagram image) SH7a as translucent or transparent, and the horizontal position of the background image Rp1a of the utility pole transparently displayed inside the shadow image (or diagram image) SH7a ( This is an example in which the position in the X direction is changed.
  • the position in the X-axis direction of the background image Rp1a of the utility pole that is transparently displayed so as to overlap with the shadow image (or diagram image) SH7a is shifted to the right in the figure from the position in the X-axis direction of the background image Rp1b of the other utility pole ing.
  • the positions of the utility pole background images Rp1a and Rp1b in the X-axis direction that should be continuous change at the boundary of the shadow image (or diagram image) SH7a. From the change position of the horizontal position of the electric pole background image Rp1 shown in the position display image, the user can recognize the presence position (position in the X-axis direction) of the moving body.
  • a region (outlined extension line) in which a part of the background image is missing may be formed along the extension of the shadow image (or diagram image) SH7a.
  • the outline region along the extension of the shadow image (or diagram image) SH7a may be added to the background image Rd1 and the background image Rd2.
  • FIGS. 8A to 8D are diagrams showing aspects of a position display image including a diagram.
  • FIGS. 6A to 6F show examples of graphics imitating the outline of the moving body, but FIGS. 8A to 8D show an aspect of a position display image including an axis indicating a coordinate position and a graphic.
  • FIG. 8A is an example in which the position of the moving object is indicated by a circular diagram image SH8a.
  • the position of the moving body on the projection plane SQ can be indicated by the displayed position of the circle.
  • the diagram image SH8a illustrated in FIG. 8A further includes a vertical line SH8az along the Z-axis direction of the projection surface SQ and a horizontal line SH8ax along the X-axis direction of the projection surface SQ.
  • the vertical line SH8az and the horizontal line SH8ax are substantially orthogonal at the center point SH8a0.
  • the center point SH8a0 corresponds to a reference position such as the center of gravity of the moving object.
  • the vertical line SH8az and the horizontal line SH8ax may be graduated.
  • the XZ coordinates are graduated with reference to the center of the radius of the circular diagram image SH8a. Thereby, the user can grasp
  • FIG. 8B is an example showing the position of the moving object using coordinate axes SH8bx and SH8bx that are substantially orthogonal to each other at the reference point SH8b0.
  • the center point SH8b0 corresponds to a reference position such as the center of gravity of the moving body.
  • the position of the moving object on the projection plane SQ can be indicated by the position of the center point and the coordinate axis.
  • FIG. 8C is an example showing the position of the moving object using two vertical lines SH8c1 and SH8c2 along the Z direction of the projection plane SQ.
  • the position of the vertical line SH8c1 corresponds to the position of the front end (or rear end) of the moving body.
  • the position of the vertical line SH8c2 corresponds to the position of the rear end (or front end) of the moving body.
  • the position of the moving body on the projection plane SQ can be indicated by the positions of the two vertical lines SH8c1 and SH8c2 in the X-axis direction.
  • FIG. 8D is an example showing the location of the moving object using a rectangular area SH8d.
  • One end of the region SH8d in the X-axis direction is defined by a vertical line SH8d1, and the other end is defined by a vertical line SH8d2.
  • the height of the vertical line SH8d1 in the Z-axis direction is not limited. In this example, the height of the vertical line SH8d1 in the Z-axis direction is the same as the height of the projection surface SQ in the Z-axis direction.
  • the position of the vertical line SH8d1 in the X-axis direction corresponds to the position of the front end (or rear end) of the moving body.
  • the position of the vertical line SH8d2 in the X-axis direction corresponds to the position of the rear end (or front end) of the moving body.
  • the position of the moving body on the projection plane SQ can be indicated by the positions of the two vertical lines SH8d1 and SH8d2 in the X direction.
  • the control device 10 generates a video including a position display image indicating the position of the target in the captured image, using the position information of the target existing around the moving body acquired by the information acquisition function.
  • a position display image of an object such as another vehicle includes a diagram image, a background image, and a shadow image, similarly to the position display image of the host vehicle (moving body). About these, in order to avoid the overlapping description, the description mentioned above is used.
  • the control device 10 of the present embodiment observes the preceding other vehicle VX2 traveling in front of the traveling lane in which the host vehicle V travels (moves) from the reference point, and from the reference point.
  • a position display image (a diagram image, a background image, a shadow image) of the observed preceding other vehicle VX2 is generated.
  • the control device 10 of the present embodiment uses the position display image SH of the host vehicle V, the position display image SH1 of the other vehicle VX1, and the position display image SH2 of the other vehicle VX2 on a common projection plane SQ. Project.
  • the driver of the own vehicle V can correctly grasp the positional relationship between the own vehicle V, the other vehicle VX1, and the other vehicle VX2 from the positional relationship with the position display images SH, SH1, and SH2.
  • the timing at which the host vehicle tries to change the lane from the lane Ln2 to the lane Ln1 is determined from the steering angle of the host vehicle, the blinker operation, the braking operation, and the like.
  • the projection position of the position display image such as the shadow image SH is changed according to the acceleration / deceleration of the host vehicle V. Specifically, when the host vehicle V is accelerating, the control device 10 shifts the position of the position display image (line image, background image, shadow image) SH forward. On the other hand, when the host vehicle V is decelerating, the control device 10 shifts the position backward to the position display image (line image, background image, shadow image) SH. Thereby, according to the situation of the own vehicle V, the position display image SH which can grasp
  • the projection position of the position display image (line image, background image, shadow image) is changed according to the difference between the speed of the lane in which the host vehicle V travels and the speed of the host vehicle V.
  • the speed of the lane flow may be an average speed of another vehicle VX traveling in the same lane as the host vehicle V is traveling, or a legal speed of the lane.
  • the control device 10 determines the position when the difference (positive value) between the vehicle speed of the host vehicle V and the flow speed of the lane is large, that is, when the host vehicle V is approaching or overtaking the preceding other vehicle.
  • the position of the position display image SH of the display image (line diagram image, background image, shadow image) is shifted forward.
  • the control device 10 has a small difference between the vehicle speed of the host vehicle V and the flow speed of the lane, or a large difference (negative value), that is, the host vehicle V is approached by the other vehicle behind or overtakes the rear vehicle.
  • the position display image SH is shifted backward.
  • a position display image (a diagram image, a background image, a shadow image) SH that makes it easy to grasp the positional relationship between the host vehicle V and the other vehicle VX according to the relative situation between the host vehicle V and the other vehicle VX.
  • a shadow image SH, a diagram image SH, and a background image SH can be used as the position display image SH.
  • the control device 10 has an area of a position display image (a diagram image, a background image, a shadow image) SH of an object having a relatively high speed of the object including the other vehicle VX and a relatively high speed.
  • the position display images SH1 and SH2 are generated so as to be larger than the area of the position display image such as the position display image SH of the low object.
  • the control device 10 of the present embodiment acquires the vehicle speed P1 of the other vehicle VX1 and the vehicle speed P2 of the other vehicle VX2, and compares the vehicle speeds P1 and P2.
  • the control device 10 generates the position display image SH of the other vehicle VX with a high vehicle speed so that the area is larger than the position display image SH of the other vehicle VX with a low vehicle speed.
  • the area of the position display image SH1 of the other vehicle VX1 is set to the position display image SH2 of the other vehicle VX2. Larger than the area.
  • a shadow image SH, a diagram image SH, and a background image SH can be used as the position display image SH.
  • control device 10 of the present embodiment may increase the area of the position display image (line image, background image, shadow image) SH of the other vehicle VX as the speed of the other vehicle VX increases. .
  • a driver using this system can predict the speed of the other vehicle VX from the size of the position display image SH.
  • the speed of the other vehicle VX may be an absolute speed or may be a relative speed with respect to the vehicle speed of the host vehicle V.
  • the position display image SH of the other vehicle V having a high degree of approach to the host vehicle V can be displayed in a large size.
  • the hue, brightness, saturation, color tone (tone), pattern, gradation, and size (enlargement / reduction) of the position display image may be changed.
  • a shadow image SH, a diagram image SH, and a background image SH can be used as the position display image SH.
  • the control device 10 of the present embodiment acquires attribute information that the lane Ln1 in which the other vehicle VX travels is an overtaking lane
  • the lane Ln2 in which the other vehicle VX is traveling is not in the overtaking lane.
  • the position display image SH is generated so that the area of the position display image (line image, background image, shadow image) SH of the other vehicle VX traveling on the overtaking lane is larger than that obtained. This is because the speed of the other vehicle VX traveling on the overtaking lane can be predicted to be higher than the speed of the other vehicle VX traveling on the non-overtaking lane.
  • the attribute information that the lane is an overtaking lane or is not an overtaking lane is acquired from the map information 72 and / or road information 73 of the navigation device 70.
  • the method for identifying and acquiring the lane attribute information is not particularly limited, and a method known at the time of filing can be used as appropriate.
  • control device 10 changes the area of the position display image (line image, background image, shadow image) SH according to the driving skill of the driver of the moving object, for example, the driver of the host vehicle V. Also good. For example, when the skill of the pilot is low, the control device 10 generates the position display image SH so that the area of the position display image SH is larger than when the skill of the pilot is high. Thereby, each position of the own vehicle V and the other vehicle V and those positional relationships can be shown in an easy-to-understand manner to a pilot with low skill.
  • the operator's skill may be input by the operator himself / herself, or may be determined based on experience such as the number of operations and distance.
  • the driver's skill may be determined from the pilot's past operation history.
  • the operation history of a pilot with high skills is compared with the operation history of individual pilots. If the difference is large, it is determined that the operation skill is low, and if the difference is small, the operation skill is Judged to be high.
  • vehicle driving driving
  • the driving skill of the vehicle can be determined based on the acceleration operation, the timing of the steering operation, and the steering amount when the travel lane is changed.
  • a shadow image SH, a diagram image SH, and a background image SH can be used as the position display image SH.
  • the position display image (line diagram image, background image, shadow image) SH1 of the other vehicle VX1 traveling on the lane Ln1 is more position display image (line diagram image, background image) of the other vehicle VX2 traveling on the lane Ln2.
  • Shadow image The area is larger than SH2 and the position display image SH of the host vehicle V.
  • the size of the position display image SH is controlled by the actual vehicle speed by changing the size of the position display image SH according to the attribute of the lane in which the host vehicle V and the other vehicle VX travel.
  • a shadow image SH, a diagram image SH, and a background image SH can be used as the position display image SH.
  • the control device 10 of the present embodiment determines that the host vehicle V is in an acceleration state from the acquired acceleration of the host vehicle V
  • the control device 10 determines the position of the reference point LG in the traveling direction of the host vehicle V (in the drawing). Shift to the opposite side (arrow F ′ direction in the figure) of arrow F direction.
  • the projection position of the position display image SH can be shifted to the traveling direction (the direction of arrow F in the figure) of the host vehicle V (the direction of arrow F in the figure).
  • the control device 10 determines that the host vehicle V is in an acceleration state from the acquired acceleration of the host vehicle V, and then the virtual light source LG.
  • a reference point (viewpoint, virtual light source) LG is set to a rear position, for example, a reference point (viewpoint, virtual light source) LG2 ( Shift to the position indicated by the broken line.
  • the position of the reference point (viewpoint, virtual light source) LG is shifted backward, and the projection position of the position display image (line image, background image, shadow image) SH is shifted forward.
  • a shadow image SH, a diagram image SH, and a background image SH can be used as the position display image SH.
  • the control device 10 determines the position of the reference point (viewpoint, virtual light source) LG in the traveling direction side of the host vehicle V (see FIG. Shift in the direction of the middle arrow F). Thereby, when it is determined that the host vehicle V is in a decelerating state, the projection position of the position display image SH can be shifted to the side opposite to the traveling direction of the host vehicle V (the direction of the arrow F ′ in the figure).
  • the control device 10 of the present embodiment determines the position of the virtual light source LG when the host vehicle V determines from the acquired acceleration that the vehicle V is in a decelerating state. Shift to the traveling direction side of vehicle V (in the direction of arrow F in the figure).
  • the reference point (viewpoint, virtual light source) LG is set to a forward position, for example, the position of the reference point LG1 (displayed with a broken line). Shift to.
  • the position of the reference point LG is shifted forward, and the projection position of the position display image (line diagram image, background image, shadow image) SH is shifted backward, thereby determining the driver's judgment status.
  • the position display image suitable for can be displayed.
  • a shadow image SH, a diagram image SH, and a background image SH can be used as the position display image SH.
  • the setting position of the reference point (viewpoint, virtual light source) LG is not limited, but may be the same position as the virtual viewpoint in the projection processing.
  • the position of the virtual viewpoint viewing the host vehicle V can be recognized from the shape of the position display image (line diagram image, background image, shadow image) SH, and the positional relationship between the host vehicle V and the surroundings can be easily understood.
  • the reference point LG may be arranged at infinity. In this case, since parallel projection can be performed, it becomes easy to grasp the positional relationship between the host vehicle V and the object (other vehicle VX, etc.) from the position display image SH.
  • the setting position of the virtual viewpoint is not particularly limited. Further, the position of the virtual viewpoint may be changeable according to the user's designation. Further, the surface on which the position display image SH is projected (represented) may be a road surface of a road on which the host vehicle travels, or may be a projection surface set as shown in FIG. Further, when the moving body is a helicopter, a position display image (line diagram image, background image, shadow image) for displaying the position may be projected on the ground surface below the helicopter. When the moving body is a ship, the shadow information may be projected on the sea surface.
  • the control device 10 projects the captured image data acquired from the camera 40 onto the three-dimensional coordinate system S or the projection plane SQ, and generates images of the host vehicle V and surrounding objects from the set virtual viewpoint. Then, the control device 10 displays the generated video on the display 80.
  • the display 80 may be mounted on the host vehicle V and configured as the mobile device 200 or may be provided on the display device 100 side.
  • the display 80 may be a display for a two-dimensional image or a display that displays a three-dimensional image in which the positional relationship in the depth direction of the screen can be visually recognized.
  • the video to be displayed in the present embodiment includes a position display image (a diagram image, a background image, a shadow image) SH of the host vehicle V.
  • the display image may include position display images (line diagram image, background image, shadow image) SH1 and SH2 of the other vehicle VX as an object.
  • the displayed image may include both the position display image SH of the host vehicle V and the position display images SH1 and SH2 of the other vehicle VX.
  • a shadow image SH, a diagram image SH, and a background image SH can be used as the position display image SH.
  • the icon image V ′ (see FIGS. 3A, 3B, 4A, and 4B) indicating the host vehicle V prepared in advance is superimposed and displayed on the video displayed by the display device 100 of the present embodiment. Good.
  • the icon image V ′ of the vehicle may be created and stored in advance based on the design of the host vehicle V. In this manner, by superimposing the icon image V ′ of the host vehicle V on the video, the relationship between the position and orientation of the host vehicle V and the surrounding video can be shown in an easily understandable manner.
  • step S ⁇ b> 1 the control device 10 acquires a captured image captured by the camera 40.
  • step S2 the control device 10 acquires the current position of the host vehicle V and the position of the object including the other vehicle VX.
  • the host vehicle V is an example of a “moving body”
  • the other vehicle VX is an example of an “object”.
  • the control device 10 sets a reference point using the position information of the moving body (the host vehicle V, the other vehicle VX).
  • One reference point may be set based on the host vehicle V, or a plurality of reference points may be set for each of the host vehicle V and the other vehicle VX.
  • the reference point is a point (viewpoint) for observing a moving object that is a target of a diagram image, a point (viewpoint) for observing a moving object that is a target of a background image, and a light that is irradiated to a moving object that is a target of a shadow image Includes points (light sources).
  • step S 3 An example of the method for setting the reference point (viewpoint, virtual light source) (step S3) will be described based on the flowchart of FIG.
  • the control device acquires the acceleration of the host vehicle V.
  • step S12 the control device 10 determines whether the host vehicle V is in an acceleration state or a deceleration state based on the acceleration. If it is in the accelerated state, the process proceeds to step S13.
  • step S13 by shifting the reference point (viewpoint, virtual light source) to the rear side, the projection position of the position display image including the diagram image SH, the background image SH, and the shadow image SH is shifted to the front side in the traveling direction. .
  • the position of the position display image projected on the projection plane when the host vehicle V is in the acceleration state is more than the position of the position display image projected on the projection plane when the host vehicle V is not in the acceleration state. It is located on the front side in the traveling direction of V.
  • the process proceeds to step S15.
  • step S15 by shifting the reference point (viewpoint, virtual light source) to the front side, the projection position of the position display image including the diagram image SH and the background image SH is shifted to the rear side in the traveling direction.
  • the position of the reference point that is set when the host vehicle V is in the deceleration state is more forward of the traveling direction of the host vehicle V than the position of the reference point that is set when the host vehicle V is not in the deceleration state.
  • the position of the position display image projected on the projection plane when the host vehicle V is in a deceleration state is greater than the position of the position display image projected on the projection plane when the host vehicle V is not in a deceleration state.
  • the projection position of the position display image is a position in the coordinate system of the projection plane set in a later process.
  • the control device 10 sets a projection plane.
  • the projection plane may be a three-dimensional coordinate system shown in FIGS. 3A, 3B, 4A, and 4B, or may be a two-dimensional coordinate system like the projection plane SQ shown in FIG. Furthermore, as shown in FIG. 11 described later, a plurality of projection planes may be set.
  • step S5 the control device 10 generates a diagram image SH, a background image SH, or a shadow image SH of the host vehicle V.
  • the diagram images SH1, SH2, background images SH1, SH2 or shadow images SH1, SH2 of the other vehicles VX1, VX2 are generated.
  • step S6 the control device 10 executes a process of projecting the captured image on the set projection plane SQ, and generates a display image.
  • step S7 the control device 10 displays the generated video on the display 80.
  • the position display image described here includes a shadow image SH, a diagram image SH, and a background image SH.
  • a case where the position display image is a shadow image SH will be described as an example.
  • a diagram image SH and a background image SH may be used.
  • the shadow image SH (or diagram image SH) of the position display image of this example includes an image indicating the movable range of the movable member of the host vehicle V.
  • the background image SH includes a contour line of the movable member of the host vehicle V and a diagram (background defect region) indicating the movable range of the working member.
  • the projection plane showing the shadow image SH of the host vehicle V includes a first projection plane SQs along the vehicle length direction of the host vehicle V and a second projection plane SQb along the vehicle width direction.
  • a position display image (shadow image, diagram image, background image) SHs showing a movable member when observed from a reference point (viewpoint, virtual light source) LG set on the side of the vehicle is displayed.
  • the position display image includes a shadow image imitating the shadow of the movable member when light is emitted from the virtual light source LG.
  • a position display image (shadow image, diagram image, background image) SHb showing a movable member when observed from a reference point (viewpoint, virtual light source) LG set in front of the vehicle is projected.
  • the position display image projects a shadow image SHb simulating a shadow when light is emitted from the virtual light source LG.
  • the position display image (shadow image, diagram image, background image) SH of this example includes an image indicating the movable range of the movable member of the host vehicle V.
  • the own vehicle V of this example is a hatchback type vehicle, and has a side door and a back door (hatch door).
  • the side door of the host vehicle V opens and closes sideways, and the back door of the host vehicle V opens and closes rearward.
  • the side door and the back door of the host vehicle V are movable members of the host vehicle V that is a moving body.
  • the control device 10 assumes a case where the occupant opens the back door in order to carry in or out the luggage from the rear loading platform, and displays a position display image (shadow image, line drawing image, background image) indicating the movable range of the back door. ) Is generated.
  • the position display image SHs projected on the first projection surface SQs includes a back door portion Vd3.
  • the back door portion Vd3 represents the rear extension (movable range) of the host vehicle V when the back door is opened.
  • the shadow image indicating the movable range of the back door is projected on the left and right sides of the host vehicle V or on the placement surface (parking surface / road surface) of the host vehicle V.
  • the shadow image may be projected on a wall surface or floor surface that actually exists.
  • the control device 10 assumes a case where the side door is opened so that an occupant can get on and off from the seat or carry in / out the luggage, and a position display image (shadow image, diagram image) indicating the movable range of the side door. , Background image).
  • the position display image (shadow image, diagram image, background image) SHb projected on the second projection plane SQb includes side door portions Vd1 and Vd2. Expresses the lateral extension (movable range) of the side door when the side door is opened.
  • the position display image indicating the movable range of the side door is projected forward and / or rearward of the host vehicle V.
  • the control device 10 may set a projection plane to project a position display image (shadow image, diagram image, background image), and project the position display image onto the projection plane.
  • the driver of the host vehicle V considers the work after the host vehicle V is parked. And the parking position of the own vehicle V can be determined. Furthermore, by superimposing the captured images of the surrounding objects together with the position display image SH, it is possible to park at a position where the work after parking is not hindered while avoiding the surrounding objects.
  • the mode of the position display image (shadow image, diagram image, background image) SH will be described in the case where the moving body is the host vehicle V, that is, the door of the host vehicle V is movable.
  • the mode of the position display image SH is not limited to this.
  • a position display image (a shadow image, a diagram image, a background image) is generated in consideration of the movable range of the forklift body and the lift equipment.
  • a position display image (shadow image, diagram image, background image) is generated in consideration of the rotation range of the helicopter body and the rotor blades.
  • a position display image (a shadow image, a diagram image, a background image) is generated in consideration of the installation range of the airplane main body and ancillary equipment such as a passenger trap.
  • the mobile body is a submarine spacecraft, it is generated in consideration of the installation range of the platform provided as necessary.
  • the generated position display image is displayed on the display 80.
  • a position display image (shadow image, diagram image, background image) indicating the operating range of the lift device is displayed, for example, on the surrounding ground surface (for example, the floor of a facility such as a warehouse or a factory).
  • the surrounding ground surface for example, the floor of a facility such as a warehouse or a factory.
  • the attitude of the helicopter can be confirmed from the sky.
  • a shadow image indicating the installation range of the main body of the airplane and the attached equipment on the ground surface (for example, the ground surface) of the airplane it is possible to search from the sky for a place having an area where the airplane can make an emergency landing.
  • the position display image (shadow image, diagram image, background image) SH may store a basic pattern prepared in advance, and the control device 10 may read it from the memory as necessary.
  • the display device 100 sets a reference point using position information of a moving body such as the host vehicle V and the other vehicle VX, and the host vehicle V and the other vehicle observed from the reference point in the captured image.
  • a position display image SH indicating the position where the VX exists is generated, and an image including the position display image SH and part or all of the captured image is displayed.
  • the video including the position display image SH indicating the position where the host vehicle V is present and the captured image can be displayed, a video in which the positional relationship of the host vehicle V with respect to the surroundings can be easily understood can be displayed.
  • the display device 100 of the present embodiment generates a diagram image SH as a position display image indicating the position of the host vehicle V in the captured image, and includes part or all of the diagram image SH and the captured image. Display video.
  • the diagram image SH indicating the location of the host vehicle V and the image including a part or all of the captured image can be displayed, so that the image in which the positional relationship of the host vehicle V with respect to the surroundings can be easily understood can be displayed.
  • the display device 100 generates a background image SH as a position display image indicating the position of the host vehicle V in the captured image, and displays a video including the background image SH and a part or all of the captured image. Display.
  • the background image SH indicating the position of the host vehicle V and the image including a part or all of the captured image can be displayed, so that it is possible to display an image in which the positional relationship of the host vehicle V with respect to the surroundings can be easily understood.
  • the display device 100 of the present embodiment generates a shadow image SH as a position display image indicating the position of the host vehicle V in the captured image, and displays an image including the shadow image SH.
  • the display device 10 sets a virtual light source as a reference point using position information of moving bodies such as the host vehicle V and the other vehicle VX, and generates shadows when the host vehicle V is irradiated with light from the virtual light source. Is generated, and a video including part or all of the shadow image SH and a captured image around the host vehicle V is displayed.
  • the shadow image SH of the host vehicle V since the presence of the host vehicle V and the direction of the host vehicle V can be expressed by the shadow image SH of the host vehicle V, it is possible to display an image in which the positional relationship between the host vehicle V and surrounding objects can be easily understood.
  • the display device 100 generates a position display image SH indicating the position of an object including the other vehicle VX, and displays an image including the position display image SH and a part or all of the captured image. .
  • the position display image SH indicating the position of the other vehicle VX as the object and the image including part or all of the captured image can be displayed together, the positional relationship between the host vehicle V and the other vehicle VX can be displayed. Easy-to-understand video can be displayed.
  • the display device 100 generates a diagram image SH as a position display image indicating the position of an object including the other vehicle VX, and displays part or all of the diagram image SH and the captured image. Display the video that contains it.
  • the diagram image SH indicating the position of the other vehicle VX as the object and the image including part or all of the captured image can be displayed together, the positional relationship between the host vehicle V and the other vehicle VX can be grasped. Can be displayed easily.
  • the display device 100 generates a background image SH as a position display image indicating the position where an object including the other vehicle VX is present, and includes a part or all of the background image SH and the captured image. Is displayed.
  • the background image SH indicating the position of the other vehicle VX as the object and the image including a part or all of the captured image can be displayed together, so that the positional relationship between the host vehicle V and the other vehicle VX can be grasped. Easy video can be displayed.
  • the display device 100 of the present embodiment generates a shadow image SH as a position display image indicating the position of an object including the other vehicle VX in the captured image, and displays an image including the shadow image SH.
  • the display device 100 according to the present embodiment generates a shadow image SH imitating a shadow generated when light is irradiated to an object including the other vehicle VX, and displays an image including the shadow image SH and a part or all of the captured image. Display.
  • the positional relationship between the other vehicle VX and the host vehicle V can be expressed by the shadow images SH1 and SH2 of the other vehicle VX, the positional relationship between the host vehicle V and the surrounding object such as the other vehicle VX can be expressed. Easy-to-understand video can be displayed.
  • the display device 100 of the present embodiment sets a projection plane SQ for projecting a position display image along the direction in which the host vehicle V moves, and is adjacent to the travel lane Ln2 in which the host vehicle V travels (moves).
  • a position display image for displaying the position of the other vehicle VX1 traveling in the adjacent lane Ln1 is generated.
  • the video includes a part or all of the captured image around the host vehicle V
  • the positional relationship is easier to grasp.
  • by setting the projection surface SQ along the traveling direction of the host vehicle V it is possible to present an image in which the distance between the host vehicle V and the other vehicle VX1 can be easily recognized.
  • the display device 100 has an area of a position display image (a diagram image, a background image, or a shadow image) SH of an object having a relatively high speed, and an object having a relatively low speed.
  • the position display images (line image, background image, or shadow image) SH1 and SH2 of the object are generated so as to be larger than the area of the position display image (line image, background image, or shadow image) SH.
  • the position display image (line image, background image, or shadow image) of the relatively high-speed object is displayed relatively large, so that an image that calls attention to the high-speed object is displayed. Can be displayed.
  • the display device 100 changes the size of the position display image (a diagram image, a background image, or a shadow image) SH according to the attribute of the lane on which the host vehicle V and the other vehicle VX travel.
  • the driver's attention to the other vehicle VX traveling at a high speed can be alerted in the same way as when the size of the position display image (line image, background image, or shadow image) SH is controlled by the actual vehicle speed.
  • the display device 100 shifts the position of the reference point for observing the moving object or the object to the rear to thereby display the position display image (the diagram image, the background image, or the shadow image). ) It is possible to display an image suitable for the judgment situation of the driver that shifts the SH projection position forward.
  • the display device 100 of the present embodiment shifts the position of the virtual light source LG of the shadow image SH backward in the acceleration state, resulting in the shadow image SH as a result.
  • the projected position can be shifted forward, and a shadow image suitable for the driver's judgment situation can be displayed.
  • the position display image (a diagram image, a background image, or a shadow image) is obtained by shifting the position of a reference point for observing a moving object or object forward.
  • the SH projection position can be shifted backward to display an image suitable for the driver's judgment status.
  • the display device 100 according to the present embodiment shifts the position of the virtual light source LG forward when the vehicle is in a decelerating state, thereby resulting in the projection position of the shadow image SH. It is possible to display a shadow image suitable for the judgment situation of the driver by shifting backward.
  • the display device 100 generates and displays a position display image including a shadow image SH, a diagram image SH, and a background image SH indicating the movable range of the movable member of the host vehicle V, thereby displaying the position display image.
  • the driver of the vehicle V can determine the parking position of the host vehicle V in consideration of the work after the host vehicle V is parked.
  • the display device 100 causes the display method of the present embodiment to execute the above-described effect.
  • the display system 1 including the display device 100 as one embodiment of the display device according to the present invention will be described as an example, but the present invention is not limited to this.
  • the display device 100 including the control device 10 including the CPU 11, the ROM 12, and the RAM 13 is described as an embodiment of the display device according to the present invention, but the present invention is not limited to this.
  • an image acquisition function an information acquisition function, and an image generation
  • the display device 100 including the control device 10 that executes the function and the display function will be described as an example, but the present invention is not limited to this.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un dispositif d'affichage (100) comprenant un dispositif de commande (10) qui exécute une fonction d'acquisition d'image qui obtient une image capturée par une caméra (40) montée dans un véhicule V, une fonction d'acquisition d'informations qui obtient des informations de position du véhicule V, une fonction de génération d'image qui utilise les informations de position du véhicule, génère une image d'affichage de position indiquant la position d'un corps mobile dans l'image capturée et génère une vidéo contenant l'image d'affichage de position et l'image capturée, ainsi qu'une fonction d'affichage qui affiche la vidéo générée.
PCT/JP2015/055957 2014-11-14 2015-02-27 Dispositif et procédé d'affichage WO2016075954A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016558896A JP6500909B2 (ja) 2014-11-14 2015-02-27 表示装置及び表示方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
PCT/JP2014/080177 WO2016075810A1 (fr) 2014-11-14 2014-11-14 Dispositif d'affichage et procédé d'affichage
JPPCT/JP2014/080177 2014-11-14

Publications (1)

Publication Number Publication Date
WO2016075954A1 true WO2016075954A1 (fr) 2016-05-19

Family

ID=55953923

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2014/080177 WO2016075810A1 (fr) 2014-11-14 2014-11-14 Dispositif d'affichage et procédé d'affichage
PCT/JP2015/055957 WO2016075954A1 (fr) 2014-11-14 2015-02-27 Dispositif et procédé d'affichage

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/080177 WO2016075810A1 (fr) 2014-11-14 2014-11-14 Dispositif d'affichage et procédé d'affichage

Country Status (2)

Country Link
JP (1) JP6500909B2 (fr)
WO (2) WO2016075810A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106851193A (zh) * 2016-12-22 2017-06-13 安徽保腾网络科技有限公司 用于拍摄事故车辆底盘的新型装置
US12002359B2 (en) 2018-06-20 2024-06-04 Nissan Motor Co., Ltd. Communication method for vehicle dispatch system, vehicle dispatch system, and communication device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7028609B2 (ja) * 2017-11-08 2022-03-02 フォルシアクラリオン・エレクトロニクス株式会社 画像表示装置、及び画像表示システム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007118762A (ja) * 2005-10-27 2007-05-17 Aisin Seiki Co Ltd 周辺監視モニタリングシステム
JP2007210458A (ja) * 2006-02-09 2007-08-23 Nissan Motor Co Ltd 車両用表示装置および車両用映像表示制御方法
JP2007282098A (ja) * 2006-04-11 2007-10-25 Denso Corp 画像処理装置及び画像処理プログラム
JP2009230225A (ja) * 2008-03-19 2009-10-08 Mazda Motor Corp 車両用周囲監視装置
JP2011028634A (ja) * 2009-07-28 2011-02-10 Toshiba Alpine Automotive Technology Corp 車両用画像表示装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10176931A (ja) * 1996-12-18 1998-06-30 Nissan Motor Co Ltd 車両用ナビゲーション装置
WO2007083494A1 (fr) * 2006-01-17 2007-07-26 Nec Corporation Dispositif de reconnaissance graphique, méthode de reconnaissance graphique et programme de reconnaissance graphique
JP5715778B2 (ja) * 2010-07-16 2015-05-13 東芝アルパイン・オートモティブテクノロジー株式会社 車両用画像表示装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007118762A (ja) * 2005-10-27 2007-05-17 Aisin Seiki Co Ltd 周辺監視モニタリングシステム
JP2007210458A (ja) * 2006-02-09 2007-08-23 Nissan Motor Co Ltd 車両用表示装置および車両用映像表示制御方法
JP2007282098A (ja) * 2006-04-11 2007-10-25 Denso Corp 画像処理装置及び画像処理プログラム
JP2009230225A (ja) * 2008-03-19 2009-10-08 Mazda Motor Corp 車両用周囲監視装置
JP2011028634A (ja) * 2009-07-28 2011-02-10 Toshiba Alpine Automotive Technology Corp 車両用画像表示装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106851193A (zh) * 2016-12-22 2017-06-13 安徽保腾网络科技有限公司 用于拍摄事故车辆底盘的新型装置
US12002359B2 (en) 2018-06-20 2024-06-04 Nissan Motor Co., Ltd. Communication method for vehicle dispatch system, vehicle dispatch system, and communication device

Also Published As

Publication number Publication date
JP6500909B2 (ja) 2019-04-17
JPWO2016075954A1 (ja) 2017-09-28
WO2016075810A1 (fr) 2016-05-19

Similar Documents

Publication Publication Date Title
US10488218B2 (en) Vehicle user interface apparatus and vehicle
US10053001B1 (en) System and method for visual communication of an operational status
CN104883554B (zh) 通过虚拟透视仪器群集显示直播视频的方法和系统
CN111788102B (zh) 用于跟踪交通灯的里程计系统和方法
CN109636924B (zh) 基于现实路况信息三维建模的车载多模式增强现实系统
US10908604B2 (en) Remote operation of vehicles in close-quarter environments
CN109863513A (zh) 用于自主车辆控制的神经网络系统
EP3235684A1 (fr) Appareil qui presente un résultat de la reconnaissance du cible de reconnaissance
CN115039129A (zh) 用于自主机器应用的表面轮廓估计和隆起检测
CN109690634A (zh) 增强现实显示器
US20170262710A1 (en) Apparatus that presents result of recognition of recognition target
WO2020031812A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations, et corps mobile
JPWO2019092846A1 (ja) 表示システム、表示方法、およびプログラム
JP2022510450A (ja) 自動車の遠隔制御におけるユーザの支援方法、コンピュータプログラム製品、遠隔制御装置および自動車の運転支援システム
JP6380550B2 (ja) 表示装置及び表示方法
JP2022008854A (ja) 制御装置
JP6500909B2 (ja) 表示装置及び表示方法
CN110271487A (zh) 具有增强现实的车辆显示器
CN113602282A (zh) 车辆驾驶和监测系统及将情境意识维持在足够水平的方法
JP2022129175A (ja) 車両評価方法及び車両評価装置
CN115857169A (zh) 碰撞预警信息的显示方法、抬头显示装置、载具及介质
US10134182B1 (en) Large scale dense mapping
JP2007072224A (ja) ドライビングシミュレータ
CN117784768A (zh) 车辆避障规划方法、装置、计算机设备和存储介质
JP2017090189A (ja) 走行ガイドシステム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15859227

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016558896

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15859227

Country of ref document: EP

Kind code of ref document: A1