WO2023105842A1 - Information processing device, mobile body, information processing method, and program - Google Patents

Information processing device, mobile body, information processing method, and program Download PDF

Info

Publication number
WO2023105842A1
WO2023105842A1 PCT/JP2022/028506 JP2022028506W WO2023105842A1 WO 2023105842 A1 WO2023105842 A1 WO 2023105842A1 JP 2022028506 W JP2022028506 W JP 2022028506W WO 2023105842 A1 WO2023105842 A1 WO 2023105842A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
control unit
moving body
bird
automobile
Prior art date
Application number
PCT/JP2022/028506
Other languages
French (fr)
Japanese (ja)
Inventor
則政 岸
雅司 高田
秀雄 廣重
Original Assignee
株式会社光庭インフォ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社光庭インフォ filed Critical 株式会社光庭インフォ
Priority to CN202280080009.0A priority Critical patent/CN118355655A/en
Priority to JP2023566083A priority patent/JPWO2023105842A1/ja
Publication of WO2023105842A1 publication Critical patent/WO2023105842A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an information processing device, a mobile object, an information processing method, and a program.
  • Japanese Patent Laid-Open No. 2002-200001 discloses a technique aimed at displaying a display that makes it easy for a passenger to understand the traveling direction of a moving object and to easily concentrate on monitoring the surroundings.
  • the surrounding information of the moving object is presented in an easy-to-understand manner.
  • an information processing device has a control unit.
  • the control unit generates a bird's-eye view image including the moving body and the surroundings of the moving body. Control is performed so that the bird's-eye view image is displayed on a display unit that can be viewed by the operator of the mobile object. Based on the distance between the moving object and an object existing in the traveling direction of the moving object, the depression angle of the viewpoint of the bird's-eye view image is changed.
  • FIG. 1 is a diagram showing an example of a system configuration of an information processing system 1000.
  • FIG. FIG. 2 is a diagram showing an example of the hardware configuration of the information processing device 110.
  • FIG. 3 is a diagram showing an example of the functional configuration of the information processing device 110.
  • FIG. 4 is an activity diagram showing an example of information processing by the information processing apparatus 110.
  • FIG. 5 is a diagram (part 1) showing an example of a bird's-eye view image.
  • FIG. 6 is a diagram (part 2) showing an example of a bird's-eye view image.
  • FIG. 7 is a diagram (part 3) showing an example of a bird's-eye view image.
  • FIG. 1 is a diagram showing an example of a system configuration of an information processing system 1000.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the information processing device 110.
  • FIG. 3 is a diagram showing an example of the functional configuration of the information processing device 110.
  • FIG. 4 is an activity diagram showing an example of
  • FIG. 8 is a diagram showing an example of a displacement curve when per is decreasing away from an object.
  • FIG. 9 is a diagram showing an example of a displacement curve when per approaches an object and increases.
  • FIG. 10 is a diagram showing an example of control by the display control unit 304 such that the closer the vehicle 100 is to the obstacle 150, the greater the angle of depression.
  • FIG. 11 is a diagram (Part 1) illustrating an example of a viewpoint change in modification 3;
  • FIG. 12 is a diagram (part 2) illustrating an example of a viewpoint change in modification 3;
  • the term "unit” may include, for example, a combination of hardware resources implemented by circuits in a broad sense and software information processing that can be specifically realized by these hardware resources.
  • various information is handled in the present embodiment, and these information are, for example, physical values of signal values representing voltage and current, and signal values as binary bit aggregates composed of 0 or 1. It is represented by high and low, or quantum superposition (so-called quantum bit), and communication and operation can be performed on a circuit in a broad sense.
  • a circuit in a broad sense is a circuit realized by at least appropriately combining circuits, circuits, processors, memories, and the like. That is, Application Specific Integrated Circuit (ASIC), programmable logic device (for example, Simple Programmable Logic Device (SPLD), Complex Programmable Logic Device (CPLD), and field It includes a programmable gate array (Field Programmable Gate Array: FPGA)).
  • ASIC Application Specific Integrated Circuit
  • SPLD Simple Programmable Logic Device
  • CPLD Complex Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • FIG. 1 is a diagram showing an example of the system configuration of an information processing system 1000.
  • an information processing system 1000 includes an automobile 100 as a system configuration.
  • a car is an example of a mobile object.
  • Automobile 100 includes information processing device 110 , display 120 , and multiple cameras 130 .
  • the information processing device 110 is a device that executes the processing of this embodiment. In the present embodiment, the information processing device 110 is described as being included in the vehicle 100, but may not be included in the vehicle 100 as long as it can communicate with the vehicle 100 or the like.
  • the display 120 is a display device that displays a 3D synthesized image and the like, which will be described later, under the control of the information processing device 110 .
  • Camera 130 is a camera that captures an image of the surroundings of automobile 100 . More specifically, the camera 130 is an RGB camera or the like that adds and describes color information as an image. Camera 130 1 is provided behind automobile 100 . Camera 130 2 is provided in front of automobile 100 . Although not shown in FIG. 1, in addition to the cameras 130-1 and 130-2 , cameras 130 are provided on the left and right sides of the automobile 100, respectively. These cameras are hereinafter simply referred to as camera 130 . In addition, although not shown in FIG. 1 for simplification of explanation, the vehicle 100 has a plurality of depth sensors. The depth sensor measures the shape of objects around the automobile 100 and the distance from the automobile 100 to the objects using laser light or the like. However, the depth sensor is not limited to laser light, and may measure distance using ultrasonic waves, or may measure distance using a camera with a depth sensor.
  • FIG. 2 is a diagram illustrating an example of the hardware configuration of the information processing apparatus 110.
  • the information processing apparatus 110 includes a control unit 201, a storage unit 202, and a communication unit 203 as a hardware configuration.
  • the control unit 201 is a CPU (Central Processing Unit) or the like, and controls the entire information processing apparatus 110 .
  • the storage unit 202 is any one of HDD (Hard Disk Drive), ROM (Read Only Memory), RAM (Random Access Memory), SSD (Solid State Drive), etc., or any combination thereof, and stores programs and controls. It stores data and the like used when the unit 201 executes processing based on a program.
  • the control unit 201 executes processing based on a program stored in the storage unit 202 to configure the functional configuration of the information processing apparatus 110 as shown in FIG. Activity diagram processing and the like are realized.
  • data used when the control unit 201 executes processing based on a program is stored in the storage unit 202, but the data is stored in another device with which the information processing device 110 can communicate. You may make it memorize
  • the communication unit 203 is a NIC (Network Interface Card) or the like, connects the information processing apparatus 110 to a network, and controls communication with other apparatuses (eg, display 120, camera 130, etc.).
  • Storage unit 202 is an example of a storage medium.
  • FIG. 3 is a diagram illustrating an example of the functional configuration of the information processing device 110 .
  • the information processing apparatus 110 includes a peripheral information receiving unit 301, an object recognition unit 302, a behavior information receiving unit 303, and a display control unit 304 as functional configurations.
  • the peripheral information receiving unit 301 receives information on objects around the vehicle 100, information on colors around the vehicle 100, and the like from the camera 130 as peripheral information. Further, the peripheral information receiving unit 301 receives the shapes of objects around the vehicle 100 and the distance from the vehicle 100 to the objects as peripheral information from the plurality of depth sensors.
  • the object recognition unit 302 recognizes objects around the automobile 100 based on the surrounding information received by the surrounding information receiving unit 301 .
  • Surrounding objects include, for example, buildings, roads, other vehicles, parking lots, and the like.
  • the behavior information receiving unit 303 receives behavior information of the moving body. More specifically, the behavior information receiving unit 303 converts information detected by a wheel speed sensor, a steering angle sensor, an inertial measurement unit (IMU), etc. included in the vehicle 100 into the behavior information of the vehicle 100. to get as An inertial measurement device is a device that detects three-dimensional inertial motion (translational motion and rotational motion in orthogonal three-axis directions). Rotational motion is detected by [deg/sec].
  • the behavior information receiving unit 303 may receive information about the behavior of the automobile 100 from SLAM (Simultaneous Localization and Mapping) included in the automobile 100, LiDAR, a stereo camera, a gyro sensor, etc. included in the automobile 100.
  • SLAM Simultaneous Localization and Mapping
  • LiDAR LiDAR
  • stereo camera LiDAR
  • gyro sensor a gyro sensor
  • the behavior information includes, for example, position information and orientation information of the vehicle 100 .
  • the display control unit 304 converts a surrounding object into a 3D model based on information on the object recognized by the object recognition unit 302, information from the SLAM, and the like.
  • the display control unit 304 also generates a 3D model of the automobile 100 based on information from the camera 130, information from SLAM, behavior information acquired by the behavior information receiving unit 303, and the like.
  • the display control unit 304 also generates a bird's-eye view image including the automobile 100 and its surroundings based on the 3D model of the surrounding objects, the 3D model of the automobile 100, information from the camera 130, and the like.
  • the bird's-eye view image is an image that has undergone perspective transformation so that the viewpoint can be viewed from above the road surface, and is an image that includes the entire automobile 100 or at least half or more of the automobile 100 .
  • Bird's-eye view images are shown in FIGS. 5 to 7, etc., which will be described later.
  • the display control unit 304 controls to display the generated overhead image on the display 120 of the automobile 100 or the like.
  • the display 120 of the automobile 100 is an example of a display visible to the operator of the automobile 100 .
  • part or all of the functional configuration of FIG. 3 may be implemented in the information processing device 110 or the automobile 100 as a hardware configuration. Also, part or all of the functional configuration of the information processing device 110 may be implemented in the vehicle 100 .
  • FIG. 4 is an activity diagram showing an example of information processing of the information processing apparatus 110 .
  • the peripheral information receiving unit 301 receives information on objects around the vehicle 100, information on colors around the vehicle 100, etc. from the camera 130 as peripheral information. Further, the peripheral information receiving unit 301 receives the shapes of objects around the vehicle 100 and the distance from the vehicle 100 to the objects as peripheral information from the plurality of depth sensors.
  • the object recognition unit 302 recognizes objects around the automobile 100 based on the surrounding information received by the surrounding information receiving unit 301 .
  • the behavior information receiving unit 303 receives information about the behavior of the automobile 100.
  • the display control unit 304 converts the surrounding objects into a 3D model based on the information on the object recognized by the object recognition unit 302, the information from the SLAM, and the like.
  • the display control unit 304 also generates a 3D model of the automobile 100 based on information from the camera 130, information from SLAM, behavior information of the automobile 100 received by the behavior information receiving unit 303, and the like.
  • the display control unit 304 also generates a bird's-eye view image including the automobile 100 and its surroundings based on the 3D model of the surrounding objects, the 3D model of the automobile 100, information from the camera 130, and the like.
  • the display control unit 304 controls to display the generated bird's-eye view image on the display 120 of the automobile 100 or the like.
  • the display control unit 304 changes the depression angle with respect to the viewpoint of the bird's-eye view image based on the distance between the vehicle 100 and an object existing in the traveling direction of the vehicle 100 .
  • objects existing in the direction of travel include, for example, obstacles such as car stops.
  • objects existing in the direction of travel include parking lot walls, other cars parked in the parking lot, and parking spaces in the parking lot. It may be a mark or the like.
  • a car stop, a mark indicating a parking place, or the like is an example of a car parking position.
  • the display control unit 304 may recognize and extract obstacles existing in the direction of travel by image processing from an image related to the direction of travel, or may recognize an object around the automobile 100 recognized by the object recognition unit 302 . may recognize.
  • FIG. 5 is a diagram (part 1) showing an example of a bird's-eye view image.
  • FIG. 6 is a diagram (part 2) showing an example of a bird's-eye view image.
  • FIG. 7 is a diagram (part 3) showing an example of a bird's-eye view image.
  • display control will be described as an example when the distance L between the automobile 100 and an obstacle behind the automobile 100 is equal to or greater than a predetermined distance and when the distance is less than the predetermined distance.
  • a predetermined distance 5 m will be described below as an example.
  • 5m is an example.
  • the display control unit 304 uses the obtained per to obtain a viewpoint movement ratio (dif) from a displacement curve.
  • a viewpoint movement ratio (dif)
  • the displacement curve is set so that per decreases when moving away from the obstacle (curveDec) and when per approaches the obstacle.
  • curveInc At the time of increase (curveInc), the automobile 100 has two of FIG. Also, curveInc is always located on the right side of curveDec.
  • FIG. 8 is a diagram showing an example of a displacement curve when per is decreasing away from an obstacle.
  • FIG. 9 is a diagram showing an example of a displacement curve when per approaches an obstacle and increases.
  • the horizontal axis is per and the vertical axis is dif.
  • the left and right values of the curves in FIGS. 8 and 9 are constant. Left is always 0 and right is always 1.
  • the display control unit 304 continuously changes the depression angle based on the distance between the automobile 100 and obstacles present in the traveling direction. More specifically, the display control unit 304 performs control such that the closer the vehicle 100 is to an obstacle, the greater the angle of depression. At this time, the display control unit 304 uses the displacement curve described above to perform control such that the closer the vehicle 100 is to the obstacle, the greater the angle of depression with respect to the viewpoint of the bird's-eye view image.
  • FIG. 10 is a diagram showing an example of control performed by the display control unit 304 so that the closer the vehicle 100 is to the obstacle 150, the greater the angle of depression.
  • the display control unit 304 changes the viewpoint from the viewpoint 1 to the viewpoint 2 as the automobile 100 approaches the obstacle 150 .
  • the depression angle of the viewpoint 2 is a depression angle larger than the depression angle of the viewpoint 1 .
  • the viewpoints 1 and 2 are on the traveling direction axis of the automobile 100, but both the viewpoints 1 and 2 do not necessarily have to be on the traveling direction axis.
  • the display control unit 304 controls to generate a bird's-eye view image from the viewpoint 1 and display it on the display 120 .
  • a bird's-eye view image from a viewpoint 2 in which an obstacle 150 exists in the traveling direction of the automobile 100 may be generated and controlled to be displayed on the display 120 .
  • the display control unit 304 may change the rate of change in the angle of depression depending on whether the automobile 100 approaches the obstacle or the moving object moves away from the obstacle.
  • the display control unit 304 controls rotation and movement of the camera using per and dif.
  • the camera referred to here is a camera that virtually captures an image of the automobile 100 from above the automobile 100 .
  • the display control unit 304 controls the vehicle 100 to approach the depression angle of 90 degrees (image from above) as it approaches the obstacle. .
  • the depression angle of 30 degrees and the depression angle of 90 degrees may be changed in advance according to the situation.
  • the display control unit 304 may change the depression angle and/or the displacement curve to be used according to the type, size, etc. of the obstacle.
  • a displacement curve for rotation and a displacement curve for movement may be prepared, and the display control unit 304 may use the displacement curve for rotation and the displacement curve for movement to control the display of the overhead image. good.
  • the display control unit 304 may hide obstacles in front of the automobile 100 . By doing so, it is possible to make the own vehicle visible even in parking lots such as tower parking lots where there are many obstructions.
  • the display control unit 304 performs the above-described processing when there is no colliding object between the viewpoint and the host vehicle.
  • the display control unit 304 hides obstacles in front of the vehicle 100 . That is, the display control unit 304 does not hide obstacles that may collide with the automobile 100 . With such a device, viewpoint conversion is flexibly implemented.
  • the display control unit 304 can perform the same processing as described above.
  • Parking is broadly divided into action for entering the car 100 into the parking frame and action during approach for adjusting the stop position and stop posture.
  • Information about the surroundings of the automobile 100 required for each action is different.
  • the surrounding information for entering the parking frame and the distance to the obstacle on the inscribed side of the vehicle when turning a large steering wheel are required.
  • the distance to the stop position and the distance between the vehicle 100 and obstacles on both sides of the vehicle 100 are obtained. Based on this assumption, the display of the distance from the vehicle 100 to the obstacle is as follows.
  • distance measurement is performed at ⁇ 20 degrees forward, ⁇ 15 degrees backward, and ⁇ 60 degrees left and right.
  • the display control unit 304 displays a marker at the position of the smallest obstacle.
  • the display control unit 304 forms and displays a triangle with the marker and both ends of the tire width of the automobile 100 .
  • the display control unit 304 displays the position of the obstacle at the shortest distance using a gradation such that the position is red and the near side is green.
  • the display control unit 304 displays the position of the obstacle at the shortest distance in red and the front in green with gradation.
  • the display control unit 304 may stereoscopically widen the scan range vertically.
  • FIG. 5 shows an image when the automobile 100 enters the parking lot.
  • FIG. 6 shows an image when the automobile 100 moves from entering the parking lot to approaching.
  • FIG. 7 shows an image when the vehicle 100 is approaching.
  • the passenger can easily understand the traveling direction of the mobile object, and the surrounding information of the mobile object can be presented in an easy-to-understand manner.
  • the information processing apparatus 110 can present intuitive surrounding information by changing the viewpoint of the perspective transformation of the bird's-eye view image according to the surrounding situation.
  • the information processing device 110 considers the distance to obstacles such as obstacles around the vehicle 100, and converts the viewpoint to realize a display method that makes the obstacles more recognizable.
  • the operator can more intuitively understand the forward and backward movements of the vehicle when the vehicle is backing up, and can drive the automobile 100 safely.
  • Modification 1 A modification of the first embodiment will be described.
  • the display control unit 304 may change the depression angle step by step according to the distance from the vehicle 100 to the obstacle.
  • the angle of depression is set to 30 degrees until the distance is 5 m, and the angle of depression is set to 90 degrees when the distance is within 5 m.
  • Modification 1 it is possible for the occupant to easily understand the traveling direction of the mobile object, and to present the peripheral information of the mobile object in an easy-to-understand manner.
  • the automobile 100 has been described as an example of a moving object.
  • the mobile object may be, for example, a so-called drone such as an unmanned aerial vehicle.
  • the information processing device 110 may be included in the drone or may be included in the controller of the operator who operates the drone.
  • the moving body is a drone
  • a display section that can be visually observed by the operator is provided in the controller.
  • examples of the object in the second modification include a mark indicating the departure and arrival location of the drone. According to the modified example, the operator can easily understand the traveling direction of the drone, and the information around the moving object can be presented in an easy-to-understand manner.
  • FIG. 11 is a diagram (Part 1) illustrating an example of a viewpoint change in modification 3;
  • the display control unit 304 of Modification 3 moves the viewpoint of the bird's-eye view image in the direction of the obstacle when an obstacle exists within a predetermined distance to the left or right of the traveling direction of the automobile 100 .
  • the display control unit 304 shifts the viewpoint leftward from the viewpoint 3 to the viewpoint 4 to generate a bird's-eye view image from the viewpoint 4 .
  • the display control unit 304 generates a bird's-eye view image so that the line of sight from viewpoint 4 is parallel to the line of sight from viewpoint 3 .
  • FIG. 12 is a diagram (part 2) illustrating an example of a viewpoint change in modification 3;
  • the display control unit 304 moves the viewpoint of the bird's-eye view image in the direction of the obstacle when an obstacle exists within a predetermined distance to the left or right with respect to the traveling direction of the automobile 100 .
  • the display control unit 304 shifts the viewpoint leftward from the viewpoint 5 to the viewpoint 6 to generate a bird's-eye view image from the viewpoint 6 .
  • the display control unit 304 controls the line of sight from the viewpoint 5 (for example, the ground at the line of sight) and the line of sight from the viewpoint 6 (for example, the ground at the line of sight). , are the same.
  • An information processing apparatus comprising a control unit, wherein the control unit generates a bird's-eye view image including a moving body and the surroundings of the moving body, and an operator of the moving body views the bird's-eye view image.
  • An information processing apparatus that controls display on a possible display unit, and changes a depression angle of the bird's-eye view image based on a distance between the moving body and an object existing in a traveling direction of the moving body.
  • control unit continuously changes the depression angle based on the distance.
  • control unit performs control such that the angle of depression increases as the moving object approaches the object. Device.
  • control unit controls the control unit to control the movement of the moving object when the moving object approaches the object and when the moving object moves away from the object. to change the rate of change in the depression angle.
  • the moving object is an automobile, and the object is related to a parking position of the automobile. Device.
  • a mobile body comprising the information processing apparatus according to any one of (1) to (7) above.
  • An information processing method executed by an information processing apparatus wherein a bird's-eye view image including a moving body and the surroundings of the moving body is generated, and the bird's-eye view image is displayed on a display unit that can be viewed by an operator of the moving body.
  • the back view has been mainly explained as an example, but the above-described effects can be obtained by executing the same processing for the front view as well.
  • Car 110 Information processing device 120: Display 130: Camera 150: Obstacle 201: Control unit 202: Storage unit 203: Communication unit 301: Peripheral information reception unit 302: Object recognition unit 303: Behavior information reception unit 304: Display Control unit 1000: information processing system

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

One aspect of the present invention provides an information processing device. This information processing device has a control unit. The control unit generates a bird's-eye image including a mobile body and the surroundings of the mobile body. Control is carried out so as to display the bird's-eye image on a display unit which can be viewed by an operator of the mobile body. A depression angle pertaining to the viewpoint in the bird's-eye image is changed on the basis of the distance between the mobile body and an object which is present in the travelling direction of the mobile body.

Description

情報処理装置、移動体、情報処理方法及びプログラムInformation processing device, mobile object, information processing method and program
 本発明は、情報処理装置、移動体、情報処理方法及びプログラムに関する。 The present invention relates to an information processing device, a mobile object, an information processing method, and a program.
 移動体に関する表示システムが存在する。
 特許文献1には、乗員が移動体の進行方向を理解しやすく、周辺監視に集中しやすい表示を目的とした技術が開示されている。
Display systems exist for mobile objects.
Japanese Patent Laid-Open No. 2002-200001 discloses a technique aimed at displaying a display that makes it easy for a passenger to understand the traveling direction of a moving object and to easily concentrate on monitoring the surroundings.
特開2021-094939号公報JP 2021-094939 A
 移動体の周辺情報をより分かりやすく提示する。  The surrounding information of the moving object is presented in an easy-to-understand manner.
 本発明の一態様によれば、情報処理装置が提供される。この情報処理装置は、制御部を有する。制御部は、移動体と移動体の周辺とを含む俯瞰画像を生成する。俯瞰画像を移動体の操作者が目視可能な表示部に表示するよう制御する。移動体と、移動体の進行方向に存在するオブジェクトと、の距離に基づいて、俯瞰画像の視点に関する俯角を変更する。 According to one aspect of the present invention, an information processing device is provided. This information processing device has a control unit. The control unit generates a bird's-eye view image including the moving body and the surroundings of the moving body. Control is performed so that the bird's-eye view image is displayed on a display unit that can be viewed by the operator of the mobile object. Based on the distance between the moving object and an object existing in the traveling direction of the moving object, the depression angle of the viewpoint of the bird's-eye view image is changed.
図1は、情報処理システム1000のシステム構成の一例を示す図である。FIG. 1 is a diagram showing an example of a system configuration of an information processing system 1000. As shown in FIG. 図2は、情報処理装置110のハードウェア構成の一例を示す図である。FIG. 2 is a diagram showing an example of the hardware configuration of the information processing device 110. As shown in FIG. 図3は、情報処理装置110の機能構成の一例を示す図である。FIG. 3 is a diagram showing an example of the functional configuration of the information processing device 110. As shown in FIG. 図4は、情報処理装置110の情報処理の一例を示すアクティビティ図である。FIG. 4 is an activity diagram showing an example of information processing by the information processing apparatus 110. As shown in FIG. 図5は、俯瞰画像の一例を示す図(その1)である。FIG. 5 is a diagram (part 1) showing an example of a bird's-eye view image. 図6は、俯瞰画像の一例を示す図(その2)である。FIG. 6 is a diagram (part 2) showing an example of a bird's-eye view image. 図7は、俯瞰画像の一例を示す図(その3)である。FIG. 7 is a diagram (part 3) showing an example of a bird's-eye view image. 図8は、オブジェクトから離れるperの減少時の変位曲線の一例を示す図である。FIG. 8 is a diagram showing an example of a displacement curve when per is decreasing away from an object. 図9は、オブジェクトに近づくperの増加時の変位曲線の一例を示す図である。FIG. 9 is a diagram showing an example of a displacement curve when per approaches an object and increases. 図10は、表示制御部304によって自動車100が障害物150に近づくほど俯角が大きくなるよう制御する一例を示す図である。FIG. 10 is a diagram showing an example of control by the display control unit 304 such that the closer the vehicle 100 is to the obstacle 150, the greater the angle of depression. 図11は、変形例3の視点の変更の一例を示す図(その1)である。FIG. 11 is a diagram (Part 1) illustrating an example of a viewpoint change in modification 3; 図12は、変形例3の視点の変更の一例を示す図(その2)である。FIG. 12 is a diagram (part 2) illustrating an example of a viewpoint change in modification 3;
 以下、図面を用いて本発明の実施形態について説明する。以下に示す実施形態中で示した各種特徴事項は、互いに組み合わせ可能である。 Embodiments of the present invention will be described below with reference to the drawings. Various features shown in the embodiments shown below can be combined with each other.
 本明細書において「部」とは、例えば、広義の回路によって実施されるハードウェア資源と、これらのハードウェア資源によって具体的に実現されうるソフトウェアの情報処理とを合わせたものも含みうる。また、本実施形態においては様々な情報を取り扱うが、これら情報は、例えば電圧・電流を表す信号値の物理的な値、0又は1で構成される2進数のビット集合体としての信号値の高低、又は量子的な重ね合わせ(いわゆる量子ビット)によって表され、広義の回路上で通信・演算が実行されうる。 In this specification, the term "unit" may include, for example, a combination of hardware resources implemented by circuits in a broad sense and software information processing that can be specifically realized by these hardware resources. In addition, various information is handled in the present embodiment, and these information are, for example, physical values of signal values representing voltage and current, and signal values as binary bit aggregates composed of 0 or 1. It is represented by high and low, or quantum superposition (so-called quantum bit), and communication and operation can be performed on a circuit in a broad sense.
 また、広義の回路とは、回路(Circuit)、回路類(Circuitry)、プロセッサ(Processor)、及びメモリ(Memory)等を少なくとも適当に組み合わせることによって実現される回路である。すなわち、特定用途向け集積回路(Application Specific Integrated Circuit:ASIC)、プログラマブル論理デバイス(例えば、単純プログラマブル論理デバイス(Simple Programmable Logic Device:SPLD)、複合プログラマブル論理デバイス(Complex Programmable Logic Device:CPLD)、及びフィールドプログラマブルゲートアレイ(Field Programmable Gate Array:FPGA))等を含むものである。 A circuit in a broad sense is a circuit realized by at least appropriately combining circuits, circuits, processors, memories, and the like. That is, Application Specific Integrated Circuit (ASIC), programmable logic device (for example, Simple Programmable Logic Device (SPLD), Complex Programmable Logic Device (CPLD), and field It includes a programmable gate array (Field Programmable Gate Array: FPGA)).
<実施形態1>
1.システム構成
 図1は、情報処理システム1000のシステム構成の一例を示す図である。図1に示されるように、情報処理システム1000は、システム構成として、自動車100を含む。自動車は移動体の一例である。自動車100は、情報処理装置110と、ディスプレイ120と、複数のカメラ130と、を含む。情報処理装置110は、本実施形態の処理を実行する装置である。本実施形態では、情報処理装置110は、自動車100に含まれるものとして説明を行うが、自動車100等と通信可能であれば、自動車100に含まれなくてもよい。ディスプレイ120は、情報処理装置110の制御のもと、後述する3Dの合成画像等を表示する表示装置である。カメラ130は、自動車100の周辺を撮像するカメラである。より具体的に説明すると、カメラ130は、映像として色情報を付加して記述する、RGBカメラ等である。カメラ130は、自動車100の後方に設けられている。カメラ130は、自動車100の前方に設けられている。図1では省略してあるが、カメラ130及びカメラ130以外にも自動車100の左右にそれぞれカメラ130が設けられている。以下、これら複数のカメラを単にカメラ130という。また、説明の簡略化のため図1には図示していないが、自動車100は、複数の深度センサを有する。深度センサは、自動車100の周りの物体の形状及び自動車100から物体までの距離を、レーザー光等を使って測定する。ただし、深度センサは、レーザー光に限られず、超音波を用いて距離を測定してもよいし、深度センサ付きカメラを用いて距離を測定してもよい。
<Embodiment 1>
1. System Configuration FIG. 1 is a diagram showing an example of the system configuration of an information processing system 1000. As shown in FIG. As shown in FIG. 1, an information processing system 1000 includes an automobile 100 as a system configuration. A car is an example of a mobile object. Automobile 100 includes information processing device 110 , display 120 , and multiple cameras 130 . The information processing device 110 is a device that executes the processing of this embodiment. In the present embodiment, the information processing device 110 is described as being included in the vehicle 100, but may not be included in the vehicle 100 as long as it can communicate with the vehicle 100 or the like. The display 120 is a display device that displays a 3D synthesized image and the like, which will be described later, under the control of the information processing device 110 . Camera 130 is a camera that captures an image of the surroundings of automobile 100 . More specifically, the camera 130 is an RGB camera or the like that adds and describes color information as an image. Camera 130 1 is provided behind automobile 100 . Camera 130 2 is provided in front of automobile 100 . Although not shown in FIG. 1, in addition to the cameras 130-1 and 130-2 , cameras 130 are provided on the left and right sides of the automobile 100, respectively. These cameras are hereinafter simply referred to as camera 130 . In addition, although not shown in FIG. 1 for simplification of explanation, the vehicle 100 has a plurality of depth sensors. The depth sensor measures the shape of objects around the automobile 100 and the distance from the automobile 100 to the objects using laser light or the like. However, the depth sensor is not limited to laser light, and may measure distance using ultrasonic waves, or may measure distance using a camera with a depth sensor.
2.ハードウェア構成
 図2は、情報処理装置110のハードウェア構成の一例を示す図である。情報処理装置110は、ハードウェア構成として、制御部201と、記憶部202と、通信部203と、を含む。制御部201は、CPU(Central Processing Unit)等であって、情報処理装置110の全体を制御する。記憶部202は、HDD(Hard Disk Drive)、ROM(Read Only Memory)、RAM(Random Access Memory)、SSD(Solid Sate Drive)等の何れか、又はこれらの任意の組み合わせであって、プログラム及び制御部201がプログラムに基づき処理を実行する際に利用するデータ等を記憶する。制御部201が、記憶部202に記憶されているプログラムに基づき、処理を実行することによって、後述する図3に示されるような情報処理装置110の機能構成、後述する図4に示されるようなアクティビティ図の処理等が実現される。なお、本実施形態では、制御部201がプログラムに基づき処理を実行する際に利用するデータを記憶部202に記憶するものとして説明するが、データは情報処理装置110が通信可能な他の装置の記憶部等に記憶するようにしてもよい。通信部203は、NIC(Network Interface Card)等であって、情報処理装置110をネットワークに接続し、他の装置(例えば、ディスプレイ120、カメラ130等)との通信を司る。記憶部202は、記憶媒体の一例である。
2. Hardware Configuration FIG. 2 is a diagram illustrating an example of the hardware configuration of the information processing apparatus 110. As illustrated in FIG. The information processing apparatus 110 includes a control unit 201, a storage unit 202, and a communication unit 203 as a hardware configuration. The control unit 201 is a CPU (Central Processing Unit) or the like, and controls the entire information processing apparatus 110 . The storage unit 202 is any one of HDD (Hard Disk Drive), ROM (Read Only Memory), RAM (Random Access Memory), SSD (Solid State Drive), etc., or any combination thereof, and stores programs and controls. It stores data and the like used when the unit 201 executes processing based on a program. The control unit 201 executes processing based on a program stored in the storage unit 202 to configure the functional configuration of the information processing apparatus 110 as shown in FIG. Activity diagram processing and the like are realized. In this embodiment, it is assumed that data used when the control unit 201 executes processing based on a program is stored in the storage unit 202, but the data is stored in another device with which the information processing device 110 can communicate. You may make it memorize|store in a memory|storage part. The communication unit 203 is a NIC (Network Interface Card) or the like, connects the information processing apparatus 110 to a network, and controls communication with other apparatuses (eg, display 120, camera 130, etc.). Storage unit 202 is an example of a storage medium.
3.機能構成
 図3は、情報処理装置110の機能構成の一例を示す図である。情報処理装置110は、機能構成として、周辺情報受取部301と、物体認識部302と、挙動情報受取部303と、表示制御部304と、を含む。
3. Functional Configuration FIG. 3 is a diagram illustrating an example of the functional configuration of the information processing device 110 . The information processing apparatus 110 includes a peripheral information receiving unit 301, an object recognition unit 302, a behavior information receiving unit 303, and a display control unit 304 as functional configurations.
(周辺情報受取部301)
 周辺情報受取部301は、カメラ130から自動車100の周囲の物体の情報、自動車100の周囲の色の情報等を周辺情報として受け取る。また、周辺情報受取部301は、複数の深度センサから自動車100の周りの物体の形状及び自動車100から物体までの距離等を周辺情報として受け取る。
(Peripheral information receiving unit 301)
The peripheral information receiving unit 301 receives information on objects around the vehicle 100, information on colors around the vehicle 100, and the like from the camera 130 as peripheral information. Further, the peripheral information receiving unit 301 receives the shapes of objects around the vehicle 100 and the distance from the vehicle 100 to the objects as peripheral information from the plurality of depth sensors.
(物体認識部302)
 物体認識部302は、周辺情報受取部301によって受け取られた周辺情報等に基づき自動車100の周辺の物体を認識する。周囲の物体としては、例えば、建物、道路、他の自動車、駐車場等がある。
(Object recognition unit 302)
The object recognition unit 302 recognizes objects around the automobile 100 based on the surrounding information received by the surrounding information receiving unit 301 . Surrounding objects include, for example, buildings, roads, other vehicles, parking lots, and the like.
(挙動情報受取部303)
 挙動情報受取部303は、移動体の挙動情報を受け取る。より具体的に説明すると、挙動情報受取部303は、自動車100に含まれる、車輪速センサ、舵角センサ及び慣性計測装置(IMU:Inertial Measurement Unit)等で検出された情報を自動車100の挙動情報として取得する。なお、慣性計測装置は、3次元の慣性運動(直行3軸方向の並進運動及び回転運動)を検出する装置であって、加速度センサ[m/s]により並進運動を、角速度(ジャイロ)センサ[deg/sec]により回転運動を検出する。また、挙動情報受取部303は、自動車100に含まれるSLAM(Simultaneous Localization and Mapping)及び自動車100に含まれるLiDAR、ステレオカメラ、ジャイロセンサー等より自動車100の挙動に関する情報を受けとってもいい。
 ここで、挙動に関する情報としては、例えば、自動車100の位置の情報、向きの情報等が含まれる。
(Behavior information receiving unit 303)
The behavior information receiving unit 303 receives behavior information of the moving body. More specifically, the behavior information receiving unit 303 converts information detected by a wheel speed sensor, a steering angle sensor, an inertial measurement unit (IMU), etc. included in the vehicle 100 into the behavior information of the vehicle 100. to get as An inertial measurement device is a device that detects three-dimensional inertial motion (translational motion and rotational motion in orthogonal three-axis directions). Rotational motion is detected by [deg/sec]. Also, the behavior information receiving unit 303 may receive information about the behavior of the automobile 100 from SLAM (Simultaneous Localization and Mapping) included in the automobile 100, LiDAR, a stereo camera, a gyro sensor, etc. included in the automobile 100.
Here, the behavior information includes, for example, position information and orientation information of the vehicle 100 .
(表示制御部304)
 表示制御部304は、物体認識部302が認識した物体の情報、SLAMからの情報等に基づき、周辺の物体を3Dモデルに変換する。また、表示制御部304は、カメラ130からの情報、SLAMからの情報、挙動情報受取部303によって取得された挙動情報等に基づき、自動車100の3Dモデルを生成する。また、表示制御部304は、周辺の物体の3Dモデル、自動車100の3Dモデル、カメラ130からの情報等に基づき、自動車100及び自動車100の周辺を含む俯瞰画像を生成する。ここで、俯瞰画像は、視点を道路面に対して上面から見えるよう透視変換された画像であり、自動車100の全体、又は少なくとも自動車100の半分以上を含む画像である。俯瞰画像については、後述する図5~図7等に示す。表示制御部304は、生成した俯瞰画像を自動車100のディスプレイ120等に表示するよう制御する。自動車100のディスプレイ120は、自動車100の操作者が目視可能な表示部の一例である。
(Display control unit 304)
The display control unit 304 converts a surrounding object into a 3D model based on information on the object recognized by the object recognition unit 302, information from the SLAM, and the like. The display control unit 304 also generates a 3D model of the automobile 100 based on information from the camera 130, information from SLAM, behavior information acquired by the behavior information receiving unit 303, and the like. The display control unit 304 also generates a bird's-eye view image including the automobile 100 and its surroundings based on the 3D model of the surrounding objects, the 3D model of the automobile 100, information from the camera 130, and the like. Here, the bird's-eye view image is an image that has undergone perspective transformation so that the viewpoint can be viewed from above the road surface, and is an image that includes the entire automobile 100 or at least half or more of the automobile 100 . Bird's-eye view images are shown in FIGS. 5 to 7, etc., which will be described later. The display control unit 304 controls to display the generated overhead image on the display 120 of the automobile 100 or the like. The display 120 of the automobile 100 is an example of a display visible to the operator of the automobile 100 .
 なお、図3の機能構成の一部、又はすべてをハードウェア構成として情報処理装置110、又は自動車100に実装するようにしてもよい。また、情報処理装置110の機能構成の一部は、又はすべてを自動車100に実装するようにしてもよい。 Note that part or all of the functional configuration of FIG. 3 may be implemented in the information processing device 110 or the automobile 100 as a hardware configuration. Also, part or all of the functional configuration of the information processing device 110 may be implemented in the vehicle 100 .
4.情報処理
 図4は、情報処理装置110の情報処理の一例を示すアクティビティ図である。
 A401において、周辺情報受取部301は、カメラ130から自動車100の周囲の物体の情報、自動車100の周囲の色の情報等を周辺情報として受け取る。また、周辺情報受取部301は、複数の深度センサから自動車100の周りの物体の形状及び自動車100から物体までの距離等を周辺情報として受け取る。
 A402において、物体認識部302は、周辺情報受取部301によって受け取られた周辺情報等に基づき自動車100の周辺の物体を認識する。
4. Information Processing FIG. 4 is an activity diagram showing an example of information processing of the information processing apparatus 110 .
At A401, the peripheral information receiving unit 301 receives information on objects around the vehicle 100, information on colors around the vehicle 100, etc. from the camera 130 as peripheral information. Further, the peripheral information receiving unit 301 receives the shapes of objects around the vehicle 100 and the distance from the vehicle 100 to the objects as peripheral information from the plurality of depth sensors.
At A<b>402 , the object recognition unit 302 recognizes objects around the automobile 100 based on the surrounding information received by the surrounding information receiving unit 301 .
 A403において、挙動情報受取部303は、自動車100の挙動に関する情報を受け取る。 At A403, the behavior information receiving unit 303 receives information about the behavior of the automobile 100.
 A404において、表示制御部304は、物体認識部302が認識した物体の情報、SLAMからの情報等に基づき、周辺の物体を3Dモデルに変換する。また、表示制御部304は、カメラ130からの情報、SLAMからの情報、挙動情報受取部303が受け取った自動車100の挙動情報等に基づき、自動車100の3Dモデルを生成する。また、表示制御部304は、周辺の物体の3Dモデル、自動車100の3Dモデル、カメラ130からの情報等に基づき、自動車100及び自動車100の周辺を含む俯瞰画像を生成する。 At A404, the display control unit 304 converts the surrounding objects into a 3D model based on the information on the object recognized by the object recognition unit 302, the information from the SLAM, and the like. The display control unit 304 also generates a 3D model of the automobile 100 based on information from the camera 130, information from SLAM, behavior information of the automobile 100 received by the behavior information receiving unit 303, and the like. The display control unit 304 also generates a bird's-eye view image including the automobile 100 and its surroundings based on the 3D model of the surrounding objects, the 3D model of the automobile 100, information from the camera 130, and the like.
 A405において、表示制御部304は、生成した俯瞰画像を自動車100のディスプレイ120等に表示するよう制御する。表示制御部304は、自動車100と、自動車100の進行方向に存在するオブジェクトと、の距離に基づいて、俯瞰画像の視点に関する俯角を変更する。進行方向に存在するオブジェクトの例としては、例えば、車止め等の障害物等がある。ただし、これは本実施の形態を制限するものではなく、進行方向に存在するオブジェクトの他の例としては、駐車場の壁、駐車場に止まっている他の車、駐車場における駐車場所を示すマーク等であってもよい。車止めや駐車場所を示すマーク等は、自動車の駐車位置に関するものの一例である。ただし、以下では説明の簡略化のためオブジェクトの一例として障害物を例に説明を行う。ここで、表示制御部304は、進行方向に存在する障害物を、進行方向に関する画像より画像処理により認識、抽出してもよいし、物体認識部302によって認識された自動車100の周辺の物体により認識してもよい。 At A405, the display control unit 304 controls to display the generated bird's-eye view image on the display 120 of the automobile 100 or the like. The display control unit 304 changes the depression angle with respect to the viewpoint of the bird's-eye view image based on the distance between the vehicle 100 and an object existing in the traveling direction of the vehicle 100 . Examples of objects existing in the direction of travel include, for example, obstacles such as car stops. However, this does not limit the present embodiment, and other examples of objects existing in the direction of travel include parking lot walls, other cars parked in the parking lot, and parking spaces in the parking lot. It may be a mark or the like. A car stop, a mark indicating a parking place, or the like is an example of a car parking position. However, in order to simplify the description, an obstacle will be used as an example of the object. Here, the display control unit 304 may recognize and extract obstacles existing in the direction of travel by image processing from an image related to the direction of travel, or may recognize an object around the automobile 100 recognized by the object recognition unit 302 . may recognize.
 俯角を変更した際の俯瞰画像の例を、図5~図7に示す。
 図5は、俯瞰画像の一例を示す図(その1)である。図6は、俯瞰画像の一例を示す図(その2)である。図7は、俯瞰画像の一例を示す図(その3)である。
Examples of bird's-eye view images when the depression angle is changed are shown in FIGS. 5 to 7. FIG.
FIG. 5 is a diagram (part 1) showing an example of a bird's-eye view image. FIG. 6 is a diagram (part 2) showing an example of a bird's-eye view image. FIG. 7 is a diagram (part 3) showing an example of a bird's-eye view image.
(自動視点変換)
 以下では説明の簡略化のため、自動車100と自動車100の後方の障害物との距離Lが所定距離以上の場合と、所定距離未満の場合とにおける表示の制御を例に説明を行う。なお、所定距離の例として、以下では5mを例に説明を行う。なお、5mは一例である。
(automatic viewpoint conversion)
In the following, for the sake of simplification of explanation, display control will be described as an example when the distance L between the automobile 100 and an obstacle behind the automobile 100 is equal to or greater than a predetermined distance and when the distance is less than the predetermined distance. As an example of the predetermined distance, 5 m will be described below as an example. In addition, 5m is an example.
 表示制御部304は、距離割合を示すperを求める。
 後退の場合は、表示制御部304は、自動車100から後方の障害物までの距離を測定し、最大距離(5m)で割ったものをperとする。
 per=1-後方の障害物までの距離/最大距離(5m)
 前進の場合は、表示制御部304は、自動車100から前方の障害物までの距離を測定し、最大距離(5m)で割ったものをperとする。
 per=1-前方の障害物までの距離/最大距離(5m)
 障害物が最大距離よりも近くにない場合は以降の処理は行わない。
 per(0<=per<=1)
The display control unit 304 obtains per indicating the distance ratio.
In the case of reversing, the display control unit 304 measures the distance from the vehicle 100 to the rear obstacle and divides it by the maximum distance (5 m) to obtain per.
per = 1 - distance to rear obstacle/maximum distance (5m)
In the case of forward movement, the display control unit 304 measures the distance from the automobile 100 to the obstacle in front, and divides the distance by the maximum distance (5 m) to obtain per.
per = 1 - distance to obstacle in front/maximum distance (5m)
Subsequent processing is not performed when the obstacle is not closer than the maximum distance.
per(0<=per<=1)
 表示制御部304は、求めたperを用いて、変位曲線により、視点移動割(dif)を求める。
 ここで、自動車100の駐車時には切り返しを行う場合がある。また、ステアリングの切り角で自動車100と障害物とが離れる場合がある。このような場合に、ディスプレイ120に表示される画像(映像)が大きく変化しないようにするため、変位曲線は、障害物から離れるperの減少時(curveDec)図8と、障害物に近づくperの増加時(curveInc)図9と、の2つを自動車100に持たせる構成としている。また、curveIncはcurveDecよりも右側に必ず位置するようにしている。図8は、障害物から離れるperの減少時の変位曲線の一例を示す図である。図9は、障害物に近づくperの増加時の変位曲線の一例を示す図である。図8及び図9ともに、横軸はperであり、縦軸はdifである。図8及び図9におけるカーブの左右の値を一定である。左は常に0であり、右は常に1である。
 dif=curve.evaluate(per)
 増加時はdifが1になるとモードが変化し、以降は評価に減少時のカーブが使用される。
 減少時はdifが0になるとモードが変化し、以降は評価に増加時のカーブが使用される。
 dif(0<=dif<=1)
The display control unit 304 uses the obtained per to obtain a viewpoint movement ratio (dif) from a displacement curve.
Here, when the automobile 100 is parked, there is a case where the vehicle is turned back. Also, the vehicle 100 may separate from the obstacle due to the steering angle. In such a case, in order to prevent a large change in the image (video) displayed on the display 120, the displacement curve is set so that per decreases when moving away from the obstacle (curveDec) and when per approaches the obstacle. At the time of increase (curveInc), the automobile 100 has two of FIG. Also, curveInc is always located on the right side of curveDec. FIG. 8 is a diagram showing an example of a displacement curve when per is decreasing away from an obstacle. FIG. 9 is a diagram showing an example of a displacement curve when per approaches an obstacle and increases. In both FIGS. 8 and 9, the horizontal axis is per and the vertical axis is dif. The left and right values of the curves in FIGS. 8 and 9 are constant. Left is always 0 and right is always 1.
dif=curve. evaluate (per)
During the increase, the mode changes when dif becomes 1, and thereafter the curve during decrease is used for evaluation.
During the decrease, the mode changes when the dif becomes 0, and thereafter the curve for the increase is used for evaluation.
dif(0<=dif<=1)
(視点の変更)
 表示制御部304は、自動車100と、進行方向に存在する障害物と、の距離に基づいて、距離に基づいて、俯角を連続的に変更する。より具体的に説明すると、表示制御部304は、自動車100が障害物に近づくほど俯角が大きくなるよう制御する。その際、表示制御部304は、上述したような変位曲線を用いて自動車100が障害物に近づくほど俯瞰画像の視点に関する俯角が大きくなるよう制御する。図10は、表示制御部304によって自動車100が障害物150に近づくほど俯角が大きくなるよう制御する一例を示す図である。表示制御部304は、自動車100が障害物150に近づくにつれ、視点1から視点2に視点を変更する。視点2の俯角は視点1の俯角より大きい俯角である。ここで、図10の例では、視点1及び視点2は、自動車100の進行方向の軸上にあるが、視点1及び視点2共に必ずしも進行方向の軸上にある必要はない。ただし、視点が軸上にあると生成された俯瞰画像が見易い。
 ここで、表示制御部304は、自動車100の進行方向に障害物150がない場合、視点1からの視点の俯瞰画像を生成し、ディスプレイ120に表示するよう制御する。一方、自動車100の進行方向に障害物150が存在する視点2からの視点の俯瞰画像を生成し、ディスプレイ120に表示するよう制御するようにしてもよい。
 なお、表示制御部304は、自動車100が障害物に近づく場合と移動体が障害物から遠ざかる場合とで俯角の変化の割合を変更するようにしてもよい。
(change of viewpoint)
The display control unit 304 continuously changes the depression angle based on the distance between the automobile 100 and obstacles present in the traveling direction. More specifically, the display control unit 304 performs control such that the closer the vehicle 100 is to an obstacle, the greater the angle of depression. At this time, the display control unit 304 uses the displacement curve described above to perform control such that the closer the vehicle 100 is to the obstacle, the greater the angle of depression with respect to the viewpoint of the bird's-eye view image. FIG. 10 is a diagram showing an example of control performed by the display control unit 304 so that the closer the vehicle 100 is to the obstacle 150, the greater the angle of depression. The display control unit 304 changes the viewpoint from the viewpoint 1 to the viewpoint 2 as the automobile 100 approaches the obstacle 150 . The depression angle of the viewpoint 2 is a depression angle larger than the depression angle of the viewpoint 1 . Here, in the example of FIG. 10, the viewpoints 1 and 2 are on the traveling direction axis of the automobile 100, but both the viewpoints 1 and 2 do not necessarily have to be on the traveling direction axis. However, when the viewpoint is on the axis, the generated bird's-eye view image is easy to see.
Here, when there is no obstacle 150 in the traveling direction of the automobile 100 , the display control unit 304 controls to generate a bird's-eye view image from the viewpoint 1 and display it on the display 120 . On the other hand, a bird's-eye view image from a viewpoint 2 in which an obstacle 150 exists in the traveling direction of the automobile 100 may be generated and controlled to be displayed on the display 120 .
The display control unit 304 may change the rate of change in the angle of depression depending on whether the automobile 100 approaches the obstacle or the moving object moves away from the obstacle.
 表示制御部304は、per、difを用いてカメラの回転及び移動を制御する。ここでいうカメラとは、自動車100の上方から仮想的に自動車100を撮像しているカメラのことである。
 例えば、俯角30度を障害物のないときの俯角とした場合、表示制御部304は、自動車100が障害物に近づくにつれ、俯角90度(上方からの画像)に近づいていくように、制御する。このとき、俯角30度、俯角90度は予め状況等に応じて設定を変えるようにしてもよい。例えば、表示制御部304は、障害物の種類、大きさ等に応じて、俯角及び/又は用いる変位曲線を変更するようにしてもよい。
 また、回転用の変位曲線と、移動用の変位曲線を用意し、表示制御部304は、回転用の変位曲線及び移動用の変位曲線を用いて、俯瞰画像の表示を制御するようにしてもよい。
The display control unit 304 controls rotation and movement of the camera using per and dif. The camera referred to here is a camera that virtually captures an image of the automobile 100 from above the automobile 100 .
For example, if the depression angle of 30 degrees is the depression angle when there is no obstacle, the display control unit 304 controls the vehicle 100 to approach the depression angle of 90 degrees (image from above) as it approaches the obstacle. . At this time, the depression angle of 30 degrees and the depression angle of 90 degrees may be changed in advance according to the situation. For example, the display control unit 304 may change the depression angle and/or the displacement curve to be used according to the type, size, etc. of the obstacle.
Alternatively, a displacement curve for rotation and a displacement curve for movement may be prepared, and the display control unit 304 may use the displacement curve for rotation and the displacement curve for movement to control the display of the overhead image. good.
 また、表示制御部304は、移動開始(per=0,dif=0)から増加時においてこれまでのdifの最大値を記録し、一時的に数値が下がってもこれまでの最大値のdifの値を使用してカメラの制御を行うようにする。このようにすることにより、障害物の形状等によりカメラが細かく移動することを抑制することができる。
 また、表示制御部304は、各センサ(自動車100の前後座右)からの情報に基づき障害物までの距離を求め、以下のように割合(per)を計算する
 前後はper=1-(障害物までの距離)/最大距離(5m)
 左右はper=1-(障害物までの距離)/最大距離(1m)
 表示制御部304は、これらの値で最大のper(障害物までの距離割合が最小のもの)を使用するようにする。
 これにより側面の障害物にぶつかりそうなときに真上からの視点に切り替わるようにすることができ、障害物の衝突を防ぐことができる。
In addition, the display control unit 304 records the maximum value of dif so far at the time of increase from the start of movement (per=0, dif=0). Use values to control the camera. By doing so, it is possible to suppress fine movement of the camera due to the shape of the obstacle or the like.
In addition, the display control unit 304 obtains the distance to the obstacle based on the information from each sensor (front and rear seat right of the automobile 100), and calculates the ratio (per) as follows. distance to)/maximum distance (5m)
Left and right per = 1 - (distance to obstacle) / maximum distance (1 m)
The display control unit 304 uses the maximum per (minimum ratio of the distance to the obstacle) among these values.
As a result, it is possible to switch to a viewpoint from directly above when it is likely to collide with an obstacle on the side, and it is possible to prevent collision with the obstacle.
 また、表示制御部304は、自動車100の手前にある障害物を非表示にするようにしてもよい。このようにすることにより、タワー駐車場等、遮蔽物が多い駐車場等でも自車を見えるようにすることができる。ここで、表示制御部304は、視点と自車との間に衝突するものは存在しない場合に上述した処理を行うものとする。自車は常に前方に進み、視点が手前側にある場合、表示制御部304は、自動車100の手前にある障害物を非表示にする。すなわち、表示制御部304は、自動車100と衝突の可能性のある障害物は非表示とはしない。
 このような工夫により視点変換を柔軟に実施している。
Also, the display control unit 304 may hide obstacles in front of the automobile 100 . By doing so, it is possible to make the own vehicle visible even in parking lots such as tower parking lots where there are many obstructions. Here, it is assumed that the display control unit 304 performs the above-described processing when there is no colliding object between the viewpoint and the host vehicle. When the vehicle always moves forward and the viewpoint is on the front side, the display control unit 304 hides obstacles in front of the vehicle 100 . That is, the display control unit 304 does not hide obstacles that may collide with the automobile 100 .
With such a device, viewpoint conversion is flexibly implemented.
 なお、上述した実施形態等では、主に自動車100の後ろに障害物がある例を用いて説明を行ったが、自動車100の前に障害物がある場合も同様である。また、自動車100の左右に障害物がある場合も、表示制御部304は、上述した処理と同様の処理を行うことができる。 In the above-described embodiment and the like, an example in which an obstacle exists behind the automobile 100 was mainly used for explanation, but the same applies when there is an obstacle in front of the automobile 100. Further, even when there are obstacles on the left and right sides of the automobile 100, the display control unit 304 can perform the same processing as described above.
 駐車は自動車100を駐車枠内に進入させるための行動と停止位置及び停止姿勢を調整するアプローチ時の行動とに大別される。それぞれの行動において要求される自動車100の周囲の情報は異なる。駐車枠内に進入させるための行動の際は、駐車枠内への進入のための周囲情報、及び、大きなステアリングを切る場合の車両内接側面の障害物との距離が求められる。アプローチ時の行動の際は、停止位置までの距離、及び、自動車100の両サイドの障害物と自動車100との距離が求められる。これを前提に、自動車100から障害物までの距離の表示を以下のようにする。 Parking is broadly divided into action for entering the car 100 into the parking frame and action during approach for adjusting the stop position and stop posture. Information about the surroundings of the automobile 100 required for each action is different. When making an action to enter the parking frame, the surrounding information for entering the parking frame and the distance to the obstacle on the inscribed side of the vehicle when turning a large steering wheel are required. During the approach action, the distance to the stop position and the distance between the vehicle 100 and obstacles on both sides of the vehicle 100 are obtained. Based on this assumption, the display of the distance from the vehicle 100 to the obstacle is as follows.
 以下の説明において、測距は前方±20度、後方±15度、左右±60度としてある。
1.左右に関しての表示
 表示制御部304は、最小となる障害物の位置にマーカーを表示する。
 表示制御部304は、マーカーと自動車100のタイヤ幅の両端とで三角形を構成し、表示する。
 表示制御部304は、最短距離の障害物位置を赤、手前が緑になるようにグラデーションをかけて表示する。
2.前後に関しての表示
 自動車100から障害物まで10cm間隔の平行線を表示
 表示制御部304は、最短距離の障害物位置を赤、手前が緑になるようにグラデーションをかけて表示する。
 表示制御部304は、立体的に上下にもスキャン範囲を広げるようにしてもよい。
In the following description, distance measurement is performed at ±20 degrees forward, ±15 degrees backward, and ±60 degrees left and right.
1. Display Regarding Left and Right The display control unit 304 displays a marker at the position of the smallest obstacle.
The display control unit 304 forms and displays a triangle with the marker and both ends of the tire width of the automobile 100 .
The display control unit 304 displays the position of the obstacle at the shortest distance using a gradation such that the position is red and the near side is green.
2. Display of Front and Back Display of parallel lines at intervals of 10 cm from the car 100 to the obstacle The display control unit 304 displays the position of the obstacle at the shortest distance in red and the front in green with gradation.
The display control unit 304 may stereoscopically widen the scan range vertically.
 図5では、自動車100が駐車場所に進入した際の画像が示されている。図6では、自動車100が駐車場所に進入してからアプローチングまでの移行時の画像が示されている。図7では、自動車100のアプローチングの際の画像が示されている。 FIG. 5 shows an image when the automobile 100 enters the parking lot. FIG. 6 shows an image when the automobile 100 moves from entering the parking lot to approaching. FIG. 7 shows an image when the vehicle 100 is approaching.
 以上、本実施形態によれば、乗員が移動体の進行方向を理解しやすく、より移動体の周辺情報を分かりやすく提示することができる。より具体的に説明すると、情報処理装置110は、俯瞰画像の透視変換の視点を周囲の状況に合わせ変化させることで、直感的な周囲情報を提示することができる。特に、情報処理装置110は、自動車100周囲の障害等の障害物までの距離を考慮し、視点変換させることにより、より障害物を意識できる表示法を実現している。これらにより操作者は、車両の後退時の前後の動きをより直感的に理解でき、安全に自動車100を運転することができる。 As described above, according to the present embodiment, the passenger can easily understand the traveling direction of the mobile object, and the surrounding information of the mobile object can be presented in an easy-to-understand manner. More specifically, the information processing apparatus 110 can present intuitive surrounding information by changing the viewpoint of the perspective transformation of the bird's-eye view image according to the surrounding situation. In particular, the information processing device 110 considers the distance to obstacles such as obstacles around the vehicle 100, and converts the viewpoint to realize a display method that makes the obstacles more recognizable. As a result, the operator can more intuitively understand the forward and backward movements of the vehicle when the vehicle is backing up, and can drive the automobile 100 safely.
(変形例1)
 実施形態1の変形例を説明する。
 上述した実施形態では、俯角を連続的に変更する例を示した。しかし、表示制御部304は、自動車100から障害物までの距離に応じて、俯角を段階的に変更するようにしてもよい。俯角を段階的に変更する例としては、距離が5mまでは俯角が30度、距離が5m以内になったら俯角が90度、等である。
 変形例1によっても、乗員が移動体の進行方向を理解しやすく、より移動体の周辺情報を分かりやすく提示することができる。
(Modification 1)
A modification of the first embodiment will be described.
In the embodiment described above, an example in which the depression angle is changed continuously has been shown. However, the display control unit 304 may change the depression angle step by step according to the distance from the vehicle 100 to the obstacle. As an example of changing the angle of depression step by step, the angle of depression is set to 30 degrees until the distance is 5 m, and the angle of depression is set to 90 degrees when the distance is within 5 m.
According to Modification 1 as well, it is possible for the occupant to easily understand the traveling direction of the mobile object, and to present the peripheral information of the mobile object in an easy-to-understand manner.
(変形例2)
 実施形態1の変形例を説明する。
 実施形態1では、移動体の一例として、自動車100を例に説明を行った。しかし、移動体は、例えば、無人飛行機等のいわゆるドローンであってもよい。移動体がドローンの場合、情報処理装置110は、ドローンに含まれてもよいし、ドローンを操作する操作者のコントローラーに含まれてもよい。また、移動体がドローンの場合、操作者が目視可能な表示部は、コントローラーに設けられる。また、移動体がドローンの場合、変形例2におけるオブジェクトの例としては、ドローンの発着場所を示すマーク等がある。
 変形例によっても、操作者がドローンの進行方向を理解しやすく、より移動体の周辺情報を分かりやすく提示することができる。
(Modification 2)
A modification of the first embodiment will be described.
In the first embodiment, the automobile 100 has been described as an example of a moving object. However, the mobile object may be, for example, a so-called drone such as an unmanned aerial vehicle. When the moving object is a drone, the information processing device 110 may be included in the drone or may be included in the controller of the operator who operates the drone. Further, when the moving body is a drone, a display section that can be visually observed by the operator is provided in the controller. Further, when the mobile object is a drone, examples of the object in the second modification include a mark indicating the departure and arrival location of the drone.
According to the modified example, the operator can easily understand the traveling direction of the drone, and the information around the moving object can be presented in an easy-to-understand manner.
(変形例3)
 実施形態1の変形例を説明する。
 図11は、変形例3の視点の変更の一例を示す図(その1)である。
 変形例3の表示制御部304は、自動車100の進行方向に対して左、又は右の所定距離内に障害物が存在する場合、障害物がある方向に俯瞰画像の視点を移動させる。図11の例では、障害物150は、自動車100の左側にあるため、表示制御部304は、視点3から視点4へと視点を左側に移動させ視点4からの俯瞰画像を生成する。図11の例では、表示制御部304は、視点4からの視線は視点3からの視線と平行になるように俯瞰画像を生成する。
(Modification 3)
A modification of the first embodiment will be described.
FIG. 11 is a diagram (Part 1) illustrating an example of a viewpoint change in modification 3;
The display control unit 304 of Modification 3 moves the viewpoint of the bird's-eye view image in the direction of the obstacle when an obstacle exists within a predetermined distance to the left or right of the traveling direction of the automobile 100 . In the example of FIG. 11 , since the obstacle 150 is on the left side of the automobile 100 , the display control unit 304 shifts the viewpoint leftward from the viewpoint 3 to the viewpoint 4 to generate a bird's-eye view image from the viewpoint 4 . In the example of FIG. 11 , the display control unit 304 generates a bird's-eye view image so that the line of sight from viewpoint 4 is parallel to the line of sight from viewpoint 3 .
 図12は、変形例3の視点の変更の一例を示す図(その2)である。
 表示制御部304は、自動車100の進行方向に対して左、又は右の所定距離内に障害物が存在する場合、障害物がある方向に俯瞰画像の視点を移動させる。図12の例では、障害物150は、自動車100の左側にあるため、表示制御部304は、視点5から視点6へと視点を左側に移動させ視点6からの俯瞰画像を生成する。しかし、図12の例の場合、表示制御部304は、視点5からの視線の先(例えば、視線の先の地面)と、視点6からの視線の先(例えば、視線の先の地面)と、が同じになるように俯瞰画像をする。
FIG. 12 is a diagram (part 2) illustrating an example of a viewpoint change in modification 3;
The display control unit 304 moves the viewpoint of the bird's-eye view image in the direction of the obstacle when an obstacle exists within a predetermined distance to the left or right with respect to the traveling direction of the automobile 100 . In the example of FIG. 12 , since the obstacle 150 is on the left side of the automobile 100 , the display control unit 304 shifts the viewpoint leftward from the viewpoint 5 to the viewpoint 6 to generate a bird's-eye view image from the viewpoint 6 . However, in the case of the example of FIG. 12, the display control unit 304 controls the line of sight from the viewpoint 5 (for example, the ground at the line of sight) and the line of sight from the viewpoint 6 (for example, the ground at the line of sight). , are the same.
<付記>
 発明は、次に記載の各態様で提供されてもよい。
<Appendix>
The invention may be provided in each of the aspects described below.
(1)情報処理装置であって、制御部を有し、前記制御部は、移動体と前記移動体の周辺とを含む俯瞰画像を生成し、前記俯瞰画像を前記移動体の操作者が目視可能な表示部に表示するよう制御し、前記移動体と、前記移動体の進行方向に存在するオブジェクトと、の距離に基づいて、前記俯瞰画像の視点に関する俯角を変更する、情報処理装置。 (1) An information processing apparatus comprising a control unit, wherein the control unit generates a bird's-eye view image including a moving body and the surroundings of the moving body, and an operator of the moving body views the bird's-eye view image. An information processing apparatus that controls display on a possible display unit, and changes a depression angle of the bird's-eye view image based on a distance between the moving body and an object existing in a traveling direction of the moving body.
(2)上記(1)に記載の情報処理装置において、前記制御部は、前記距離に基づいて、前記俯角を連続的に変更する、情報処理装置。 (2) In the information processing apparatus according to (1) above, the control unit continuously changes the depression angle based on the distance.
(3)上記(1)に記載の情報処理装置において、前記制御部は、前記距離に基づいて、前記俯角を段階的に変更する、情報処理装置。 (3) The information processing apparatus according to (1) above, wherein the control unit changes the depression angle step by step based on the distance.
(4)上記(1)から(3)までの何れか1項に記載の情報処理装置において、前記制御部は、前記移動体が前記オブジェクトに近づくほど前記俯角が大きくなるよう制御する、情報処理装置。 (4) In the information processing apparatus according to any one of (1) to (3) above, the control unit performs control such that the angle of depression increases as the moving object approaches the object. Device.
(5)上記(1)から(4)までの何れか1項に記載の情報処理装置において、前記制御部は、前記移動体が前記オブジェクトに近づく場合と前記移動体が前記オブジェクトから遠ざかる場合とで前記俯角の変化の割合を変更する、情報処理装置。 (5) In the information processing apparatus according to any one of (1) to (4) above, the control unit controls the control unit to control the movement of the moving object when the moving object approaches the object and when the moving object moves away from the object. to change the rate of change in the depression angle.
(6)上記(1)から(5)までの何れか1項に記載の情報処理装置において、前記制御部は、前記移動体の進行方向に対して左、又は右にオブジェクトが存在する場合、前記オブジェクトがある方向に前記視点を移動させる、情報処理装置。 (6) In the information processing apparatus according to any one of (1) to (5) above, when an object exists on the left or right of the moving direction of the moving object, An information processing device that moves the viewpoint in a direction in which the object is located.
(7)上記(1)から(6)までの何れか1項に記載の情報処理装置において、前記移動体は、自動車であり、前記オブジェクトは、前記自動車の駐車位置に関するものである、情報処理装置。 (7) In the information processing apparatus according to any one of (1) to (6) above, the moving object is an automobile, and the object is related to a parking position of the automobile. Device.
(8)移動体であって、上記(1)から(7)までの何れか1項に記載の情報処理装置を有する、移動体。 (8) A mobile body, comprising the information processing apparatus according to any one of (1) to (7) above.
(9)上記(8)に記載の移動体において、前記移動体は自動車である、移動体。 (9) The mobile object according to (8) above, wherein the mobile object is an automobile.
(10)情報処理装置が実行する情報処理方法であって、移動体と前記移動体の周辺とを含む俯瞰画像を生成し、前記俯瞰画像を前記移動体の操作者が目視可能な表示部に表示するよう制御し、前記移動体と、前記移動体の進行方向に存在するオブジェクトと、の距離に基づいて、前記俯瞰画像の視点に関する俯角を変更する、情報処理方法。 (10) An information processing method executed by an information processing apparatus, wherein a bird's-eye view image including a moving body and the surroundings of the moving body is generated, and the bird's-eye view image is displayed on a display unit that can be viewed by an operator of the moving body. An information processing method for controlling display, and changing a depression angle with respect to a viewpoint of the bird's-eye view image based on a distance between the moving body and an object existing in a traveling direction of the moving body.
(11)プログラムであって、コンピュータを、上記(1)から(7)までの何れか1項に記載の情報処理装置の制御部として機能させるためのプログラム。
 もちろん、この限りではない。
(11) A program for causing a computer to function as a control unit of the information processing apparatus according to any one of (1) to (7) above.
Of course, this is not the only case.
 例えば、上述のプログラムを記憶させる、コンピュータ読み取り可能な非一時的な記憶媒体として提供してもよい。
 また、変形例を任意に組み合わせてもよい。
 また、上述した実施形態等では主にバックビューを例に説明を行ったが、フロントビューに対しても同様な処理を実行することによって上述した効果を奏することができる。
For example, it may be provided as a computer-readable non-temporary storage medium that stores the above program.
Moreover, you may combine a modification arbitrarily.
Also, in the above-described embodiments and the like, the back view has been mainly explained as an example, but the above-described effects can be obtained by executing the same processing for the front view as well.
 最後に、本発明に係る種々の実施形態を説明したが、これらは、例として提示したものであり、発明の範囲を限定することは意図していない。新規な実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更を行うことができる。実施形態やその変形は、発明の範囲や要旨に含まれると共に、特許請求の範囲に記載された発明とその均等の範囲に含まれるものである。 Finally, although various embodiments of the present invention have been described, these are presented as examples and are not intended to limit the scope of the invention. The novel embodiments can be embodied in various other forms, and various omissions, replacements, and modifications can be made without departing from the scope of the invention. Embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the scope of the invention described in the claims and equivalents thereof.
100  :自動車
110  :情報処理装置
120  :ディスプレイ
130  :カメラ
150  :障害物
201  :制御部
202  :記憶部
203  :通信部
301  :周辺情報受取部
302  :物体認識部
303  :挙動情報受取部
304  :表示制御部
1000 :情報処理システム
100: Car 110: Information processing device 120: Display 130: Camera 150: Obstacle 201: Control unit 202: Storage unit 203: Communication unit 301: Peripheral information reception unit 302: Object recognition unit 303: Behavior information reception unit 304: Display Control unit 1000: information processing system

Claims (11)

  1.  情報処理装置であって、
     制御部を有し、
     前記制御部は、
     移動体と前記移動体の周辺とを含む俯瞰画像を生成し、
     前記俯瞰画像を前記移動体の操作者が目視可能な表示部に表示するよう制御し、
     前記移動体と、前記移動体の進行方向に存在するオブジェクトと、の距離に基づいて、前記俯瞰画像の視点に関する俯角を変更する、
    情報処理装置。
    An information processing device,
    having a control unit,
    The control unit
    generating a bird's-eye view image including a moving body and the surroundings of the moving body;
    controlling to display the bird's-eye view image on a display unit visible to an operator of the moving body;
    changing a depression angle with respect to the viewpoint of the bird's-eye view image based on the distance between the moving body and an object existing in the traveling direction of the moving body;
    Information processing equipment.
  2.  請求項1に記載の情報処理装置において、
     前記制御部は、
     前記距離に基づいて、前記俯角を連続的に変更する、
    情報処理装置。
    In the information processing device according to claim 1,
    The control unit
    continuously changing the depression angle based on the distance;
    Information processing equipment.
  3.  請求項1に記載の情報処理装置において、
     前記制御部は、
     前記距離に基づいて、前記俯角を段階的に変更する、
    情報処理装置。
    In the information processing device according to claim 1,
    The control unit
    changing the depression angle step by step based on the distance;
    Information processing equipment.
  4.  請求項1から請求項3までの何れか1項に記載の情報処理装置において、
     前記制御部は、
     前記移動体が前記オブジェクトに近づくほど前記俯角が大きくなるよう制御する、
    情報処理装置。
    In the information processing apparatus according to any one of claims 1 to 3,
    The control unit
    controlling the angle of depression to increase as the moving object approaches the object;
    Information processing equipment.
  5.  請求項1から請求項4までの何れか1項に記載の情報処理装置において、
     前記制御部は、
     前記移動体が前記オブジェクトに近づく場合と前記移動体が前記オブジェクトから遠ざかる場合とで前記俯角の変化の割合を変更する、
    情報処理装置。
    In the information processing apparatus according to any one of claims 1 to 4,
    The control unit
    changing the rate of change in the depression angle between when the moving body approaches the object and when the moving body moves away from the object;
    Information processing equipment.
  6.  請求項1から請求項5までの何れか1項に記載の情報処理装置において、
     前記制御部は、
     前記移動体の進行方向に対して左、又は右にオブジェクトが存在する場合、前記オブジェクトがある方向に前記視点を移動させる、
    情報処理装置。
    In the information processing apparatus according to any one of claims 1 to 5,
    The control unit
    When an object exists to the left or right of the traveling direction of the moving object, moving the viewpoint in the direction of the object;
    Information processing equipment.
  7.  請求項1から請求項6までの何れか1項に記載の情報処理装置において、
     前記移動体は、自動車であり、
     前記オブジェクトは、前記自動車の駐車位置に関するものである、
    情報処理装置。
    In the information processing apparatus according to any one of claims 1 to 6,
    The moving body is an automobile,
    the object relates to a parking position of the car;
    Information processing equipment.
  8.  移動体であって、
     請求項1から請求項7までの何れか1項に記載の情報処理装置を有する、
    移動体。
    being mobile,
    Having the information processing device according to any one of claims 1 to 7,
    Mobile.
  9.  請求項8に記載の移動体において、
     前記移動体は自動車である、
    移動体。
    In the mobile body according to claim 8,
    the moving object is an automobile,
    Mobile.
  10.  情報処理装置が実行する情報処理方法であって、
     移動体と前記移動体の周辺とを含む俯瞰画像を生成し、
     前記俯瞰画像を前記移動体の操作者が目視可能な表示部に表示するよう制御し、
     前記移動体と、前記移動体の進行方向に存在するオブジェクトと、の距離に基づいて、前記俯瞰画像の視点に関する俯角を変更する、
    情報処理方法。
    An information processing method executed by an information processing device,
    generating a bird's-eye view image including a moving body and the surroundings of the moving body;
    controlling to display the bird's-eye view image on a display unit visible to an operator of the moving body;
    changing a depression angle with respect to the viewpoint of the bird's-eye view image based on the distance between the moving body and an object existing in the traveling direction of the moving body;
    Information processing methods.
  11.  プログラムであって、
     コンピュータを、
     請求項1から請求項7までの何れか1項に記載の情報処理装置の制御部として機能させるためのプログラム。
    a program,
    the computer,
    A program for functioning as a control unit of the information processing apparatus according to any one of claims 1 to 7.
PCT/JP2022/028506 2021-12-06 2022-07-22 Information processing device, mobile body, information processing method, and program WO2023105842A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202280080009.0A CN118355655A (en) 2021-12-06 2022-07-22 Information processing device, mobile body, information processing method, and program
JP2023566083A JPWO2023105842A1 (en) 2021-12-06 2022-07-22

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-198099 2021-12-06
JP2021198099 2021-12-06

Publications (1)

Publication Number Publication Date
WO2023105842A1 true WO2023105842A1 (en) 2023-06-15

Family

ID=86730051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/028506 WO2023105842A1 (en) 2021-12-06 2022-07-22 Information processing device, mobile body, information processing method, and program

Country Status (3)

Country Link
JP (1) JPWO2023105842A1 (en)
CN (1) CN118355655A (en)
WO (1) WO2023105842A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003134507A (en) * 2001-10-24 2003-05-09 Nissan Motor Co Ltd Monitor device for rear of vehicle
JP2009071790A (en) * 2007-09-18 2009-04-02 Denso Corp Vehicle surroundings monitoring apparatus
JP2010218058A (en) * 2009-03-13 2010-09-30 Denso It Laboratory Inc Device and method for supporting driving
JP2011004201A (en) * 2009-06-19 2011-01-06 Konica Minolta Opto Inc Circumference display
WO2014087594A1 (en) * 2012-12-04 2014-06-12 株式会社デンソー Vehicle monitoring device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003134507A (en) * 2001-10-24 2003-05-09 Nissan Motor Co Ltd Monitor device for rear of vehicle
JP2009071790A (en) * 2007-09-18 2009-04-02 Denso Corp Vehicle surroundings monitoring apparatus
JP2010218058A (en) * 2009-03-13 2010-09-30 Denso It Laboratory Inc Device and method for supporting driving
JP2011004201A (en) * 2009-06-19 2011-01-06 Konica Minolta Opto Inc Circumference display
WO2014087594A1 (en) * 2012-12-04 2014-06-12 株式会社デンソー Vehicle monitoring device

Also Published As

Publication number Publication date
JPWO2023105842A1 (en) 2023-06-15
CN118355655A (en) 2024-07-16

Similar Documents

Publication Publication Date Title
JP7021372B2 (en) Detection of couplers and traction bars through group points for automated trailer hitting
JP2022058391A (en) Self-driving car with improved visual detection capability
JP7010221B2 (en) Image generator, image generation method, and program
US9507345B2 (en) Vehicle control system and method
US20170036678A1 (en) Autonomous vehicle control system
JP6659379B2 (en) Road information recognition system and road information recognition method
EP2429877B1 (en) Camera system for use in vehicle parking
CN107433905A (en) For with the towing vehicle for looking around visual field imaging device and the system and method for trailer
US20200097021A1 (en) Autonomous Farm Equipment Hitching To A Tractor
JP2020525948A (en) Autonomous vehicle collision mitigation system and method
DE112018004507T5 (en) INFORMATION PROCESSING DEVICE, MOTION DEVICE AND METHOD AND PROGRAM
US20200062257A1 (en) Automated Reversing By Following User-Selected Trajectories and Estimating Vehicle Motion
US20190135169A1 (en) Vehicle communication system using projected light
US10522041B2 (en) Display device control method and display device
JP2022526191A (en) Vehicle-Autonomous maneuvering and parking of trailers
CN108399394A (en) Barrier method for early warning, device and terminal
JP7371629B2 (en) Information processing device, information processing method, program, and vehicle
JP7091624B2 (en) Image processing equipment
WO2020031812A1 (en) Information processing device, information processing method, information processing program, and moving body
US20230215196A1 (en) Information processing apparatus, information processing method, and program
US10540807B2 (en) Image processing device
CN112837209B (en) Novel method for generating distorted image for fish-eye lens
WO2023105842A1 (en) Information processing device, mobile body, information processing method, and program
WO2023100415A1 (en) Information processing device, movable body, information processing method and program
WO2022202780A1 (en) Display control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22903778

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023566083

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE