WO2023233821A1 - Dispositif de traitement d'informations et procédé de traitement d'informations - Google Patents

Dispositif de traitement d'informations et procédé de traitement d'informations Download PDF

Info

Publication number
WO2023233821A1
WO2023233821A1 PCT/JP2023/014415 JP2023014415W WO2023233821A1 WO 2023233821 A1 WO2023233821 A1 WO 2023233821A1 JP 2023014415 W JP2023014415 W JP 2023014415W WO 2023233821 A1 WO2023233821 A1 WO 2023233821A1
Authority
WO
WIPO (PCT)
Prior art keywords
grid
unit
information processing
processing device
moving
Prior art date
Application number
PCT/JP2023/014415
Other languages
English (en)
Japanese (ja)
Inventor
龍一 鈴木
真 城間
直樹 西田
真理 安田
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023233821A1 publication Critical patent/WO2023233821A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present disclosure relates to an information processing device and an information processing method.
  • Patent Document 1 describes a technology that combines a grid image along an uneven shape with an image of a work plane captured by a work machine remotely controlled by an operator, and allows the operator to visually recognize the combined image. Disclosed. According to this, the operator can more easily grasp the sense of distance to the work plane, and therefore can more appropriately remotely control the work machine.
  • a display image is generated in which a grid having a unit grid having a size corresponding to the moving object is superimposed on an environmental image captured by at least one imaging device mounted on the moving object.
  • An information processing device is provided that includes an image generation section.
  • the computing device superimposes a grid having a unit grid having a size corresponding to the moving object on an environmental image captured by at least one imaging device mounted on the moving object.
  • An information processing method is provided that includes generating a displayed display image.
  • FIG. 1 is a schematic diagram showing the overall configuration of a remote control system including an information processing device according to an embodiment of the present disclosure.
  • FIG. 2 is a block diagram showing the functional configuration of a mobile object and an information processing device.
  • FIG. 2 is a schematic diagram showing an example of an environment around a moving body. 4 is a schematic diagram illustrating an example of an environmental image captured by an imaging unit mounted on a moving body in the environment illustrated in FIG. 3.
  • FIG. FIG. 2 is a schematic diagram showing an example of a display image generated by an image generation unit from an environmental image.
  • FIG. 2 is a schematic diagram illustrating the size of a unit cell of a grid superimposed on an environmental image.
  • FIG. 3 is a schematic diagram illustrating an example of a coordinate system of a grid superimposed on an environmental image.
  • FIG. 7 is a schematic diagram illustrating another example of a coordinate system of a grid superimposed on an environmental image.
  • FIG. 2 is a schematic diagram showing an example of a grid in which the size of a unit cell is expanded.
  • FIG. 3 is a graph diagram showing the relationship between the moving speed of a moving body and the amount of expansion of a grid.
  • FIG. 2 is a schematic diagram showing an example of a grid in which the size of a unit cell is reduced.
  • FIG. 3 is a graph diagram showing the relationship between the moving speed of a moving body and the amount of reduction of a grid.
  • FIG. 6 is a schematic diagram showing an example of deformation of a unit cell of a grid when a moving body moves diagonally.
  • FIG. 1 is a schematic diagram showing another example of a coordinate system of a grid superimposed on an environmental image.
  • FIG. 2 is a schematic diagram showing an example of a grid in which the size of a unit cell is expanded.
  • FIG. 3 is a graph diagram showing the
  • FIG. 7 is a schematic diagram illustrating a grid and subgrids superimposed on an environmental image in a first modification.
  • FIG. 3 is a graph diagram showing the relationship between the display level of a subgrid and the moving speed of a moving object.
  • FIG. 7 is a schematic diagram illustrating a grid superimposed on an environmental image in a second modification. It is a schematic diagram which shows the display image in the 3rd modification.
  • FIG. 7 is a schematic diagram showing an example of a display image in which a two-dimensional grid is superimposed at a predetermined height position in a three-dimensional space in a fourth modification.
  • FIG. 12 is a schematic diagram showing an example of a display image in which a three-dimensional grid having a rectangular parallelepiped unit grid is superimposed on an environmental image captured by an imaging unit of a moving body in a fourth modification.
  • Configuration example 1.1 Overall configuration 1.2. Configuration of information processing device 2. Operation example 3. Variation 3.1. First modification 3.2. Second modification 3.3. Third modification 3.4. Fourth modification
  • FIG. 1 is a schematic diagram showing the overall configuration of a remote control system including an information processing apparatus according to this embodiment.
  • the remote control system includes a mobile body 20 and an information processing device 10 connected to the mobile body 20 via a network 30.
  • the mobile object 20 is a robot that is remotely controlled by input to the information processing device 10 by the operator 40.
  • the mobile object 20 may be a wheeled, legged, or legged and wheeled robot, or may be a rotary wing or air-levitating drone.
  • the mobile body 20 can, for example, transmit an environmental image captured by an imaging device mounted on the mobile body 20 to the information processing device 10 and receive a remote operation instruction input from the information processing device 10. .
  • the mobile body 20 may be able to move autonomously based on sensing results of the surrounding environment.
  • the information processing device 10 is a terminal device used by the operator 40 to remotely control the mobile body 20.
  • the information processing device 10 may be, for example, a personal computer (PC), a tablet, a smartphone, or the like.
  • the information processing device 10 presents the operator 40 with an environmental image of the surroundings of the mobile body 20 received from the mobile body 20, receives a remote operation instruction from the operator 40 to the mobile body 20, and transmits the received remote operation instruction. It can be transmitted to the mobile body 20.
  • the network 30 is a communication network that connects the mobile object 20 and the information processing device 10 so that they can mutually transmit and receive data.
  • the network 30 may be, for example, the Internet, a satellite communication network, a mobile communication network, a LAN (Local Area Network), or a WAN (Wide Area Network).
  • the information processing device 10 may be connected to the network 30 using wired communication.
  • the mobile body 20 may connect to the network 30 using wireless communication in order to connect to the network 30 at any location of the moving destination.
  • FIG. 2 is a block diagram showing the functional configurations of the mobile object 20 and the information processing device 10.
  • the moving body 20 includes, for example, an imaging section 210, a sensor section 220, a driving section 230, an output section 240, a control section 250, and a communication section 260.
  • the moving body 20 can transmit an environmental image around the moving body 20 captured by the imaging unit 210 to the information processing device 10, and can also receive a remote operation instruction transmitted from the information processing device 10.
  • the imaging unit 210 includes a camera that images the environment around the moving body 20.
  • the imaging unit 210 may include, for example, an RGB camera, a stereo camera, an IR camera, or a thermo camera.
  • the imaging unit 210 may include multiple cameras with different imaging directions, or may include multiple cameras of different types.
  • the sensor unit 220 includes a sensor capable of sensing information about the outside or inside of the moving body 20.
  • the sensor unit 220 includes a sensor that measures the distance to an object in the outside world, such as a ToF (Time of Flight) sensor, LiDAR (Light Detection And Ranging), Radar (Radio Detection And Ranging), or an ultrasonic sensor. May include.
  • the sensor unit 220 may include a sensor that measures information related to the environment, such as the temperature, humidity, illuminance, or atmospheric pressure of the outside world, for example.
  • the sensor unit 220 may include a sensor that measures information regarding the inside of the moving body 20, such as the position, vibration, tilt, speed, or acceleration of the moving body 20.
  • the drive unit 230 is a moving mechanism that can move the moving body 20 to an arbitrary position based on the control by the control unit 250.
  • the drive unit 230 may be a moving mechanism of various types such as a wheel type, a leg type, a leg wheel type, a crawler type, or an air cushion type, and may be a moving mechanism capable of moving in the air such as a rotary wing. Alternatively, it may be a moving mechanism such as a screw that can move on or under water.
  • the output unit 240 outputs various information in the form of images or sounds based on the control of the control unit 250.
  • the output unit 240 may include, for example, a liquid crystal display (LCD) device that outputs various information in the form of images, an OLED (organic light emitting diode) display device, or a lamp that outputs various information in the form of light emission. May include. Further, the output unit 240 may include, for example, a speaker that outputs various information as audio to the surroundings of the moving body 20.
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • the control unit 250 controls the overall operation of the moving body 20.
  • the control unit 250 is configured by, for example, hardware such as a CPU (Central Processing Unit), a RAM (Read Only Memory), and a ROM (Read Only Memory), and software including a control program for the mobile body 20.
  • the control unit 250 estimates the self-position of the mobile body 20 based on the sensing result of the sensor unit 220, creates an action plan for the mobile body 20, and drives the drive unit 230 based on the created action plan. good.
  • the communication unit 260 is a communication interface for connecting to the network 30.
  • the communication unit 260 may be, for example, a communication interface that can be connected to the network 30 wirelessly.
  • the communication unit 260 may be a communication interface connectable to the network 30 via another network such as a mobile communication network or a wireless LAN, or a base station.
  • the information processing device 10 includes, for example, a communication section 110, an image generation section 120, a grid modification section 130, and a display section 140.
  • the information processing device 10 receives an environmental image around the moving object 20 from the moving object 20, and also transmits to the moving object 20 a remote operation instruction for the moving object 20 that is input from the operator 40 to the information processing device 10. I can do it.
  • the communication unit 110 is a communication interface for connecting to the network 30.
  • the communication unit 110 may be, for example, a communication interface connectable to the network 30 by wire.
  • the communication unit 110 may be a communication interface connectable to the network 30 via another network such as a wired LAN.
  • the image generation unit 120 superimposes a grid having a unit grid of a size corresponding to the moving object 20 on the environmental image surrounding the moving object 20 received from the moving object 20, thereby presenting the image to the operator 40.
  • the information processing device 10 allows the operator 40 to intuitively grasp the size or movement characteristics of the moving body 20 by using the display image on which the grid is superimposed, so the information processing device 10 allows the operator 40 to operate the moving body 20. Feelings can be grasped more easily.
  • FIG. 3 is a schematic diagram showing an example of the environment around the moving body 20 for explaining the operation of the image generation unit 120.
  • FIG. 4 is a schematic diagram showing an example of an environment image EI captured by the imaging unit 210 mounted on the moving object 20 in the environment shown in FIG.
  • FIG. 5 is a schematic diagram showing an example of a display image VI generated by the image generation unit 120 from the environmental image EI.
  • FIG. 6 is a schematic diagram illustrating the size of a unit cell of the grid Gd superimposed on the environment image EI.
  • FIG. 7A is a schematic diagram illustrating an example of a coordinate system of a grid superimposed on the environmental image EI.
  • FIG. 7B is a schematic diagram illustrating another example of the coordinate system of the grid superimposed on the environmental image EI.
  • the mobile object 20 can transmit the captured environmental image EI to the information processing device 10.
  • the image generation unit 120 of the information processing device 10 superimposes a grid Gd on a plane including the top surface of the table 62, which is the traveling surface of the moving body 20, on the received environmental image EI.
  • the display image VI to be presented to the operator 40 can be generated. According to this, the operator 40 who visually recognized the display image VI can more clearly grasp the running surface of the moving body 20 based on the position of the grid Gd.
  • the image generation unit 120 may superimpose the grid Gd only on the region of the display image VI where the moving object 20 can travel. For example, in the display image VI shown in FIG. 5, the image generation unit 120 may superimpose the grid Gd only on the top surface of the table 62 on which the moving object 20 can travel. According to this, the image generation unit 120 can express more clearly the area in which the mobile object 20 can run in the display image VI.
  • the image generation unit 120 may superimpose the grid Gd so as to cover the obstacle along the surface shape of the obstacle. According to this, the image generation unit 120 can express more clearly the obstacles that exist in the area where the mobile object 20 can travel in the display image VI.
  • the running surface of the moving body 20 can be recognized by analyzing the sensing results by the sensor section 220 of the moving body 20 (particularly the sensing results by the distance measuring sensor).
  • the running surface of the moving body 20 can be recognized by image analysis of the environmental image EI captured by the imaging unit 210 of the moving body 20.
  • the unit cell of the grid Gd superimposed on the environment image EI may have a rectangular shape.
  • the grid Gd may be configured by arranging rectangular unit grids in the front direction of the moving body 20 and in the side direction perpendicular to the front direction.
  • the grid Gd superimposed on the environmental image EI is a rectangular unit grid on the xy plane with the front direction of the moving body 20 as the y direction and the side direction of the moving body 20 as the x direction. may be configured by arranging them in a matrix.
  • the shape of the unit cell of the grid Gd is not limited to the above example.
  • the shape of the unit cell of the grid Gd may be another polygonal shape that can be filled in a plane with a single shape.
  • the shape of the unit cell of the grid Gd may be a triangle, a parallelogram, a rhombus, or a hexagon.
  • the size of the unit cell of the grid Gd is determined for each moving object 20 based on the size, movement characteristics, or use of the moving object 20. According to this, the operator 40 can intuitively grasp the size or movement characteristics of the moving object 20 by the grid Gd superimposed on the display image VI, and therefore can more easily grasp the feeling of operating the moving object 20. can do. Note that information regarding the size, movement characteristics, or usage of the mobile body 20 can be acquired from the mobile body 20 via the communication unit 110, for example.
  • the size of the unit cell of the grid Gd may be determined based on the area occupied by the moving body 20 on the running surface. For example, if the moving body 20 is a wheeled moving body, the size of the unit grid of the grid Gd may be determined based on the total length and width of the moving body 20.
  • the unit grid of the grid Gd may be configured in a rectangular shape whose length is the entire length of the moving body 20 and whose width is the entire width of the moving body 20, and the unit grid of the grid Gd may be configured in a rectangular shape whose length is the entire length of the moving body 20 and whose width is the entire width of the moving body 20, and the unit grid of the grid Gd is more than 100% and 120% or less of the above-mentioned rectangular shape. It may also be configured in a rectangular shape expanded to .
  • the size of the unit cell of the grid Gd may be determined based on the amount of movement of the moving body 20 in one drive. For example, when the moving body 20 is a legged moving body, the size of the unit cell of the grid Gd may be determined based on the width of one walking step of the moving body 20. More specifically, the unit grid of the grid Gd may be configured in a rectangular shape whose length is the width of one walking step of the moving body 20 and whose width is the entire width of the moving body 20, and the above-mentioned rectangular shape is It may be configured in a rectangular shape expanded by more than 120%.
  • the size of the unit cell of the grid Gd may be determined based on the size of an object existing in the usage environment of the moving body 20.
  • the size of the unit cell of the grid Gd is the size of the surgeon's finger (for example, thumb) that is frequently seen on the endoscopic camera.
  • the unit grid of the grid Gd may be configured in a rectangular shape whose length is the size of a human finger (for example, a thumb) and whose width is the entire width of the moving body 20, and the unit cell of the grid Gd may be configured in a rectangular shape whose length is the size of a human finger (for example, a thumb) and whose width is the entire width of the moving body 20, and the unit cell of the grid Gd is more than 100% larger than the above-mentioned rectangular shape. It may be configured in a rectangular shape expanded by 120% or less.
  • the coordinate system of the grid Gd superimposed on the environmental image EI may be fixed in space, or may be moved or rotated in conjunction with the movement of the moving body 20.
  • the coordinate system of the grid Gd superimposed on the environment image EI may be fixed with respect to the space including the running surface.
  • the coordinate system of the grid Gd is fixed in space.
  • the moving body 20 may freely move on the grid Gd in which the direction in which the unit cells are arranged is fixed.
  • the coordinate system of the grid Gd superimposed on the environmental image EI may be fixed with respect to the moving body 20.
  • the coordinate system of the grid Gd rotates in conjunction with the rotation of the moving body 20 so that the unit grid of the grid Gd is always arranged in the front and side directions of the moving body 20. It's okay.
  • the coordinate system of the grid Gd is moved to make the moving direction of the moving object 20 clearer. Preferably, it is fixed relative to the body 20.
  • the grid deformation unit 130 expands or contracts the size of the unit cell of the grid Gd based on the moving speed of the moving body 20. Specifically, the grid deforming unit 130 expands or reduces the size of the unit grid of the grid Gd in the traveling direction of the mobile body 20 based on the moving speed of the mobile body 20. According to this, the information processing device 10 allows the operator 40 to intuitively grasp the moving speed of the moving object 20 by deforming the unit cell of the grid Gd superimposed on the display image VI. Therefore, the information processing device 10 allows the operator 40 to more easily grasp the feeling of operating the moving body 20. Note that information regarding the moving speed of the moving body 20 can be acquired from the moving body 20 via the communication unit 110, for example.
  • FIG. 8A is a schematic diagram showing an example of a grid Gd in which the size of the unit cell is expanded.
  • FIG. 8B is a graph diagram showing the relationship between the moving speed of the moving body 20 and the amount of expansion of the grid Gd.
  • FIG. 9A is a schematic diagram showing an example of a grid Gd in which the size of the unit cell is reduced.
  • FIG. 9B is a graph diagram showing the relationship between the moving speed of the moving body 20 and the amount of reduction of the grid Gd.
  • FIG. 10 is a schematic diagram showing an example of deformation of the unit grid of the grid Gd when the moving body 20 moves diagonally.
  • the grid deforming unit 130 may expand the size d of the unit grid of the grid Gd in the moving direction TD of the moving body 20 in proportion to the moving speed of the moving body 20.
  • the grid deforming unit 130 expands the size d of the unit cell of the grid Gd in proportion to the moving speed. You may let them. According to this, the grid deforming unit 130 can make the operator 40 intuitively understand the increase in the moving speed of the moving body 20 by expanding the unit grid of the grid Gd.
  • the grid deforming unit 130 does not extend the size d of the unit cell of the grid Gd, and changes the size, movement characteristics, or application of the moving body 20.
  • the size L determined for each mobile object 20 based on the above may be left unchanged.
  • the grid deforming unit 130 expands the size d of the unit grid of the grid Gd when the moving speed of the moving body 20 is equal to or higher than a threshold value V1 , thereby changing the size of the unit grid of the grid Gd when the moving body 20 moves at a low speed. It is possible to suppress frequent changes in sd.
  • the threshold value V 1 may be set, for example, to a speed at which the moving distance of the moving body 20 per second exceeds the grid size L.
  • the grid deforming unit 130 may reduce the size d of the unit grid of the grid Gd in the traveling direction TD of the moving body 20 in proportion to the moving speed of the moving body 20. good.
  • the grid deforming unit 130 reduces the size d of the unit cell of the grid Gd in proportion to the moving speed. You may let them. According to this, the grid deforming unit 130 can make the operator 40 intuitively understand the increase in the moving speed of the moving object 20 by reducing the unit grid of the grid Gd.
  • the grid deforming unit 130 changes the size, movement characteristics, or application of the moving object 20 without reducing the size d of the unit cell of the grid Gd.
  • the size L determined for each mobile object 20 based on the above may be left unchanged.
  • the grid deformation unit 130 reduces the size d of the unit grid of the grid Gd when the moving body 20 moves at a low speed by reducing the size d of the unit grid of the grid Gd when the moving speed of the moving body 20 is equal to or higher than the threshold value V2. It is possible to suppress frequent changes in sd.
  • the threshold value V 2 may be set, for example, to a speed at which the moving distance of the moving object 20 per second exceeds the grid size L.
  • the grid deforming section 130 may expand or contract the unit grids of the grid Gd in the diagonal direction.
  • the grid deforming section 130 may be expanded or reduced, respectively.
  • the display unit 140 is a display device that displays the display image VI generated by the image generation unit 120.
  • the operator 40 can visually recognize, via the display unit 140, the display image VI in which the environmental image EI around the moving object 20 and the grid Gd are superimposed. According to this, the operator 40 can recognize information regarding the size or movement characteristics of the moving body 20 from the grid Gd included in the display image VI, and therefore can more easily grasp the feeling of operating the moving body 20. I can do it.
  • the display unit 140 may be a display device such as an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an OLED (Organic Light Emitting Diode) display, a hologram, or a projector.
  • FIG. 11 is a flowchart showing the flow of operations of the information processing device 10 according to this embodiment.
  • the information processing device 10 first connects to the mobile object 20 to be remotely controlled via the network 30 (S101). Next, the information processing device 10 acquires information for determining the size of the grid Gd from the mobile object 20 via the network 30 (S102). Subsequently, the information processing device 10 acquires the environmental image EI captured by the imaging unit 210 of the mobile object 20 via the network 30 (S103).
  • the information processing device 10 causes the image generation unit 120 to superimpose a grid Gd having a unit grid having a size corresponding to the moving object 20 on the environmental image EI (S104).
  • the size of the unit grid of the grid Gd is set for each moving body 20 based on the size, movement characteristics, or use of the moving body 20.
  • the environment image EI on which the grid Gd is superimposed is displayed on the display unit 140 as a display image VI, so that it is visually recognized by the operator 40.
  • the information processing device 10 obtains the moving speed of the mobile object 20 via the network 30 (S105).
  • the information processing device 10 determines whether the connection with the mobile object 20 has been released (S106).
  • the information processing device 10 causes the grid modification unit 130 to expand or contract the size of the unit grid of the grid Gd based on the moving speed of the moving body 20. After that, the operation flow of the information processing device 10 returns to step S104, and the information processing device 10 uses the image generation unit 120 to superimpose the grid Gd whose unit grid size has been expanded or reduced on the environmental image EI. (S104). The information processing device 10 repeatedly performs the operations from step S104 to step S107 until the connection with the mobile object 20 is released.
  • the information processing device 10 determines whether to connect to another mobile body 20 (S108). When connecting to another mobile body 20 (S108/Yes), the operation flow of the information processing device 10 returns to step S101, and the information processing device 10 executes connection to the mobile body 20 to be remotely controlled (S101). . When not connected to another mobile body 20 (S108/No), the information processing device 10 ends its operation.
  • the information processing device 10 superimposes the grid Gd having a unit grid of a size corresponding to the moving object 20 on the environmental image EI, and also superimposes the grid Gd in units of grid Gd based on the moving speed of the moving object 20.
  • the size of the grid can be expanded or contracted. Therefore, the operator 40 who visually recognizes the display image VI can intuitively grasp the size or movement characteristics of the moving object 20, and therefore can remotely control the moving object 20 more smoothly.
  • FIG. 12 is a schematic diagram illustrating the grid Gd and subgrid Sd superimposed on the environment image EI in the first modification.
  • FIG. 13 is a graph diagram showing the relationship between the display level ⁇ of the sub-grid Sd and the moving speed of the moving body 20.
  • the image generation unit 120 may further superimpose a sub-grid Sd in addition to the grid Gd on the environmental image EI captured by the imaging unit 210 of the moving body 20.
  • the sub-grid Sd is an auxiliary grid having a unit cell that is a fraction of the size of the unit cell of the grid Gd.
  • the sub-grid Sd allows the operator 40 to grasp the amount of movement of the moving body 20 in more detail by dividing the unit grid of the grid Gd into smaller pieces.
  • the image generation unit 120 generates a grid Gd whose unit grid is a rectangular shape whose length is the full length of the moving body 20 and whose width is the full width of the moving body 20, and a unit grid whose size is 1/2 of the grid Gd. may be superimposed on the environment image EI.
  • the sub-grid Sd may be expressed by lines that are different from the grid Gd in at least one of line type, color, or width.
  • the sub-grid Sd may be composed of broken lines or dotted lines.
  • the sub-grid Sd may be expressed by the same lines as the grid Gd.
  • the display level of the sub-grid Sd may be changed based on the moving speed of the moving body 20.
  • the display level ⁇ of the sub-grid Sd may be controlled so that the higher the moving speed of the moving object 20 is, the lower the display level ⁇ is, and the sub-grid Sd is not displayed at a threshold value V3 or higher.
  • the sub-grid Sd may be controlled so that its transparency increases as the moving speed of the moving body 20 increases, and becomes transparent at a threshold value V3 or higher.
  • the image generation unit 120 displays the sub-grid Sd on the display image VI when the moving object 20 is moving at a low speed less than the threshold V3 , and when the moving object 20 is moving at a high speed equal to or higher than the threshold V3 . It is possible to control the sub-grid Sd so that it is not displayed when the user is moving. Therefore, when the moving object 20 is moving at a low speed, the image generation unit 120 displays the sub-grid Sd with a smaller unit grid size, thereby allowing the operator 40 to remotely operate the moving object 20 using the sub-grid Sd as a guide. be able to.
  • the image generation unit 120 can suppress the visibility of the display image VI from decreasing by not displaying the sub-grid Sd when the moving object 20 is moving at high speed. That is, the information processing device 10 can discretely change the grid width displayed on the display image VI by controlling display or non-display of the sub-grid Sd based on the moving speed of the moving object 20. Therefore, the information processing device 10 allows the operator 40 to view the display image VI including a grid having a unit grid of a size suitable for the moving speed of the moving object 20.
  • the grid transformation unit 130 may choose not to expand or contract the size of the unit grid of the grid Gd. .
  • the size of the unit cell of grid Gd expands or contracts
  • the size of the unit cell of sub-grid Sd also expands or contracts as the size of the unit cell of grid Gd expands or contracts. Therefore, if both the grid Gd and the sub-grid Sd superimposed on the environmental image EI are deformed, the visibility of the display image VI may be reduced.
  • the grid deformation unit 130 may choose not to expand or contract the size of the unit grid of the grid Gd and the sub-grid Sd, depending on the content of the display image VI.
  • the second modification is a modification in which the expression of the grid Gd superimposed on the environment image EI is controlled based on the position of the moving object 20 and the like.
  • FIG. 14 is a schematic diagram illustrating the grid Gd superimposed on the environmental image EI in the second modification.
  • the image generation unit 120 may color the unit grid FL in which the moving object 20 is present among the unit grids of the grid Gd.
  • the color given to the unit cell FL is expressed as hatching.
  • a blind spot may occur in the environmental image EI captured by the imaging unit 210.
  • the vicinity of the drive unit 230 of the moving body 20 that is, the vicinity of the feet
  • the image generation unit 120 can clarify the position of the moving body 20 in the display image VI by coloring the unit grid FL in which the moving body 20 exists. Therefore, the operator 40 can more accurately grasp the sense of distance between the moving object 20 and the object in the display image VI.
  • the image generation unit 120 can also similarly color unit cells near the unit cell FL where the moving body 20 exists.
  • the color given to the neighboring unit grids may become darker as the unit grids are closer to the unit grid FL where the moving body 20 is present. According to this, even if the blind spot in the display image VI is large, the image generation unit 120 can clarify the position of the moving object 20 in the display image VI.
  • the image generation unit 120 may apply an effect such as highlighting or hatching to the unit grid FL in which the moving object 20 exists instead of the color. Even in such a case, the image generation unit 120 can similarly clarify the position of the moving body 20 within the display image VI.
  • the image generation unit 120 may change the line width of the grid Gd according to the position of the moving object 20. Specifically, the image generation unit 120 may make the line width of the grid Gd thinner in proportion to the distance from the moving object 20. According to this, the image generation unit 120 can improve the visibility of the grid Gd in the display image VI, and can make it easier to understand the sense of distance in the display image VI by the line width of the grid Gd.
  • the image generation unit 120 may change the line width of the grid Gd in the traveling direction of the moving body 20, and may change the line width of the grid Gd in the traveling direction of the moving body 20 as well as in the direction perpendicular to the traveling direction of the moving body 20. You may change the line width.
  • the image generation unit 120 may adjust the color tone or brightness of the lines of the grid Gd based on the surrounding environment of the moving body 20 (for example, the color of the running surface). Specifically, the image generation unit 120 adjusts the color or brightness of the lines of the grid Gd so that the color or brightness has a higher contrast with the color or brightness of the surrounding environment of the moving object 20. Good too. According to this, the image generation unit 120 can further improve the visibility of the grid Gd in the display image VI.
  • the color or brightness of the surrounding environment of the moving object 20 may be determined by, for example, the average value of the color or brightness of each pixel of the environment image EI, or the average value of the color or brightness of each pixel of the running surface of the moving object 20. An average value may be used.
  • the image generation unit 120 may adjust the brightness of the lines of the grid Gd to be higher. As another example, when the average value of the brightness of the surrounding environment of the moving body 20 is high, the image generation unit 120 may adjust the brightness of the lines of the grid Gd to be lower. As another example, the image generation unit 120 may adjust the color tone of the lines of the grid Gd so that the color tone is complementary to the average color tone of the surrounding environment of the moving body 20.
  • FIG. 15 is a schematic diagram showing a display image VI in the third modification.
  • the image generation unit 120 may generate a display image VIA including each of the environmental images EI1 and EI2 captured by the plurality of imaging units 210.
  • the image generation unit 120 has a unit grid of the same size in the imaged space and the same size for each of the environmental images EI1 and EI2 captured by the plurality of imaging units 210.
  • the grids Gd of the coordinate system can be superimposed on each other.
  • the environmental images EI1 and EI2 captured by the plurality of imaging units 210 may have different viewpoints or scales. Therefore, the image generation unit 120 deforms the grid Gd fixed on the imaged space according to the scale and superimposes it on each of the environmental images EI1 and EI2, thereby changing the position and position of each of the environmental images EI1 and EI2. This makes it easier to understand the relationship between scales.
  • the image generation unit 120 when an environmental image EI1 that captures the front of the moving body 20 and an environmental image EI2 that captures the feet of the moving body 20 are captured, the image generation unit 120 generates the environmental image EI1. , EI2, the grid Gd may be superimposed on the running surface of each moving body 20. According to this, the image generation unit 120 generates the display image VIA in which the positional relationship and scale relationship between the environmental image EI1 and the environmental image EI2 are easier to understand from the size and arrangement of the unit cells of the grid Gd. be able to.
  • the image generation unit 120 when a low-resolution environmental image and a high-resolution environmental image obtained by enlarging a part of the low-resolution environmental image are captured, the image generation unit 120 generates a unit of the same size for each of the environmental images. Grids Gd having a grid and having the same coordinate system may be superimposed. According to this, the image generation unit 120 can more easily understand the consistency, continuity, and correspondence between the low-resolution environmental image and the high-resolution environmental image from the size and arrangement of the unit cells of the grid Gd. It is possible to generate a display image that looks like this.
  • the fourth modification is a modification in which the grid Gd is expanded from a two-dimensional plane to a three-dimensional space.
  • FIG. 16 is a schematic diagram showing an example of a display image VIB in which a two-dimensional grid is superimposed at a predetermined height position in a three-dimensional space.
  • the image generation unit 120 superimposes the grid Gd at an arbitrary height of the environmental image (in FIG. 16, the height of the face of the person 61), not limited to the running surface of the moving object 20.
  • a display image VIB may also be generated. In such a case, the image generation unit 120 can generate a display image VIB that makes it easier to understand the sense of distance to an object that is located at a high position away from the running surface of the moving body 20.
  • the image generation unit 120 may superimpose the grid Gd at the flight height of the moving object 20 on an environmental image captured by the flying moving object 20 such as a drone. According to this, the image generation unit 120 can support the operator 40 to remotely control the flyable mobile object 20 such as a drone.
  • the image generation unit 120 may superimpose the grid Gd on the environmental image captured by the moving body 20 equipped with the manipulator at a height where the end effector of the manipulator is present. According to this, the image generation unit 120 can support the operator 40 to remotely control the manipulator mounted on the moving body 20.
  • FIG. 17 is a schematic diagram showing an example of a display image VIC in which a three-dimensional grid having a rectangular parallelepiped unit grid is superimposed on an environmental image captured by the imaging unit 210 of the moving body 20.
  • the image generation unit 120 generates a unit grid of a rectangular parallelepiped in the front direction of the mobile body 20, in the side direction perpendicular to the front direction, and Three-dimensional grids arranged in the height direction of the moving body 20 may be superimposed.
  • the size of the unit grid in the front direction of the movable body 20 and in the side direction perpendicular to the front direction may be set based on the total length and width of the movable body 20.
  • the size of the unit grid in the height direction of the moving body 20 may be set based on the height of the moving body 20.
  • the image generation unit 120 when the operator 40 remotely controls the moving body 20, the image generation unit 120 generates the display image VIC that allows the operator 40 to determine whether or not the moving body 20 can pass in the height direction. be able to.
  • the remotely controlled moving object 20 is a flying moving object such as a drone
  • the moving object 20 is also movable in the height direction. Therefore, it is desirable that the image generation unit 120 generates a display image VIC in which a three-dimensional grid is superimposed on an environmental image.
  • the environmental image EI and the grid Gd drawn based on the self-position of the mobile object 20 are displayed in synchronization with good timing.
  • the grid Gd drawn based on the self-position There is a possibility that the environmental image EI and the environmental image EI may not be synchronized.
  • the information processing device 10 may adjust the difference between each delay amount using a fixed value.
  • the information processing device 10 acquires the calculation time of the self-position of the mobile object 20 and the imaging time of the environmental image EI. It's okay. According to this, the information processing device 10 can grasp the absolute delay amount of the self-position of the mobile object 20 and the absolute delay amount of the environmental image EI from these times. Therefore, the information processing device 10 is able to synchronize the environmental image EI and the grid Gd drawn based on the self-position of the moving body 20 by adjusting the timing to the one with the larger amount of delay.
  • the information processing device 10 adjusts the color tone or thickness of the lines constituting the grid Gd so that it blends in with the environmental image EI. may be adjusted. This is because in the environment image EI with a low resolution, the display of the superimposed grid Gd is emphasized, which may make it difficult to recognize the environment image EI itself. Furthermore, when the running surface is configured in a checkered pattern using tiles or the like, the grid Gd and the checkered pattern of the tiles coexist, which may make it difficult to distinguish and visually recognize the two. In such a case, the information processing device 10 may detect a grid pattern on the running surface and superimpose the grid Gd on the environmental image EI so that the grid pattern and the grid Gd overlap.
  • the information processing device 10 when it is required to remotely control the mobile object 20 with higher precision, the information processing device 10 notifies the operator 40 of some information when the mobile object 20 crosses the lines forming the grid Gd. Good too. Specifically, the information processing device 10 may add momentary vertical shaking to the display image VI when the moving object 20 crosses the lines forming the grid Gd. Furthermore, when the moving object 20 crosses over the lines forming the grid Gd, the information processing device 10 may output a sound that makes it sound like the moving body 20 has crossed the line that makes up the grid Gd, and may output vibrations that make it sound like the moving body 20 has gone over the seam. You can also output it to a sheet.
  • the information processing device 10 adds the sub-grid Sd only to the area of the environmental image EI where the pitch of the uneven shapes on the running surface is large. It may be superimposed on Further, the information processing device 10 may supplementarily superimpose an additional sub-grid whose unit grid size is smaller than the sub-grid Sd on the area of the environmental image EI. Furthermore, the information processing device 10 may choose not to superimpose the grid Gd on a partial area of the environmental image EI in order to indicate that the mobile object 20 cannot travel in the partial area.
  • the information processing device 10 may choose to hide the grid Gd under specific conditions in order to prevent the environmental image EI from becoming difficult to view due to the superimposition of the grid Gd.
  • the information processing device 10 may select to hide the grid Gd until the moving body 20 resumes movement, if a predetermined time has passed after the moving body 20 stops moving.
  • the information processing device 10 may decide not to hide the grid Gd when it is determined that the operator 40 is proficient in remote control of the moving body 20 because there are few sudden accelerations and decelerations of the moving body 20. You may choose.
  • the information processing device 10 may choose to hide the grid Gd until the operator 40 inputs a request to display the grid Gd.
  • the communication quality of the network 30 is unstable and it is difficult to superimpose the environmental image EI and the grid Gd in consideration of delay, the information processing device 10 may hide the grid Gd. You may choose to do so.
  • the technology according to the present disclosure is applicable not only to the moving object 20 but also to wearable devices such as a powered suit or a power assist suit in which the operator 40's feeling of operation or size changes before and after wearing the device.
  • (1) comprising an image generation unit that generates a display image in which a grid having a unit grid having a size corresponding to the moving object is superimposed on an environmental image captured by at least one imaging device mounted on the moving object; Information processing device.
  • (2) The size of the unit grid of the grid is determined based on the area occupied by the moving body on the running surface, the movement step width of the moving body, or the size of a predetermined object existing in the usage environment of the moving body.
  • the information processing device according to any one of (1) to (6), further comprising a grid deformation unit that expands or contracts the size of the unit grid of the grid based on the moving speed of the moving body.
  • a grid deformation unit that expands or contracts the size of the unit grid of the grid based on the moving speed of the moving body.
  • the grid deforming section expands or reduces the size of the unit grid of the grid in the traveling direction of the moving body in proportion to the moving speed of the moving body. .
  • the grid deformation unit expands or contracts the size of the unit grid of the grid when the moving speed of the moving body is equal to or higher than a threshold value.
  • the image generation unit further superimposes a sub-grid having a unit cell having a size a fraction of the unit cell of the grid on the environmental image, according to any one of (1) to (9) above.
  • (11) The information processing device according to (10), wherein the sub-grid is expressed by lines that are different from the grid in at least one of line type, color, or width.
  • (12) The information processing according to (10) or (11), wherein the image generation unit increases the transparency of the sub-grid based on the moving speed of the moving object when the moving speed of the moving object is equal to or higher than a threshold value.
  • Device (13) The information processing device according to any one of (1) to (12), wherein the image generation unit changes the color of the unit grid in which the moving object is present.
  • the information processing device changes the line width of the grid depending on the position of the moving object.
  • the grid is a three-dimensional grid in which the unit grids of rectangular parallelepipeds are arranged in the front direction of the movable body, in the side direction perpendicular to the front direction, and in the height direction of the movable body, 14) The information processing device according to any one of item 14).
  • the image generation unit generates the display images in which the grid having the unit grid having the same coordinate system and the same size is superimposed on the environmental images captured by the different imaging devices, respectively.
  • the information processing device according to any one of (1) to (16).
  • Information processing device 20 Mobile object 30 Network 40 Operator 110 Communication unit 120 Image generation unit 130 Grid transformation unit 140 Display unit 210 Imaging unit 220 Sensor unit 230 Drive unit 240 Output unit 250 Control unit 260 Communication unit EI Environmental image VI Display image Gd Grid Sd Subgrid

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Le problème décrit par la présente invention est d'amener un opérateur à obtenir intuitivement une sensation pour faire fonctionner un corps mobile qui est présent dans un environnement inconnu. La solution selon l'invention porte sur un dispositif de traitement d'informations qui comprend une unité de génération d'image qui génère une image d'affichage qui superpose une grille, qui a une grille unitaire d'une taille correspondant au corps mobile, sur des images d'environnement capturées par un ou plusieurs dispositifs de capture d'image montés sur le corps mobile.
PCT/JP2023/014415 2022-06-02 2023-04-07 Dispositif de traitement d'informations et procédé de traitement d'informations WO2023233821A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-090302 2022-06-02
JP2022090302 2022-06-02

Publications (1)

Publication Number Publication Date
WO2023233821A1 true WO2023233821A1 (fr) 2023-12-07

Family

ID=89026183

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/014415 WO2023233821A1 (fr) 2022-06-02 2023-04-07 Dispositif de traitement d'informations et procédé de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2023233821A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0687964A1 (fr) * 1994-06-14 1995-12-20 ZELTRON S.p.A. Système de commande à distance programmable pour un véhicule
JP2003532218A (ja) * 2000-05-01 2003-10-28 アイロボット コーポレーション 移動ロボットを遠隔操作するための方法およびシステム
JP2009226978A (ja) * 2008-03-19 2009-10-08 Mazda Motor Corp 車両用周囲監視装置
JP2011022703A (ja) * 2009-07-14 2011-02-03 Oki Electric Industry Co Ltd 表示制御装置および表示制御方法
JP2011123807A (ja) * 2009-12-14 2011-06-23 Dainippon Printing Co Ltd アノテーション表示システム,方法及びサーバ装置
JP2015071369A (ja) * 2013-10-03 2015-04-16 矢崎総業株式会社 車両用表示装置
JP2019074458A (ja) * 2017-10-18 2019-05-16 株式会社東芝 情報処理装置、学習済モデル、情報処理方法、およびプログラム
JP2020008962A (ja) * 2018-07-03 2020-01-16 パナソニックIpマネジメント株式会社 移動体制御システム、移動体システム、移動体制御方法及びプログラム
US20200141755A1 (en) * 2017-05-24 2020-05-07 SZ DJI Technology Co., Ltd. Navigation processing method, apparatus, and control device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0687964A1 (fr) * 1994-06-14 1995-12-20 ZELTRON S.p.A. Système de commande à distance programmable pour un véhicule
JP2003532218A (ja) * 2000-05-01 2003-10-28 アイロボット コーポレーション 移動ロボットを遠隔操作するための方法およびシステム
JP2009226978A (ja) * 2008-03-19 2009-10-08 Mazda Motor Corp 車両用周囲監視装置
JP2011022703A (ja) * 2009-07-14 2011-02-03 Oki Electric Industry Co Ltd 表示制御装置および表示制御方法
JP2011123807A (ja) * 2009-12-14 2011-06-23 Dainippon Printing Co Ltd アノテーション表示システム,方法及びサーバ装置
JP2015071369A (ja) * 2013-10-03 2015-04-16 矢崎総業株式会社 車両用表示装置
US20200141755A1 (en) * 2017-05-24 2020-05-07 SZ DJI Technology Co., Ltd. Navigation processing method, apparatus, and control device
JP2019074458A (ja) * 2017-10-18 2019-05-16 株式会社東芝 情報処理装置、学習済モデル、情報処理方法、およびプログラム
JP2020008962A (ja) * 2018-07-03 2020-01-16 パナソニックIpマネジメント株式会社 移動体制御システム、移動体システム、移動体制御方法及びプログラム

Similar Documents

Publication Publication Date Title
US10629107B2 (en) Information processing apparatus and image generation method
CA3016539C (fr) Procede de traitement d'image, dispositif d'affichage et systeme d'inspection
JP4850984B2 (ja) 動作空間提示装置、動作空間提示方法およびプログラム
US20150097777A1 (en) 3D Motion Interface Systems and Methods
WO2022166264A1 (fr) Système, procédé et appareil d'apprentissage de simulation pour une machine de travail, et dispositif électronique
EP3248176A1 (fr) Système de réalité mélangée
EP3196734B1 (fr) Dispositif de commande, procédé de commande et programme
CN112639685B (zh) 模拟现实(sr)中的显示设备共享和交互
KR20150000783A (ko) 멀티 스크린을 이용한 디스플레이 방법 및 장치
KR20130108643A (ko) 응시 및 제스처 인터페이스를 위한 시스템들 및 방법들
US20190371072A1 (en) Static occluder
TWI759670B (zh) 物體追蹤系統及物體追蹤方法
CA3078578A1 (fr) Systemes de teleoperation, procede, appareil et support d'informations lisible par ordinateur
EP3797931A1 (fr) Système de commande à distance, procédé de traitement d'informations, et programme
US20220078390A1 (en) Image processor/image processing method
WO2023233821A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations
US20220187828A1 (en) Information processing device, information processing method, and program
WO2017155005A1 (fr) Procédé de traitement d'image, dispositif d'affichage et système d'inspection
JP6307706B2 (ja) 投影装置
JP2005277900A (ja) 3次元映像装置
WO2022004130A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et support de stockage
US11610343B2 (en) Video display control apparatus, method, and non-transitory computer readable medium
JP7300436B2 (ja) 情報処理装置、システム、情報処理方法および情報処理プログラム
TWI570594B (zh) 控制裝置、系統及方法
KR101975556B1 (ko) 로봇의 관측 시점 제어 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23815583

Country of ref document: EP

Kind code of ref document: A1