WO2024070398A1 - 情報処理装置、情報処理方法及びプログラム - Google Patents
情報処理装置、情報処理方法及びプログラム Download PDFInfo
- Publication number
- WO2024070398A1 WO2024070398A1 PCT/JP2023/031079 JP2023031079W WO2024070398A1 WO 2024070398 A1 WO2024070398 A1 WO 2024070398A1 JP 2023031079 W JP2023031079 W JP 2023031079W WO 2024070398 A1 WO2024070398 A1 WO 2024070398A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- processing device
- operator
- image
- output
- input
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J3/00—Manipulators of leader-follower type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- This disclosure relates to an information processing device, an information processing method, and a program.
- this disclosure provides a mechanism that can prevent a decrease in operability with a simpler system.
- the information processing device disclosed herein includes a control unit.
- the control unit When an operator controls an output mechanism located in a remote location to perform a task, the control unit acquires an image of the work area where the task is to be performed from an imaging device that captures an image of the work area.
- the control unit generates a display image from the captured image in accordance with a first positional relationship between the imaging device and the output mechanism, and a second positional relationship between the operator and an input mechanism that accepts an operation from the operator to control the output mechanism.
- the control unit presents the display image to the operator.
- FIG. 1 is a diagram for explaining an overview of an information processing system according to an embodiment of the present disclosure.
- 1 is a block diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure.
- 11 is a flowchart illustrating an example of a flow of an operation process according to an embodiment of the present disclosure.
- 1A to 1C are diagrams illustrating an example of imaging by an output side imaging device according to an embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of an output captured image according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating an example of a first positional relationship according to an embodiment of the present disclosure.
- 11A and 11B are diagrams illustrating an angle calculated by an input side processing device according to an embodiment of the present disclosure.
- FIG. 11 is a diagram illustrating an example of an angle formed between an output side imaging device and an output mechanism according to an embodiment of the present disclosure.
- 1 is a diagram for explaining an example of an arrangement of an output side imaging device according to an embodiment of the present disclosure.
- 1A to 1C are diagrams illustrating an example of imaging by an input side imaging device according to an embodiment of the present disclosure.
- 11A and 11B are diagrams illustrating an example of an angle formed between, for example, the head of an operator and an input mechanism according to an embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of a converted image according to an embodiment of the present disclosure.
- FIG. 11 is a diagram for explaining an example of a converted image according to an embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of a method for generating a converted image according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating an example of 3D data according to an embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating another example of a method for generating a converted image according to an embodiment of the present disclosure.
- FIG. 1 is a diagram illustrating an example of an operation performed by an operator according to an embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating another example of a converted image according to an embodiment of the present disclosure.
- FIG. 2 illustrates an example of a transformed image including lens distortion according to an embodiment of the present disclosure.
- FIG. 2 is a diagram showing an example of a display image generated based on a first generation method according to an embodiment of the present disclosure.
- 11A to 11C are diagrams illustrating another example of a display image generated based on the first generation method according to an embodiment of the present disclosure.
- FIG. 11 is a diagram showing an example of a display image generated based on a second generation method according to an embodiment of the present disclosure.
- 13A to 13C are diagrams illustrating another example of a display image generated based on the second generation method according to an embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of a display image generated based on a third generation method according to an embodiment of the present disclosure.
- 13A to 13C are diagrams illustrating another example of a display image generated based on the third generation method according to an embodiment of the present disclosure.
- 13A to 13C are diagrams illustrating an example of a display image generated based on a fourth generation method according to an embodiment of the present disclosure.
- 13A to 13C are diagrams illustrating another example of a display image generated based on the fourth generation method according to the embodiment of the present disclosure.
- 13A to 13C are diagrams illustrating an example of a display image generated based on a fifth generation method according to an embodiment of the present disclosure.
- 13A to 13C are diagrams illustrating another example of a display image generated based on the fifth generation method according to an embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of a display image according to an embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of a display image according to another embodiment of the present disclosure.
- FIG. 1 is a hardware configuration diagram showing an example of a computer 1000 that realizes the functions of the input side processing device.
- One or more of the embodiments (including examples and variations) described below can be implemented independently. However, at least a portion of the embodiments described below may be implemented in appropriate combination with at least a portion of another embodiment. These embodiments may include novel features that are different from one another. Thus, these embodiments may contribute to solving different purposes or problems and may provide different effects.
- Example of general configuration of information processing system>> 1 is a diagram for explaining an overview of an information processing system 10 according to an embodiment of the present disclosure.
- the information processing system 10 includes an input system 11 and an output system 12.
- the input system 11 accepts input operations from an operator (worker) 140.
- the output system 12 performs work according to the input operations from the operator 140.
- the input system 11 and the output system 12 are located in different locations.
- the location where the input system 11 is located is also referred to as remote location B.
- the location where the output system 12 is located is also referred to as remote location A.
- the distance between remote location A and remote location B may be such that the operation area where the output system 12 performs operations cannot be directly seen by the operator 140. The distance may be so far away that the operator 140 cannot see remote location A. Alternatively, the distance between remote location A and remote location B may be such that the operator 140 can see remote location A but cannot directly see the work area, such as in the same room or adjacent rooms.
- the input system 11 includes an input processing device 100, an input imaging device 110, an input mechanism 120, and a display device 130.
- the output system 12 includes an output processing device 200, an output imaging device 210, and an output mechanism 220.
- the input side processing device 100 is an information processing device that executes each process of the input side system 11.
- the input side processing device 100 communicates with the output side processing device 200, for example, via a network (not shown).
- the input side processing device 100 transmits operation information related to an operation accepted by the input mechanism 120 to the output side processing device 200.
- the input side processing device 100 acquires information related to the output mechanism 220 and an image captured by the output side imaging device 210 from the output side processing device 200.
- the input side processing device 100 generates a display image to be displayed on the display device 130 based on the captured image of the input side imaging device 110 and information acquired from the output side processing device 200.
- the input-side imaging device 110 is an imaging device (e.g., a camera) that captures an image of the remote location B.
- the input-side imaging device 110 captures an image of the operator 140.
- the input-side imaging device 110 captures an image of, for example, the head of the operator 140.
- the input-side imaging device 110 outputs the captured image to the input-side processing device 100.
- one input-side imaging device 110 is disposed in the remote location B, but the number of input-side imaging devices 110 is not limited to one, and may be two or more.
- the input mechanism 120 is a device that accepts an operation from the operator 140.
- the input mechanism 120 accepts a three-dimensional input operation from the operator 140.
- the input mechanism 120 is a pen-type device in FIG. 1, the input mechanism 120 is not limited to a pen-type device.
- the input mechanism 120 may be a mouse, a joystick, a game controller, a keyboard, or the like.
- the input mechanism 120 outputs operation information regarding the operation received from the operator 140 to the input side processing device 100.
- the display device 130 displays various information.
- the display device 130 is, for example, a liquid crystal display or an organic EL display.
- the display device 130 displays a display image generated by the input side processing device 100, thereby presenting the state of the remote location A (for example, an image of a work area) to the operator 140.
- the output side processing device 200 is an information processing device that executes each process of the output side system 12.
- the output side processing device 200 communicates with the input side processing device 100, for example, via a network (not shown).
- the output side processing device 200 receives operation information related to an operation accepted by the input mechanism 120 from the input side processing device 100.
- the output side processing device 200 transmits information related to the output mechanism 220 and an image captured by the output side imaging device 210 to the input side processing device 100.
- the output side processing device 200 controls the output mechanism 220 to perform the operation input by the operator 140 based on the operation information.
- the output-side imaging device 210 is an imaging device (e.g., a camera) that captures an image of the remote location A.
- the output-side imaging device 210 captures, for example, an image of a work target 230 of work performed in the remote location A.
- the output-side imaging device 210 captures, for example, an image of a work area including at least a part of the work target 230.
- the output-side imaging device 210 outputs the captured image to the output-side processing device 200.
- one output side imaging device 210 is placed at remote location A, but the number of output side imaging devices 210 is not limited to one and may be two or more.
- the output mechanism 220 is a device (e.g., an actuator) that performs work on the work target 230 under control of the output side processing device 200.
- the output mechanism 220 is a robot arm in FIG. 1, the output mechanism 220 is not limited to this.
- the output mechanism 220 may be an actuator according to the work content, such as a shovel car.
- the operator 140 performs work on the work target 230 by operating the output mechanism 220 of the remote location A via the input mechanism 120 while checking the state of the remote location A displayed on the display device 130.
- the cognitive operational load on the operator 140 may be high.
- the angle of view of the image captured by the output side imaging device 210 may be an angle of view that is inefficient for the operator 140 to work with.
- the operator 140 If the operator 140 performs an operation while checking an image captured at an angle of view that is inefficient for work, the operator 140 must perform the operation under conditions that place a high cognitive load on the operator, resulting in a problem of reduced operability for the operator 140.
- a method of arranging multiple output side imaging devices 210 can be considered.
- a method of cutting out an image according to the line of sight of the operator 140 from images captured in various directions by the multiple output side imaging devices 210 and presenting it to the operator 140 can be considered.
- This method requires that many output-side imaging devices 210 be placed at the remote location A in order to obtain images according to the line of sight of the operator 140.
- a mechanism for detecting the line of sight of the operator 140 is required.
- the system configuration required to realize this method becomes complex.
- the input side processing device 100 acquires an image of the work area from an output side imaging device 210 that captures an image of the work area where work is performed.
- the input side processing device 100 generates a display image from the image captured by the output side imaging device 210 according to a first positional relationship between the output side imaging device 210 and the output mechanism 220 and a second positional relationship between the input mechanism 120 and the operator 140.
- the input side processing device 100 presents the display image to the operator 140.
- the input side processing device 100 can detect a state in which the cognitive operational load imposed on the operator 140 is high according to the first positional relationship and the second positional relationship. Furthermore, the input side processing device 100 can generate a display image that imposes a low cognitive operational load on the operator 140 by generating a display image from a captured image according to the first positional relationship and the second positional relationship.
- the input-side processing device 100 also generates a display image from the captured image captured by the output-side imaging device 210. In this way, since the input-side processing device 100 generates a display image rather than cutting out the captured image, the number of output-side imaging devices 210 placed at the remote location A may be small. For example, even if there is only one output-side imaging device 210 placed at the remote location A, the input-side processing device 100 can generate a display image with a low operational burden on the operator 140, i.e., high work efficiency.
- the input side processing device 100 can suppress deterioration of operability for the operator 140 with a simpler system.
- the display image is generated by the input side processing device 100 here, the display image may be generated by the output side processing device 200.
- the proposed technology of this disclosure may be executed by the output side processing device 200.
- Detailed configuration example of information processing system>> 2 is a block diagram showing an example of a configuration of an information processing system 10 according to an embodiment of the present disclosure.
- the input system 11 includes an input processing device 100 , an input imaging device 110 , an input mechanism 120 , and a display device 130 .
- the input-side imaging device 110 captures an image of the remote location B.
- the input-side imaging device 110 captures images of the operator 140 and the input mechanism 120.
- the input-side captured image captured by the input-side imaging device 110 is used to detect a second positional relationship between the input mechanism 120 and the operator 140.
- the input-side processing device 100 detects the second positional relationship from information other than the input-side captured image, such as when the operator 140 is wearing a head-mounted display (HMD), the input-side imaging device 110 may be omitted.
- HMD head-mounted display
- the input mechanism 120 is a device that receives an operation from the operator 140.
- the input mechanism 120 outputs the operation received from the operator 140 to the input side processing device 100.
- the display device 130 is a liquid crystal display or an organic EL display.
- the operator 140 operates the input mechanism 120 while gazing at the display device 130.
- the display device 130 may be a wearable device such as an HMD.
- the input side processing device 100 controls each device of the input side system 11.
- the input side processing device 100 controls the output side system 12 via, for example, a network (not shown), and acquires information such as an output captured image from the output side system 12.
- the input side processing device 100 includes a communication unit 101 and a control unit 102.
- the communication unit 101 is a communication interface for communicating with other devices.
- the communication unit 101 is a network interface.
- the communication unit 101 is a LAN (Local Area Network) interface such as a NIC (Network Interface Card).
- NIC Network Interface Card
- the communication unit 101 may be a wired interface or a wireless interface.
- the control unit 102 is a controller that controls each unit of the input side processing device 100.
- the control unit 102 is realized by a processor such as a central processing unit (CPU), a micro processing unit (MPU), or a graphics processing unit (GPU).
- the control unit 102 is realized by a processor executing various programs stored in a storage device inside the input side processing device 100 using a random access memory (RAM) or the like as a working area.
- the control unit 102 may be realized by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- the CPU, the MPU, the GPU, the ASIC, and the FPGA can all be regarded as controllers.
- the control unit 102 includes a central processing unit 102A, an image generating unit 102B, a display generating unit 102C, an image processing unit 102D, and an input processing unit 102E.
- Each block constituting the control unit 102 (central processing unit 102A to input processing unit 102E) is a functional block indicating the function of the control unit 102.
- These functional blocks may be software blocks or hardware blocks.
- each of the above-mentioned functional blocks may be a software module realized by software (including a microprogram), or may be a circuit block on a semiconductor chip (die).
- each functional block may be a processor or an integrated circuit.
- the method of configuring the functional blocks is arbitrary.
- the control unit 102 may be configured in functional units different from the above-mentioned functional blocks.
- the image processing unit 102D acquires an input captured image from the input side imaging device 110.
- the image processing unit 102D performs image processing on the input captured image to generate an input image.
- the image processing unit 102D detects the input mechanism 120 and the operator 140 from the input captured image, and generates an input image including the detection results.
- the image processing unit 102D may generate an input image by extracting captured images at a predetermined cycle from a plurality of captured images captured as a moving image.
- the image processing unit 102D may use as an input image an captured image that has changed by more than a predetermined amount from the previously generated input image, such as a large change in the position of the operator 140 from the previous input image.
- the video processing unit 102D outputs the generated input image to the central processing unit 102A.
- the input processing unit 102E acquires operation information related to an operation performed by the operator 140 from the input mechanism 120.
- the input processing unit 102E outputs the acquired operation information to the central processing unit 102A.
- the central processing unit 102A transmits operation information acquired via the input processing unit 102E to the output side processing device 200 via the communication unit 101. In addition, the central processing unit 102A acquires an output captured image captured by the output side imaging device 210 from the output side processing device 200.
- the central processing unit 102A judges whether the operator 140 is in a cognitively high operational load state from the input image acquired from the video processing unit 102D and the output captured image acquired from the output side processing device 200. For example, the central processing unit 102A detects a second positional relationship between the input mechanism 120 and the operator 140 from the input image. The central processing unit 102A judges whether the operator 140 is in a cognitively high operational load state according to the second positional relationship and the first positional relationship between the output side imaging device 210 and the output mechanism 220.
- the central processing unit 102A instructs the display generation unit 102C to display the output captured image on the display device 130.
- the central processing unit 102A instructs the image generating unit 102B to generate a converted image (an example of a display image) from the output captured image.
- the central processing unit 102A instructs the image generating unit 102B to generate a converted image based on the cognitive operational load of the operator 140 according to the first positional relationship between the output side imaging device 210 and the output mechanism 220.
- the image generating unit 102B generates a converted image having a lower cognitive operation load than the output captured image from the output captured image in accordance with an instruction from the central processing unit 102A.
- the image generating unit 102B performs viewpoint conversion processing and geometric conversion processing on the output captured image to generate the converted image.
- the image generating unit 102B may generate the converted image by generating a simulation image using CG (computer graphics) or the like based on the output captured image.
- the video generation unit 102B outputs the generated converted image to the display generation unit 102C.
- the display generation unit 102C generates a display image to be displayed on the display device 130.
- the display generation unit 102C displays the output captured image as a display image on the display device 130.
- the display generation unit 102C generates a display image based on the converted image and displays it on the display device 130.
- the output system 12 includes an output processing device 200 , an output imaging device 210 , and an output mechanism 220 .
- the output side imaging device 210 captures an image of the remote location A.
- the output side imaging device 210 captures images of the work target 230 and the output mechanism 220.
- the output side imaging device 210 outputs the captured output captured image to the output side processing device 200.
- Output mechanism 220 As described above, the output mechanism 220 performs an operation on the work target 230 in response to an operation by the operator 140 .
- the output side processing device 200 controls each device of the output side system 12. In addition, the output side processing device 200 receives an instruction regarding an operation from the input side system 11 via, for example, a network (not shown), and transmits information such as an output captured image to the input side system 11.
- the output processing device 200 includes a communication unit 201 and a control unit 202.
- the communication unit 201 is a communication interface for communicating with other devices.
- the communication unit 201 is a network interface.
- the communication unit 201 is a LAN (Local Area Network) interface such as a NIC (Network Interface Card).
- the communication unit 201 may be a wired interface or a wireless interface.
- the control unit 202 is a controller that controls each unit of the output side processing device 200.
- the control unit 202 is realized by a processor such as a CPU, an MPU, or a GPU.
- the control unit 202 is realized by a processor executing various programs stored in a storage device inside the output side processing device 200 using a RAM or the like as a working area.
- the control unit 202 may be realized by an integrated circuit such as an ASIC or an FPGA.
- the CPU, the MPU, the GPU, the ASIC, and the FPGA can all be considered as controllers.
- the control unit 202 includes an image processing unit 202A, a central processing unit 202B, and an output generating unit 202C.
- Each block constituting the control unit 102 is a functional block indicating a function of the control unit 202.
- These functional blocks may be software blocks or hardware blocks.
- each of the above-mentioned functional blocks may be a software module realized by software (including a microprogram), or may be a circuit block on a semiconductor chip (die).
- each functional block may be a processor or an integrated circuit.
- the method of configuring the functional blocks is arbitrary.
- the control unit 102 may be configured in functional units different from the above-mentioned functional blocks.
- the image processing unit 202A acquires an output captured image from the output side imaging device 210.
- the image processing unit 202A outputs the acquired output captured image to the central processing unit 202B.
- the central processing unit 202B transmits an output captured image captured by the output side imaging device 210 to the input side processing device 100 via the communication unit 201. In addition, the central processing unit 202B acquires operation information from the input side processing device 100. The central processing unit 202B outputs the operation information to the output generation unit 202C.
- the output generation unit 202C generates control information for controlling the output mechanism 220 based on the operation information, and outputs the control information to the output mechanism 220.
- the output generation unit 202C generates the control information so that the output mechanism 220 performs the same operation as that performed by the operator 140.
- Operation processing example>> 3 is a flowchart showing an example of the flow of an operation process according to an embodiment of the present disclosure.
- the operation process shown in FIG. 3 is executed by the information processing system 10 when, for example, the operator 140 performs remote operation.
- remote operation is started on the input processing device 100 (step S101).
- a power button (not shown) on the input processing device 100 or performs an operation on the input mechanism 120, remote operation is started.
- a connection is made from the communication unit 101 of the input side processing device 100 (hereinafter also referred to as the input side communication unit 101) to the communication unit 201 of the output side processing device 200 (hereinafter also referred to as the output side communication unit 201) (step S102).
- the input side processing device 100 is connected to the output side processing device 200.
- the output side processing device 200 acquires the output captured image from the output side imaging device 210 (step S103).
- FIG. 4 is a diagram showing an example of imaging by the output-side imaging device 210 according to an embodiment of the present disclosure.
- the output-side imaging device 210 captures an image of the output mechanism 220 and the work object 230, and outputs the captured image to the output-side processing device 200.
- the output-side processing device 200 transmits the acquired image to the input-side processing device 100 as an output captured image.
- FIG. 5 is a diagram showing an example of an output captured image according to an embodiment of the present disclosure.
- FIG. 5 shows a case where the output side imaging device 210 captures an image of the output mechanism 220 and the work object 230 from the side. More specifically, if the direction from the output mechanism 220 to the work object 230 is the front direction of the output mechanism 220, the output captured image shown in FIG. 5 is an image of the output mechanism 220 and the work object 230 captured from a direction approximately perpendicular to this front direction.
- the output captured image in Figure 5 shows the rod-shaped tip of the output mechanism 220. Note that the arms of the output mechanism 220 are not shown.
- the work object 230 is, for example, a number of buttons arranged on a work surface. The operator 140 performs the task of pressing the buttons, which are the work object 230, with the tip of the output mechanism 220.
- the output side system 12 is equipped with the output side imaging device 210, but the sensing device equipped in the output side system 12 is not limited to the output side imaging device 210.
- the output side system 12 may be equipped with a distance measurement sensor in addition to the output side imaging device 210. Examples of distance measurement sensors include ToF (Time of Flight), LiDAR (Laser imaging Detection and Ranging), stereo cameras, monocular SLAM (Simultaneous Localization and Mapping), compound eye SLAM, mirror phase difference sensors, ultrasonic distance measurement sensors, and electromagnetic wave radar.
- ToF Time of Flight
- LiDAR Laser imaging Detection and Ranging
- stereo cameras stereo cameras
- monocular SLAM Simultaneous Localization and Mapping
- compound eye SLAM mirror phase difference sensors
- ultrasonic distance measurement sensors and electromagnetic wave radar.
- the output side processing device 200 can acquire the sensing data acquired by these sensing devices.
- the output side processing device 200 transmits the output captured image from the output side communication unit 201 to the input side communication unit 101 (step S104).
- the output side processing device 200 transmits a moving image to the input side communication unit 101, for example, by transmitting the output captured image at a predetermined interval (frame).
- the output processing device 200 can transmit sensing data from the sensing device to the input processing device 100 in addition to the output captured image.
- the output side processing device 200 can acquire imaging information related to the position and imaging direction of the output side imaging device 210, the output side processing device 200 can transmit the imaging information to the input side processing device 100.
- the imaging information is calculated, for example, from a GPS, IMU, SLAM, or the like installed in the output side imaging device 210.
- the imaging information may be calculated from radio wave information such as Bluetooth (registered trademark).
- the input side processing device 100 calculates the positional relationship (first positional relationship) between the output side imaging device 210 and the output mechanism 220 (step S105).
- the input side processing device 100 uses image processing technology to calculate from the output captured image the position from which the output side imaging device 210 is capturing images relative to the output mechanism 220.
- FIG. 6 is a diagram showing an example of a first positional relationship according to an embodiment of the present disclosure.
- the imaging direction of the output side imaging device 210 is set to the front direction of the output side imaging device 210.
- the front direction of the output side imaging device 210 is set to a direction approximately perpendicular to the screen at the center of the screen of the output captured image.
- the direction from the output mechanism 220 towards the work target 230 is defined as the front direction of the output mechanism 220.
- the front direction of the output mechanism 220 is not limited to this.
- the output mechanism 220 is movable, the movement direction of the output mechanism 220 may be defined as the front direction.
- the front direction of the output mechanism 220 may be determined according to the drive direction of the arm, etc.
- the input-side processing device 100 calculates the angle between the output-side imaging device 210 and the output mechanism 220, more specifically, the angle between the front axis of the output-side imaging device 210 and the front axis of the output mechanism 220, as the first positional relationship.
- FIG. 7 is a diagram explaining angles calculated by the input side processing device 100 according to an embodiment of the present disclosure.
- rotation around the x-axis is called roll
- rotation angle around the x-axis is called the roll angle.
- rotation around the y-axis is called pitch
- rotation angle around the y-axis is called the pitch angle.
- rotation around the z-axis is called yaw
- addition angle around the z-axis is called the yaw angle.
- the input side processing device 100 calculates, for example, the yaw angle between the front axis of the output mechanism 220 and the front axis of the output side imaging device 210 as the angle described above.
- FIG. 8 is a diagram showing an example of the angle formed by the output side imaging device 210 and the output mechanism 220 according to an embodiment of the present disclosure.
- the input-side processing device 100 calculates the angle ⁇ between the axis 210a in the front direction of the output-side imaging device 210 and the axis 220a in the front direction of the output mechanism 220 in a plane parallel to the ground.
- This angle ⁇ is the rotation angle (yaw angle) around the z-axis when the axis 220a is the x-axis and the axis perpendicular to the ground is the z-axis.
- the front-facing axis 210a of the output-side imaging device 210 and the front-facing axis 220a of the output mechanism 220 intersect on the work surface of the work object 230. Therefore, the angle ⁇ described above coincides with the angle when the output-side imaging device 210 is rotated in the yaw direction around the work surface of the work object 230.
- FIG. 9 is a diagram for explaining an example of the arrangement of the output side imaging device 210 according to an embodiment of the present disclosure.
- output captured images captured by the output side imaging device 210 with different arrangements are shown. Note that the angle shown in FIG. 9 corresponds to the angle ⁇ shown in FIG. 8, for example.
- the left diagram in FIG. 9 is an output image captured by the output-side imaging device 210, which forms an angle of "0°" with the output mechanism 220.
- the output-side imaging device 210 captures an image of the work object 230 from the front.
- the central diagram in FIG. 9 shows an output image captured by the output-side imaging device 210, which forms an angle of 40° with the output mechanism 220.
- the output-side imaging device 210 captures an image of the work object 230 from an oblique direction.
- the right diagram in Figure 9 shows an output image captured by the output-side imaging device 210, which forms an angle of 90° with the output mechanism 220.
- the output-side imaging device 210 captures an image of the work object 230 from the side.
- the input side processing device 100 only needs to calculate the first positional relationship once, and subsequent calculation processes may be omitted. Furthermore, if this positional relationship is known, such as if the relative positional relationship between the output side imaging device 210 and the output mechanism 220 is determined when constructing the output side system 12, the calculation by the input side processing device 100 may be omitted.
- the input side processing device 100 calculates the first positional relationship between the output side imaging device 210 and the output mechanism 220, this is not limited to the above.
- the output side processing device 200 may calculate the first positional relationship and notify the input side processing device 100 of the calculation result.
- the input-side processing device 100 determines whether the angle between the output-side imaging device 210 and the output mechanism 220 is equal to or greater than a threshold (step S106).
- This threshold is the angle at which the cognitive operational burden on the operator 140 becomes large, and is calculated, for example, by experiment.
- the inventor has found through experiments that when the angle between the output side imaging device 210 and the output mechanism 220 becomes a predetermined value (e.g., 100 degrees) or more, the cognitive operational burden on the operator 140 becomes large and operability decreases significantly. This is regardless of the dominant hand of the operator 140, or the rotation direction of the angle. Note that since there are some individual differences between operators 140 (e.g., 80 degrees to 120 degrees), this threshold can be set for each operator 140.
- the input side processing device 100 can determine whether the angle between the output side imaging device 210 and the output mechanism 220 is equal to or greater than a threshold value, thereby enabling the input side processing device 100 to determine whether the cognitive operational load on the operator 140 is high. In other words, the input side processing device 100 can determine whether the operability of the operator 140 is reduced.
- the thresholds described above are merely examples, and the thresholds used for the judgment can be set arbitrarily.
- the thresholds can be set for each operator 140 as described above.
- the input side processing device 100 for example, identifies the operator 140 from the input side captured image captured by the input side imaging device 110, and sets a threshold for each operator 140.
- the operator 140 may log in to the input side processing device 100, which allows the input side processing device 100 to identify the operator 140.
- the operator 140 may set the threshold value himself.
- the threshold value may be set to the same value for each rotation direction of the angle, or different values may be set. That is, in FIG. 8, different threshold values may be set depending on whether the output side imaging device 210 is disposed to the right or left side of the output mechanism 220, or the same threshold value may be set.
- the input-side processing device 100 calculates, for example, the angles between each of the multiple output-side imaging devices 210 and the output mechanism 220.
- the input-side processing device 100 compares, for example, the smallest angle with a threshold value.
- the input side processing device 100 may compare the angle between the output side imaging device 210 that obtained the output captured image to be displayed on the display device 130 and the output mechanism 220 with a threshold value.
- the output captured image to be displayed on the display device 130 may be selected by the operator 140 or may be selected by the input side processing device 100.
- step S106 if the angle between the output side imaging device 210 and the output mechanism 220 is less than the threshold value (step S106; No), the input side processing device 100 displays the output captured image on the display device 130 (step S107). After that, the input side processing device 100 proceeds to step S112.
- the input side processing device 100 displays the output captured image captured by the output side imaging device 210 as is on the display device 130.
- the input side processing device 100 acquires an input captured image of the operator 140 and the input mechanism 120 using the input side imaging device 110 (step S108).
- FIG. 10 is a diagram showing an example of imaging by the input-side imaging device 110 according to an embodiment of the present disclosure.
- the input-side imaging device 110 images the input mechanism 120 and the operator 140, and outputs the captured image to the input-side processing device 100.
- the operator 140 operates the input mechanism 120 while checking the display image displayed on the display device 130.
- the input system 11 is described as having an input imaging device 110, but the sensing device provided in the input system 11 is not limited to the input imaging device 110.
- the input system 11 may have a distance measurement sensor in addition to the input imaging device 110. Examples of distance measurement sensors include ToF, LiDAR, stereo camera, monocular SLAM, compound eye SLAM, specular phase difference sensor, ultrasonic distance measurement sensor, and electromagnetic wave radar.
- the input processing device 100 can acquire the sensing data acquired by these sensing devices.
- the input side processing device 100 calculates the positional relationship (second positional relationship) between the head of the operator 140 and the input mechanism 120 from the input captured image (step S109).
- the input side processing device 100 calculates, as the second positional relationship, for example, an angle (one example of an angle) between the operator 140 and the input mechanism 120. More specifically, the input side processing device 100 calculates, for example, an angle between the head of the operator 140 and the input mechanism 120.
- FIG. 11 is a diagram showing an example of an angle between the head of, for example, an operator 140 and the input mechanism 120 according to an embodiment of the present disclosure.
- the direction in front of the face of the operator 140 and passing through approximately the center of the head is defined as the axis (x-axis) in the front direction of the head of the operator 140.
- the direction from the input mechanism 120 to the operator 140 is defined as the front direction (x-axis direction) of the input mechanism 120.
- the angle at which the operator 140 is viewed from the xy plane of the input mechanism 120 is the angle between the head of the operator 140 and the input mechanism 120.
- the input side processing device 100 calculates the angle between the head of the operator 140 and the input mechanism 120, for example, using image processing technology or a machine learning model.
- the input processing device 100 may recognize the operator 140 using, for example, face recognition technology.
- the input processing device 100 can calculate the angle (pitch angle) between the operation center surface of the input mechanism 120 and the head of the operator 140, it does not need to use the input captured image to calculate this angle.
- the input side imaging device 110 does not need to image the input mechanism 120.
- the input side processing device 100 calculates, for example, the three-dimensional position of the head of the operator 140 at the remote location B from the operator 140 in the input captured image.
- the input side processing device 100 calculates the angle between the head of the operator 140 and the input mechanism 120 based on the position of the operation center of the input mechanism 120 and the position of the head of the operator 140.
- the input-side imaging device 110 does not need to image the operator 140.
- the input-side processing device 100 can acquire the position of the operator's 140 head from this device.
- the input side processing device 100 may detect the position of the input mechanism 120 or the head of the operator 140 using sensing data acquired by a sensing device other than the input side imaging device 110.
- the input side processing device 100 may use the position information of the input side imaging device 110 to detect the position of the input mechanism 120 or the head of the operator 140.
- the position of the input side imaging device 110 is detected based on the location where the input side imaging device 110 is installed (for example, mounted on the display device 130) or an IMU sensor mounted on the camera.
- the input side imaging device 110 may not be able to capture images of both the input mechanism 120 and the operator 140, and the operator 140 may be cut off. Also, if the input captured image includes multiple people, the input side processing device 100 may not be able to identify the operator 140.
- the input side processing device 100 may use the previous value or the average of past values as the angle between the head of the operator 140 and the input mechanism 120.
- the input processing device 100 generates a converted image from the viewpoint of the calculated positional relationship between the operator 140 and the input mechanism 120 (step S110).
- the input side processing device 100 generates a converted image according to the angle between the head of the operator 140 and the input mechanism 120. More specifically, the input side processing device 100 generates a converted image on a line connecting the operator 140 and the input mechanism 120 (see the arrow in FIG. 11 ) in which the yaw angle is 0 degrees and the pitch angle is the angle between the head of the operator 140 and the input mechanism 120 calculated in step S108.
- the front direction of the output mechanism 220 is the x-axis
- the direction perpendicular to the ground is the z-axis
- the center of the coordinate axis is the operation center of the output mechanism 220 (for example, the center of the bottom surface of the output mechanism 220).
- the input side processing device 100 generates, as a converted image, an image of the work object 230 captured from a viewpoint with a yaw angle of 0 degrees and a pitch angle corresponding to the angle between the head of the operator 140 and the input mechanism 120 calculated in step S108.
- the input side processing device 100 generates a converted image so that the line connecting the work area of the work object 230 and the viewpoint of the converted image is the same as the line connecting the head of the operator 140 and the coordinate center of the input mechanism 120 (see the arrow in FIG. 11).
- FIG. 12 is a diagram showing an example of a converted image according to an embodiment of the present disclosure.
- the upper left diagram of FIG. 12 is a diagram showing a converted image with a pitch angle of 10 degrees.
- the upper right diagram of FIG. 12 is a diagram showing a converted image with a pitch angle of 40 degrees.
- the lower left diagram of FIG. 12 is a diagram showing a converted image with a pitch angle of 80 degrees.
- the lower right diagram of FIG. 12 is a diagram showing a converted image with a pitch angle of 90 degrees.
- the input side processing device 100 generates a converted image from the output captured image in which the work object 230 is viewed from the head height of the operator 140 in front.
- the input side processing device 100 may generate a converted image whose pitch angle is exactly the same as the angle calculated in step S108, or may generate a converted image whose pitch angle is an angle close to the angle. For example, the input side processing device 100 generates a converted image with a pitch angle in units of a predetermined value such as 1 degree or 10 degrees. In this way, the input side processing device 100 can reduce the processing load for generating a converted image by generating a converted image with a pitch angle that is closest to the angle calculated in step S108 among the predetermined pitch angles.
- the input processing device 100 generates a converted image with a yaw angle of 0 degrees, in other words, viewed from the same direction as the front direction of the output mechanism 220. In this embodiment, the input processing device 100 generates a converted image viewed from the direction of the work object 230.
- the input side processing device 100 can generate a converted image with a lower cognitive operational burden on the operator 140 than the output captured image by generating a converted image with a yaw angle of 0 degrees. In this way, the input side processing device 100 can generate a converted image in such a way as to reduce the cognitive operational burden on the operator 140.
- the input side processing device 100 generates a converted image with a pitch angle corresponding to the angle between the head of the operator 140 and the input mechanism 120 calculated in step S108.
- the input side processing device 100 generates a converted image so that the positional relationship between the work object 230 and the viewpoint of the converted image is close to the positional relationship between the input mechanism 120 and the head (viewpoint) of the operator 140. This allows the operator 140 to view a converted image on the display device 130 that is close to how the work object 230 would look if viewed with the naked eye.
- the input side processing device 100 can generate a converted image that imposes a lower cognitive operational burden on the operator 140 than the output captured image by generating a converted image with a pitch angle that corresponds to the angle between the head of the operator 140 and the input mechanism 120. In this way, the input side processing device 100 can generate a converted image that reduces the cognitive operational burden on the operator 140.
- FIG. 13 is a diagram for explaining an example of a converted image according to an embodiment of the present disclosure.
- the angle (pitch angle) between the operator 140 and the input mechanism 120 is 40 degrees.
- the right diagram in FIG. 13 is a diagram of the display device 130 and the input mechanism 120 as viewed from the operator 140 when the input side processing device 100 displays a converted image with a pitch angle of 10 degrees on the display device 130.
- the input side processing device 100 generates a converted image of the work object 230 as viewed from above at a pitch angle of 10 degrees relative to the ground (the surface on which the work object 230 is placed) and presents it to the operator 140.
- the operator 140 will be viewing the work object 230 from a lower position than the position from which the operator 140 is viewing the input mechanism 120.
- the central diagram in FIG. 13 is a view of the display device 130 and the input mechanism 120 as viewed from the operator 140 when the input side processing device 100 displays a converted image with a pitch angle of 40 degrees on the display device 130.
- the input side processing device 100 generates a converted image of the work object 230 as viewed from above at a pitch angle of 40 degrees relative to the ground (the surface on which the work object 230 is placed) and presents it to the operator 140.
- the operator 140 will see the work object 230 from the same angle as the angle at which the operator 140 is viewing the input mechanism 120.
- the right diagram in FIG. 13 is a diagram of the display device 130 and the input mechanism 120 as viewed from the operator 140 when the input side processing device 100 displays a converted image with a pitch angle of 80 degrees on the display device 130.
- the input side processing device 100 generates a converted image of the work object 230 as viewed from above at a pitch angle of 80 degrees relative to the ground (the surface on which the work object 230 is placed) and presents it to the operator 140.
- the operator 140 will be able to see the work object 230 from a position higher than the angle at which he or she is viewing the input mechanism 120.
- the input side processing device 100 generates a converted image according to the angle (pitch angle) between the operator 140 and the input mechanism 120. This allows the input side processing device 100 to generate a converted image for the operator 140 with good work efficiency, i.e., with a reduced cognitive workload for the operator 140.
- the inventors' experiments have revealed that the pitch angle that provides good work efficiency differs from person to person. Therefore, by having the input processing device 100 generate a converted image according to the posture of the operator 140 operating the input mechanism 120 (the angle (pitch angle) between the operator 140 and the input mechanism 120), the input processing device 100 can generate a converted image that further reduces the cognitive operational burden on the operator 140 performing the operation.
- the input side processing device 100 may generate a converted image with a constant pitch angle.
- the input side processing device 100 can generate a converted image in which the work object 230 is viewed from a constant angle, regardless of the posture (head position) of the operator 140.
- the input side processing device 100 may generate a converted image according to the frequency of postures taken by the operator 140. For example, the input side processing device 100 periodically detects the posture of the operator 140, and generates a converted image at the angle (pitch angle) when looking at the input mechanism 120 in the posture that the operator 140 has most frequently taken up to that point. In this way, the input side processing device 100 may generate a converted image in a posture (pitch angle) that the operator 140 frequently takes when operating.
- the input side processing device 100 may generate a converted image at a constant pitch angle for each operator 140.
- This constant pitch angle may be an angle registered in advance by the operator 140 through pre-registration, or may be the posture (pitch angle) that was most frequently used in the previous task.
- the input side processing device 100 identifies the operator 140 from the input captured image using a technique such as face recognition, and generates a converted image with a pitch angle according to the identified operator 140.
- the operator 140 may log in to the input side processing device 100, so that the input side processing device 100 can identify the operator 140.
- the input side processing device 100 may generate a converted image with a constant pitch angle regardless of the angle (yaw angle) between the output side imaging device 210 and the output mechanism 220. In this case, the input side processing device 100 may omit step S106 in FIG. 3.
- the input side processing device 100 generates a converted image depending on whether or not 3D data from the remote location A can be acquired.
- the input side processing device 100 If 3D data from remote location A can be acquired, the input side processing device 100 generates a converted image by converting the viewpoint of the output side imaging device 210 on the 3D data.
- the input processing device 100 If 3D data from remote location A cannot be obtained, the input processing device 100 generates a converted image according to the task. For example, if the task is a two-dimensional task (such as pressing a button placed on a plane), the input processing device 100 generates a converted image by geometrically transforming the output captured image. For example, if the task is a three-dimensional task (such as moving an object), the input processing device 100 uses computer graphics to generate a simulation image as a converted image.
- the input side processing device 100 When 3D data of remote location A can be obtained, the input side processing device 100 generates a converted image according to the angle (pitch angle) between the operator 140 and the input mechanism 120 based on the 3D data of the remote location A.
- the 3D data may be held by the information processing system 10 or may exist outside the information processing system 10.
- FIG. 14 is a diagram showing an example of a method for generating a converted image according to an embodiment of the present disclosure.
- the right diagram in FIG. 14 is a diagram showing an output captured image acquired by the output side imaging device 210.
- the center diagram in FIG. 14 is a diagram showing 3D data of remote location A. Note that, here, 3D data including the work target 230 and a part of the output mechanism 220 in remote location A is shown.
- the input side processing device 100 uses 3D data (three-dimensional information) based on the output captured image to generate a converted image with a yaw angle of 0 degrees and an arbitrary pitch angle.
- the pitch angle is, for example, the angle between the operator 140 and the input mechanism 120, as described above.
- 3D data examples include images from multiple viewpoints captured by multiple output imaging devices 210, CAD (Computer Aided Design) data, SLAM data, and three-dimensional capture data using markers.
- CAD Computer Aided Design
- the input processing device 100 may generate a 3D map based on the information acquired by these devices.
- FIG. 15 is a diagram illustrating an example of 3D data according to an embodiment of the present disclosure.
- FIG. 15 illustrates 3D data (3D map) generated based on sensing data from a distance measurement sensor at one viewpoint.
- the input side processing device 100 acquires sensing data from a distance measuring sensor placed in remote location A via the output side processing device 200.
- the input side processing device 100 uses the sensing data to generate a 3D map, for example, as shown in FIG. 15.
- Range sensors that may be placed at remote location A include ToF, LiDAR, stereo cameras, monocular SLAM, compound eye SLAM, image plane phase difference sensors, ultrasonic ranging sensors, and electromagnetic radar.
- the input side processing device 100 creates the 3D map here, the device that creates the 3D map is not limited to the input side processing device 100.
- the output side processing device 200 may generate the 3D map using sensing data from a distance measuring sensor placed in remote location A, or an external device (not shown) may generate the map. In this case, the input side processing device 100 obtains the 3D map from the output side processing device 200 or the external device.
- FIG. 16 is a diagram showing another example of a method for generating a converted image according to an embodiment of the present disclosure.
- the shape of the work surface (original surface) of the work object 230 is known (rectangle shape).
- the input side processing device 100 detects the work surface of the work object 230 from the output captured image.
- the input side processing device 100 performs a geometric transformation on the detected work surface according to the detected work surface and the original shape of the work surface (rectangle).
- the input side processing device 100 converts the detected work surface (trapezoidal work surface in FIG. 16) into a known shape (rectangle).
- the input side processing device 100 performs viewpoint position conversion of the pitch angle x on the work surface that has been converted into a known shape, and generates a converted image.
- the input processing device 100 may omit the conversion from the detected work surface to a known shape (rectangle shape) and generate a converted image of pitch angle x directly from the detected work surface.
- FIG. 17 is a diagram illustrating an example of an operation performed by an operator 140 according to an embodiment of the present disclosure.
- the operator 140 performs an operation of moving a cube placed on a work target 230.
- the input side processing device 100 of this embodiment when the operator 140 performs three-dimensional work, the input side processing device 100 of this embodiment generates a converted image including a virtual work object 230 and an output mechanism 220 based on the output captured image. At this time, the input side processing device 100 generates, for example, a virtual converted image of the range of the work surface of the work object 230.
- FIG. 18 is a diagram showing another example of a converted image according to an embodiment of the present disclosure.
- FIG. 18 shows a converted image that is virtually generated based on the output captured image shown in FIG. 17.
- the input side processing device 100 generates a converted image that virtually represents a part of the work surface and the output mechanism 220, for example, using computer graphics technology.
- the input side processing device 100 generates a converted image based on the output captured image according to the information (e.g., 3D data) that can be used to generate the converted image and the work content (two-dimensional work or three-dimensional work).
- the information e.g., 3D data
- the work content two-dimensional work or three-dimensional work.
- FIG. 19 is a diagram showing an example of a transformed image including lens distortion according to an embodiment of the present disclosure.
- FIG. 19 shows a transformed image generated by parallel projection, a transformed image with a FoV (Field of View) of 15 degrees, a transformed image with a FoV of 30 degrees, and a transformed image with a FoV of 60 degrees.
- FoV Field of View
- the input side processing device 100 generates a converted image with a yaw angle of 0 degrees, i.e., the work object 230 viewed from the front of the output mechanism 220, but the converted image generated by the input side processing device 100 is not limited to this.
- the input side processing device 100 may generate a yaw angle conversion image according to the operation performed by the operator 140.
- the operator 140 performs an action to move the output mechanism 220 to the rear side of the work object 230 (in a direction away from the output mechanism 220 in the front direction of the output mechanism 220).
- the operator 140 may have better operability if the input side processing device 100 generates a converted image of the work object 230 from the side, i.e., with a yaw angle of 90 degrees.
- the input side processing device 100 may generate a converted image with a yaw angle according to the task performed by the operator 140, for example.
- the input-side processing device 100 which generated the converted image in step S109, generates a display image from the generated converted image, and displays the generated display image on the display device 130 (step S111).
- the following five methods are examples of how the input side processing device 100 generates a display image. Note that the following methods are only examples, and the input side processing device 100 may generate a display image using other methods.
- (First generation method) A method of using a converted image as a display image.
- (Second generation method) A method of dividing the converted image and the output captured image to use them as a display image.
- (Third generation method) A method of displaying a converted image in full screen and displaying the output captured image as a wipe to generate a display image.
- (Fourth generation method) A method of displaying an output captured image in full screen and displaying the converted image as a wipe to generate a display image.
- (Fifth generation method) A method of transparently superimposing one of the converted image and the output captured image to generate a display image.
- the first generation method is a method in which the converted image is used as the display image, and the output captured image is not displayed on the display device 130 .
- FIG. 20 is a diagram showing an example of a display image generated based on a first generation method according to an embodiment of the present disclosure.
- the left diagram in FIG. 20 shows an output captured image.
- the right diagram in FIG. 20 shows a display image.
- the input side processing device 100 generates a converted image generated from the output captured image as a display image, and displays it on the display device 130.
- the input side processing device 100 displays the converted image as a display image on the display device 130, so that the operator 140 can operate the input mechanism 120 while checking the display image, which imposes a lower cognitive operational load than the output captured image. This makes it possible to further suppress a decrease in the operational efficiency of the operator 140.
- FIG. 21 is a diagram showing another example of a display image generated based on the first generation method according to an embodiment of the present disclosure.
- the left diagram in FIG. 21 shows an output captured image.
- the right diagram in FIG. 21 shows a display image.
- the input side processing device 100 generates a converted image generated from the output captured image as a display image, and displays it on the display device 130.
- the input side processing device 100 generates a converted image that includes the work surface of the work object 230 and part of the output mechanism 220, and omits the display of other parts such as the background.
- the input side processing device 100 may also generate a converted image that highlights parts of the work surface, such as by displaying the boundary of the work surface with a line of a specified color.
- the input side processing device 100 displays the generated converted image on the display device 130 as a display image.
- the input side processing device 100 can further reduce the decrease in the work efficiency of the operator 140 by generating a converted image that includes information necessary for the work (e.g., the work surface and the output mechanism 220) and omits information unnecessary for the work (e.g., the background, etc.).
- a converted image that includes information necessary for the work (e.g., the work surface and the output mechanism 220) and omits information unnecessary for the work (e.g., the background, etc.).
- the input side processing device 100 when the input side processing device 100 generates a virtual transformed image, it is preferable that the input side processing device 100 does not use the first generation method in which the transformed image is used as the display image. In other words, when the input side processing device 100 generates a transformed image based on viewpoint transformation or geometric transformation using 3D data, it generates the display image using the first generation method.
- the second generation method is a method in which the converted image and the output captured image are divided and displayed on the display device 130 .
- FIG. 22 is a diagram showing an example of a display image generated based on a second generation method according to an embodiment of the present disclosure.
- the left diagram in FIG. 22 shows the output captured image.
- the right diagram in FIG. 22 shows the display image.
- the input side processing device 100 generates a display image in which the output captured image and the converted image are arranged side by side, and displays it on the display device 130.
- the input side processing device 100 generates a display image in which a converted image generated by viewpoint conversion, for example, is arranged on the right side and an output captured image is arranged on the left side.
- the input side processing device 100 displays both the converted image and the output captured image on the display device 130. This allows the operator 140 to work while checking both the converted image, which places a low cognitive operational burden, and the output captured image that was actually captured by the output side imaging device 210.
- FIG. 23 is a diagram showing another example of a display image generated based on the second generation method according to an embodiment of the present disclosure.
- the left diagram in FIG. 23 shows the output captured image.
- the right diagram in FIG. 23 shows the display image.
- the input side processing device 100 generates a display image in which the output captured image and the converted image are arranged side by side, and displays it on the display device 130.
- the input side processing device 100 generates a display image in which, for example, the output captured image is arranged on the right side and the virtually generated converted image is arranged on the left side.
- the input side processing device 100 presents the output captured image to the operator 140 in addition to the converted image. This allows the operator 140 to work while checking both the converted image, which imposes a low cognitive operational load, and the output captured image that was actually captured by the output side imaging device 210.
- the third generation method is a method in which the converted image is displayed full screen, and the output captured image is displayed on the display device 130 as a wipe superimposed on the converted image.
- FIG. 24 is a diagram showing an example of a display image generated based on a third generation method according to an embodiment of the present disclosure.
- the left diagram in FIG. 24 shows the output captured image.
- the right diagram in FIG. 24 shows the display image.
- the input side processing device 100 generates a display image by superimposing an output captured image, which is smaller than the converted image, on the converted image, and displays it on the display device 130.
- the input side processing device 100 generates a display image in which the output captured image is superimposed on the lower right of the converted image generated by viewpoint conversion, for example.
- FIG. 25 is a diagram showing another example of a display image generated based on the third generation method according to an embodiment of the present disclosure.
- the left diagram in FIG. 25 shows the output captured image.
- the right diagram in FIG. 25 shows the display image.
- the input side processing device 100 generates a display image by superimposing an output captured image, which is smaller than the converted image, on the converted image, and displays it on the display device 130.
- the input side processing device 100 generates a display image in which the output captured image is superimposed on the bottom left of a virtually generated converted image, for example.
- the input side processing device 100 displays both the converted image and the output captured image on the display device 130. This allows the operator 140 to work while checking both the converted image, which places a low cognitive operational burden, and the output captured image that was actually captured by the output side imaging device 210.
- the input side processing device 100 can generate a display image by superimposing an output captured image of any size at any position on the converted image.
- the position and size of the output captured image may be specified by the operator 140.
- the input side processing device 100 may also change the position or display size of the output captured image depending on the situation, such as when the output captured image displayed by a wipe is superimposed on an area of the converted image displayed full screen that is to be presented to the operator 140, such as a work surface (work area).
- the input system 11 has a function for detecting the line of sight of the operator 140, that is, if the input processing device 100 can detect where the operator 140 is looking on the display device 130, the displayed image may be changed according to the line of sight of the operator 140.
- the input side processing device 100 may end the display of the converted image and display the output captured image on the display device 130.
- the input side processing device 100 may end the display of the output captured image and display the converted image on the display device 130.
- the input side processing device 100 may also display an image that the operator 140 is gazing at larger than an image that the operator 140 is not gazing at. For example, if the operator 140 continues to look at the output captured image that is displayed as a wipe, the input side processing device 100 may increase the size of the output captured image. Alternatively, in this case, the input side processing device 100 may display the output captured image full screen and display the converted image as a wipe.
- the input side processing device 100 can switch the display image according to the line of sight of the operator 140 in the second generation method described above or the fourth generation method described below, as in the third generation method.
- the fourth generation method is a method in which the output captured image is displayed full screen, and the converted image is displayed on the display device 130 by superimposing the converted image as a wipe on the converted image.
- FIG. 26 is a diagram showing an example of a display image generated based on a fourth generation method according to an embodiment of the present disclosure.
- the left diagram in FIG. 26 shows an output captured image.
- the right diagram in FIG. 26 shows a display image.
- the input side processing device 100 generates a display image by superimposing a converted image that is smaller than the output captured image on the output captured image, and displays it on the display device 130.
- the input side processing device 100 generates a display image in which a converted image generated by viewpoint conversion is superimposed on the lower right of the output captured image, for example.
- FIG. 27 is a diagram showing another example of a display image generated based on the fourth generation method according to an embodiment of the present disclosure.
- the left diagram in FIG. 27 shows the output captured image.
- the right diagram in FIG. 27 shows the display image.
- the input side processing device 100 generates a display image by superimposing a converted image that is smaller than the output captured image on the output captured image, and displays it on the display device 130.
- the input side processing device 100 generates a display image in which a virtually generated converted image is superimposed on the lower right of the output captured image, for example.
- the input side processing device 100 displays both the converted image and the output captured image on the display device 130. This allows the operator 140 to work while checking both the converted image, which places a low cognitive operational burden, and the output captured image that was actually captured by the output side imaging device 210.
- the fifth generation method is a method in which one of the output captured image and the converted image is made transparent and superimposed on the other and displayed on the display device 130 .
- FIG. 28 is a diagram showing an example of a display image generated based on a fifth generation method according to an embodiment of the present disclosure.
- the left diagram in FIG. 28 shows the output captured image.
- the right diagram in FIG. 28 shows the display image.
- the input side processing device 100 generates a display image by superimposing a converted image with a predetermined transparency on the output captured image, and displays it on the display device 130.
- the input side processing device 100 generates a display image by superimposing, on the output captured image, for example, a converted image of the same size as the output captured image and a converted image generated by viewpoint conversion.
- FIG. 29 is a diagram showing another example of a display image generated based on the fifth generation method according to an embodiment of the present disclosure.
- the left diagram in FIG. 29 shows the output captured image.
- the right diagram in FIG. 29 shows the display image.
- the input side processing device 100 generates a display image by superimposing a converted image with a predetermined transparency on the output captured image, and displays it on the display device 130.
- the input side processing device 100 generates a display image in which a virtually generated converted image is superimposed on the output captured image, the converted image being the same size as the output captured image, for example.
- the input side processing device 100 displays both the converted image and the output captured image on the display device 130. This allows the operator 140 to work while checking both the converted image, which places a low cognitive operational burden, and the output captured image that was actually captured by the output side imaging device 210.
- the input side processing device 100 may generate a display image using a method other than the first to fifth generation methods described above. Alternatively, the input side processing device 100 may generate a display image that includes information other than the converted image and the output captured image in addition to these.
- FIG. 30 is a diagram illustrating an example of a display image according to an embodiment of the present disclosure.
- the display image shown in FIG. 30 is, for example, an image generated according to the fourth generation method.
- the display image shown in FIG. 30 further includes work area information and operation object (part of the output mechanism 220) information that are displayed superimposed on the output captured image.
- the work area information is, for example, information indicating the boundary of the work surface, and is generated by computer graphics.
- the operation object information is information indicating an object that is part of the output mechanism 220 and actually performs work on the work target 230, i.e., an object that the operator 140 operates, and is generated by computer graphics.
- the input side processing device 100 generates a display image, for example, by superimposing the work area information and the operation object information on the output captured image.
- the input side processing device 100 here superimposes the work area information and the operation object information on the output captured image, for example, the input side processing device 100 may superimpose the work area information and the operation object information on the converted image.
- the input side processing device 100 when the input side processing device 100 generates a display image using the fourth generation method, the input side processing device 100 generates a display image including the working area information and the manipulation object information.
- the display images generated using the first to third and fifth generation methods may also include the working area information and the manipulation object information.
- the operator 140 may also arbitrarily change the viewpoint of the converted image displayed in the display image or the imaging direction of the output side imaging device 210.
- the input side processing device 100 controls the output side imaging device 210 or generates the converted image according to instructions from the operator 140. This allows the operator 140 to instruct the input side processing device 100 to generate a display image that improves the operator's own operational efficiency.
- the input side processing device 100 may also generate an indicator showing the positional relationship between the work surface of the work object 230 and a virtual camera that captures the converted image, and display it on the display device 130.
- the virtual camera is a virtual imaging device that can capture the converted image when placed at remote location A.
- the input side processing device 100 presents an indicator showing the viewpoint and imaging direction of the converted image to the operator 140.
- the input side processing device 100 which displayed the display image on the display device 130 in step S111, acquires operation information from the input mechanism 120 (step S112).
- the input side processing device 100 acquires the operation performed by the operator 140 on the input mechanism 120 as operation information.
- the input side processing device 100 transmits the operation information from the input side communication unit 101 to the output side communication unit 211 (step S113).
- the output side processing device 200 that has acquired the operation information outputs the operation information through the output mechanism 220 (step S114). As a result, the output mechanism 220 executes an operation on the work target 230 in response to the operation by the operator 140.
- the input side processing device 100 determines whether or not to end the remote operation at the input side processing device 100 (step S115). For example, the input side processing device 100 determines to end the play operation when the operation on the work target 230 is completed. Alternatively, when the operator 140 determines to end the remote operation, the input side processing device 100 determines to end the remote operation according to an instruction from the operator 140.
- step S115 the input side processing device 100 returns to step S103 and acquires an image (output captured image) from the output side imaging device 210.
- step S115 if it is determined that the remote operation should be ended (step S115; Yes), the input side processing device 100 ends the remote operation and ends the operation processing.
- the above-mentioned input side processing device 100 may be a wearable device worn by the operator 140, such as an HMD, which is an AR (Augmented Reality) device or a VR (Virtual Reality) device.
- an HMD which is an AR (Augmented Reality) device or a VR (Virtual Reality) device.
- the input side processing device 100 may use this imaging device as the input side imaging device 110.
- the pitch angle of the converted image may also be changed depending on the work content. For example, when the operator 140 performs an operation in a vertical direction relative to the ground (the surface on which the output mechanism 220 is placed) or the work surface of the work object 230, the input side processing device 100 generates a converted image with a pitch angle of 0 degrees. In other words, when the operator 140 performs an operation in a vertical direction, the input side processing device 100 generates a converted image in which the work object 230 is viewed from the side.
- the input processing device 100 when the operator 140 performs an operation in a plane parallel to the ground or the work surface, the input processing device 100 generates a converted image with a pitch angle of 90 degrees. In other words, when the operator 140 performs an operation in the horizontal direction, the input processing device 100 generates a converted image of the work object 230 viewed from above.
- the input-side processing device 100 performs coordinate conversion of the output captured image (moving image) captured by the output-side imaging device 210, but the input-side processing device 100 may also perform coordinate conversion of information other than moving images.
- the output system 12 acquires sound from a remote location A using a sound collection device (not shown) such as a microphone.
- the input processing device 100 may perform coordinate conversion of the sound acquired by the output system 12.
- the input processing device 100 converts the position of the sound source in the remote location A to the same coordinates as the generated converted image, for example, using stereophonic technology.
- the input processing device 100 outputs the sound after the coordinate conversion from an audio output device (not shown) such as a speaker or headphones.
- the position of the output mechanism 220 (or the work object 230) included in the converted image may be different from the position of the output mechanism 220 (or the work object 230) included in the output captured image.
- a delay may occur between the operation performed by the operator 140 and the output mechanism 220 executing it.
- a delay may occur between the time when the output side imaging device 210 actually captures an image and the time when the output captured image is transmitted to the input side processing device 100.
- timing at which the operator 140 performs an operation may be a discrepancy between the timing at which the operator 140 performs an operation and the timing at which the input side processing device 100 displays this operation as an output captured image on the display device 130.
- This discrepancy in timing may result in a deterioration in operability for the operator 140.
- the input side processing device 100 displays an image showing the output mechanism 220 (hereinafter also referred to as an operation image) superimposed on the output captured image at a position according to the operation performed by the operator 140 on the output captured image.
- an operation image an image showing the output mechanism 220 (hereinafter also referred to as an operation image) superimposed on the output captured image at a position according to the operation performed by the operator 140 on the output captured image.
- FIG. 31 is a diagram showing an example of a display image according to another embodiment of the present disclosure.
- the input-side processing device 100 generates a display image based on the fourth generation method by displaying the output captured image in full screen and displaying the converted image as a wipe.
- the input side processing device 100 displays an operation image 221 showing a part of the output mechanism 220, which is generated using, for example, computer graphics technology, superimposed on the output captured image.
- the position and orientation of this output mechanism 220 are based on the operation performed by the operator 140 while checking the displayed image, and are different from the position and orientation of the output mechanism 220 in the output captured image.
- the input side processing device 100 may also generate a conversion image such that the position and orientation of the output mechanism 220 of the conversion image to be displayed as a wipe are in accordance with the operation performed by the operator 140, similar to the operation image 221.
- the input side processing device 100 presents to the operator 140, in the displayed image, the position of the output mechanism 220 linked to the operation of the operator 140.
- the operator 140 can check without delay how the output mechanism 220 will move as a result of the operation he or she performed. This makes it possible to suppress any deterioration in operability for the operator 140.
- the input side processing device 100 may present to the operator 140 the amount of deviation (amount of delay) between the timing at which the operator 140 performs an operation and the timing at which this operation is displayed as an output captured image, using a motion blur or color. This allows the operator 140 to check how much delay is occurring, making it possible to prevent a decrease in operability.
- the information devices such as the input side processing device 100 and the output side processing device 200 according to each embodiment described above are realized by a computer 1000 having a configuration as shown in Fig. 32.
- the input side processing device 100 according to the embodiment will be described below as an example.
- FIG. 32 is a hardware configuration diagram showing an example of a computer 1000 that realizes the functions of the input side processing device 100.
- the computer 1000 has a CPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, a HDD (Hard Disk Drive) 1400, a communication interface 1500, and an input/output interface 1600. Each part of the computer 1000 is connected by a bus 1050.
- the CPU 1100 operates based on the programs stored in the ROM 1300 or the HDD 1400 and controls each part. For example, the CPU 1100 loads the programs stored in the ROM 1300 or the HDD 1400 into the RAM 1200 and executes processes corresponding to the various programs.
- the ROM 1300 stores boot programs such as the Basic Input Output System (BIOS) that is executed by the CPU 1100 when the computer 1000 starts up, as well as programs that depend on the hardware of the computer 1000.
- BIOS Basic Input Output System
- HDD 1400 is a computer-readable recording medium that non-temporarily records programs executed by CPU 1100 and data used by such programs.
- HDD 1400 is a recording medium that records the audio playback program related to the present disclosure, which is an example of program data 1450.
- the communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (e.g., the Internet).
- the CPU 1100 receives data from other devices and transmits data generated by the CPU 1100 to other devices via the communication interface 1500.
- the input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000.
- the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input/output interface 1600.
- the CPU 1100 also transmits data to an output device such as a display, a speaker or a printer via the input/output interface 1600.
- the input/output interface 1600 may also function as a media interface that reads programs and the like recorded on a specific recording medium.
- Examples of media include optical recording media such as DVDs (Digital Versatile Discs) and PDs (Phase change rewritable Disks), magneto-optical recording media such as MOs (Magneto-Optical Disks), tape media, magnetic recording media, and semiconductor memories.
- optical recording media such as DVDs (Digital Versatile Discs) and PDs (Phase change rewritable Disks)
- magneto-optical recording media such as MOs (Magneto-Optical Disks)
- tape media magnetic recording media
- magnetic recording media and semiconductor memories.
- the CPU 1100 of the computer 1000 executes a program loaded onto the RAM 1200 to realize the functions of the control unit 102, etc.
- the HDD 1400 stores the program according to the present disclosure and data in the storage unit.
- the CPU 1100 reads and executes the program data 1450 from the HDD 1400, but as another example, the CPU 1100 may obtain these programs from other devices via the external network 1550.
- each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure.
- the specific form of distribution and integration of each device is not limited to that shown in the figure, and all or part of them can be functionally or physically distributed and integrated in any unit depending on various loads, usage conditions, etc. This distribution and integration configuration may also be performed dynamically.
- this embodiment can be implemented as any configuration that constitutes an apparatus or system, such as a processor as a system LSI, a module using multiple processors, a unit using multiple modules, a set in which a unit has been further enhanced with other functions, etc. (i.e., a configuration that constitutes part of an apparatus).
- a system refers to a collection of multiple components (devices, modules (parts), etc.), regardless of whether all the components are in the same housing. Therefore, multiple devices housed in separate housings and connected via a network, and a single device in which multiple modules are housed in a single housing, are both systems.
- this embodiment can be configured as a cloud computing system in which a single function is shared and processed collaboratively by multiple devices via a network.
- the present technology can also be configured as follows. (1) When an operator controls an output mechanism disposed at a remote location to perform a task, an image of the task area is acquired from an imaging device that images the task area, and generating a display image from the captured image in accordance with a first positional relationship between the imaging device and the output mechanism and a second positional relationship between the operator and an input mechanism that receives an operation from the operator to control the output mechanism; A control unit that presents the display image to the operator; An information processing device comprising: (2) The information processing device according to (1), wherein the control unit generates the display image based on a cognitive operational load of the operator according to the first positional relationship.
- control unit generates the display image when the first positional relationship is a state in which the operational load is high.
- control unit generates the display image according to an angle between a front direction of the output mechanism and a front direction of the imaging device.
- control unit generates the display image when the angle is equal to or greater than a threshold value.
- control unit generates the display image so as to reduce a cognitive operational load on the operator.
- the information processing device according to any one of (1) to (6), wherein the control unit generates the display image according to an angle between the operator and the input mechanism.
- the control unit generates a front display image of the work area captured at the angle based on the captured image.
- the control unit generates the display image based on the captured image from a direction at an angle between the output mechanism and the captured image.
- the control unit generates the display image based on the captured image from a direction in which an angle with respect to a front direction of the output mechanism is equal to or smaller than a predetermined value.
- control unit generates the display image by changing an imaging direction of the captured image based on three-dimensional data of the remote location.
- control unit performs a geometric transformation on the captured image to generate the display image.
- control unit generates the display image that virtually displays the work area based on the captured image.
- control unit generates the display image so that lens distortion is equal to or less than a predetermined value.
- control unit generates the display image based on a parallel projection method.
- control unit presents the display image to the operator and does not present the captured image to the operator.
- control unit presents the display image and the captured image to the operator.
- An information processing method comprising: (19) Computer, When an operator controls an output mechanism disposed at a remote location to perform a task, an image of the task area is acquired from an imaging device that images the task area, and generating a display image from the captured image in accordance with a first positional relationship between the imaging device and the output mechanism and a second positional relationship between the operator and an input mechanism that receives an operation from the operator to control the output mechanism; A control unit that presents the display image to the operator;
- An information processing device comprising: A program to function as a
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Image Processing (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2024549897A JPWO2024070398A1 (enrdf_load_stackoverflow) | 2022-09-26 | 2023-08-29 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022152542 | 2022-09-26 | ||
JP2022-152542 | 2022-09-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024070398A1 true WO2024070398A1 (ja) | 2024-04-04 |
Family
ID=90477072
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/031079 WO2024070398A1 (ja) | 2022-09-26 | 2023-08-29 | 情報処理装置、情報処理方法及びプログラム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2024070398A1 (enrdf_load_stackoverflow) |
WO (1) | WO2024070398A1 (enrdf_load_stackoverflow) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS57191180U (enrdf_load_stackoverflow) * | 1981-05-29 | 1982-12-03 | ||
JPH08227319A (ja) * | 1995-02-22 | 1996-09-03 | Kajima Corp | 遠隔操作支援画像システム |
JP2003181785A (ja) * | 2001-12-20 | 2003-07-02 | Yaskawa Electric Corp | 遠隔操作装置 |
JP2014180736A (ja) * | 2013-03-21 | 2014-09-29 | Toyota Motor Corp | 遠隔操作ロボットシステム |
JP2018121195A (ja) * | 2017-01-25 | 2018-08-02 | 大成建設株式会社 | 遠隔制御システム |
JP2021180421A (ja) * | 2020-05-14 | 2021-11-18 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | 遠隔制御システム、遠隔作業装置、映像処理装置およびプログラム |
-
2023
- 2023-08-29 JP JP2024549897A patent/JPWO2024070398A1/ja active Pending
- 2023-08-29 WO PCT/JP2023/031079 patent/WO2024070398A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS57191180U (enrdf_load_stackoverflow) * | 1981-05-29 | 1982-12-03 | ||
JPH08227319A (ja) * | 1995-02-22 | 1996-09-03 | Kajima Corp | 遠隔操作支援画像システム |
JP2003181785A (ja) * | 2001-12-20 | 2003-07-02 | Yaskawa Electric Corp | 遠隔操作装置 |
JP2014180736A (ja) * | 2013-03-21 | 2014-09-29 | Toyota Motor Corp | 遠隔操作ロボットシステム |
JP2018121195A (ja) * | 2017-01-25 | 2018-08-02 | 大成建設株式会社 | 遠隔制御システム |
JP2021180421A (ja) * | 2020-05-14 | 2021-11-18 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | 遠隔制御システム、遠隔作業装置、映像処理装置およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2024070398A1 (enrdf_load_stackoverflow) | 2024-04-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6860488B2 (ja) | 複合現実システム | |
US20210049360A1 (en) | CONTROLLER GESTURES IN VIRTUAL, AUGMENTED, AND MIXED REALITY (xR) APPLICATIONS | |
TWI659335B (zh) | 圖形處理方法和裝置、虛擬實境系統和計算機儲存介質 | |
CN113711109A (zh) | 具有直通成像的头戴式显示器 | |
CN106454311B (zh) | 一种led三维成像系统及方法 | |
JP5709440B2 (ja) | 情報処理装置、情報処理方法 | |
JP7182920B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP2004062758A (ja) | 情報処理装置および方法 | |
WO2016163183A1 (ja) | 没入型仮想空間に実空間のユーザの周辺環境を提示するためのヘッドマウント・ディスプレイ・システムおよびコンピュータ・プログラム | |
JP5565331B2 (ja) | 表示システム、表示処理装置、表示方法、および表示プログラム | |
JP2019008623A (ja) | 情報処理装置、及び、情報処理装置の制御方法、コンピュータプログラム、記憶媒体 | |
CN110969658B (zh) | 利用来自多个设备的图像进行定位和标测 | |
JP2022058753A (ja) | 情報処理装置、情報処理方法及びプログラム | |
CN116205980A (zh) | 一种移动空间内虚拟现实定位追踪方法及装置 | |
US20250203061A1 (en) | Augmented reality eyewear with x-ray effect | |
WO2019106862A1 (ja) | 操作案内システム | |
WO2018146922A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
WO2017098999A1 (ja) | 情報処理装置、情報処理システム、情報処理装置の制御方法、及び、コンピュータープログラム | |
JP2020071394A (ja) | 情報処理装置、情報処理方法及びプログラム | |
US20240036327A1 (en) | Head-mounted display and image displaying method | |
JP2016171478A (ja) | カメラ制御装置及びそのプログラム | |
WO2024070398A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
EP3599539B1 (en) | Rendering objects in virtual views | |
JP2025089960A (ja) | 表示制御装置、表示制御方法、及びプログラム | |
CN111651043A (zh) | 一种支持定制化多通道交互的增强现实系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23871639 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2024549897 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |