WO2017158718A1 - 遠隔作業支援装置、指示用端末及び現場用端末 - Google Patents
遠隔作業支援装置、指示用端末及び現場用端末 Download PDFInfo
- Publication number
- WO2017158718A1 WO2017158718A1 PCT/JP2016/058126 JP2016058126W WO2017158718A1 WO 2017158718 A1 WO2017158718 A1 WO 2017158718A1 JP 2016058126 W JP2016058126 W JP 2016058126W WO 2017158718 A1 WO2017158718 A1 WO 2017158718A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- work
- instruction
- site
- image
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims description 15
- 238000004891 communication Methods 0.000 description 84
- 238000000034 method Methods 0.000 description 71
- 238000012545 processing Methods 0.000 description 56
- 230000006870 function Effects 0.000 description 30
- 239000003550 marker Substances 0.000 description 27
- 238000010586 diagram Methods 0.000 description 12
- 238000007689 inspection Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 239000002131 composite material Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010248 power generation Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/148—Interfacing a video terminal to a particular transmission medium, e.g. ISDN
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the present invention relates to a remote operation support apparatus, an instruction terminal, and an on-site terminal including an on-site terminal having an imaging unit for capturing an image viewed by an operator, and an instruction terminal for transmitting and receiving information to and from the on-site terminal. Relates to a terminal for use.
- maintenance and inspection work is indispensable during operation.
- this maintenance and inspection work it is necessary to periodically inspect a large number of devices, accurately record the inspection results, and to take measures such as device adjustment as necessary if the inspection results are incomplete.
- These tasks include a simple task that can be performed even by an unskilled worker, and a complex task that is difficult if the worker is not a skilled worker.
- an unskilled worker can perform a complicated work by supporting a field work from a remote place by a skilled worker.
- Patent Document 1 There is a technology disclosed in Patent Document 1 as an example of the technology related to the remote work support as described above.
- a field worker and a work instructor are displayed by displaying an image captured by a photographing unit of a head mounted display (hereinafter referred to as HMD) worn by a field worker on a screen for a work instructor at a remote place.
- Information can be shared with.
- a sub-screen for a work instructor displays a whole image showing a whole image of a work target and a shooting range of the video in the whole image. As a result, even when a field worker approaches the work target and only a part of the work target appears in the video, it is possible to grasp the shooting range of the video by viewing the entire image.
- Patent Document 1 has a problem that it is not possible to acquire information on the site outside the shooting angle of view of the imaging unit. Therefore, when giving a work instruction to a work object located away from the site worker, for example, an instruction such as “Please show me the lower right” or a direction to the work object on the HMD. It is necessary for the work instructor to instruct at any time in order to display a guidance image indicating that the instruction is not smooth.
- the present invention has been made in order to solve the above-described problems, and provides a remote work support apparatus and instructions for enabling an instruction regarding a work object located outside a shooting angle of view of a shooting unit for shooting an on-site image.
- the purpose is to provide a terminal and a field terminal.
- a remote work support apparatus includes a site terminal having a photographing unit that captures an image viewed by an operator, and an instruction terminal that transmits and receives information to and from the site terminal.
- a position / direction estimation unit that estimates the worker's position and orientation from the video taken by the photographing unit, and an on-site situation that generates an image showing the on-site situation including the worker's position from the estimation result by the position / direction estimation unit
- An image generation unit, an instruction-side display unit that displays a screen including an image generated by the on-site situation image generation unit, and a next work position input by the work instructor on the screen displayed by the instruction-side display unit
- a work instruction receiving unit that receives the information to be shown, and a direction calculation unit that calculates a direction to the next work position from the estimation result by the position / direction estimation unit and the reception result by the work instruction reception unit.
- Total A guidance image generation unit that generates an image indicating a direction to the next work position from a calculation result by the unit, and a site
- the present invention since it is configured as described above, it is possible to give an instruction regarding a work target located outside the shooting angle of view of the imaging unit that captures an on-site video.
- FIG. 2A is a diagram showing a hardware configuration example of the site terminal and the instruction terminal according to Embodiment 1 of the present invention
- FIG. 2B is a diagram showing details of the hardware configuration example of the site terminal.
- FIG. 1 is a diagram showing an example of the overall configuration of a remote work support apparatus according to Embodiment 1 of the present invention.
- the remote work support device is a skillful work so that maintenance work of machine equipment, correction work or installation work can be performed even if the worksite worker (hereinafter referred to as the worksite worker) is an unskilled worker.
- This is a device for assisting a work instructor, who is an expert, to work at a site from a remote location.
- this remote work support device is a worksite terminal 1 used by a field worker who actually performs work on site and a work instructor gives instructions to the field worker from a remote location. And an instruction terminal 2 for supporting the above.
- the on-site terminal 1 includes a control unit 101, a storage unit 102, a communication unit 103, a photographing unit 104, a guidance image generation unit 105, a display unit (site-side display unit) 106, a voice input unit 107, An audio output unit 108 is provided.
- the control unit 101 controls the operation of each unit in the site terminal 1.
- the storage unit 102 stores information used in the on-site terminal 1.
- the storage unit 102 stores, for example, pre-registration information for displaying on the display 33 described later by the display unit 106, information transmitted and received by the communication unit 103, and the like.
- the communication unit 103 transmits / receives information to / from the communication unit 203 of the instruction terminal 2.
- the communication unit 103 transmits information (video data) indicating video captured by the imaging unit 104 and information (audio data) indicating audio input to the audio input unit 107 to the communication unit 203.
- the communication unit 103 receives work instruction data, text information, and audio data from the communication unit 203.
- the work instruction data is information indicating the direction from the current position of the field worker to the next work position.
- the photographing unit 104 photographs a scene image viewed from a field worker.
- the guide image generation unit 105 generates an image (guide image) indicating the direction from the current position of the field worker to the next work position based on the work instruction data received by the communication unit 103.
- an image guide image
- the mark like an arrow is mentioned, for example.
- the display unit 106 displays various screens on the display 33.
- the display unit 106 displays a screen (information presentation screen) including the guide image on the display 33.
- the display unit 106 displays a screen (information presentation screen) including text indicated by the text information on the display 33.
- the guidance image and text information may be displayed on the same screen.
- the voice input unit 107 is for inputting a voice by a field worker.
- the audio output unit 108 reproduces the audio data when the communication unit 103 receives the audio data.
- the instruction terminal 2 includes a control unit 201, a storage unit 202, a communication unit 203, a position / direction estimation unit 204, an on-site situation image generation unit 205, a display unit (instruction display unit) 206, a work instruction A reception unit 207, a direction calculation unit 208, a text reception unit 209, an input unit 210, a voice input unit 211, and a voice output unit 212 are provided.
- the control unit 201 controls the operation of each unit in the instruction terminal 2.
- the storage unit 202 stores information used in the instruction terminal 2.
- the storage unit 202 stores, for example, work place data used by the position / direction estimation unit 204 and the on-site situation image generation unit 205 and information transmitted / received by the communication unit 203.
- the work place data is defined as point cloud data that is a set of three-dimensional coordinate values for various devices existing at the work place, and further, image feature points obtained from video images of the work place are represented by the above point cloud data. Are associated with each other.
- the communication unit 203 transmits / receives information to / from the communication unit 103 of the on-site terminal 1.
- the communication unit 203 uses the information (work instruction data) indicating the direction from the current position of the on-site worker calculated by the direction calculation unit 208 to the next work position (work instruction data).
- Information indicating the accepted text (text information) and information indicating the voice input to the voice input unit 211 (voice data) are transmitted to the communication unit 103.
- the communication unit 203 receives video data and audio data from the communication unit 103.
- the position / direction estimation unit 204 estimates the current position of the site worker and the direction in which the site worker is facing based on the video data received by the communication unit 203. At this time, the position / direction estimation unit 204 estimates the current position and the direction of the site worker by comparing the video indicated by the video data with the work place data stored in the storage unit 202 in advance. .
- the on-site situation image generation unit 205 generates an image (on-site situation image) indicating the on-site situation including the current position of the on-site worker based on the estimation result by the position / direction estimation unit 204.
- the display unit 206 displays various screens on the display 6 described later.
- the display unit 206 displays a screen (site situation screen) including the site situation image on the display 6.
- a screen (work to be performed) is provided using the site situation image generated by the site situation image generation unit 205. (Instruction screen) is displayed on the display 6.
- the work instruction receiving unit 207 receives information indicating the next work position input by the work instructor via the input unit 210. At this time, the work instructor designates the next work position using the work instruction screen displayed on the display 6 by the display unit 206.
- the direction calculation unit 208 calculates the direction from the current position of the field worker to the next work position based on the estimation result by the position / direction estimation unit 204 and the reception result by the work instruction reception unit 207.
- the text receiving unit 209 receives information indicating text input by the work instructor via the input unit 210.
- the input unit 210 is used when the work instructor inputs various information to the instruction terminal 2.
- the voice input unit 211 is for inputting a voice by a work instructor.
- the audio output unit 212 reproduces the audio data when the communication unit 203 receives the audio data.
- FIGS. 1 and the instruction terminal 2 will be described with reference to FIGS.
- a hardware configuration example of the on-site terminal 1 will be described.
- Each function of the on-site terminal 1 is realized by the HMD 3 and the headset 4 as shown in FIG. And a field worker performs various work with respect to a work object in the state where this HMD3 and headset 4 were equipped.
- FIG. 2 a case where inspection work or the like is performed on the switchboard 11 is illustrated.
- the HMD 3 includes a terminal unit 31, a photographing device 32, and a display 33 as shown in FIGS.
- the terminal unit 31 includes a processing circuit 311, a storage device 312, and a communication device 313.
- the headset 4 includes a microphone 41 and a speaker 42 as shown in FIGS.
- the processing circuit 311 realizes the functions of the control unit 101, the guide image generation unit 105, and the display unit 106, and executes various processes on the HMD 3.
- the processing circuit 311 may be dedicated hardware as shown in FIG. 3, or as shown in FIG. 4, a CPU (Central Processing Unit, a central processing unit, a CPU that executes a program stored in the memory 315, It may be a processing device, an arithmetic device, a microprocessor, a microcomputer, a processor, or a DSP (Digital Signal Processor) 314.
- the processing circuit 311 is dedicated hardware, the processing circuit 311 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field Programmable Gate). Array) or a combination thereof.
- the functions of the respective units of the control unit 101, the guide image generating unit 105, and the display unit 106 may be realized by the processing circuit 311, or the functions of the respective units may be collectively realized by the processing circuit 311.
- the processing circuit 311 When the processing circuit 311 is the CPU 314, the functions of the control unit 101, the guide image generation unit 105, and the display unit 106 are realized by software, firmware, or a combination of software and firmware. Software and firmware are described as programs and stored in the memory 315.
- the processing circuit 311 reads out and executes the program stored in the memory 315, thereby realizing the function of each unit. That is, the on-site terminal 1 includes a memory 315 for storing a program in which, for example, each step shown in FIGS. These programs can also be said to cause the computer to execute the procedures and methods of the control unit 101, the guide image generation unit 105, and the display unit 106.
- the memory 315 is, for example, a nonvolatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable ROM), an EEPROM (Electrically EPROM), or the like. And a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versatile Disc), and the like.
- RAM Random Access Memory
- ROM Read Only Memory
- flash memory an EPROM (Erasable Programmable ROM), an EEPROM (Electrically EPROM), or the like.
- a magnetic disk a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versatile Disc), and the like.
- a part may be implement
- the function of the control unit 101 is realized by a processing circuit 311 as dedicated hardware, and the processing circuit 311 reads and executes a program stored in the memory 315 for the guide image generation unit 105 and the display unit 106. This function can be realized.
- the processing circuit 311 can realize the functions described above by hardware, software, firmware, or a combination thereof.
- the storage device 312 realizes the function of the storage unit 102.
- the storage device 312 corresponds to, for example, a nonvolatile or volatile semiconductor memory such as RAM, flash memory, EPROM, or EEPROM, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD, or the like. To do.
- the communication device 313 realizes the function of the communication unit 103.
- the communication method and shape of the communication device 313 are not questioned.
- the photographing device 32 realizes the function of the photographing unit 104.
- the imaging device 32 should just be mountable on HMD3, and an imaging system and a shape are not ask
- the display 33 displays various screens by the display unit 106.
- the display 33 may be mounted on the HMD 3, and the display method and shape are not limited. Examples of the display method of the display 33 include a method of projecting a projector image on a glass using a half mirror, a projection method using interference of laser light, or a method using a small liquid crystal display.
- the microphone 41 realizes the function of the voice input unit 107.
- the speaker 42 realizes the function of the audio output unit 108.
- the shapes of the microphone 41 and the speaker 42 are not limited.
- the headset 4 see FIG. 2 in which the microphone 41 and the speaker 42 are integrated, or an earphone in which the microphone 41 is mounted on the earphone cable.
- a microphone 4b see FIG. 5) or the like may be used.
- Each function of the instruction terminal 2 is realized by a control arithmetic device 5, a display 6, an input device 7, a microphone 8 and a speaker 9, as shown in FIGS.
- the control arithmetic device 5 includes a processing circuit 51, a storage device 52, and a communication device 53.
- the microphone 8 and the speaker 9 are not shown.
- the processing circuit 51 implements the functions of the control unit 201, the position / direction estimation unit 204, the on-site situation image generation unit 205, the display unit 206, the work instruction reception unit 207, the direction calculation unit 208, and the text reception unit 209. Various processes on the instruction terminal 2 are executed.
- the processing circuit 51 may be dedicated hardware as shown in FIG. 3, or as shown in FIG. 4, a CPU (central processing unit, processing unit, arithmetic unit) that executes a program stored in the memory 55. 54) (also referred to as a device, a microprocessor, a microcomputer, a processor, or a DSP).
- the processing circuit 51 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC, an FPGA, or a combination thereof. . Even if the functions of the control unit 201, the position / direction estimation unit 204, the on-site situation image generation unit 205, the display unit 206, the work instruction reception unit 207, the direction calculation unit 208, and the text reception unit 209 are realized by the processing circuit 51, respectively. Alternatively, the functions of the respective units may be collectively realized by the processing circuit 51.
- the processing circuit 51 When the processing circuit 51 is the CPU 54, the functions of the control unit 201, the position / direction estimation unit 204, the on-site situation image generation unit 205, the display unit 206, the work instruction reception unit 207, the direction calculation unit 208, and the text reception unit 209 are software, This is realized by firmware or a combination of software and firmware. Software and firmware are described as programs and stored in the memory 55.
- the processing circuit 51 reads out and executes the program stored in the memory 55, thereby realizing the function of each unit. That is, when the instruction terminal 2 is executed by the processing circuit 51, the instruction terminal 2 has a memory 55 for storing a program in which, for example, each step shown in FIGS. Prepare.
- these programs store the procedures and methods of the control unit 201, the position / direction estimation unit 204, the on-site situation image generation unit 205, the display unit 206, the work instruction reception unit 207, the direction calculation unit 208, and the text reception unit 209 on a computer. It can be said that it is what is executed.
- the memory 55 is, for example, a non-volatile or volatile semiconductor memory such as RAM, ROM, flash memory, EPROM, or EEPROM, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD, or the like. Applicable.
- control unit 201 the position / direction estimation unit 204, the on-site situation image generation unit 205, the display unit 206, the work instruction reception unit 207, the direction calculation unit 208, and the text reception unit 209 are partially dedicated hardware. It may be realized by a part, and a part may be realized by software or firmware.
- the function of the control unit 201 is realized by a processing circuit 51 as dedicated hardware, and a position / direction estimation unit 204, an on-site situation image generation unit 205, a display unit 206, a work instruction reception unit 207, and a direction calculation unit 208.
- the processing circuit 51 can read out and execute the program stored in the memory 55, thereby realizing its function.
- the processing circuit 51 can realize the above-described functions by hardware, software, firmware, or a combination thereof.
- the storage device 52 realizes the function of the storage unit 202.
- the storage device 52 corresponds to, for example, a nonvolatile or volatile semiconductor memory such as RAM, flash memory, EPROM, or EEPROM, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD, or the like. To do.
- the communication device 53 realizes the function of the communication unit 203. The communication method and shape of the communication device 53 are not questioned.
- the display 6 displays various screens by the display unit 206.
- the display 6 may be a monitor device that can be referred to by the work instructor, and may be a liquid crystal monitor device or a tablet device, and the display method and shape are not limited.
- the input device 7 realizes the function of the input unit 210.
- the input device 7 may be any device that can input characters and coordinate values, such as a keyboard, a mouse, or a touch pen.
- the microphone 8 realizes the function of the voice input unit 211.
- the speaker 9 implements the function of the audio output unit 212.
- the shapes of the microphone 8 and the speaker 9 are not limited.
- a headset in which the microphone 8 and the speaker 9 are integrated, an earphone microphone in which the microphone 8 is mounted on an earphone cable, or the like may be used.
- the communication relay device 10 is a device for securing a communication path from the on-site terminal 1 to the instruction terminal 2 at a remote location.
- the communication relay device 10 may be any device that can be connected via a wide area communication network, and the communication method and shape thereof are not limited, such as a wireless LAN, a wired LAN, or infrared communication.
- One of the on-site terminal 1 and the instruction terminal 2 may have the hardware configuration shown in FIG. 3, and the other may have the hardware configuration shown in FIG. Further, the control arithmetic device 5 may be divided into a plurality of units so that processing with a higher load is performed on the control arithmetic device 5 capable of large-scale calculation processing.
- the on-site terminal 1 is not limited to the configuration shown in FIG. 2, and for example, a monocular HMD 3b as shown in FIG. 5 may be used.
- a monocular HMD 3b as shown in FIG. 5 may be used.
- the case where the earphone microphone 4 b is used as the configuration of the microphone 41 and the speaker 42 is illustrated.
- the communication unit 103 and the communication unit 203 establish communication between the on-site terminal 1 and the instruction terminal 2 (step ST601).
- the communication establishment process is performed when the site worker determines that the site worker is located at the work site by GPS, video by the photographing unit 104 or wireless LAN communication, or by the site worker linked to the work site security system. You may make it perform automatically with a notification of entering a room.
- the on-site terminal 1 captures an on-site image viewed by the on-site worker and transmits it to the instruction terminal 2 (step ST602). That is, first, the image capturing unit 104 captures an image of the scene viewed from the field worker by the image capturing device 32 mounted on the HMD 3.
- the video shot by the shooting unit 104 is preferably a moving image (15 fps or more). However, if hardware resources or communication bandwidth is insufficient, a series of still images shot at a fixed period (4 to 5 fps) is used. It does not matter.
- the communication unit 103 transmits information (video data) indicating the video imaged by the imaging unit 104 to the communication unit 203. This video transmission process is continuously performed while communication between the on-site terminal 1 and the instruction terminal 2 is established.
- the instruction terminal 2 uses the video data from the on-site terminal 1 to generate an image indicating the on-site situation including the current position of the on-site worker, and displays the image (step ST603). Details of the on-site situation display process in step ST603 will be described later.
- the on-site status display process is continuously performed while communication between the on-site terminal 1 and the instruction terminal 2 is established.
- the instruction terminal 2 accepts the work instruction for the field worker input by the work instructor and notifies the field terminal 1 (step ST604). Details of the work instruction acceptance process in step ST604 will be described later.
- the on-site terminal 1 displays a screen indicating the work instruction using information indicating the work instruction from the instruction terminal 2 (step ST605). Details of the information presentation process in step ST605 will be described later. Thereafter, the field worker moves to the work position and works according to the screen displayed on the display 33 of the field terminal 1. Then, the above processing is repeated until all operations are completed.
- the communication unit 103 and the communication unit 203 disconnect the communication between the on-site terminal 1 and the instruction terminal 2 (step ST606). Thereby, the work support for the field worker is completed.
- step ST603 details of the on-site situation display process in step ST603 will be described with reference to FIG.
- the communication unit 203 receives video data from the communication unit 103 (step ST701).
- the position / direction estimation unit 204 estimates the current position of the site worker and the direction in which the site worker is facing (step ST702). At this time, the position / direction estimation unit 204 compares the video indicated by the video data with the work place data stored in the storage unit 202 in advance, so that the site worker is in any position on the work site. Estimate whether it is facing the direction.
- FIG. 8 is a diagram illustrating an example of work place data stored in the storage unit 202.
- a device ID 801 a coordinate value 802, RGB data 803, and image feature point data 804 are registered in association with each defined point.
- the device ID 801 is an ID for identifying which device on the work site the defined point belongs to.
- a coordinate value 802 is a coordinate value on a three-dimensional space indicating which position on the work site the defined point is. Note that the origin of the coordinate system is appropriately defined for each work site such as the center of the entrance / exit of the work site or the corner of the room.
- the RGB data 803 is color information of defined points, and is obtained from an image captured in advance.
- the image feature point data 804 is data indicating the image feature amount for the defined point, and is calculated based on the RGB data 803 of another point around the defined point. For example, for another set of points Bi within a predetermined distance around the point A, the distribution of luminance differences between the point A and the set Bi can be defined as the image feature amount of the point A.
- the position / direction estimation unit 204 obtains, as an estimation result, a coordinate value P0 (X 0 , Y 0 , Z 0 ) indicating the current position of the site worker and a direction in which the site worker is facing (of the imaging device 32). It is assumed that a direction vector Vc (Xc, Yc, Zc) representing (direction), a gradient ⁇ H in the horizontal direction, and a gradient ⁇ V in the vertical direction are obtained.
- a coordinate value P0 X 0 , Y 0 , Z 0
- Vc Xc, Yc, Zc
- the on-site situation image generation unit 205 generates an image indicating the on-site situation including the current position of the on-site worker based on the estimation result by the position / direction estimation unit 204 (step ST703). That is, the site situation image generation unit 205 reproduces the equipment around the work site in the virtual space using the estimation result and the work place data, and indicates the current position of the site worker in the virtual space. Generate an image.
- the display unit 206 displays a screen (site status screen) including the image on the display 6 based on the image indicating the site status generated by the site status image generation unit 205 (step ST704).
- FIG. 9 is a diagram illustrating an example of a site situation screen displayed by the display unit 206.
- a site image 901, a virtual site image 902, and operation buttons 903 are displayed.
- the on-site video 901 is a video indicated by video data received by the communication unit 203.
- the virtual site image 902 is an image in which devices around the work site generated by the site situation image generation unit 205 are reproduced in a virtual space.
- the virtual site image 902 shows a frame line 904 indicating which part the site image 901 corresponds to. This frame line 904 makes it possible to grasp the current position of the field worker.
- the operation button 903 is a button image for moving the viewpoint in the virtual site image 902. With the operation button 903 illustrated in FIG.
- the viewpoint in the virtual site image 902 may be moved by dragging one point on the virtual site image 902 with the mouse. Good.
- step ST604 details of the work instruction acceptance process in step ST604 will be described with reference to FIG.
- the display unit 206 is requested to start a work instruction by the work instructor via the input unit 210, thereby generating an on-site situation image.
- a screen (work instruction screen) for performing a work instruction is displayed on the display 6 using the image indicating the on-site situation generated by the unit 205 (step ST1001).
- the work instruction receiving unit 207 receives information indicating the next work position input by the work instructor via the input unit 210 (step ST1002). At this time, the work instructor designates the next work position using the work instruction screen displayed on the display 6 by the display unit 206.
- FIG. 11 is a diagram illustrating an example of a work instruction screen displayed by the display unit 206.
- a virtual site image 1101 and operation buttons 1102 are displayed.
- the virtual site image 1101 is an image for the work instructor to designate the next work position, and is the same image as the virtual site image 902 in FIG.
- Reference numeral 1103 denotes a frame line (for grasping the current position of the field worker) indicating which part of the field image 901 corresponds to.
- a work position marker 1104 indicating the next work position is added.
- the operation button 1102 is a button image for moving the work position marker 1104. With the operation button 1102 shown in FIG.
- the plus / minus of the axis (X, Y, Z) direction can be operated with respect to the work position marker 1104. Then, the work instructor operates the operation button 1102 to move the work position marker 1104, and the work position (coordinate values P1 (X 1 , Y 1 , Z 1 )) that the field worker should pay attention to next. Is specified. Further, instead of the button operation using the operation button 1102 as shown in FIG. 11, the work position marker 1104 may be moved by dragging the work position marker 1104 with a mouse.
- direction calculation section 208 calculates the direction from the current position of the field worker to the next work position based on the estimation result by position / direction estimation section 204 and the acceptance result by work instruction reception section 207 (step ST1003). .
- the details of the calculation processing by the direction calculation unit 208 will be described below with reference to FIG.
- the direction calculation unit 208 In the calculation processing by the direction calculation unit 208, as shown in FIG. 12, first, the current position (coordinate values P0 (X 0 , Y 0 , Z 0 )) and the next work position (coordinate values P1 (X 1 , Y 1 , Z 1 )) to calculate the direction vector Vd (Xd, Yd, Zd) from P0 (X 0 , Y 0 , Z 0 ) to P1 (X 1 , Y 1 , Z 1 ) (Step ST1201).
- the direction calculation unit 208 moves to the next work position based on the calculated direction vector Vd (Xd, Yd, Zd) and the direction (direction vector Vc (Xc, Yc, Zc)) that the field worker is facing. Is calculated (step ST1202). Specifically, the direction vector Vd (Xd, Yd, Zd) is projected onto a plane having the direction vector Vc (Xc, Yc, Zc) as a normal vector, and an on-site video (video taken by the imaging unit 104) is projected. Request direction theta d from the center point of). At this time, the horizontal inclination ⁇ H and the vertical inclination ⁇ V of the photographing apparatus 32 estimated by the position / direction estimation unit 204 may be corrected in consideration of the inclination of the head of the field worker.
- the text receiving unit 209 receives information indicating the text input by the work instructor via the input unit 210 (step ST1004).
- the work instructor inputs text while viewing the site situation screen or the work instruction screen displayed by the display unit 206.
- This text may be a character string input by a work instructor using a keyboard, or ink data input using a touch pen. Alternatively, a pre-registered standard sentence may be selected from a selection menu by operating a mouse. Note that if the work instructor determines that no text instruction is required, the text receiving unit 209 does not perform processing.
- the voice input unit 211 receives a voice from the work instructor (step ST1005). At this time, the work instructor inputs voice while looking at the on-site situation screen or the work instruction screen displayed by the display unit 206. If the work instructor determines that no voice instruction is required, the process by the voice input unit 211 is not performed.
- communication unit 203 transmits information related to the work instruction to communication unit 103 (step ST1006).
- the communication unit 203 transmits information (instruction data) indicating the calculation result by the direction calculation unit 208 to the communication unit 103.
- information indicating the text is also transmitted to the communication unit 103.
- voice is input to the voice input unit 211
- information indicating the voice is also transmitted to the communication unit 103. Thereafter, the above process is repeated until it is determined by the work instructor that the work instruction is unnecessary.
- the communication unit 103 receives information related to a work instruction from the communication unit 203 (step ST1301). At this time, the communication unit 103 receives instruction data from the communication unit 203. In addition, when the text information is transmitted from the communication unit 203, the communication unit 103 also receives the text information. In addition, when audio data is transmitted from the communication unit 203, the communication unit 103 also receives the audio data.
- the guide image generation unit 105 Based on the work instruction data received by the communication unit 103, the guide image generation unit 105 generates a guide image indicating the direction from the current position of the field worker to the next work position (step ST1302).
- the guide image generation unit 105 generates a guide image indicating the direction from the current position of the field worker to the next work position (step ST1302).
- step ST1401 it is determined whether the direction vector Vd is greater than or equal to a predetermined threshold THd (step ST1401). That is, the guidance image generation unit 105 determines whether or not the field worker has reached the next work position by determining whether or not the direction vector Vd is greater than or equal to the threshold Thd. In step ST1401, if the guidance image generation unit 105 determines that the direction vector Vd is less than the threshold value THd, the site worker has reached the next work position and determines that display of the guidance image is unnecessary. The process is terminated.
- the guidance image generation unit 105 when it is determined in step ST1401 that the direction vector Vd is greater than or equal to the threshold value THd, the guidance image generation unit 105 generates a guidance image indicating the direction from the current position of the field worker to the next work position. (Step ST1402).
- a guidance image the mark like an arrow is mentioned, for example.
- the display unit 106 displays a screen (information presentation screen) including the guidance image on the display 33 based on the guidance image generated by the guidance image generation unit 105. (Step ST1303).
- display unit 106 displays a screen (information presentation screen) including text indicated by the text information on display 33 (step ST1304).
- FIG. 15 is a diagram illustrating an example of an information presentation screen displayed by the display unit 106.
- a guide image 1501 and text 1502 are displayed.
- an arrow indicating the direction from the current position of the field worker to the next work position is displayed.
- the field worker can move to the next work by looking at the guide image 1501 and the text 1502.
- the work instructor simply designates the work position, and the direction from the current position of the field worker to the next work position is automatically calculated, so the work instructor sequentially instructs the next work position. There is no need, and smooth communication is possible.
- the overhead view 1601 as shown in FIG. 16 can be displayed by calculating the overhead view display direction ⁇ d2 in the calculation process in step ST1202 shown in FIG.
- the display direction ⁇ d2 can be obtained by calculation similar to the direction ⁇ d by projecting the direction vector Vd (Xd, Yd, Zd) onto the floor plane.
- the voice output unit 108 plays back the voice data (step ST1305).
- the field worker listens to a voice instruction from the work instructor, and performs the same question items, confirmation responses, and the like by voice.
- the voice of the on-site worker is input by the voice input unit 107 and transmitted to the instruction terminal 2 through a route opposite to the instruction voice of the work instructor.
- the work instructor listens to the on-site worker's voice reproduced by the voice output unit 212 of the instruction terminal 2, and determines whether the previous instruction is correctly understood or whether the next instruction should be given.
- the instruction terminal 2 estimates the position and direction of the site worker from the video captured by the imaging unit 104 of the site terminal 1. And an on-site situation image generation unit 205 that generates an image indicating an on-site situation including the position of the on-site worker from the estimation result by the position / direction estimation unit 204, and a screen including an image generated by the on-site situation image generation unit 205.
- the on-site terminal 1 includes a calculation result obtained by the direction calculation unit 208.
- the direction calculation unit 208 calculates a direction to the next work position from the result received by the instruction reception unit 207. And a display unit 106 for displaying a screen including the image generated by the guide image generation unit 105.
- the video on the site is provided. It is possible to give an instruction regarding a work object located outside the shooting angle of view of the imaging unit 104 that captures the image.
- the direction from the current position to the next work position can be automatically calculated from the estimation result of the current position and the direction of the on-site worker. There is no need to do so, and smooth communication is possible. As a result, communication between the field worker and the work instructor can be facilitated, and the work can be made more efficient.
- FIG. FIG. 17 is a diagram showing an example of the overall configuration of a remote work support apparatus according to Embodiment 2 of the present invention.
- the remote work support apparatus according to the second embodiment shown in FIG. 17 changes the work instruction receiving unit 207 of the remote work support apparatus according to the first embodiment shown in FIG. 208 is changed to the direction calculation unit 208b.
- Other configurations are the same, and only the different parts are described with the same reference numerals.
- the work instruction receiving unit 207b receives information indicating a next work position and a route to the work position input by the work instructor via the input unit 210. At this time, the work instructor uses the work instruction screen displayed on the display 6 by the display unit 206 to designate the next work position and a route to the work position.
- the direction calculation unit 208b calculates the direction from the current position of the field worker to the next work position along the route based on the estimation result by the position / direction estimation unit 204 and the reception result by the work instruction reception unit 207b. It is.
- the overall process by the remote work support device is the same as the overall process by the remote work support device according to the first embodiment, and a description thereof will be omitted.
- the on-site situation display process and the information presentation process are also the same as the on-site situation display process performed by the remote operation support device according to the first embodiment, and a description thereof will be omitted.
- the work instruction acceptance processing by the instruction terminal 2 in the second embodiment shown in FIG. 18 is changed from steps ST1002 and ST1003 of the work instruction acceptance processing by the instruction terminal 2 in the first embodiment shown in FIG. 10 to steps ST1801 and ST1802. It has been changed. Other processes are the same, and the description thereof is omitted.
- step ST1801 the work instruction receiving unit 207b receives information indicating a next work position and a route to the work position input by the work instructor via the input unit 210.
- the work instructor uses the work instruction screen displayed on the display 6 by the display unit 206 to designate the next work position and a route to the work position.
- FIG. 19 is a diagram illustrating an example of a work instruction screen displayed by the display unit 206.
- a virtual site image 1901 and operation buttons 1902 are displayed.
- the virtual site image 1901 is an image for the work instructor to specify the next work position together with the route to the work position, and is the same image as the virtual site image 1101 in FIG.
- a plurality of work route markers 1903 are added.
- This work route marker 1903 is a marker indicating a route to the next work position.
- the operation button 1902 is a button image for adding and deleting the work route marker 1903 and moving the work position marker 1104 and the work route marker 1903. With the operation button 1902 shown in FIG.
- the direction calculation unit 208b calculates the direction from the current position of the field worker to the next work position along the route based on the estimation result by the position / direction estimation unit 204 and the reception result by the work instruction reception unit 207b. (Step ST1802).
- the details of the calculation processing by the direction calculation unit 208b will be described with reference to FIG.
- the direction calculation unit 208b In the calculation processing by the direction calculation unit 208b, as shown in FIG. 20, first, the current position (coordinate values P0 (X 0 , Y 0 , Z 0 )) of the field worker, the next work position, and the work position are reached. Based on the route (coordinate values Pi (X i , Y i , Z i )), the coordinate values Pi (X i , Y i , Z i ) to be calculated are selected (step ST2001).
- the direction calculation unit 208b determines that the current position P0 (X 0 , Y 0 , Z 0 ) of the field worker is the position of the work path marker 1903a (coordinate value P1 (X 1 ) from the position of the frame line 1103 shown in FIG. , Y 1 , Z 1 )), P1 (X 1 , Y 1 , Z 1 ) is selected as a calculation target. Thereafter, the direction calculation unit 208b determines that the field worker's current position P0 (X 0 , Y 0 , Z 0 ) is within the threshold with respect to P1 (X 1 , Y 1 , Z 1 ). It is determined that the worker has reached the position of the work route marker 1903a.
- the direction calculation unit 208b determines that the current position P0 (X 0 , Y 0 , Z 0 ) of the field worker is changed from the position of the work path marker 1903a to the position of the work path marker 1903b (P2 (X 2 , Y 2 , Z 2). )), P2 (X 2 , Y 2 , Z 2 ) is selected as a calculation target. Thereafter, the direction calculation unit 208b determines that the field worker's current position P0 (X 0 , Y 0 , Z 0 ) is within the threshold with respect to P2 (X 2 , Y 2 , Z 2 ). It is determined that the worker has reached the position of the work route marker 1903b.
- the direction calculation unit 208b determines that the current position P0 (X 0 , Y 0 , Z 0 ) of the field worker is changed from the position of the work path marker 1903b to the position of the work position marker 1104 (P3 (X 3 , Y 3 , Z 3). )), P3 (X 3 , Y 3 , Z 3 ) is selected as a calculation target.
- step ST2002 the direction calculating unit 208b, the site worker of the current position (coordinates P0 (X 0, Y 0, Z 0)) and selected coordinate value Pi (X i, Y i, Z i) on the basis of, P0
- a direction vector Vd (Xd, Yd, Zd) from (X 0 , Y 0 , Z 0 ) to Pi (X i , Y i , Z i ) is calculated (step ST2002). This process is the same as the process in step ST1201 of FIG.
- the direction calculation unit 208b determines the next route or work based on the calculated direction vector Vd (Xd, Yd, Zd) and the direction (direction vector Vc (Xc, Yc, Zc)) that the field worker is facing.
- the direction to the position is calculated (step ST2003). This process is the same as the process in step ST1202 of FIG.
- step ST2004 when the direction calculation unit 208b determines that the calculation process has been completed up to the next work position, the sequence ends.
- step ST2004 determines in step ST2004 that the calculation process has not been completed up to the next work position
- the sequence returns to step ST2001 and repeats the above process.
- the work instruction receiving unit 207b receives information indicating the next work position together with information indicating the route to the work position, and the direction calculating unit 208b Since the configuration is such that the direction to the position is calculated along the route, in addition to the effect in the first embodiment, the movement to the work position needs to be moved along the predetermined route. It is possible to give instructions smoothly.
- FIG. FIG. 21 is a diagram showing an example of the overall configuration of a remote work support apparatus according to Embodiment 3 of the present invention.
- the remote work support apparatus according to Embodiment 3 shown in FIG. 21 changes the direction calculation unit 208 of the remote work support apparatus according to Embodiment 1 shown in FIG. Is changed to the guide image generation unit 105c.
- Other configurations are the same, and only the different parts are described with the same reference numerals.
- the direction calculation unit 208c calculates the direction in the three-dimensional space from the current position of the field worker to the next work position based on the estimation result by the position / direction estimation unit 204 and the reception result by the work instruction reception unit 207. Is.
- the guide image generation unit 105c Based on the work instruction data received by the communication unit 103, the guide image generation unit 105c generates an image (guide image) indicating the direction in the three-dimensional space from the current position of the field worker to the next work position. To do.
- an image indicating the direction in the three-dimensional space from the current position of the field worker to the next work position.
- the mark like an arrow is mentioned, for example.
- the overall process by the remote work support device is the same as the overall process by the remote work support device according to the first embodiment, and a description thereof will be omitted.
- the on-site situation display processing is the same as the on-site situation display processing by the instruction terminal 2 in the first embodiment, and thus the description thereof is omitted.
- step ST1003 of the work instruction acceptance process by the instruction terminal 2 according to the first embodiment shown in FIG. 10 is changed to step ST2201. Is. Other processes are the same, and the description thereof is omitted.
- step ST2201 the direction calculation unit 208c is based on the estimation result by the position / direction estimation unit 204 and the reception result by the work instruction reception unit 207 in the three-dimensional space from the current position of the field worker to the next work position. Calculate the direction.
- the direction calculation unit 208c details of the calculation processing by the direction calculation unit 208c will be described with reference to FIG.
- the direction calculation unit 208c In the calculation processing by the direction calculation unit 208c, as shown in FIG. 23, first, the current position (coordinate values P0 (X 0 , Y 0 , Z 0 )) and the next work position (coordinate values P1 (X 1 , Y 1 , Z 1 )) to calculate the direction vector Vd (Xd, Yd, Zd) from P0 (X 0 , Y 0 , Z 0 ) to P1 (X 1 , Y 1 , Z 1 ) (Step ST2301).
- the direction calculation unit 208c determines the next work position.
- the direction in the three-dimensional space is calculated (step ST2302). Specifically, the direction vector Vd (Xd, Yd, Zd) is used as a direction vector Vdr (Xdr, Ydr, Zdr) for right-eye projection with respect to a plane having the direction vector Vc (Xc, Yc, Zc) as a normal vector.
- the horizontal inclination ⁇ H and the vertical inclination ⁇ V of the photographing apparatus 32 estimated by the position / direction estimation unit 204 may be corrected in consideration of the inclination of the head of the field worker.
- the information presentation process by the on-site terminal 1 in the third embodiment shown in FIG. 24 is obtained by changing step ST1302 of the work instruction acceptance process by the instruction terminal 2 in the first embodiment shown in FIG. 13 to step ST2401. .
- Other processes are the same, and the description thereof is omitted.
- step ST2401 the guide image generation unit 105c generates a guide image indicating a direction in the three-dimensional space from the current position of the field worker to the next work position based on the work instruction data received by the communication unit 103. Generate.
- the details of the guide image generation processing by the guide image generation unit 105c will be described with reference to FIG.
- the guide image generation process shown in FIG. 25 only the process for the direction vector Vdr (Xdr, Ydr, Zdr) for right-eye projection is shown.
- step ST2501 the guidance image generation unit 105c determines that display of the guidance image is unnecessary when determining that the direction vector Vdr (Xdr, Ydr, Zdr) is less than the threshold value THd, and ends the processing.
- the guidance image generation unit 105c determines in step ST2501 that the direction vector Vdr (Xdr, Ydr, Zdr) is equal to or greater than the threshold value THd, the guide image generation unit 105c 3 A guide image indicating the direction in the dimensional space is generated (step ST2502).
- the mark like an arrow is mentioned, for example.
- the same processing as described above is performed on the direction vector Vdl (Xdl, Ydl, Zdl) for left-eye projection.
- display unit 106 displays a screen (information presentation screen) including the guidance image on display 33 (step ST1303).
- a guide image that is a three-dimensional image is displayed on the display 33.
- FIG. 26 is a diagram illustrating an example of an information presentation screen displayed by the display unit 106.
- a guide image 2601 and text 2602 are displayed on the information presentation screen shown in FIG. 26, a guide image 2601 and text 2602 are displayed.
- an arrow indicating the direction from the current position of the field worker to the next work position is displayed three-dimensionally.
- the text 2602 is the same as the text 1502 shown in FIG.
- the field worker can move to the next work by looking at the guide image 2601 and the text 2602.
- the work instructor simply specifies the work position, and the three-dimensional direction from the current position of the field worker to the next work position is automatically calculated. There is no need to sequentially indicate the position, and smooth communication is possible.
- the direction calculation unit 208c calculates the direction to the next work position on the three-dimensional space
- the guide image generation unit 105c calculates the direction to the next work position.
- the guide image can be displayed in a three-dimensional display for the on-site worker, and smoother communication is achieved. It becomes possible.
- the remote operation support device is capable of instructing a work target located outside a shooting angle of view of a shooting unit for shooting a video on the site, and has a shooting unit for shooting a video viewed from a site worker. It is suitable for use in a remote work support apparatus or the like provided with a terminal and an instruction terminal for transmitting and receiving information between the terminal for the site.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Tourism & Hospitality (AREA)
- Optics & Photonics (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
- Testing And Monitoring For Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
実施の形態1.
図1はこの発明の実施の形態1に係る遠隔作業支援装置の全体構成例を示す図である。
遠隔作業支援装置は、現場の作業者(以下、現場作業者)が非熟練作業者であっても、機械設備の保守点検作業、修正作業又は設置作業等を行うことができるように、熟練作業者である作業指示者が遠隔地から現場の作業を支援するための装置である。この遠隔作業支援装置は、図1に示すように、実際に現場で作業を行う現場作業者が用いる現場用端末1と、作業指示者が遠隔地から現場作業者に対して指示を与えて作業を支援するための指示用端末2とを備えている。
記憶部102は、現場用端末1で用いる情報を記憶するものである。この記憶部102では、例えば、表示部106により後述するディスプレイ33に表示を行うための事前登録情報や、通信部103により送受信される情報等を記憶する。
音声出力部108は、通信部103により音声データが受信された場合に、当該音声データを再生するものである。
記憶部202は、指示用端末2で用いる情報を記憶するものである。この記憶部202では、例えば、位置方向推定部204及び現場状況画像生成部205で用いる作業場所データや、通信部203により送受信される情報を記憶する。なお、作業場所データとは、作業現場に存在する各種機器を3次元座標値の集合である点群データとして定義し、更に、同現場を撮影した映像から得られる画像特徴点を上記点群データと対応付けたデータである。
音声出力部212は、通信部203により音声データが受信された場合に、当該音声データを再生するものである。
まず、現場用端末1のハードウェア構成例について説明する。
現場用端末1の各機能は、図2に示すように、HMD3及びヘッドセット4により実現される。そして、現場作業者は、このHMD3及びヘッドセット4を装着した状態で、作業対象に対して各種作業を行う。なお図2の例では、配電盤11に対して点検作業等を行う場合を示している。
指示用端末2の各機能は、図2~4に示すように、制御演算装置5、ディスプレイ6、入力装置7、マイク8及びスピーカ9により実現される。また、制御演算装置5は、処理回路51、記憶装置52及び通信装置53を備えている。また図2では、マイク8及びスピーカ9の図示は省略されている。
更に、制御演算装置5を複数に分割し、大規模な計算処理が可能な制御演算装置5上で負荷のより高い処理を行うようにしてもよい。
まず、遠隔作業支援装置による全体処理例について、図6を参照しながら説明する。
遠隔作業支援装置による全体処理例では、図6に示すように、まず、通信部103及び通信部203は、現場用端末1と指示用端末2との間の通信を確立する(ステップST601)。なお、上記通信の確立処理は、GPS、撮影部104による映像又は無線LAN通信等により現場作業者が作業現場に位置したと判定した場合、又は、作業現場のセキュリティシステムに連動した現場作業者の入室通知等に伴って、自動で行うようにしてもよい。
その後、現場作業者は、現場用端末1のディスプレイ33に表示された画面に従って、作業位置へ移動して作業を行う。そして、全ての作業が完了するまで上記処理を繰り返す。
指示用端末2による現場状況表示処理では、図7に示すように、まず、通信部203は、通信部103から映像データを受信する(ステップST701)。
この図8に示す作業場所データでは、定義された点毎に、機器ID801、座標値802、RGBデータ803及び画像特徴点データ804が対応付けて登録されている。なお、機器ID801は、定義された点が作業現場のどの機器に属するかを識別するIDである。また、座標値802は、定義された点が作業現場のどの位置であるかを示す3次元空間上での座標値である。なお、座標系の原点は、例えば、作業現場の出入り口の中心又は部屋の隅等、作業現場毎に適宜定義するものとする。また、RGBデータ803は、定義された点の色情報であり、事前に撮影した映像から得られる。また、画像特徴点データ804は、定義された点に対する画像特徴量を示すデータであり、定義された点の周辺にある別の点のRGBデータ803等を元に計算される。例えば、点Aの周囲にある所定距離内の別の点の集合Biについて、点Aと集合Biとの輝度の差異の分布を点Aの画像特徴量として定義することができる。
この図9に示す現場状況画面では、現場映像901、仮想現場画像902及び操作ボタン903が表示されている。なお、現場映像901は、通信部203により受信された映像データが示す映像である。また、仮想現場画像902は、現場状況画像生成部205により生成された作業現場周辺の機器を仮想空間内に再現した画像である。また、この仮想現場画像902には、現場映像901がどの部分に対応するのかを示す枠線904が示されている。この枠線904により現場作業者の現在位置を把握することができる。また、操作ボタン903は、仮想現場画像902内の視点を移動するためのボタン画像である。図9に示す操作ボタン903では、仮想現場画像902に対し、軸(X,Y,Z)方向のプラスマイナスと各軸の回転及び反回転とを操作可能としている。また、図9に示すような操作ボタン903を用いたボタン操作に代えて、仮想現場画像902上の1点をマウスでドラッグ操作することにより仮想現場画像902内の視点を移動するようにしてもよい。
指示用端末2による作業指示受付け処理では、図10に示すように、まず、表示部206は、入力部210を介して作業指示者により作業指示の開始が要求されることで、現場状況画像生成部205により生成された現場状況を示す画像を用いて、作業指示を行うための画面(作業指示画面)をディスプレイ6に表示する(ステップST1001)。
この図11に示す作業指示画面では、仮想現場画像1101及び操作ボタン1102が表示されている。なお、仮想現場画像1101は、作業指示者が次の作業位置を指定するための画像であり、図9の仮想現場画像902と同様の画像である。なお、符号1103は、現場映像901がどの部分に対応するのかを示す(現場作業者の現在位置を把握するための)枠線である。また、仮想現場画像1101では、次の作業位置を示す作業位置マーカ1104が付加されている。また、操作ボタン1102は、作業位置マーカ1104を移動するためのボタン画像である。図11に示す操作ボタン1102では、作業位置マーカ1104に対して、軸(X,Y,Z)方向のプラスマイナスを操作可能としている。そして、作業指示者は、操作ボタン1102を操作することにより、作業位置マーカ1104を移動し、現場作業者が次に注目すべき作業位置(座標値P1(X1,Y1,Z1))を指定する。また、図11に示すような操作ボタン1102を用いたボタン操作に代えて、作業位置マーカ1104をマウスでドラッグ操作することにより作業位置マーカ1104を移動するようにしてもよい。
その後、作業指示者により作業指示が不要であると判定されるまで、上記処理が繰り返される。
現場用端末1による情報提示処理では、図13に示すように、まず、通信部103は、通信部203から作業指示に関する情報を受信する(ステップST1301)。この際、通信部103は、通信部203から指示データを受信する。また、通信部103は、通信部203からテキスト情報が送信された場合には、当該テキスト情報も受信する。また、通信部103は、通信部203から音声データが送信された場合には、当該音声データも受信する。
この図15に示す情報提示画面では、案内画像1501及びテキスト1502が表示されている。図15に示す案内画像1501では、現場作業者の現在位置から次の作業位置への方向を示す矢印を表示している。そして、現場作業者は、この案内画像1501及びテキスト1502を見ることで、次の作業へ移行することができる。
なお、作業指示者は、作業位置を指定するだけで、現場作業者の現在位置から次の作業位置への方向が自動的に計算されるため、作業指示者が次の作業位置について逐次指示する必要はなく、円滑なコミュニケーションが可能となる。
図17はこの発明の実施の形態2に係る遠隔作業支援装置の全体構成例を示す図である。この図17に示す実施の形態2に係る遠隔作業支援装置は、図1に示す実施の形態1に係る遠隔作業支援装置の作業指示受付部207を作業指示受付部207bに変更し、方向計算部208を方向計算部208bに変更したものである。その他の構成は同様であり、同一の符号を付して異なる部分についてのみ説明を行う。
この図19に示す作業指示画面では、仮想現場画像1901及び操作ボタン1902が表示されている。なお、仮想現場画像1901は、作業指示者が次の作業位置を当該作業位置への経路とともに指定するための画像であり、図11の仮想現場画像1101と同様の画像である。また、仮想現場画像1901では、複数の作業経路マーカ1903が付加されている。この作業経路マーカ1903は、次の作業位置に至る経路を示すマーカである。また、操作ボタン1902は、作業経路マーカ1903の追加、削除と、作業位置マーカ1104及び作業経路マーカ1903の移動を行うためのボタン画像である。図19に示す操作ボタン1902では、作業経路マーカ1903の追加及び削除と、作業位置マーカ1104及び作業経路マーカ1903に対する軸(X,Y,Z)方向のプラスマイナスを操作可能としている。そして、作業指示者は、操作ボタン1902を操作することにより、作業経路マーカ1903を追加又は削除し、作業位置マーカ1104及び作業経路マーカ1903を移動し、次の作業位置及び当該作業位置へ至る経路(座標値Pi(Xi,Yi,Zi),i=1,2,・・・,k)を指定する。また、図19に示すような操作ボタン1902を用いたボタン操作に代えて、作業位置マーカ1104及び作業経路マーカ1903をマウスでドラッグ操作すること位置を移動するようにしてもよい。なお図19では、k=3の場合を示しており、現場作業者が配電盤Aの前(枠線1103の位置)にいる状態で、次の作業位置である配電盤Eの位置(作業位置マーカ1104)へ至る経路を、作業経路マーカ1903a,1903bにより示している。
図21はこの発明の実施の形態3に係る遠隔作業支援装置の全体構成例を示す図である。この図21に示す実施の形態3に係る遠隔作業支援装置は、図1に示す実施の形態1に係る遠隔作業支援装置の方向計算部208を方向計算部208cに変更し、案内画像生成部105を案内画像生成部105cに変更したものである。その他の構成は同様であり、同一の符号を付して異なる部分についてのみ説明を行う。
また、左目投影用の方向ベクトルVdl(Xdl,Ydl,Zdl)に対しても上記と同様に処理される。
この図26に示す情報提示画面では、案内画像2601及びテキスト2602が表示されている。図26に示す案内画像2601では、現場作業者の現在位置から次の作業位置への方向を指す矢印を3次元表示している。なお、テキスト2602は、図15に示すテキスト1502と同一のものである。そして、現場作業者は、この案内画像2601及びテキスト2602を見ることで、次の作業へ移行することができる。
なお、作業指示者は、作業位置を指定するだけで、現場作業者の現在位置から次の作業位置への3次元上での方向が自動的に計算されるため、作業指示者が次の作業位置について逐次指示する必要はなく、円滑なコミュニケーションが可能となる。
Claims (5)
- 作業者から見た映像を撮影する撮影部を有する現場用端末と、前記現場用端末との間で情報を送受信する指示用端末とを備え、
前記指示用端末は、
前記撮影部により撮影された映像から、前記作業者の位置及び向きを推定する位置方向推定部と、
前記位置方向推定部による推定結果から、前記作業者の位置を含む現場状況を示す画像を生成する現場状況画像生成部と、
前記現場状況画像生成部により生成された画像を含む画面を表示する指示側表示部と、
前記指示側表示部により表示された画面上で作業指示者により入力された次の作業位置を示す情報を受付ける作業指示受付部と、
前記位置方向推定部による推定結果及び前記作業指示受付部による受付け結果から、次の作業位置への方向を計算する方向計算部とを備え、
前記現場用端末は、
前記方向計算部による計算結果から、次の作業位置への方向を示す画像を生成する案内画像生成部と、
前記案内画像生成部により生成された画像を含む画面を表示する現場側表示部とを備えた
ことを特徴とする遠隔作業支援装置。 - 前記作業指示受付部は、前記次の作業位置を示す情報を当該作業位置へ至る経路を示す情報とともに受付け、
前記方向計算部は、前記次の作業位置への方向を、前記経路に沿って計算する
ことを特徴とする請求項1記載の遠隔作業支援装置。 - 前記方向計算部は、前記次の作業位置への方向を3次元空間上で計算し、
前記案内画像生成部は、前記次の作業位置への方向を示す画像として、3次元画像を生成する
ことを特徴とする請求項1記載の遠隔作業支援装置。 - 現場用端末の撮影部により撮影された作業者から見た映像から、前記作業者の位置及び向きを推定する位置方向推定部と、
前記位置方向推定部による推定結果から、前記作業者の位置を含む現場状況を示す画像を生成する現場状況画像生成部と、
前記現場状況画像生成部により生成された画像を含む画面を表示する指示側表示部と、
前記指示側表示部により表示された画面上で作業指示者により入力された次の作業位置を示す情報を受付ける作業指示受付部と、
前記位置方向推定部による推定結果及び前記作業指示受付部による受付け結果から、次の作業位置への方向を計算する方向計算部と
を備えたことを特徴とする指示用端末。 - 作業者から見た映像を撮影する撮影部と、
指示用端末の、位置方向推定部により、前記撮影部により撮影された映像から、前記作業者の位置及び向きが推定され、現場状況画像生成部により、前記位置方向推定部による推定結果から、前記作業者の位置を含む現場状況を示す画像が生成され、指示側表示部により、前記現場状況画像生成部により生成された画像を含む画面が表示され、作業指示受付部により、前記指示側表示部により表示された画面上で作業指示者により入力された次の作業位置を示す情報が受付けられ、方向計算部により、前記位置方向推定部による推定結果及び前記作業指示受付部による受付け結果から、次の作業位置への方向が計算された結果から、次の作業位置への方向を示す画像を生成する案内画像生成部と、
前記案内画像生成部により生成された画像を含む画面を表示する現場側表示部と
を備えたことを特徴とする現場用端末。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017547024A JP6309176B2 (ja) | 2016-03-15 | 2016-03-15 | 遠隔作業支援装置、指示用端末及び現場用端末 |
US15/772,775 US20180241967A1 (en) | 2016-03-15 | 2016-03-15 | Remote work assistance device, instruction terminal and onsite terminal |
PCT/JP2016/058126 WO2017158718A1 (ja) | 2016-03-15 | 2016-03-15 | 遠隔作業支援装置、指示用端末及び現場用端末 |
TW105120461A TWI579666B (zh) | 2016-03-15 | 2016-06-29 | A remote operation support device, an instruction terminal, and a field terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/058126 WO2017158718A1 (ja) | 2016-03-15 | 2016-03-15 | 遠隔作業支援装置、指示用端末及び現場用端末 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017158718A1 true WO2017158718A1 (ja) | 2017-09-21 |
Family
ID=59241087
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/058126 WO2017158718A1 (ja) | 2016-03-15 | 2016-03-15 | 遠隔作業支援装置、指示用端末及び現場用端末 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20180241967A1 (ja) |
JP (1) | JP6309176B2 (ja) |
TW (1) | TWI579666B (ja) |
WO (1) | WO2017158718A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020080147A (ja) * | 2018-11-13 | 2020-05-28 | 株式会社東芝 | 保全活動サポートシステムおよび保全活動サポート方法 |
WO2021033400A1 (ja) * | 2019-08-21 | 2021-02-25 | ソニー株式会社 | 情報処理装置、情報処理方法および記録媒体 |
WO2023286115A1 (ja) * | 2021-07-12 | 2023-01-19 | 日本電気株式会社 | 表示制御装置、表示システム、表示方法及びコンピュータ可読媒体 |
JP2023016589A (ja) * | 2021-07-21 | 2023-02-02 | 東芝デジタルエンジニアリング株式会社 | 点検作業順表示装置および点検作業支援システム |
WO2023073775A1 (ja) * | 2021-10-25 | 2023-05-04 | 三菱電機株式会社 | 情報処理装置、情報処理方法、及び、情報処理プログラム |
WO2023218740A1 (ja) * | 2022-05-13 | 2023-11-16 | 株式会社Nttドコモ | 表示制御システムおよびウェアラブル装置 |
US11892822B2 (en) | 2021-01-08 | 2024-02-06 | Mitsubishi Electric Corporation | Maintenance support system, maintenance support method and maintenance management server |
US12051218B2 (en) | 2021-08-17 | 2024-07-30 | Fujifilm Business Innovation Corp. | Remote support system, terminal device, and remote device |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210102820A1 (en) * | 2018-02-23 | 2021-04-08 | Google Llc | Transitioning between map view and augmented reality view |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002132487A (ja) * | 2000-10-25 | 2002-05-10 | Oki Electric Ind Co Ltd | 遠隔作業支援システム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3265893B2 (ja) * | 1995-02-13 | 2002-03-18 | 株式会社日立製作所 | 画像表示装置 |
JP2002027567A (ja) * | 2000-07-12 | 2002-01-25 | Hitachi Kokusai Electric Inc | 半導体製造装置のリモート操作システム、半導体製造装置および遠隔操作装置 |
JP4316210B2 (ja) * | 2002-08-27 | 2009-08-19 | 東京エレクトロン株式会社 | 保守システム,基板処理装置及び遠隔操作装置 |
US20040093516A1 (en) * | 2002-11-12 | 2004-05-13 | Hornbeek Marc William Anthony | System for enabling secure remote switching, robotic operation and monitoring of multi-vendor equipment |
US8254713B2 (en) * | 2005-11-11 | 2012-08-28 | Sony Corporation | Image processing apparatus, image processing method, program therefor, and recording medium in which the program is recorded |
-
2016
- 2016-03-15 JP JP2017547024A patent/JP6309176B2/ja not_active Expired - Fee Related
- 2016-03-15 US US15/772,775 patent/US20180241967A1/en not_active Abandoned
- 2016-03-15 WO PCT/JP2016/058126 patent/WO2017158718A1/ja active Application Filing
- 2016-06-29 TW TW105120461A patent/TWI579666B/zh not_active IP Right Cessation
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002132487A (ja) * | 2000-10-25 | 2002-05-10 | Oki Electric Ind Co Ltd | 遠隔作業支援システム |
Non-Patent Citations (1)
Title |
---|
TOMOAKI ADACHI: "A Telepresence System using Live Video Projection onto a 3D Scene Model", IEICE TECHNICAL REPORT, vol. 104, no. 490, 17 January 2005 (2005-01-17), pages 7 - 12, XP055420430 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020080147A (ja) * | 2018-11-13 | 2020-05-28 | 株式会社東芝 | 保全活動サポートシステムおよび保全活動サポート方法 |
JP7337654B2 (ja) | 2018-11-13 | 2023-09-04 | 株式会社東芝 | 保全活動サポートシステムおよび保全活動サポート方法 |
WO2021033400A1 (ja) * | 2019-08-21 | 2021-02-25 | ソニー株式会社 | 情報処理装置、情報処理方法および記録媒体 |
US11892822B2 (en) | 2021-01-08 | 2024-02-06 | Mitsubishi Electric Corporation | Maintenance support system, maintenance support method and maintenance management server |
WO2023286115A1 (ja) * | 2021-07-12 | 2023-01-19 | 日本電気株式会社 | 表示制御装置、表示システム、表示方法及びコンピュータ可読媒体 |
JP2023016589A (ja) * | 2021-07-21 | 2023-02-02 | 東芝デジタルエンジニアリング株式会社 | 点検作業順表示装置および点検作業支援システム |
JP7410417B2 (ja) | 2021-07-21 | 2024-01-10 | 東芝デジタルエンジニアリング株式会社 | 点検作業順表示装置および点検作業支援システム |
US12051218B2 (en) | 2021-08-17 | 2024-07-30 | Fujifilm Business Innovation Corp. | Remote support system, terminal device, and remote device |
WO2023073775A1 (ja) * | 2021-10-25 | 2023-05-04 | 三菱電機株式会社 | 情報処理装置、情報処理方法、及び、情報処理プログラム |
JPWO2023073775A1 (ja) * | 2021-10-25 | 2023-05-04 | ||
JP7374396B2 (ja) | 2021-10-25 | 2023-11-06 | 三菱電機株式会社 | 情報処理装置、情報処理方法、及び、情報処理プログラム |
WO2023218740A1 (ja) * | 2022-05-13 | 2023-11-16 | 株式会社Nttドコモ | 表示制御システムおよびウェアラブル装置 |
Also Published As
Publication number | Publication date |
---|---|
JP6309176B2 (ja) | 2018-04-11 |
US20180241967A1 (en) | 2018-08-23 |
TW201809934A (zh) | 2018-03-16 |
TWI579666B (zh) | 2017-04-21 |
JPWO2017158718A1 (ja) | 2018-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6309176B2 (ja) | 遠隔作業支援装置、指示用端末及び現場用端末 | |
KR101566543B1 (ko) | 공간 정보 증강을 이용하는 상호 인터랙션을 위한 방법 및 시스템 | |
JP6249248B2 (ja) | 投影装置 | |
US20150116502A1 (en) | Apparatus and method for dynamically selecting multiple cameras to track target object | |
TWI400940B (zh) | 遠端控制軌道式攝影裝置的手持式裝置及方法 | |
CN104982031B (zh) | 处理对象图像生成装置、处理对象图像生成方法及操作支援系统 | |
JP2018112789A (ja) | 情報処理システム、情報処理プログラム、情報処理装置、情報処理方法、ゲームシステム、ゲームプログラム、ゲーム装置、及びゲーム方法 | |
JP5708051B2 (ja) | 映像処理装置、映像処理システム、テレビ会議システム、遠方監視システム、映像処理方法、及び撮像装置 | |
JP2004128997A (ja) | 映像遠隔制御装置,映像遠隔制御方法,映像遠隔制御プログラムおよび映像遠隔制御プログラムを記録した記録媒体 | |
US20140210957A1 (en) | Stereoscopic imaging apparatus and method of displaying in-focus state confirmation image | |
US11494149B2 (en) | Display system, information processing device, display control method of display system | |
JP4199641B2 (ja) | プロジェクタ装置 | |
JP2009200697A (ja) | 画像送信装置、画角制御方法、画像受信装置、画像表示システム、画像表示方法 | |
JP2014039166A (ja) | 自動追尾カメラの制御装置及びそれを備える自動追尾カメラ | |
JP6494060B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
KR101700360B1 (ko) | 공통화각 표시 기능을 갖는 디지털 촬영 장치, 이의 제어 방법 및 상기 방법을 기록한 기록 매체 | |
JP4839858B2 (ja) | 遠隔指示システム及び遠隔指示方法 | |
JP2005333628A (ja) | カメラ制御装置およびこれを用いた監視カメラシステム | |
JP5508308B2 (ja) | 数値制御機器のテレビカメラモニター画面の関連操作方法 | |
JP2016201007A (ja) | 搬送設備の遠隔メンテナンスシステム | |
JP2015053734A (ja) | プロジェクター、画像投写システムおよび画像投写方法 | |
JP2013238891A (ja) | プロジェクター、画像投写システムおよび画像投写方法 | |
WO2021131325A1 (ja) | 画像処理装置、画像処理方法、プログラム | |
JP4556944B2 (ja) | 投影装置、測距処理方法及びプログラム | |
JPS6231272A (ja) | 雲台制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2017547024 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15772775 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16894342 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16894342 Country of ref document: EP Kind code of ref document: A1 |