WO2021100331A1 - 遠隔作業支援システム - Google Patents
遠隔作業支援システム Download PDFInfo
- Publication number
- WO2021100331A1 WO2021100331A1 PCT/JP2020/037310 JP2020037310W WO2021100331A1 WO 2021100331 A1 WO2021100331 A1 WO 2021100331A1 JP 2020037310 W JP2020037310 W JP 2020037310W WO 2021100331 A1 WO2021100331 A1 WO 2021100331A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- supporter
- wearable device
- image
- worker
- work support
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 27
- 238000000034 method Methods 0.000 claims description 13
- 230000007613 environmental effect Effects 0.000 claims description 7
- 239000000284 extract Substances 0.000 claims description 7
- 238000004891 communication Methods 0.000 description 28
- 238000012986 modification Methods 0.000 description 24
- 230000004048 modification Effects 0.000 description 24
- 238000012423 maintenance Methods 0.000 description 8
- 239000004984 smart glass Substances 0.000 description 6
- 230000006870 function Effects 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 230000005856 abnormality Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- OCKGFTQIICXDQW-ZEQRLZLVSA-N 5-[(1r)-1-hydroxy-2-[4-[(2r)-2-hydroxy-2-(4-methyl-1-oxo-3h-2-benzofuran-5-yl)ethyl]piperazin-1-yl]ethyl]-4-methyl-3h-2-benzofuran-1-one Chemical compound C1=C2C(=O)OCC2=C(C)C([C@@H](O)CN2CCN(CC2)C[C@H](O)C2=CC=C3C(=O)OCC3=C2C)=C1 OCKGFTQIICXDQW-ZEQRLZLVSA-N 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 208000019901 Anxiety disease Diseases 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000036506 anxiety Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/24—Pc safety
- G05B2219/24019—Computer assisted maintenance
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32014—Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32226—Computer assisted repair, maintenance of system components
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/221—Announcement of recognition results
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M11/00—Telephonic communication systems specially adapted for combination with other electrical systems
- H04M11/10—Telephonic communication systems specially adapted for combination with other electrical systems with dictation recording and playback systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2203/00—Aspects of automatic or semi-automatic exchanges
- H04M2203/05—Aspects of automatic or semi-automatic exchanges related to OAM&P
- H04M2203/053—Aspects of automatic or semi-automatic exchanges related to OAM&P remote terminal provisioning, e.g. of applets
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2203/00—Aspects of automatic or semi-automatic exchanges
- H04M2203/10—Aspects of automatic or semi-automatic exchanges related to the purpose or context of the telephonic communication
- H04M2203/1016—Telecontrol
Definitions
- Patent Document 1 Japanese Unexamined Patent Publication No. 2018-207420 discloses a work support system capable of confirming a work object by voice communication even if the worker does not have a display device.
- the remote work support system from the first viewpoint includes a wearable device, a mobile terminal, and a supporter terminal.
- Wearable devices are worn by field workers.
- the wearable device includes an imaging unit, an audio input unit, and an audio output unit.
- the wearable device transmits images and sends and receives audio.
- the mobile terminal is carried by a field worker separately from the wearable device.
- the mobile terminal has a display unit that displays the received image.
- the supporter terminal is used by a supporter who remotely supports the field worker.
- the supporter terminal has a voice input unit, a voice output unit, and a display unit.
- the supporter terminal transmits / receives voice to / from the wearable device, receives an image from the wearable device, and transmits the image to the mobile terminal.
- the wearable device is not equipped with a display, so that the field worker can secure a sufficient field of view and can work safely.
- the image can be displayed on the mobile terminal, so that the content of the instruction from the remote supporter can be viewed on the screen of the display unit.
- the remote work support system of the second viewpoint is the system of the first viewpoint, and the wearable device is a neck-mounted device.
- the wearable device is a neck-mounted device, the field worker can work hands-free and work efficiently without obstructing the field of vision. it can.
- the remote work support system of the third viewpoint is a system of the first viewpoint or the second viewpoint, and the display unit of the supporter terminal displays a moving image and a still image taken by the imaging unit of the wearable device.
- the remote work support system of the fourth viewpoint is a system of any of the first to third viewpoints, and the image taken by the imaging unit of the wearable device is selected, processed, or information on the supporter terminal. It is added and transmitted to the mobile terminal, and is displayed on the display unit of the mobile terminal.
- on-site workers can process the on-site image and receive support using the information-added image without having to consciously take an image. Efficiency can be increased.
- the remote work support system of the fifth viewpoint is any of the systems from the first viewpoint to the fourth viewpoint, and the utterances of the field worker and the supporter are voice-recognized, and the display unit of the supporter terminal is used as text information. Is displayed in.
- the supporter can provide support while confirming the call with the field worker with text information, so that it is not necessary to take notes and efficient work is possible. Become.
- the remote work support system of the sixth viewpoint is the system of the fifth viewpoint, and the text information is further displayed on the display unit of the mobile terminal. To.
- the field worker can confirm the call with the supporter by text.
- the 7th viewpoint remote work support system is any system from the 1st viewpoint to the 6th viewpoint, and has a first line and a second line.
- the first line connects the supporter terminal and the wearable device.
- the second line connects the supporter terminal and the mobile terminal.
- the first line is used continuously during work by field workers.
- the second line is connected as needed during work by field workers.
- the remote work support system of the seventh viewpoint uses the first line and the second line properly and uses the second line only when necessary, so that the field worker only needs to use the second line when necessary, which is useless. There is no need to operate a mobile terminal.
- the remote work support system of the eighth viewpoint is any of the systems from the first viewpoint to the seventh viewpoint, and the work progress of the work by the field worker is captured by the image pickup unit and / or the field work. Automatically extract from the utterances of the person and the supporter. Estimate the progress of work or the end time of work from automatically extracted images and utterances. The estimated progress status or end time is displayed on the display section of the supporter terminal.
- the 8th viewpoint remote work support system allows the supporter to monitor the progress of work.
- the remote work support system of the ninth viewpoint is any of the systems from the first viewpoint to the eighth viewpoint, and the supporter terminal accesses the database storing the on-site work support information and extracts the on-site work support information. Process and send to mobile terminals.
- the field worker can proceed with the work efficiently because the field worker receives the extracted and processed field work support information.
- the remote work support system of the tenth viewpoint is any of the systems from the first viewpoint to the ninth viewpoint, and the support from the supporter terminal to the field worker is voice support, processed image support, and support. Includes support by text entered on the person's terminal, support by textualized text of the supporter's utterance, or a combination of these.
- the remote work support system of the eleventh viewpoint is any system from the first viewpoint to the tenth viewpoint, and includes a plurality of devices that can be connected to the same server.
- the plurality of devices include wearable devices, mobile terminals, and supporter terminals.
- the server designates at least one or more of the plurality of devices based on a predetermined condition, and calls the device to be connected based on the designation.
- the twelfth viewpoint remote work support system is the eleventh viewpoint system, and the server acquires the environmental information of the work site of the field worker.
- the server calls at least one or more of the plurality of devices based on the environmental information.
- the 13th viewpoint remote work support system is the 11th viewpoint system, and the server manages the progress of work by field workers.
- the server calls at least one or more of the plurality of devices based on the progress of the work.
- the remote work support system of the 14th viewpoint is the system of the 11th viewpoint, and the server recognizes the utterances of the field worker and the supporter by voice. Call at least one or more of the plurality of devices based on the keywords acquired by voice recognition.
- the remote work support system of the fifteenth viewpoint is any system from the eleventh viewpoint to the fourteenth viewpoint, and the server is further, from the supporter terminal, at least one of a plurality of devices used by field workers. Accepts the above device specifications.
- the remote work support system of the 16th viewpoint is any system from the 1st viewpoint to the 15th viewpoint, and a plurality of devices can communicate with each other between the devices.
- the supporter terminal of the 17th viewpoint is used by a supporter who remotely supports field workers.
- the supporter terminal has a voice input unit, a voice output unit, and a display unit.
- the supporter terminal transmits and receives audio to and from the wearable device, and receives an image from the wearable device.
- the wearable device is attached to a field worker and has an imaging unit, an audio input unit, and an audio output unit.
- the wearable device transmits images and sends and receives audio.
- the supporter terminal transmits an image to the mobile terminal.
- the mobile terminal is carried by a field worker separately from the wearable device.
- the mobile terminal has a display unit that displays an image received by the image receiving unit.
- the server of the 18th viewpoint is a server that connects to a plurality of devices.
- the plurality of devices include wearable devices, mobile terminals, and supporter terminals.
- the wearable device is attached to a field worker, has an image pickup unit, an audio input unit, and an audio output unit, transmits an image, and transmits / receives audio.
- the mobile terminal has a display unit that is carried by a field worker separately from the wearable device and displays an image received by the image receiving unit.
- the supporter terminal is used by a supporter who remotely supports a field worker, and has a voice input unit, a voice output unit, and a display unit.
- the server designates at least one or more of the plurality of devices based on a predetermined condition, and calls the device to be communicated based on the designation.
- the remote work support system 1 of the first embodiment has a wearable device 10, a mobile terminal 20, a server 30, a supporter terminal 40, and a database 50. ing.
- the wearable device 10 and the mobile terminal 20 are held by field workers "AAA_aaa”, “BBB_bbb”, and "CCC_ccc", respectively.
- Field workers are multiple workers who are dispatched separately to multiple sites. In FIG. 2, three workers are dispatched to the three sites, but the number of sites and the number of workers are not limited to this.
- the supporter terminal 40 is arranged so as to be used by a supporter at a remote location from the field worker.
- the wearable device 10, the mobile terminal 20, the server 30, the supporter terminal 40, and the database 50 are connected to a common network and can communicate with each other.
- the field worker may be referred to as a worker, but the meaning is the same.
- work means any of equipment repair, maintenance work, on-site survey for equipment installation, and installation / installation.
- the wearable device 10 is a neck-mounted device as shown in FIG.
- the wearable device 10 has an annular shape, and a part of the ring is open. Use the open part to insert it into the neck, and attach it so that the open part comes to the front.
- the wearable device 10 is a neck-hanging type, it is possible to work hands-free even if it is worn. Also, unlike the glasses type, it does not affect the field of vision.
- the wearable device 10 includes a main body 19, an imaging unit 11, a voice input unit 12, a voice output unit 13, a gesture sensor 14, a control unit 15, a communication unit 16, and a power switch (not shown). Have.
- An imaging unit 11 and a gesture sensor 14 are arranged at both ends of the main body 19.
- the imaging unit 11 is a camera.
- the camera continuously captures the front of the worker during the work of the worker.
- the captured image is a moving image. It may be a still image.
- the captured image is sent to the server 30 by the communication unit 16 via the network.
- the image sent to the server 30 is stored in the storage unit 32 of the server 30. At the same time, the image sent to the server 30 is displayed on the display unit 43 of the supporter terminal 40.
- the gesture sensor 14 has a switch function.
- the wearable device can be operated by the wearer (worker) showing a sign in front of the gesture sensor.
- the imaging unit 11 turns on / off the imaging. Further, it may be used for calling a supporter using the supporter terminal 40 from the worker.
- the wearable device 10 may have a mechanical or electronic switch in place of or with the gesture sensor 14.
- the voice input unit 12 is a microphone.
- the microphone has directivity and preferentially collects the sound generated by the wearer.
- the voice input unit 12 may be used instead of the gesture sensor 14 or together with the gesture sensor 14 to call a supporter using the supporter terminal 40 from the operator. For example, when the worker utters "Please help me", the utterance reaches the server 30, is voice-recognized by the server 30, and is converted into text.
- the server 30 identifies that the worker is requesting assistance from the text-formatted voice data, and blinks (blinks) the worker's name "AAAaaa" in the worker identification information 112 of the supporter terminal 40. Let me. As a result, the supporter knows that the worker "AAA" is seeking support.
- the audio output unit 13 is a speaker.
- the audio output unit 13 is arranged behind the body of the wearer (worker) of the wearable device 10.
- the wearable device 10 of the present embodiment includes a voice input unit 12, a voice output unit 13, and a communication unit 16 for transmitting and receiving voice. Therefore, the worker who wears the wearable device 10 can use the device as a telephone to talk with the supporter.
- the communication unit 16 is connected to the network by the first line T1.
- the voice input by the voice input unit 12 and the image captured by the image pickup unit 11 can be transmitted to the network as electronic data. It also receives voice as electronic data from the network.
- the control unit 15 is a computer.
- the control unit 15 has a processor and a storage unit.
- the control unit 15 controls the image pickup unit 11, the voice input unit 12, the voice output unit 13, the gesture sensor 14, and the communication unit 16.
- the control unit 15 receives and executes a control command input to the gesture sensor 14.
- the control unit 15 executes a command of the server 30 and a command input by the supporter by operating the supporter terminal 40.
- the mobile terminal 20 is carried or held by an operator.
- the mobile terminal 20 may be a smartphone, a tablet, a PC, or the like.
- the mobile terminal 20 has a display unit 21, a communication unit 22, and a control unit 23.
- the mobile terminal 20 is connected to the network by the second line T2.
- the communication unit 22 receives an image from the network.
- the image may be a moving image or a still image.
- the display unit 21 is a display. An image received from the network is displayed on the display unit 21. The image processed by the supporter on the supporter terminal 40 is transmitted to the mobile terminal 20 via the network and displayed on the display unit 21.
- the image may be a moving image or a still image.
- the control unit 23 is a computer.
- the control unit 23 has a processor and a storage unit.
- the control unit 23 controls the display unit 21 and the communication unit 22.
- the mobile terminal 20 does not need to be constantly connected to the supporter terminal 40 (or server 30) during the work of the worker. It is sufficient to connect only when the image (including text information) is sent from the supporter terminal 40 to the mobile terminal 20.
- the server 30 is a computer.
- the server 30 has a processor 31 and a storage unit 32.
- Applications and data are stored in the storage unit 32.
- the application of the remote work support system 1 stored in the storage unit 32 is read into the processor 31 and executed.
- the application of the remote work support system 1 is a WEB application.
- a browser is launched, and the application of the remote work support system 1 is accessed from the browser.
- the wearable device 10 When the wearable device 10 is activated by the worker, the wearable device 10 connects to the server 30 and transmits the image captured by the imaging unit 11 of the wearable device 10 to the server 30.
- the captured image is stored in the storage unit 32 of the server 30.
- the images captured at the same time can be viewed on the display unit 43 of the supporter terminal 40 by activating the browser of the supporter terminal 40.
- the application of the remote work support system 1 makes a two-way call between the worker and the supporter by a connection request from the wearable device 10 or a connection request from the supporter terminal 40 during the work of the worker. to enable.
- the server 30 converts the voice obtained from the wearable device 10 and the voice obtained from the supporter terminal 40 into text information by voice recognition.
- the text information is stored in the storage unit 32 for each operation. Further, it is transmitted to the supporter terminal 40 and displayed on the display unit 43 of the supporter terminal 40. Further, the text information is also transmitted to the mobile terminal 20 as needed and displayed on the display unit 21 of the mobile terminal 20.
- the server 30 collects the voice data of all the wearable devices 10 that are communicating, converts it into text information, and stores it in the storage unit 32. If the voice from the worker "BBB bbb" who is currently connected to the supporter terminal 40 but has not received support contains a wording requesting support, the server 30 will be the supporter terminal.
- the worker's name "BBB bbb" of the worker identification information 112 of 40 is blinked (blinking).
- the server 30 may further have an automatic translation function. Translate the voice-recognized voice data into another language. For example, when a field worker speaks English and a supporter in a remote location speaks Japanese, the English spoken by the field worker may be transmitted to the supporter in Japanese voice. Alternatively, it may be displayed in Japanese on the utterance record 116 of the display unit 43 of the supporter terminal 40. English and Japanese may be displayed together.
- Images generated during work support, voice information converted into text, and images processed by the supporter are stored in the storage unit 32 of the server 30. This information may be stored elsewhere. For example, it may be stored in the database 50. It may be saved in the storage unit of the supporter terminal 40.
- the database 50 includes a storage unit for storing electronic data.
- Database 50 is connected to the network.
- the database 50 stores on-site work support information used for equipment repair and maintenance.
- the storage of on-site work support information is not limited to the database 50. It may be stored in the storage unit 32 of the server 30. It may be stored in the storage unit of the supporter terminal 40.
- the supporter terminal 40 is used by a supporter who remotely supports field workers.
- the supporter terminal 40 may be a PC.
- the supporter terminal 40 has a voice input unit 41, a voice output unit 42, a display unit 43, a communication unit 44, and a control unit 45.
- the voice input unit 41 is a microphone.
- the audio output unit 42 is a speaker.
- the supporter terminal 40 of the present embodiment has a voice input unit 41, a voice output unit 42, and a communication unit 44 for transmitting and receiving voice. Therefore, the supporter can use the supporter terminal 40 as a telephone to talk with the on-site worker.
- the communication unit 44 is connected to the network. Receives voice as electronic data from the network. The communication unit 44 further receives the text information that has been voice-recognized and converted into text by the server 30. The communication unit 44 receives the image captured by the image pickup unit 11 of the wearable device 10. On the other hand, the communication unit 44 can transmit the voice input by the voice input unit 41 and the image processed by the supporter at the supporter terminal 40 as electronic data to the network.
- the display unit 43 is a display.
- the control unit 45 is a computer.
- the control unit 45 has a processor and a storage unit.
- the control unit 45 controls the voice input unit 41, the voice output unit 42, the display unit 43, and the communication unit 44.
- a browser (program) is stored in the storage unit of the control unit 45.
- the control unit 45 reads and executes the browser program. If the application of the remote work support system 1 is executed on the server 30, the work using the supporter terminal 40 can be supported by using the browser in the control unit 45 of the supporter terminal 40.
- a plurality of communication protocols may be used properly for communication between the wearable device 10 of the present embodiment, the mobile terminal 20, the server 30, the supporter terminal 40, and the like.
- WebRTC is used as the video and audio communication protocol.
- WebRTC is a standard protocol and has the feature that it is secure but the degree of freedom in design is not effective (only parameters such as resolution can be changed).
- HTTPS, MQTT, or WebSocket may be used as the communication protocol for still images.
- These communication protocols can be freely designed, can handle anything such as moving images, texts, and still images, and have the feature of increasing the degree of design on the server side.
- the work progress 111 On the screen of the display unit 43, the work progress 111, the worker identification information 112, the live image 113 of the site, the supporter processed image 114, the reference list 115, and the utterance record 116 are displayed. ..
- the work progress 111 represents the number of work sites (or the number of workers) that the supporter using the supporter terminal is involved in "today". Here, it is shown that there are two works that have already been completed "today”.
- the work progress status (may be indicated by%) and the expected end time of the work may be displayed at the work progress 111.
- the server 30 automatically estimates the progress status of the work or the end time of the work by automatically extracting the image captured by the imaging unit 11 of the wearable device 10 and / or the utterances of the worker and the supporter. You may.
- the worker identification information 112 describes "currently an online newcomer". This means a worker who is currently using the wearable device 10 connected to the supporter terminal 40.
- the worker's icon (photograph) and the worker's name "AAAaaa” are displayed.
- “Live” is displayed in front of the icon of the worker "AAAaaa”.
- the display of "Live” means that the connection between the supporter terminal 40 and the wearable device 10 is active.
- “Active” means that the live image 113 of the worker “AAA aaa” is displayed on the screen, and the voice is connected to the worker “AAA aaa” so that a call can be made.
- the worker identification information 112 of FIG. 3 five workers are displayed, but some of the inactive workers may be hidden and displayed by scrolling. In addition, the active worker may be displayed in a large size, and the names of other workers may be partially or completely hidden.
- the worker identification information 112 includes not only an icon and a worker's name, but also a worker's ID number, a site address, a model number and ID number of a device being worked on (here, an air conditioner), a work ID, and the like. May be displayed.
- the worker identification information 112 is information that allows the supporter to identify the worker or the work.
- the display method of active workers is not limited to the display of "Live". It suffices if the supporter using the supporter terminal 40 can identify the active worker.
- the connected worker identification information may be enlarged, blinked, changed in color, or popped up.
- the live image 113 shows a live image of an active worker being photographed by the imaging unit 11 of the wearable device 10 of the worker “AAA aaa”.
- FIG. 3 shows an image of the outdoor unit of the air conditioner.
- the voice is connected to the worker "AAAaaa", so that the supporter moves the worker by instructing by voice, and as a result, moves the position of the image pickup unit 11 of the wearable device 10. , You can move the image.
- the supporter processed image 114 is an image extracted from the live image 113 and processed.
- a circular marking is made in the center of the fan diagram, and the instruction “Open the fan from here” is handwritten. This is a character or symbol entered by the supporter on the supporter terminal 40 by touch pen input.
- the reference list 115 is a list of references (field work support information) to be referred to when the supporter supports.
- the site work support information may include site survey confirmation items, construction work procedure manuals, or maintenance maintenance information.
- the supporter extracts the documents necessary for the construction from the on-site work support information stored in the database 50, extracts the necessary parts from the documents, and marks them as necessary. , Can be transmitted to the worker's mobile terminal 20.
- the utterance record 116 is displayed by distinguishing between the utterance text of the active worker "AAAaaa" and the utterance text of the supporter.
- the worker arrives at the site and starts work for equipment repair and maintenance.
- an air conditioner is assumed as a device.
- the field worker wears the wearable device 10 on his / her body and turns on the power switch.
- the operator sends a sign in front of the gesture sensor 14 and activates the imaging unit 11.
- the imaging unit 11 starts imaging.
- the wearable device 10 is connected to the application of the remote work support system 1 of the server 30.
- the captured image is sent from the communication unit 16 to the server 30 via the network, and is stored in the storage unit 32 of the server 30.
- the supporter terminal 40 starts the browser and connects to the application of the remote work support system 1 of the server 30.
- the power on of the wearable device 10 and the connection of the server 30 to the application of the remote work support system 1 on the supporter terminal 40 do not have to be simultaneous. Whichever comes first.
- the imaging unit 11 of the wearable device 10 is activated, and the supporter terminal 40 is connected to the application of the remote work support system 1 of the server 30 on the browser.
- the supporter who uses the supporter terminal 40 appropriately selects a worker and starts support.
- the gesture indicating that he / she wants to receive the support is shown in front of the gesture sensor 14.
- the request for assistance may be made by another method such as voice.
- the worker's support request blinks (blinks) the worker's name "AAAaaa” of the worker identification information 112 on the screen of the display unit 43 of the supporter terminal 40.
- the connection with the worker "AAA aaa” is activated by clicking the name "AAA aaa”.
- the worker "AAA aaa” of the worker identification information 112 is marked with "Live”, and the image captured by the wearable device 10 is displayed live on the live image 113, and the worker "AAA aaa” is displayed. Supporters can make calls.
- the supporter confirms the video transmitted from the wearable device 10 of the worker "AAAaaa” while talking with the worker "AAAaaa”, and examines the work support contents.
- the simplest way to support is that the supporter gives advice to the worker while talking to the worker "AAAaaa".
- the contents of the utterances of the worker and the supporter are voice-recognized by the server 30, converted into text, and stored in the storage unit 32 of the server 30. At the same time, it is displayed on the utterance record 116 of the display unit 43 of the supporter terminal 40.
- the supporter considers the support content and gives advice while checking the utterance content in the utterance record 116. Further, the support may be provided by transmitting the textualized utterance or call content of the supporter to the mobile terminal 20 of the worker "AAAaaa" and displaying it on the display unit 21.
- the supporter appropriately selects (cuts) an image from the live image 113 as a still image (snapshot), and displays the still image on the supporter processed image 114.
- the supporter further processes the displayed image. Specifically, the necessary parts are cut out, marked, and characters are written.
- the image written by the supporter is transmitted to the mobile terminal 20 of the worker "AAAaaa".
- the worker "AAA" confirms the display unit 21 of the mobile terminal 20 and confirms the image transmitted by the supporter. In this way, the worker proceeds with the work while receiving the support of the supporter.
- the supporter processed image created by the supporter is a still image
- it may be a moving image
- the support information provided by the supporter may be not only the information captured and processed by the wearable device 10 but also an image or characters (snapshot) extracted or processed from the reference (field work support information). ..
- the field work support information may be displayed in the reference list 115 on the screen, or may be obtained by searching in the window of the reference list 115. What is obtained by searching in the reference list 115 is the one stored in the database 50. For example, if the supporter determines that it is necessary to check the instruction manual for support, he / she clicks the instruction manual in the reference list 115. Then, a new window will pop up and the instruction manual page will be displayed.
- the supporter selects a predetermined page of the instruction manual, cuts out a necessary part, and displays the cut-out image (snapshot) on the supporter-processed image 114.
- the supporter processes the image (snapshot) of the predetermined page of the instruction manual by marking, writing characters, and the like.
- the processed image is captured by the wearable device 10 and transmitted to the worker's mobile terminal 20 as a supporter-processed image in the same manner as the selected and processed image. The worker can proceed with the work by referring to the image processed by the supporter.
- the field work support information may be stored in the storage unit 32 of the server 30. Alternatively, it may be stored in the storage unit of the supporter terminal 40.
- the field work support information may be obtained from other sources. For example, it may be website information.
- the supporter will proceed with the support work of the worker "AAAaaa”.
- a support request is made by the worker "BBB bbb” after the support of the worker "AAA” is completed or during the support
- the name "BBB bbb” blinks (blinks).
- the connection with the worker "BBB bbb” becomes active.
- the worker "BBB bbb" of the worker identification information 112 is marked with "Live”, and the image captured by the wearable device 10 is displayed live on the live image 113, and the worker "BBB bbb” is displayed. Supporters can make calls.
- the image captured by the wearable device 10 the voice including the call between the worker and the supporter, the supporter processed image created by the supporter, and the like are stored in the server. It is accumulated in the storage unit 32 of 30. These are stored for each task.
- the server 30 may automatically create a report based on the stored information.
- the remote work support system 1 of the present embodiment includes a wearable device 10, a mobile terminal 20, and a supporter terminal 40.
- the wearable device 10 is worn and used by a field worker.
- the wearable device 10 includes an image pickup unit 11, and can transmit a captured image of the vicinity of the worker to the supporter terminal 40. Further, the wearable device 10 and the supporter terminal 40 can be used like a telephone to make a telephone call.
- the wearable device 10 does not have a display unit.
- the wearable device is provided with a display unit, the field of view of the operator is limited and the operator may be in danger, for example, a glasses-type device.
- the display unit is not provided, there is a merit that such anxiety is lessened.
- the worker since the worker carries the mobile terminal 20, the information of the support from the supporter can be displayed on the display unit 21 of the mobile terminal 20 as needed.
- the wearable device 10 of the present embodiment is a neck-mounted device. Therefore, the worker can perform the work hands-free.
- the image captured by the imaging unit 11 of the wearable device 10 is a moving image or a still image.
- the live image 113 can be continuously displayed on the display unit 43 of the supporter terminal 40.
- the supporter creates the supporter processed image, it can be created more easily by using the still image.
- the image taken by the imaging unit 11 of the wearable device 10 is selected, processed, or information is added to the supporter terminal 40 and transmitted to the mobile terminal 20, and is displayed on the display unit 21 of the mobile terminal 20.
- the worker does not need to see the support information of the supporter more than necessary, and can see it on the mobile terminal 20 when necessary.
- the supporter can easily confirm the contents of the call, and can also use it to support the worker. Furthermore, the record of the utterance content as text information can also be used for creating a report.
- the text information may be further displayed on the display unit 21 of the mobile terminal 20. It is also useful for the worker to be able to confirm the utterance with the supporter in text on the display unit 21. Further, the supporter may extract important utterance content texts and display them on the worker's mobile terminal 20.
- the remote work support system 1 has a first line T1 connecting the supporter terminal 40 and the wearable device 10, and a second line T2 connecting the supporter terminal 40 and the mobile terminal 20.
- the first line T1 is continuously used during the work by the worker, and the second line T2 is connected as needed during the work by the worker.
- the server 30 of the remote work support system 1 stores the image captured by the image capturing unit 11 of the wearable device 10 in the storage unit 32 of the server 30. Further, the utterances of the worker and the supporter are voice-recognized, converted into text, and stored in the storage unit 32 of the server 30.
- the server 30 may estimate the progress status of the work and the end time of the work from these images, textual characters, and the like, and display them on the display unit 43 of the supporter terminal 40. The progress of the work may be estimated and displayed, such as XX% completion. In this way, the supporter can know the progress of the work.
- the supporter terminal 40 displays the reference list 115 on the display unit 43.
- References are field work support information.
- the field work support information is stored in the database 50 on the network.
- the supporter can access the contents of the field work support information from the reference list 115.
- the supporter extracts and processes the contents of the on-site work support information and transmits it to the mobile terminal 20.
- the supporter processes the on-site work support information in this way and sends it to the worker, the worker does not have to look at a large amount of a huge amount of materials, and the efficiency of the worker is improved.
- the work support by the remote work support system 1 of the present embodiment includes support by various means.
- voice support voice support, processed image support, text support entered on the supporter terminal, text support for the supporter's utterances, or a combination of these. Is.
- Voice support is the same as telephone support.
- the support by the processed image includes the one using the image captured by the wearable device 10 and the one using the processed on-site work support information.
- Support by characters input on the supporter terminal 40 is realized by the supporter inputting characters on the supporter terminal 40, transmitting the characters to the mobile terminal 20, and displaying the characters on the display unit 21. Support.
- Support with textualized utterances of the supporter is support in which the utterances of the supporter are converted into text and the textualized information is sent to the worker's mobile terminal 20. Workers can miss or forget what the supporters say. Therefore, for important utterances, sending them as text may lead to effective support.
- the active worker "AAA aaa” is displayed large above, and the images of the inactive workers "BBB bbb”, “CCC ccc”, “DDD ddd”, and “EEE ee” are displayed small below. Has been done.
- the display unit 43 of the modification 1A allows the supporter to know the approximate movement of the inactive worker, in other words, the worker who is not in support. The supporter can use this to pay attention to the worker, for example, when the worker is trying to make an unfavorable movement.
- the server 30 executes the voice recognition and the conversion into text.
- the voice recognition and the text conversion are performed by the wearable device 10 or the supporter terminal 40. Therefore, the wearable device 10 or the supporter terminal 40 transmits both voice audio data and voice text data to the network.
- the wearable device 10 or the supporter terminal 40 may have a translation function.
- the wearable device 10 was a neck-hanging type.
- the wearable device 10 may be in another format. It is preferable that it is worn on the body and becomes hands-free.
- the wearable device 10 of the modified example 1C is worn on the head. Specifically, it is in the form of a helmet, a hat, or a hair band.
- the field worker holds and uses the wearable device 10 and the mobile terminal 20.
- the remote work support system 1 may further use another device.
- Another device may be connected to the supporter terminal 40 via a network.
- Another device may be connected to the server 30.
- Another device is a network camera different from the imaging unit 11 of the wearable device 10 used by the field worker in the modified example 1D.
- the network camera transmits a photographed image of a place different from the image capturing unit 11 of the wearable device 10 or an image captured from a different angle even at the same place to the supporter terminal through the network.
- the wearable device 10 and the mobile terminal 20 are independently connected to the network by the first line (T1) or the second line (T2), respectively.
- the wearable device 10 and the mobile terminal 20 are connected to each other by P2P.
- connection between the wearable device 10 and the network may be via the first line, and the wearable device 10 and the mobile terminal 20 once communicate with each other via P2P, and then the wearable device 10 and the mobile terminal 20 communicate with each other via P2P.
- the mobile terminal 20 may be connected to the network by a second line.
- the short-range wireless communication technology may be Bluetooth®, infrared communication, or Wi-Fi.
- the field worker can use the PC as a device different from the wearable device 10 and the mobile terminal 20 which is a smartphone.
- the supporter may select the mobile terminal 20 or a PC and send the image.
- the server 30 controls the transmission of the image from the supporter terminal 40 to the worker, the supporter terminal 40 issues a command to specify the device of the worker who transmits the image, and the server 30 accepts the command. It may be configured in.
- the content of the training is on-site work such as equipment repair and maintenance. Students participate as workers.
- the number of participants (workers) is preferably one or two. It is preferable to carry out the work to be mastered by the minimum number of people.
- the trainees of the training are supporters. It is preferable that there are a plurality of trainees, and each trainee guides the students from different perspectives. Therefore, the place and angle you want to see differ depending on the trainee. For example, a trainee who is familiar with the work itself wants to check the hands of the trainee. For those who are familiar with the equipment, I would like to get a bird's-eye view of the overall condition of the equipment, including the trainees, and also check the continuous equipment such as meters attached to the equipment.
- the trainee urges the trainee (worker) to switch to a wearable device 10 or a mobile terminal 20 such as a smartphone that can shoot the place or field of view that he / she wants to check, or to add those devices.
- a wearable device 10 or a mobile terminal 20 such as a smartphone that can shoot the place or field of view that he / she wants to check, or to add those devices.
- the students can receive support and education from the trainees regarding the work they carry out from various viewpoints and angles.
- FIG. 4 shows a diagram schematically explaining the remote work support system of the second embodiment.
- the remote work support system of the second embodiment includes the configuration of the remote work support system 1 of the first embodiment almost as it is.
- the remote work support system of the second embodiment includes a wearable device 10, a mobile terminal 20, a server 30, a supporter terminal 40, and a database 50.
- the remote work support system provides work support, and as a result, the image captured by the wearable device 10, the utterance of the worker, and the utterance of the supporter are voice-recognized and converted into text, and the support is provided. Accumulate support images, work reports, etc. created by the person.
- the server 30 of the second embodiment compares data such as an image captured by the wearable device 10 with data accumulated in the past for newly generated work such as failure repair, and an abnormality or failure of a similar device occurs. It automatically determines what you are doing. As described above, the remote work support system 1 of the present embodiment can diagnose an abnormality or failure of the device.
- the server 30 When the server 30 can diagnose an abnormality or failure of the device, the server 30 proposes to the supporter an appropriate support method similar to the past. Alternatively, directly propose the same appropriate support method as in the past to the worker.
- the remote work support system of this embodiment predicts future equipment failures.
- Modification example (7-1) Modification example 2A
- the configuration of the modified example 2A is almost the same as that of the second embodiment.
- the server 30 may automatically identify an inappropriate point in the operation manual of the device or the repair / maintenance work manual by accumulating the work support data. In such a case, the server 30 proposes correction and optimization of the operation manual and the repair / maintenance work manual.
- the configuration of the remote work support system 1 of the third embodiment is the same as the configuration of the remote work support system 1 of the first embodiment as shown in FIG.
- the remote work support system 1 includes a plurality of devices.
- the plurality of devices include a wearable device 10, a mobile terminal 20, and a supporter terminal 40.
- the plurality of devices may include yet another device.
- the other device may be a device used by an operator other than the wearable device 10 and the mobile terminal 20, and may be a device that supports input or output of voice or image.
- Specific examples of other devices may be cameras, smartphones, smart watches, smart glasses, headsets, wearable devices, and the like.
- a web camera similar to the modified example 1D or a communication device similar to the modified example 1F may be used.
- the server 30 is connected to a plurality of devices, and the information collected by the plurality of devices is collected in the server and stored in the storage unit 32.
- the information is, for example, an image captured by the wearable device 10, and is a voice collected by the wearable device 10 or the supporter terminal 40.
- the server 30 designates at least one or more of the plurality of devices based on a predetermined condition, and calls the device to be connected based on the designation.
- AI technology may be used to determine the predetermined conditions of the server 30.
- the server 30 acquires the environmental information of the work site of the field worker and stores it in the storage unit 32.
- the environmental information is information such as the illuminance of the work site, the degree of noise, the temperature, the humidity, and whether or not the work is a work at a high place.
- the server 30 may prompt the worker to use and select the device and call the selected device according to the environmental information.
- the server 30 acquires information that the worker is in a noisy environment such as a factory by voice information transmitted from the wearable device 10. In this case, the server 30 urges the operator to use a device with a directional microphone other than the wearable device 10. When the operator turns on the device equipped with the directional microphone, the server 30 calls the device equipped with the directional microphone.
- the server 30 acquires information that the worker is in a dark place by the image information transmitted from the wearable device 10.
- the server 30 urges the operator to use a camera with a light other than the wearable device 10 (for example, a mobile terminal 20 or another digital camera).
- the worker prepares the lighted camera, and the server 30 calls the lighted camera.
- Modification example (9-1) Modification example 3A
- the server 30 manages the progress of the work by the field worker and stores it in the storage unit 32.
- the server 30 calls at least one or more of the plurality of devices according to the progress of the work.
- the server 30 tells the worker to remove the smart glasses with a narrow field of view when the next work is determined to be work at a high place from the work progress. prompt.
- the server 30 determines that the work requires confirmation from two or more viewpoints from the work progress, it is urged to use two cameras, one for taking a picture of the whole and the other for moving the hand. ..
- the server 30 calls the two cameras.
- the server 30 voice-recognizes the utterances of the field worker and the supporter.
- the server 30 calls at least one or more of the plurality of devices based on the keywords acquired by voice recognition.
- the server 30 encourages the worker to use the headset when the words “climb” and “on the stepladder” are mentioned in the conversation between the worker and the supporter.
- the server 30 calls the headset.
- the server 30 calls the device equipped with the directional microphone.
- the server 30 calls the smartphone or smart glasses.
- the modified example 3C is the use of the remote work support system 1 for the purpose of training as in the modified example 1H.
- the server 30 in addition to the modified example 1H, the server 30 further receives the designation of at least one or more of the plurality of devices used by the field workers from the supporter terminal 40.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
- Telephonic Communication Services (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Closed-Circuit Television Systems (AREA)
- Lifting Devices For Agricultural Implements (AREA)
- Selective Calling Equipment (AREA)
- Jib Cranes (AREA)
Abstract
Description
る。
(1)全体構成
第1実施形態の遠隔作業支援システム1は、図2に示すように、ウエアラブルデバイス10と、携帯端末20と、サーバ30と、支援者端末40と、データベース50とを有している。ウエアラブルデバイス10と携帯端末20とは、現場作業者「AAA_aaa」、「BBB_bbb」および「CCC_ccc」がそれぞれ保持している。現場作業者は複数の現場に別々に派遣されている複数の作業者である。図2においては、3つの現場に3人の作業者が派遣されているが、現場の数、作業者の数はこれに限定されない。支援者端末40は、現場作業者から遠隔地にいる支援者が利用するように配置されている。
(2-1)ウエアラブルデバイス10
ウエアラブルデバイス10は、図1に示すように首掛け式のデバイスである。ウエアラブルデバイス10は、環状であり、環の一部が開いた形をしている。その開いた部分を利用して首に挿入し、開いた部分が前にくるように装着する。
携帯端末20は、作業者に携帯または保持される。携帯端末20は、スマートフォン、タブレット、PCなどであっても良い。携帯端末20は、表示部21と、通信部22と、制御部23とを有している。携帯端末20は、第2回線T2でネットワークに接続されている。
サーバ30は、コンピュータである。サーバ30は、プロセッサ31と、記憶部32とを有している。記憶部32には、アプリケーションやデータが保存されている。本実施形態においては、記憶部32に記憶された遠隔作業支援システム1のアプリケーションがプロセッサ31に読み込まれ実行される。遠隔作業支援システム1のアプリケーションは、WEBアプリケーションである。支援者端末40と携帯端末20においてはブラウザを立ち上げ、ブラウザから遠隔作業支援システム1のアプリケーションにアクセスする。
データベース50は、電子データを保存する記憶部を備えている。データベース50は、ネットワークに接続されている。本実施形態においては、データベース50は、機器の修理や保守メンテナンスで利用される現場作業支援情報を格納している。現場作業支援情報の格納は、データベース50に限定されない。サーバ30の記憶部32に格納しても良い。支援者端末40の記憶部に格納しても良い。
支援者端末40は、現場作業者を遠隔支援する支援者により利用される。支援者端末40は、PCであってもよい。支援者端末40は、音声入力部41と、音声出力部42と、表示部43と、通信部44と、制御部45と、を有している。
遠隔作業支援システム1のアプリケーションが実行中のときの、支援者端末40の表示部43のブラウザの表示画面の一例を図3に示す。
次に、本実施形態の遠隔作業支援システム1を用いた作業者の支援の方法について説明する。
(4-1)
本実施形態の遠隔作業支援システム1は、ウエアラブルデバイス10と、携帯端末20と、支援者端末40とを備えている。ウエアラブルデバイス10は、現場作業者が装着して用いる。ウエアラブルデバイス10は、撮像部11を備えており、支援者端末40に作業者周辺の撮像画像を送信することができる。また、ウエアラブルデバイス10と、支援者端末40とで、電話のように用いて通話が可能である。
本実施形態のウエアラブルデバイス10は、首掛け式のデバイスである。したがって、作業者は、ハンズフリーに作業を行うことができる。
ウエアラブルデバイス10の撮像部11で撮影する画像は、動画または静止画である。
ウエアラブルデバイス10の撮像部11で撮影された画像が、支援者端末40において、選択、加工、または、情報付加されて携帯端末20に送信され、携帯端末20の表示部21に表示される。
作業者および支援者の発話は、音声認識され、テキスト情報として、支援者端末40の表示部43の発話記録116に表示される。
上記テキスト情報は、さらに、携帯端末20の表示部21に表示されてもよい。作業者においても、支援者との発話をテキストで表示部21で確認できるのは有用である。また、重要な発話内容テキストを支援者が抽出して作業者の携帯端末20に表示しても良い。
遠隔作業支援システム1は、支援者端末40とウエアラブルデバイス10とを結ぶ第1回線T1と、支援者端末40と携帯端末20を結ぶ第2回線T2とを有する。第1回線T1は、作業者による作業の間、連続して用いられ、第2回線T2は、作業者による作業の間、必要に応じて接続される。
遠隔作業支援システム1のサーバ30は、ウエアラブルデバイス10の撮像部11で撮像された画像をサーバ30の記憶部32に記憶する。また、作業者および支援者の発話を音声認識してテキスト化し、サーバ30の記憶部32に記憶する。サーバ30は、これらの画像、テキスト化された文字などから、作業の進捗状況、作業の終了時間を推定して、支援者端末40の表示部43に表示してもよい。作業の進捗状況は、○○%終了、などのように推定、表示しても良い。このように支援者は作業の進捗を知ることができる。
支援者端末40は、表示部43に参照文献リスト115を表示する。参照文献とは、現場作業支援情報である。現場作業支援情報は、ネットワーク上のデータベース50に保存されている。支援者は、参照文献リスト115より、現場作業支援情報の内容にアクセスできる。支援者は、現場作業支援情報の内容を、抽出、加工して、携帯端末20に送信する。
本実施形態の遠隔作業支援システム1による作業支援は、様々な手段による支援を含んでいる。言い換えると、音声による支援、加工された画像による支援、支援者端末に置いて入力された文字による支援、支援者の発話がテキスト化された文字による支援、のいずれか、またはこれらの組み合わせによる支援である。
(5-1)変形例1A
第1実施形態においては、図3に示すように、アクティブな作業者「AAA aaa」のライブ画像113のみが表示部43に表示されていた。変形例1Aにおいては、接続されている全ての作業者「AAA aaa」、「BBB bbb」、「CCC ccc」、「DDD ddd」、および、「EEE eee」のライブ画像がライブ画像113に表示されている。言い換えると、アクティブでない作業者のライブ映像もライブ画像113に同時に表示されている。ただし、アクティブな作業者「AAA aaa」は上に大きく表示され、アクティブでない作業者「BBB bbb」、「CCC ccc」、「DDD ddd」、および、「EEE eee」の映像は、下に小さく表示されている。このように変形例1Aの表示部43によって、支援者は、アクティブでない作業者、言い換えると、支援中で無い作業者のおおよその動きを知ることができる。支援者は、これを利用して、たとえば、作業者が好ましくない動きをしようとしているときは、作業者を注意することができる。
第1実施形態では、音声を音声認識して、テキスト化するのは、サーバ30において実行していた。変形例1Bにおいては、音声認識およびテキスト化は、ウエアラブルデバイス10または支援者端末40にて行われる。したがって、ウエアラブルデバイス10または支援者端末40は、音声のオーディオデータと音声のテキストデータとを両方ともネットワークに発信する。さらに、音声認識機能だけでなく、ウエアラブルデバイス10または支援者端末40は、翻訳機能を有していても良い。
第1実施形態では、ウエアラブルデバイス10は、首掛け式であった。ウエアラブルデバイス10は他の形式であっても良い。身体に装着して、ハンズフリーになるものが好ましい。変形例1Cのウエアラブルデバイス10は、頭部に装着するものである。具体的には、ヘルメット、帽子、または、ヘアバンド、のような形態のものである。
第1実施形態の遠隔作業支援システム1では、現場作業者は、ウエアラブルデバイス10と、携帯端末20を保持し、使用していた。遠隔作業支援システム1は、さらに、別のデバイスを利用してもよい。別のデバイスは、支援者端末40にネットワークを介して接続されていてもよい。別のデバイスは、サーバ30に接続されていてもよい。
第1実施形態においては、ウエアラブルデバイス10、携帯端末20は、それぞれ独立に第1回線(T1)または第2回線(T2)によって、ネットワークに接続されていた。変形例1Eにおいては、ウエアラブルデバイス10、携帯端末20は、互いにP2Pで接続されている。
別の通信デバイスがウエアラブルデバイス10に近距離無線通信技術で接続することができる。近距離無線通信技術は、Bluetooth(登録商標)、赤外線通信、Wi-Fiであってもよい。
第1実施形態においては、支援者が発生する音声はウエアラブルデバイス10に、支援者端末40から送信する画像は、携帯端末20に送信された。
本実施形態の作業者支援システムは、他の用途、たとえば、研修目的での利用が可能である。
(6)第2実施形態の遠隔作業支援システム
第2実施形態の遠隔作業支援システムを模式的に説明する図を図4に示す。
(7-1)変形例2A
変形例2Aの構成は、第2実施形態とほぼ同様である。変形例2Aでは、サーバ30が作業支援データを蓄積していくことにより、機器の操作マニュアルや修理メンテナンス作業マニュアルの不適切な点を自動識別する場合が起こりうる。サーバ30は、このような場合には、操作マニュアルや修理メンテナンス作業マニュアルの修正、最適化を提案する。
(8)第3実施形態の遠隔作業支援システム
第3実施形態の遠隔作業支援システム1の構成は、図2に示すように、第1実施形態の遠隔作業支援システム1の構成と同じである。遠隔作業支援システム1は、複数のデバイスを含む。複数のデバイスとは、ウエアラブルデバイス10、携帯端末20、および、支援者端末40を含む。複数のデバイスは、さらに別のデバイスを含んでもよい。別のデバイスとは、ウエアラブルデバイス10、携帯端末20以外に作業者が利用するデバイスであって、音声または画像の入力または出力を支援するデバイスであってもよい。具体的な別のデバイスの例としては、カメラ、スマートフォン、スマートウオッチ、スマートグラス、ヘッドセット、ウエアラブルデバイス、などであってもよい。変形例1Dと同様のウエブカメラや、変形例1Fと同様の通信デバイスであってもよい。
(9-1)変形例3A
変形例3Aでは、サーバ30は、現場作業者による作業の進捗を管理し、記憶部32に記憶する。サーバ30は作業の進捗に応じて、前記複数のデバイスのうち少なくとも1つ以上のデバイスを呼び出す。
変形例3Bでは、サーバ30は、現場作業者および支援者の発話を音声認識する。サーバ30は、音声認識によって取得したキーワードに基づいて、複数のデバイスのうち少なくとも1つ以上のデバイスを呼び出す。
変形例3Cは、変形例1Hと同様に研修目的での遠隔作業支援システム1の利用である。本実施形態は、変形例1Hにプラスして、さらに、サーバ30は、支援者端末40から、現場作業者が利用する複数のデバイスのうち少なくとも1つ以上のデバイスの指定を受け付ける。
10 ウエアラブルデバイス
11 撮像部
12 音声入力部
13 音声出力部
20 携帯端末
21 表示部
22 通信部
40 支援者端末
41 音声入力部
42 音声出力部
43 表示部
T1 第1回線
T2 第2回線
Claims (18)
- 現場作業者に装着され、撮像部(11)と、音声入力部(12)と、音声出力部(13)とを有し、画像を送信し、音声を送受信するウエアラブルデバイス(10)と、
前記ウエアラブルデバイスとは別に前記現場作業者に携帯され、受信した画像を表示する表示部(21)を有する携帯端末(20)と、
前記現場作業者を遠隔支援する支援者により利用され、音声入力部(41)と、音声出力部(42)と、表示部(43)と、を有し、前記ウエアラブルデバイスと音声を送受信し、前記ウエアラブルデバイスからの画像を受信し、前記携帯端末へ画像を送信する支援者端末(40)と、
を備えた
遠隔作業支援システム。 - 前記ウエアラブルデバイスは、首掛け式のデバイスである、
請求項1に記載の遠隔作業支援システム。 - 前記支援者端末の前記表示部は、前記ウエアラブルデバイスの前記撮像部で撮影した動画および静止画を表示する、
請求項1または2に記載の遠隔作業支援システム。 - 前記ウエアラブルデバイスの前記撮像部で撮影された画像が、前記支援者端末において、選択、加工、または、情報付加されて前記携帯端末に送信され、前記携帯端末の表示部に表示される、
請求項1~3のいずれか1項に記載の遠隔作業支援システム。 - 前記現場作業者および前記支援者の発話は、音声認識され、テキスト情報として、前記支援者端末の表示部に表示される、
請求項1~4のいずれか1項に記載の遠隔作業支援システム。 - 前記テキスト情報は、さらに、前記携帯端末の表示部に表示される、
請求項5に記載の遠隔作業支援システム。 - 前記支援者端末と前記ウエアラブルデバイスとを結ぶ第1回線(T1)と、
前記支援者端末と前記携帯端末を結ぶ第2回線(T2)とを有し、
前記第1回線は、前記現場作業者による作業の間、連続して用いられ、
前記第2回線は、前記現場作業者による作業の間、必要に応じて接続される、
請求項1~6のいずれか1項に記載の遠隔作業支援システム。 - 前記現場作業者による作業の作業進捗を、前記撮像部で撮像された画像、または/および、前記現場作業者および前記支援者の発話から自動抽出し、作業の進捗状況または作業の終了時間を推定し、前記支援者端末の前記表示部に、前記進捗状況または前記終了時間を表示する、
請求項1~7のいずれか1項に記載の遠隔作業支援システム。 - 前記支援者端末は、現場作業支援情報を記憶するデータベース(50)にアクセスし、前記現場作業支援情報を抽出、加工して、前記携帯端末に送信する、
請求項1~8のいずれか1項に記載の遠隔作業支援システム。 - 前記支援者端末より前記現場作業者への支援は、音声による支援、加工された画像による支援、前記支援者端末に於いて入力された文字による支援、支援者の発話がテキスト化された文字による支援、のいずれか、またはこれらの組み合わせによる支援を含む、
請求項1~9のいずれか1項に記載の遠隔作業支援システム。 - 前記遠隔作業支援システムは、同一のサーバに接続可能な複数のデバイスを備え、
前記複数のデバイスは、前記ウエアラブルデバイス、前記携帯端末、および、前記支援者端末を含み、
前記サーバは、所定の条件に基づいて、前記複数のデバイスのうち少なくとも1つ以上のデバイスを指定し、前記指定に基づいて接続対象となるデバイスを呼び出す、
請求項1~10のいずれか1項に記載の遠隔作業支援システム。 - 前記サーバは、前記現場作業者の作業現場の環境情報を取得し、
前記サーバは、前記環境情報に基づいて、前記複数のデバイスのうち少なくとも1つ以上のデバイスを呼び出す、
請求項11に記載の遠隔作業支援システム。 - 前記サーバは、前記現場作業者による作業の進捗を管理し、
前記サーバは、作業の進捗に基づいて、前記複数のデバイスのうち少なくとも1つ以上のデバイスを呼び出す、
請求項11に記載の遠隔作業支援システム。 - 前記サーバは、前記現場作業者および前記支援者の発話を音声認識し、
前記音声認識よって取得したキーワードに基づいて、前記複数のデバイスのうち少なくとも1つ以上のデバイスを呼び出す、
請求項11に記載の遠隔作業支援システム。 - 前記サーバは、さらに、前記支援者端末から、前記現場作業者が利用する前記複数のデバイスのうち少なくとも1つ以上のデバイスの指定を受け付ける、
請求項11~14のいずれか1項に記載の遠隔作業支援システム。 - 前記複数のデバイスはデバイス間で相互に通信可能である、
請求項1~15のいずれか1項に記載の遠隔作業支援システム(1)。 - 現場作業者を遠隔支援する支援者により利用され、音声入力部(41)と、音声出力部(42)と、表示部(43)と、を有する支援者端末(40)であって、
現場作業者に装着され、撮像部(11)と、音声入力部(12)と、音声出力部(13)とを有するウエアラブルデバイス(10)と、音声を送受信し、前記ウエアラブルデバイスからの画像を受信し、
前記ウエアラブルデバイスとは別に現場作業者に携帯され、画像受信部(22)が受信した画像を表示する表示部(21)を有する携帯端末(20)に画像を送信する、支援者端末(40)。 - 現場作業者に装着され、撮像部(11)と、音声入力部(12)と、音声出力部(13)とを有し、画像を送信し、音声を送受信するウエアラブルデバイス(10)と、
前記ウエアラブルデバイスとは別に現場作業者に携帯され、画像受信部(22)が受信した画像を表示する表示部(21)を有する携帯端末(20)と、
現場作業者を遠隔支援する支援者により利用され、音声入力部(41)と、音声出力部(42)と、表示部(43)と、を有する支援者端末(40)とを含む、複数のデバイスと接続するサーバ(30)であって、
所定の条件に基づいて、前記複数のデバイスのうち、少なくとも1つ以上のデバイスを指定し、前記指定に基づいて通信対象となるデバイスを呼び出す、サーバ。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20891306.1A EP4064694A4 (en) | 2019-11-20 | 2020-09-30 | REMOTE CONTROL ASSISTANCE SYSTEM |
CN202080080913.2A CN114731389A (zh) | 2019-11-20 | 2020-09-30 | 远程作业辅助系统 |
AU2020385740A AU2020385740B2 (en) | 2019-11-20 | 2020-09-30 | Remote work support system |
US17/748,805 US20220276701A1 (en) | 2019-11-20 | 2022-05-19 | Remote work support system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019209987 | 2019-11-20 | ||
JP2019-209987 | 2019-11-20 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/748,805 Continuation US20220276701A1 (en) | 2019-11-20 | 2022-05-19 | Remote work support system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021100331A1 true WO2021100331A1 (ja) | 2021-05-27 |
Family
ID=75966362
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/037310 WO2021100331A1 (ja) | 2019-11-20 | 2020-09-30 | 遠隔作業支援システム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220276701A1 (ja) |
EP (1) | EP4064694A4 (ja) |
JP (2) | JP7270154B2 (ja) |
CN (1) | CN114731389A (ja) |
AU (1) | AU2020385740B2 (ja) |
WO (1) | WO2021100331A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023068275A1 (ja) * | 2021-10-18 | 2023-04-27 | Fairy Devices株式会社 | 情報処理システム |
WO2023188951A1 (ja) * | 2022-03-28 | 2023-10-05 | 株式会社サンタ・プレゼンツ | 遠隔指示システム |
WO2024053476A1 (ja) * | 2022-09-05 | 2024-03-14 | ダイキン工業株式会社 | システム、支援方法、サーバ装置及び通信プログラム |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7382372B2 (ja) * | 2021-09-27 | 2023-11-16 | 株式会社 日立産業制御ソリューションズ | 作業支援装置、作業支援システム及び作業支援プログラム |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008123366A (ja) * | 2006-11-14 | 2008-05-29 | Nec Corp | 遠隔作業支援システム、遠隔作業支援プログラム、及びサーバ |
JP2011039996A (ja) * | 2009-08-18 | 2011-02-24 | Chugoku Electric Power Co Inc:The | 作業支援システム |
JP2016181767A (ja) * | 2015-03-23 | 2016-10-13 | パナソニックIpマネジメント株式会社 | ウェアラブルカメラ及びウェアラブルカメラシステム |
JP2018067773A (ja) * | 2016-10-18 | 2018-04-26 | キヤノン株式会社 | 撮像装置とその制御方法、プログラム及び記憶媒体 |
JP2018185570A (ja) * | 2017-04-24 | 2018-11-22 | トーヨーカネツソリューションズ株式会社 | 遠隔支援システム |
JP2018207420A (ja) | 2017-06-09 | 2018-12-27 | キヤノン株式会社 | 遠隔作業支援システム |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4275459B2 (ja) | 2003-05-27 | 2009-06-10 | 株式会社タムラ製作所 | マイクセット |
JP2008096868A (ja) * | 2006-10-16 | 2008-04-24 | Sony Corp | 撮像表示装置、撮像表示方法 |
JP2015228009A (ja) * | 2014-06-03 | 2015-12-17 | セイコーエプソン株式会社 | 頭部装着型表示装置、頭部装着型表示装置の制御方法、情報送受信システム、および、コンピュータープログラム |
FR3024583B1 (fr) | 2014-07-29 | 2017-12-08 | Airbus | Surveillance d'une intervention de maintenance sur un aeronef |
EP3413583A1 (en) * | 2014-10-20 | 2018-12-12 | Sony Corporation | Voice processing system |
JP6540108B2 (ja) * | 2015-03-09 | 2019-07-10 | 富士通株式会社 | 画像生成方法、システム、装置、及び端末 |
JP6585929B2 (ja) * | 2015-06-02 | 2019-10-02 | キヤノン株式会社 | システム、システムの制御方法 |
JP2018092478A (ja) * | 2016-12-06 | 2018-06-14 | 矢崎総業株式会社 | 作業指示システム |
EP3333688B1 (en) * | 2016-12-08 | 2020-09-02 | LG Electronics Inc. | Mobile terminal and method for controlling the same |
US11051120B2 (en) * | 2017-07-31 | 2021-06-29 | Sony Corporation | Information processing apparatus, information processing method and program |
JP7080636B2 (ja) * | 2017-12-28 | 2022-06-06 | Dynabook株式会社 | システム |
US11326886B2 (en) * | 2018-04-16 | 2022-05-10 | Apprentice FS, Inc. | Method for controlling dissemination of instructional content to operators performing procedures at equipment within a facility |
-
2020
- 2020-09-30 WO PCT/JP2020/037310 patent/WO2021100331A1/ja unknown
- 2020-09-30 CN CN202080080913.2A patent/CN114731389A/zh active Pending
- 2020-09-30 EP EP20891306.1A patent/EP4064694A4/en active Pending
- 2020-09-30 AU AU2020385740A patent/AU2020385740B2/en active Active
- 2020-09-30 JP JP2020165496A patent/JP7270154B2/ja active Active
-
2022
- 2022-05-19 US US17/748,805 patent/US20220276701A1/en active Pending
- 2022-07-19 JP JP2022115125A patent/JP2022145707A/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008123366A (ja) * | 2006-11-14 | 2008-05-29 | Nec Corp | 遠隔作業支援システム、遠隔作業支援プログラム、及びサーバ |
JP2011039996A (ja) * | 2009-08-18 | 2011-02-24 | Chugoku Electric Power Co Inc:The | 作業支援システム |
JP2016181767A (ja) * | 2015-03-23 | 2016-10-13 | パナソニックIpマネジメント株式会社 | ウェアラブルカメラ及びウェアラブルカメラシステム |
JP2018067773A (ja) * | 2016-10-18 | 2018-04-26 | キヤノン株式会社 | 撮像装置とその制御方法、プログラム及び記憶媒体 |
JP2018185570A (ja) * | 2017-04-24 | 2018-11-22 | トーヨーカネツソリューションズ株式会社 | 遠隔支援システム |
JP2018207420A (ja) | 2017-06-09 | 2018-12-27 | キヤノン株式会社 | 遠隔作業支援システム |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023068275A1 (ja) * | 2021-10-18 | 2023-04-27 | Fairy Devices株式会社 | 情報処理システム |
WO2023188951A1 (ja) * | 2022-03-28 | 2023-10-05 | 株式会社サンタ・プレゼンツ | 遠隔指示システム |
WO2024053476A1 (ja) * | 2022-09-05 | 2024-03-14 | ダイキン工業株式会社 | システム、支援方法、サーバ装置及び通信プログラム |
JP7482459B2 (ja) | 2022-09-05 | 2024-05-14 | ダイキン工業株式会社 | システム、支援方法、サーバ装置及び通信プログラム |
Also Published As
Publication number | Publication date |
---|---|
AU2020385740B2 (en) | 2023-05-25 |
JP2022145707A (ja) | 2022-10-04 |
JP2021083079A (ja) | 2021-05-27 |
JP7270154B2 (ja) | 2023-05-10 |
CN114731389A (zh) | 2022-07-08 |
EP4064694A4 (en) | 2023-01-11 |
AU2020385740A1 (en) | 2022-07-07 |
EP4064694A1 (en) | 2022-09-28 |
US20220276701A1 (en) | 2022-09-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021100331A1 (ja) | 遠隔作業支援システム | |
JP6439788B2 (ja) | 情報処理装置、制御方法、プログラム、およびシステム | |
US20200236070A1 (en) | Information processing system and information processing method | |
CN100358358C (zh) | 视频电话手语翻译辅助装置及应用其的手语翻译系统 | |
JP2003514294A (ja) | 対象に指向してマーキングし情報を選択された技術的構成要素に対応付けるシステム及び方法 | |
US9334067B2 (en) | Real-time aircraft maintenance terminal | |
WO2017154136A1 (ja) | 携帯情報端末及びそれに用いる情報処理方法 | |
JP2018036812A (ja) | It運用作業遠隔支援システム及び方法 | |
JP2008217545A (ja) | コンソール情報取得システム、コンソール情報取得方法及びコンソール情報取得プログラム | |
JP6826322B2 (ja) | 故障部品交換支援方法 | |
JP2018186366A (ja) | 会議システム | |
KR20140000570U (ko) | 안경형 원격 제어 장치 | |
JP2018037813A (ja) | 通報受付システム及び通報受付方法 | |
CN104062758B (zh) | 图像显示的方法和显示设备 | |
JP2021082295A (ja) | 遠隔作業支援システム | |
US20150220506A1 (en) | Remote Document Annotation | |
JP2022114111A (ja) | 作業支援システム、作業支援制御装置、及び作業支援制御プログラム | |
US20220076671A1 (en) | Information processing terminal, information processing apparatus, and information processing method | |
CN207765165U (zh) | 一种带有语音识别功能的无线监控系统 | |
JP7351642B2 (ja) | 音声処理システム、会議システム、音声処理方法、及び音声処理プログラム | |
TWI810486B (zh) | 擴增實境即時互動售服維修系統 | |
US11265512B2 (en) | Door-knocking for teleconferencing | |
JP7381200B2 (ja) | デバイス設定装置、デバイス設定方法、デバイス設定システム、及びデバイス設定プログラム | |
KR20190034896A (ko) | 사용자 기반 웹 매뉴얼 제공 방법 및 시스템 | |
Dogan et al. | Trial of a special end user terminal that aids field operators during emergency rescue operations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20891306 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020891306 Country of ref document: EP Effective date: 20220620 |
|
ENP | Entry into the national phase |
Ref document number: 2020385740 Country of ref document: AU Date of ref document: 20200930 Kind code of ref document: A |