WO2023188951A1 - Remote instruction system - Google Patents

Remote instruction system Download PDF

Info

Publication number
WO2023188951A1
WO2023188951A1 PCT/JP2023/005478 JP2023005478W WO2023188951A1 WO 2023188951 A1 WO2023188951 A1 WO 2023188951A1 JP 2023005478 W JP2023005478 W JP 2023005478W WO 2023188951 A1 WO2023188951 A1 WO 2023188951A1
Authority
WO
WIPO (PCT)
Prior art keywords
instruction
site
image
captured image
unit
Prior art date
Application number
PCT/JP2023/005478
Other languages
French (fr)
Japanese (ja)
Inventor
登仁 福田
Original Assignee
株式会社サンタ・プレゼンツ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社サンタ・プレゼンツ filed Critical 株式会社サンタ・プレゼンツ
Publication of WO2023188951A1 publication Critical patent/WO2023188951A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a system that remotely projects an instruction image onto a site or projects a previously recorded instruction image onto a site.
  • Patent Document 1 discloses a system in which a worker at a site wears a camera and a laser projector around the neck, and projects annotations input by a remote instructor onto objects at the site.
  • a camera worn by the worker captures an image of a marker placed near the object, and annotations are displayed in relation to this marker. Therefore, even if a worker moves and the direction of the laser projector changes, the annotation will be displayed in the correct position.
  • annotations can be accurately displayed on-site from a remote location.
  • the annotation is displayed in the correct position, but the During the process, if the worker turns his or her body in a different direction (the marker is no longer captured by the camera), the annotation disappears.
  • An object of the present invention is to solve the above-mentioned problems and provide a system in which an instruction image (annotation) is continuously displayed even if the operator changes direction significantly.
  • the remote instruction system is a remote instruction system comprising an instruction device used by an instructor and a field device used by a site person
  • the on-site device includes an imaging unit that is attached to a person in charge of the on-site or a mobile body and that captures an image of the on-site space to generate a captured image, and a captured image transmitting unit that transmits the captured image to the instruction device using a transmitting unit.
  • a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data; and a drive that changes the imaging direction of the imaging unit and the projection direction of the projection unit.
  • the imaging section and the projection section are set in a predetermined position with the on-site personnel at the center, regardless of the movement of the on-site personnel or the mobile object.
  • a direction control means for controlling the drive unit to face the direction; and a characteristic partial image in the reference captured image when the fixing command is given, using the captured image when the fixing command is given as a reference captured image; Based on the comparison with the characteristic partial image in the currently captured image, the projection of the instruction image by the projection unit without depending on the drive unit is performed so that the instruction image is correctly displayed with reference to a predetermined part of the site space.
  • the instruction device includes a captured image receiving means for receiving the transmitted captured image by a receiving section, a captured image display section for displaying the received captured image, and a captured image of a desired site space so that the captured image is captured. , a fixing command means for giving a fixing command to the on-site device by a transmitting unit; and an instruction image input unit for inputting an instruction image to a desired position in the on-site space by an operation of an instructor in the displayed captured image. , the transmitting unit transmits an instruction image so that the projection unit of the on-site device correctly projects the instruction image with reference to a predetermined part of the on-site space based on a characteristic partial image of the on-site space included in the captured image.
  • the present invention is characterized by comprising an instruction image transmitting means for transmitting instruction image data specifying the position of to the on-site device.
  • the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
  • the remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed to the drive section via a member that absorbs high-frequency vibrations.
  • the remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed to the helmet of the person in charge of the field via the drive section.
  • the imaging section and the projection section can be stably fixed.
  • the remote instruction system is characterized in that the direction control means changes a predetermined direction based on a direction instruction from the instruction device.
  • the instructor can remotely control the imaging direction and check the situation at the site.
  • the remote instruction system according to the present invention is characterized in that the characteristic portion image is a marker provided in the site space or a characteristic point of the captured image.
  • the instruction image can be displayed correctly based on the markers or feature points.
  • the remote instruction system according to the present invention is characterized in that the drive section has a multi-axis drive mechanism.
  • the orientation of the imaging section and the projection section can be freely controlled.
  • the on-site instruction device includes an imaging unit that is attached to a person in charge of the site or a moving body, and that captures an image of the on-site space and generates a captured image; a projection unit that projects an instruction image onto the site space based on given instruction image data; a drive unit that changes an imaging direction of the imaging unit and a projection direction of the projection unit; and an orientation of the imaging unit and the projection unit.
  • a drive unit is controlled so that the imaging unit and the projection unit face a predetermined direction with the on-site person at the center regardless of the movement of the on-site person or the moving body.
  • a direction control means and a direction control means that controls the direction control means without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space based on a characteristic partial image of the site space included in the captured image.
  • a correction means for correcting the projection of the instruction image by the projection unit.
  • the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
  • the remote instruction system is a remote instruction system comprising an instruction device used by an instructor and a field device used by a site person
  • the on-site device includes an imaging unit that is attached to a person in charge of the on-site or a mobile body and that captures an image of the on-site space to generate a captured image, and a captured image transmitting unit that transmits the captured image to the instruction device using a transmitting unit.
  • a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data; and a drive that changes the imaging direction of the imaging unit and the projection direction of the projection unit.
  • the instruction device includes a captured image receiving means for receiving the transmitted captured image by a receiving section, a captured image display section for displaying the received captured image, and a captured image of a desired site space so that the captured image is captured.
  • a fixing command means for giving a fixing command to the on-site device by a transmitting unit
  • an instruction image input unit for inputting an instruction image to a desired position in the on-site space by an operation of an instructor in the displayed captured image.
  • a driving unit is controlled so that the projection unit of the on-site device correctly projects the instruction image with reference to a predetermined part of the on-site space;
  • the apparatus is characterized by comprising an instruction image transmitting means for transmitting instruction image data in which the position of the instruction image is specified to the on-site device by a transmitter.
  • the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
  • the remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed to the driving section via a member that absorbs high frequency vibrations.
  • the remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed to the helmet of the person in charge of the site via the drive section.
  • the imaging section and the projection section can be stably fixed.
  • the remote instruction system according to the present invention is characterized in that the direction control means changes the predetermined direction based on a direction instruction from the instruction device.
  • the instructor can remotely control the imaging direction and check the situation at the site.
  • the remote instruction system according to the present invention is characterized in that the characteristic portion image is a marker provided in the site space or a characteristic point of the captured image.
  • the instruction image can be displayed correctly based on the markers or feature points.
  • the remote instruction system according to the present invention is characterized in that the drive section has a multi-axis drive mechanism.
  • the orientation of the imaging section and the projection section can be freely controlled.
  • the remote instruction system uses a captured image when a fixing command is given as a reference captured image, a characteristic partial image in the reference captured image when the fixing command is given, and a current captured image.
  • correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit, so that the instruction image is correctly displayed with reference to a predetermined part of the site space, based on a comparison with a characteristic partial image; It is further characterized by the following.
  • the on-site instruction device includes an imaging unit that is attached to a person in charge of the site or a moving body, and that captures an image of the on-site space and generates a captured image; a projection unit that projects an instruction image onto the site space based on given instruction image data; a drive unit that changes the imaging direction of the imaging unit and the projection direction of the projection unit;
  • Follow-up control that controls a drive unit based on a characteristic partial image of the on-site space so that the instruction image is correctly displayed based on a predetermined part of the on-site space, regardless of the movement of the on-site person or the mobile object. and means.
  • the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
  • the remote instruction system is a remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
  • the on-site device includes an imaging unit that is attached to a person in charge of the on-site or a moving body, and that captures an image of the on-site space in a wide-angle direction to generate a captured image, and a transmission unit that transmits the captured image to the instruction device.
  • the instruction device includes a captured image receiving unit that receives a transmitted captured image by a receiving unit, a fixing command unit that issues a fixing command to the field device by a transmitting unit, and when there is a fixing command, an instruction image input unit that takes the vicinity of a characteristic partial image of the captured image as a captured image of interest and inputs an instruction image to a desired position in the site space by an instruction person's operation in the captured image of interest; Regardless of the movement of
  • the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
  • the remote instruction system according to the present invention is characterized in that the projection section is fixed to the drive section via a member that absorbs high frequency vibrations.
  • the remote instruction system according to the present invention is characterized in that the projection section is fixed to the helmet of the person in charge of the field via the drive section.
  • the projection section can be stably fixed.
  • the remote instruction system according to the present invention is characterized in that the characteristic portion image is a marker provided in the site space or a characteristic point of the captured image.
  • the instruction image can be correctly projected based on the markers or feature points.
  • the remote instruction system according to the present invention is characterized in that the drive section has a multi-axis drive mechanism.
  • the orientation of the imaging section and the projection section can be freely controlled.
  • the on-site instruction device includes an imaging unit that is attached to a person in charge of the site or a moving body and generates a captured image by capturing an image of the site space in a wide-angle direction, and an imaging unit that is attached to the person in charge of the site or the moving body.
  • a projection unit that projects an instruction image onto the site space based on given instruction image data;
  • a drive unit that changes the projection direction of the projection unit;
  • a direction control means for controlling a drive unit so that the projection unit faces in a predetermined direction with the person in charge of the site as the center; and a correction means for correcting the projection of the instruction image by the projection section regardless of the section.
  • the remote instruction system is a remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
  • the on-site device includes an imaging unit that is attached to a person in charge of the on-site or a moving body, and that captures an image of the on-site space in a wide-angle direction to generate a captured image, and a transmission unit that transmits the captured image to the instruction device.
  • the instruction device includes a captured image receiving unit that receives a transmitted captured image by a receiving unit, a fixing command unit that issues a fixing command to the field device by a transmitting unit, and when there is a fixing command, an instruction image input unit that takes the vicinity of a characteristic partial image of the captured image as a captured image of interest and inputs an instruction image to a desired position in the site space by an instruction person's operation in the captured image of interest; instruction image data in which the position of the instruction image is specified by a transmitting section in order to control the driving section so that the projection section projects the instruction image based on a predetermined part of the site space regardless of the movement of the body; and instruction image transmitting
  • the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
  • the remote instruction system according to the present invention is characterized in that the projection section is fixed to the drive section via a member that absorbs high frequency vibrations.
  • the remote instruction system according to the present invention is characterized in that the projection section is fixed to the helmet of the person in charge of the field via the drive section.
  • the projection section can be stably fixed.
  • the remote instruction system according to the present invention is characterized in that the characteristic portion image is a marker provided in the site space or a characteristic point of the captured image.
  • the instruction image can be correctly projected based on the markers or feature points.
  • the remote instruction system according to the present invention is characterized in that the drive section has a multi-axis drive mechanism.
  • the orientation of the imaging section and the projection section can be freely controlled.
  • the remote instruction system uses a captured image when a fixing command is given as a reference captured image, and a characteristic partial image in the reference captured image when the fixing command is given, and a current captured image.
  • the on-site instruction device includes an imaging unit that is attached to a person in charge of the site or a moving body, and that captures an image of the site space in a wide-angle direction to generate a captured image; a projection unit that is mounted and projects an instruction image onto the site space based on given instruction image data; a drive unit that changes the projection direction of the projection unit; and follow-up control means for controlling the drive unit so that it is displayed correctly as a reference.
  • the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
  • the remote instruction system is a remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
  • the on-site device includes an imaging unit that is attached to a person in charge of the on-site or a moving body, and that captures an image of the on-site space in a wide-angle direction to generate a captured image, and a transmission unit that transmits the captured image to the instruction device.
  • a projection unit that is attached to a person in charge of the site or a moving body, is capable of projecting in a wide angle direction onto the site space, and projects an instruction image onto the site space based on given instruction image data;
  • the projection unit projects a fixed image of the site space to a predetermined position based on a characteristic partial image of the site space included in the captured image, regardless of the movement of the site person or the mobile object.
  • the instruction device includes a captured image receiving means for receiving the transmitted captured image by a receiving section;
  • a fixing command means for giving a fixing command to the device, and when there is a fixing command, the vicinity of the characteristic partial image of the captured image is set as a captured image of interest, and in the captured image of interest, a desired position of the site space is determined by an operation of an instructor.
  • the present invention is characterized by comprising an instruction image input section for inputting an instruction image at a position, and instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the instruction device by a transmission section.
  • the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
  • the remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed via a member that absorbs high-frequency vibrations.
  • the remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed to the helmet of the person in charge of the field.
  • the imaging section/projection section can be stably fixed.
  • the remote instruction system according to the present invention is characterized in that the characteristic portion image is a marker provided in the site space or a characteristic point of the captured image.
  • the instruction image can be correctly projected based on the markers or feature points.
  • the on-site instruction device includes an imaging unit that is attached to a person in charge of the site or a moving body, and that captures an image of the site space in a wide-angle direction to generate a captured image; a projection unit that is attached and capable of projecting in a wide-angle direction onto a site space, and projects an instruction image onto the site space based on given instruction image data; and features of the site space included in the captured image. and follow-up control means for controlling the projection unit to project the instruction image based on a predetermined part of the site space based on the partial image, regardless of the movement of the site person or the mobile object. There is.
  • the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
  • the integrated imaging unit and projection unit includes an imaging unit, a projection unit that projects at substantially the same angle of view as the imaging angle of view of the imaging unit, and an imaging unit and a projection unit that are integrated at least. It includes a structure that can be oriented in two axial directions and a drive unit that controls the orientation of the structure.
  • the "direction control means" corresponds to steps S1 and S2.
  • the "correction means" corresponds to step S34 and step S75.
  • the "fixing command means" corresponds to step S51.
  • the "instruction image transmitting means" corresponds to step S53.
  • the "follow-up control means" corresponds to step S35, step S76, and step S79.
  • the term "device” is a concept that includes not only one computer but also multiple computers connected via a network. Therefore, when the means (or a part of the means) of the present invention is distributed over multiple computers, these multiple computers correspond to the apparatus.
  • Program refers to not only programs that can be directly executed by the CPU, but also programs in source format, compressed programs, encrypted programs, programs that cooperate with the operating system to perform their functions, etc. It is a concept that includes
  • FIG. 1 is a functional configuration diagram of a remote instruction system according to an embodiment of the present invention. It is a figure showing the field person in charge 54 wearing a field device. This is an appearance of the laser projector/camera integrated body 58. 9 is a diagram showing a structure for attaching the laser projector/camera integrated body 58 to the mount member 97.
  • FIG. 7 is a diagram showing the attachment position of a silicon gel bush 120 (high-frequency vibration absorbing member) to the unit 80.
  • FIG. This is the system configuration of the remote instruction system. This is the hardware configuration of the instruction device 30. This is the hardware configuration of the motor control circuit 400. This is the hardware configuration of the smartphone 200. It is a flowchart of direction fixed control. 3 is a flowchart of instruction image display control.
  • FIG. 3 is a diagram showing a captured image in which an instruction image 62 is drawn in the instruction device 30.
  • FIG. FIG. 3 is a diagram showing a data structure of an instruction image.
  • 7 is a diagram showing the relationship between the movement of a person in charge of the site and the projection direction of a laser projector 84.
  • FIG. 7 is a diagram showing the relationship between the movement of a person in charge of the site and the projection direction of a laser projector 84.
  • FIG. 7 is a diagram showing the relationship between the movement of a person in charge of the site and the projection direction of a laser projector 84.
  • FIG. This is an example in which feature points 512 are used instead of markers.
  • FIG. 3 is a flowchart of instruction processing by the on-site instruction device 11.
  • FIG. FIG. 2 is a functional configuration diagram of a remote instruction system according to a second embodiment.
  • 3 is a flowchart of instruction image display control. 7 is a diagram showing the relationship between the movement of a person in charge of the site and the projection direction of a laser projector 84.
  • FIG. This is a functional configuration of the on-site instruction device 11.
  • 3 is a flowchart of instruction processing by the on-site instruction device 11.
  • FIG. FIG. 3 is a functional configuration diagram of a remote instruction system according to a third embodiment. This is an appearance of the laser projector/camera integrated body 58. It is a flowchart of direction fixed control.
  • FIG. 3 is a flowchart of instruction image display control. It is a flowchart of direction fixed control. 3 is a flowchart of instruction image display control. 2 is a functional configuration diagram of the on-site instruction device 11. FIG. 3 is a flowchart of instruction processing by the on-site instruction device 11. FIG. FIG. 3 is a functional configuration diagram of a remote instruction system according to a fourth embodiment. It is a flowchart of direction fixed control. 3 is a flowchart of instruction image display control. 2 is a functional configuration diagram of the on-site instruction device 11. FIG. 3 is a flowchart of instruction processing by the on-site instruction device 11. FIG.
  • FIG. 1 shows the functional configuration of a remote instruction system according to an embodiment of the present invention.
  • This system includes an on-site device 10 used by a person in charge at the site and an instruction device 30 used by a remote instructor.
  • An imaging unit 12 and a projection unit 14 are provided on the helmet worn by the person in charge of the field via a drive unit 16.
  • the imaging area of the imaging unit 12 and the projection area of the projection unit 14 are arranged to be substantially the same.
  • imaging section 12 and projection section 14 are integrally configured so that their imaging direction and projection direction can be changed by a driving section 16.
  • the imaging direction and projection direction of the imaging unit 12 and the projection unit 14 are detected by the sensor 28.
  • the direction control means 20 controls the drive section 16 based on the output of the sensor 28, and directs the imaging section 12 and the projection section 14 in a predetermined direction centered on the on-site worker, regardless of the movement of the on-site worker. maintain.
  • the imaging unit 12 of the field device 10 images the field space including the object 52 and generates a captured image.
  • the imaging direction by the imaging unit 12 is fixed in a predetermined direction centered on the person in charge of the site, so if the person in charge of the site does not move, the person in charge of the site will be able to capture the face on the spot. Even when oriented in different directions, a substantially fixed captured image can be obtained.
  • This captured image is transmitted to the instruction device 30 by the transmitter 22 under the control of the captured image transmitter 18.
  • the captured image receiving means 36 of the instruction device 30 receives the captured image by the receiving unit 32.
  • the captured image display section 40 displays the received captured image. This allows the instructor to view an image of the site space.
  • the instructor When giving an instruction, the instructor inputs a fixed command.
  • the captured image display 40 uses the captured image at that time as a reference captured image and displays it as a still image. Note that since the captured image is displayed in a substantially fixed state by the direction control means 20, the captured image may be displayed as is.
  • the instructor inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40.
  • the instruction image transmitting means 38 transmits the input instruction image to the field device 10 by the transmitter 34 .
  • the field device 10 receives the instruction image by the receiving unit 24 and projects the instruction image from the projection unit 14. As a result, the instruction image 62 is projected onto the target object 52. As described above, since the projection direction of the projection unit 14 is fixed, even if the on-site person in charge changes the direction of his or her face, the instruction image will be projected onto the location intended by the person in charge. However, if the person in charge of the site moves from place to place, the projection position of the instruction image will shift.
  • the correction means 26 of the field device 10 compares the characteristic partial image (marker, etc.) in the reference captured image with the characteristic partial image in the current captured image so that the instruction image is correctly projected at the intended position.
  • the projection position of the instruction image is corrected and controlled. This allows the instruction image to be displayed in the correct position even if the person in charge of the site moves.
  • the transmitting unit 34 transmits the fixing command to the field device 10.
  • the receiving unit 24 of the field device 10 receives this and records the captured image at that time as a reference captured image. Further, the person in charge at the site places a marker near the object 52 so that the object 52 is imaged.
  • the person in charge at the site can confirm the position to be worked on based on the instruction image 62 actually projected onto the site space.
  • This instruction image 62 is displayed at the correct position by the direction control means 20 even if the worker changes the direction of his or her head. Therefore, the instruction image 62 does not disappear depending on the direction of the worker's head, making it easier to work. Furthermore, when multiple people are working, the instruction image 62 will continue to be projected even if the person in charge of the site wearing the site device 10 moves his or her head significantly, which may interrupt the work of other workers. do not have. Furthermore, even if the worker moves, the instruction image 62 is displayed correctly.
  • FIG. 2 shows the field person 54 wearing the field device 10.
  • the on-site person in charge 54 performs the work of drilling a hole in the side of the chimney 52 on the roof 50, and this will be explained as being instructed by an instructor remotely.
  • the field device 10 is composed of a smartphone (not shown), a laser projector/camera integrated body 58, and a rear storage body 59.
  • the field worker 54 wears a helmet 56, a laser projector/camera integrated unit 58 is fixed to the front eave, and a rear storage unit 59 housing a battery and a motor control circuit is provided at the rear end. . Both are connected by a signal line/power line (not shown).
  • the field person in charge 54 holds a smartphone (not shown) in his chest pocket.
  • the smartphone, the laser projector/camera integrated body 58, and the rear housing 59 are connected by a signal line/power line (not shown).
  • FIG. 3 shows the appearance of the laser projector/camera integrated body 58.
  • a clip 96 is provided on the laser projector/camera integrated body 58.
  • the clip 96 is biased by a spring member (not shown) in the direction in which the members 93 and 95 close about the axis 91.
  • the members 93 and 95 are opened and the eaves of the helmet 56 are sandwiched between them, and then the pressing is stopped so that the helmet 56 is held in the eaves by the spring member.
  • the laser projector/camera integrated body 58 be mounted so that the camera 82 and laser projector 84 are located near the front of the dominant eye (the right eye if the user is right-handed). That is, the camera 82 and the laser projector 84 are held in the eaves of the helmet 56 so that they are positioned downward from the eaves (inverted upside down from the state shown in FIG. 3). If the laser projector/camera integrated body 58 obstructs the line of sight during installation, it should be placed under the eaves so that the camera 82 and laser projector 84 are positioned upward from the eaves (as shown in Fig. 3). Good to keep.
  • a unit 80 housing a camera 82 and a laser projector 84 is fixed to the clip 96 via a triaxial structure 90 (another multiaxial structure may be used) as a drive section.
  • the member 93 of the clip 96 serves as the base 93 of the triaxial structure 90.
  • a motor 92 is fixed to the base 93 of the triaxial structure 90, and one end of an intermediate member 92A that is rotated in the XY plane by the motor 92 is connected.
  • the intermediate member 92A is formed in an L-shape, and a motor 94 is fixed to the other end.
  • One end of an intermediate member 94A that is rotated in the ZX plane by the motor 94 is connected to the motor 94.
  • the intermediate member 94A is formed in an L shape, and a motor 96 is fixed to the other end.
  • a mount member 97 that is rotated in the ZY plane by the motor 96 is connected to the motor 96 . Note that the XYZ axes shown in FIG. 3 vary as each member 92A, 94A, 97 rotates.
  • the three-axis structure 90 can adjust the orientation of the mount member 97 with three-axis degrees of freedom by driving the motors 92, 94, and 96.
  • the base 93 is provided with a triaxial gyro sensor JS and a triaxial acceleration sensor AS as the sensors 28.
  • Each of the motors 92, 94, and 96 is controlled by a motor control circuit (not shown) based on the outputs of the three-axis gyro sensor JS and the three-axis acceleration sensor AS.
  • a unit 80 housing a camera 82 and a laser projector 84 is fixed to the mount member 97 of the triaxial structure 90. As shown in FIG. 4, a camera control circuit 102 and a laser projector control circuit 104 that control the camera 82 and the laser projector 84 are provided within the casing 81 of the unit 80.
  • the camera control circuit 102 and the laser projector control circuit 104 may be provided in the rear housing 59, it is preferable that at least the MEMS circuit of the laser projector 104 is provided in the unit 80.
  • the casing 81 is attached to the mount top surface 101, mount side surface 97, and mount bottom surface 99 of the triaxial structure 90 via silicone gel bushings 120 (for example, Taica's anti-vibration material gel bushing B-1). Note that in FIG. 3, the mount top surface 101 is omitted for easy understanding.
  • the silicone gel bush 120 includes a ring-shaped silicone gel 114 inserted outside the upper part of the ring-shaped silicone gel 116.
  • the upper part of the silicone gel 116 is inserted into a hole provided in the housing 81.
  • the housing 81 is sandwiched between the silicon gel 114 and the silicon gel 116.
  • the silicon gels 114 and 116 are screwed to the mount bottom surface 99 by bolts 110 and washers 112. With this structure, the housing 81 is held by the silicone gels 116 and 114. This can prevent high-frequency vibrations from being transmitted to the housing 81 from the outside.
  • silicone gel bushes 120 are provided at two locations on each of the top, side, and bottom surfaces of the casing 81.
  • Figure 6 shows the system configuration of the remote instruction system.
  • the field device 10 and the instruction device 30 are connected via the Internet.
  • the field device 10 and the instruction device 30 may directly exchange data via the Internet, or may exchange data via a server device.
  • the field device 10 includes a laser projector/camera integrated body 58, a rear storage body (motor control circuit, battery) 59, and a smartphone 200. These are connected to each other by signal lines and power lines.
  • FIG. 7 shows the hardware configuration of the instruction device 30.
  • a memory 304, a display 306, a microphone 308, a communication circuit 310, an SSD 312, a DVD-ROM drive 314, a mouse/keyboard 316, and a speaker 318 are connected to the CPU 302.
  • Communication circuit 310 is for connecting to the Internet.
  • An operating system 320 and an instruction program 322 are recorded in the SSD 312.
  • the instruction program 322 cooperates with the operating system 320 to perform its functions.
  • These programs were recorded on the DVD-ROM 324 and installed via the DVD-ROM drive 314.
  • a microphone 308 and a speaker 318 are for having a conversation with the person in charge at the site.
  • FIG. 8 shows the hardware configuration of the motor control circuit 400.
  • a memory 404, a gyro sensor JS, an acceleration sensor AS, a camera 82, a laser projector 84, motors 92, 94, and 96, and a nonvolatile memory 406 are connected to the CPU 402.
  • the camera 82 is connected via a camera control circuit 102
  • the laser projector 84 is connected via a laser projector control circuit 104, but these are omitted in the figure.
  • the operating system 31 and motor control program 32 are recorded in the nonvolatile memory 406.
  • the motor control program 32 cooperates with the operating system 31 to perform its functions.
  • FIG. 9 shows the hardware configuration of the smartphone 200.
  • a memory 204, a touch display 206, a short-range communication circuit 208, a camera 82, a laser projector 84, an SSD 212, a speaker 214, a microphone 216, and a communication circuit 218 are connected to the CPU 202.
  • a normal communication circuit is omitted.
  • the communication circuit 218 is a circuit for connecting to the Internet.
  • a speaker 214 and a microphone 216 are for communicating with the instructor via the Internet.
  • An operating system 222 and an image control program 224 are recorded on the SSD 212. The image control program 224 cooperates with the operating system 224 to perform its functions.
  • FIGS. 10 and 11 show flowcharts of the motor control program 32 of the motor control circuit 400, the image control program 224 of the smartphone 200, and the instruction program 322 of the instruction device 30.
  • FIG. 10 is a flowchart of direction fixing control
  • FIG. 11 is a flowchart of instruction image display control.
  • the chimney 52 When the person in charge of the site 54 arrives at the site and gets in front of the target object, the chimney 52, he attaches the laser projector/camera integrated body 58 to the eave of his helmet 58 and turns on the power. Thereby, the vicinity of the chimney 52 is imaged by the camera 82.
  • the CPU 402 of the motor control circuit 400 acquires the outputs of the gyro sensor JS and acceleration sensor AS of the laser projector/camera integrated body 58 (step S1).
  • a gyro sensor and an acceleration sensor in three orthogonal axes are used.
  • the motor control circuit 400 calculates in which position and in which direction the base 93 (see FIG. 3) is located in three-dimensional space based on the outputs of the gyro sensor JS and the acceleration sensor AS.
  • the rotation angles of the motors 92, 94, and 96 are then controlled so that the unit 80 faces in a constant direction regardless of the position and direction of the base 93 (step S2). Therefore, regardless of the orientation of the head of the field personnel 54, the unit 80 is kept in a constant direction.
  • Such control is similar to that of a gimbal used as a stabilizing device for cameras and the like.
  • the camera 82 is directed toward the vicinity of the chimney 52 regardless of the direction of the head of the person in charge of the site 54, and a stable captured image can be obtained as a moving image.
  • the CPU 202 of the smartphone 200 acquires an image captured by the camera 82 via a signal line (or short-range communication), and transmits it to the instruction device 30 via the Internet (step S21). .
  • the CPU 302 of the instruction device 30 displays the received captured image on the display 306 (step S41).
  • FIG. 12 shows an example of a captured image displayed on the display 306.
  • the chimney 52 and its vicinity are displayed as a captured image.
  • the instructor can use the microphone 308 and speaker 318 to have a conversation with the smartphone 200 of the person in charge of the field through an Internet call. Thereby, the instructor can guide the on-site person to the desired position by conversation while looking at the display 306.
  • the instruction device 30 displays a direction change button 500 on the display 306.
  • the instructor wants to change the imaging direction of the camera 82, he or she clicks on the circumference of the direction change button 500 using the mouse 316.
  • the instruction device 30 detects the click, it generates an imaging direction change command for changing the imaging direction in the corresponding direction (up, down, left, and right), and transmits it to the smartphone 200 (step S42).
  • the smartphone 200 transmits the received imaging direction change command to the motor control circuit 400 (step S22).
  • the motor control circuit 400 controls the motors 92, 94, and 96 based on the received imaging direction change command to change the direction of the unit 80 (that is, the imaging direction) (step S3).
  • the orientation of the unit is fixed in the changed direction regardless of the orientation of the head of the person in charge at the site. Therefore, the captured image is displayed on the display 306 of the instruction device 30 in the direction indicated by the instruction. In this way, the instructor can obtain an image in a desired direction at the site.
  • the direction fixing control in FIG. 10 is repeatedly executed.
  • the instruction image display control process shown in FIG. 11 is performed.
  • the person in charge at the site takes out the marker 60 prepared in advance and attaches it to the chimney 52, which is the target object, based on a conversational instruction from the instructor.
  • the size and shape of the image of the marker 60 are recorded in advance in the nonvolatile memory 212 of the smartphone 200. Therefore, the smartphone 200 can calculate the distance, direction, etc. from the camera 82 to the marker 60 based on the captured image of the marker 60.
  • the captured image is displayed as a moving image on the display 306 of the instruction device 30. While looking at this captured image, the instructor clicks the instruction input mode button 501 with the marker 60 captured as shown in FIG. 12 to give a fixing instruction.
  • the instruction device 30 sets the captured image at that time as a reference captured image and displays it on the display 306 as a still image (step S52 ).
  • the instructor uses the mouse 316 to input an instruction to the on-site person as an instruction image to this still image (step S53). For example, as shown in FIG. 13, in the image of the chimney 52 displayed on the display 306, an image 62 (in this example, a cross image indicating the position to drill a hole) indicating the position where the hole should be drilled is displayed. Draw and input using the mouse 316.
  • the instruction device 30 transmits the fixing command to the smartphone 200 (step S32).
  • the smartphone 200 that has received the fixing instruction records the captured image when receiving the fixing instruction in the nonvolatile memory 212 as a reference captured image (step S32).
  • the instruction device 30 and the smartphone 200 can recognize captured images taken at the same time as reference captured images.
  • information for identifying the frame such as a frame number
  • the smartphone 200 by determining a reference captured image based on information specifying this frame, it is possible to prevent deviations due to time lag.
  • the instructor After the instructor finishes inputting the instruction image, the instructor clicks an instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference captured image on the display 306. Thereby, the instruction device 30 transmits the instruction image to the smartphone 200 (step S53).
  • the instruction device 30 cancels the instruction input mode, stops displaying the still image as the reference captured image, and displays the transmitted captured image as a moving image (step S54). This allows the instructor to know the situation at the site again.
  • the data structure of the instruction image sent to the smartphone 200 is shown in FIG. 14A.
  • the instruction image data is the actual data of the instruction image input by the instructor, as shown in FIG. 14B.
  • the reference coordinate position is the XY coordinate value of the reference point of the instruction image when the reference point of the marker image (for example, the lower center point of M) is set as the origin.
  • the reference point is the upper left of the rectangle circumscribing the instruction image input by the instructor.
  • the instruction image is transmitted as image data, but the parameters may be transmitted as numerical values depending on the shape of the image determined in advance. For example, if it is a perfect circle, the center coordinates and radius, and if it is a square, the upper left coordinates and side lengths may be expressed numerically and transmitted.
  • the smartphone 200 Upon receiving the instruction image data of FIG. 14A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image from the camera 82 (step S33).
  • the imaging range of the camera 82 and the projection range of the laser projector 84 are configured to be the same. Therefore, if the current captured image is exactly the same as the recorded reference captured image (that is, if the site personnel has not moved at all since the reference captured image), the laser When the instruction image data is projected by the projector 84, the instruction image 62 is projected onto the chimney 52, as shown in FIG.
  • This instruction image 62 matches the position input by the instructor on the display 306, so it is possible to accurately indicate the work position to the person in charge at the site.
  • the person in charge at the site uses the instruction image 62 as a guide to drill a hole at that position.
  • the smartphone 200 calculates the distance and direction between the camera 82 and the marker 60 (and the location where the instruction image 62 is to be projected) based on the image of the marker 60 in the reference captured image. As mentioned above, since a known pattern is printed on the marker 60 in advance, the distance to the marker 60 attached to the chimney 52 (and the location where the instruction image 62 should be projected) can be determined based on the captured image. The direction can be calculated.
  • the smartphone 200 calculates the distance and direction to the marker 60 attached to the chimney 52 (and the location where the instruction image 62 should be projected) based on the current captured image acquired in step S33.
  • the smartphone 200 determines the distance and direction to the marker 60 (and the location where the instruction image 62 should be projected) in the reference captured image, and the marker 60 (and the location where the instruction image 62 should be projected) in the current captured image.
  • the instruction image 62 is deformed based on the comparison with the distance and direction to the point, and the position where the instruction image 62 is projected is controlled (step S34).
  • the instruction image 62 is controlled to be enlarged/reduced and projected according to the change in the distance between the camera 82 and the marker 60 (or the instruction image 62).
  • control is performed to move the position where the instruction image 62 is projected as the marker 60 moves.
  • the direction fixing control (see FIG. 10) is performed separately by the triaxial structure 90, in many cases, the instruction image 62 is displayed at the correct position by performing the control for FIGS. 15B and 15C. be able to.
  • the direction fixing control by the triaxial structure 90 is separately performed and the above control is performed, so the instruction image 62 can be stably displayed at the correct position. Furthermore, even if the on-site person in charge 54 changes the direction of his or her head and takes his/her line of sight away from the object 52, the instruction image 62 continues to be displayed by the direction fixing control. Therefore, there is less stress when the on-site person in charge 54 performs the work.
  • the marker 60 of the object 52 may be The inclination of the surface 510 to which is pasted may change from the time of the reference captured image in FIG. 16A to the time of the current captured image in FIG. 16B.
  • the smartphone 200 calculates the inclination of the surface 510 of the target object 52 based on the image of the marker 60 in the reference captured image (FIG. 16A). Thereby, the actual distance LL between the marker 60 and the instruction image 62 is calculated based on the reference coordinate position PL1 (X or Y) sent from the instruction device 30.
  • the inclination of the surface 510 of the object 52 is calculated based on the image of the marker 60 in the current captured image (FIG. 16B).
  • the position where the instruction image 62 should be displayed is determined based on the actual distance LL calculated above, and the reference coordinate position PL2 (X or Y) is calculated.
  • the smartphone 200 can control the position at which the instruction image 62 is projected based on this reference coordinate position PL2, and can project the instruction image 62 at the correct position. Furthermore, the instruction image 62 is transformed so that the projected instruction image 62 is not distorted.
  • the above process can be performed in the same way in both the vertical and horizontal directions.
  • the imaging range 504 for the reference captured image may be tilted diagonally as shown in the imaging range 506.
  • FIG. 17 shows a tilt in a direction horizontal to the plane of the paper, such a tilt may occur in all three-dimensional directions.
  • the projected instruction image 62 will also be distorted.
  • the instruction image 62 is transformed (deformed in the opposite way to the distortion) based on the image of the marker 60 in the reference captured image and the image of the marker 60 in the current captured image so as to eliminate the distortion.
  • a correct instruction image can be projected.
  • Smartphone 200 calculates the distance and direction between camera 82 and marker 60 based on the image of marker 60 in the reference captured image. Furthermore, the smartphone 200 calculates the distance and direction to the marker 60 attached to the chimney 52 based on the current captured image acquired in step S33. The smartphone 200 transforms the instruction image 62 based on a comparison between the distance and direction to the marker 60 in the reference captured image and the distance and direction to the marker 60 in the current captured image, and projects the instruction image 62. control the position.
  • the instruction image intended by the instructor is projected and displayed on the object 52 at the site.
  • the marker 60 be pasted on the plane on which the instruction image 60 is to be displayed.
  • estimating the accurate position using image feature points such as SLAM may be combined. is preferred.
  • the smartphone 200 analyzes the captured image to calculate feature points (points on the boundary of the object, etc.) near the object (near the marker 60). By comparing the feature points in the reference captured image and the feature points in the current captured image, the positional relationship between the marker 60 and the surface on which the instruction image 62 is to be displayed is determined.
  • the field device 10 is attached to the field person 54.
  • it may also be attached to a moving object (or a fixed object) near the person in charge of the site, such as a bicycle or a car that the person in charge of the site is riding.
  • a moving object or a fixed object
  • it may be attached to a moving body (or a fixed body) located away from the person in charge of the site.
  • the field device 10 may be attached to a robot instead of the field worker 54.
  • an instructor in a remote location can communicate with people around the robot by displaying text or images using the on-site instruction device 11 attached to the robot. The same applies to the following embodiments.
  • the instruction image 62 is always projected onto the object 52 by the laser projector 84. However, if there are people in the projection direction, the laser projector 84 may not emit radiation.
  • the laser projector 84 is used as the projection section.
  • a normal projector may also be used.
  • a triaxial structure 90 (gimbal) is used, but a uniaxial structure, a biaxial structure (gimbal), a structure with four or more axes, etc. may also be used.
  • the person in charge at the site 54 attaches the marker 60 to the object 52.
  • the marker 60 may be placed on the object 52 at the site in advance.
  • the distance and direction to the camera 82 are determined using the marker 60.
  • the marker 60 it is also possible to use SLAM or the like to grasp this only from the feature points of the captured image and perform similar processing.
  • the smartphone 200 recognizes the feature points 512 (vertices that characterize the image, etc.), transmits them to the instruction device 30, and displays them on the display 306 as shown in FIG.
  • the instructor looks at this image, operates the mouse 316, and selects a feature point 512 to be used for position specification. It is preferable to select a feature point 512 related to the object 52 (a feature point on the object) as the feature point 512 used for position identification.
  • the instruction input mode button 501 When the instruction input mode button 501 is clicked, information on the selected feature point 512 is transmitted to the smartphone 200.
  • the smartphone 200 can specify the position and direction based on these feature points 512.
  • the instructor checks the screen shown in FIG. 13 on the instruction device 30 and clicks the instruction image transmission button 502.
  • the instruction device 30 or the smartphone 200 detects that the marker 60 in the captured image has entered a predetermined area (for example, within a predetermined central area) in the captured image, the instruction input mode is automatically activated. It may be possible to enter the The same applies when processing is performed using the feature points 512 without using the marker 60.
  • the motor control circuit 400 controls the triaxial structure 90, and the smartphone 200 controls the projection position based on image processing.
  • the three-axis structure 90 may also be controlled by the smartphone 200.
  • a circuit for controlling the triaxial structure 90 and controlling the projection position based on image processing may be provided in the rear storage body 59.
  • smart phone 200 will be used only for phone calls.
  • a telephone call function may also be provided in the rear storage body 59.
  • the instruction image is transformed by the smartphone 200 so that the instruction image is not projected in a distorted (or changed in size) manner.
  • the shape of the instruction image is not important and it is important to indicate a specific position (for example, when indicating the position at the center point of a cross mark), if the position can be shown correctly, the instruction image Even if it becomes distorted (even if its size changes), there is no problem. In such a case, the process of transforming the instruction image may not be performed.
  • the projection position and the like are controlled based on image processing by the smartphone 200 (FIG. 11).
  • the projection position and the like may not be controlled based on image processing by the smartphone 200, and only the processing by the triaxial structure 90 may be performed.
  • an instruction image such as an arrow indicating a direction
  • a three-axis structure is used. Control by 90 is sufficient.
  • the smartphone 200 not only performs the control corresponding to FIGS. 15B and 15C, but also performs the control corresponding to FIGS. 16 and 17. However, only the controls corresponding to FIGS. 15B and 15C may be performed.
  • the mode is set such that the reference captured image as a still image is displayed and an instruction image is input.
  • the imaging direction is fixed in the same direction by controlling the triaxial structure 90, a substantially fixed image can be obtained even if the reference image is displayed as a moving image as it is.
  • the instructor inputs an instruction image in this state and clicks the instruction image transmission button 502.
  • the instruction device 30 and the field device 10 may set the captured image at that time as the reference captured image.
  • the instruction image is projected by the on-site device 10 and the instruction device 30 that can communicate with the on-site device 10.
  • the instruction image 62 may be recorded in the site device 10 and constructed as the site instruction device 11.
  • the functional configuration of the site instruction device 11 constructed in this way is shown in FIG. 11.
  • the processing of the direction control means 20 is similar to steps S1 and S2 in FIG.
  • An instruction image 62 is recorded in the recording section 24.
  • the correction means 26 sets the captured image at that time as the reference captured image.
  • the instruction image 62 is corrected based on the markers or feature points of the reference captured image and the current captured image, and the projection position is controlled. Therefore, the instruction image 62 is projected onto the target object 52.
  • the on-site instruction device 11 may be worn by the on-site person in charge 54, and may be a moving body (or a fixed body) near the on-site person in charge.
  • the instruction image 62 is recorded in the recording unit 24, but it may be acquired from outside the on-site instruction device 11 through communication or the like.
  • FIG. 20 shows a flowchart of instruction processing.
  • the on-site instruction device 11 is configured by a motor control circuit 400 and a smartphone 200.
  • the processing of the motor control circuit 400 is similar to the embodiment described above.
  • the smartphone 200 acquires a captured image (step S71), and determines whether a predetermined marker 60 is captured and falls within a predetermined range (for example, within a predetermined area in the center) of the captured image (step S72).
  • the smartphone 200 records the captured image at that time as a reference captured image (step S73). Subsequently, the current captured image is acquired, compared with the reference captured image, the instruction image is corrected, and the projection position is controlled (step S75).
  • this system can be used to project an instruction image onto the item the person wants to buy.
  • a still image is used as the instruction image.
  • a moving image may be used as the instruction image.
  • the on-site device repeatedly reproduces the video.
  • the smartphone, the laser projector/camera integrated body 58, and the rear storage body 59 constitute the on-site device. However, they may be constructed as one. Furthermore, if the rear storage body 59 (motor control circuit) is provided with the functions that a smartphone performs, the smartphone may not be necessary. For example, a dedicated device, a PC, a tablet, a stick type PC, etc. may be used.
  • the direction of the displayed captured image is changed by operating the direction change button 500.
  • the direction of the displayed captured image may be changed by dragging the screen (moving the cursor while holding down the mouse button).
  • FIG. 21 shows the functional configuration of the remote instruction system according to the second embodiment.
  • This system includes an on-site device 10 used by a person in charge at the site and an instruction device 30 used by a remote instructor.
  • An imaging unit 12 and a projection unit 14 are provided on the helmet worn by the person in charge of the field via a drive unit 16.
  • the imaging area of the imaging unit 12 and the projection area of the projection unit 14 are arranged to be substantially the same.
  • the functions of the imaging unit 12, projection unit 14, and drive unit 16 are the same as in the first embodiment.
  • the follow-up control means 21 performs the same function as the direction control means 20 in the first embodiment, and maintains the orientation of the imaging section 12 and the projection section 14 in a predetermined direction.
  • the captured image by the field device 10 is transmitted to the instruction device 30 by the transmitting section 22 under the control of the captured image transmitting means 18.
  • the captured image receiving means 36 of the instruction device 30 receives the captured image by the receiving unit 32.
  • the captured image display unit 40 displays the received captured image. This allows the instructor to view an image of the site space.
  • the instructor When giving an instruction, the instructor inputs a fixed command.
  • the captured image display 40 uses the captured image at that time as a reference captured image and displays it as a still image. Note that since the captured image is displayed in a substantially fixed state by the direction control means 20, the captured image may be displayed as is.
  • the instructor inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40.
  • the instruction image transmitting means 38 transmits the input instruction image to the field device 10 by the transmitter 34 .
  • the field device 10 receives the instruction image by the receiving unit 24 and projects the instruction image from the projection unit 14.
  • the tracking control means 21 of the field device 10 controls the drive unit 16 so that the imaging unit 12 follows and captures a characteristic portion image (such as a marker).
  • the instruction image 62 can be projected onto the target object 52.
  • FIG. 22 shows a flowchart of instruction image display control after marker placement.
  • Steps S31 to S33 and steps S51 to S54 are the same as in the first embodiment.
  • step S35 the direction fixing control shown in FIG. 10 is canceled, and thereafter, marker tracking control is performed.
  • the smartphone 200 compares the marker 60 of the reference captured image with the marker 60 of the current captured image, and determines that the position of the marker 60 of the current captured image in the captured image is the position of the marker 60 of the reference captured image in the reference captured image. It is calculated how the motors 92, 94, and 96 should be controlled in order to change the direction of the unit 80 so as to match (step S35). Note that when calculating this control signal, calculation is performed so that the inclination of the marker 60 on the image (see FIG. 17) is the same as that of the reference captured image.
  • the calculated control signal is transmitted to the motor control circuit 400, and the motors 92, 94, and 96 are controlled by the motor control circuit 400 (step S5). In this way, the imaging direction of the camera 82 (the irradiation direction of the laser projector 94) is controlled to follow the marker 60.
  • the imaging direction is changed in the direction of the arrow as shown in FIG. 23, and the marker 60 is always imaged at a predetermined position. Note that due to a slight angle change, the marker 60 is not displayed completely correctly (it is slightly distorted), but it can be used as long as it is within an allowable range depending on the purpose of use.
  • the size of the marker 60 is slightly changed or is projected with some distortion, but this may be within an allowable range depending on the intended use. If so, it can be used.
  • the marker 60 is controlled to be tracked and imaged by the triaxial structure 90.
  • the instruction image may be modified and the projection position may be controlled by the image processing shown in step S34 of the first embodiment.
  • the instruction image is projected by the on-site device 10 and the instruction device 30 that can communicate with the on-site device 10.
  • the instruction image 62 may be recorded in the site device 10 and constructed as the site instruction device 11.
  • the functional configuration of the site instruction device 11 constructed in this way is shown in FIG. 24.
  • the follow-up control means 21 initially performs steps S1 and S2 in FIG.
  • An instruction image 62 is recorded in the recording section 24.
  • the tracking control means 21 controls the drive unit 16 based on the marker 60 or feature points of the reference captured image and the current captured image, and tracks and captures the image so that the marker 60 comes to a predetermined position in the captured image. . Therefore, the instruction image 62 is projected onto the target object 52.
  • the on-site instruction device 11 may be worn by the on-site person in charge 54, or may be attached to a moving body (or a fixed body) near the on-site person in charge.
  • the instruction image 62 is recorded in the recording unit 24, but it may be acquired from outside the on-site instruction device 11 through communication or the like.
  • FIG. 25 shows a flowchart of instruction processing.
  • the on-site instruction device 11 is configured by a motor control circuit 400 and a smartphone 200.
  • the processing of the motor control circuit 400 is similar to the embodiment described above.
  • the smartphone 200 acquires a captured image (step S71), and determines whether a predetermined marker 60 is captured and falls within a predetermined range (for example, within a predetermined area in the center) of the captured image (step S72).
  • the smartphone 200 records the captured image at that time as a reference captured image (step S73). Subsequently, the current captured image is acquired, compared with the reference captured image, and a motor control signal for tracking and capturing the marker 60 is generated (step S76). This motor control signal is given to motor control circuit 400.
  • the motor control circuit 400 controls the motors 92, 94, and 96, and controls the triaxial structure 90 to follow the marker 60 and image and project it.
  • the projection section 14 is held by a triaxial structure 90, but the imaging section 12 is capable of capturing wide-angle images and is fixedly provided.
  • FIG. 26 shows the functional configuration of a remote instruction system according to an embodiment of the present invention.
  • This system includes an on-site device 10 used by a person in charge at the site and an instruction device 30 used by a remote instructor.
  • the person in charge at the site is wearing the projection unit 14 via the drive unit 16.
  • the imaging unit 12 is fixedly mounted. Note that in this embodiment, a 360-degree camera is used as the imaging unit 12, so that images can be taken all around.
  • the projection unit 14 is configured so that its projection direction can be changed by a drive unit 16.
  • the projection direction of the projection unit 14 is detected by the sensor 28.
  • the direction control means 20 controls the drive unit 16 based on the output of the sensor 28, and maintains the direction of the projection unit 14 in a predetermined direction centered on the person in charge of the site, regardless of the movement of the person in charge of the site.
  • the imaging unit 12 of the field device 10 images the field space including the object 52 and generates a captured image.
  • the imaging unit 12 captures images in a wide-angle direction (for example, all around 360 degrees), so even if the person in charge of the site moves or changes direction, it is possible to obtain a captured image that includes the site space. can.
  • This captured image is transmitted to the instruction device 30 by the transmitter 22 under the control of the captured image transmitter 18.
  • the captured image receiving means 36 of the instruction device 30 receives the captured image by the receiving unit 32.
  • the captured image display section 40 displays the received captured image. Note that since the captured image is an image captured in the entire circumferential direction of 360 degrees, it is not displayed all at once, but only partially. The instructor can look around the site worker by changing the direction up, down, left, and right. Thereby, it is possible to display a captured image in the direction in which the object in the scene space is projected.
  • the instructor When giving an instruction, the instructor inputs a fixing command while the captured image including the target object is displayed.
  • the captured image display 40 uses a partial captured image in that direction as a reference captured image and displays it as a still image. The direction when the fixed command is given is transmitted to the field device 10.
  • the instructor inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40.
  • the instruction image transmitting means 38 transmits the input instruction image to the field device 10 by the transmitter 34 .
  • the field device 10 receives the instruction image by the receiving unit 24, and causes the driving unit 16 to follow the characteristic partial image (such as a characteristic point of the image) displayed in the reference captured image when the fixing command was given. control and project an instruction image from the projection unit 14. As a result, the instruction image 62 is projected onto the target object 52.
  • the characteristic partial image such as a characteristic point of the image
  • the projection direction of the projection unit 14 is controlled so as to follow the characteristic partial image, even if the person in charge of the site changes the direction of his or her face, the instruction image will be projected onto the location intended by the person in charge. However, if the person in charge of the site moves from place to place, the projected position of the instruction image will shift.
  • the correction means 26 of the field device 10 correctly projects the instruction image at the intended position by comparing the characteristic partial image (such as a characteristic point of the image) in the reference captured image with the characteristic partial image in the current captured image.
  • the projection position of the instruction image is corrected and controlled as follows. This allows the instruction image to be displayed in the correct position even if the person in charge of the site moves.
  • the person in charge at the site can confirm the position to be worked on based on the instruction image 62 actually projected onto the site space.
  • This instruction image 62 is displayed at the correct position by the direction control means 20 even if the worker changes the direction of his or her head. Therefore, the instruction image 62 does not disappear depending on the direction of the worker's head, making it easier to work. Furthermore, when multiple people are working, the instruction image 62 will continue to be projected even if the person in charge of the site wearing the site device 10 moves his or her head significantly, which may interrupt the work of other workers. do not have. Furthermore, even if the worker moves, the instruction image 62 is displayed correctly.
  • FIG. 27 shows the external appearance of the laser projector/camera integrated body 57.
  • a camera is not provided in the unit 80, but a laser projector 84 is provided. Further, a 360 degree camera 81 is provided as a camera and is fixed to a member 93 of the clip.
  • the field device 10 is composed of a smartphone (not shown), a laser projector/camera integrated body 57, and a rear storage body 59.
  • the field worker 54 wears a helmet 56, and in this embodiment, a laser projector/camera assembly 57 is held at the top of the helmet 56.
  • a rear storage body 59 is provided on the rear end side of the helmet 56 in which a battery and a motor control circuit are stored.
  • the laser projector/camera integrated body 57 and the rear storage body 59 are connected through a signal line/power line (not shown). Furthermore, the field person in charge 54 holds a smartphone (not shown) in his chest pocket. The smartphone, the laser projector/camera integrated body 57, and the rear housing 59 are connected by a signal line/power line (not shown).
  • the system configuration and hardware configuration are the same as in the first embodiment.
  • FIGS. 28 and 29 show flowcharts of the motor control program 32 of the motor control circuit 400, the image control program 224 of the smartphone 200, and the instruction program 322 of the instruction device 30.
  • FIG. 28 is a flowchart of direction change control
  • FIG. 29 is a flowchart of instruction image display control.
  • the chimney 52 When the on-site person in charge 54 arrives at the site and gets in front of the target object, the chimney 52, he attaches the laser projector housing 57 to the top of his helmet 58 and turns on the power. It is preferable that a metal fitting for attachment be provided on the top of the helmet 58.
  • the smartphone 200 acquires an image captured by the 360-degree camera 81 via a signal line (or short-range communication), and transmits it to the instruction device 30 via the Internet (step S21). At this time, the smartphone 200 acquires the projection direction of the laser projector 84 (direction based on the helmet 58, that is, the direction on the helmet 58), and transmits it to the instruction device 30.
  • the instruction device 30 records the received captured image in the memory and displays it on the display 306 (step S41).
  • the captured image is an omnidirectional image captured by a 360 degree camera. Therefore, the instruction device 30 displays only the partial image that matches the projection direction of the laser projector 84 among the received captured images. In this embodiment, control is performed so that the region of the partial image displayed on the pointing device 30 and the projection region of the laser projector 84 match.
  • the instruction device 30 displays a direction change button 500 on the display 306.
  • the instructor clicks on the circumference of the direction change button 500 using the mouse 316.
  • the instruction device 30 detects a click, it displays a partial captured image in a direction corresponding to the click (up, down, left, and right) (step S44).
  • the direction is given to the field device 10 from the instruction device 30 via the smartphone 200, and the direction of the laser projector 84 is controlled to match this direction. Therefore, control is performed so that the region of the partial image displayed on the pointing device 30 and the projection region of the laser projector 84 match. In this way, the instructor can obtain an image in a desired direction at the site.
  • the partial captured image in the direction selected by the instructor is displayed as a moving image on the display 306 of the instruction device 30.
  • the instructor clicks the instruction input mode button 501 while looking at this partial captured image and with the target object, the chimney 52, being displayed.
  • the instruction device 30 sets the partial captured image at that time as a reference captured image and displays it on the display 306 as a still image (step S52).
  • the instructor uses the mouse 316 to input an instruction to the on-site person as an instruction image to this still image (step S53). For example, as shown in FIG. 13, in the image of the chimney 52 displayed on the display 306, the position at which a hole should be made is input by drawing an instruction image 62 (in this example, a cross image) using the mouse 316.
  • the instruction device 30 transmits the fixing command and the direction when the fixing command was given to the smartphone 200 (step S51).
  • the smartphone 200 that has received the fixing command determines a reference captured image based on the captured image and direction when receiving the fixing command, and records it in the nonvolatile memory 212 (step S32). Note that after receiving the fixing command, the smartphone 200 does not transmit all captured images to the instruction device 30, but only transmits a partial captured image in the fixed direction to the instruction device 30.
  • the instruction device 30 and the smartphone 200 can recognize a partial captured image at the same time as a reference captured image.
  • information for identifying the frame such as a frame number
  • the smartphone 200 by determining a reference captured image based on information specifying this frame, it is possible to prevent deviations due to time lag.
  • the instructor After the instructor finishes inputting the instruction image, the instructor clicks an instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference captured image on the display 306. Thereby, the instruction device 30 transmits the instruction image to the smartphone 200 (step S53).
  • the instruction device 30 cancels the instruction input mode, stops displaying the still image as the reference captured image, and displays the partial captured image of the transmitted captured image as a moving image (step S54). This allows the instructor to know the situation at the site again.
  • the smartphone 200 Upon receiving the instruction image data of FIG. 14A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image from the camera 82, and extracts a partial captured image based on the received fixed direction (step S33).
  • the smartphone 200 compares the image feature points of the reference captured image and the image feature points of the current partial captured image, and controls the motors 92, 94, and 96 of the triaxial structure 90 to follow the image feature points. At the same time, an instruction image is projected from the laser projector 84 (step S35).
  • the generated signal is transmitted to motor control circuit 400, and motors 92, 94, and 96 are controlled. Thereby, even if the person in charge of the site moves, the instruction image will be projected at the correct position by the laser projector 84.
  • the direction of the unit 80 (laser projector 84) is controlled by the triaxial structure 90.
  • the instruction image can be deformed and the projection position can be deformed and the projection position can be changed, independently of the control by the three-axis structure 90, based on the partial feature images (feature points and markers) by the same processing as in the first embodiment. may also be controlled. This allows the instruction image to be projected at a more correct position.
  • a 360-degree camera 81 that captures images in all directions is used.
  • a camera that captures images at 360 degrees (or a predetermined degree) in the horizontal direction may also be used.
  • step S21 all captured images of the 360-degree camera 81 are transmitted to the instruction device 30.
  • a partial captured image in a direction corresponding to the direction of the unit 80 Alternatively, only the information may be sent to the instruction device 30. In this way, substantially the same processing as in the first embodiment and the second embodiment can be performed.
  • FIGS. 30 and 31 Flowcharts for performing such processing are shown in FIGS. 30 and 31.
  • feature points of the image are used as partial feature images, but markers or the like may also be used.
  • the instruction image is projected by the on-site device 10 and the instruction device 30 that can communicate with the on-site device 10.
  • the instruction image 62 may be recorded in the site device 10 and constructed as the site instruction device 11.
  • FIG. 32 The functional configuration of the site instruction device 11 constructed in this way is shown in FIG. 32.
  • An instruction image 62 is recorded in the recording section 24.
  • the tracking control means 21 controls the drive unit 16 based on the markers 60 or feature points of the reference captured image and the current captured image, so that the instruction image is projected while following the markers 60. Therefore, the instruction image 62 is projected onto the target object 52.
  • the on-site instruction device 11 may be worn by the on-site person in charge 54, or may be attached to a moving body (or a fixed body) near the on-site person in charge.
  • the instruction image 62 is recorded in the recording unit 24, but it may be acquired from outside the on-site instruction device 11 through communication or the like.
  • FIG. 33 shows a flowchart of instruction processing.
  • the on-site instruction device 11 is configured by a motor control circuit 400 and a smartphone 200.
  • the processing of the motor control circuit 400 is similar to the embodiment described above.
  • the smartphone 200 acquires a captured image (step S71), and determines whether a predetermined marker 60 has been captured (step S72).
  • the smartphone 200 records the partial image in which the marker 60 is imaged among the images captured at that time as a reference image (step S77). At this time, the direction of the partial captured image is selected so that the marker 60 is in a predetermined area (for example, in the center). Also, record this direction as well.
  • a partial captured image in the above direction of the current captured image is acquired and compared with the reference captured image to generate a motor control signal for tracking and capturing the marker 60 (step S78).
  • This motor control signal is given to motor control circuit 400.
  • the motor control circuit 400 controls the motors 92, 94, and 96, and controls the triaxial structure 90 to follow and image the marker 60.
  • control for tracking the marker 60 may be performed by the motor control circuit.
  • the projection section 14 is held by a triaxial structure 90, and the wide-angle imaging section 12 is fixedly provided. In this embodiment, not only the imaging section 12 but also the projection section 14 has a wide angle.
  • FIG. 34 shows the functional configuration of a remote instruction system according to an embodiment of the present invention.
  • This system includes an on-site device 10 used by a person in charge at the site and an instruction device 30 used by a remote instructor.
  • the person in charge at the site is wearing a wide-angle projection unit 14 and a wide-angle imaging unit 12.
  • a 360-degree camera is used as the wide-angle imaging unit 12, so that images can be taken all around.
  • a laser projector capable of projecting in all directions of 360 degrees is used as the wide-angle projection unit 14, so that projection can be performed all around the circumference.
  • the imaging unit 12 of the field device 10 images the field space including the object 52 and generates a captured image. As described above, since the imaging unit 12 captures images in all directions of 360 degrees, even if the person in charge of the site moves or changes direction, a captured image including the site space can be obtained.
  • This captured image is transmitted to the instruction device 30 by the transmitter 22 under the control of the captured image transmitter 18.
  • the captured image receiving means 36 of the instruction device 30 receives the captured image by the receiving unit 32.
  • the captured image display section 40 displays the received captured image. Note that since the captured image is an image captured in the entire circumferential direction of 360 degrees, it is not displayed all at once, but is partially displayed (partial captured image).
  • the instructor can look around the site worker by changing the direction up, down, left, and right. Thereby, a partial image taken in the direction in which the object in the scene space is projected can be displayed.
  • the instructor When giving an instruction, the instructor inputs a fixing command while the captured image including the target object is displayed.
  • the captured image display section 40 sets a partial captured image in that direction as a reference captured image and displays it as a still image.
  • the direction when the fixed command is given is transmitted to the field device 10.
  • the instructor inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40.
  • the instruction image transmitting means 38 transmits the input instruction image to the field device 10 by the transmitter 34 .
  • the field device 10 receives the instruction image by the reception unit 24, and controls the projection unit 14 to project the instruction image in the direction based on the direction in which the fixing command was given. As a result, the instruction image 62 is projected onto the target object 52.
  • the direction in which the instruction image is projected by the projection unit 14 matches the direction of the reference captured image, so the instruction image is projected onto the location intended by the instructor. Although it is possible to implement this control alone, if the person in charge of the site moves from place to place, the projected position of the instruction image will shift.
  • the correction means 26 of the field device 10 compares the characteristic partial image (marker, etc.) in the reference captured image with the characteristic partial image in the current captured image so that the instruction image is correctly projected at the intended position.
  • the instruction image is transformed and the projection position of the instruction image is corrected and controlled. This allows the instruction image to be displayed in the correct position even if the person in charge of the site moves.
  • the person in charge at the site can confirm the position to be worked on based on the instruction image 62 actually projected onto the site space.
  • This instruction image 62 is displayed at the correct position by the follow-up control means 21 and the correction means 26 even if the operator changes the direction of his head. Therefore, the instruction image 62 does not disappear depending on the direction of the worker's head, making it easier to work. Furthermore, when multiple people are working, the instruction image 62 will continue to be projected even if the person in charge of the site wearing the site device 10 moves his or her head significantly, which may interrupt the work of other workers. do not have. Furthermore, even if the worker moves, the instruction image 62 is displayed correctly.
  • the triaxial structure 90 is not used, and a 360 degree camera 81 and a 360 degree laser projector 83 are fixed to the top of the helmet 56.
  • the hardware configuration of the instruction device 30 is the same as that of the first embodiment (see FIG. 7). Furthermore, since the three-axis structure 90 is not used, the motors 92, 94, and 96 for controlling it are unnecessary, and the motor control circuit 400 is also unnecessary.
  • the hardware configuration of the smartphone 200 is the same as that of the first embodiment (see FIG. 8). However, instead of the camera 82 and laser projector 94, a 360 degree camera 81 and a 360 degree laser projector 83 are connected.
  • FIGS. 35 and 36 show flowcharts of the image control program 224 of the smartphone 200 and the instruction program 322 of the instruction device 30.
  • FIG. 35 is a flowchart of direction change control
  • FIG. 36 is a flowchart of instruction image display control.
  • the chimney 52 When the on-site person in charge 54 arrives at the site and gets in front of the target object, the chimney 52, he attaches the laser projector housing 57 to the top of his helmet 58 and turns on the power.
  • the smartphone 200 acquires an image captured by the 360-degree camera 81 via a signal line (or short-range communication), and transmits it to the instruction device 30 via the Internet (step S21).
  • the instruction device 30 records the received captured image in the memory and displays it on the display 306 (step S41).
  • the captured image is an omnidirectional image captured by a 360 degree camera. Therefore, the instruction device 30 displays only a partial image in a predetermined direction of the received captured image.
  • the instruction device 30 displays a direction change button 500 on the display 306.
  • the instructor wants to display a partial captured image in a different direction
  • the instruction device 30 detects a click, it displays a partial captured image in a direction corresponding to the click (up, down, left, and right) (step S44). In this way, the instructor can obtain an image in a desired direction at the site.
  • the partial captured image in the direction selected by the instructor is displayed as a moving image on the display 306 of the instruction device 30.
  • the instructor clicks the instruction input mode button 502 while looking at this partial captured image and with the chimney 52, which is the target object, being displayed.
  • the instruction device 30 sets the partial captured image at that time as a reference captured image and displays it on the display 306 as a still image (step S52).
  • the instructor uses the mouse 316 to input an instruction to the on-site person as an instruction image to this still image (step S53). For example, as shown in FIG. 13, in the image of the chimney 52 displayed on the display 306, the position at which a hole should be made is input by drawing an instruction image 62 (in this example, a cross image) using the mouse 316.
  • the instruction device 30 transmits the fixing command and the direction when the fixing command was given to the smartphone 200 (step S51).
  • the smartphone 200 that has received the fixing command determines a reference captured image based on the captured image and direction when receiving the fixing command, and records it in the nonvolatile memory 212 (step S32). Note that after receiving the fixing command, the smartphone 200 does not transmit all captured images to the instruction device 30, but only transmits a partial captured image in the fixed direction to the instruction device 30.
  • the instruction device 30 and the smartphone 200 can recognize a partial captured image at the same time as a reference captured image.
  • information for identifying the frame such as a frame number
  • the smartphone 200 by determining a reference captured image based on information specifying this frame, it is possible to prevent deviations due to time lag.
  • the instructor After the instructor finishes inputting the instruction image, the instructor clicks an instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference captured image on the display 306. Thereby, the instruction device 30 transmits the instruction image to the smartphone 200 (step S53).
  • the instruction device 30 cancels the instruction input mode, stops displaying the still image as the reference captured image, and displays the partial captured image of the transmitted captured image as a moving image (step S54). This allows the instructor to know the situation at the site again.
  • the smartphone 200 Upon receiving the instruction image data of FIG. 14A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image from the camera 82, and extracts a partial captured image based on the received fixed direction (step S33).
  • the smartphone 200 compares the marker of the reference captured image with the marker of the current partial captured image, and determines the direction in which the instruction image is projected by the 360-degree laser projector so that the instruction image is projected correctly by following the marker. control (step S34). Furthermore, by comparing the markers, the instruction image is transformed so that it is correctly projected, and its projection position is controlled.
  • a 360-degree camera 81 that captures images in all directions and a 360-degree laser projector 83 that projects in all directions are used.
  • a camera that captures images at 360 degrees (or a predetermined degree) in the horizontal direction, a laser projector that performs projection, or the like may be used.
  • a marker is used as the partial feature image, but feature points of the image may also be used.
  • the instruction image is projected by the on-site device 10 and the instruction device 30 that can communicate with the on-site device 10.
  • the instruction image 62 may be recorded in the site device 10 and constructed as the site instruction device 11.
  • FIG. 37 The functional configuration of the site instruction device 11 constructed in this way is shown in FIG. 37.
  • An instruction image 62 is recorded in the recording section 24.
  • the follow-up control means 21 controls the instruction image to be projected at a desired position based on the marker 60 or feature points of the reference captured image and the current captured image. Therefore, the instruction image 62 is projected onto the target object 52.
  • the on-site instruction device 11 may be worn by the on-site person in charge 54, or may be attached to a moving body (or a fixed body) near the on-site person in charge.
  • the instruction image 62 is recorded in the recording unit 24, but it may be acquired from outside the on-site instruction device 11 through communication or the like.
  • FIG. 38 shows a flowchart of instruction processing.
  • the on-site instruction device 11 includes a 360-degree camera 81, a 360-degree laser projector 83, and a smartphone 200.
  • the smartphone 200 acquires a captured image (step S71), and determines whether a predetermined marker 60 has been captured (step S72).
  • the smartphone 200 records the partial image in which the marker 60 is imaged among the images captured at that time as a reference image (step S77). At this time, the direction of the partial captured image is selected so that the marker 60 is in a predetermined area (for example, in the center). Also, record this direction as well.
  • the smartphone 200 compares the marker of the reference captured image with the marker of the current partial captured image, and determines the direction in which the instruction image is projected by the 360-degree laser projector so that the instruction image is projected correctly by following the marker. control (step S79). Furthermore, by comparing the markers, the instruction image is transformed so that it is correctly projected, and its projection position is controlled.
  • the camera 81 and the projector 83 are attached directly to the helmet. However, it may be attached via a cushioning material such as silicone gel.

Abstract

[Problem] To provide a system capable of projecting an appropriate instruction image from a remote location. [Solution] A helmet worn by an on-site person in charge is provided with an imaging unit 12 and a projection unit 14 via a drive unit 16. An imaging direction of the imaging unit 12 and a projection direction of the projection unit 14 are detected by a sensor 28. A direction control means 20 controls the drive unit 16 on the basis of an output of the sensor 28, and maintains the directions of the imaging unit 12 and the projection unit 14 at prescribed directions centered on the on-site person in charge, regardless of motion of the on-site person in charge. A captured image from the imaging unit 12 of an on-site device 10 is transmitted to an instruction device 30 by a transmission unit 22 through control of a captured image transmission means 18. An instructor inputs an instruction image from an instruction image input unit 44 while watching an on-site reference captured image displayed on a captured image display unit 40. An instruction image transmission means 38 transmits the input instruction image to the on-site device 10 by means of a transmission unit 34. The on-site device 10 receives the instruction image by means of a reception unit 24, and projects the instruction image from the projection unit 14. Therefore, an instruction image 62 is projected on a target object 52.

Description

遠隔指示システムremote instruction system
 この発明は、遠隔から指示画像を現場に投影したり、予め記録されている指示画像を現場に投影したりするシステムに関するものである。 The present invention relates to a system that remotely projects an instruction image onto a site or projects a previously recorded instruction image onto a site.
 遠隔からの指示を現場に投影するためのシステムが提案されている。たとえば、特許文献1には、現場の作業者がカメラとレーザプロジェクタを首から装着し、遠隔の指示者が入力したアノテーションを、現場の対象物に投影するシステムが開示されている。 A system has been proposed for projecting instructions from a remote location onto the site. For example, Patent Document 1 discloses a system in which a worker at a site wears a camera and a laser projector around the neck, and projects annotations input by a remote instructor onto objects at the site.
 このシステムでは、作業者の装着しているカメラにより、対象物の近傍に置かれたマーカを撮像し、このマーカとの関係においてアノテーションを表示するようにしている。したがって、現場の作業者が動いてレーザプロジェクタの向きが変わったとしても、アノテーションが正しい位置に表示される。 In this system, a camera worn by the worker captures an image of a marker placed near the object, and annotations are displayed in relation to this marker. Therefore, even if a worker moves and the direction of the laser projector changes, the annotation will be displayed in the correct position.
 このように、上記システムによれば、遠隔から現場に対し正確にアノテーションを表示することができる。 In this way, according to the above system, annotations can be accurately displayed on-site from a remote location.
特開2019-5095JP2019-5095
 しかしながら、上記のような従来技術では、現場の作業者が対象物の方を向いている間(すなわち、カメラによってマーカが撮像されている間)は、アノテーションが正しい位置に表示されるが、作業中に、作業者が異なる方向に体の向きを変えると(カメラによってマーカが撮像されなくなると)、アノテーションが表示されなくなってしまう。 However, with the above-mentioned conventional technology, while the on-site worker is facing the object (i.e., while the marker is being imaged by the camera), the annotation is displayed in the correct position, but the During the process, if the worker turns his or her body in a different direction (the marker is no longer captured by the camera), the annotation disappears.
 このため、現場の作業者が大きく向きを変えるとアノテーションが消えてしまい、再び向きを戻すとアノテーションが復活するものの、現場の作業者にストレスを与えることになる。 For this reason, if the on-site worker changes direction significantly, the annotation will disappear, and although the annotation will be restored when the on-site worker changes direction again, it will cause stress to the on-site worker.
 また、複数人で作業をしている際に、レーザプロジェクタを装着した作業者が向きを変えてしまうと、アノテーションが消失し、他の作業者がアノテーションを見ることができなくなるという問題があった。 Additionally, when multiple people are working together, if the worker wearing the laser projector changes direction, the annotation will disappear, making it impossible for other workers to see the annotation. .
 この発明は、上記のような問題点を解決して、作業者が大きく向きを変えても指示画像(アノテーション)が継続して表示されるようなシステムを提供することを目的とする。 An object of the present invention is to solve the above-mentioned problems and provide a system in which an instruction image (annotation) is continuously displayed even if the operator changes direction significantly.
 以下この発明の独立して適用可能ないくつかの特徴を列挙する。 Hereinafter, some independently applicable features of this invention will be listed.
(1)~(5)この発明に係る遠隔指示システムは、指示者が使用する指示装置と、現場担当者が使用する現場装置とを備えた遠隔指示システムであって、
 前記現場装置は、現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部と、撮像部および投影部の向きを検出するセンサからの出力を受けて、前記現場担当者または前記移動体の動きに拘わらず、前記撮像部および前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、前記固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段とを備え、
 前記指示装置は、受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、受信した撮像画像を表示する撮像画像表示部と、所望の現場空間の撮像画像が撮像されるように、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、前記表示された撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場装置の投影部が前記現場空間の所定部位を基準として正しく前記指示画像を投影するように、送信部により、指示画像の位置を特定した指示画像データを前記現場装置に送信する指示画像送信手段とを備えたことを特徴としている。
(1) to (5) The remote instruction system according to the present invention is a remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
The on-site device includes an imaging unit that is attached to a person in charge of the on-site or a mobile body and that captures an image of the on-site space to generate a captured image, and a captured image transmitting unit that transmits the captured image to the instruction device using a transmitting unit. a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data; and a drive that changes the imaging direction of the imaging unit and the projection direction of the projection unit. In response to an output from a sensor that detects the direction of the imaging section and the projection section, the imaging section and the projection section are set in a predetermined position with the on-site personnel at the center, regardless of the movement of the on-site personnel or the mobile object. a direction control means for controlling the drive unit to face the direction; and a characteristic partial image in the reference captured image when the fixing command is given, using the captured image when the fixing command is given as a reference captured image; Based on the comparison with the characteristic partial image in the currently captured image, the projection of the instruction image by the projection unit without depending on the drive unit is performed so that the instruction image is correctly displayed with reference to a predetermined part of the site space. and a correction means for correcting,
The instruction device includes a captured image receiving means for receiving the transmitted captured image by a receiving section, a captured image display section for displaying the received captured image, and a captured image of a desired site space so that the captured image is captured. , a fixing command means for giving a fixing command to the on-site device by a transmitting unit; and an instruction image input unit for inputting an instruction image to a desired position in the on-site space by an operation of an instructor in the displayed captured image. , the transmitting unit transmits an instruction image so that the projection unit of the on-site device correctly projects the instruction image with reference to a predetermined part of the on-site space based on a characteristic partial image of the on-site space included in the captured image. The present invention is characterized by comprising an instruction image transmitting means for transmitting instruction image data specifying the position of to the on-site device.
 したがって、現場作業者の向きに拘わらず、正しい位置に指示画像データを投影することができる。 Therefore, the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
(6)この発明に係る遠隔指示システムは、撮像部および投影部が、駆動部に対して、高周波振動を吸収する部材を介して固定されていることを特徴としている。 (6) The remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed to the drive section via a member that absorbs high-frequency vibrations.
 したがって、振動の少ない撮像画像を得て、振動の少ない指示画像の投影を実現できる。 Therefore, a captured image with less vibration can be obtained, and an instruction image can be projected with less vibration.
(7)この発明に係る遠隔指示システムは、撮像部および投影部は、前記駆動部を介して、現場担当者のヘルメットに固定されていることを特徴としている。 (7) The remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed to the helmet of the person in charge of the field via the drive section.
 したがって、安定して撮像部、投影部を固定することができる。 Therefore, the imaging section and the projection section can be stably fixed.
(8)この発明に係る遠隔指示システムは、方向制御手段が、指示装置からの方向指示に基づいて所定方向を変化させることを特徴としている。 (8) The remote instruction system according to the present invention is characterized in that the direction control means changes a predetermined direction based on a direction instruction from the instruction device.
 したがって、指示者が撮像方向を遠隔から操作して現場の状況を確認することができる。 Therefore, the instructor can remotely control the imaging direction and check the situation at the site.
(9)この発明に係る遠隔指示システムは、特徴部分画像が、前記現場空間内に設けられたマーカまたは前記撮像画像の特徴点であることを特徴としている。 (9) The remote instruction system according to the present invention is characterized in that the characteristic portion image is a marker provided in the site space or a characteristic point of the captured image.
 したがって、マーカまたは特徴点に基づいて、指示画像を正しく表示させることができる。 Therefore, the instruction image can be displayed correctly based on the markers or feature points.
(10)この発明に係る遠隔指示システムは、駆動部が多軸駆動機構を有することを特徴としている。 (10) The remote instruction system according to the present invention is characterized in that the drive section has a multi-axis drive mechanism.
 したがって、自在に撮像部、投影部の向きを制御することができる。 Therefore, the orientation of the imaging section and the projection section can be freely controlled.
(11)(12)この発明に係る現場指示装置は、現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部と、撮像部および投影部の向きを検出するセンサからの出力を受けて、前記現場担当者または前記移動体の動きに拘わらず、前記撮像部および前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段とを備えている。 (11)(12) The on-site instruction device according to the present invention includes an imaging unit that is attached to a person in charge of the site or a moving body, and that captures an image of the on-site space and generates a captured image; a projection unit that projects an instruction image onto the site space based on given instruction image data; a drive unit that changes an imaging direction of the imaging unit and a projection direction of the projection unit; and an orientation of the imaging unit and the projection unit. In response to an output from a sensor that detects, a drive unit is controlled so that the imaging unit and the projection unit face a predetermined direction with the on-site person at the center regardless of the movement of the on-site person or the moving body. a direction control means, and a direction control means that controls the direction control means without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space based on a characteristic partial image of the site space included in the captured image. and a correction means for correcting the projection of the instruction image by the projection unit.
 したがって、現場作業者の向きに拘わらず、正しい位置に指示画像データを投影することができる。 Therefore, the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
(13)~(17)この発明に係る遠隔指示システムは、指示者が使用する指示装置と、現場担当者が使用する現場装置とを備えた遠隔指示システムであって、
 前記現場装置は、現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部と、前記固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、前記現場担当者または前記移動体の動きに拘わらず、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、駆動部を制御する追従制御手段とを備え、
 前記指示装置は、受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、受信した撮像画像を表示する撮像画像表示部と、所望の現場空間の撮像画像が撮像されるように、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、前記表示された撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、駆動部を制御して前記現場装置の投影部が前記現場空間の所定部位を基準として正しく前記指示画像を投影するように、送信部により、指示画像の位置を特定した指示画像データを前記現場装置に送信する指示画像送信手段とを備えたことを特徴とする。
(13) to (17) The remote instruction system according to the present invention is a remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
The on-site device includes an imaging unit that is attached to a person in charge of the on-site or a mobile body and that captures an image of the on-site space to generate a captured image, and a captured image transmitting unit that transmits the captured image to the instruction device using a transmitting unit. a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data; and a drive that changes the imaging direction of the imaging unit and the projection direction of the projection unit. and a comparison between the characteristic partial image in the reference captured image when the fixing command was given and the characteristic partial image in the current captured image, using the captured image when the fixing command was given as a reference captured image. and a follow-up control means for controlling a drive unit so that the instruction image is correctly displayed based on a predetermined part of the site space, regardless of the movement of the site person or the mobile body,
The instruction device includes a captured image receiving means for receiving the transmitted captured image by a receiving section, a captured image display section for displaying the received captured image, and a captured image of a desired site space so that the captured image is captured. , a fixing command means for giving a fixing command to the on-site device by a transmitting unit; and an instruction image input unit for inputting an instruction image to a desired position in the on-site space by an operation of an instructor in the displayed captured image. , based on a characteristic partial image of the on-site space included in the captured image, a driving unit is controlled so that the projection unit of the on-site device correctly projects the instruction image with reference to a predetermined part of the on-site space; The apparatus is characterized by comprising an instruction image transmitting means for transmitting instruction image data in which the position of the instruction image is specified to the on-site device by a transmitter.
 したがって、現場作業者の向きに拘わらず、正しい位置に指示画像データを投影することができる。 Therefore, the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
(18)この発明に係る遠隔指示システムは、撮像部および投影部が、前記駆動部に対して、高周波振動を吸収する部材を介して固定されていることを特徴としている。 (18) The remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed to the driving section via a member that absorbs high frequency vibrations.
 したがって、振動の少ない撮像画像を得て、振動の少ない指示画像の投影を実現できる。 Therefore, a captured image with less vibration can be obtained, and an instruction image can be projected with less vibration.
(19)この発明に係る遠隔指示システムは、撮像部および投影部が、前記駆動部を介して、現場担当者のヘルメットに固定されていることを特徴としている。 (19) The remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed to the helmet of the person in charge of the site via the drive section.
 したがって、安定して撮像部、投影部を固定することができる。 Therefore, the imaging section and the projection section can be stably fixed.
(20)この発明に係る遠隔指示システムは、方向制御手段は、前記指示装置からの方向指示に基づいて前記所定方向を変化させることを特徴としている。 (20) The remote instruction system according to the present invention is characterized in that the direction control means changes the predetermined direction based on a direction instruction from the instruction device.
 したがって、指示者が撮像方向を遠隔から操作して現場の状況を確認することができる。 Therefore, the instructor can remotely control the imaging direction and check the situation at the site.
(21)この発明に係る遠隔指示システムは、特徴部分画像が、前記現場空間内に設けられたマーカまたは前記撮像画像の特徴点であることを特徴としている。 (21) The remote instruction system according to the present invention is characterized in that the characteristic portion image is a marker provided in the site space or a characteristic point of the captured image.
 したがって、マーカまたは特徴点に基づいて、指示画像を正しく表示させることができる。 Therefore, the instruction image can be displayed correctly based on the markers or feature points.
(22)この発明に係る遠隔指示システムは、駆動部が、多軸駆動機構を有することを特徴としている。 (22) The remote instruction system according to the present invention is characterized in that the drive section has a multi-axis drive mechanism.
 したがって、自在に撮像部、投影部の向きを制御することができる。 Therefore, the orientation of the imaging section and the projection section can be freely controlled.
(23)この発明にかかる遠隔指示システムは、固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段をさらに備えたことを特徴としている。 (23) The remote instruction system according to the present invention uses a captured image when a fixing command is given as a reference captured image, a characteristic partial image in the reference captured image when the fixing command is given, and a current captured image. correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit, so that the instruction image is correctly displayed with reference to a predetermined part of the site space, based on a comparison with a characteristic partial image; It is further characterized by the following.
 したがって、より正確に指示画像を投影することができる。 Therefore, the instruction image can be projected more accurately.
(24)(25)この発明に係る現場指示装置は、現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部と、前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場担当者または前記移動体の動きに拘わらず、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、駆動部を制御する追従制御手段とを備えている。 (24)(25) The on-site instruction device according to the present invention includes an imaging unit that is attached to a person in charge of the site or a moving body, and that captures an image of the on-site space and generates a captured image; a projection unit that projects an instruction image onto the site space based on given instruction image data; a drive unit that changes the imaging direction of the imaging unit and the projection direction of the projection unit; Follow-up control that controls a drive unit based on a characteristic partial image of the on-site space so that the instruction image is correctly displayed based on a predetermined part of the on-site space, regardless of the movement of the on-site person or the mobile object. and means.
 したがって、現場作業者の向きに拘わらず、正しい位置に指示画像データを投影することができる。 Therefore, the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
(26)~(30)この発明に係る遠隔指示システムは、指示者が使用する指示装置と、現場担当者が使用する現場装置とを備えた遠隔指示システムであって、
 前記現場装置は、現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記投影部の投影方向を変化させる駆動部と、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、
 前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段とを備え、
 前記指示装置は、受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するよう、駆動部を制御し、投影部の投影を制御するために、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段とを備えたことを特徴としている。
(26) to (30) The remote instruction system according to the present invention is a remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
The on-site device includes an imaging unit that is attached to a person in charge of the on-site or a moving body, and that captures an image of the on-site space in a wide-angle direction to generate a captured image, and a transmission unit that transmits the captured image to the instruction device. a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on given instruction image data; a drive unit that changes the projection direction of the projection unit; Direction control means for controlling a drive unit so that the projection unit faces in a predetermined direction with the on-site person in charge regardless of the movement of the on-site person or the mobile body;
a correction means for correcting projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space;
The instruction device includes a captured image receiving unit that receives a transmitted captured image by a receiving unit, a fixing command unit that issues a fixing command to the field device by a transmitting unit, and when there is a fixing command, an instruction image input unit that takes the vicinity of a characteristic partial image of the captured image as a captured image of interest and inputs an instruction image to a desired position in the site space by an instruction person's operation in the captured image of interest; Regardless of the movement of the body, the driving section is controlled so that the projection section projects the instruction image based on a predetermined part of the site space, and the transmission section transmits the instruction image to control the projection of the projection section. and instruction image transmitting means for transmitting instruction image data specifying the position of the instruction device to the instruction device.
 したがって、現場作業者の向きに拘わらず、正しい位置に指示画像データを投影することができる。 Therefore, the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
(31)この発明に係る遠隔指示システムは、前記投影部が、前記駆動部に対して、高周波振動を吸収する部材を介して固定されていることを特徴としている。 (31) The remote instruction system according to the present invention is characterized in that the projection section is fixed to the drive section via a member that absorbs high frequency vibrations.
 したがって、振動の少ない撮像画像を得て、振動の少ない指示画像の投影を実現できる。 Therefore, a captured image with less vibration can be obtained, and an instruction image can be projected with less vibration.
(32)この発明に係る遠隔指示システムは、投影部が、前記駆動部を介して、現場担当者のヘルメットに固定されていることを特徴としている。 (32) The remote instruction system according to the present invention is characterized in that the projection section is fixed to the helmet of the person in charge of the field via the drive section.
 したがって、安定して投影部を固定することができる。 Therefore, the projection section can be stably fixed.
(33)この発明に係る遠隔指示システムは、特徴部分画像が、前記現場空間内に設けられたマーカまたは前記撮像画像の特徴点であることを特徴としている。 (33) The remote instruction system according to the present invention is characterized in that the characteristic portion image is a marker provided in the site space or a characteristic point of the captured image.
 したがって、マーカまたは特徴点に基づいて、指示画像を正しく投影させることができる。 Therefore, the instruction image can be correctly projected based on the markers or feature points.
(34)この発明に係る遠隔指示システムは、駆動部が、多軸駆動機構を有することを特徴としている。 (34) The remote instruction system according to the present invention is characterized in that the drive section has a multi-axis drive mechanism.
 したがって、自在に撮像部、投影部の向きを制御することができる。 Therefore, the orientation of the imaging section and the projection section can be freely controlled.
(35)36)この発明に係る現場指示装置は、現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記投影部の投影方向を変化させる駆動部と、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段とを備えている。 (35)36) The on-site instruction device according to the present invention includes an imaging unit that is attached to a person in charge of the site or a moving body and generates a captured image by capturing an image of the site space in a wide-angle direction, and an imaging unit that is attached to the person in charge of the site or the moving body. a projection unit that projects an instruction image onto the site space based on given instruction image data; a drive unit that changes the projection direction of the projection unit; First, a direction control means for controlling a drive unit so that the projection unit faces in a predetermined direction with the person in charge of the site as the center; and a correction means for correcting the projection of the instruction image by the projection section regardless of the section.
 したがって、より正確に指示画像を投影することができる。 Therefore, the instruction image can be projected more accurately.
(37)~(41)この発明に係る遠隔指示システムは、指示者が使用する指示装置と、現場担当者が使用する現場装置とを備えた遠隔指示システムであって、
 前記現場装置は、現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記投影部の投影方向を変化させる駆動部と、前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部を制御する追従制御手段とを備え、
 前記指示装置は、受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するよう、駆動部を制御するために、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段とを備えたことを特徴としている。
(37) to (41) The remote instruction system according to the present invention is a remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
The on-site device includes an imaging unit that is attached to a person in charge of the on-site or a moving body, and that captures an image of the on-site space in a wide-angle direction to generate a captured image, and a transmission unit that transmits the captured image to the instruction device. a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on given instruction image data; a drive unit that changes the projection direction of the projection unit; a follow-up control means for controlling the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space,
The instruction device includes a captured image receiving unit that receives a transmitted captured image by a receiving unit, a fixing command unit that issues a fixing command to the field device by a transmitting unit, and when there is a fixing command, an instruction image input unit that takes the vicinity of a characteristic partial image of the captured image as a captured image of interest and inputs an instruction image to a desired position in the site space by an instruction person's operation in the captured image of interest; instruction image data in which the position of the instruction image is specified by a transmitting section in order to control the driving section so that the projection section projects the instruction image based on a predetermined part of the site space regardless of the movement of the body; and instruction image transmitting means for transmitting an instruction image to the instruction device.
 したがって、現場作業者の向きに拘わらず、正しい位置に指示画像データを投影することができる。 Therefore, the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
(42)この発明に係る遠隔指示システムは、投影部が、前記駆動部に対して、高周波振動を吸収する部材を介して固定されていることを特徴としている。 (42) The remote instruction system according to the present invention is characterized in that the projection section is fixed to the drive section via a member that absorbs high frequency vibrations.
 したがって、振動の少ない撮像画像を得て、振動の少ない指示画像の投影を実現できる。 Therefore, a captured image with less vibration can be obtained, and an instruction image can be projected with less vibration.
(43)この発明に係る遠隔指示システムは、投影部が、前記駆動部を介して、現場担当者のヘルメットに固定されていることを特徴としている。 (43) The remote instruction system according to the present invention is characterized in that the projection section is fixed to the helmet of the person in charge of the field via the drive section.
 したがって、安定して投影部を固定することができる。 Therefore, the projection section can be stably fixed.
(44)この発明に係る遠隔指示システムは、特徴部分画像が、前記現場空間内に設けられたマーカまたは前記撮像画像の特徴点であることを特徴としている。 (44) The remote instruction system according to the present invention is characterized in that the characteristic portion image is a marker provided in the site space or a characteristic point of the captured image.
 したがって、マーカまたは特徴点に基づいて、指示画像を正しく投影させることができる。 Therefore, the instruction image can be correctly projected based on the markers or feature points.
(45)この発明に係る遠隔指示システムは、駆動部が、多軸駆動機構を有することを特徴としている。 (45) The remote instruction system according to the present invention is characterized in that the drive section has a multi-axis drive mechanism.
 したがって、自在に撮像部、投影部の向きを制御することができる。 Therefore, the orientation of the imaging section and the projection section can be freely controlled.
(46)この発明に係る遠隔指示システムは、固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段をさらに備えたことを特徴としている。 (46) The remote instruction system according to the present invention uses a captured image when a fixing command is given as a reference captured image, and a characteristic partial image in the reference captured image when the fixing command is given, and a current captured image. correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit, so that the instruction image is correctly displayed with reference to a predetermined part of the site space, based on a comparison with a characteristic partial image; It is further characterized by the following.
 したがって、より正確に指示画像を投影することができる。 Therefore, the instruction image can be projected more accurately.
(47)(48)この発明に係る現場指示装置は、現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記投影部の投影方向を変化させる駆動部と、前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部を制御する追従制御手段とを備えている。 (47)(48) The on-site instruction device according to the present invention includes an imaging unit that is attached to a person in charge of the site or a moving body, and that captures an image of the site space in a wide-angle direction to generate a captured image; a projection unit that is mounted and projects an instruction image onto the site space based on given instruction image data; a drive unit that changes the projection direction of the projection unit; and follow-up control means for controlling the drive unit so that it is displayed correctly as a reference.
 したがって、現場作業者の向きに拘わらず、正しい位置に指示画像データを投影することができる。 Therefore, the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
(49)~(53)この発明に係る遠隔指示システムは、指示者が使用する指示装置と、現場担当者が使用する現場装置とを備えた遠隔指示システムであって、
 前記現場装置は、現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、現場担当者または移動体に装着され、現場空間に対して広角方向に投影可能であり、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記指示装置からの固定指令を受けて、前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するように制御する追従制御手段とを備え
 前記指示装置は、受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段とを備えたことを特徴としている。
(49) to (53) The remote instruction system according to the present invention is a remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
The on-site device includes an imaging unit that is attached to a person in charge of the on-site or a moving body, and that captures an image of the on-site space in a wide-angle direction to generate a captured image, and a transmission unit that transmits the captured image to the instruction device. a projection unit that is attached to a person in charge of the site or a moving body, is capable of projecting in a wide angle direction onto the site space, and projects an instruction image onto the site space based on given instruction image data; In response to a fixing command from an instruction device, the projection unit projects a fixed image of the site space to a predetermined position based on a characteristic partial image of the site space included in the captured image, regardless of the movement of the site person or the mobile object. a follow-up control means for controlling the projection of the instruction image based on the body part; the instruction device includes a captured image receiving means for receiving the transmitted captured image by a receiving section; A fixing command means for giving a fixing command to the device, and when there is a fixing command, the vicinity of the characteristic partial image of the captured image is set as a captured image of interest, and in the captured image of interest, a desired position of the site space is determined by an operation of an instructor. The present invention is characterized by comprising an instruction image input section for inputting an instruction image at a position, and instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the instruction device by a transmission section.
 したがって、現場作業者の向きに拘わらず、正しい位置に指示画像データを投影することができる。 Therefore, the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
(54)この発明に係る遠隔指示システムは、撮像部および投影部は、高周波振動を吸収する部材を介して固定されていることを特徴としている。 (54) The remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed via a member that absorbs high-frequency vibrations.
 したがって、振動の少ない撮像画像を得て、振動の少ない指示画像の投影を実現できる。 Therefore, a captured image with less vibration can be obtained, and an instruction image can be projected with less vibration.
(55)この発明に係る遠隔指示システムは、撮像部および投影部は、現場担当者のヘルメットに固定されていることを特徴としている。 (55) The remote instruction system according to the present invention is characterized in that the imaging section and the projection section are fixed to the helmet of the person in charge of the field.
 したがって、安定して撮像部・投影部を固定することができる。 Therefore, the imaging section/projection section can be stably fixed.
(57)この発明に係る遠隔指示システムは、特徴部分画像が、前記現場空間内に設けられたマーカまたは前記撮像画像の特徴点であることを特徴としている。 (57) The remote instruction system according to the present invention is characterized in that the characteristic portion image is a marker provided in the site space or a characteristic point of the captured image.
 したがって、マーカまたは特徴点に基づいて、指示画像を正しく投影させることができる。 Therefore, the instruction image can be correctly projected based on the markers or feature points.
(57)(58)この発明に係る現場指示装置は、現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、現場空間に対して広角方向に投影可能であり、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するように制御する追従制御手段とを備えている。 (57)(58) The on-site instruction device according to the present invention includes an imaging unit that is attached to a person in charge of the site or a moving body, and that captures an image of the site space in a wide-angle direction to generate a captured image; a projection unit that is attached and capable of projecting in a wide-angle direction onto a site space, and projects an instruction image onto the site space based on given instruction image data; and features of the site space included in the captured image. and follow-up control means for controlling the projection unit to project the instruction image based on a predetermined part of the site space based on the partial image, regardless of the movement of the site person or the mobile object. There is.
 したがって、現場作業者の向きに拘わらず、正しい位置に指示画像データを投影することができる。 Therefore, the instruction image data can be projected at the correct position regardless of the orientation of the on-site worker.
(59)この発明に係る撮像部・投影部統合体は、撮像部と、撮像部の撮像画角と実質的に同じ画角に投影を行う投影部と、撮像部と投影部を一体として少なくとも二軸方向に方向付けできる構造体と、構造体の方向付けを制御する駆動部と を備えている。 (59) The integrated imaging unit and projection unit according to the present invention includes an imaging unit, a projection unit that projects at substantially the same angle of view as the imaging angle of view of the imaging unit, and an imaging unit and a projection unit that are integrated at least. It includes a structure that can be oriented in two axial directions and a drive unit that controls the orientation of the structure.
 したがって、指示画像などを対象物に投影する際に、移動する人などに取り付けても、正確に指示画像を表示する制御が容易である。 Therefore, when projecting an instruction image onto a target object, it is easy to control the display of the instruction image accurately even if it is attached to a moving person or the like.
 「方向制御手段」は、実施形態においては、ステップS1、S2がこれに対応する。 In the embodiment, the "direction control means" corresponds to steps S1 and S2.
 「補正手段」は、実施形態においては、ステップS34やステップS75がこれに対応する。 In the embodiment, the "correction means" corresponds to step S34 and step S75.
 「固定指令手段」は、実施形態においては、ステップS51がこれに対応する。 In the embodiment, the "fixing command means" corresponds to step S51.
 「指示画像送信手段」は、実施形態においては、ステップS53がこれに対応する。 In the embodiment, the "instruction image transmitting means" corresponds to step S53.
 「追従制御手段」は、実施形態においては、ステップS35やステップS76やステップS79がこれに対応する。 In the embodiment, the "follow-up control means" corresponds to step S35, step S76, and step S79.
 「装置」とは、1台のコンピュータによって構成されるものだけでなく、ネットワークなどを介して接続された複数のコンピュータによって構成されるものも含む概念である。したがって、本発明の手段(あるいは手段の一部でもよい)が複数のコンピュータに分散されている場合、これら複数のコンピュータが装置に該当する。 The term "device" is a concept that includes not only one computer but also multiple computers connected via a network. Therefore, when the means (or a part of the means) of the present invention is distributed over multiple computers, these multiple computers correspond to the apparatus.
 「プログラム」とは、CPUにより直接実行可能なプログラムだけでなく、ソース形式のプログラム、圧縮処理がされたプログラム、暗号化されたプログラム、オペレーティングシステムと協働してその機能を発揮するプログラム等を含む概念である。 "Program" refers to not only programs that can be directly executed by the CPU, but also programs in source format, compressed programs, encrypted programs, programs that cooperate with the operating system to perform their functions, etc. It is a concept that includes
この発明の一実施形態による遠隔指示システムの機能構成図である。FIG. 1 is a functional configuration diagram of a remote instruction system according to an embodiment of the present invention. 現場装置を装着した現場担当者54を示す図である。It is a figure showing the field person in charge 54 wearing a field device. レーザプロジェクタ・カメラ統合体58の外観である。This is an appearance of the laser projector/camera integrated body 58. レーザプロジェクタ・カメラ統合体58のマウント部材97への取付構造を示す図である。9 is a diagram showing a structure for attaching the laser projector/camera integrated body 58 to the mount member 97. FIG. ユニット80に対するシリコンゲルブッシュ120(高周波振動吸収部材)の取付位置を示す図である。7 is a diagram showing the attachment position of a silicon gel bush 120 (high-frequency vibration absorbing member) to the unit 80. FIG. 遠隔指示システムのシステム構成である。This is the system configuration of the remote instruction system. 指示装置30のハードウエア構成である。This is the hardware configuration of the instruction device 30. モータ制御回路400のハードウエア構成である。This is the hardware configuration of the motor control circuit 400. スマートフォン200のハードウエア構成である。This is the hardware configuration of the smartphone 200. 方向固定制御のフローチャートである。It is a flowchart of direction fixed control. 指示画像表示制御のフローチャートである。3 is a flowchart of instruction image display control. 指示装置30のディスプレイ306に表示された撮像画像である。This is a captured image displayed on the display 306 of the instruction device 30. 指示装置30において指示画像62を描画した撮像画像を示す図である。3 is a diagram showing a captured image in which an instruction image 62 is drawn in the instruction device 30. FIG. 指示画像のデータ構造を示す図である。FIG. 3 is a diagram showing a data structure of an instruction image. 現場担当者の動きとレーザプロジェクタ84の投影方向との関係を示す図である。7 is a diagram showing the relationship between the movement of a person in charge of the site and the projection direction of a laser projector 84. FIG. 現場担当者の動きとレーザプロジェクタ84の投影方向との関係を示す図である。7 is a diagram showing the relationship between the movement of a person in charge of the site and the projection direction of a laser projector 84. FIG. 現場担当者の動きとレーザプロジェクタ84の投影方向との関係を示す図である。7 is a diagram showing the relationship between the movement of a person in charge of the site and the projection direction of a laser projector 84. FIG. マーカに代えて特徴点512を用いる場合の例である。This is an example in which feature points 512 are used instead of markers. 現場指示装置11の機能構成である。This is a functional configuration of the on-site instruction device 11. 現場指示装置11による指示処理のフローチャートである。3 is a flowchart of instruction processing by the on-site instruction device 11. FIG. 第2の実施形態による遠隔指示システムの機能構成図である。FIG. 2 is a functional configuration diagram of a remote instruction system according to a second embodiment. 指示画像表示制御のフローチャートである。3 is a flowchart of instruction image display control. 現場担当者の動きとレーザプロジェクタ84の投影方向との関係を示す図である。7 is a diagram showing the relationship between the movement of a person in charge of the site and the projection direction of a laser projector 84. FIG. 現場指示装置11の機能構成である。This is a functional configuration of the on-site instruction device 11. 現場指示装置11による指示処理のフローチャートである。3 is a flowchart of instruction processing by the on-site instruction device 11. FIG. 第3の実施形態による遠隔指示システムの機能構成図である。FIG. 3 is a functional configuration diagram of a remote instruction system according to a third embodiment. レーザプロジェクタ・カメラ統合体58の外観である。This is an appearance of the laser projector/camera integrated body 58. 方向固定制御のフローチャートである。It is a flowchart of direction fixed control. 指示画像表示制御のフローチャートである。3 is a flowchart of instruction image display control. 方向固定制御のフローチャートである。It is a flowchart of direction fixed control. 指示画像表示制御のフローチャートである。3 is a flowchart of instruction image display control. 現場指示装置11の機能構成図である。2 is a functional configuration diagram of the on-site instruction device 11. FIG. 現場指示装置11による指示処理のフローチャートである。3 is a flowchart of instruction processing by the on-site instruction device 11. FIG. 第4の実施形態による遠隔指示システムの機能構成図である。FIG. 3 is a functional configuration diagram of a remote instruction system according to a fourth embodiment. 方向固定制御のフローチャートである。It is a flowchart of direction fixed control. 指示画像表示制御のフローチャートである。3 is a flowchart of instruction image display control. 現場指示装置11の機能構成図である。2 is a functional configuration diagram of the on-site instruction device 11. FIG. 現場指示装置11による指示処理のフローチャートである。3 is a flowchart of instruction processing by the on-site instruction device 11. FIG.
1.第1の実施形態
1.1機能的構成
 図1に、この発明の一実施形態による遠隔指示システムの機能構成を示す。このシステムは、現場担当者が使用する現場装置10と遠隔での指示者が使用する指示装置30を備えている。
1. First embodiment
1.1 Functional Configuration FIG. 1 shows the functional configuration of a remote instruction system according to an embodiment of the present invention. This system includes an on-site device 10 used by a person in charge at the site and an instruction device 30 used by a remote instructor.
 現場担当者が被っているヘルメットには、駆動部16を介して、撮像部12、投影部14が設けられている。撮像部12の撮像領域と、投影部14の投影領域は、実質的に同じになるように配置されている。 An imaging unit 12 and a projection unit 14 are provided on the helmet worn by the person in charge of the field via a drive unit 16. The imaging area of the imaging unit 12 and the projection area of the projection unit 14 are arranged to be substantially the same.
 これら撮像部12、投影部14は、一体として、駆動部16により、その撮像方向、投影方向を変化できるように構成されている。撮像部12、投影部14の撮像方向、投影方向は、センサ28によって検出される。方向制御手段20は、センサ28の出力に基づいて駆動部16を制御し、現場担当者の動きに拘わらず、撮像部12、投影部14の方向を、現場担当者を中心とした所定方向に維持する。 These imaging section 12 and projection section 14 are integrally configured so that their imaging direction and projection direction can be changed by a driving section 16. The imaging direction and projection direction of the imaging unit 12 and the projection unit 14 are detected by the sensor 28. The direction control means 20 controls the drive section 16 based on the output of the sensor 28, and directs the imaging section 12 and the projection section 14 in a predetermined direction centered on the on-site worker, regardless of the movement of the on-site worker. maintain.
 現場装置10の撮像部12は、対象物52を含む現場空間を撮像して撮像画像を生成する。上記のように、撮像部12による撮像方向は、現場担当者を中心とした所定方向に固定されているので、現場担当者自身が場所を移動しなければ、現場担当者がその場で顔を違った方向に向けたとしても、実質的に固定された撮像画像が得られる。 The imaging unit 12 of the field device 10 images the field space including the object 52 and generates a captured image. As mentioned above, the imaging direction by the imaging unit 12 is fixed in a predetermined direction centered on the person in charge of the site, so if the person in charge of the site does not move, the person in charge of the site will be able to capture the face on the spot. Even when oriented in different directions, a substantially fixed captured image can be obtained.
 この撮像画像は、撮像画像送信手段18の制御により、送信部22によって、指示装置30に送信される。指示装置30の撮像画像受信手段36は、受信部32により撮像画像を受信する。撮像画像表示部40は、受信した撮像画像を表示する。これにより、指示者は現場空間の画像を見ることができる。 This captured image is transmitted to the instruction device 30 by the transmitter 22 under the control of the captured image transmitter 18. The captured image receiving means 36 of the instruction device 30 receives the captured image by the receiving unit 32. The captured image display section 40 displays the received captured image. This allows the instructor to view an image of the site space.
 指示を与える際、指示者は固定指令を入力する。撮像画像表示40は、固定指令が入力されるとその際の撮像画像を参照撮像画像とし、静止画として表示する。なお、撮像画像は方向制御手段20によってほぼ固定された状態で表示されているので、撮像画像をそのまま表示してもよい。 When giving an instruction, the instructor inputs a fixed command. When a fixing command is input, the captured image display 40 uses the captured image at that time as a reference captured image and displays it as a still image. Note that since the captured image is displayed in a substantially fixed state by the direction control means 20, the captured image may be displayed as is.
 指示者は、撮像画像表示部40に表示された現場の参照撮像画像を見ながら、指示画像入力部44から指示画像を入力する。指示画像送信手段38は、送信部34により、入力された指示画像を現場装置10に送信する。 The instructor inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40. The instruction image transmitting means 38 transmits the input instruction image to the field device 10 by the transmitter 34 .
 現場装置10は、受信部24によって指示画像を受信し、投影部14から指示画像を投影する。これにより、対象物52上に指示画像62が投影される。上述のように、投影部14の投影方向は固定されているので、現場担当者が顔の向きを変えたとしても、指示画像は指示者の意図する箇所に投影されることになる。とはいえ、現場担当者が場所を移動すると、指示画像の投影位置は、ずれてしまうことになる。 The field device 10 receives the instruction image by the receiving unit 24 and projects the instruction image from the projection unit 14. As a result, the instruction image 62 is projected onto the target object 52. As described above, since the projection direction of the projection unit 14 is fixed, even if the on-site person in charge changes the direction of his or her face, the instruction image will be projected onto the location intended by the person in charge. However, if the person in charge of the site moves from place to place, the projection position of the instruction image will shift.
 そこで、現場装置10の補正手段26は、参照撮像画面における特徴部分画像(マーカなど)と現在の撮像画像における特徴部分画像との比較により、指示画像が意図した位置に正しく投影されるように、指示画像の投影位置を補正制御する。これにより、現場担当者が移動したとしても、指示画像が正しい位置に表示される。 Therefore, the correction means 26 of the field device 10 compares the characteristic partial image (marker, etc.) in the reference captured image with the characteristic partial image in the current captured image so that the instruction image is correctly projected at the intended position. The projection position of the instruction image is corrected and controlled. This allows the instruction image to be displayed in the correct position even if the person in charge of the site moves.
 なお、上記を実現するため、指示装置30が固定指令を受けると、送信部34により固定指令を現場装置10に送信する。現場装置10の受信部24はこれを受信して、そのときの撮像画像を参照撮像画像として記録する。また、現場担当者は、対象物52の近傍にマーカを置いて撮像されるようにする。 Note that, in order to realize the above, when the instruction device 30 receives a fixing command, the transmitting unit 34 transmits the fixing command to the field device 10. The receiving unit 24 of the field device 10 receives this and records the captured image at that time as a reference captured image. Further, the person in charge at the site places a marker near the object 52 so that the object 52 is imaged.
 現場担当者は、現場空間に実際に投影された指示画像62に基づいて作業すべき位置を確認することができる。この指示画像62は、作業者が頭の向きを変えても方向制御手段20によって正しい位置に表示される。したがって、作業者の頭の向きによって指示画像62が消えてしまうことがなく作業がしやすい。また、複数人で作業している場合、現場装置10を身につけている現場担当者が、頭を大きく動かしたとしても指示画像62が投影され続け、他の作業者の作業が中断することがない。さらに、作業者が移動したとしても、指示画像62が正しく表示される。
 
The person in charge at the site can confirm the position to be worked on based on the instruction image 62 actually projected onto the site space. This instruction image 62 is displayed at the correct position by the direction control means 20 even if the worker changes the direction of his or her head. Therefore, the instruction image 62 does not disappear depending on the direction of the worker's head, making it easier to work. Furthermore, when multiple people are working, the instruction image 62 will continue to be projected even if the person in charge of the site wearing the site device 10 moves his or her head significantly, which may interrupt the work of other workers. do not have. Furthermore, even if the worker moves, the instruction image 62 is displayed correctly.
1.2外観及びハードウエア構成
 図2に、現場装置10を装着した現場担当者54を示す。現場担当者54は、屋根50の上の煙突52の側面にに穴を開ける作業を行い、これを遠隔から指示者が指示するものとして説明する。
1.2 Appearance and Hardware Configuration FIG. 2 shows the field person 54 wearing the field device 10. The on-site person in charge 54 performs the work of drilling a hole in the side of the chimney 52 on the roof 50, and this will be explained as being instructed by an instructor remotely.
 この実施形態では、スマートフォン(図示せず)とレーザプロジェクタ・カメラ統合体58と後部収納体59とによって現場装置10を構成している。現場担当者54はヘルメット56を被っており、前方の庇にはレーザプロジェクタ・カメラ統合体58が固定され、後端側にはバッテリやモータ制御回路を収納した後部収納体59が設けられている。両者は、信号線・電源線(図示せず)にて接続されている。さらに、現場担当者54は、スマートフォン(図示せず)を胸ポケットに保持している。スマートフォンと、レーザプロジェクタ・カメラ統合体58、後部収納体59の間は、信号線・電源線(図示せず)によって接続されている。 In this embodiment, the field device 10 is composed of a smartphone (not shown), a laser projector/camera integrated body 58, and a rear storage body 59. The field worker 54 wears a helmet 56, a laser projector/camera integrated unit 58 is fixed to the front eave, and a rear storage unit 59 housing a battery and a motor control circuit is provided at the rear end. . Both are connected by a signal line/power line (not shown). Furthermore, the field person in charge 54 holds a smartphone (not shown) in his chest pocket. The smartphone, the laser projector/camera integrated body 58, and the rear housing 59 are connected by a signal line/power line (not shown).
 図3に、レーザプロジェクタ・カメラ統合体58の外観を示す。レーザプロジェクタ・カメラ統合体58には、クリップ96が設けられている。クリップ96は、軸91を中心として、部材93、95が閉じる方向に、バネ部材(図示せず)によって付勢されている。レバー98を矢印Aの方向に押圧することで、部材93、95を開き、ヘルメット56の庇を挟み込んだ後、押圧を止めることで、バネ部材によって庇に保持するようにしている。 FIG. 3 shows the appearance of the laser projector/camera integrated body 58. A clip 96 is provided on the laser projector/camera integrated body 58. The clip 96 is biased by a spring member (not shown) in the direction in which the members 93 and 95 close about the axis 91. By pressing the lever 98 in the direction of the arrow A, the members 93 and 95 are opened and the eaves of the helmet 56 are sandwiched between them, and then the pressing is stopped so that the helmet 56 is held in the eaves by the spring member.
 レーザプロジェクタ・カメラ統合体58は、カメラ82、レーザプロジェクタ84が利き目(右利きであれば右目)の前方付近に来るように取り付けることが好ましい。すなわち、ヘルメット56の庇から下方向にカメラ82、レーザプロジェクタ84が位置するように(図3の状態から上下逆転させた状態にて)、庇に保持する。なお、取付時にレーザプロジェクタ・カメラ統合体58が、視線の邪魔になるようであれば、庇から上方向にカメラ82、レーザプロジェクタ84が位置するように(図3の状態にて)、庇に保持するとよい。 It is preferable that the laser projector/camera integrated body 58 be mounted so that the camera 82 and laser projector 84 are located near the front of the dominant eye (the right eye if the user is right-handed). That is, the camera 82 and the laser projector 84 are held in the eaves of the helmet 56 so that they are positioned downward from the eaves (inverted upside down from the state shown in FIG. 3). If the laser projector/camera integrated body 58 obstructs the line of sight during installation, it should be placed under the eaves so that the camera 82 and laser projector 84 are positioned upward from the eaves (as shown in Fig. 3). Good to keep.
 クリップ96には、駆動部である三軸構造体90(その他の多軸構造体でもよい)を介して、カメラ82とレーザプロジェクタ84を収納したユニット80が固定されている。この実施形態では、クリップ96の部材93が三軸構造体90の基部93となっている。 A unit 80 housing a camera 82 and a laser projector 84 is fixed to the clip 96 via a triaxial structure 90 (another multiaxial structure may be used) as a drive section. In this embodiment, the member 93 of the clip 96 serves as the base 93 of the triaxial structure 90.
 三軸構造体90の基部93にはモータ92が固定され、モータ92によってXY平面において回動される中間部材92Aの一端が接続されている。中間部材92Aは、L字状に形成されており、その他端にモータ94が固定されている。モータ94には、当該モータ94によってZX平面において回動される中間部材94Aの一端が接続されている。中間部材94Aは、L字状に形成されており、その他端にモータ96が固定されている。モータ96には、当該モータ96によってZY平面において回動されるマウント部材97が接続されている。なお、図3において示したXYZ軸は、各部材92A、94A、97の回動につれて変動するものである。 A motor 92 is fixed to the base 93 of the triaxial structure 90, and one end of an intermediate member 92A that is rotated in the XY plane by the motor 92 is connected. The intermediate member 92A is formed in an L-shape, and a motor 94 is fixed to the other end. One end of an intermediate member 94A that is rotated in the ZX plane by the motor 94 is connected to the motor 94. The intermediate member 94A is formed in an L shape, and a motor 96 is fixed to the other end. A mount member 97 that is rotated in the ZY plane by the motor 96 is connected to the motor 96 . Note that the XYZ axes shown in FIG. 3 vary as each member 92A, 94A, 97 rotates.
 このように、三軸構造体90は、モータ92、94、96を駆動することにより、マウント部材97の向きを三軸の自由度にて調整することができる。 In this manner, the three-axis structure 90 can adjust the orientation of the mount member 97 with three-axis degrees of freedom by driving the motors 92, 94, and 96.
 また、基部93には、センサ28として三軸ジャイロセンサJS、三軸加速度センサASが設けられている。上記各モータ92、94、96は、これら三軸ジャイロセンサJS、三軸加速度センサASの出力に基づいて、モータ制御回路(図示せず)によって制御される。 Furthermore, the base 93 is provided with a triaxial gyro sensor JS and a triaxial acceleration sensor AS as the sensors 28. Each of the motors 92, 94, and 96 is controlled by a motor control circuit (not shown) based on the outputs of the three-axis gyro sensor JS and the three-axis acceleration sensor AS.
 三軸構造体90のマウント部材97には、カメラ82とレーザプロジェクタ84を収納したユニット80が固定されている。図4に示すように、ユニット80の筐体81内には、カメラ82やレーザプロジェクタ84を制御するカメラ制御回路102やレーザプロジェクタ制御回路104が設けられている。 A unit 80 housing a camera 82 and a laser projector 84 is fixed to the mount member 97 of the triaxial structure 90. As shown in FIG. 4, a camera control circuit 102 and a laser projector control circuit 104 that control the camera 82 and the laser projector 84 are provided within the casing 81 of the unit 80.
 カメラ制御回路102、レーザプロジェクタ制御回路104は、後部収納体59に設けるようにしてもよいが、少なくともレーザプロジェクタ104のMEMS回路はユニット80に設けることが好ましい。 Although the camera control circuit 102 and the laser projector control circuit 104 may be provided in the rear housing 59, it is preferable that at least the MEMS circuit of the laser projector 104 is provided in the unit 80.
 筐体81は、三軸構造体90のマウント上面101、マウント側面97、マウント底面99に、シリコンゲルブッシュ120(たとえばTaica社防振材ゲルブッシュB-1)を介して取り付けられている。なお、図3においては、理解を容易にするためにマウント上面101を省略している。 The casing 81 is attached to the mount top surface 101, mount side surface 97, and mount bottom surface 99 of the triaxial structure 90 via silicone gel bushings 120 (for example, Taica's anti-vibration material gel bushing B-1). Note that in FIG. 3, the mount top surface 101 is omitted for easy understanding.
 図4に示すように、シリコンゲルブッシュ120は、リング状のシリコンゲル116の上部外側に挿入するリング状のシリコンゲル114を備えている。シリコンゲル116の上部は、筐体81に設けられた穴に挿入される。シリコンゲル114とシリコンゲル116によって筐体81を挟み込むようにしている。シリコンゲル114、116は、ボルト110、ワッシャ112によってマウント底面99にネジ止めされる。このような構造により、筐体81は、シリコンゲル116、114によって保持された状態となる。これにより、外部からの高周波振動が筐体81に伝達されるのを防ぐことができる。 As shown in FIG. 4, the silicone gel bush 120 includes a ring-shaped silicone gel 114 inserted outside the upper part of the ring-shaped silicone gel 116. The upper part of the silicone gel 116 is inserted into a hole provided in the housing 81. The housing 81 is sandwiched between the silicon gel 114 and the silicon gel 116. The silicon gels 114 and 116 are screwed to the mount bottom surface 99 by bolts 110 and washers 112. With this structure, the housing 81 is held by the silicone gels 116 and 114. This can prevent high-frequency vibrations from being transmitted to the housing 81 from the outside.
 この実施形態では、図5に示すように、筐体81の上面、側面、底面にそれぞれ2カ所、シリコンゲルブッシュ120を設けている。 In this embodiment, as shown in FIG. 5, silicone gel bushes 120 are provided at two locations on each of the top, side, and bottom surfaces of the casing 81.
 図6に、遠隔指示システムのシステム構成を示す。現場装置10と指示装置30は、インターネットを介して接続される。現場装置10と指示装置30は,インターネットを介して直接データのやり取りを行うようにしてもよいが、サーバ装置を介してデータのやり取りを行うようにしてもよい。現場装置10は、レーザプロジェクタ・カメラ統合体58、後部収納体(モータ制御回路、バッテリ)59、スマートフォン200を備えている。これらは互いに信号線・電源線にて接続されている。 Figure 6 shows the system configuration of the remote instruction system. The field device 10 and the instruction device 30 are connected via the Internet. The field device 10 and the instruction device 30 may directly exchange data via the Internet, or may exchange data via a server device. The field device 10 includes a laser projector/camera integrated body 58, a rear storage body (motor control circuit, battery) 59, and a smartphone 200. These are connected to each other by signal lines and power lines.
 図7に、指示装置30のハードウエア構成を示す。CPU302には、メモリ304、ディスプレイ306、マイク308、通信回路310、SSD312、DVD-ROMドライブ314、マウス/キーボード316、スピーカ318が接続されている。通信回路310は、インターネットに接続するためのものである。 FIG. 7 shows the hardware configuration of the instruction device 30. A memory 304, a display 306, a microphone 308, a communication circuit 310, an SSD 312, a DVD-ROM drive 314, a mouse/keyboard 316, and a speaker 318 are connected to the CPU 302. Communication circuit 310 is for connecting to the Internet.
 SSD312には、オペレーティングシステム320、指示プログラム322が記録されている。指示プログラム322は、オペレーティングシステム320と協働してその機能を発揮するものである。これらプログラムは、DVD-ROM324に記録されていたものを、DVD-ROMドライブ314を介してインストールしたものである。マイク308、スピーカ318は、現場担当者と会話を行うためのものである。 An operating system 320 and an instruction program 322 are recorded in the SSD 312. The instruction program 322 cooperates with the operating system 320 to perform its functions. These programs were recorded on the DVD-ROM 324 and installed via the DVD-ROM drive 314. A microphone 308 and a speaker 318 are for having a conversation with the person in charge at the site.
 図8に、モータ制御回路400のハードウエア構成を示す。CPU402には、メモリ404、ジャイロセンサJS、加速度センサAS、カメラ82、レーザプロジェクタ84、モータ92、94、96、不揮発性メモリ406が接続されている。カメラ82はカメラ制御回路102を介して接続され、レーザプロジェクタ84はレーザプロジェクタ制御回路104を介して接続されているが、図においては省略している。 FIG. 8 shows the hardware configuration of the motor control circuit 400. A memory 404, a gyro sensor JS, an acceleration sensor AS, a camera 82, a laser projector 84, motors 92, 94, and 96, and a nonvolatile memory 406 are connected to the CPU 402. The camera 82 is connected via a camera control circuit 102, and the laser projector 84 is connected via a laser projector control circuit 104, but these are omitted in the figure.
 不揮発性メモリ406には、オペレーティングシステム31、モータ制御プログラム32が記録されている。モータ制御プログラム32は、オペレーティングシステム31と協働してその機能を発揮するものである。 The operating system 31 and motor control program 32 are recorded in the nonvolatile memory 406. The motor control program 32 cooperates with the operating system 31 to perform its functions.
 図9に、スマートフォン200のハードウエア構成を示す。CPU202には、メモリ204、タッチディスプレイ206、近距離通信回路208、カメラ82、レーザプロジェクタ84、SSD212、スピーカ214、マイク216、通信回路218が接続されている。なお、図において、通常の通話回路は省略している。 FIG. 9 shows the hardware configuration of the smartphone 200. A memory 204, a touch display 206, a short-range communication circuit 208, a camera 82, a laser projector 84, an SSD 212, a speaker 214, a microphone 216, and a communication circuit 218 are connected to the CPU 202. In addition, in the figure, a normal communication circuit is omitted.
 通信回路218はインターネットに接続するための回路である。スピーカ214、マイク216は、インターネットを介して指示者と通話を行うためのものである。SSD212には、オペレーティングシステム222、画像制御プログラム224が記録されている。画像制御プログラム224は、オペレーティングシステム224と協働してその機能を発揮するものである。
 
The communication circuit 218 is a circuit for connecting to the Internet. A speaker 214 and a microphone 216 are for communicating with the instructor via the Internet. An operating system 222 and an image control program 224 are recorded on the SSD 212. The image control program 224 cooperates with the operating system 224 to perform its functions.
1.3遠隔指示処理
 図10、図11に、モータ制御回路400のモータ制御プログラム32、スマートフォン200の画像制御プログラム224、指示装置30の指示プログラム322のフローチャートを示す。図10が方向固定制御のフローチャートであり、図11が指示画像表示制御のフローチャートである。
1.3 Remote Instruction Processing FIGS. 10 and 11 show flowcharts of the motor control program 32 of the motor control circuit 400, the image control program 224 of the smartphone 200, and the instruction program 322 of the instruction device 30. FIG. 10 is a flowchart of direction fixing control, and FIG. 11 is a flowchart of instruction image display control.
 現場に到着した現場担当者54は、対象物である煙突52の前まで来るとヘルメット58の庇に、レーザプロジェクタ・カメラ統合体58を取り付けて電源を入れる。これにより、カメラ82によって煙突52の近傍が撮像される。 When the person in charge of the site 54 arrives at the site and gets in front of the target object, the chimney 52, he attaches the laser projector/camera integrated body 58 to the eave of his helmet 58 and turns on the power. Thereby, the vicinity of the chimney 52 is imaged by the camera 82.
 モータ制御回路400のCPU402(以下、モータ制御回路400と省略することがある)は、レーザプロジェクタ・カメラ統合体58のジャイロセンサJS、加速度センサASの出力を取得する(ステップS1)。この実施形態では、直交する三軸方向のジャイロセンサ、加速度センサを用いている。 The CPU 402 of the motor control circuit 400 (hereinafter sometimes abbreviated as the motor control circuit 400) acquires the outputs of the gyro sensor JS and acceleration sensor AS of the laser projector/camera integrated body 58 (step S1). In this embodiment, a gyro sensor and an acceleration sensor in three orthogonal axes are used.
 モータ制御回路400は、ジャイロセンサJS、加速度センサASの出力に基づいて、基部93(図3参照)が三次元空間においていずれの位置、いずれの方向にあるのかを算出する。そして、基部93の位置、方向に拘わらず、ユニット80が一定の方向を向くように、モータ92、94、96の回転角を制御する(ステップS2)。したがって、現場担当者54の頭の向きに拘わらず、ユニット80は一定の方向に保たれる。このような制御は、カメラなどの安定装置として用いられているジンバルと同様の制御である。 The motor control circuit 400 calculates in which position and in which direction the base 93 (see FIG. 3) is located in three-dimensional space based on the outputs of the gyro sensor JS and the acceleration sensor AS. The rotation angles of the motors 92, 94, and 96 are then controlled so that the unit 80 faces in a constant direction regardless of the position and direction of the base 93 (step S2). Therefore, regardless of the orientation of the head of the field personnel 54, the unit 80 is kept in a constant direction. Such control is similar to that of a gimbal used as a stabilizing device for cameras and the like.
 以上のように、現場担当者54の頭の向きに拘わらず、煙突52の近傍の方向にカメラ82が向けられ、動画として安定した撮像画像を得ることができる。 As described above, the camera 82 is directed toward the vicinity of the chimney 52 regardless of the direction of the head of the person in charge of the site 54, and a stable captured image can be obtained as a moving image.
 スマートフォン200のCPU202(以下、スマートフォン200と省略することがある)は、信号線(または近距離通信)によってカメラ82の撮像画像を取得し、インターネットを介して指示装置30に送信する(ステップS21)。 The CPU 202 of the smartphone 200 (hereinafter sometimes abbreviated as the smartphone 200) acquires an image captured by the camera 82 via a signal line (or short-range communication), and transmits it to the instruction device 30 via the Internet (step S21). .
 指示装置30のCPU302(以下、指示装置30と省略することがある)は、受信した撮像画像をディスプレイ306に表示する(ステップS41)。図12に、ディスプレイ306に表示された撮像画像の例を示す。図12に示すように、煙突52およびその近傍が撮像画像として表示されている。これにより、指示装置30を操作する指示者は、現場の状況をリアルタイムに動画にて確認することができる。 The CPU 302 of the instruction device 30 (hereinafter sometimes abbreviated as the instruction device 30) displays the received captured image on the display 306 (step S41). FIG. 12 shows an example of a captured image displayed on the display 306. As shown in FIG. 12, the chimney 52 and its vicinity are displayed as a captured image. Thereby, the instructor who operates the instruction device 30 can check the situation at the site in real time with a moving image.
 指示者は、マイク308、スピーカ318を用いて、インターネット通話により、現場担当者のスマートフォン200と会話を行うことができる。これにより、指示者はディスプレイ306を見ながら、希望する位置に現場担当者を会話によって誘導することができる。 The instructor can use the microphone 308 and speaker 318 to have a conversation with the smartphone 200 of the person in charge of the field through an Internet call. Thereby, the instructor can guide the on-site person to the desired position by conversation while looking at the display 306.
 また、図12に示すように、指示装置30は、ディスプレイ306上に方向変更ボタン500を表示する。指示者は、カメラ82の撮像方向を変えたいとき、この方向変更ボタン500の円周上をマウス316にてクリックする。指示装置30は、クリックを検知すると、これに対応する方向(上下左右)に撮像方向を変えるための撮像方向変更指令を生成し、スマートフォン200に送信する(ステップS42)。 Further, as shown in FIG. 12, the instruction device 30 displays a direction change button 500 on the display 306. When the instructor wants to change the imaging direction of the camera 82, he or she clicks on the circumference of the direction change button 500 using the mouse 316. When the instruction device 30 detects the click, it generates an imaging direction change command for changing the imaging direction in the corresponding direction (up, down, left, and right), and transmits it to the smartphone 200 (step S42).
 スマートフォン200は、受信した撮像方向変更指令をモータ制御回路400に伝送する(ステップS22)。モータ制御回路400は、受信した撮像方向変更指令に基づいて、モータ92、94、96を制御し、ユニット80の向き(すなわち撮像方向)を変更する(ステップS3)。これにより、以後は、現場担当者の頭の向きに拘わらず、変更された方向にユニットの向きが固定される。したがって、指示装置30のディスプレイ306に指示者の指示した向きに変更された撮像画像が表示されることになる。指示者は、このようにして現場における所望の方向の画像を得ることができる。 The smartphone 200 transmits the received imaging direction change command to the motor control circuit 400 (step S22). The motor control circuit 400 controls the motors 92, 94, and 96 based on the received imaging direction change command to change the direction of the unit 80 (that is, the imaging direction) (step S3). As a result, from now on, the orientation of the unit is fixed in the changed direction regardless of the orientation of the head of the person in charge at the site. Therefore, the captured image is displayed on the display 306 of the instruction device 30 in the direction indicated by the instruction. In this way, the instructor can obtain an image in a desired direction at the site.
 図10の方向固定制御は、繰り返し実行されている。この方向固定制御と並行して、図11の指示画像表示制御の処理が行われる。現場担当者は、指示者からの会話による指示に基づいて、予め用意しているマーカ60を取り出して、対象物である煙突52に貼り付ける。マーカ60の画像(特徴部分画像)は、予め、大きさや形状がスマートフォン200の不揮発性メモリ212に記録されている。したがって、スマートフォン200は、撮像されたマーカ60の画像に基づいて、カメラ82からマーカ60までの距離、方向などを算出することができる。 The direction fixing control in FIG. 10 is repeatedly executed. In parallel with this direction fixing control, the instruction image display control process shown in FIG. 11 is performed. The person in charge at the site takes out the marker 60 prepared in advance and attaches it to the chimney 52, which is the target object, based on a conversational instruction from the instructor. The size and shape of the image of the marker 60 (characteristic portion image) are recorded in advance in the nonvolatile memory 212 of the smartphone 200. Therefore, the smartphone 200 can calculate the distance, direction, etc. from the camera 82 to the marker 60 based on the captured image of the marker 60.
 上述のように図10に示す処理により、撮像画像は指示装置30のディスプレイ306に動画として表示されている。指示者は、この撮像画像を見ながら、図12に示すように、マーカ60が撮像された状態にて、指示入力モードボタン501をクリックして固定指令を与える。 As described above, through the process shown in FIG. 10, the captured image is displayed as a moving image on the display 306 of the instruction device 30. While looking at this captured image, the instructor clicks the instruction input mode button 501 with the marker 60 captured as shown in FIG. 12 to give a fixing instruction.
 これを受けて、指示装置30は、指示入力モードボタン501がクリックされることにより固定指令が与えられると、そのときの撮像画像を参照撮像画像とし、静止画としてディスプレイ306に表示する(ステップS52)。指示者は、この静止画に対し、マウス316を用いて現場担当者に対する指示を指示画像として入力する(ステップS53)。たとえば、図13に示すように、ディスプレイ306に表示された煙突52の画像において、穴を開けるべき位置を指示画像62(この例では、穴開け位置を十字のクロス点にて示す十字画像)をマウス316にて描画し、入力する。 In response to this, when the instruction input mode button 501 is clicked and a fixing command is given, the instruction device 30 sets the captured image at that time as a reference captured image and displays it on the display 306 as a still image (step S52 ). The instructor uses the mouse 316 to input an instruction to the on-site person as an instruction image to this still image (step S53). For example, as shown in FIG. 13, in the image of the chimney 52 displayed on the display 306, an image 62 (in this example, a cross image indicating the position to drill a hole) indicating the position where the hole should be drilled is displayed. Draw and input using the mouse 316.
 また、指示装置30は、上記固定指令を、スマートフォン200に送信する(ステップS32)。固定指令を受けたスマートフォン200は、当該固定指令を受信した際の撮像画像を、参照撮像画像として不揮発性メモリ212に記録する(ステップS32)。 Furthermore, the instruction device 30 transmits the fixing command to the smartphone 200 (step S32). The smartphone 200 that has received the fixing instruction records the captured image when receiving the fixing instruction in the nonvolatile memory 212 as a reference captured image (step S32).
 したがって、指示装置30とスマートフォン200において、同じ時点の撮像画像を参照撮像画像として認識することができる。なお、通信によるタイムラグを避けるため、指示装置30からスマートフォン200に固定指令を送信する際に、フレームを特定する情報(フレーム番号など)を付けて送信するようにしてもよい。スマートフォン200において、このフレームを特定する情報に基づいて参照撮像画像を決定することで、タイムラグによるずれを防止することができる。 Therefore, the instruction device 30 and the smartphone 200 can recognize captured images taken at the same time as reference captured images. Note that in order to avoid a time lag due to communication, when transmitting the fixing command from the instruction device 30 to the smartphone 200, information for identifying the frame (such as a frame number) may be attached and transmitted. In the smartphone 200, by determining a reference captured image based on information specifying this frame, it is possible to prevent deviations due to time lag.
 指示者は、指示画像を入力し終わると、ディスプレイ306の参照撮像画像の右下に表示されている指示画像送信ボタン502(指示入力モードになると表示される)をクリックする。これにより、指示装置30は、指示画像をスマートフォン200に送信する(ステップS53)。 After the instructor finishes inputting the instruction image, the instructor clicks an instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference captured image on the display 306. Thereby, the instruction device 30 transmits the instruction image to the smartphone 200 (step S53).
 また、指示装置30は、指示入力モードを解除して参照撮像画像としての静止画の表示を止めて、送信されてきた撮像画像を動画として表示する(ステップS54)。これにより、指示者は、再び現場の状況を知ることができる。 Furthermore, the instruction device 30 cancels the instruction input mode, stops displaying the still image as the reference captured image, and displays the transmitted captured image as a moving image (step S54). This allows the instructor to know the situation at the site again.
 スマートフォン200に送信される指示画像のデータ構造を、図14Aに示す。指示画像データは、図14Bに示すように、指示者が入力した指示画像の実体データである。基準座標位置は、図14Cに示すように、マーカ画像の基準点(たとえばMの中央下部の点)原点としたときの、指示画像の基準点のXY座標値である。この実施形態では、図14Bに示すように、指示者が入力した指示画像に外接する矩形の左上を基準点としている。 The data structure of the instruction image sent to the smartphone 200 is shown in FIG. 14A. The instruction image data is the actual data of the instruction image input by the instructor, as shown in FIG. 14B. As shown in FIG. 14C, the reference coordinate position is the XY coordinate value of the reference point of the instruction image when the reference point of the marker image (for example, the lower center point of M) is set as the origin. In this embodiment, as shown in FIG. 14B, the reference point is the upper left of the rectangle circumscribing the instruction image input by the instructor.
 なお、上記では、指示画像を画像データとして送信しているが、予め定められた画像の形状に応じて、そのパラメータを数値にて送信するようにしてもよい。たとえば、真円なら中心座標と半径、正方形なら左上の座標と辺の長さなどを数値にて表して送信するようにしてもよい。 Note that in the above, the instruction image is transmitted as image data, but the parameters may be transmitted as numerical values depending on the shape of the image determined in advance. For example, if it is a perfect circle, the center coordinates and radius, and if it is a square, the upper left coordinates and side lengths may be expressed numerically and transmitted.
 スマートフォン200は、図14Aの指示画像データを受信すると、これをメモリ204に保持する。さらに、スマートフォン200は、カメラ82から現在の撮像画像を取得する(ステップS33)。 Upon receiving the instruction image data of FIG. 14A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image from the camera 82 (step S33).
 前述のように、カメラ82の撮像範囲と、レーザプロジェクタ84の投影範囲は、同じになるように構成されている。したがって、現在の撮像画像が記録した参照撮像画像と全く同じであれば(すなわち、参照撮像画像の時から現場担当者が全く動かなければ)、基準座標位置(図14C)に基づく位置に、レーザプロジェクタ84によって指示画像データを投影すれば、図2示すように、煙突52に指示画像62が映し出されることになる。 As described above, the imaging range of the camera 82 and the projection range of the laser projector 84 are configured to be the same. Therefore, if the current captured image is exactly the same as the recorded reference captured image (that is, if the site personnel has not moved at all since the reference captured image), the laser When the instruction image data is projected by the projector 84, the instruction image 62 is projected onto the chimney 52, as shown in FIG.
 この指示画像62の位置は、指示者がディスプレイ306において入力した位置と合致するので、現場担当者に対して正確に作業位置を示すことができる。現場担当者は、この指示画像62を目印としてその位置に穴を開ける作業を行う。 The position of this instruction image 62 matches the position input by the instructor on the display 306, so it is possible to accurately indicate the work position to the person in charge at the site. The person in charge at the site uses the instruction image 62 as a guide to drill a hole at that position.
 図15Aに示すように、現場担当者54の頭の垂直軸(水平軸)を中心としてヘルメット56を水平回転(垂直回転)させた場合でも、図10の方向固定制御によって、カメラ82、レーザプロジェクタ84の指示画像までの距離や向きが変わらなければ(回転量により機構的な限界があるが)、指示画像62は正しい位置に表示される。 As shown in FIG. 15A, even when the helmet 56 is horizontally rotated (vertically rotated) about the vertical axis (horizontal axis) of the head of the field worker 54, the direction fixed control shown in FIG. As long as the distance and direction to the instruction image 84 do not change (although there is a mechanical limit depending on the amount of rotation), the instruction image 62 will be displayed at the correct position.
 ただし、図15Bの破線にて示すように、現場担当者54(ヘルメット56)が対象物52に近づいたり、離れたりした場合には、現場に投影される指示画像62が大きくなったり、小さくなったりしてしまう。 However, as shown by the broken line in FIG. 15B, when the site person 54 (helmet 56) approaches or moves away from the object 52, the instruction image 62 projected onto the site becomes larger or smaller. I end up doing something like that.
 また、図15Cの破線にて示すように、現場担当者54(ヘルメット56)が左右に移動すると、図10の方向固定制御では、矢印で示す撮像方向が現場担当者54を基準として一定に保持されるだけである。このため、カメラ82による撮像範囲は左右にずれ、レーザプロジェクタ84によって投影される指示画像62も左右にずれた位置に表示されることになる。このような状況は、現場担当者54の上下方向への移動(しゃがんでいた状態から立ち上がった状態になるなど)があった場合も同様に、指示画像62が上下にずれることになる。 Furthermore, as shown by the broken line in FIG. 15C, when the on-site person in charge 54 (helmet 56) moves left and right, the direction fixing control in FIG. It is only done. For this reason, the imaging range by the camera 82 is shifted left and right, and the instruction image 62 projected by the laser projector 84 is also displayed at a position shifted left and right. In such a situation, the instruction image 62 will similarly shift vertically if the on-site person in charge 54 moves in the vertical direction (for example, from a squatting position to a standing position).
 これらを解消して正しく指示画像62を投影するため、この実施形態では次のような処理を行っている。 In order to eliminate these problems and correctly project the instruction image 62, the following processing is performed in this embodiment.
 スマートフォン200は、参照撮像画像におけるマーカ60の画像に基づいて、カメラ82とマーカ60(および指示画像62を投影すべき場所)との距離、方向を算出する。前述のように、マーカ60には予め既知の絵柄が印刷されているので、撮像画像に基づいて、煙突52に貼り付けられたマーカ60(および指示画像62を投影すべき場所)までの距離や方向を算出することができる。 The smartphone 200 calculates the distance and direction between the camera 82 and the marker 60 (and the location where the instruction image 62 is to be projected) based on the image of the marker 60 in the reference captured image. As mentioned above, since a known pattern is printed on the marker 60 in advance, the distance to the marker 60 attached to the chimney 52 (and the location where the instruction image 62 should be projected) can be determined based on the captured image. The direction can be calculated.
 また、スマートフォン200は、ステップS33において取得した現在の撮像画像に基づいて、煙突52に貼付られたマーカ60(および指示画像62を投影すべき場所)までの距離や方向を算出する。 Furthermore, the smartphone 200 calculates the distance and direction to the marker 60 attached to the chimney 52 (and the location where the instruction image 62 should be projected) based on the current captured image acquired in step S33.
 したがって、スマートフォン200は、参照撮像画像の際のマーカ60(および指示画像62を投影すべき場所)までの距離・方向と、現在の撮像画像のマーカ60(および指示画像62を投影すべき場所)までの距離・方向との比較に基づいて、指示画像62を変形し、指示画像62を投影する位置を制御する(ステップS34)。 Therefore, the smartphone 200 determines the distance and direction to the marker 60 (and the location where the instruction image 62 should be projected) in the reference captured image, and the marker 60 (and the location where the instruction image 62 should be projected) in the current captured image. The instruction image 62 is deformed based on the comparison with the distance and direction to the point, and the position where the instruction image 62 is projected is controlled (step S34).
 たとえば、図15Bのような場合であれば、カメラ82とマーカ60(または指示画像62)の距離の変化に応じて指示画像62を拡大・縮小して投影するよう制御する。 For example, in the case shown in FIG. 15B, the instruction image 62 is controlled to be enlarged/reduced and projected according to the change in the distance between the camera 82 and the marker 60 (or the instruction image 62).
 図15Cのような場合であれば、マーカ60の移動に伴って指示画像62を投影する位置を移動させるよう制御する。 In a case like that shown in FIG. 15C, control is performed to move the position where the instruction image 62 is projected as the marker 60 moves.
 この実施形態では、三軸構造体90による方向固定制御(図10参照)を別途行っているので、多くの場合、図15B、図15Cに対する制御を行うことで正しい位置に指示画像62を表示させることができる。 In this embodiment, since the direction fixing control (see FIG. 10) is performed separately by the triaxial structure 90, in many cases, the instruction image 62 is displayed at the correct position by performing the control for FIGS. 15B and 15C. be able to.
 この実施形態では、三軸構造体90による方向固定制御を別途行った上、上記の制御を行っているので、指示画像62を安定して正しい位置に表示させることができる。また、現場担当者54が頭の向きを変えて対象物52から視線を外した場合であっても、方向固定制御によって指示画像62が表示され続ける。このため、現場担当者54が作業を行う際のストレスが少ない。 In this embodiment, the direction fixing control by the triaxial structure 90 is separately performed and the above control is performed, so the instruction image 62 can be stably displayed at the correct position. Furthermore, even if the on-site person in charge 54 changes the direction of his or her head and takes his/her line of sight away from the object 52, the instruction image 62 continues to be displayed by the direction fixing control. Therefore, there is less stress when the on-site person in charge 54 performs the work.
 ところで、三軸構造体90による方向固定制御の限界を超えるような場合には、たとえば、図16に模式的に示すように、カメラ82(レーザプロジェクタ84)に対して、対象物52のマーカ60が貼り付けられた面510の傾きが、図16Aの参照撮像画像の時点から、図16Bの現在の撮像画像の時点で、変化することがある。 By the way, if the limit of direction fixing control by the triaxial structure 90 is exceeded, for example, as schematically shown in FIG. 16, the marker 60 of the object 52 may be The inclination of the surface 510 to which is pasted may change from the time of the reference captured image in FIG. 16A to the time of the current captured image in FIG. 16B.
 この場合、スマートフォン200は、参照撮像画像におけるマーカ60の画像に基づいて、対象物52の面510の傾きを算出する(図16A)。これにより、指示装置30から送られてきた基準座標位置PL1(XまたはY)に基づいて、マーカ60と指示画像62との実際の距離LLを算出する。 In this case, the smartphone 200 calculates the inclination of the surface 510 of the target object 52 based on the image of the marker 60 in the reference captured image (FIG. 16A). Thereby, the actual distance LL between the marker 60 and the instruction image 62 is calculated based on the reference coordinate position PL1 (X or Y) sent from the instruction device 30.
 次に、現在の撮像画像におけるマーカ60の画像に基づいて、対象物52の面510の傾きを算出する(図16B)。これにより、上記算出した実際の距離LLに基づいて、指示画像62を表示すべき位置を決定し、基準座標位置PL2(XまたはY)を算出する。スマートフォン200は、この基準座標位置PL2に基づいて、指示画像62を投影する位置を制御し、正しい位置に指示画像62を投影することができる。また、投影された指示画像62が歪まないように、指示画像62を変形する。 Next, the inclination of the surface 510 of the object 52 is calculated based on the image of the marker 60 in the current captured image (FIG. 16B). Thereby, the position where the instruction image 62 should be displayed is determined based on the actual distance LL calculated above, and the reference coordinate position PL2 (X or Y) is calculated. The smartphone 200 can control the position at which the instruction image 62 is projected based on this reference coordinate position PL2, and can project the instruction image 62 at the correct position. Furthermore, the instruction image 62 is transformed so that the projected instruction image 62 is not distorted.
 上記の処理は、上下方向、左右方向のいずれについても同様に行うことができる。 The above process can be performed in the same way in both the vertical and horizontal directions.
 さらに、図17に示すように、参照撮像画像の際の撮像範囲504が、撮像範囲506に示すように斜めに傾いてしまうこともある。図17では、紙面に水平な方向の傾きを示したが、このような傾きは、三次元方向の全てにおいて生じる可能性がある。これにより、投影される指示画像62も歪んでしまうことになる。 Further, as shown in FIG. 17, the imaging range 504 for the reference captured image may be tilted diagonally as shown in the imaging range 506. Although FIG. 17 shows a tilt in a direction horizontal to the plane of the paper, such a tilt may occur in all three-dimensional directions. As a result, the projected instruction image 62 will also be distorted.
 これらについても、参照撮像画像におけるマーカ60の画像と、現在の撮像画像におけるマーカ60の画像とに基づいて、上記歪みを解消するように、指示画像62を変形(歪みの逆に変形する)し投影することで、正しい指示画像を投影することができる。 For these as well, the instruction image 62 is transformed (deformed in the opposite way to the distortion) based on the image of the marker 60 in the reference captured image and the image of the marker 60 in the current captured image so as to eliminate the distortion. By projecting, a correct instruction image can be projected.
 上記の処理をまとめると、以下のとおりである。スマートフォン200は、参照撮像画像におけるマーカ60の画像に基づいて、カメラ82とマーカ60との距離、方向を算出する。また、スマートフォン200は、ステップS33において取得した現在の撮像画像に基づいて、煙突52に貼付られたマーカ60までの距離や方向を算出する。スマートフォン200は、参照撮像画像の際のマーカ60までの距離・方向と、現在の撮像画像のマーカ60までの距離・方向との比較に基づいて、指示画像62を変形し、指示画像62を投影する位置を制御する。 The above processing is summarized as follows. Smartphone 200 calculates the distance and direction between camera 82 and marker 60 based on the image of marker 60 in the reference captured image. Furthermore, the smartphone 200 calculates the distance and direction to the marker 60 attached to the chimney 52 based on the current captured image acquired in step S33. The smartphone 200 transforms the instruction image 62 based on a comparison between the distance and direction to the marker 60 in the reference captured image and the distance and direction to the marker 60 in the current captured image, and projects the instruction image 62. control the position.
 以上のようにして、指示者の意図した指示画像が、現場の対象物52に投影されて表示される。 As described above, the instruction image intended by the instructor is projected and displayed on the object 52 at the site.
 なお、指示画像60を正しく表示するためにはマーカ60は、指示画像60を表示すべき平面に貼り付けることが好ましい。 Note that in order to display the instruction image 60 correctly, it is preferable that the marker 60 be pasted on the plane on which the instruction image 60 is to be displayed.
 なお、マーカ60を貼り付けた面と指示画像62を表示すべき面が異なる場合(段差がある場合など)には、SLAMなど画像の特徴点を用いて正確な位置を推定することを組み合わせることが好ましい。 Note that if the surface on which the marker 60 is pasted and the surface on which the instruction image 62 should be displayed are different (such as when there is a difference in level), estimating the accurate position using image feature points such as SLAM may be combined. is preferred.
 すなわち、スマートフォン200において撮像画像を解析して対象物近傍(マーカ60近傍)の特徴点(物体の境界上の点など)を算出する。参照撮像画像における特徴点と、現在の撮像画像の特徴点の比較によって、マーカ60と指示画像62を表示すべき面との位置関係を決定する。 That is, the smartphone 200 analyzes the captured image to calculate feature points (points on the boundary of the object, etc.) near the object (near the marker 60). By comparing the feature points in the reference captured image and the feature points in the current captured image, the positional relationship between the marker 60 and the surface on which the instruction image 62 is to be displayed is determined.
 このようにすれば、マーカ60を貼り付けた面と指示画像62を表示すべき面が異なる場合であっても、正しく指示画像62を投影することができる。
 
In this way, even if the surface on which the marker 60 is pasted is different from the surface on which the instruction image 62 should be displayed, the instruction image 62 can be correctly projected.
1.4変形例
(1)上記実施形態では、現場装置10を現場担当者54に装着するようにしている。
1.4 Variations
(1) In the above embodiment, the field device 10 is attached to the field person 54.
 しかし、現場担当者近傍にある移動体(固定体でもよい)、たとえば、現場担当者が乗っている自転車・自動車などに装着するようにしてもよい。もしくは現場担当者から離れた位置にある移動体(固定体でもよい)に装着されていても良い。 However, it may also be attached to a moving object (or a fixed object) near the person in charge of the site, such as a bicycle or a car that the person in charge of the site is riding. Alternatively, it may be attached to a moving body (or a fixed body) located away from the person in charge of the site.
 あるいは、現場装置10を現場担当者54ではなく、ロボットに装着するようにしてもよい。たとえば、遠隔地にいる指示者がロボットに装着された現場指示装置11を使って文字や画像を映し出し、ロボットの周囲にいる人とのコミュニケーションを図ることができる。以下の実施形態においても同様である。 Alternatively, the field device 10 may be attached to a robot instead of the field worker 54. For example, an instructor in a remote location can communicate with people around the robot by displaying text or images using the on-site instruction device 11 attached to the robot. The same applies to the following embodiments.
(2)上記実施形態では、対象物52に対して常に指示画像62をレーザプロジェクタ84によって投影するようにしている。しかし、投影方向に人がいる場合には、レーザプロジェクタ84による照射を行わないようにしてもよい。 (2) In the above embodiment, the instruction image 62 is always projected onto the object 52 by the laser projector 84. However, if there are people in the projection direction, the laser projector 84 may not emit radiation.
 これは、スマートフォン200において、撮像画像中に人がいるかどうかを判断し(たとえばYOLOなどの学習済AIにより)、人がいると判断した場合にはレーザプロジェクタ84による照射を停止することによって実現できる。人が検知されなくなると再び、照射を再開する。 This can be achieved by determining in the smartphone 200 whether or not there is a person in the captured image (for example, using trained AI such as YOLO), and stopping the irradiation by the laser projector 84 if it is determined that there is a person. . Once no more people are detected, the irradiation will resume.
 また、人を検知した場合に、全面的にレーザ照射を停止するのではなく、認識した人の領域(YOLOの場合であれば矩形領域)では照射を停止し、その他の領域では照射を行うようにしてもよい。 In addition, when a person is detected, instead of stopping laser irradiation entirely, it stops irradiating the area of the recognized person (a rectangular area in the case of YOLO) and continues irradiating in other areas. You can also do this.
 さらに、人の目を検知して、目の領域(及びその周辺流域)のみ照射を停止するようにしてもよい。 Furthermore, it may be possible to detect a person's eyes and stop irradiation only in the eye area (and the surrounding area).
(3)上記実施形態では、投影部としてレーザプロジェクタ84を用いている。しかし、通常のプロジェクタを用いてもよい。 (3) In the above embodiment, the laser projector 84 is used as the projection section. However, a normal projector may also be used.
(4)上記実施形態では、三軸構造体90(ジンバル)を用いているが、一軸、二軸構造体(ジンバル)、四軸以上の構造体などを用いてもよい。 (4) In the above embodiment, a triaxial structure 90 (gimbal) is used, but a uniaxial structure, a biaxial structure (gimbal), a structure with four or more axes, etc. may also be used.
(5)上記実施形態では、現場担当者54がマーカ60を対象物52に貼り付けるようにしている。しかし、現場の対象物52に予めマーカ60を配置しておくようにしてもよい。 (5) In the embodiment described above, the person in charge at the site 54 attaches the marker 60 to the object 52. However, the marker 60 may be placed on the object 52 at the site in advance.
(6)上記実施形態では、マーカ60を用いてカメラ82との距離・方向などを把握するようにしている。しかし、マーカ60を用いずに、SLAMなどを用い撮像画像の特徴点のみでこれを把握し、同様の処理を行うようにしてもよい。 (6) In the embodiment described above, the distance and direction to the camera 82 are determined using the marker 60. However, without using the marker 60, it is also possible to use SLAM or the like to grasp this only from the feature points of the captured image and perform similar processing.
 この場合、スマートフォン200にて特徴点512(画像を特徴付ける頂点など)を認識し、これを指示装置30に送信し、図18に示すようにディスプレイ306に表示する。指示者はこの画像を見て、マウス316を操作し、位置特定のために用いる特徴点512を選択する。位置特定のための用いる特徴点512としては、対象物52に関係する(対象物上の特徴点)特徴点512を選択することが好ましい。 In this case, the smartphone 200 recognizes the feature points 512 (vertices that characterize the image, etc.), transmits them to the instruction device 30, and displays them on the display 306 as shown in FIG. The instructor looks at this image, operates the mouse 316, and selects a feature point 512 to be used for position specification. It is preferable to select a feature point 512 related to the object 52 (a feature point on the object) as the feature point 512 used for position identification.
 指示入力モードボタン501がクリックされると、選択された特徴点512の情報が、スマートフォン200に送信される。スマートフォン200は、これら特徴点512に基づいて、位置や方向を特定することができる。 When the instruction input mode button 501 is clicked, information on the selected feature point 512 is transmitted to the smartphone 200. The smartphone 200 can specify the position and direction based on these feature points 512.
(7)上記実施形態では、指示者が指示装置30において図13の画面を確認して指示画像送信ボタン502をクリックするようにしている。しかし、撮像画像中のマーカ60が、撮像画像中の所定領域内(たとえば予め定めた中央の領域内)に入ったことを、指示装置30またはスマートフォン200が検知して、指示入力モードに自動的に入るようにしてもよい。マーカ60を用いずに、特徴点512によって処理をする場合も同様である。 (7) In the above embodiment, the instructor checks the screen shown in FIG. 13 on the instruction device 30 and clicks the instruction image transmission button 502. However, when the instruction device 30 or the smartphone 200 detects that the marker 60 in the captured image has entered a predetermined area (for example, within a predetermined central area) in the captured image, the instruction input mode is automatically activated. It may be possible to enter the The same applies when processing is performed using the feature points 512 without using the marker 60.
(8)上記実施形態では、三軸構造体90の制御をモータ制御回路400で行い、画像処理に基づく投影位置等の制御をスマートフォン200で行うようにしている。 (8) In the above embodiment, the motor control circuit 400 controls the triaxial structure 90, and the smartphone 200 controls the projection position based on image processing.
 しかし、三軸構造体90の制御もスマートフォン200で行うようにしてもよい。あるいは、後部収納体59の中に、三軸構造体90の制御、画像処理に基づく投影位置などの制御を行う回路を設けてもよい。この場合、スマートフォン200は、通話のためにのみ用いることになる。さらに、通話機能も後部収納体59の中に設けてもよい。 However, the three-axis structure 90 may also be controlled by the smartphone 200. Alternatively, a circuit for controlling the triaxial structure 90 and controlling the projection position based on image processing may be provided in the rear storage body 59. In this case, smart phone 200 will be used only for phone calls. Furthermore, a telephone call function may also be provided in the rear storage body 59.
(9)上記実施形態では、指示画像が歪んで(あるいは大きさが変わって)投影されないように、スマートフォン200によって指示画像を変形している。しかし、指示画像の形状に重要性がなく、特定の位置を示すことが重要である場合(たとえば、十字マークの中央点にて位置を示す場合)には、位置が正しく示せるのであれば指示画像がゆがんだとしても(大きさが変わったとしても)支障はない。このような場合には、指示画像を変形する処理は行わなくともよい。 (9) In the above embodiment, the instruction image is transformed by the smartphone 200 so that the instruction image is not projected in a distorted (or changed in size) manner. However, if the shape of the instruction image is not important and it is important to indicate a specific position (for example, when indicating the position at the center point of a cross mark), if the position can be shown correctly, the instruction image Even if it becomes distorted (even if its size changes), there is no problem. In such a case, the process of transforming the instruction image may not be performed.
(10)上記実施形態では、三軸構造体90の制御(図10)に加えて、スマートフォン200による画像処理に基づく投影位置等の制御(図11)を行うようにしている。しかし、スマートフォン200による画像処理に基づく投影位置等の制御は行わず、三軸構造体90による処理だけを行うようにしてもよい。 (10) In the above embodiment, in addition to controlling the triaxial structure 90 (FIG. 10), the projection position and the like are controlled based on image processing by the smartphone 200 (FIG. 11). However, the projection position and the like may not be controlled based on image processing by the smartphone 200, and only the processing by the triaxial structure 90 may be performed.
 現場担当者54の動きが少ない場合や、指示画像の現場における投影位置が多少ずれてもいいような場合には、このようにすることができる。たとえば、現場担当者54が身につけたレーザプロジェクタ・カメラ統合体58から、地面に指示画像(方向を示す矢印など)を表示し、指示装置30から道案内をする場合には、三軸構造体90による制御だけで十分である。 This can be done when the on-site person in charge 54 moves little or when the projection position of the instruction image on the site can be slightly shifted. For example, when displaying an instruction image (such as an arrow indicating a direction) on the ground from a laser projector/camera integrated body 58 worn by a site person 54 and providing directions from the instruction device 30, a three-axis structure is used. Control by 90 is sufficient.
(11)上記実施形態では、スマートフォン200により、図15B、図15Cに対応する制御をするだけでなく、図16、図17に対応する制御を行っている。しかし、図15B、図15Cに対応する制御のみを行うようにしてもよい。 (11) In the above embodiment, the smartphone 200 not only performs the control corresponding to FIGS. 15B and 15C, but also performs the control corresponding to FIGS. 16 and 17. However, only the controls corresponding to FIGS. 15B and 15C may be performed.
(12)上記実施形態では、指示装置30に固定指令が与えられると、静止画としての参照撮像画像を表示し、指示画像を入力するモードとしている。しかし、三軸構造体90の制御によって、撮像方向は同じ方向に固定されているので、参照画像をそのまま動画として表示しても、ほぼ固定された画像を得ることができる。 (12) In the above embodiment, when a fixing command is given to the instruction device 30, the mode is set such that the reference captured image as a still image is displayed and an instruction image is input. However, since the imaging direction is fixed in the same direction by controlling the triaxial structure 90, a substantially fixed image can be obtained even if the reference image is displayed as a moving image as it is.
 指示者は、この状態にて指示画像を入力し、指示画像送信ボタン502をクリックする。これを受けて、指示装置30、現場装置10は、その時点の撮像画像を参照撮像画像とするようにしてもよい。 The instructor inputs an instruction image in this state and clicks the instruction image transmission button 502. In response to this, the instruction device 30 and the field device 10 may set the captured image at that time as the reference captured image.
(13)上記実施形態では、現場装置10、現場装置10と通信可能な指示装置30によって指示画像を投影するようにしている。 (13) In the above embodiment, the instruction image is projected by the on-site device 10 and the instruction device 30 that can communicate with the on-site device 10.
 しかし、現場において投影する指示画像が予め定まっている場合には、現場装置10に指示画像62を記録しておき、現場指示装置11として構築するようにしてもよい。 However, if the instruction image to be projected at the site is determined in advance, the instruction image 62 may be recorded in the site device 10 and constructed as the site instruction device 11.
 このようにして構築した現場指示装置11の機能構成を図11に示す。方向制御手段20の処理は、図10のステップS1、S2と同様である。記録部24には指示画像62が記録されている。補正手段26は、マーカ60が撮像画像中の所定領域内に入れば、その時点の撮像画像を参照撮像画像とする。さらに、参照撮像画像と現在の撮像画像のマーカまたは特徴点に基づいて、指示画像62を補正し、投影位置を制御する。したがって、対象物52に指示画像62が投影される。 The functional configuration of the site instruction device 11 constructed in this way is shown in FIG. 11. The processing of the direction control means 20 is similar to steps S1 and S2 in FIG. An instruction image 62 is recorded in the recording section 24. When the marker 60 enters a predetermined area in the captured image, the correction means 26 sets the captured image at that time as the reference captured image. Furthermore, the instruction image 62 is corrected based on the markers or feature points of the reference captured image and the current captured image, and the projection position is controlled. Therefore, the instruction image 62 is projected onto the target object 52.
 現場指示装置11は、現場担当者54が装着していても良く、現場担当者近傍の移動体(固定体でもよい)。 The on-site instruction device 11 may be worn by the on-site person in charge 54, and may be a moving body (or a fixed body) near the on-site person in charge.
 この実施形態では、記録部24に指示画像62が記録されているが、現場指示装置11の外部から通信などによって取得するようにしてもよい。 In this embodiment, the instruction image 62 is recorded in the recording unit 24, but it may be acquired from outside the on-site instruction device 11 through communication or the like.
 図20に、指示処理のフローチャートを示す。この例では、現場指示装置11を、モータ制御回路400とスマートフォン200によって構成している。モータ制御回路400の処理は、上述の実施形態と同様である。 FIG. 20 shows a flowchart of instruction processing. In this example, the on-site instruction device 11 is configured by a motor control circuit 400 and a smartphone 200. The processing of the motor control circuit 400 is similar to the embodiment described above.
 スマートフォン200は、撮像画像を取得し(ステップS71)、予め定められたマーカ60が撮像され、撮像画像の所定範囲内(たとえば中央の所定領域内)に入ったかどうかを判定する(ステップS72)。 The smartphone 200 acquires a captured image (step S71), and determines whether a predetermined marker 60 is captured and falls within a predetermined range (for example, within a predetermined area in the center) of the captured image (step S72).
 マーカ60が所定範囲内に撮像されると、スマートフォン200は、その時の撮像画像を参照撮像画像として記録する(ステップS73)。続いて、現在の撮像画像を取得し、参照撮像画像と比較して、指示画像を修正し、投影位置を制御する(ステップS75)。 When the marker 60 is imaged within the predetermined range, the smartphone 200 records the captured image at that time as a reference captured image (step S73). Subsequently, the current captured image is acquired, compared with the reference captured image, the instruction image is corrected, and the projection position is controlled (step S75).
(14)上記実施形態では、現場の煙突に対する作業を遠隔から指示する場合について説明した。しかし、工場、道路、広場、建築物、スタジアムなど様々な現場における指示に用いることができる。 (14) In the above embodiment, a case has been described in which work on a chimney on site is instructed remotely. However, it can be used for instructions at various sites such as factories, roads, plazas, buildings, and stadiums.
 また、遠隔地から現場担当者に買い物をお願いする場合に、このシステムを用いて、買いたい品物に指示画像を投影して示すこともできる。 Additionally, when requesting shopping from a remote location to a person in charge on-site, this system can be used to project an instruction image onto the item the person wants to buy.
 また、現場担当者の前方の道路上に矢印などの指示画面を投影し、現場担当者を道案内することもできる。 It is also possible to project an instruction screen such as an arrow onto the road in front of the person in charge of the site to guide the person in charge of the site.
 また、バーチャル観光などにおいて、現場担当者に遠隔から指示を行うのが容易となる。遠隔の指示者は、カメラの撮像方向を移動させることで現場担当者の向きには関係なく周りを見渡すことができ、さらに、必要な指示を指示画像として投影することができる。 Additionally, in virtual tourism, etc., it becomes easy to give instructions to the on-site personnel remotely. By moving the imaging direction of the camera, the remote instructor can look around regardless of the orientation of the person in charge at the site, and can also project necessary instructions as an instruction image.
(15)上記実施形態では、指示画像として静止画を用いている。しかし、動画を指示画像として用いるようにしてもよい。この場合、現場装置では動画を繰り返して再生するようにするとよい。 (15) In the above embodiment, a still image is used as the instruction image. However, a moving image may be used as the instruction image. In this case, it is preferable that the on-site device repeatedly reproduces the video.
(16)上記実施形態では、スマートフォン、レーザプロジェクタ・カメラ統合体58、後部収納体59により現場装置を構成している。しかし、これらを一体として構築するようにしてもよい。また、後部収納体59(モータ制御回路)にスマートフォンが行っている機能を持たせるようにすれば、スマートフォンはなくてもよい。たとえば、専用機器、PC、タブレット、スティックタイプPC等を用いてもよい。 (16) In the above embodiment, the smartphone, the laser projector/camera integrated body 58, and the rear storage body 59 constitute the on-site device. However, they may be constructed as one. Furthermore, if the rear storage body 59 (motor control circuit) is provided with the functions that a smartphone performs, the smartphone may not be necessary. For example, a dedicated device, a PC, a tablet, a stick type PC, etc. may be used.
(17)上記実施形態では、図13に示すように方向変更ボタン500を操作して表示される撮像画像の方向を変更している。しかし、画面をドラッグ(マウスボタンを押したままでカーソルを移動する)することで、表示される撮像画像の方向を変更するようにしてもよい。 (17) In the above embodiment, as shown in FIG. 13, the direction of the displayed captured image is changed by operating the direction change button 500. However, the direction of the displayed captured image may be changed by dragging the screen (moving the cursor while holding down the mouse button).
(18)上記の各変形例は、その本質に反しない限り互いに組み合わせて適用することができる。また、他の実施形態、その変形例とも組み合わせて適用することが可能である。
 
(18) The above modifications can be applied in combination with each other as long as it does not contradict the essence thereof. Further, it is possible to apply the present invention in combination with other embodiments and modifications thereof.
2.第2の実施形態
2.1機能的構成
 図21に、第2の実施形態による遠隔指示システムの機能構成を示す。このシステムは、現場担当者が使用する現場装置10と遠隔での指示者が使用する指示装置30を備えている。
2. Second embodiment
2.1 Functional Configuration FIG. 21 shows the functional configuration of the remote instruction system according to the second embodiment. This system includes an on-site device 10 used by a person in charge at the site and an instruction device 30 used by a remote instructor.
 現場担当者が被っているヘルメットには、駆動部16を介して、撮像部12、投影部14が設けられている。撮像部12の撮像領域と、投影部14の投影領域は、実質的に同じになるように配置されている。 An imaging unit 12 and a projection unit 14 are provided on the helmet worn by the person in charge of the field via a drive unit 16. The imaging area of the imaging unit 12 and the projection area of the projection unit 14 are arranged to be substantially the same.
 撮像部12、投影部14、駆動部16の機能は、第1の実施形態と同様である。追従制御手段21は、第1の実施形態における方向制御手段20と同様の機能を発揮し、撮像部12・投影部14の向きを所定方向に維持する。 The functions of the imaging unit 12, projection unit 14, and drive unit 16 are the same as in the first embodiment. The follow-up control means 21 performs the same function as the direction control means 20 in the first embodiment, and maintains the orientation of the imaging section 12 and the projection section 14 in a predetermined direction.
 現場装置10による撮像画像は、撮像画像送信手段18の制御により、送信部22によって、指示装置30に送信される。指示装置30の撮像画像受信手段36は、受信部32により撮像画像を受信する。撮像画像表示部40は、受信した撮像画像を表示する。これにより、指示者は現場空間の画像を見ることができる。 The captured image by the field device 10 is transmitted to the instruction device 30 by the transmitting section 22 under the control of the captured image transmitting means 18. The captured image receiving means 36 of the instruction device 30 receives the captured image by the receiving unit 32. The captured image display unit 40 displays the received captured image. This allows the instructor to view an image of the site space.
 指示を与える際、指示者は固定指令を入力する。撮像画像表示40は、固定指令が入力されるとその際の撮像画像を参照撮像画像とし、静止画として表示する。なお、撮像画像は方向制御手段20によってほぼ固定された状態で表示されているので、撮像画像をそのまま表示してもよい。 When giving an instruction, the instructor inputs a fixed command. When a fixing command is input, the captured image display 40 uses the captured image at that time as a reference captured image and displays it as a still image. Note that since the captured image is displayed in a substantially fixed state by the direction control means 20, the captured image may be displayed as is.
 指示者は、撮像画像表示部40に表示された現場の参照撮像画像を見ながら、指示画像入力部44から指示画像を入力する。指示画像送信手段38は、送信部34により、入力された指示画像を現場装置10に送信する。 The instructor inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40. The instruction image transmitting means 38 transmits the input instruction image to the field device 10 by the transmitter 34 .
 現場装置10は、受信部24によって指示画像を受信し、投影部14から指示画像を投影する。現場装置10の追従制御手段21は、上述の方向制御に加えて、撮像部12が特徴部分画像(マーカなど)を追従して撮像するように駆動部16を制御する。 The field device 10 receives the instruction image by the receiving unit 24 and projects the instruction image from the projection unit 14. In addition to the above-described direction control, the tracking control means 21 of the field device 10 controls the drive unit 16 so that the imaging unit 12 follows and captures a characteristic portion image (such as a marker).
 これにより、指示画像62を対象物52に投影することができる。
 
Thereby, the instruction image 62 can be projected onto the target object 52.
2.2外観及びハードウエア構成
 外観およびハードウエア構成は、第1の実施形態と同様である。
 
2.2 Appearance and Hardware Configuration The appearance and hardware configuration are the same as those of the first embodiment.
2.3遠隔指示処理
 カメラの撮像方向を所定方向に固定する方向固定制御は、第1の実施形態における図10と同様である。マーカを配置してからの指示画像表示制御のフローチャートを、図22に示す。
2.3 Remote Instruction Processing Direction fixing control for fixing the imaging direction of the camera to a predetermined direction is the same as that shown in FIG. 10 in the first embodiment. FIG. 22 shows a flowchart of instruction image display control after marker placement.
 ステップS31~S33、ステップS51~S54は、第1の実施形態と同様である。ステップS35において、図10に示す方向固定制御は解除され、以後は、マーカを追従する制御が行われる。 Steps S31 to S33 and steps S51 to S54 are the same as in the first embodiment. In step S35, the direction fixing control shown in FIG. 10 is canceled, and thereafter, marker tracking control is performed.
 スマートフォン200は、参照撮像画像のマーカ60と、現在の撮像画像のマーカ60とを比較し、現在の撮像画像のマーカ60の撮像画像における位置が、参照撮像画像のマーカ60の参照撮像画像における位置に合致するように、ユニット80の向きを変えるには、モータ92、94、96をどのように制御すればよいかを算出する(ステップS35)。なお、この制御信号の算出の際には、マーカ60の画像上の傾き(図17参照)が、参照撮像画像と同じになるように算出が行われる。 The smartphone 200 compares the marker 60 of the reference captured image with the marker 60 of the current captured image, and determines that the position of the marker 60 of the current captured image in the captured image is the position of the marker 60 of the reference captured image in the reference captured image. It is calculated how the motors 92, 94, and 96 should be controlled in order to change the direction of the unit 80 so as to match (step S35). Note that when calculating this control signal, calculation is performed so that the inclination of the marker 60 on the image (see FIG. 17) is the same as that of the reference captured image.
 算出された制御信号は、モータ制御回路400に伝送され、モータ制御回路400によってモータ92、94、96が制御される(ステップS5)。このようにして、カメラ82の撮像方向(レーザプロジェクタ94の照射方向)が、マーカ60を追従する方向に制御される。 The calculated control signal is transmitted to the motor control circuit 400, and the motors 92, 94, and 96 are controlled by the motor control circuit 400 (step S5). In this way, the imaging direction of the camera 82 (the irradiation direction of the laser projector 94) is controlled to follow the marker 60.
 したがって、図15Aに示す場合であれば、図23に示すように、矢印の方向に撮像方向が変更されて、常に所定の位置にマーカ60が撮像された状態となる。なお、若干の角度の変更があるため、マーカ60が完全に正しく表示されない(少し歪む)が、使用用途によって許容できるような範囲であれば、用いることができる。 Therefore, in the case shown in FIG. 15A, the imaging direction is changed in the direction of the arrow as shown in FIG. 23, and the marker 60 is always imaged at a predetermined position. Note that due to a slight angle change, the marker 60 is not displayed completely correctly (it is slightly distorted), but it can be used as long as it is within an allowable range depending on the purpose of use.
 また、図17に示すような場合でも、その傾きが調整され、正しい位置にマーカ60が表示されることになる。 Furthermore, even in the case shown in FIG. 17, the inclination is adjusted and the marker 60 is displayed at the correct position.
 図15に示す場合や、図16に示すような場合には、マーカ60の大きさが多少変わったり、多少歪んだりして投影されるが、これも、使用用途によって許容できるような範囲であれば、用いることができる。
 
In the case shown in FIG. 15 or the case shown in FIG. 16, the size of the marker 60 is slightly changed or is projected with some distortion, but this may be within an allowable range depending on the intended use. If so, it can be used.
2.4変形例
(1)上記実施形態では、図22の指示画像表示制御において、三軸構造体90によってマーカ60を追従して撮像するように制御している。これに加えて、第1の実施形態のステップS34に示す画像処理による指示画像の修正や投影位置の制御を行うようにしてもよい。
2.4 Variations
(1) In the embodiment described above, in the instruction image display control shown in FIG. 22, the marker 60 is controlled to be tracked and imaged by the triaxial structure 90. In addition to this, the instruction image may be modified and the projection position may be controlled by the image processing shown in step S34 of the first embodiment.
(2)上記実施形態では、現場装置10、現場装置10と通信可能な指示装置30によって指示画像を投影するようにしている。 (2) In the above embodiment, the instruction image is projected by the on-site device 10 and the instruction device 30 that can communicate with the on-site device 10.
 しかし、現場において投影する指示画像が予め定まっている場合には、現場装置10に指示画像62を記録しておき、現場指示装置11として構築するようにしてもよい。 However, if the instruction image to be projected at the site is determined in advance, the instruction image 62 may be recorded in the site device 10 and constructed as the site instruction device 11.
 このようにして構築した現場指示装置11の機能構成を図24に示す。追従制御手段21は、最初、図10のステップS1、S2の処理を行っている。記録部24には指示画像62が記録されている。追従制御手段21は、参照撮像画像と現在の撮像画像のマーカ60または特徴点に基づいて、駆動部16を制御し、マーカ60が撮像画像中の所定の位置にくるように追従して撮像する。したがって、対象物52に指示画像62が投影される。 The functional configuration of the site instruction device 11 constructed in this way is shown in FIG. 24. The follow-up control means 21 initially performs steps S1 and S2 in FIG. An instruction image 62 is recorded in the recording section 24. The tracking control means 21 controls the drive unit 16 based on the marker 60 or feature points of the reference captured image and the current captured image, and tracks and captures the image so that the marker 60 comes to a predetermined position in the captured image. . Therefore, the instruction image 62 is projected onto the target object 52.
 現場指示装置11は、現場担当者54が装着していても良く、現場担当者近傍の移動体(固定体でもよい)に装着されていても良い。 The on-site instruction device 11 may be worn by the on-site person in charge 54, or may be attached to a moving body (or a fixed body) near the on-site person in charge.
 この実施形態では、記録部24に指示画像62が記録されているが、現場指示装置11の外部から通信などによって取得するようにしてもよい。 In this embodiment, the instruction image 62 is recorded in the recording unit 24, but it may be acquired from outside the on-site instruction device 11 through communication or the like.
 図25に、指示処理のフローチャートを示す。この例では、現場指示装置11を、モータ制御回路400とスマートフォン200によって構成している。モータ制御回路400の処理は、上述の実施形態と同様である。 FIG. 25 shows a flowchart of instruction processing. In this example, the on-site instruction device 11 is configured by a motor control circuit 400 and a smartphone 200. The processing of the motor control circuit 400 is similar to the embodiment described above.
 スマートフォン200は、撮像画像を取得し(ステップS71)、予め定められたマーカ60が撮像され、撮像画像の所定範囲内(たとえば中央の所定領域内)に入ったかどうかを判定する(ステップS72)。 The smartphone 200 acquires a captured image (step S71), and determines whether a predetermined marker 60 is captured and falls within a predetermined range (for example, within a predetermined area in the center) of the captured image (step S72).
 マーカ60が所定範囲内に撮像されると、スマートフォン200は、その時の撮像画像を参照撮像画像として記録する(ステップS73)。続いて、現在の撮像画像を取得し、参照撮像画像と比較して、マーカ60を追従して撮像するためのモータ制御信号を生成する(ステップS76)。このモータ制御信号は、モータ制御回路400に与えられる。 When the marker 60 is imaged within the predetermined range, the smartphone 200 records the captured image at that time as a reference captured image (step S73). Subsequently, the current captured image is acquired, compared with the reference captured image, and a motor control signal for tracking and capturing the marker 60 is generated (step S76). This motor control signal is given to motor control circuit 400.
 モータ制御回路400は、これを受けて、モータ92、94、96を制御し、マーカ60を追従して撮像・投影するように三軸構造体90を制御する。 In response to this, the motor control circuit 400 controls the motors 92, 94, and 96, and controls the triaxial structure 90 to follow the marker 60 and image and project it.
(3)上記の各変形例は、その本質に反しない限り互いに組み合わせて適用することができる。また、他の実施形態、その変形例とも組み合わせて適用することが可能である。
 
(3) The above modifications can be applied in combination with each other as long as they do not contradict their essence. Further, it is possible to apply the present invention in combination with other embodiments and modifications thereof.
3.第3の実施形態
3.1機能的構成
 第1、第2の実施形態においては、撮像部12、投影部14を一体として、駆動部16によって保持するようにしている。
3. Third embodiment
3.1 Functional Configuration In the first and second embodiments, the imaging section 12 and the projection section 14 are integrally held by the drive section 16.
 この第3の実施形態では、投影部14は三軸構造体90によって保持されているが、撮像部12は広角に撮像可能なものであり、固定して設けられている。 In this third embodiment, the projection section 14 is held by a triaxial structure 90, but the imaging section 12 is capable of capturing wide-angle images and is fixedly provided.
 図26に、この発明の一実施形態による遠隔指示システムの機能構成を示す。このシステムは、現場担当者が使用する現場装置10と遠隔での指示者が使用する指示装置30を備えている。 FIG. 26 shows the functional configuration of a remote instruction system according to an embodiment of the present invention. This system includes an on-site device 10 used by a person in charge at the site and an instruction device 30 used by a remote instructor.
 現場担当者は、駆動部16を介して投影部14を装着している。撮像部12は固定して装着されている。なお、この実施形態では、撮像部12として360度カメラを用い、全周を撮像できるようにしている。 The person in charge at the site is wearing the projection unit 14 via the drive unit 16. The imaging unit 12 is fixedly mounted. Note that in this embodiment, a 360-degree camera is used as the imaging unit 12, so that images can be taken all around.
 投影部14は、駆動部16により、その投影方向を変化できるように構成されている。投影部14の投影方向は、センサ28によって検出される。方向制御手段20は、センサ28の出力に基づいて駆動部16を制御し、現場担当者の動きに拘わらず、投影部14の方向を、現場担当者を中心とした所定方向に維持する。 The projection unit 14 is configured so that its projection direction can be changed by a drive unit 16. The projection direction of the projection unit 14 is detected by the sensor 28. The direction control means 20 controls the drive unit 16 based on the output of the sensor 28, and maintains the direction of the projection unit 14 in a predetermined direction centered on the person in charge of the site, regardless of the movement of the person in charge of the site.
 現場装置10の撮像部12は、対象物52を含む現場空間を撮像して撮像画像を生成する。上記のように、撮像部12は広角方向(たとえば360度の全周方向)を撮像するので、現場担当者が移動したり向きを変えたりしても、現場空間を含む撮像画像を得ることができる。 The imaging unit 12 of the field device 10 images the field space including the object 52 and generates a captured image. As described above, the imaging unit 12 captures images in a wide-angle direction (for example, all around 360 degrees), so even if the person in charge of the site moves or changes direction, it is possible to obtain a captured image that includes the site space. can.
 この撮像画像は、撮像画像送信手段18の制御により、送信部22によって、指示装置30に送信される。指示装置30の撮像画像受信手段36は、受信部32により撮像画像を受信する。撮像画像表示部40は、受信した撮像画像を表示する。なお、撮像画像は360度の全周方向が撮像された画像であるので、全てを一度に表示せず、部分的に表示を行うようにしている。指示者は、上下左右に方向を変更する操作を行うことにより、現場作業者の周囲を見渡すことができる。これにより、現場空間にある対象物が映し出された方向の撮像画像を表示することができる。 This captured image is transmitted to the instruction device 30 by the transmitter 22 under the control of the captured image transmitter 18. The captured image receiving means 36 of the instruction device 30 receives the captured image by the receiving unit 32. The captured image display section 40 displays the received captured image. Note that since the captured image is an image captured in the entire circumferential direction of 360 degrees, it is not displayed all at once, but only partially. The instructor can look around the site worker by changing the direction up, down, left, and right. Thereby, it is possible to display a captured image in the direction in which the object in the scene space is projected.
 指示を与える際、指示者は対象物を含む撮像画像が表示された状態で、固定指令を入力する。撮像画像表示40は、固定指令が入力されるとその方向の部分的な撮像画像を参照撮像画像とし、静止画として表示する。固定指令が与えられた際の方向は、現場装置10に送信される。 When giving an instruction, the instructor inputs a fixing command while the captured image including the target object is displayed. When a fixed command is input, the captured image display 40 uses a partial captured image in that direction as a reference captured image and displays it as a still image. The direction when the fixed command is given is transmitted to the field device 10.
 指示者は、撮像画像表示部40に表示された現場の参照撮像画像を見ながら、指示画像入力部44から指示画像を入力する。指示画像送信手段38は、送信部34により、入力された指示画像を現場装置10に送信する。 The instructor inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40. The instruction image transmitting means 38 transmits the input instruction image to the field device 10 by the transmitter 34 .
 現場装置10は、受信部24によって指示画像を受信し、前記固定指令が与えられた際の参照撮像画像に映し出された特徴部分画像(画像の特徴点など)を追従するように駆動部16を制御し、投影部14から指示画像を投影する。これにより、対象物52上に指示画像62が投影される。 The field device 10 receives the instruction image by the receiving unit 24, and causes the driving unit 16 to follow the characteristic partial image (such as a characteristic point of the image) displayed in the reference captured image when the fixing command was given. control and project an instruction image from the projection unit 14. As a result, the instruction image 62 is projected onto the target object 52.
 投影部14は特徴部分画像を追従するように投影方向が制御されるので、現場担当者が顔の向きを変えたとしても、指示画像は指示者の意図する箇所に投影されることになる。とはいえ、現場担当者が場所を移動すると、指示画像の投影位置はずれてしまうことになる。 Since the projection direction of the projection unit 14 is controlled so as to follow the characteristic partial image, even if the person in charge of the site changes the direction of his or her face, the instruction image will be projected onto the location intended by the person in charge. However, if the person in charge of the site moves from place to place, the projected position of the instruction image will shift.
 そこで、現場装置10の補正手段26は、参照撮像画面における特徴部分画像(画像の特徴点など)と現在の撮像画像における特徴部分画像との比較により、指示画像が意図した位置に正しく投影されるように、指示画像の投影位置を補正制御する。これにより、現場担当者が移動したとしても、指示画像が正しい位置に表示される。 Therefore, the correction means 26 of the field device 10 correctly projects the instruction image at the intended position by comparing the characteristic partial image (such as a characteristic point of the image) in the reference captured image with the characteristic partial image in the current captured image. The projection position of the instruction image is corrected and controlled as follows. This allows the instruction image to be displayed in the correct position even if the person in charge of the site moves.
 現場担当者は、現場空間に実際に投影された指示画像62に基づいて作業すべき位置を確認することができる。この指示画像62は、作業者が頭の向きを変えても方向制御手段20によって正しい位置に表示される。したがって、作業者の頭の向きによって指示画像62が消えてしまうことがなく作業がしやすい。また、複数人で作業している場合、現場装置10を身につけている現場担当者が、頭を大きく動かしたとしても指示画像62が投影され続け、他の作業者の作業が中断することがない。さらに、作業者が移動したとしても、指示画像62が正しく表示される。
 
The person in charge at the site can confirm the position to be worked on based on the instruction image 62 actually projected onto the site space. This instruction image 62 is displayed at the correct position by the direction control means 20 even if the worker changes the direction of his or her head. Therefore, the instruction image 62 does not disappear depending on the direction of the worker's head, making it easier to work. Furthermore, when multiple people are working, the instruction image 62 will continue to be projected even if the person in charge of the site wearing the site device 10 moves his or her head significantly, which may interrupt the work of other workers. do not have. Furthermore, even if the worker moves, the instruction image 62 is displayed correctly.
3.2外観及びハードウエア構成
 図27に、レーザプロジェクタ・カメラ統合体57の外観を示す。ユニット80内にはカメラは設けられず、レーザプロジェクタ84が設けられている。また、カメラとして360度カメラ81が設けられ、クリップの部材93に固定されている。
3.2 External appearance and hardware configuration FIG. 27 shows the external appearance of the laser projector/camera integrated body 57. A camera is not provided in the unit 80, but a laser projector 84 is provided. Further, a 360 degree camera 81 is provided as a camera and is fixed to a member 93 of the clip.
 この実施形態では、スマートフォン(図示せず)とレーザプロジェクタ・カメラ統合体57と後部収納体59とによって現場装置10を構成している。現場担当者54はヘルメット56を被っており、この実施形態では、ヘルメット56の最上部にレーザプロジェクタ・カメラ統合体57が保持されている。ヘルメット56の後端側にはバッテリやモータ制御回路を収納した後部収納体59が設けられている。 In this embodiment, the field device 10 is composed of a smartphone (not shown), a laser projector/camera integrated body 57, and a rear storage body 59. The field worker 54 wears a helmet 56, and in this embodiment, a laser projector/camera assembly 57 is held at the top of the helmet 56. A rear storage body 59 is provided on the rear end side of the helmet 56 in which a battery and a motor control circuit are stored.
 レーザプロジェクタ・カメラ統合体57、後部収納体59は、信号線・電源線(図示せず)にて接続されている。さらに、現場担当者54は、スマートフォン(図示せず)を胸ポケットに保持している。スマートフォンと、レーザプロジェクタ・カメラ統合体57、後部収納体59の間は、信号線・電源線(図示せず)によって接続されている。 The laser projector/camera integrated body 57 and the rear storage body 59 are connected through a signal line/power line (not shown). Furthermore, the field person in charge 54 holds a smartphone (not shown) in his chest pocket. The smartphone, the laser projector/camera integrated body 57, and the rear housing 59 are connected by a signal line/power line (not shown).
 システム構成およびハードウエア構成は、第1の実施形態と同様である。
 
The system configuration and hardware configuration are the same as in the first embodiment.
3.3遠隔指示処理
 図28、図29に、モータ制御回路400のモータ制御プログラム32、スマートフォン200の画像制御プログラム224、指示装置30の指示プログラム322のフローチャートを示す。図28が方向変更制御のフローチャートであり、図29が指示画像表示制御のフローチャートである。
3.3 Remote Instruction Processing FIGS. 28 and 29 show flowcharts of the motor control program 32 of the motor control circuit 400, the image control program 224 of the smartphone 200, and the instruction program 322 of the instruction device 30. FIG. 28 is a flowchart of direction change control, and FIG. 29 is a flowchart of instruction image display control.
 現場に到着した現場担当者54は、対象物である煙突52の前まで来るとヘルメット58の頭頂部に、レーザプロジェクタ収納体57を取り付けて電源を入れる。ヘルメット58の頭頂部に取付のための金具を設けておくことが好ましい。 When the on-site person in charge 54 arrives at the site and gets in front of the target object, the chimney 52, he attaches the laser projector housing 57 to the top of his helmet 58 and turns on the power. It is preferable that a metal fitting for attachment be provided on the top of the helmet 58.
 スマートフォン200は、信号線(または近距離通信)によって360度カメラ81の撮像画像を取得し、インターネットを介して指示装置30に送信する(ステップS21)。この際、スマートフォン200は、レーザプロジェクタ84の投影方向(ヘルメット58を基準とした方向、すなわちヘルメット58上での方向)を取得し、指示装置30に送信する。 The smartphone 200 acquires an image captured by the 360-degree camera 81 via a signal line (or short-range communication), and transmits it to the instruction device 30 via the Internet (step S21). At this time, the smartphone 200 acquires the projection direction of the laser projector 84 (direction based on the helmet 58, that is, the direction on the helmet 58), and transmits it to the instruction device 30.
 指示装置30は、受信した撮像画像をメモリに記録し、ディスプレイ306に表示する(ステップS41)。撮像画像は360度カメラによる全方向の画像である。したがって、指示装置30は、受信した撮像画像のうち、前記レーザプロジェクタ84の投影方向に合致する部分画像のみを表示する。この実施形態においては、指示装置30において表示される部分画像の領域と、レーザプロジェクタ84の投影領域が一致するように制御している。 The instruction device 30 records the received captured image in the memory and displays it on the display 306 (step S41). The captured image is an omnidirectional image captured by a 360 degree camera. Therefore, the instruction device 30 displays only the partial image that matches the projection direction of the laser projector 84 among the received captured images. In this embodiment, control is performed so that the region of the partial image displayed on the pointing device 30 and the projection region of the laser projector 84 match.
 図12に示すように、指示装置30は、ディスプレイ306上に方向変更ボタン500を表示する。指示者は、異なる方向の部分撮像画像を表示したいとき、この方向変更ボタン500の円周上をマウス316にてクリックする。指示装置30は、クリックを検知すると、これに対応する方向(上下左右)の部分撮像画像を表示する(ステップS44)。この際、その方向が指示装置30からスマートフォン200を介して、現場装置10に与えられ、レーザプロジェクタ84の向きがこれに合うように制御される。したがって、指示装置30において表示される部分画像の領域と、レーザプロジェクタ84の投影領域が一致するよう制御される。指示者は、このようにして現場における所望の方向の画像を得ることができる。 As shown in FIG. 12, the instruction device 30 displays a direction change button 500 on the display 306. When the instructor wants to display a partial captured image in a different direction, the instructor clicks on the circumference of the direction change button 500 using the mouse 316. When the instruction device 30 detects a click, it displays a partial captured image in a direction corresponding to the click (up, down, left, and right) (step S44). At this time, the direction is given to the field device 10 from the instruction device 30 via the smartphone 200, and the direction of the laser projector 84 is controlled to match this direction. Therefore, control is performed so that the region of the partial image displayed on the pointing device 30 and the projection region of the laser projector 84 match. In this way, the instructor can obtain an image in a desired direction at the site.
 上述のように図10に示す処理により、指示者の選択した方向の部分撮像画像は指示装置30のディスプレイ306に動画として表示されている。指示者は、この部分撮像画像を見ながら、対象物である煙突52が表示された状態にて、指示入力モードボタン501をクリックする。 As described above, through the process shown in FIG. 10, the partial captured image in the direction selected by the instructor is displayed as a moving image on the display 306 of the instruction device 30. The instructor clicks the instruction input mode button 501 while looking at this partial captured image and with the target object, the chimney 52, being displayed.
 これを受けて、指示装置30は、指示入力モードボタン501がクリックされることにより固定指令が与えられると、そのときの部分撮像画像を参照撮像画像とし、静止画としてディスプレイ306に表示する(ステップS52)。指示者は、この静止画に対し、マウス316を用いて現場担当者に対する指示を指示画像として入力する(ステップS53)。たとえば、図13に示すように、ディスプレイ306に表示された煙突52の画像において、穴を開けるべき位置を指示画像62(この例では、十字画像)をマウス316にて描画し、入力する。 In response to this, when a fixing command is given by clicking the instruction input mode button 501, the instruction device 30 sets the partial captured image at that time as a reference captured image and displays it on the display 306 as a still image (step S52). The instructor uses the mouse 316 to input an instruction to the on-site person as an instruction image to this still image (step S53). For example, as shown in FIG. 13, in the image of the chimney 52 displayed on the display 306, the position at which a hole should be made is input by drawing an instruction image 62 (in this example, a cross image) using the mouse 316.
 また、指示装置30は、上記固定指令および固定指令が与えられた時の方向を、スマートフォン200に送信する(ステップS51)。固定指令を受けたスマートフォン200は、当該固定指令を受信した際の撮像画像と方向とに基づいて、参照撮像画像を決定し、不揮発性メモリ212に記録する(ステップS32)。なお、固定指令を受けた後、スマートフォン200は、撮像画像を全て指示装置30に送信せず、固定された方向の部分撮像画像のみを指示装置30に送信する。 Further, the instruction device 30 transmits the fixing command and the direction when the fixing command was given to the smartphone 200 (step S51). The smartphone 200 that has received the fixing command determines a reference captured image based on the captured image and direction when receiving the fixing command, and records it in the nonvolatile memory 212 (step S32). Note that after receiving the fixing command, the smartphone 200 does not transmit all captured images to the instruction device 30, but only transmits a partial captured image in the fixed direction to the instruction device 30.
 上記のように、指示装置30とスマートフォン200において、同じ時点の部分撮像画像を参照撮像画像として認識することができる。なお、通信によるタイムラグを避けるため、指示装置30からスマートフォン200に固定指令を送信する際に、フレームを特定する情報(フレーム番号など)を付けて送信するようにしてもよい。スマートフォン200において、このフレームを特定する情報に基づいて参照撮像画像を決定することで、タイムラグによるずれを防止することができる。 As described above, the instruction device 30 and the smartphone 200 can recognize a partial captured image at the same time as a reference captured image. Note that in order to avoid a time lag due to communication, when transmitting the fixing command from the instruction device 30 to the smartphone 200, information for identifying the frame (such as a frame number) may be attached and transmitted. In the smartphone 200, by determining a reference captured image based on information specifying this frame, it is possible to prevent deviations due to time lag.
 指示者は、指示画像を入力し終わると、ディスプレイ306の参照撮像画像の右下に表示されている指示画像送信ボタン502(指示入力モードになると表示される)をクリックする。これにより、指示装置30は、指示画像をスマートフォン200に送信する(ステップS53)。 After the instructor finishes inputting the instruction image, the instructor clicks an instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference captured image on the display 306. Thereby, the instruction device 30 transmits the instruction image to the smartphone 200 (step S53).
 また、指示装置30は、指示入力モードを解除して参照撮像画像としての静止画の表示を止めて、送信されてきた撮像画像の部分撮像画像を動画として表示する(ステップS54)。これにより、指示者は、再び現場の状況を知ることができる。 Furthermore, the instruction device 30 cancels the instruction input mode, stops displaying the still image as the reference captured image, and displays the partial captured image of the transmitted captured image as a moving image (step S54). This allows the instructor to know the situation at the site again.
 スマートフォン200は、図14Aの指示画像データを受信すると、これをメモリ204に保持する。さらに、スマートフォン200は、カメラ82から現在の撮像画像を取得し、前記受信した固定方向に基づいて、部分撮像画像を抽出する(ステップS33)。 Upon receiving the instruction image data of FIG. 14A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image from the camera 82, and extracts a partial captured image based on the received fixed direction (step S33).
 スマートフォン200は、参照撮像画像の画像特徴点と、現在の部分撮像画像の画像特徴点とを比較して、画像特徴点を追従するように三軸構造体90のモータ92、94、96を制御するための信号を生成するとともに、レーザプロジェクタ84から指示画像を投影する(ステップS35)。 The smartphone 200 compares the image feature points of the reference captured image and the image feature points of the current partial captured image, and controls the motors 92, 94, and 96 of the triaxial structure 90 to follow the image feature points. At the same time, an instruction image is projected from the laser projector 84 (step S35).
 生成された信号は、モータ制御回路400に伝送され、モータ92、94、96が制御される。これにより、現場担当者が動いたとしても、レーザプロジェクタ84によって正しい位置に指示画像が投影されることになる。
 
The generated signal is transmitted to motor control circuit 400, and motors 92, 94, and 96 are controlled. Thereby, even if the person in charge of the site moves, the instruction image will be projected at the correct position by the laser projector 84.
3.4変形例
(1)上記実施形態では、ユニット80(レーザプロジェクタ84)の方向を三軸構造体90によって制御するようにしている。しかし、これに加えて、第1の実施形態と同様の処理により、部分特徴画像(特徴点やマーカ)に基づいて、三軸構造体90による制御とは別に、指示画像を変形したり投影位置を制御したりしてもよい。これにより、より正しい位置に指示画像を投影することができる。
3.4 Variations
(1) In the above embodiment, the direction of the unit 80 (laser projector 84) is controlled by the triaxial structure 90. However, in addition to this, the instruction image can be deformed and the projection position can be deformed and the projection position can be changed, independently of the control by the three-axis structure 90, based on the partial feature images (feature points and markers) by the same processing as in the first embodiment. may also be controlled. This allows the instruction image to be projected at a more correct position.
(2)上記実施形態では、全方向を撮像する360度カメラ81を用いている。しかし、水平方向に360度(所定度でもよい)撮像するカメラなどを用いてもよい。 (2) In the above embodiment, a 360-degree camera 81 that captures images in all directions is used. However, a camera that captures images at 360 degrees (or a predetermined degree) in the horizontal direction may also be used.
(3)上記実施形態では、ステップS21において、360度カメラ81の撮像画像を全て指示装置30に送信するようにしている
 しかし、撮像画像のうち、ユニット80の方向に対応する方向の部分撮像画像のみを指示装置30に送信するようにしてもよい。このようにすれば、実質的に第1の実施形態、第2の実施形態と同様の処理を行うことができる。
(3) In the above embodiment, in step S21, all captured images of the 360-degree camera 81 are transmitted to the instruction device 30. However, among the captured images, a partial captured image in a direction corresponding to the direction of the unit 80 Alternatively, only the information may be sent to the instruction device 30. In this way, substantially the same processing as in the first embodiment and the second embodiment can be performed.
 このような処理を行う場合のフローチャートを図30、図31に示す。 Flowcharts for performing such processing are shown in FIGS. 30 and 31.
(4)上記実施形態では、部分特徴画像として画像の特徴点を用いているが、マーカなどを用いるようにしてもよい。 (4) In the above embodiment, feature points of the image are used as partial feature images, but markers or the like may also be used.
(5)上記実施形態では、現場装置10、現場装置10と通信可能な指示装置30によって指示画像を投影するようにしている。 (5) In the above embodiment, the instruction image is projected by the on-site device 10 and the instruction device 30 that can communicate with the on-site device 10.
 しかし、現場において投影する指示画像が予め定まっている場合には、現場装置10に指示画像62を記録しておき、現場指示装置11として構築するようにしてもよい。 However, if the instruction image to be projected at the site is determined in advance, the instruction image 62 may be recorded in the site device 10 and constructed as the site instruction device 11.
 このようにして構築した現場指示装置11の機能構成を図32に示す。記録部24には指示画像62が記録されている。追従制御手段21は、参照撮像画像と現在の撮像画像のマーカ60または特徴点に基づいて、駆動部16を制御し、マーカ60に追従して指示画像が投影されるようにする。したがって、対象物52に指示画像62が投影される。 The functional configuration of the site instruction device 11 constructed in this way is shown in FIG. 32. An instruction image 62 is recorded in the recording section 24. The tracking control means 21 controls the drive unit 16 based on the markers 60 or feature points of the reference captured image and the current captured image, so that the instruction image is projected while following the markers 60. Therefore, the instruction image 62 is projected onto the target object 52.
 現場指示装置11は、現場担当者54が装着していても良く、現場担当者近傍の移動体(固定体でもよい)に装着されていても良い。 The on-site instruction device 11 may be worn by the on-site person in charge 54, or may be attached to a moving body (or a fixed body) near the on-site person in charge.
 この実施形態では、記録部24に指示画像62が記録されているが、現場指示装置11の外部から通信などによって取得するようにしてもよい。 In this embodiment, the instruction image 62 is recorded in the recording unit 24, but it may be acquired from outside the on-site instruction device 11 through communication or the like.
 図33に、指示処理のフローチャートを示す。この例では、現場指示装置11を、モータ制御回路400とスマートフォン200によって構成している。モータ制御回路400の処理は、上述の実施形態と同様である。 FIG. 33 shows a flowchart of instruction processing. In this example, the on-site instruction device 11 is configured by a motor control circuit 400 and a smartphone 200. The processing of the motor control circuit 400 is similar to the embodiment described above.
 スマートフォン200は、撮像画像を取得し(ステップS71)、予め定められたマーカ60が撮像されたかどうかを判定する(ステップS72)。 The smartphone 200 acquires a captured image (step S71), and determines whether a predetermined marker 60 has been captured (step S72).
 マーカ60が撮像されると、スマートフォン200は、その時の撮像画像のうち、マーカ60が撮像された部分撮像画像を参照撮像画像として記録する(ステップS77)。この際、マーカ60が所定領域(たとえば中央部分)になるように、部分撮像画像の方向を選択する。また、併せてこの方向も記録する。 When the marker 60 is imaged, the smartphone 200 records the partial image in which the marker 60 is imaged among the images captured at that time as a reference image (step S77). At this time, the direction of the partial captured image is selected so that the marker 60 is in a predetermined area (for example, in the center). Also, record this direction as well.
 続いて、現在の撮像画像のうち上記の方向の部分撮像画像を取得し、参照撮像画像と比較して、マーカ60を追従して撮像するためのモータ制御信号を生成する(ステップS78)。このモータ制御信号は、モータ制御回路400に与えられる。 Subsequently, a partial captured image in the above direction of the current captured image is acquired and compared with the reference captured image to generate a motor control signal for tracking and capturing the marker 60 (step S78). This motor control signal is given to motor control circuit 400.
 モータ制御回路400は、これを受けて、モータ92、94、96を制御し、マーカ60を追従して撮像するように三軸構造体90を制御する。 In response to this, the motor control circuit 400 controls the motors 92, 94, and 96, and controls the triaxial structure 90 to follow and image the marker 60.
 なお、マーカ60を追従する制御については、モータ制御回路において行うようにしてもよい。 Note that the control for tracking the marker 60 may be performed by the motor control circuit.
(6)上記の各変形例は、その本質に反しない限り互いに組み合わせて適用することができる。また、他の実施形態、その変形例とも組み合わせて適用することが可能である。
 
(6) The above modifications can be applied in combination with each other as long as it does not contradict the essence. Further, it is possible to apply the present invention in combination with other embodiments and modifications thereof.
4.第4の実施形態
4.1機能的構成
 第3の実施形態では、投影部14は三軸構造体90によって保持し、広角の撮像部12は固定して設けられている。この実施形態では、撮像部12だけでなく投影部14も広角としている。
4. Fourth embodiment
4.1 Functional Configuration In the third embodiment, the projection section 14 is held by a triaxial structure 90, and the wide-angle imaging section 12 is fixedly provided. In this embodiment, not only the imaging section 12 but also the projection section 14 has a wide angle.
 図34に、この発明の一実施形態による遠隔指示システムの機能構成を示す。このシステムは、現場担当者が使用する現場装置10と遠隔での指示者が使用する指示装置30を備えている。 FIG. 34 shows the functional configuration of a remote instruction system according to an embodiment of the present invention. This system includes an on-site device 10 used by a person in charge at the site and an instruction device 30 used by a remote instructor.
 現場担当者は、広角の投影部14、広角の撮像部12を装着している。なお、この実施形態では、広角の撮像部12として360度カメラを用い、全周を撮像できるようにしている。また、広角の投影部14として360度全方向に投影可能なレーザプロジェクタを用いて、全周に投影できるようにしている。 The person in charge at the site is wearing a wide-angle projection unit 14 and a wide-angle imaging unit 12. Note that in this embodiment, a 360-degree camera is used as the wide-angle imaging unit 12, so that images can be taken all around. Furthermore, a laser projector capable of projecting in all directions of 360 degrees is used as the wide-angle projection unit 14, so that projection can be performed all around the circumference.
 現場装置10の撮像部12は、対象物52を含む現場空間を撮像して撮像画像を生成する。上記のように、撮像部12は360度の全周方向を撮像するので、現場担当者が移動したり向きを変えたりしても、現場空間を含む撮像画像を得ることができる。 The imaging unit 12 of the field device 10 images the field space including the object 52 and generates a captured image. As described above, since the imaging unit 12 captures images in all directions of 360 degrees, even if the person in charge of the site moves or changes direction, a captured image including the site space can be obtained.
 この撮像画像は、撮像画像送信手段18の制御により、送信部22によって、指示装置30に送信される。指示装置30の撮像画像受信手段36は、受信部32により撮像画像を受信する。撮像画像表示部40は、受信した撮像画像を表示する。なお、撮像画像は360度の全周方向が撮像された画像であるので、全てを一度に表示せず、部分的に表示(部分撮像画像)を行うようにしている。指示者は、上下左右に方向を変更する操作を行うことにより、現場作業者の周囲を見渡すことができる。これにより、現場空間にある対象物が映し出された方向の部分撮像画像を表示することができる。 This captured image is transmitted to the instruction device 30 by the transmitter 22 under the control of the captured image transmitter 18. The captured image receiving means 36 of the instruction device 30 receives the captured image by the receiving unit 32. The captured image display section 40 displays the received captured image. Note that since the captured image is an image captured in the entire circumferential direction of 360 degrees, it is not displayed all at once, but is partially displayed (partial captured image). The instructor can look around the site worker by changing the direction up, down, left, and right. Thereby, a partial image taken in the direction in which the object in the scene space is projected can be displayed.
 指示を与える際、指示者は対象物を含む撮像画像が表示された状態で、固定指令を入力する。撮像画像表示部40は、固定指令が入力されるとその方向の部分撮像画像を参照撮像画像とし、静止画として表示する。固定指令が与えられた際の方向は、現場装置10に送信される。 When giving an instruction, the instructor inputs a fixing command while the captured image including the target object is displayed. When the fixed command is input, the captured image display section 40 sets a partial captured image in that direction as a reference captured image and displays it as a still image. The direction when the fixed command is given is transmitted to the field device 10.
 指示者は、撮像画像表示部40に表示された現場の参照撮像画像を見ながら、指示画像入力部44から指示画像を入力する。指示画像送信手段38は、送信部34により、入力された指示画像を現場装置10に送信する。 The instructor inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40. The instruction image transmitting means 38 transmits the input instruction image to the field device 10 by the transmitter 34 .
 現場装置10は、受信部24によって指示画像を受信し、前記固定指令が与えられた際の方向に基づいて、当該方向に向けて指示画像を投影するように、投影部14を制御する。これにより、対象物52上に指示画像62が投影される。 The field device 10 receives the instruction image by the reception unit 24, and controls the projection unit 14 to project the instruction image in the direction based on the direction in which the fixing command was given. As a result, the instruction image 62 is projected onto the target object 52.
 投影部14による指示画像の投影方向は、参照撮像画像の方向と合致されるので、指示画像は指示者の意図する箇所に投影されることになる。この制御だけでも実施可能であるが、とはいえ、現場担当者が場所を移動すると、指示画像の投影位置はずれてしまうことになる。 The direction in which the instruction image is projected by the projection unit 14 matches the direction of the reference captured image, so the instruction image is projected onto the location intended by the instructor. Although it is possible to implement this control alone, if the person in charge of the site moves from place to place, the projected position of the instruction image will shift.
 そこで、現場装置10の補正手段26は、参照撮像画面における特徴部分画像(マーカなど)と現在の撮像画像における特徴部分画像との比較により、指示画像が意図した位置に正しく投影されるように、指示画像を変形したり指示画像の投影位置を補正制御する。これにより、現場担当者が移動したとしても、指示画像が正しい位置に表示される。 Therefore, the correction means 26 of the field device 10 compares the characteristic partial image (marker, etc.) in the reference captured image with the characteristic partial image in the current captured image so that the instruction image is correctly projected at the intended position. The instruction image is transformed and the projection position of the instruction image is corrected and controlled. This allows the instruction image to be displayed in the correct position even if the person in charge of the site moves.
 現場担当者は、現場空間に実際に投影された指示画像62に基づいて作業すべき位置を確認することができる。この指示画像62は、作業者が頭の向きを変えても追従制御手段21、補正手段26によって正しい位置に表示される。したがって、作業者の頭の向きによって指示画像62が消えてしまうことがなく作業がしやすい。また、複数人で作業している場合、現場装置10を身につけている現場担当者が、頭を大きく動かしたとしても指示画像62が投影され続け、他の作業者の作業が中断することがない。さらに、作業者が移動したとしても、指示画像62が正しく表示される。
 
The person in charge at the site can confirm the position to be worked on based on the instruction image 62 actually projected onto the site space. This instruction image 62 is displayed at the correct position by the follow-up control means 21 and the correction means 26 even if the operator changes the direction of his head. Therefore, the instruction image 62 does not disappear depending on the direction of the worker's head, making it easier to work. Furthermore, when multiple people are working, the instruction image 62 will continue to be projected even if the person in charge of the site wearing the site device 10 moves his or her head significantly, which may interrupt the work of other workers. do not have. Furthermore, even if the worker moves, the instruction image 62 is displayed correctly.
4.2外観及びハードウエア構成
 この実施形態においては、三軸構造体90は用いず、360度カメラ81、360度レーザ・プロジェクタ83が、ヘルメット56の頂部に固定されている。
4.2 Appearance and Hardware Configuration In this embodiment, the triaxial structure 90 is not used, and a 360 degree camera 81 and a 360 degree laser projector 83 are fixed to the top of the helmet 56.
 指示装置30のハードウエア構成は、第1の実施形態と同様である(図7参照)。また、三軸構造体90を用いないので、これを制御するモータ92、94、96は不要であり、モータ制御回路400も不要である。スマートフォン200のハードウエア構成は、第1の実施形態と同様である(図8参照)。ただし、カメラ82、レーザプロジェクタ94に代えて、360度カメラ81、360度レーザ・プロジェクタ83が接続されている。
 
The hardware configuration of the instruction device 30 is the same as that of the first embodiment (see FIG. 7). Furthermore, since the three-axis structure 90 is not used, the motors 92, 94, and 96 for controlling it are unnecessary, and the motor control circuit 400 is also unnecessary. The hardware configuration of the smartphone 200 is the same as that of the first embodiment (see FIG. 8). However, instead of the camera 82 and laser projector 94, a 360 degree camera 81 and a 360 degree laser projector 83 are connected.
4.3遠隔指示処理
 図35、図36に、スマートフォン200の画像制御プログラム224、指示装置30の指示プログラム322のフローチャートを示す。図35が方向変更制御のフローチャートであり、図36が指示画像表示制御のフローチャートである。
4.3 Remote Instruction Processing FIGS. 35 and 36 show flowcharts of the image control program 224 of the smartphone 200 and the instruction program 322 of the instruction device 30. FIG. 35 is a flowchart of direction change control, and FIG. 36 is a flowchart of instruction image display control.
 現場に到着した現場担当者54は、対象物である煙突52の前まで来るとヘルメット58の頭頂部に、レーザプロジェクタ収納体57を取り付けて電源を入れる。 When the on-site person in charge 54 arrives at the site and gets in front of the target object, the chimney 52, he attaches the laser projector housing 57 to the top of his helmet 58 and turns on the power.
 スマートフォン200は、信号線(または近距離通信)によって360度カメラ81の撮像画像を取得し、インターネットを介して指示装置30に送信する(ステップS21)。 The smartphone 200 acquires an image captured by the 360-degree camera 81 via a signal line (or short-range communication), and transmits it to the instruction device 30 via the Internet (step S21).
 指示装置30は、受信した撮像画像をメモリに記録し、ディスプレイ306に表示する(ステップS41)。撮像画像は360度カメラによる全方向の画像である。したがって、指示装置30は、受信した撮像画像のうち、所定方向の部分画像のみを表示する。 The instruction device 30 records the received captured image in the memory and displays it on the display 306 (step S41). The captured image is an omnidirectional image captured by a 360 degree camera. Therefore, the instruction device 30 displays only a partial image in a predetermined direction of the received captured image.
 図12に示すように、指示装置30は、ディスプレイ306上に方向変更ボタン500を表示する。指示者は、異なる方向の部分撮像画像を表示したいとき、この方向変更ボタン500の円周上をマウス316にてクリックする。指示装置30は、クリックを検知すると、これに対応する方向(上下左右)の部分撮像画像を表示する(ステップS44)。指示者は、このようにして現場における所望の方向の画像を得ることができる。 As shown in FIG. 12, the instruction device 30 displays a direction change button 500 on the display 306. When the instructor wants to display a partial captured image in a different direction, the instructor clicks on the circumference of the direction change button 500 using the mouse 316. When the instruction device 30 detects a click, it displays a partial captured image in a direction corresponding to the click (up, down, left, and right) (step S44). In this way, the instructor can obtain an image in a desired direction at the site.
 上述の処理により、指示者の選択した方向の部分撮像画像は指示装置30のディスプレイ306に動画として表示されている。指示者は、この部分撮像画像を見ながら、対象物である煙突52が表示された状態にて、指示入力モードボタン502をクリックする。 Through the above-described processing, the partial captured image in the direction selected by the instructor is displayed as a moving image on the display 306 of the instruction device 30. The instructor clicks the instruction input mode button 502 while looking at this partial captured image and with the chimney 52, which is the target object, being displayed.
 これを受けて、指示装置30は、指示入力モードボタン501がクリックされることにより固定指令が与えられると、そのときの部分撮像画像を参照撮像画像とし、静止画としてディスプレイ306に表示する(ステップS52)。指示者は、この静止画に対し、マウス316を用いて現場担当者に対する指示を指示画像として入力する(ステップS53)。たとえば、図13に示すように、ディスプレイ306に表示された煙突52の画像において、穴を開けるべき位置を指示画像62(この例では、十字画像)をマウス316にて描画し、入力する。 In response to this, when a fixing command is given by clicking the instruction input mode button 501, the instruction device 30 sets the partial captured image at that time as a reference captured image and displays it on the display 306 as a still image (step S52). The instructor uses the mouse 316 to input an instruction to the on-site person as an instruction image to this still image (step S53). For example, as shown in FIG. 13, in the image of the chimney 52 displayed on the display 306, the position at which a hole should be made is input by drawing an instruction image 62 (in this example, a cross image) using the mouse 316.
 また、指示装置30は、上記固定指令および固定指令が与えられた時の方向を、スマートフォン200に送信する(ステップS51)。固定指令を受けたスマートフォン200は、当該固定指令を受信した際の撮像画像と方向とに基づいて、参照撮像画像を決定し、不揮発性メモリ212に記録する(ステップS32)。なお、固定指令を受けた後、スマートフォン200は、撮像画像を全て指示装置30に送信せず、固定された方向の部分撮像画像のみを指示装置30に送信する。 Further, the instruction device 30 transmits the fixing command and the direction when the fixing command was given to the smartphone 200 (step S51). The smartphone 200 that has received the fixing command determines a reference captured image based on the captured image and direction when receiving the fixing command, and records it in the nonvolatile memory 212 (step S32). Note that after receiving the fixing command, the smartphone 200 does not transmit all captured images to the instruction device 30, but only transmits a partial captured image in the fixed direction to the instruction device 30.
 上記のように、指示装置30とスマートフォン200において、同じ時点の部分撮像画像を参照撮像画像として認識することができる。なお、通信によるタイムラグを避けるため、指示装置30からスマートフォン200に固定指令を送信する際に、フレームを特定する情報(フレーム番号など)を付けて送信するようにしてもよい。スマートフォン200において、このフレームを特定する情報に基づいて参照撮像画像を決定することで、タイムラグによるずれを防止することができる。 As described above, the instruction device 30 and the smartphone 200 can recognize a partial captured image at the same time as a reference captured image. Note that in order to avoid a time lag due to communication, when transmitting the fixing command from the instruction device 30 to the smartphone 200, information for identifying the frame (such as a frame number) may be attached and transmitted. In the smartphone 200, by determining a reference captured image based on information specifying this frame, it is possible to prevent deviations due to time lag.
 指示者は、指示画像を入力し終わると、ディスプレイ306の参照撮像画像の右下に表示されている指示画像送信ボタン502(指示入力モードになると表示される)をクリックする。これにより、指示装置30は、指示画像をスマートフォン200に送信する(ステップS53)。 After the instructor finishes inputting the instruction image, the instructor clicks an instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference captured image on the display 306. Thereby, the instruction device 30 transmits the instruction image to the smartphone 200 (step S53).
 また、指示装置30は、指示入力モードを解除して参照撮像画像としての静止画の表示を止めて、送信されてきた撮像画像の部分撮像画像を動画として表示する(ステップS54)。これにより、指示者は、再び現場の状況を知ることができる。 Furthermore, the instruction device 30 cancels the instruction input mode, stops displaying the still image as the reference captured image, and displays the partial captured image of the transmitted captured image as a moving image (step S54). This allows the instructor to know the situation at the site again.
 スマートフォン200は、図14Aの指示画像データを受信すると、これをメモリ204に保持する。さらに、スマートフォン200は、カメラ82から現在の撮像画像を取得し、前記受信した固定方向に基づいて、部分撮像画像を抽出する(ステップS33)。 Upon receiving the instruction image data of FIG. 14A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image from the camera 82, and extracts a partial captured image based on the received fixed direction (step S33).
 スマートフォン200は、参照撮像画像のマーカと、現在の部分撮像画像のマーカとを比較して、マーカに追従して正しく指示画像が投影されるように、360度レーザプロジェクタによる指示画像の投影方向を制御する(ステップS34)。さらに、上記のマーカの比較により、指示画像が正しく投影されるように指示画像を変形し、その投影位置を制御する。
 
The smartphone 200 compares the marker of the reference captured image with the marker of the current partial captured image, and determines the direction in which the instruction image is projected by the 360-degree laser projector so that the instruction image is projected correctly by following the marker. control (step S34). Furthermore, by comparing the markers, the instruction image is transformed so that it is correctly projected, and its projection position is controlled.
4.4変形例
(1)上記実施形態では、全方向を撮像する360度カメラ81、全方向に投影を行う360度レーザプロジェクタ83を用いている。しかし、水平方向に360度(所定度でもよい)撮像するカメラや投影を行うレーザプロジェクタなどを用いてもよい。
4.4 Variations
(1) In the above embodiment, a 360-degree camera 81 that captures images in all directions and a 360-degree laser projector 83 that projects in all directions are used. However, a camera that captures images at 360 degrees (or a predetermined degree) in the horizontal direction, a laser projector that performs projection, or the like may be used.
(2)上記実施形態では、部分特徴画像としてマーカを用いているが、画像の特徴点などを用いるようにしてもよい。 (2) In the above embodiment, a marker is used as the partial feature image, but feature points of the image may also be used.
(3)上記実施形態では、現場装置10、現場装置10と通信可能な指示装置30によって指示画像を投影するようにしている。 (3) In the above embodiment, the instruction image is projected by the on-site device 10 and the instruction device 30 that can communicate with the on-site device 10.
 しかし、現場において投影する指示画像が予め定まっている場合には、現場装置10に指示画像62を記録しておき、現場指示装置11として構築するようにしてもよい。 However, if the instruction image to be projected at the site is determined in advance, the instruction image 62 may be recorded in the site device 10 and constructed as the site instruction device 11.
 このようにして構築した現場指示装置11の機能構成を図37に示す。記録部24には指示画像62が記録されている。追従制御手段21は、参照撮像画像と現在の撮像画像のマーカ60または特徴点に基づいて、指示画像が所望の位置に投影されるように制御する。したがって、対象物52に指示画像62が投影される。 The functional configuration of the site instruction device 11 constructed in this way is shown in FIG. 37. An instruction image 62 is recorded in the recording section 24. The follow-up control means 21 controls the instruction image to be projected at a desired position based on the marker 60 or feature points of the reference captured image and the current captured image. Therefore, the instruction image 62 is projected onto the target object 52.
 現場指示装置11は、現場担当者54が装着していても良く、現場担当者近傍の移動体(固定体でもよい)に装着されていても良い。 The on-site instruction device 11 may be worn by the on-site person in charge 54, or may be attached to a moving body (or a fixed body) near the on-site person in charge.
 この実施形態では、記録部24に指示画像62が記録されているが、現場指示装置11の外部から通信などによって取得するようにしてもよい。 In this embodiment, the instruction image 62 is recorded in the recording unit 24, but it may be acquired from outside the on-site instruction device 11 through communication or the like.
 図38に、指示処理のフローチャートを示す。この例では、現場指示装置11を、360度カメラ81、360度レーザプロジェクタ83、スマートフォン200によって構成している。 FIG. 38 shows a flowchart of instruction processing. In this example, the on-site instruction device 11 includes a 360-degree camera 81, a 360-degree laser projector 83, and a smartphone 200.
 スマートフォン200は、撮像画像を取得し(ステップS71)、予め定められたマーカ60が撮像されたかどうかを判定する(ステップS72)。 The smartphone 200 acquires a captured image (step S71), and determines whether a predetermined marker 60 has been captured (step S72).
 マーカ60が撮像されると、スマートフォン200は、その時の撮像画像のうち、マーカ60が撮像された部分撮像画像を参照撮像画像として記録する(ステップS77)。この際、マーカ60が所定領域(たとえば中央部分)になるように、部分撮像画像の方向を選択する。また、併せてこの方向も記録する。 When the marker 60 is imaged, the smartphone 200 records the partial image in which the marker 60 is imaged among the images captured at that time as a reference image (step S77). At this time, the direction of the partial captured image is selected so that the marker 60 is in a predetermined area (for example, in the center). Also, record this direction as well.
 スマートフォン200は、参照撮像画像のマーカと、現在の部分撮像画像のマーカとを比較して、マーカに追従して正しく指示画像が投影されるように、360度レーザプロジェクタによる指示画像の投影方向を制御する(ステップS79)。さらに、上記のマーカの比較により、指示画像が正しく投影されるように指示画像を変形し、その投影位置を制御する。 The smartphone 200 compares the marker of the reference captured image with the marker of the current partial captured image, and determines the direction in which the instruction image is projected by the 360-degree laser projector so that the instruction image is projected correctly by following the marker. control (step S79). Furthermore, by comparing the markers, the instruction image is transformed so that it is correctly projected, and its projection position is controlled.
(4)上記実施形態では、カメラ81、プロジェクタ83をヘルメットに直接取り付けるようにしている。しかし、シリコンゲルなどの緩衝材を介して取り付けるようにしてもよい。 (4) In the above embodiment, the camera 81 and the projector 83 are attached directly to the helmet. However, it may be attached via a cushioning material such as silicone gel.
(5)上記の各変形例は、その本質に反しない限り互いに組み合わせて適用することができる。また、他の実施形態、その変形例とも組み合わせて適用することが可能である。
 
(5) The above modifications can be applied in combination with each other as long as they do not contradict their essence. Further, it is possible to apply the present invention in combination with other embodiments and modifications thereof.

Claims (59)

  1.  指示者が使用する指示装置と、現場担当者が使用する現場装置とを備えた遠隔指示システムであって、
     前記現場装置は、
     現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部と、
     撮像部および投影部の向きを検出するセンサからの出力を受けて、前記現場担当者または前記移動体の動きに拘わらず、前記撮像部および前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、
     前記固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段とを備え、
     前記指示装置は、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     受信した撮像画像を表示する撮像画像表示部と、
     所望の現場空間の撮像画像が撮像されるように、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     前記表示された撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場装置の投影部が前記現場空間の所定部位を基準として正しく前記指示画像を投影するように、送信部により、指示画像の位置を特定した指示画像データを前記現場装置に送信する指示画像送信手段と、
     を備えたことを特徴とする遠隔指示システム。
    A remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
    The field device is
    an imaging unit that is attached to a person in charge of the site or a moving object and captures an image of the site space and generates a captured image;
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes an imaging direction of the imaging unit and a projection direction of the projection unit;
    In response to an output from a sensor that detects the orientation of the imaging unit and the projection unit, the imaging unit and the projection unit face in a predetermined direction with the on-site person in charge regardless of the movement of the on-site person or the moving body. directional control means for controlling the drive unit in such a manner;
    Based on a comparison between the characteristic partial image in the reference captured image when the fixing command was given and the characteristic partial image in the current captured image, using the captured image when the fixing command was given as a reference captured image, a correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space;
    The indicating device includes:
    captured image receiving means for receiving the captured image transmitted by the receiving unit;
    a captured image display section that displays the received captured image;
    fixing command means for giving a fixing command to the on-site device by a transmitter so that a captured image of a desired on-site space is captured;
    an instruction image input unit that inputs an instruction image into a desired position in the site space by an instruction person's operation in the displayed captured image;
    The transmission unit transmits an instruction image so that the projection unit of the on-site device correctly projects the instruction image based on the characteristic partial image of the on-site space included in the captured image. instruction image transmitting means for transmitting instruction image data with a specified position to the on-site device;
    A remote instruction system comprising:
  2.  指示装置とともに遠隔指示システムを構築するための現場装置であって、
     現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部と、
     撮像部および投影部の向きを検出するセンサからの出力を受けて、前記現場担当者または前記移動体の動きに拘わらず、前記撮像部および前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、
     前記指示装置から固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段と、
     を備えた現場装置。
    A field device for constructing a remote instruction system together with an instruction device,
    an imaging unit that is attached to a person in charge of the site or a moving object and captures an image of the site space and generates a captured image;
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes an imaging direction of the imaging unit and a projection direction of the projection unit;
    In response to an output from a sensor that detects the orientation of the imaging unit and the projection unit, the imaging unit and the projection unit face in a predetermined direction with the on-site person in charge regardless of the movement of the on-site person or the moving body. directional control means for controlling the drive unit in such a manner;
    A captured image when a fixing command is given from the instruction device is used as a reference captured image, and a characteristic partial image in the reference captured image when the fixing command is given is compared with a characteristic partial image in the current captured image. a correction unit that corrects projection of the instruction image by the projection unit without depending on the drive unit, so that the instruction image is correctly displayed based on the predetermined part of the site space;
    On-site equipment equipped with
  3.  現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、送信部により、前記撮像画像を指示装置に送信する撮像画像送信手段と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部とを備えた現場装置をコンピュータによって実現するための現場装置補正プログラムであって、コンピュータを、
     前記指示装置から固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段として機能させるための現場装置補正プログラム。
    an imaging unit that is attached to a person in charge of a site or a mobile body and captures an image of a site space to generate a captured image; a captured image transmitter that transmits the captured image to an instruction device by a transmitting unit; and a person in charge of the site or a mobile body. a projection unit that is attached to the camera and projects an instruction image onto the site space based on given instruction image data; and a drive unit that changes the imaging direction of the imaging unit and the projection direction of the projection unit. An on-site device correction program for realizing a device using a computer, the computer
    A captured image when a fixing command is given from the instruction device is used as a reference captured image, and a characteristic partial image in the reference captured image when the fixing command is given is compared with a characteristic partial image in the current captured image. on-site device correction for functioning as a correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed based on a predetermined part of the site space as a reference; program.
  4.  現場装置とともに遠隔指示システムを構築するための指示装置であって、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     受信した撮像画像を表示する撮像画像表示部と、
     所望の現場空間の撮像画像が撮像されるように、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     前記表示された撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場装置の投影部が前記現場空間の所定部位を基準として正しく前記指示画像を投影するように、送信部により、指示画像の位置を特定した指示画像データを前記現場装置に送信する指示画像送信手段と、
     を備えた指示装置。
    An instruction device for constructing a remote instruction system together with on-site devices,
    captured image receiving means for receiving the captured image transmitted by the receiving section;
    a captured image display section that displays the received captured image;
    fixing command means for giving a fixing command to the on-site device by a transmitter so that a captured image of a desired on-site space is captured;
    an instruction image input unit that inputs an instruction image into a desired position in the site space by an instruction person's operation in the displayed captured image;
    The transmission unit transmits an instruction image so that the projection unit of the on-site device correctly projects the instruction image based on the characteristic partial image of the on-site space included in the captured image. instruction image transmitting means for transmitting instruction image data with a specified position to the on-site device;
    An indicating device equipped with.
  5.  受信した撮像画像を表示する撮像画像表示部と、表示された撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部とを備えた指示装置プログラムであって、コンピュータを、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     所望の現場空間の撮像画像が撮像されるように、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場装置の投影部が前記現場空間の所定部位を基準として正しく前記指示画像を投影するように、送信部により、指示画像の位置を特定した指示画像データを前記現場装置に送信する指示画像送信手段として機能させるための指示装置プログラム。
    An instruction device program comprising: a captured image display unit that displays a received captured image; and an instruction image input unit that inputs an instruction image to a desired position in a site space by an instruction person's operation in the displayed captured image. and the computer,
    captured image receiving means for receiving the captured image transmitted by the receiving unit;
    fixing command means for giving a fixing command to the on-site device by a transmitter so that a captured image of a desired on-site space is captured;
    The transmission unit transmits an instruction image so that the projection unit of the on-site device correctly projects the instruction image based on the characteristic partial image of the on-site space included in the captured image. An instruction device program for functioning as instruction image transmitting means for transmitting instruction image data with a specified position to the field device.
  6.  請求項1~5のいずれかのシステム、装置またはプログラムにおいて、
     前記撮像部および前記投影部は、前記駆動部に対して、高周波振動を吸収する部材を介して固定されていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 1 to 5,
    A system, device, or program, wherein the imaging section and the projection section are fixed to the drive section via a member that absorbs high-frequency vibrations.
  7.  請求項1~6のいずれかのシステム、装置またはプログラムにおいて、
     前記撮像部および前記投影部は、前記駆動部を介して、現場担当者のヘルメットに固定されていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 1 to 6,
    A system, device, or program, wherein the imaging unit and the projection unit are fixed to a helmet of a person in charge of the field via the drive unit.
  8.  請求項1~7のいずれかのシステム、装置またはプログラムにおいて、
     前記方向制御手段は、前記指示装置からの方向指示に基づいて前記所定方向を変化させることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 1 to 7,
    A system, device, or program, wherein the direction control means changes the predetermined direction based on a direction instruction from the instruction device.
  9.  請求項1~8のいずれかのシステム、装置またはプログラムにおいて、
     前記特徴部分画像は、前記現場空間内に設けられたマーカまたは前記撮像画像の特徴点であることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 1 to 8,
    A system, device, or program characterized in that the characteristic partial image is a marker provided in the site space or a characteristic point of the captured image.
  10.  請求項1~9のいずれかのシステム、装置またはプログラムにおいて、
     前記駆動部は、多軸駆動機構を有することを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 1 to 9,
    A system, device, or program characterized in that the drive unit has a multi-axis drive mechanism.
  11.  現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部と、
     撮像部および投影部の向きを検出するセンサからの出力を受けて、前記現場担当者または前記移動体の動きに拘わらず、前記撮像部および前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段と、
     を備えた現場指示装置。
    an imaging unit that is attached to a person in charge of the site or a moving object and captures an image of the site space and generates a captured image;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes an imaging direction of the imaging unit and a projection direction of the projection unit;
    In response to an output from a sensor that detects the orientation of the imaging unit and the projection unit, the imaging unit and the projection unit face in a predetermined direction with the on-site person in charge regardless of the movement of the on-site person or the moving body. directional control means for controlling the drive unit in such a manner;
    The instruction image is generated by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with a predetermined part of the site space as a reference based on a characteristic partial image of the site space included in the captured image. correction means for correcting the projection of the
    On-site instruction device with.
  12.  現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部とを備えた現場指示装置をコンピュータによって実現するための現場指示プログラムであって、コンピュータを、
     撮像部および投影部の向きを検出するセンサからの出力を受けて、前記現場担当者または前記移動体の動きに拘わらず、前記撮像部および前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段として機能させるための現場指示プログラム。
    an imaging unit that is attached to a person in charge of the site or a mobile object and captures an image of the site space to generate a captured image; and an imaging unit that is attached to the person in charge of the site or the mobile object and gives instructions to the site space based on the given instruction image data; A site instruction program for realizing, by a computer, a site instruction device including a projection unit that projects an image, and a drive unit that changes the imaging direction of the imaging unit and the projection direction of the projection unit, the computer comprising:
    In response to an output from a sensor that detects the orientation of the imaging unit and the projection unit, the imaging unit and the projection unit face in a predetermined direction with the on-site person in charge regardless of the movement of the on-site person or the moving body. directional control means for controlling the drive unit in such a manner;
    The instruction image is generated by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with a predetermined part of the site space as a reference based on a characteristic partial image of the site space included in the captured image. An on-site instruction program for functioning as a correction means for correcting the projection of.
  13.  指示者が使用する指示装置と、現場担当者が使用する現場装置とを備えた遠隔指示システムであって、
     前記現場装置は、
     現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部と、
     前記固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、前記現場担当者または前記移動体の動きに拘わらず、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、駆動部を制御する追従制御手段とを備え、
     前記指示装置は、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     受信した撮像画像を表示する撮像画像表示部と、
     所望の現場空間の撮像画像が撮像されるように、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     前記表示された撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、駆動部を制御して前記現場装置の投影部が前記現場空間の所定部位を基準として正しく前記指示画像を投影するように、送信部により、指示画像の位置を特定した指示画像データを前記現場装置に送信する指示画像送信手段と、
     を備えたことを特徴とする遠隔指示システム。
    A remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
    The field device is
    an imaging unit that is attached to a person in charge of the site or a moving object and captures an image of the site space and generates a captured image;
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes an imaging direction of the imaging unit and a projection direction of the projection unit;
    Based on a comparison between the characteristic partial image in the reference captured image when the fixing command was given and the characteristic partial image in the current captured image, using the captured image when the fixing command was given as a reference captured image, a follow-up control means for controlling a drive unit so that the instruction image is correctly displayed based on a predetermined part of the site space, regardless of the movement of the site person or the mobile body;
    The indicating device includes:
    captured image receiving means for receiving the captured image transmitted by the receiving unit;
    a captured image display section that displays the received captured image;
    fixing command means for giving a fixing command to the on-site device by a transmitter so that a captured image of a desired on-site space is captured;
    an instruction image input unit that inputs an instruction image into a desired position in the site space by an instruction person's operation in the displayed captured image;
    Controlling a drive unit based on a characteristic partial image of the site space included in the captured image so that the projection unit of the site device correctly projects the instruction image with reference to a predetermined part of the site space; instruction image transmitting means for transmitting instruction image data in which the position of the instruction image is specified by the unit to the on-site device;
    A remote instruction system comprising:
  14.  指示装置とともに遠隔指示システムを構築するための現場装置であって、
     現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部と、
     前記固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、前記現場担当者または前記移動体の動きに拘わらず、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、駆動部を制御する追従制御手段と、
     を備えた現場装置。
    A field device for constructing a remote instruction system together with an instruction device,
    an imaging unit that is attached to a person in charge of the site or a moving object and captures an image of the site space and generates a captured image;
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes an imaging direction of the imaging unit and a projection direction of the projection unit;
    Based on a comparison between the characteristic partial image in the reference captured image when the fixing command was given and the characteristic partial image in the current captured image, using the captured image when the fixing command was given as a reference captured image, a follow-up control means for controlling a drive unit so that the instruction image is correctly displayed based on a predetermined part of the site space, regardless of the movement of the site person or the mobile body;
    On-site equipment equipped with
  15.  現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、送信部により、前記撮像画像を指示装置に送信する撮像画像送信手段と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部とを備えた現場装置をコンピュータによって実現するための現場装置プログラムであって、コンピュータを、
     前記固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、前記現場担当者または前記移動体の動きに拘わらず、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、駆動部を制御する追従制御手段として機能させるための現場装置プログラム。
    an imaging unit that is attached to a person in charge of a site or a mobile body and captures an image of a site space to generate a captured image; a captured image transmitter that transmits the captured image to an instruction device by a transmitting unit; and a person in charge of the site or a mobile body. a projection unit that is attached to the camera and projects an instruction image onto the site space based on given instruction image data; and a drive unit that changes the imaging direction of the imaging unit and the projection direction of the projection unit. A field device program for realizing a device by a computer, the computer
    Based on a comparison between the characteristic partial image in the reference captured image when the fixing command was given and the characteristic partial image in the current captured image, using the captured image when the fixing command was given as a reference captured image, A site device program for functioning as a follow-up control means for controlling a drive unit so that an instruction image is correctly displayed based on a predetermined part of the site space regardless of the movement of the site person or the moving body.
  16.  現場装置とともに遠隔指示システムを構築するための指示装置であって、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     受信した撮像画像を表示する撮像画像表示部と、
     所望の現場空間の撮像画像が撮像されるように、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     前記表示された撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、駆動部を制御して前記現場装置の投影部が前記現場空間の所定部位を基準として正しく前記指示画像を投影するように、送信部により、指示画像の位置を特定した指示画像データを前記現場装置に送信する指示画像送信手段と、
     を備えた指示装置。
    An instruction device for constructing a remote instruction system together with on-site devices,
    captured image receiving means for receiving the captured image transmitted by the receiving section;
    a captured image display section that displays the received captured image;
    fixing command means for giving a fixing command to the on-site device by a transmitter so that a captured image of a desired on-site space is captured;
    an instruction image input unit that inputs an instruction image into a desired position in the site space by an instruction person's operation in the displayed captured image;
    Controlling a drive unit based on a characteristic partial image of the site space included in the captured image so that the projection unit of the site device correctly projects the instruction image with reference to a predetermined part of the site space; instruction image transmitting means for transmitting instruction image data in which the position of the instruction image is specified by the unit to the on-site device;
    An indicating device equipped with.
  17.  受信した撮像画像を表示する撮像画像表示部と、表示された撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部とを備えた指示装置プログラムであって、コンピュータを、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     所望の現場空間の撮像画像が撮像されるように、送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、駆動部を制御して前記現場装置の投影部が前記現場空間の所定部位を基準として正しく前記指示画像を投影するように、送信部により、指示画像の位置を特定した指示画像データを前記現場装置に送信する指示画像送信手段として機能させるための指示装置プログラム。
    An instruction device program comprising: a captured image display unit that displays a received captured image; and an instruction image input unit that inputs an instruction image to a desired position in a site space by an instruction person's operation in the displayed captured image. and the computer,
    captured image receiving means for receiving the captured image transmitted by the receiving section;
    fixing command means for giving a fixing command to the on-site device by a transmitter so that a captured image of a desired on-site space is captured;
    Controlling a drive unit based on a characteristic partial image of the site space included in the captured image so that the projection unit of the site device correctly projects the instruction image with reference to a predetermined part of the site space; An instruction device program for causing the instruction device to function as instruction image transmitting means for transmitting instruction image data in which a position of an instruction image is specified to the field device.
  18.  請求項13~17のいずれかのシステム、装置またはプログラムにおいて、
     前記撮像部および前記投影部は、前記駆動部に対して、高周波振動を吸収する部材を介して固定されていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 13 to 17,
    A system, device, or program, wherein the imaging section and the projection section are fixed to the drive section via a member that absorbs high-frequency vibrations.
  19.  請求項13~18のいずれかのシステム、装置またはプログラムにおいて、
     前記撮像部および前記投影部は、前記駆動部を介して、現場担当者のヘルメットに固定されていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 13 to 18,
    A system, device, or program, wherein the imaging unit and the projection unit are fixed to a helmet of a person in charge of the field via the drive unit.
  20.  請求項13~19のいずれかのシステム、装置またはプログラムにおいて、
     前記方向制御手段は、前記指示装置からの方向指示に基づいて前記所定方向を変化させることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 13 to 19,
    A system, device, or program, wherein the direction control means changes the predetermined direction based on a direction instruction from the instruction device.
  21.  請求項13~20のいずれかのシステム、装置またはプログラムにおいて、
     前記特徴部分画像は、前記現場空間内に設けられたマーカまたは前記撮像画像の特徴点であることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 13 to 20,
    A system, device, or program characterized in that the characteristic partial image is a marker provided in the site space or a characteristic point of the captured image.
  22.  請求項13~21のいずれかのシステム、装置またはプログラムにおいて、
     前記駆動部は、多軸駆動機構を有することを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 13 to 21,
    A system, device, or program characterized in that the drive unit has a multi-axis drive mechanism.
  23.  請求項13~22のいずれかのシステム、装置またはプログラムにおいて、
     前記固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段をさらに備えたことを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 13 to 22,
    Based on a comparison between the characteristic partial image in the reference captured image when the fixing command was given and the characteristic partial image in the current captured image, using the captured image when the fixing command was given as a reference captured image, A system further comprising a correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space, device or program.
  24.  現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部と、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場担当者または前記移動体の動きに拘わらず、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、駆動部を制御する追従制御手段と、
     を備えた現場指示装置。
    an imaging unit that is attached to a person in charge of the site or a moving object and captures an image of the site space and generates a captured image;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes an imaging direction of the imaging unit and a projection direction of the projection unit;
    Based on the characteristic partial image of the site space included in the captured image, the instruction image is correctly displayed with respect to a predetermined part of the site space as a reference, regardless of the movement of the site person or the moving body; a follow-up control means for controlling the drive section;
    On-site instruction device with.
  25.  現場担当者または移動体に装着され、現場空間を撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記撮像部の撮像方向および前記投影部の投影方向を変化させる駆動部とを備えた現場指示装置をコンピュータによって実現するための現場指示プログラムであって、コンピュータを、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場担当者または前記移動体の動きに拘わらず、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、駆動部を制御する追従制御手段として機能させるための現場指示プログラム。
    an imaging unit that is attached to a person in charge of the site or a mobile object and captures an image of the site space to generate a captured image; and an imaging unit that is attached to the person in charge of the site or the mobile object and gives instructions to the site space based on the given instruction image data; A site instruction program for realizing, by a computer, a site instruction device including a projection unit that projects an image, and a drive unit that changes the imaging direction of the imaging unit and the projection direction of the projection unit, the computer comprising:
    Based on the characteristic partial image of the site space included in the captured image, the instruction image is correctly displayed with respect to a predetermined part of the site space as a reference, regardless of the movement of the site person or the moving body; A field instruction program for functioning as a follow-up control means for controlling the drive unit.
  26.  指示者が使用する指示装置と、現場担当者が使用する現場装置とを備えた遠隔指示システムであって、
     前記現場装置は、
     現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記投影部の投影方向を変化させる駆動部と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、
     前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段とを備え、
     前記指示装置は、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するよう、駆動部を制御し、投影部の投影を制御するために、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段と、
     を備えたことを特徴とする遠隔指示システム。
    A remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
    The field device is
    an imaging unit that is attached to a person in charge of the site or a moving object and that captures images of the site space in a wide-angle direction and generates a captured image;
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes the projection direction of the projection unit;
    Direction control means for controlling a drive unit so that the projection unit faces in a predetermined direction with the on-site person at the center regardless of the movement of the on-site person or the mobile body;
    a correction means for correcting projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space;
    The indicating device includes:
    captured image receiving means for receiving the captured image transmitted by the receiving section;
    fixed command means for giving a fixed command to the field device by a transmitter;
    When there is a fixing command, an instruction image input unit sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and inputs an instruction image to a desired position in the site space in the captured image of interest by an operation of an instructor;
    Controlling a drive unit so that the projection unit projects the instruction image based on a predetermined part of the site space regardless of the movement of the site person or the moving body, and controlling the projection of the projection unit. , instruction image transmitting means for transmitting instruction image data in which the position of the instruction image is specified to the instruction device by a transmitter;
    A remote instruction system comprising:
  27.  遠隔指示システムのための現場装置であって、
     現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記投影部の投影方向を変化させる駆動部と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、
     前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段と、
     を備えた現場装置。
    A field device for a remote instruction system, comprising:
    an imaging unit that is attached to a person in charge of the site or a moving object and that captures images of the site space in a wide-angle direction and generates a captured image;
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes the projection direction of the projection unit;
    Direction control means for controlling a drive unit so that the projection unit faces in a predetermined direction with the on-site person at the center regardless of the movement of the on-site person or the mobile body;
    a correction unit that corrects projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space;
    On-site equipment equipped with
  28.  現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記投影部の投影方向を変化させる駆動部とを備えた現場装置をコンピュータによって実現するための現場装置プログラムであって、コンピュータを、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、
     前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段として機能させるための現場装置プログラム。
    an imaging unit that is attached to a person in charge of the site or a mobile body and that captures a wide-angle view of the site space and generates a captured image; a captured image transmitting unit that transmits the captured image to the instruction device by a transmitting unit; A computer is a site device that is attached to a person or a moving object and includes a projection section that projects an instruction image onto the site space based on given instruction image data, and a drive section that changes the projection direction of the projection section. This is a field equipment program for realizing by using a computer,
    Direction control means for controlling a drive unit so that the projection unit faces in a predetermined direction with the on-site person at the center regardless of the movement of the on-site person or the mobile body;
    A site device program for functioning as a correction means for correcting projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space.
  29.  遠隔指示システムのための指示装置であって、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するよう、駆動部を制御し、投影部の投影を制御するために、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段と、
     を備えた指示装置。
    An instruction device for a remote instruction system, comprising:
    captured image receiving means for receiving the captured image transmitted by the receiving unit;
    fixed command means for giving a fixed command to the field device by a transmitter;
    When there is a fixing command, an instruction image input unit sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and inputs an instruction image to a desired position in the site space in the captured image of interest by an operation of an instructor;
    Controlling a drive unit so that the projection unit projects the instruction image based on a predetermined part of the site space regardless of the movement of the site person or the moving body, and controlling the projection of the projection unit. , instruction image transmitting means for transmitting instruction image data in which the position of the instruction image is specified to the instruction device by a transmitter;
    An indicating device equipped with.
  30.  固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部を備えた指示装置をコンピュータによって実現するための指示装置プログラムであって、コンピュータを、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するよう、駆動部を制御し、投影部の投影を制御するために、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段として機能させるための指示装置プログラム。
    When there is a fixing command, an instruction image input unit is provided that sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and inputs an instruction image to a desired position in the site space in the captured image of interest by operation of an instructor. An instruction device program for realizing an instruction device using a computer, the computer
    captured image receiving means for receiving the captured image transmitted by the receiving section;
    fixed command means for giving a fixed command to the field device by a transmitter;
    Controlling a drive unit so that the projection unit projects the instruction image based on a predetermined part of the site space regardless of the movement of the site person or the moving body, and controlling the projection of the projection unit. . An instruction device program for causing a transmitter to function as instruction image transmitting means for transmitting instruction image data in which a position of an instruction image is specified to the instruction device.
  31.  請求項26~30のいずれかのシステム、装置またはプログラムにおいて、
     前記前記投影部は、前記駆動部に対して、高周波振動を吸収する部材を介して固定されていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 26 to 30,
    A system, device, or program, wherein the projection unit is fixed to the drive unit via a member that absorbs high-frequency vibrations.
  32.  請求項26~31のいずれかのシステム、装置またはプログラムにおいて、
     前記投影部は、前記駆動部を介して、現場担当者のヘルメットに固定されていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 26 to 31,
    A system, device, or program, wherein the projection unit is fixed to a helmet of a person in charge of the field via the drive unit.
  33.  請求項26~32のいずれかのシステム、装置またはプログラムにおいて、
     前記特徴部分画像は、前記現場空間内に設けられたマーカまたは前記撮像画像の特徴点であることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 26 to 32,
    A system, device, or program characterized in that the characteristic partial image is a marker provided in the site space or a characteristic point of the captured image.
  34.  請求項26~33のいずれかのシステム、装置またはプログラムにおいて、
     前記駆動部は、多軸駆動機構を有することを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 26 to 33,
    A system, device, or program characterized in that the drive unit has a multi-axis drive mechanism.
  35.  現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記投影部の投影方向を変化させる駆動部と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、
     前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段と、
     を備えた現場指示装置。
    an imaging unit that is attached to a person in charge of the site or a moving object and that captures images of the site space in a wide-angle direction and generates a captured image;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes the projection direction of the projection unit;
    Direction control means for controlling a drive unit so that the projection unit faces in a predetermined direction with the on-site person at the center regardless of the movement of the on-site person or the mobile body;
    a correction unit that corrects projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space;
    On-site instruction device with.
  36.  現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記投影部の投影方向を変化させる駆動部とを備えた現場指示装置をコンピュータによって実現するための現場指示プログラムであって、コンピュータを、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が現場担当者を中心として所定方向を向くように駆動部を制御する方向制御手段と、
     前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段として機能させるための現場指示プログラム。
    an imaging unit that is attached to a person in charge of the site or a mobile object and generates a captured image by imaging the site space in a wide-angle direction; A site instruction program for realizing, by a computer, a site instruction device including a projection section that projects an instruction image in space and a drive section that changes the projection direction of the projection section, the computer comprising:
    Direction control means for controlling a drive unit so that the projection unit faces in a predetermined direction with the on-site person at the center regardless of the movement of the on-site person or the mobile body;
    A site instruction program for functioning as a correction means for correcting projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space.
  37.  指示者が使用する指示装置と、現場担当者が使用する現場装置とを備えた遠隔指示システムであって、
     前記現場装置は、
     現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記投影部の投影方向を変化させる駆動部と、
     前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部を制御する追従制御手段とを備え、
     前記指示装置は、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するよう、駆動部を制御するために、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段と、
     を備えたことを特徴とする遠隔指示システム。
    A remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
    The field device is
    an imaging unit that is attached to a person in charge of the site or a moving object and that captures images of the site space in a wide-angle direction and generates a captured image;
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes the projection direction of the projection unit;
    and a follow-up control means for controlling the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space,
    The indicating device includes:
    captured image receiving means for receiving the captured image transmitted by the receiving unit;
    fixed command means for giving a fixed command to the field device by a transmitter;
    When there is a fixing command, an instruction image input unit sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and inputs an instruction image to a desired position in the site space in the captured image of interest by an operation of an instructor;
    In order to control the drive unit so that the projection unit projects the instruction image based on a predetermined part of the site space, regardless of the movement of the site person or the moving body, the transmission unit transmits the instruction image. instruction image transmitting means for transmitting instruction image data with a specified position to the instruction device;
    A remote instruction system comprising:
  38.  遠隔指示システムのための現場装置であって、
     現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記投影部の投影方向を変化させる駆動部と、
     前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部を制御する追従制御手段と、
     を備えた現場装置。
    A field device for a remote instruction system, comprising:
    an imaging unit that is attached to a person in charge of the site or a moving object and that captures images of the site space in a wide-angle direction and generates a captured image;
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes the projection direction of the projection unit;
    a follow-up control means for controlling the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space;
    On-site equipment equipped with
  39.  現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記投影部の投影方向を変化させる駆動部とを備えた現場装置をコンピュータによって実現するための現場装置プログラムであって、コンピュータを、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部を制御する追従制御手段と、
     を備えた現場装置プログラム。
    an imaging unit that is attached to a person in charge of the site or a mobile body and that captures a wide-angle view of the site space and generates a captured image; a captured image transmitting unit that transmits the captured image to the instruction device by a transmitting unit; A computer is a site device that is attached to a person or a moving object and includes a projection section that projects an instruction image onto the site space based on given instruction image data, and a drive section that changes the projection direction of the projection section. This is a field equipment program for realizing by using a computer,
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a follow-up control means for controlling the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space;
    Field equipment program with.
  40.  遠隔指示システムのための指示装置であって、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するよう、駆動部を制御するために、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段と、
     を備えた指示装置。
    An instruction device for a remote instruction system, comprising:
    captured image receiving means for receiving the captured image transmitted by the receiving section;
    fixed command means for giving a fixed command to the field device by a transmitter;
    When there is a fixing command, an instruction image input unit sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and inputs an instruction image to a desired position in the site space in the captured image of interest by an operation of an instructor;
    In order to control the drive unit so that the projection unit projects the instruction image based on a predetermined part of the site space, regardless of the movement of the site person or the moving body, the transmission unit transmits the instruction image. instruction image transmitting means for transmitting instruction image data with a specified position to the instruction device;
    An indicating device equipped with.
  41.  固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部を備えた指示装置をコンピュータによって実現するための指示装置プログラムであって、コンピュータを、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するよう、駆動部を制御するために、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段として機能させるための指示装置プログラム。
    When there is a fixing command, an instruction image input unit is provided that sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and inputs an instruction image to a desired position in the site space in the captured image of interest by operation of an instructor. An instruction device program for realizing an instruction device using a computer, the computer
    captured image receiving means for receiving the captured image transmitted by the receiving section;
    fixed command means for giving a fixed command to the field device by a transmitter;
    In order to control the drive unit so that the projection unit projects the instruction image based on a predetermined part of the site space, regardless of the movement of the site person or the moving body, the transmission unit transmits the instruction image. An instruction device program for functioning as instruction image transmitting means for transmitting instruction image data with a specified position to the instruction device.
  42.  請求項37~41のいずれかのシステム、装置またはプログラムにおいて、
     前記前記投影部は、前記駆動部に対して、高周波振動を吸収する部材を介して固定されていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 37 to 41,
    A system, device, or program, wherein the projection unit is fixed to the drive unit via a member that absorbs high-frequency vibrations.
  43.  請求項37~42のいずれかのシステム、装置またはプログラムにおいて、
     前記投影部は、前記駆動部を介して、現場担当者のヘルメットに固定されていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 37 to 42,
    A system, device, or program, wherein the projection unit is fixed to a helmet of a person in charge of the field via the drive unit.
  44.  請求項37~43のいずれかのシステム、装置またはプログラムにおいて、
     前記特徴部分画像は、前記現場空間内に設けられたマーカまたは前記撮像画像の特徴点であることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 37 to 43,
    A system, device, or program characterized in that the characteristic partial image is a marker provided in the site space or a characteristic point of the captured image.
  45.  請求項37~44のいずれかのシステム、装置またはプログラムにおいて、
     前記駆動部は、多軸駆動機構を有することを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 37 to 44,
    A system, device, or program characterized in that the drive unit has a multi-axis drive mechanism.
  46.  請求項37~45のいずれかのシステム、装置またはプログラムにおいて、
     前記固定指令が与えられた際の撮像画像を参照撮像画像として、当該固定指令が与えられた際の参照撮像画像における特徴部分画像と、現在の撮像画像における特徴部分画像との比較に基づいて、指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部によらずに前記投影部による指示画像の投影を補正する補正手段をさらに備えたことを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 37 to 45,
    Based on a comparison between the characteristic partial image in the reference captured image when the fixing command was given and the characteristic partial image in the current captured image, using the captured image when the fixing command was given as a reference captured image, A system further comprising a correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space, device or program.
  47.  現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、
     現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記投影部の投影方向を変化させる駆動部と、
     前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部を制御する追従制御手段と、
     を備えた現場指示装置。
    an imaging unit that is attached to a person in charge of the site or a moving object and that captures images of the site space in a wide-angle direction and generates a captured image;
    a projection unit that is attached to a person in charge at the site or a moving body and projects an instruction image onto the site space based on the given instruction image data;
    a drive unit that changes the projection direction of the projection unit;
    a follow-up control means for controlling the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space;
    On-site instruction device with.
  48.  現場担当者また移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、前記投影部の投影方向を変化させる駆動部とを備えた現場指示装置をコンピュータによって実現するための現場指示プログラムであって、コンピュータを、
     前記指示画像が前記現場空間の所定部位を基準として正しく表示されるように、前記駆動部を制御する追従制御手段として機能させるための現場指示プログラム。
    An imaging unit is attached to a person in charge of the site or a moving body, and captures a wide-angle image of the site space to generate a captured image; A site instruction program for realizing, by a computer, a site instruction device including a projection section that projects an instruction image in space and a drive section that changes the projection direction of the projection section, the computer comprising:
    A site instruction program for functioning as a follow-up control means for controlling the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the site space.
  49.  指示者が使用する指示装置と、現場担当者が使用する現場装置とを備えた遠隔指示システムであって、
     前記現場装置は、
     現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     現場担当者または移動体に装着され、現場空間に対して広角方向に投影可能であり、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記指示装置からの固定指令を受けて、前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するように制御する追従制御手段とを備え
     前記指示装置は、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、
     送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段と、
     を備えたことを特徴とする遠隔指示システム。
    A remote instruction system comprising an instruction device used by an instructor and a field device used by a site person,
    The field device is
    an imaging unit that is attached to a person in charge of the site or a moving object and that captures images of the site space in a wide-angle direction and generates a captured image;
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a projection unit that is attached to a person in charge of the site or a moving body, is capable of projecting in a wide-angle direction onto the site space, and projects an instruction image onto the site space based on given instruction image data;
    In response to a fixed command from the instruction device, the projection unit adjusts the image of the site space based on the characteristic partial image of the site space included in the captured image, regardless of the movement of the site person or the mobile object. and tracking control means for controlling the projection of the instruction image based on a predetermined region, the instruction device comprising:
    captured image receiving means for receiving the captured image transmitted by the receiving section;
    fixed command means for giving a fixed command to the field device by a transmitter;
    When there is a fixing command, an instruction image input unit that sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and inputs an instruction image to a desired position in the site space in the captured image of interest by an operation of an instructor;
    instruction image transmitting means for transmitting instruction image data in which the position of the instruction image is specified to the instruction device by a transmission unit;
    A remote instruction system comprising:
  50.  遠隔指示システムのための現場装置であって、
     現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、
     現場担当者または移動体に装着され、現場空間に対して広角方向に投影可能であり、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記指示装置からの固定指令を受けて、前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するように制御する追従制御手段と、
     を備えた現場装置。
    A field device for a remote instruction system, comprising:
    an imaging unit that is attached to a person in charge of the site or a moving object and that captures images of the site space in a wide-angle direction and generates a captured image;
    a captured image transmitting means for transmitting the captured image to the instruction device by a transmitting unit;
    a projection unit that is mounted on a site person or a moving body, is capable of projecting in a wide-angle direction onto a site space, and projects an instruction image onto the site space based on given instruction image data;
    In response to a fixed command from the instruction device, the projection unit adjusts the image of the site space based on a characteristic partial image of the site space included in the captured image, regardless of the movement of the site person or the mobile object. follow-up control means for controlling the projection of the instruction image based on a predetermined region;
    On-site equipment equipped with
  51.  現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、現場空間に対して広角方向に投影可能であり、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部とを備えた現場装置を、コンピュータによって実現するための現場装置プログラムであって、コンピュータを、
     送信部により、前記撮像画像を前記指示装置に送信する撮像画像送信手段と、前記指示装置からの固定指令を受けて、前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するように制御する追従制御手段として機能させるための現場装置プログラム。
    an imaging unit that is attached to a person in charge of the site or a mobile object and captures an image of the site space in a wide-angle direction and generates a captured image; , a projection unit that projects an instruction image onto the site space based on given instruction image data, the on-site device program for realizing by a computer, the on-site device including:
    A transmission unit includes a captured image transmitting unit that transmits the captured image to the instruction device, and receives a fixed command from the instruction device and transmits the captured image to the instruction device based on a characteristic partial image of the site space included in the captured image. A field device program for functioning as a follow-up control means for controlling the projection unit to project the instruction image based on a predetermined part of the field space, regardless of the movement of the person in charge or the moving body.
  52.  遠隔指示システムのための指示装置であって、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するように、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段と、
     を備えた指示装置。
    An instruction device for a remote instruction system, comprising:
    captured image receiving means for receiving the captured image transmitted by the receiving section;
    fixed command means for giving a fixed command to the field device by a transmitter;
    When there is a fixing command, an instruction image input unit sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and inputs an instruction image to a desired position in the site space in the captured image of interest by an operation of an instructor;
    Instruction image data in which the position of an instruction image is specified by a transmitter so that the projection section projects the instruction image based on a predetermined part of the site space, regardless of the movement of the on-site person or the mobile object. instruction image transmitting means for transmitting the instruction image to the instruction device;
    An indicating device equipped with.
  53.  固定指令があると、前記撮像画像の特徴部分画像の近傍を注目撮像画像とし、当該注目撮像画像において、指示者の操作により現場空間の所望の位置に指示画像を入力する指示画像入力部を備えた指示装置をコンピュータによって実現するための指示装置プログラムであって、コンピュータを、
     受信部により、送信されてきた撮像画像を受信する撮像画像受信手段と、
     送信部により、前記現場装置に対して固定指令を与える固定指令手段と、
     前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するように、送信部により、指示画像の位置を特定した指示画像データを前記指示装置に送信する指示画像送信手段として機能させるための指示装置プログラム。
    When there is a fixing command, an instruction image input unit is provided that sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and inputs an instruction image to a desired position in the site space in the captured image of interest by operation of an instructor. An instruction device program for realizing an instruction device using a computer, the computer
    captured image receiving means for receiving the captured image transmitted by the receiving unit;
    fixed command means for giving a fixed command to the field device by a transmitter;
    Instruction image data in which the position of an instruction image is specified by a transmitter so that the projection section projects the instruction image based on a predetermined part of the site space, regardless of the movement of the on-site person or the mobile object. An instruction device program for functioning as instruction image transmitting means for transmitting an instruction image to the instruction device.
  54.  請求項49~53のいずれかのシステム、装置またはプログラムにおいて、
     前記撮像部および投影部は、高周波振動を吸収する部材を介して固定されていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 49 to 53,
    A system, device, or program, wherein the imaging section and the projection section are fixed via a member that absorbs high-frequency vibrations.
  55.  請求項49~54のいずれかのシステム、装置またはプログラムにおいて、
     前記撮像部および投影部は、現場担当者のヘルメットに固定されていることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 49 to 54,
    A system, device, or program characterized in that the imaging section and the projection section are fixed to a helmet of a person in charge of the field.
  56.  請求項49~55のいずれかのシステム、装置またはプログラムにおいて、
     前記特徴部分画像は、前記現場空間内に設けられたマーカまたは前記撮像画像の特徴点であることを特徴とするシステム、装置またはプログラム。
    The system, device or program according to any one of claims 49 to 55,
    A system, device, or program characterized in that the characteristic partial image is a marker provided in the site space or a characteristic point of the captured image.
  57.  現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、
     現場担当者または移動体に装着され、現場空間に対して広角方向に投影可能であり、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部と、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するように制御する追従制御手段と、
     を備えた現場指示装置。
    an imaging unit that is attached to a person in charge of the site or a moving object and that captures images of the site space in a wide-angle direction and generates a captured image;
    a projection unit that is mounted on a site person or a moving body, is capable of projecting in a wide-angle direction onto a site space, and projects an instruction image onto the site space based on given instruction image data;
    Based on the characteristic partial image of the site space included in the captured image, the projection unit projects the instruction image with respect to a predetermined portion of the site space as a reference, regardless of the movement of the site person or the mobile object. a follow-up control means for controlling the
    On-site instruction device with.
  58.  現場担当者または移動体に装着され、現場空間を広角方向に撮像して撮像画像を生成する撮像部と、現場担当者または移動体に装着され、現場空間に対して広角方向に投影可能であり、与えられた指示画像データに基づいて、前記現場空間に指示画像を投影する投影部とを備えた現場指示装置をコンピュータによって実現するための現場指示プログラムであって、コンピュータを、
     前記撮像画像に含まれる前記現場空間の特徴部分画像に基づいて、前記現場担当者または前記移動体の動きに拘わらず、前記投影部が前記現場空間の所定部位を基準として前記指示画像を投影するように制御する追従制御手段として機能させるための現場指示プログラム。
    an imaging unit that is attached to a person in charge of the site or a mobile object and captures an image of the site space in a wide-angle direction and generates a captured image; , a projection unit that projects an instruction image onto the on-site space based on given instruction image data.
    Based on the characteristic partial image of the site space included in the captured image, the projection unit projects the instruction image with respect to a predetermined portion of the site space as a reference, regardless of the movement of the site person or the mobile object. An on-site instruction program to function as a follow-up control means to control as follows.
  59.  撮像部と、
     撮像部の撮像画角と実質的に同じ画角に投影を行う投影部と、
     撮像部と投影部を一体として少なくとも二軸方向に方向付けできる構造体と、
     構造体の方向付けを制御する駆動部と、
     を備えた撮像部・投影部統合体。
     
     
     
     
     
    an imaging unit;
    a projection unit that projects at substantially the same angle of view as the imaging angle of view of the imaging unit;
    a structure that can integrally orient an imaging section and a projection section in at least two axial directions;
    a drive unit that controls orientation of the structure;
    Integrated imaging and projection unit.




PCT/JP2023/005478 2022-03-28 2023-02-16 Remote instruction system WO2023188951A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-052309 2022-03-28
JP2022052309 2022-03-28

Publications (1)

Publication Number Publication Date
WO2023188951A1 true WO2023188951A1 (en) 2023-10-05

Family

ID=88200297

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/005478 WO2023188951A1 (en) 2022-03-28 2023-02-16 Remote instruction system

Country Status (1)

Country Link
WO (1) WO2023188951A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005323310A (en) * 2004-05-11 2005-11-17 Nippon Telegr & Teleph Corp <Ntt> Visual field sharing instrument, visual field movement input unit, picture image display device, photographing scope projection method, control method of visual field movement input unit, control method of picture image display device, program of visual field sharing device. and program of both visual field movement input device and picture image dispaly device
JP2018201077A (en) * 2017-05-25 2018-12-20 キヤノン株式会社 Imaging apparatus and control method thereof, image processing system and control method thereof, work support system, and program
WO2019119022A1 (en) * 2017-12-21 2019-06-27 Ehatsystems Pty Ltd Augmented visual assistance system for assisting a person working at a remote workplace, method and headwear for use therewith
US20190235242A1 (en) * 2015-03-31 2019-08-01 Timothy A. Cummings System for virtual display and method of use
JP2020137000A (en) * 2019-02-22 2020-08-31 株式会社日立製作所 Video recording device and head-mounted display
WO2021100331A1 (en) * 2019-11-20 2021-05-27 ダイキン工業株式会社 Remote work support system
JP2021125793A (en) * 2020-02-05 2021-08-30 株式会社大林組 Monitoring system and site monitoring device
JP2022042303A (en) * 2020-09-02 2022-03-14 XR Japan株式会社 Remote support system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005323310A (en) * 2004-05-11 2005-11-17 Nippon Telegr & Teleph Corp <Ntt> Visual field sharing instrument, visual field movement input unit, picture image display device, photographing scope projection method, control method of visual field movement input unit, control method of picture image display device, program of visual field sharing device. and program of both visual field movement input device and picture image dispaly device
US20190235242A1 (en) * 2015-03-31 2019-08-01 Timothy A. Cummings System for virtual display and method of use
JP2018201077A (en) * 2017-05-25 2018-12-20 キヤノン株式会社 Imaging apparatus and control method thereof, image processing system and control method thereof, work support system, and program
WO2019119022A1 (en) * 2017-12-21 2019-06-27 Ehatsystems Pty Ltd Augmented visual assistance system for assisting a person working at a remote workplace, method and headwear for use therewith
JP2020137000A (en) * 2019-02-22 2020-08-31 株式会社日立製作所 Video recording device and head-mounted display
WO2021100331A1 (en) * 2019-11-20 2021-05-27 ダイキン工業株式会社 Remote work support system
JP2021125793A (en) * 2020-02-05 2021-08-30 株式会社大林組 Monitoring system and site monitoring device
JP2022042303A (en) * 2020-09-02 2022-03-14 XR Japan株式会社 Remote support system

Similar Documents

Publication Publication Date Title
US10962774B2 (en) System for virtual display and method of use
CN107402000B (en) Method and system for correlating a display device with respect to a measurement instrument
US7714895B2 (en) Interactive and shared augmented reality system and method having local and remote access
US7373218B2 (en) Image distribution system
JP3217723B2 (en) Telecommunications system and telecommunications method
US9503628B1 (en) Camera mounting and control device
JP6788845B2 (en) Remote communication methods, remote communication systems and autonomous mobile devices
US11609345B2 (en) System and method to determine positioning in a virtual coordinate system
WO2023188951A1 (en) Remote instruction system
CN113467731B (en) Display system, information processing apparatus, and display control method of display system
JP2021018710A (en) Site cooperation system and management device
US10250813B2 (en) Methods and systems for sharing views
CN115904188A (en) Method and device for editing house-type graph, electronic equipment and storage medium
JP2021035002A (en) Image specification system and image specification method
WO2023203853A1 (en) Remote experience system
US20230035962A1 (en) Space recognition system, space recognition method and information terminal
KR20220165948A (en) Method and system for remote collaboration
KR20220058122A (en) Collaborative systems and methods for field and remote site using HMD
JP2021190729A (en) Image specification system and image specification method
US20240020927A1 (en) Method and system for optimum positioning of cameras for accurate rendering of a virtual scene
JP7437642B2 (en) remote instruction control system
US11800214B2 (en) Real time camera-based visibility improvement in atmospheric suit
Siegl et al. An AR human computer interface for object localization in a cognitive vision framework
US20220155592A1 (en) System for virtual display and method of use
WO2022215313A1 (en) Information processing method, information processing device, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23778981

Country of ref document: EP

Kind code of ref document: A1