WO2022168689A1 - Neck-hanging device and remote work assisting system - Google Patents

Neck-hanging device and remote work assisting system Download PDF

Info

Publication number
WO2022168689A1
WO2022168689A1 PCT/JP2022/002803 JP2022002803W WO2022168689A1 WO 2022168689 A1 WO2022168689 A1 WO 2022168689A1 JP 2022002803 W JP2022002803 W JP 2022002803W WO 2022168689 A1 WO2022168689 A1 WO 2022168689A1
Authority
WO
WIPO (PCT)
Prior art keywords
arm
neck
imaging
light
unit
Prior art date
Application number
PCT/JP2022/002803
Other languages
French (fr)
Japanese (ja)
Inventor
真人 藤野
雄一郎 竹崎
Original Assignee
Fairy Devices株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fairy Devices株式会社 filed Critical Fairy Devices株式会社
Publication of WO2022168689A1 publication Critical patent/WO2022168689A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B19/00Cameras
    • G03B19/02Still-picture cameras
    • G03B19/04Roll-film cameras
    • G03B19/06Roll-film cameras adapted to be loaded with more than one film, e.g. with exposure of one or the other at will
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B19/00Cameras
    • G03B19/02Still-picture cameras
    • G03B19/04Roll-film cameras
    • G03B19/07Roll-film cameras having more than one objective
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a neck-mounted device and a remote work support system using it.
  • Patent Document 1 a remote work support system comprising a worker terminal having a camera, a laser pointer, etc., and a supporter terminal connected to this terminal via the Internet.
  • a worker terminal transmits an image captured by a camera to a supporter terminal
  • the image is displayed on the supporter terminal.
  • a supporter such as an operator designates a part of the image with a mouse pointer or the like on the supporter terminal while viewing the image
  • the designation information is transmitted to the worker terminal via the Internet.
  • the operator terminal controls the irradiation direction and angle of the laser pointer based on this designation information.
  • the support information can be transmitted to the worker at the site via a laser pointer or the like provided in the worker terminal.
  • Patent Document 2 it has also been proposed to miniaturize the worker terminal and make it a device (wearable device) that the worker can wear.
  • the small portable terminal disclosed in Patent Document 2 is used by being worn on the user's shoulder, but it is about half the size of the user's head, and there is still room for miniaturization.
  • this small portable terminal has a laser light source and a camera that are integrated and fixed. Therefore, in this terminal, in order to be able to irradiate the laser light from the laser light source within the angle of view range of the camera, a relatively large turntable is used to control the irradiation direction of the laser light. This is an obstacle to miniaturization of terminals. In addition, if the turntable becomes large, it becomes necessary to mount a large battery to supply power to it, which is also one of the factors that hinder the miniaturization of terminals.
  • the main object of the present invention is to further miniaturize a worker terminal that can be used in a remote work support system.
  • a first aspect of the present invention relates to a neck-mounted device that is worn around the user's neck.
  • the neck-hung type device has a first arm and a second arm that can be arranged at positions across the neck. Both the first arm and the second arm are provided with imaging units (such as cameras) capable of photographing the front side of the user. Further, both the first arm and the second arm are provided with light projection units (laser light source, projector, etc.) capable of irradiating light within the imaging range of the imaging unit.
  • the neck-mounted device according to the present invention can use the light emitted from the light projecting section to indicate (point) a specific location, and the light can be used to point figures, letters, numbers, symbols, or images.
  • the imaging section is provided on at least one of the first arm and the second arm, and the light projecting section is provided on the first arm and the second arm. It may be provided in at least the other of the parts.
  • the terminal can be further miniaturized. Specifically, for example, when an imaging unit is provided on the first arm and a light projecting unit is provided on the second arm, there is a certain distance between the image capturing unit and the light projecting unit, and the optical axis of the two is aligned. A certain angle difference can be added.
  • the driving amount of the driving unit (such as an actuator) that controls the irradiation direction of the light from the light projecting unit. If the drive amount is reduced, the configuration of the drive unit itself can be made smaller. Furthermore, if the drive unit can be made smaller, the battery that powers it can also be made smaller. As a result, the entire device can be made compact. Note that it is also possible to provide the imaging section on either one of the first arm and the second arm, and provide the light projecting section on both the first arm and the second arm. It is also possible to provide the imaging units on both the first arm and the second arm and provide the light projecting unit on either the first arm or the second arm.
  • the imaging section and the light projecting section are provided on both the first arm section and the second arm section, respectively.
  • the imaging units are provided on both arms.
  • the front side of the user can be photographed widely by the two imaging units.
  • the horizontal shooting range horizontal angle of view
  • the imaging units on both arms the light from the light projecting units can be emitted over a wide range.
  • the other light projecting unit can output character information, graphic information, and the like. Therefore, the amount of information given to the user can be greatly improved.
  • the illuminance of the indication point may decrease.
  • the illuminance of the indication point can be maintained strong, and at the same time, the character information, etc. can be displayed effectively.
  • the light projecting section of the first arm be configured to be able to irradiate light within the photographing range of the imaging section of the second arm.
  • the light projecting section of the second arm is configured to be able to irradiate light within the photographing range of the imaging section of the first arm. In this way, light is emitted from the light projecting section of the first arm within the imaging range of the second imaging section, and light is emitted from the light projecting section of the second arm within the imaging range of the first imaging section.
  • the configuration of the drive unit for controlling the irradiation direction of the light projecting unit can be reduced in size.
  • the light projecting unit of the first arm irradiates the second arm with light within the shooting range of the image capturing unit, so that a certain distance can be secured between the image capturing unit and the light projecting unit. , and a certain angular difference can be given to the optical axes of both. Therefore, when irradiating the imaging range of the imaging unit with light from the light projecting unit, it is possible to reduce the driving amount of the driving unit (such as an actuator) that controls the irradiation direction of the light from the light projecting unit. If the drive amount is reduced, the configuration of the drive unit itself can be made smaller. Furthermore, if the drive unit can be made smaller, the battery that powers it can also be made smaller.
  • the photographing directions of the imaging units of the first arm and the second arm are inclined outward, and the light projection units of the first arm and the second arm It is preferable that the direction of light projection of is inclined inward.
  • “outward” means that the imaging direction of the imaging unit of the first arm faces the direction opposite to the direction in which the second arm exists, and similarly, the imaging direction of the imaging unit of the second arm. points in the direction opposite to the direction in which the first arm lies.
  • inward means that the light projecting direction of the light projecting part of the first arm faces the direction in which the second arm exists, and similarly, the light projecting direction of the light projecting part of the second arm is It means facing the direction in which the first arm exists.
  • the imaging direction of the imaging unit and the light projection direction of the light projecting unit are set to be opposite, the light from the light projecting unit arranged on the same arm is less likely to enter the imaging unit. As a result, a phenomenon such as so-called blown-out highlights is less likely to occur in the image acquired by the imaging unit.
  • the neck-mounted device may further include a control section that controls the light projecting section, and one or more sound collecting sections provided on the first arm or the second arm.
  • the control unit can control the light projecting unit based on the information about the sound acquired by the sound collecting unit so that the direction in which the sound is emitted is indicated by light. This allows the user to visually understand the direction from which the sound is generated.
  • a second aspect of the present invention relates to a remote work support system using the neck-hanging device according to the first aspect.
  • a remote work support system according to the present invention is configured by connecting a neck-mounted device worn around the neck of a user and a supporter device operated by a supporter to each other via an information communication line. It is
  • the neck-hung type device includes a first arm and a second arm that can be arranged at positions sandwiching the user's neck, and an imaging device provided on the first arm and the second arm that can photograph the front side of the user.
  • a light projecting unit provided on the first arm and the second arm and capable of irradiating light within an imaging range of the imaging unit; and a control unit for controlling the light projecting unit.
  • the light projecting section of the first arm is configured to be able to irradiate light within an imaging range of the image capturing section of the second arm, and the light projecting section of the second arm captures the image of the first arm. It is configured to be able to irradiate light within the photographing range of the part.
  • the supporter device includes a display unit capable of displaying a photographed image acquired by the imaging unit of the neck-worn device, and an operation unit capable of inputting designation of coordinates on the display screen of the display unit. The supporter device transmits coordinate information in the captured image to the worker terminal when coordinates in the captured image displayed on the display screen are designated by the operation unit.
  • the control unit of the neck-mounted device controls the irradiation direction of the light emitted from the light projecting unit based on this coordinate information.
  • support information can be transmitted from a remote location from the supporter terminal to the neck-mounted device (worker terminal).
  • FIG. 1 is a perspective view showing an example of a neck-mounted device.
  • FIG. 2 is a plan view showing the mounted state of the neck-mounted device, and schematically shows the imaging range of each imaging section and the light projection possible range of each light projection section.
  • FIG. 3 is a block diagram showing the functional configuration of the neck-mounted device.
  • FIG. 4 shows an example of a remote work support system.
  • FIG. 5 is a flow chart showing an example of processing of a worker terminal and a supporter terminal included in the remote work support system.
  • FIG. 1 shows an embodiment of a neck-mounted device 100 according to the present invention.
  • FIG. 2 schematically shows a state in which the wearer wears the neck-mounted device 100 .
  • 3 is a block diagram showing the components of the neck-worn device 100.
  • the housing that constitutes neck-mounted device 100 includes left arm 10 , right arm 20 , and main body 30 .
  • the left arm portion 10 and the right arm portion 20 extend forward from the left end and the right end of the main body portion 30, respectively, and the neck-hanging type device 100 has a substantially U-shaped structure as a whole when viewed from above. ing.
  • FIG. 1 shows an embodiment of a neck-mounted device 100 according to the present invention.
  • FIG. 2 schematically shows a state in which the wearer wears the neck-mounted device 100 .
  • 3 is a block diagram showing the components of the neck-worn device 100.
  • the housing that constitutes neck-mounted device 100 includes left arm 10 , right arm 20 , and main body 30 .
  • the body portion 30 is brought into contact with the back of the wearer's neck, and the left arm portion 10 and the right arm portion 20 are moved from the side of the wearer's neck to the chest side. Hang the entire device around your neck so that it hangs down toward you.
  • various electronic components are stored in the housing of the neck-mounted device 100 .
  • the left arm 10 and the right arm 20 are provided with imaging units (cameras) 41 and 51, respectively.
  • a first imaging unit 41 is provided at the tip of the left arm 10
  • a second imaging unit 51 is provided at the tip of the right arm 20.
  • These imaging units 41, 51 are used to capture images of the wearer. You can shoot still images and moving images of the front side.
  • the images acquired by the imaging units 41 and 51 are transmitted to the control unit 60 in the main unit 30, and after being subjected to predetermined processing, are stored as image data.
  • the photographing ranges of the first imaging section 41 and the second imaging section 51 partially overlap. Therefore, the control unit 60 obtains the image captured by the first imaging unit 41 and the image captured by the second imaging unit 51, and then integrates these two captured images to process them into one image. be able to.
  • the left arm portion 10 and the right arm portion 20 are further provided with light projecting portions 42 and 52, respectively.
  • a first light projecting section 42 is provided at the tip of the left arm 10
  • a second light projecting section 52 is provided at the tip of the right arm 20 .
  • Each of the light projection units 42 and 52 is configured to irradiate light within the imaging range of the imaging units 41 and 51 .
  • Examples of the light projection units 42 and 52 are known laser light sources and projectors.
  • the light projecting units 42 and 52 can indicate a specific location on the wearer's front side by laser light emitted from the laser light source.
  • each of the light projection units 42 and 52 is configured to be able to independently change the irradiation direction of light by driving units (actuators) 43 and 53 (see FIG. 3) provided inside the housing. ing.
  • FIG. 2 shows the positional relationship of the image capturing units 41 and 51 and the light projecting units 42 and 52 on the arms 10 and 20, the imaging range of the image capturing units 41 and 51 and the projection of the light projecting units 42 and 52. It shows the relationship of the light possible range.
  • the photographing ranges of the imaging units 41 and 51 are indicated by fan-shaped dashed lines. Of course it is widespread.
  • the light projectable range of each light projecting part 42 and 52 is shown with the dashed-dotted line.
  • the light projectable range means that light is emitted from each of the light projecting units 42 and 52 by changing the irradiation direction of the light (laser light, etc.) of each of the light projecting units 42 and 52 by the driving units 43 and 53 described above. means the maximum possible range.
  • the optical axis (main axis) of the first imaging section 41 is denoted by L1
  • the central axis of the first light projecting section 42 is denoted by L2.
  • the optical axis L1 of the first imaging section 41 is an axis of symmetry passing through the center of the lens that constitutes this imaging section, and extends along the center of the imaging range.
  • the central axis L2 of the first light projecting section 42 is a line extending along the center of the light projectable range of this light projecting section, and mainly corresponds to the light emitted from the initial position of this light projecting section. is doing.
  • the second imaging section 51 and the second light projecting section 52 also have an optical axis and a central axis.
  • a planar object arranged at a position 1 m away from the tip of the neck-hanging device 100 on the wearer's front side is denoted by symbol O. As shown in FIG.
  • the distal end portions of the left arm portion 10 and the right arm portion 20 include outward surfaces 11 and 21 and inward surfaces 12 and 22, respectively.
  • the outward surfaces 11 and 21 are surfaces facing the outside of the neck-mounted device 100 .
  • Inward facing surfaces 12 , 22 are surfaces facing the inside of the neck device 100 .
  • the normal lines intersect on the wearer's front side.
  • the imaging units 41 and 51 are arranged on the outward surfaces 11 and 21 of the distal ends of the arms 10 and 20, respectively. Therefore, as shown in FIG. 2, the optical axes L1 of the imaging units 41 and 51 are inclined outward from the neck-mounted device 100 . As a result, the imaging directions of the imaging units 41 and 51 are tilted outward. However, as shown in FIG. 2, the angle of view in the horizontal direction of each of the imaging units 41 and 51 is designed to be super wide, so the imaging ranges of the two imaging units 41 and 51 are at least partially Duplicate. Specifically, the imaging ranges of the imaging units 41 and 51 overlap in the front portion of the wearer.
  • the horizontal angle of view of each of the imaging units 41 and 51 may be 90 to 180 degrees, preferably 100 degrees or more, 110 degrees or more, or 120 degrees or more.
  • the inclination of the optical axis L1 of each imaging unit 41, 51 is not particularly limited. preferably.
  • the respective light projecting sections 42, 52 are arranged on the inward surfaces 12, 22 of the distal end portions of the respective arm sections 10, 20, respectively. Therefore, as shown in FIG. 2, the central axis L2 of each of the light projection units 42 and 52 is inclined outward from the neck-mounted device 100. As shown in FIG. As a result, the light projecting directions of the light projecting units 42 and 52 are inclined outward. Further, the light projecting direction of each of the light projecting units 42 and 52 can be adjusted in left-right angle (yaw angle) and vertical angle (pitch angle) by the driving units 43 and 53. When the possible light range) is increased, the structures of the drive units 43 and 53 are correspondingly increased.
  • the adjustment range of the angle of the light projecting direction of the light projecting units 43 and 52 is 10 degrees to 90 degrees for both the horizontal angle and the vertical angle. 15 to 80 degrees, or 20 to 60 degrees is particularly preferred. Furthermore, the adjustment width of this angle can be 45 degrees or less. Further, as shown in FIG. 2, since the light projecting directions of the light projecting units 42 and 52 are inclined inward, the light projecting directions (central axes) of the light projecting units 42 and 52 are aligned with the wearer's body.
  • the inclination of the central axis L2 of each of the light projection units 42 and 52 is not particularly limited, for example, the inclination ⁇ 2 of the central axis L2 with respect to the line of sight of the wearer extending straight toward the front is 5 to 45 degrees or 10 to 30 degrees. It is preferable to measure However, as shown in FIG. 2, when it is assumed that the object O is located at a distance of 1 m from the wearer's front side, the light projectable range of each of the light projecting units 42 and 52 on the surface of the object O is Preferably, they overlap at least partially.
  • the postable ranges of the light projecting units 42 and 52 and the respective light projecting units 42 and 52 are arranged so that the light projectable ranges of the light projecting units 42 and 52 overlap on the surface of the object O which is 1 m away from the wearer.
  • the inclination ⁇ 2 of the central axis L2 of 42, 52 may be adjusted.
  • at least the object O can be irradiated with light without gaps by the two light projection units 42 and 52 .
  • the shooting ranges of the imaging units 41 and 51 are tilted outward, while the postable ranges of the light projection units 42 and 52 are tilted inward. Therefore, as shown in FIG. 2 , the first light projecting section 42 provided on the left arm 10 can irradiate light onto the photographing range of the second imaging section 51 provided on the right arm 20 .
  • the first light projecting section 42 may also be capable of irradiating light to the imaging range of the first imaging section 41 provided on the left arm 10 as well. 2
  • the main function is to irradiate the imaging range of the imaging unit 51 with light.
  • the second light projecting section 52 provided on the right arm 20 can irradiate the imaging range of the first imaging section 41 provided on the left arm 10 with light.
  • the second light projecting section 52 may be capable of irradiating light to the imaging range of the second imaging section 51 provided on the right arm 20 , but the first imaging section provided on the left arm 10 may emit light. Its main function is to irradiate the imaging range of 41 with light.
  • the light projecting section provided on one arm is configured to mainly irradiate light to the imaging range of the imaging section provided on the other arm. .
  • outward surfaces 11 and 21 and inward surfaces 12 and 22 are formed at the distal end portions of the left arm portion 10 and the right arm portion 20, respectively.
  • the light emitted from the light projecting units 42 and 52 is less likely to enter the imaging units 41 and 51 directly.
  • a phenomenon called overexposure is less likely to occur in the captured images of the imaging units 41 and 51 .
  • the outward surfaces 11 and 21 and the inward surfaces 12 and 22 are not formed at the distal end portions of the respective arm portions 10 and 20, and the imaging portions 41 and 51 and the light projecting portion 42 are formed. , 52 on the same plane.
  • the imaging units 41 and 51 and the light projecting units 42 and 52 are arranged so that the optical axes L1 of the image capturing units 41 and 51 are tilted outward and the central axes L1 of the light projecting units 42 and 52 are tilted inward. is preferably installed.
  • illumination units 44 and 54 can be further installed at the tip of each arm 10 and 20 separately from the light projection units 42 and 52 .
  • the main function of the light projecting units 42 and 52 is to present information to the wearer, such as indicating specific locations and displaying characters, figures, etc., by light.
  • the units 44 and 54 are installed simply for the purpose of brightly illuminating the imaging range of the imaging units 41 and 51 .
  • the first illumination unit 44 of the left arm 10 is configured to mainly illuminate the imaging range of the second imaging unit 51 of the right arm 20, and the second illumination unit 54 of the right arm 20 , the imaging range of the first imaging unit 41 of the left arm 10 is preferably illuminated.
  • the lighting units 44 and 54 are preferably installed on the inward surfaces 12 and 22 of the distal ends of the arms 10 and 20, respectively, like the light projecting units 42 and 52 described above. In addition, by installing in this way, it is possible to prevent the illumination light emitted from the lighting units 44 and 54 from directly entering the imaging units 41 and 51 .
  • the left arm 10 and the right arm 20 are provided with one or more sound collectors (microphones) 45 and 55, respectively.
  • the sound collectors 45 and 55 are arranged mainly for the purpose of acquiring the voices of the wearer and the interlocutor.
  • the left arm 10 is provided with three first sound collectors 45
  • the right arm 20 is also provided with three second sound collectors 55 .
  • the left arm 10 and the right arm 20 may additionally be provided with one or more sound collectors.
  • a sound collecting section as an optional additional element in the body section 30 located between the left arm section 10 and the right arm section 20 .
  • the sound signals acquired by these sound collectors 45 and 55 are transmitted to the controller 60 (see FIG. 3) provided in the main body 30 and subjected to predetermined analysis processing.
  • the control unit 60 may also analyze the sound signals acquired from the sound collectors 45 and 55 and identify the position of the source of the sound and the direction in which the source exists with respect to the neck-worn device 100. good.
  • the control unit 60 can control the light projecting units 42 and 52 so as to indicate the direction in which the sound is emitted.
  • the control unit 60 emits light from the light projecting units 42 and 52 to project the shape of an arrow onto an object, and visually communicates the source of the sound to the wearer according to the direction of the arrow. good.
  • the control unit 60 actually emits light from the light projecting units 42 and 52 toward the source of the sound, and illuminates the source with the light so that the wearer can visually recognize the source of the sound. It is also possible to transmit
  • the left arm portion 10 and the right arm portion 20 described above can be arranged at positions sandwiching the neck.
  • the left arm portion 10 and the right arm portion 20 are connected by a main body portion 30 provided at a position that contacts the back of the wearer's neck.
  • Electronic parts such as a processor and a battery are installed in the main body 30 .
  • the housing that constitutes the main body 30 has a substantially flat shape, and can accommodate a planar (plate-shaped) circuit board and a battery.
  • the body portion 30 also has a hanging portion 31 that extends downward from the left arm portion 10 and the right arm portion 20 . By providing the main body portion 30 with the hanging portion 31, a space for installing the control system circuit is secured.
  • control system circuits are centrally mounted on the body portion 30 .
  • This control system circuit includes a battery and a circuit board on which various electronic components such as a processor driven by power supply from the battery are mounted. Therefore, when the total weight of the neck-mounted device 100 is 100%, the weight of the main body 30 accounts for 40 to 80% or 50% to 70%. By arranging such a heavy main body part 30 on the back of the wearer's neck, the stability during wearing is improved. In addition, by arranging the heavy body portion 30 at a position close to the trunk of the wearer, the burden imposed on the wearer by the weight of the entire device can be reduced.
  • a proximity sensor 63 is provided inside the main body 30 (on the wearer's side).
  • the proximity sensor 63 is for detecting the approach of an object, and when the neck-hanging type device 100 is worn around the wearer's neck, the proximity sensor 63 detects the approach of the neck. Therefore, when the proximity sensor 63 detects the proximity of an object, the imaging units 41 and 51, the light projecting units 42 and 52, the lighting units 44 and 54, the sound collecting units 45, When the devices such as 55 are turned on (driving state) and the proximity sensor 63 is not detecting the proximity of an object, these devices can be turned off (sleep state) or cannot be activated. As a result, power consumption of the battery can be efficiently suppressed.
  • the proximity sensor 63 when the proximity sensor 63 is in a state where the proximity sensor 63 is not detecting the proximity of an object, by making it impossible to activate the imaging units 41 and 51 and the sound collecting units 45 and 55, intentional or unintentional It can also be expected to have the effect of preventing data from being recorded in the As the proximity sensor 63, a known one can be used. A window should be provided.
  • a sound emitting part 64 (speaker) is provided on the outside of the body part 30 (opposite side of the wearer).
  • the sound emitting portion 64 is preferably arranged to output sound toward the outside of the main body portion 30 .
  • the left arm 10 and the right arm 20 are provided with the sound collectors 45 and 55, respectively.
  • the physical distance between the unit 64 and the sound collecting units 45 and 55 can be maximized. That is, in a state where the voices of the wearer and the interlocutor are being collected by the sound collectors 45 and 55, when some sound is output from the sound emitting unit 64, the voice of the wearer or the like is recorded.
  • a sound (self-output sound) from the unit 64 may be mixed.
  • the self-output sound is included in the recorded sound, it interferes with speech recognition, so it is necessary to remove the self-output sound by echo cancellation processing or the like.
  • the sound emitting unit 64 is provided at a position corresponding to the back of the wearer's neck as described above, and the sound collecting unit 45, Physical distance from 55 is preferred.
  • the left arm 10 and the right arm 20 have flexible parts 13 and 23 near the joints with the main body 30, as shown in FIG.
  • Flexible portions 13 and 23 are made of a flexible material such as rubber or silicone. Therefore, when the neck-mounted device 100 is worn, the left arm portion 10 and the right arm portion 20 can be easily fitted around the wearer's neck and shoulders. Wires connecting the electronic devices mounted on the arms 10 and 20 and the control unit 60 provided on the main body 30 are also inserted through the flexible portions 13 and 23 .
  • FIG. 3 is a block diagram showing the functional configuration of the neck-worn device 100.
  • the neck-mounted device 100 includes first and second imaging units 41 and 51, first and second light projecting units 42 and 52, first and second driving units 43 and 53, 1 and second lighting units 44 and 54, first and second sound collecting units 45 and 55, operation unit 46, control unit 60, storage unit 61, communication unit 62, proximity sensor 63, sound emitting unit 64, and battery 70 have
  • the left arm portion 10 is provided with a first imaging portion 41, a first light projecting portion 42, a first driving portion 43, a first lighting portion 44, a first sound collecting portion 45, and an operating portion 46. ing.
  • a second imaging section 51 , a second light projecting section 52 , a second driving section 53 , a second lighting section 54 and a second sound collecting section 55 are arranged on the right arm section 20 . Furthermore, a control unit 60 , a storage unit 61 , a communication unit 62 , a proximity sensor 63 , a sound emitting unit 64 and a battery 70 are arranged in the main unit 30 .
  • the neck-mounted device 100 is equipped with sensors such as a gyro sensor, an acceleration sensor, a geomagnetic sensor, a GPS sensor, and the like in general portable information terminals. Module equipment can be mounted as appropriate.
  • the imaging units 41 and 51 acquire image data of still images or moving images.
  • a general digital camera may be adopted as the imaging units 41 and 51 .
  • the imaging units 41 and 51 include, for example, a photographing lens, a mechanical shutter, a shutter driver, a photoelectric conversion element such as a CCD image sensor unit, a digital signal processor (DSP) that reads the amount of charge from the photoelectric conversion element and generates image data, and an IC. Consists of memory.
  • the imaging units 41 and 51 also include an autofocus sensor (AF sensor) for measuring the distance from the photographic lens to the subject, and a mechanism for adjusting the focal length of the photographic lens according to the distance detected by the AF sensor. is preferably provided.
  • AF sensor autofocus sensor
  • the type of AF sensor is not particularly limited, but a known passive sensor such as a phase difference sensor or a contrast sensor may be used. Also, as the AF sensor, an active sensor that directs infrared rays or ultrasonic waves toward a subject and receives the reflected light or reflected waves thereof may be used.
  • the image data acquired by the imaging units 41 and 51 are supplied to the control unit 60 and stored in the storage unit 61, where predetermined image analysis processing and image processing processing are performed. Alternatively, the image data is transmitted to another device via the Internet via communication unit 62 . In this embodiment, as described above, the front side of the wearer is photographed by the two imaging units 41 and 51 . Since the two captured images acquired by the imaging units 41 and 51 are partially overlapped, the control unit 60 preferably performs processing to integrate these images.
  • the light projecting units 42 and 52 present information to the wearer by emitting light.
  • Known laser light sources and projectors can be used as the light projection units 42 and 52, but here, an image display unit equipped with a laser light source will be described as an example.
  • the image display unit includes, for example, a laser light source that emits laser light, an optical fiber one end of which is connected to the laser light source, and an optical scanning section arranged on the other end side of the optical fiber.
  • the configuration of the image display unit for example, the disclosure in JP-A-2019-133102 can be referred to.
  • the laser light source emits laser light of each color of R (red), G (green), and B (blue), and includes three monochromatic laser diode chips that emit these three colors of laser light.
  • Light emitted from the laser light source is controlled by the controller 60 based on image data. That is, the control unit 60 converts image data corresponding to a plurality of pixels representing an image to be displayed into an image signal by a predetermined method. Controls the modulation of light.
  • optical fiber One end of an optical fiber is connected to the output end of the laser light source, and each color laser light modulated according to the image signal is transmitted through the optical fiber.
  • the other end of the optical fiber is open as a free end, and each color laser beam propagated through the optical fiber is emitted from the tip.
  • this optical fiber a single optical fiber having a single core for guiding laser beams of three colors is used, or three optical fibers having a single core for guiding laser beams of respective colors are bundled.
  • a bundle fiber may be used, or a multi-core fiber having three cores for guiding laser light of each color may be used.
  • the optical scanning unit is composed of, for example, a holding member that cantilevers the optical fiber and a cylindrical piezoelectric element connected to the holding member and provided on the tip side of the optical fiber.
  • An optical fiber is arranged at the center of a cylindrical piezoelectric element, and the cylindrical piezoelectric element is vibrated to flexure and vibrate the optical fiber.
  • the cylindrical piezoelectric element has electrodes divided into four, and a voltage with a phase difference of ⁇ /2 is applied to each of these adjacent electrodes based on a drive signal output from the control unit 60 to form a cylindrical shape.
  • the end face of the piezoelectric element on the open end side is circularly vibrated.
  • the control unit 60 controls the amplitude at a predetermined cycle, so that the optical scanning unit causes the laser light emitted from the tip of the optical fiber to two-dimensionally scan, for example, in a spiral shape.
  • the display screen becomes circular or elliptical.
  • the mode of scanning the laser light in two-dimensional directions is not limited to this.
  • Lissajous scanning may be performed by adjusting the application timing of the voltage applied to each electrode of the cylindrical piezoelectric element.
  • the display screen corresponds to a substantially rectangular shape. In this way, by manipulating the laser beam emitted from the optical fiber, it is possible to draw figures, characters, numbers, symbols, or images with some degree of freedom.
  • the drive units 43 and 53 have a mechanism for varying the irradiation direction of the light from the light projection units 42 and 52 .
  • the drive units 43 and 53 include a mechanism for turning the direction of the emission ends of the light projection units 42 and 52 left and right, and a mechanism for turning the direction of the emission ends of the light projection units 42 and 52 up and down. mechanism.
  • a known turntable or the like may be adopted as these rotating mechanisms.
  • the drive units 43 and 53 those including mirrors for reflecting the light from the light projection units 42 and 53 and an adjustment mechanism for adjusting the direction of the mirrors can be employed.
  • the driving units 43 and 53 drive these rotating mechanisms or adjusting mechanisms based on control signals from the control unit 60 to control the irradiation direction of light from the light projecting units 42 and 52 .
  • the illumination units 44 and 54 are lights for illuminating the shooting range of the imaging units 41 and 51 .
  • Light-emitting elements such as known LEDs, laser light sources, or diffused laser light sources can be employed as the illumination units 44 and 54 .
  • the illumination units 44 and 54 may be of a type that obtains white light by combining a blue LED and its complementary color, yellow phosphor, or of a type that obtains white light by combining red, blue, and green.
  • a full-color type that obtains white light by combining LEDs can be used.
  • the illumination units 44 and 54 may be turned off when the brightness sensor (not shown) detects sufficient brightness of ambient light, and may be turned on when the brightness sensor does not detect sufficient brightness. good.
  • the sound collectors 45 and 55 known microphones such as dynamic microphones, condenser microphones, and MEMS (Micro-Electrical-Mechanical Systems) microphones may be adopted.
  • the sound collectors 45 and 55 convert sounds into electric signals, amplify the electric signals with an amplifier circuit, convert them into digital information with an A/D converter circuit, and output the digital information to the control unit 60 .
  • the sound collectors 45 and 55 can acquire not only the voice of the wearer, but also the voices of one or more interlocutors existing around the wearer. For this reason, it is preferable to employ omnidirectional (omnidirectional) microphones as the sound collectors 45 and 55 so that sounds generated around the wearer can be widely collected.
  • the operation unit 46 accepts an operation input by the wearer.
  • a known switch circuit, touch panel, or the like can be employed as the operation unit 46 .
  • the operation unit 46 is, for example, an operation for instructing start or stop of voice input, an operation for instructing start or stop of shooting, an operation for instructing to turn on or off the power of the device, an operation for instructing to increase or decrease the volume of the speaker, In addition, it accepts operations necessary for realizing functions of the neck-mounted device 100 .
  • Information input via the operation unit 46 is transmitted to the control unit 60 .
  • the control unit 60 performs arithmetic processing to control other elements included in the neck-mounted device 100 .
  • a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit) can be used.
  • the control unit 60 basically reads a program stored in the storage unit 61 and executes predetermined arithmetic processing according to this program. In addition, the control unit 60 can appropriately write and read the calculation result according to the program to and from the storage unit 61 .
  • the storage unit 61 is an element for storing information used for arithmetic processing and the like in the control unit 60 and the calculation results thereof. Specifically, the storage unit 61 stores a program that causes the neck-worn device 100 to exhibit the functions unique to the present invention. When this program is activated by an instruction from the user, the control unit 60 executes processing according to the program.
  • the storage function of the storage unit 61 can be realized by non-volatile memories such as HDD and SDD, for example.
  • the storage unit 61 may have a function as a memory for writing or reading the intermediate progress of arithmetic processing by the control unit 60 or the like.
  • a memory function of the storage unit 61 can be realized by a volatile memory such as a RAM or a DRAM.
  • the storage unit 61 may store ID information unique to the user who owns it.
  • the storage unit 61 may store an IP address, which is identification information of the neck-worn device 100 on the network.
  • the communication unit 62 is an element for wirelessly communicating with another device on the cloud (specifically, a supporter terminal described later) or another neck-mounted device.
  • the communication unit 62 uses known mobile communication such as 3G (W-CDMA), 4G (LTE/LTE-Advanced), and 5G to communicate with a supporter terminal or another neck-mounted device via the Internet.
  • a standard or a communication module for wireless communication by a wireless LAN system such as Wi-Fi (registered trademark) may be adopted.
  • the communication unit 62 can employ a communication module for close proximity wireless communication such as Bluetooth (registered trademark) or NFC in order to directly communicate with another neck-mounted device.
  • the proximity sensor 63 is mainly used to detect the proximity of the neck-worn device 100 (particularly the main body 30) and the wearer.
  • a known sensor such as an optical sensor, an ultrasonic sensor, a magnetic sensor, a capacitance sensor, or a temperature sensing sensor can be used as described above.
  • the proximity sensor 63 is arranged inside the main body 30 and detects when the wearer's neck comes within a predetermined range. When the proximity sensor 63 detects that the neck of the wearer is approaching, the imaging units 41 and 51, the light projection units 42 and 52, the illumination units 44 and 54, and/or the sound collection units 45 and 55 are activated. can be done.
  • the sound emitting unit 64 is an acoustic device that converts electrical signals into physical vibrations (that is, sound).
  • An example of the sound emitting part 64 is a general speaker that transmits sound to the wearer by air vibration.
  • the sound emitting portion 64 is provided on the outer side of the body portion 30 (on the side opposite to the wearer), and the direction away from the back of the wearer's neck (rear in the horizontal direction) or the direction along the back of the neck (vertical direction) It is preferably configured to emit sound in an upward direction.
  • the sound emitting unit 64 may be a bone conduction speaker that transmits sound to the wearer by vibrating the bones of the wearer.
  • the sound emitting part 64 may be provided inside the main body part 30 (on the side of the wearer) so that the bone conduction speaker contacts the bone on the back of the wearer's neck (cervical vertebrae).
  • the battery 70 is a battery that supplies power to various electronic components included in the neck-mounted device 100 .
  • a rechargeable storage battery is used as the battery 70 .
  • the battery 70 may be a known battery such as a lithium ion battery, a lithium polymer battery, an alkaline storage battery, a nickel-cadmium battery, a nickel-metal hydride battery, or a lead storage battery.
  • the neck-mounted device 100 can be used as a worker terminal that constitutes a remote work support system.
  • the remote work support system includes a worker terminal 100 (a neck-mounted device) and a supporter terminal 200 which are connected to each other via the Internet.
  • the worker terminal 100 is worn by a worker who works on site.
  • the operator terminal 100 the above-described neck-hanging device according to the present invention is used.
  • the supporter terminal 200 is a terminal operated by a supporter (operator, etc.) who provides support information to workers on site.
  • the supporter can send work instructions and the like from a remote location to the worker at the site.
  • a general PC can be used as the supporter terminal 200 .
  • the supporter terminal 200 includes a control device 210 including a processor and a communication module, a display device 220 such as a display, and an input device 230 such as a mouse and keyboard.
  • FIG. 5 shows an example of the processing flow of the worker terminal 100 and the supporter terminal 200.
  • the imaging units 41 and 51 of the worker terminal 100 acquire a photographed image of the front side of the worker (wearer) (step S1). Since the imaging units 41 and 51 are mounted on the left and right arms 10 and 20 of the worker terminal 100, respectively, these two imaging units 41 and 51 capture images over a wide range in the horizontal direction. can be obtained.
  • the worker terminal 100 processes the captured images acquired by the two imaging units 41 and 51 (step S2).
  • processing for integrating two captured images into one image is mainly performed.
  • the processing may be correction of distortion caused by a lens, correction of contrast, brightness, sharpness, and color tone of each photographed image.
  • the operator terminal 100 transmits the processed image to the supporter terminal 200 via the Internet (step S3).
  • the image processing is performed on the worker terminal 100 side, but this image processing may be performed on the supporter terminal 200 side.
  • the worker terminal 100 transmits the captured images acquired by the respective imaging units 41 and 51 to the supporter terminal 200 without being processed.
  • the supporter terminal 200 performs the processing described above on the captured image received from the worker terminal 100 .
  • the supporter terminal 200 displays the processed image received from the worker terminal 100 on the display (step S4).
  • the worker terminal 100 acquires the moving image by the imaging units 41 and 51 , the moving image is also displayed on the display of the worker terminal 100 .
  • the supporter terminal 200 determines whether support information for the worker has been input via the input device 230 such as a mouse or keyboard (step S5).
  • An example of assistance information is the designation of coordinates for a displayed image on a display. For example, as shown in FIG. 4, it is possible to designate predetermined coordinates with a mouse pointer with respect to the displayed image.
  • Another example of assistance information is the input of messages (letters, numbers, symbols, graphics, etc.) to the operator. For example, as shown in FIG.
  • step S6 information such as characters can be input using a keyboard or the like.
  • the supporter terminal 200 transmits this support information to the worker terminal 100 (step S6).
  • steps S1 to S4 are repeated.
  • the worker terminal 100 outputs the support information received from the supporter terminal 200 by controlling the light projecting units 42, 52 and the driving units 43, 53 (step S7).
  • the control unit 60 of the worker terminal 100 generates control signals for the light projection units 42 and 52 and the driving units 43 and 53 based on the support information received from the supporter terminal 200 .
  • the control signals for the light projecting units 42 and 52 contain information for reproducing the message to the operator.
  • the control signals for the drive units 43 and 53 include information for controlling the light irradiation direction of the light projection units 42 and 52 .
  • the control unit 60 of the operator terminal 100 provides these control signals to the light projecting units 42 and 52 and the driving units 43 and 53, for example, as shown in FIG.
  • light beams having different information can be emitted from the light projection portions 42, 52 of the left and right arm portions 10, 20, respectively.
  • laser light is emitted from the first light projecting section 42 of the left arm 10 to indicate a specific location within the imaging range.
  • the second light projecting portion 52 of the right arm portion 20 reproduces specific character information by scanning laser light.
  • the worker terminal 100 and the supporter terminal 200 may each have a call function.
  • the worker terminal 100 acquires the voice of the worker from the sound collectors 45 and 55 and transmits it to the supporter terminal 200 , and outputs the voice of the supporter from the sound emitting unit 64 .
  • the supporter terminal 200 uses, for example, a headset (not shown) to acquire the voice of the supporter, transmit it to the worker terminal 100, and output the voice of the worker from the headset.
  • a headset not shown

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Stroboscope Apparatuses (AREA)
  • Cameras In General (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)
  • Accessories Of Cameras (AREA)
  • Projection Apparatus (AREA)
  • Details Of Cameras Including Film Mechanisms (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)

Abstract

[Problem] To provide a compact terminal for workers that is usable in a remote work assisting system. [Solution] A neck-hanging device 100 worn on the neckline of a user, comprising: a first arm part 10 and a second arm 20 arrangeable at positions across the neckline; imaging units 41, 51 capable of photographing the front side of the user and provided on the first and second arm parts 10, 20; and light projection units 42, 52 capable of irradiating light to the inside of the photographing range of the imaging units 41, 51 and provided on the first and second arm parts. The light projection part 42 of the first arm part 10 is capable of irradiating light to the inside of the photographing range of the imaging unit 51 of the second arm part 20, and the light projection part 52 of the second arm part 20 is capable of irradiating light to the inside of the photographing range of the imaging part 41 of the first arm part 10.

Description

首掛け型装置及び遠隔作業支援システムNeck hanging device and remote work support system
 本発明は、首掛け型装置や、それを利用した遠隔作業支援システムに関する。 The present invention relates to a neck-mounted device and a remote work support system using it.
 従来から、カメラ及びレーザポインタ等を有する作業者端末と、この端末とインターネットを通じて接続された支援者端末とを備える遠隔作業支援システムが知られている(特許文献1)。このシステムでは、作業者端末が、カメラによって撮影した画像を支援者端末に送信すると、支援者端末にその画像が表示される。また、オペレーター等の支援者が、例えばその画像を見ながら支援者端末にマウスポインタ等で画像内の一部を指定すると、その指定情報がインターネットを通じて作業者端末に送信される。そして、作業者端末は、この指定情報に基づいて、レーザポインタの照射方向や角度を制御する。これにより、支援者が作業の現場から遠隔地にいる場合であっても、この現場の作業者に対して作業者端末が備えるレーザポインタ等を介して支援情報を伝達できる。 Conventionally, there has been known a remote work support system comprising a worker terminal having a camera, a laser pointer, etc., and a supporter terminal connected to this terminal via the Internet (Patent Document 1). In this system, when a worker terminal transmits an image captured by a camera to a supporter terminal, the image is displayed on the supporter terminal. Further, when a supporter such as an operator designates a part of the image with a mouse pointer or the like on the supporter terminal while viewing the image, the designation information is transmitted to the worker terminal via the Internet. Then, the operator terminal controls the irradiation direction and angle of the laser pointer based on this designation information. As a result, even when the supporter is in a remote location from the work site, the support information can be transmitted to the worker at the site via a laser pointer or the like provided in the worker terminal.
 また、上記のような遠隔作業支援システムにおいて、作業者端末を小型化して、作業者が装着可能な装置(ウェアラブルデバイス)とすることも提案されている(特許文献2)。 In addition, in the remote work support system as described above, it has also been proposed to miniaturize the worker terminal and make it a device (wearable device) that the worker can wear (Patent Document 2).
特開2000-125024号公報JP-A-2000-125024 特開2004-219847号公報Japanese Patent Application Laid-Open No. 2004-219847
 ところで、特許文献2に開示された小型携帯端末は、ユーザの肩部に装着して使用するものであるが、ユーザの頭部の半分程度のサイズであり、まだ小型化の余地がある。特に、この小型携帯端末は、レーザ光源とカメラとが一体にして固定されたものである。このため、この端末では、カメラの画角範囲内にレーザ光源からレーザ光を照射できるように構成するために、比較的大型の回転台を用いてレーザ光の照射方向を制御することとしており、このことが端末の小型化の障害となっている。また、回転台が大型化すると、それに電力を供給するバッテリーも大型のものを搭載する必要が生じるため、このことも端末の小型化の障害の要因の一つとなる。 By the way, the small portable terminal disclosed in Patent Document 2 is used by being worn on the user's shoulder, but it is about half the size of the user's head, and there is still room for miniaturization. In particular, this small portable terminal has a laser light source and a camera that are integrated and fixed. Therefore, in this terminal, in order to be able to irradiate the laser light from the laser light source within the angle of view range of the camera, a relatively large turntable is used to control the irradiation direction of the laser light. This is an obstacle to miniaturization of terminals. In addition, if the turntable becomes large, it becomes necessary to mount a large battery to supply power to it, which is also one of the factors that hinder the miniaturization of terminals.
 そこで、本発明は、遠隔作業支援システムに利用可能な作業者用の端末をより小型化することを主な目的とする。 Therefore, the main object of the present invention is to further miniaturize a worker terminal that can be used in a remote work support system.
 本発明の第1の側面は、ユーザの首元に装着される首掛け型装置に関する。首掛け型装置は、首元を挟んだ位置に配置可能な第1腕部及び第2腕部を持つ。第1腕部と第2腕部の両方には、ユーザの正面側を撮影可能な撮像部(カメラ等)が設けられる。また、第1腕部と第2腕部の両方には、撮像部の撮影範囲内に光を照射可能な投光部(レーザ光源、プロジェクタ等)が設けられる。本発明に係る首掛け型装置は、投光部から照射される光を利用して特定箇所の指示(ポインティング)を行うこともできるし、またこの光によって図形、文字、数字、記号、又は画像を含むメッセージ情報を物体上に投影することも可能である。このように、首掛け型装置の腕部に撮像部と投光部を設けることで、遠隔作業支援システムに利用可能な作業者用の端末をより小型化することができる。その結果、作業者用端末の携行性を向上させることができる。 A first aspect of the present invention relates to a neck-mounted device that is worn around the user's neck. The neck-hung type device has a first arm and a second arm that can be arranged at positions across the neck. Both the first arm and the second arm are provided with imaging units (such as cameras) capable of photographing the front side of the user. Further, both the first arm and the second arm are provided with light projection units (laser light source, projector, etc.) capable of irradiating light within the imaging range of the imaging unit. The neck-mounted device according to the present invention can use the light emitted from the light projecting section to indicate (point) a specific location, and the light can be used to point figures, letters, numbers, symbols, or images. It is also possible to project message information containing on the object. In this way, by providing the imaging section and the light projecting section on the arms of the neck-mounted device, it is possible to further miniaturize the operator's terminal that can be used in the remote work support system. As a result, the portability of the worker terminal can be improved.
 なお、別の形態において、本発明に係る首掛け型装置において、撮像部は、第1腕部と第2腕部の少なくとも一方に設けられ、投光部は、第1腕部と第2腕部の少なくとも他方に設けられていてもよい。このように、撮像部と投光部を別々の腕部に設けることで、端末をさらに小型化できる。具体的に説明すると、例えば第1腕部に撮像部を設け、第2腕部に投光部を設けた場合、撮像部と投光部の間に一定距離があり、また両者の光軸に一定の角度差をつけることができる。このため、撮像部の撮影範囲に対して投光部から光を照射するにあたり、投光部の光の照射方向を制御する駆動部(アクチュエータ等)の駆動量を小さくできる。駆動量が小さくなれば駆動部の構成自体も小型化できる。さらに、駆動部が小型化できれば、それに電力を供給するためのバッテリーも小型化できる。その結果、装置全体をコンパクトにすることが可能となる。なお、撮像部を第1腕部と第2腕部のどちらか一方に設けて、投光部を第1腕部と第2腕部の両方に設けることも可能である。また、撮像部を第1腕部と第2腕部の両方に設けて、投光部を第1腕部と第2腕部のどちらか一方に設けることも可能である。 In another embodiment, in the neck-mounted device according to the present invention, the imaging section is provided on at least one of the first arm and the second arm, and the light projecting section is provided on the first arm and the second arm. It may be provided in at least the other of the parts. In this way, by providing the imaging section and the light projecting section on separate arms, the terminal can be further miniaturized. Specifically, for example, when an imaging unit is provided on the first arm and a light projecting unit is provided on the second arm, there is a certain distance between the image capturing unit and the light projecting unit, and the optical axis of the two is aligned. A certain angle difference can be added. Therefore, when irradiating the imaging range of the imaging unit with light from the light projecting unit, it is possible to reduce the driving amount of the driving unit (such as an actuator) that controls the irradiation direction of the light from the light projecting unit. If the drive amount is reduced, the configuration of the drive unit itself can be made smaller. Furthermore, if the drive unit can be made smaller, the battery that powers it can also be made smaller. As a result, the entire device can be made compact. Note that it is also possible to provide the imaging section on either one of the first arm and the second arm, and provide the light projecting section on both the first arm and the second arm. It is also possible to provide the imaging units on both the first arm and the second arm and provide the light projecting unit on either the first arm or the second arm.
 本発明に係る首掛け型装置において、撮像部と投光部は、それぞれ第1腕部と第2腕部の両方に設けられていることが好ましい。このように、撮像部を両方の腕部に設けることで、ユーザの正面側を2つの撮像部によって広く撮影できる。特に水平方向の撮影範囲(水平画角)が広がる。同様に、撮像部を両方の腕部に設けることで、その投光部からの光を広い範囲に照射できる。また、投光部を2か所以上に設置することで、例えば一方の投光部からの光で特定箇所の指示を行いつつ、他方の投光部によって文字情報や図形情報等を出力することができるため、ユーザに与える情報量を格段に向上させることができる。また、1つの投光部によって、特定箇所の指示点(ポインティング)と文字情報等の表示を同時に行おうとすると、この投光部から広い範囲に亘って光を照射する必要があることから、例えば指示点の照度が低下する懸念がある。この点、2以上の投光部によって、特定箇所の指示点と文字情報等の表示を別々に行うことで、指示点の照度を強く維持すると同時に、文字情報等を効果的に表示できるようになる。 In the neck-mounted device according to the present invention, it is preferable that the imaging section and the light projecting section are provided on both the first arm section and the second arm section, respectively. In this way, by providing the imaging units on both arms, the front side of the user can be photographed widely by the two imaging units. In particular, the horizontal shooting range (horizontal angle of view) is widened. Similarly, by providing the imaging units on both arms, the light from the light projecting units can be emitted over a wide range. In addition, by installing the light projecting units at two or more locations, for example, while pointing to a specific location with the light from one light projecting unit, the other light projecting unit can output character information, graphic information, and the like. Therefore, the amount of information given to the user can be greatly improved. In addition, when it is attempted to simultaneously display an indication point (pointing) at a specific location and character information by a single light projecting unit, it is necessary to irradiate light over a wide range from the light projecting unit. There is a concern that the illuminance of the indication point may decrease. In this regard, by using two or more light-projecting units to separately display the indication point and the character information, etc., the illuminance of the indication point can be maintained strong, and at the same time, the character information, etc. can be displayed effectively. Become.
 本発明に係る首掛け型装置において、第1腕部の投光部は、第2腕部の撮像部の撮影範囲内に光を照射可能に構成されていることが好ましい。また、第2腕部の投光部は、第1腕部の撮像部の撮影範囲内に光を照射可能に構成されていることが好ましい。このように、第1腕部の投光部から第2の撮像部の撮影範囲内に光を照射し、第2腕部の投光部から第1の撮像部の撮影範囲内に光を照射するように、各投光部と各撮像部を配置することで、前述したように投光部の照射方向を制御する駆動部の構成を小型化できる。具体的に説明すると、例えば第1腕部の投光部が第2腕部に撮像部の撮影範囲内に光を照射することで、撮像部と投光部の間に一定距離を確保できるとともに、また両者の光軸に一定の角度差をつけることができる。このため、撮像部の撮影範囲に対して投光部から光を照射するにあたり、投光部の光の照射方向を制御する駆動部(アクチュエータ等)の駆動量を小さくできる。駆動量が小さくなれば駆動部の構成自体も小型化できる。さらに、駆動部が小型化できれば、それに電力を供給するためのバッテリーも小型化できる。 In the neck-mounted device according to the present invention, it is preferable that the light projecting section of the first arm be configured to be able to irradiate light within the photographing range of the imaging section of the second arm. Moreover, it is preferable that the light projecting section of the second arm is configured to be able to irradiate light within the photographing range of the imaging section of the first arm. In this way, light is emitted from the light projecting section of the first arm within the imaging range of the second imaging section, and light is emitted from the light projecting section of the second arm within the imaging range of the first imaging section. By arranging each light projecting unit and each imaging unit as described above, the configuration of the drive unit for controlling the irradiation direction of the light projecting unit can be reduced in size. Specifically, for example, the light projecting unit of the first arm irradiates the second arm with light within the shooting range of the image capturing unit, so that a certain distance can be secured between the image capturing unit and the light projecting unit. , and a certain angular difference can be given to the optical axes of both. Therefore, when irradiating the imaging range of the imaging unit with light from the light projecting unit, it is possible to reduce the driving amount of the driving unit (such as an actuator) that controls the irradiation direction of the light from the light projecting unit. If the drive amount is reduced, the configuration of the drive unit itself can be made smaller. Furthermore, if the drive unit can be made smaller, the battery that powers it can also be made smaller.
 本発明に係る首掛け型装置では、平面視において、第1腕部及び第2腕部の撮像部の撮影方向が外向きに傾き、かつ、第1腕部及び第2腕部の投光部の投光方向が内向きに傾くように構成されていることが好ましい。なお、「外向き」とは、第1腕部の撮像部の撮影方向が第2腕部が存在する方向とは反対の方向を向いており、同様に第2腕部の撮像部の撮影方向が第1腕部が存在する方向とは反対の方向を向いていることを意味する。また、「内向き」とは、第1腕部の投光部の投光方向が第2腕部が存在する方向を向いており、同様に第2腕部の投光部の投光方向が第1腕部が存在する方向を向いていることを意味する。このように、各腕部の撮像部の撮影方向を外向きとすることにより、2つの撮像部によって撮影された画像を統合したときに、この統合画像の水平方向の画角を広くすることができる。また、各腕部の投光部の投光方向を内向きとすることにより、一方の腕部の投光部によって他方の腕部の撮像部の撮影範囲内に光を照射しやすくなる。さらに、撮像部による撮影方向と投光部による投光方向を反対向きとすることで、同じ腕部に配置された投光部からの光が撮像部に入射しにくくなる。これにより、撮像部によって取得した画像に、いわゆる白飛びといった現象が発生しにくくなる。 In the neck-mounted device according to the present invention, in a plan view, the photographing directions of the imaging units of the first arm and the second arm are inclined outward, and the light projection units of the first arm and the second arm It is preferable that the direction of light projection of is inclined inward. Note that "outward" means that the imaging direction of the imaging unit of the first arm faces the direction opposite to the direction in which the second arm exists, and similarly, the imaging direction of the imaging unit of the second arm. points in the direction opposite to the direction in which the first arm lies. In addition, "inward" means that the light projecting direction of the light projecting part of the first arm faces the direction in which the second arm exists, and similarly, the light projecting direction of the light projecting part of the second arm is It means facing the direction in which the first arm exists. In this way, by setting the shooting direction of the imaging unit of each arm to face outward, when the images shot by the two imaging units are integrated, the horizontal angle of view of the integrated image can be widened. can. In addition, by setting the light projecting direction of the light projecting part of each arm to face inward, it becomes easier for the light projecting part of one arm to irradiate light within the imaging range of the imaging part of the other arm. Furthermore, by setting the imaging direction of the imaging unit and the light projection direction of the light projecting unit to be opposite, the light from the light projecting unit arranged on the same arm is less likely to enter the imaging unit. As a result, a phenomenon such as so-called blown-out highlights is less likely to occur in the image acquired by the imaging unit.
 本発明に係る首掛け型装置は、さらに、投光部を制御する制御部と、第1腕部又は第2腕部に設けられた一又は複数の集音部を備えることとしてもよい。この場合、制御部は、集音部によって取得された音に関する情報に基づいて、その音が発せられた方向を光によって指し示すように投光部を制御することができる。これにより、ユーザは、音が発生した方向を視覚によって理解できるようになる。 The neck-mounted device according to the present invention may further include a control section that controls the light projecting section, and one or more sound collecting sections provided on the first arm or the second arm. In this case, the control unit can control the light projecting unit based on the information about the sound acquired by the sound collecting unit so that the direction in which the sound is emitted is indicated by light. This allows the user to visually understand the direction from which the sound is generated.
 本発明の第2の側面は、前述した第1の側面に係る首掛け型装置を利用した遠隔作業支援システムに関する。本発明に係る遠隔作業支援システムは、ユーザの首周りに装着される首掛け型装置と、支援者により操作される支援者装置とが、情報通信回線を介して相互に接続されることによって構成されている。首掛け型装置は、ユーザの首元を挟んだ位置に配置可能な第1腕部及び第2腕部と、第1腕部及び第2腕部に設けられユーザの正面側を撮影可能な撮像部と、第1腕部及び第2腕部に設けられ撮像部の撮影範囲内に光を照射可能な投光部と、投光部を制御する制御部とを備える。第1腕部の前記投光部は、第2腕部の前記撮像部の撮影範囲内に光を照射可能に構成され、かつ、第2腕部の投光部は、第1腕部の撮像部の撮影範囲内に光を照射可能に構成されている。支援者装置は、首掛け型装置の撮像部によって取得された撮影画像を表示可能な表示部と、表示部の表示画面上における座標の指定を入力可能な操作部とを備える。支援者装置は、表示画面に表示された撮影画像内の座標が操作部により指定された場合に、撮影画像内の座標情報を作業者端末に送信する。首掛け型装置の制御部は、この座標情報に基づいて投光部から発せられる光の照射方向を制御する。これにより、支援者端末から首掛け型装置(作業者端末)に対して、遠隔地から支援情報を送信できる。 A second aspect of the present invention relates to a remote work support system using the neck-hanging device according to the first aspect. A remote work support system according to the present invention is configured by connecting a neck-mounted device worn around the neck of a user and a supporter device operated by a supporter to each other via an information communication line. It is The neck-hung type device includes a first arm and a second arm that can be arranged at positions sandwiching the user's neck, and an imaging device provided on the first arm and the second arm that can photograph the front side of the user. a light projecting unit provided on the first arm and the second arm and capable of irradiating light within an imaging range of the imaging unit; and a control unit for controlling the light projecting unit. The light projecting section of the first arm is configured to be able to irradiate light within an imaging range of the image capturing section of the second arm, and the light projecting section of the second arm captures the image of the first arm. It is configured to be able to irradiate light within the photographing range of the part. The supporter device includes a display unit capable of displaying a photographed image acquired by the imaging unit of the neck-worn device, and an operation unit capable of inputting designation of coordinates on the display screen of the display unit. The supporter device transmits coordinate information in the captured image to the worker terminal when coordinates in the captured image displayed on the display screen are designated by the operation unit. The control unit of the neck-mounted device controls the irradiation direction of the light emitted from the light projecting unit based on this coordinate information. As a result, support information can be transmitted from a remote location from the supporter terminal to the neck-mounted device (worker terminal).
 本発明によれば、遠隔作業支援システムに利用可能な作業者用の端末をより小型化することができる。 According to the present invention, it is possible to further reduce the size of terminals for workers that can be used in the remote work support system.
図1は、首掛け型装置の一例を示した斜視図である。FIG. 1 is a perspective view showing an example of a neck-mounted device. 図2は、首掛け型装置の装着状態を示した平面図であり、各撮像部の撮影範囲と各投光部の投光可能範囲を模式的に表している。FIG. 2 is a plan view showing the mounted state of the neck-mounted device, and schematically shows the imaging range of each imaging section and the light projection possible range of each light projection section. 図3は、首掛け型装置の機能構成を示したブロック図である。FIG. 3 is a block diagram showing the functional configuration of the neck-mounted device. 図4は、遠隔作業支援システムの一例を示している。FIG. 4 shows an example of a remote work support system. 図5は、遠隔作業支援システムに含まれる作業者端末と支援者端末の処理の一例を示したフロー図である。FIG. 5 is a flow chart showing an example of processing of a worker terminal and a supporter terminal included in the remote work support system.
 以下、図面を用いて本発明を実施するための形態について説明する。本発明は、以下に説明する形態に限定されるものではなく、以下の形態から当業者が自明な範囲で適宜変更したものも含む。 Embodiments for carrying out the present invention will be described below with reference to the drawings. The present invention is not limited to the embodiments described below, and includes appropriate modifications within the scope obvious to those skilled in the art from the following embodiments.
 図1は、本発明に係る首掛け型装置100の一実施形態を示している。また、図2は、装着者が首掛け型装置100を装着した状態を模式的に表している。また、図3は、首掛け型装置100の構成要素を示したブロック図である。図1に示されるように、首掛け型装置100を構成する筐体は、左腕部10、右腕部20、及び本体部30を備える。左腕部10と右腕部20は、それぞれ本体部30の左端と右端から前方に向かって延出しており、首掛け型装置100は、平面視したときに装置全体として略U字をなす構造となっている。首掛け型装置100を装着する際には、図2に示されるように、本体部30を装着者の首裏に接触させ、左腕部10と右腕部20を装着者の首横から胸部側に向かって垂らすようにして、装置全体を首元に引っ掛ければよい。首掛け型装置100の筐体内には、図3に示されるように各種の電子部品が格納されている。 FIG. 1 shows an embodiment of a neck-mounted device 100 according to the present invention. Moreover, FIG. 2 schematically shows a state in which the wearer wears the neck-mounted device 100 . 3 is a block diagram showing the components of the neck-worn device 100. As shown in FIG. As shown in FIG. 1 , the housing that constitutes neck-mounted device 100 includes left arm 10 , right arm 20 , and main body 30 . The left arm portion 10 and the right arm portion 20 extend forward from the left end and the right end of the main body portion 30, respectively, and the neck-hanging type device 100 has a substantially U-shaped structure as a whole when viewed from above. ing. When wearing the neck-mounted device 100, as shown in FIG. 2, the body portion 30 is brought into contact with the back of the wearer's neck, and the left arm portion 10 and the right arm portion 20 are moved from the side of the wearer's neck to the chest side. Hang the entire device around your neck so that it hangs down toward you. As shown in FIG. 3, various electronic components are stored in the housing of the neck-mounted device 100 .
 本実施形態において、左腕部10及び右腕部20には、それぞれ撮像部(カメラ)41,51が設けられている。具体的には、左腕部10の先端部に第1撮像部41が設けられ、右腕部20の先端部に第2撮像部51が設けられており、これらの撮像部41,51によって装着者の正面側の静止画像や動画像を撮影することができる。各撮像部41,51によって取得された画像は、本体部30内の制御部60に伝達され、所定の加工が行われた後、画像データとして記憶される。第1撮像部41と第2撮像部51の撮影範囲は部分的に重複している。このため、制御部60は、第1撮像部41の撮影画像と第2撮像部51の撮影画像を取得した後、これら2枚の撮影画像を統合して一枚の画像に加工する処理を行うことができる。 In this embodiment, the left arm 10 and the right arm 20 are provided with imaging units (cameras) 41 and 51, respectively. Specifically, a first imaging unit 41 is provided at the tip of the left arm 10, and a second imaging unit 51 is provided at the tip of the right arm 20. These imaging units 41, 51 are used to capture images of the wearer. You can shoot still images and moving images of the front side. The images acquired by the imaging units 41 and 51 are transmitted to the control unit 60 in the main unit 30, and after being subjected to predetermined processing, are stored as image data. The photographing ranges of the first imaging section 41 and the second imaging section 51 partially overlap. Therefore, the control unit 60 obtains the image captured by the first imaging unit 41 and the image captured by the second imaging unit 51, and then integrates these two captured images to process them into one image. be able to.
 また、左腕部10及び右腕部20には、それぞれ投光部42,52がさらに設けられている。具体的には、左腕部10の先端部に第1投光部42が設けられ、右腕部20の先端部に第2投光部52が設けられている。各投光部42,52は、撮像部41,51の撮影範囲内に光を照射できるように構成されている。投光部42,52の例は、公知のレーザ光源やプロジェクタである。例えば、投光部42,52は、レーザ光源から出射されたレーザ光により、装着者の正面側の特定箇所を指示できる。また、レーザ光源から出射されるレーザ光を走査(例えばリサージュ走査)することにより、このレーザ光によって文字や図形を表現することもできる。また、投光部42,52としては、公知のマイクロプロジェクタを採用し、装着者の正面側に向けて映像光を投影することとしてもよい。また、各投光部42,52は、筐体内部に設けられた駆動部(アクチュエータ)43,53(図3参照)により、それぞれ独立して光の照射方向を変えることができるように構成されている。 In addition, the left arm portion 10 and the right arm portion 20 are further provided with light projecting portions 42 and 52, respectively. Specifically, a first light projecting section 42 is provided at the tip of the left arm 10 , and a second light projecting section 52 is provided at the tip of the right arm 20 . Each of the light projection units 42 and 52 is configured to irradiate light within the imaging range of the imaging units 41 and 51 . Examples of the light projection units 42 and 52 are known laser light sources and projectors. For example, the light projecting units 42 and 52 can indicate a specific location on the wearer's front side by laser light emitted from the laser light source. Further, by scanning (for example, Lissajous scanning) laser light emitted from a laser light source, letters and figures can be expressed by the laser light. Also, as the light projection units 42 and 52, a known micro-projector may be employed to project image light toward the front side of the wearer. Further, each of the light projection units 42 and 52 is configured to be able to independently change the irradiation direction of light by driving units (actuators) 43 and 53 (see FIG. 3) provided inside the housing. ing.
 図2は、各撮像部41,51と各投光部42,52の各腕部10,20上の配置関係や、各撮像部41,51の撮影範囲と各投光部42,52の投光可能範囲の関係性を示している。なお、図2では、便宜的に、各撮像部41,51の撮影範囲を扇状の破線で示しているが、実際には各撮像部41,51の撮影範囲は、扇状の弧の先にも広がっていることは当然である。また、図2では、各投光部42,52の投光可能範囲を一点鎖線で示している。投光可能範囲とは、前述した駆動部43,53によって各投光部42,52の光(レーザ光等)の照射方向を変動させることによって、各投光部42,52から光を照射することのできる最大限の範囲を意味する。また、図2においては、第1撮像部41の光軸(主軸)を符号L1で示し、第1投光部42の中心軸を符号L2で示している。第1撮像部41の光軸L1とは、この撮像部を構成するレンズの中心を通る対称軸であり、撮影範囲の中心に沿って延びている。また、第1投光部42の中心軸L2とは、この投光部の投光可能範囲の中心に沿って延びる線であり、主にこの投光部の初期位置から照射される光に対応している。なお、図示は省略しているが、第2撮像部51と第2投光部52にも、同様に光軸と中心軸がある。さらに、図2では、着用者の正面側において、首掛け型装置100の先端から1m離間した位置に配置された平面状のオブジェクトを、符号Oで示している。 FIG. 2 shows the positional relationship of the image capturing units 41 and 51 and the light projecting units 42 and 52 on the arms 10 and 20, the imaging range of the image capturing units 41 and 51 and the projection of the light projecting units 42 and 52. It shows the relationship of the light possible range. In FIG. 2, for the sake of convenience, the photographing ranges of the imaging units 41 and 51 are indicated by fan-shaped dashed lines. Of course it is widespread. Moreover, in FIG. 2, the light projectable range of each light projecting part 42 and 52 is shown with the dashed-dotted line. The light projectable range means that light is emitted from each of the light projecting units 42 and 52 by changing the irradiation direction of the light (laser light, etc.) of each of the light projecting units 42 and 52 by the driving units 43 and 53 described above. means the maximum possible range. In FIG. 2, the optical axis (main axis) of the first imaging section 41 is denoted by L1, and the central axis of the first light projecting section 42 is denoted by L2. The optical axis L1 of the first imaging section 41 is an axis of symmetry passing through the center of the lens that constitutes this imaging section, and extends along the center of the imaging range. Further, the central axis L2 of the first light projecting section 42 is a line extending along the center of the light projectable range of this light projecting section, and mainly corresponds to the light emitted from the initial position of this light projecting section. is doing. Although not shown, the second imaging section 51 and the second light projecting section 52 also have an optical axis and a central axis. Furthermore, in FIG. 2, a planar object arranged at a position 1 m away from the tip of the neck-hanging device 100 on the wearer's front side is denoted by symbol O. As shown in FIG.
 図1及び図2に示されるように、まず、左腕部10と右腕部20の先端部は、それぞれ外向面11,21と内向面12,22とを含む。外向面11,21は、首掛け型装置100の外側を向いた面である。例えば、平面視において、左右の腕部10,20の外向面11,21から法線(面に対して垂直な線)を引いた場合、各法線が交差することはない。他方で、内向面12,22は、首掛け型装置100の内側を向いた面である。例えば、平面視において、左右の腕部10,20の内向面12,22から法線(面に対して垂直な線)を引いた場合、各法線は装着者の正面側で交差することとなる。 As shown in FIGS. 1 and 2, first, the distal end portions of the left arm portion 10 and the right arm portion 20 include outward surfaces 11 and 21 and inward surfaces 12 and 22, respectively. The outward surfaces 11 and 21 are surfaces facing the outside of the neck-mounted device 100 . For example, when normal lines (perpendicular to the surfaces) are drawn from the outward surfaces 11 and 21 of the left and right arms 10 and 20 in plan view, the normal lines do not intersect. Inward facing surfaces 12 , 22 , on the other hand, are surfaces facing the inside of the neck device 100 . For example, when normal lines (perpendicular to the surfaces) are drawn from the inward surfaces 12 and 22 of the left and right arms 10 and 20 in plan view, the normal lines intersect on the wearer's front side. Become.
 各撮像部41,51は、各腕部10,20の先端部の外向面11,21に配置される。このため、図2に示されるように、各撮像部41,51の光軸L1は、首掛け型装置100の外向きに傾いている。その結果、各撮像部41,51の撮影方向は外向きに傾くことなる。ただし、図2に示したように、各撮像部41,51の水平方向の画角は、超広角に設計されていることから、2つの撮像部41,51の撮影範囲は、少なくとも部分的に重複する。具体的には、撮像部41,51の撮影範囲は、装着者の正面部分において重複することとなる。例えば、各撮像部41,51の水平画角は、90~180度とすればよく、100度以上、110度以上、又は120度以上とすることが特に好ましい。なお、各撮像部41,51の光軸L1の傾きは特に制限されないが、例えば正面に向かって真っすぐ延びる装着者の視線に対する光軸L1の傾きθ1は、5~45度又は10~30度程度することが好ましい。 The imaging units 41 and 51 are arranged on the outward surfaces 11 and 21 of the distal ends of the arms 10 and 20, respectively. Therefore, as shown in FIG. 2, the optical axes L1 of the imaging units 41 and 51 are inclined outward from the neck-mounted device 100 . As a result, the imaging directions of the imaging units 41 and 51 are tilted outward. However, as shown in FIG. 2, the angle of view in the horizontal direction of each of the imaging units 41 and 51 is designed to be super wide, so the imaging ranges of the two imaging units 41 and 51 are at least partially Duplicate. Specifically, the imaging ranges of the imaging units 41 and 51 overlap in the front portion of the wearer. For example, the horizontal angle of view of each of the imaging units 41 and 51 may be 90 to 180 degrees, preferably 100 degrees or more, 110 degrees or more, or 120 degrees or more. The inclination of the optical axis L1 of each imaging unit 41, 51 is not particularly limited. preferably.
 他方で、各投光部42,52は、各腕部10,20の先端部の内向面12,22に配置される。このため、図2に示されるように、各投光部42,52の中心軸L2は、首掛け型装置100の外向きに傾いている。その結果、各投光部42,52の投光方向は外向きに傾くことなる。また、各投光部42,52の投光方向は、駆動部43,53によって左右角度(ヨー角)と上下角度(ピッチ角)を調整することができるが、これら角度の調整幅(すなわち投光可能範囲)を大きくすると、その分駆動部43,53の構造も大きくなる。このため、駆動部43,53の小型化の観点から、投光部43,52の投光方向の角度の調整幅は、左右角度と上下角度ともに、10度~90度であることが好ましく、15~80度、又は20~60度であることが特に好ましい。さらに、この角度の調整幅は、45度以下とすることもできる。また、図2に示したように、各投光部42,52の投光方向はそれぞれ内向きに傾いているため、各投光部42,52の投光方向(中心軸)は、装着者の正面にて交差することとなる。なお、各投光部42,52の中心軸L2の傾きは特に制限されないが、例えば正面に向かって真っすぐ延びる装着者の視線に対する中心軸L2の傾きθ2は、5~45度又は10~30度程度することが好ましい。ただし、図2に示されるように、装着者の正面側1mの距離にオブジェクトOが位置する想定した場合に、このオブジェクトOの表面上において、各投光部42,52の投光可能範囲は少なくとも部分的に重複することが好ましい。このように、装着者から1m離間したオブジェクトOの表面上で各投光部42,52の投光可能範囲が重なるように、各投光部42,52の投稿可能範囲や、各投光部42,52の中心軸L2の傾きθ2を調整すればよい。これにより、少なくともこのオブジェクトOにおいては、2つの投光部42,52によって隙間なく光を照射できるようになる。 On the other hand, the respective light projecting sections 42, 52 are arranged on the inward surfaces 12, 22 of the distal end portions of the respective arm sections 10, 20, respectively. Therefore, as shown in FIG. 2, the central axis L2 of each of the light projection units 42 and 52 is inclined outward from the neck-mounted device 100. As shown in FIG. As a result, the light projecting directions of the light projecting units 42 and 52 are inclined outward. Further, the light projecting direction of each of the light projecting units 42 and 52 can be adjusted in left-right angle (yaw angle) and vertical angle (pitch angle) by the driving units 43 and 53. When the possible light range) is increased, the structures of the drive units 43 and 53 are correspondingly increased. Therefore, from the viewpoint of miniaturization of the driving units 43 and 53, it is preferable that the adjustment range of the angle of the light projecting direction of the light projecting units 43 and 52 is 10 degrees to 90 degrees for both the horizontal angle and the vertical angle. 15 to 80 degrees, or 20 to 60 degrees is particularly preferred. Furthermore, the adjustment width of this angle can be 45 degrees or less. Further, as shown in FIG. 2, since the light projecting directions of the light projecting units 42 and 52 are inclined inward, the light projecting directions (central axes) of the light projecting units 42 and 52 are aligned with the wearer's body. will intersect in front of Although the inclination of the central axis L2 of each of the light projection units 42 and 52 is not particularly limited, for example, the inclination θ2 of the central axis L2 with respect to the line of sight of the wearer extending straight toward the front is 5 to 45 degrees or 10 to 30 degrees. It is preferable to measure However, as shown in FIG. 2, when it is assumed that the object O is located at a distance of 1 m from the wearer's front side, the light projectable range of each of the light projecting units 42 and 52 on the surface of the object O is Preferably, they overlap at least partially. In this way, the postable ranges of the light projecting units 42 and 52 and the respective light projecting units 42 and 52 are arranged so that the light projectable ranges of the light projecting units 42 and 52 overlap on the surface of the object O which is 1 m away from the wearer. The inclination θ2 of the central axis L2 of 42, 52 may be adjusted. As a result, at least the object O can be irradiated with light without gaps by the two light projection units 42 and 52 .
 上記のように、各撮像部41,51の撮影範囲は外向きに傾いているのに対して、各投光部42,52の投稿可能範囲は内向きに傾いている。このため、図2に示したように、左腕部10に設けられた第1投光部42は、右腕部20に設けられた第2撮像部51の撮影範囲に光を照射できる。なお、第1投光部42は、同じく左腕部10に設けられた第1撮像部41の撮影範囲に対しても光を照射できるものであってもよいが、右腕部20に設けられた第2撮像部51の撮影範囲に光を照射することが主な機能となる。同様に、右腕部20に設けられた第2投光部52は、左腕部10に設けられた第1撮像部41の撮影範囲に光を照射できる。第2投光部52も、同じく右腕部20に設けられた第2撮像部51の撮影範囲に対して光を照射できるものであってもよいが、左腕部10に設けられた第1撮像部41の撮影範囲に光を照射することが主な機能となる。このように、本実施形態では、一方の腕部に設けられた投光部は、他方の腕部に設けられた撮像部の撮影範囲に対して主に光を照射するように構成されている。 As described above, the shooting ranges of the imaging units 41 and 51 are tilted outward, while the postable ranges of the light projection units 42 and 52 are tilted inward. Therefore, as shown in FIG. 2 , the first light projecting section 42 provided on the left arm 10 can irradiate light onto the photographing range of the second imaging section 51 provided on the right arm 20 . The first light projecting section 42 may also be capable of irradiating light to the imaging range of the first imaging section 41 provided on the left arm 10 as well. 2 The main function is to irradiate the imaging range of the imaging unit 51 with light. Similarly, the second light projecting section 52 provided on the right arm 20 can irradiate the imaging range of the first imaging section 41 provided on the left arm 10 with light. Similarly, the second light projecting section 52 may be capable of irradiating light to the imaging range of the second imaging section 51 provided on the right arm 20 , but the first imaging section provided on the left arm 10 may emit light. Its main function is to irradiate the imaging range of 41 with light. Thus, in the present embodiment, the light projecting section provided on one arm is configured to mainly irradiate light to the imaging range of the imaging section provided on the other arm. .
 本実施形態のように、左腕部10と右腕部20の先端部にそれぞれ外向面11,21と内向面12,22を形成して、外向面11,21に撮像部41,51を設け、内向面12,22に投光部42,52を設けることで、投光部42,52から照射された光が撮像部41,51に直接入射しにくくなる。その結果、撮像部41,51の撮影画像中にいわゆる白飛びと呼ばれる現象が発生しにくくなる。ただし、図示は省略するが、別の実施形態では、各腕部10,20の先端部に外向面11,21や内向面12,22を形成せず、撮像部41,51と投光部42,52とを同じ平面上に設けることも可能である。この場合でも、撮像部41,51の光軸L1が外向きに傾き、投光部42,52の中心軸L1が内向きに傾くように、撮像部41,51と投光部42,52とを設置することが好ましい。 As in the present embodiment, outward surfaces 11 and 21 and inward surfaces 12 and 22 are formed at the distal end portions of the left arm portion 10 and the right arm portion 20, respectively. By providing the light projecting units 42 and 52 on the surfaces 12 and 22, the light emitted from the light projecting units 42 and 52 is less likely to enter the imaging units 41 and 51 directly. As a result, a phenomenon called overexposure is less likely to occur in the captured images of the imaging units 41 and 51 . However, although illustration is omitted, in another embodiment, the outward surfaces 11 and 21 and the inward surfaces 12 and 22 are not formed at the distal end portions of the respective arm portions 10 and 20, and the imaging portions 41 and 51 and the light projecting portion 42 are formed. , 52 on the same plane. Even in this case, the imaging units 41 and 51 and the light projecting units 42 and 52 are arranged so that the optical axes L1 of the image capturing units 41 and 51 are tilted outward and the central axes L1 of the light projecting units 42 and 52 are tilted inward. is preferably installed.
 図1等に示されるように、各腕部10,20の先端部には、投光部42,52とは別に、照明部44,54(フラッシュライト)をさらに設置することもできる。先述したとおり、投光部42,52は、光によって特定箇所の指示や文字・図形等の表示を行うなど、装着者に対して情報を提示することを主たる機能とするものであるが、照明部44,54は、単純に、撮像部41,51の撮影範囲を明るく照明することを目的として設置される。本実施形態において、左腕部10の第1照明部44は、右腕部20の第2撮像部51の撮影範囲を主に照明するように構成されており、右腕部20の第2照明部54は、左腕部10の第1撮像部41の撮影範囲を主に照明するように構成されていることが好ましい。このため、各照明部44,54は、前述した各投光部42,52と同様に、各腕部10,20の先端部の内向面12,22に設置しておくと良い。また、このように設置することで、各照明部44,54から照射された照明光が各撮像部41,51に直接入射することも防止できる。 As shown in FIG. 1 and the like, illumination units 44 and 54 (flashlights) can be further installed at the tip of each arm 10 and 20 separately from the light projection units 42 and 52 . As described above, the main function of the light projecting units 42 and 52 is to present information to the wearer, such as indicating specific locations and displaying characters, figures, etc., by light. The units 44 and 54 are installed simply for the purpose of brightly illuminating the imaging range of the imaging units 41 and 51 . In this embodiment, the first illumination unit 44 of the left arm 10 is configured to mainly illuminate the imaging range of the second imaging unit 51 of the right arm 20, and the second illumination unit 54 of the right arm 20 , the imaging range of the first imaging unit 41 of the left arm 10 is preferably illuminated. For this reason, the lighting units 44 and 54 are preferably installed on the inward surfaces 12 and 22 of the distal ends of the arms 10 and 20, respectively, like the light projecting units 42 and 52 described above. In addition, by installing in this way, it is possible to prevent the illumination light emitted from the lighting units 44 and 54 from directly entering the imaging units 41 and 51 .
 図1等に示されるように、左腕部10と右腕部20には、それぞれ一又は複数の集音部(マイク)45,55が設けられている。集音部45,55は、主に装着者や、その対話者の音声を取得することを目的として配置されている。図1に示した例では、左腕部10に第1集音部45が3箇所に設けられ、右腕部20にも第2集音部55が3箇所に設けられている。なお、任意の要素として、左腕部10と右腕部20に、一又は複数の集音部を追加で設けることもできる。さらに、図示は省略するが、左腕部10と右腕部20の間に位置する本体部30にも、任意の追加的要素として集音部を設けることも可能である。これらの集音部45,55によって取得した音信号は、本体部30内に設けられた制御部60(図3参照)へ伝達されて所定の解析処理が行われる。 As shown in FIG. 1 and the like, the left arm 10 and the right arm 20 are provided with one or more sound collectors (microphones) 45 and 55, respectively. The sound collectors 45 and 55 are arranged mainly for the purpose of acquiring the voices of the wearer and the interlocutor. In the example shown in FIG. 1 , the left arm 10 is provided with three first sound collectors 45 , and the right arm 20 is also provided with three second sound collectors 55 . As an optional element, the left arm 10 and the right arm 20 may additionally be provided with one or more sound collectors. Furthermore, although illustration is omitted, it is also possible to provide a sound collecting section as an optional additional element in the body section 30 located between the left arm section 10 and the right arm section 20 . The sound signals acquired by these sound collectors 45 and 55 are transmitted to the controller 60 (see FIG. 3) provided in the main body 30 and subjected to predetermined analysis processing.
 制御部60は、集音部45,55から取得した音信号を解析して、その音の発信源の位置や、首掛け型装置100に対して発生源が存在する方向を特定することとしてもよい。この場合に、制御部60は、その音が発せられた方向を光によって指し示すように投光部42,52を制御することが可能である。例えば、制御部60は、投光部42,52から矢印の形状を物体に投影するための光を出射し、その矢印の向きによって音の発生源を装着者に視覚的に伝達することとしてもよい。また、制御部60は、音の発生源に対して、実際に投光部42,52から光を出射し、その発生源を光で照らすことにより、音の発生源を装着者に視覚的に伝達することとも可能である。 The control unit 60 may also analyze the sound signals acquired from the sound collectors 45 and 55 and identify the position of the source of the sound and the direction in which the source exists with respect to the neck-worn device 100. good. In this case, the control unit 60 can control the light projecting units 42 and 52 so as to indicate the direction in which the sound is emitted. For example, the control unit 60 emits light from the light projecting units 42 and 52 to project the shape of an arrow onto an object, and visually communicates the source of the sound to the wearer according to the direction of the arrow. good. In addition, the control unit 60 actually emits light from the light projecting units 42 and 52 toward the source of the sound, and illuminates the source with the light so that the wearer can visually recognize the source of the sound. It is also possible to transmit
 上記した左腕部10と右腕部20は、首元を挟んだ位置に配置可能である。この左腕部10と右腕部20は、装着者の首裏に当接する位置に設けられた本体部30によって連結されている。この本体部30には、プロセッサやバッテリーなどの電子部品(制御系回路)が内装されている。本体部30を構成する筐体は、図1に示されるように、ほぼ平坦な形状となっており、平面状(板状)の回路基板やバッテリーを格納することができる。また、本体部30は、左腕部10及び右腕部20よりも下方に向かって延出する下垂部31を有する。本体部30に下垂部31を設けることで、制御系回路を内装するための空間を確保している。また、本体部30には制御系回路が集中して搭載されている。この制御系回路には、バッテリーと、バッテリーから電力の供給を受けて駆動するプロセッサなどの各種電子部品が搭載された回路基板とが含まれる。このため、首掛け型装置100の全重量を100%とした場合に、本体部30の重量は40~80%又は50%~70%を占める。このような重量の大きい本体部30を装着者の首裏に配置することで、装着時における安定性が向上する。また、装着者の体幹に近い位置に重量の大きい本体部30を配置することで、装置全体の重量が装着者に与える負荷を軽減できる。 The left arm portion 10 and the right arm portion 20 described above can be arranged at positions sandwiching the neck. The left arm portion 10 and the right arm portion 20 are connected by a main body portion 30 provided at a position that contacts the back of the wearer's neck. Electronic parts (control system circuits) such as a processor and a battery are installed in the main body 30 . As shown in FIG. 1, the housing that constitutes the main body 30 has a substantially flat shape, and can accommodate a planar (plate-shaped) circuit board and a battery. The body portion 30 also has a hanging portion 31 that extends downward from the left arm portion 10 and the right arm portion 20 . By providing the main body portion 30 with the hanging portion 31, a space for installing the control system circuit is secured. In addition, control system circuits are centrally mounted on the body portion 30 . This control system circuit includes a battery and a circuit board on which various electronic components such as a processor driven by power supply from the battery are mounted. Therefore, when the total weight of the neck-mounted device 100 is 100%, the weight of the main body 30 accounts for 40 to 80% or 50% to 70%. By arranging such a heavy main body part 30 on the back of the wearer's neck, the stability during wearing is improved. In addition, by arranging the heavy body portion 30 at a position close to the trunk of the wearer, the burden imposed on the wearer by the weight of the entire device can be reduced.
 また、本体部30の内側(装着者側)には近接センサ63が設けられている。近接センサ63は、物体の接近を検出するためのものであり、首掛け型装置100が装着者の首元に装着されると、その首元の接近を検出することとなる。このため、近接センサ63が物体の近接を検出している状態にあるときに、各撮像部41,51、各投光部42,52、各照明部44,54、及び各集音部45,55などの機器をオン(駆動状態)とし、近接センサ63が物体の近接を検出していない状態にあるときには、これらの機器をオフ(スリープ状態)、もしくは起動できない状態とすればよい。これにより、バッテリーの電力消費を効率的に抑えることができる。また、近接センサ63が物体の近接を検出していない状態にあるとき、各撮像部41,51や各集音部45,55を起動できなくすることによって、非装着時に意図的あるいは非意図的にデータが記録されてしまうことを防ぐという効果も期待できる。なお、近接センサ63としては公知のものを用いることができるが、光学式のものが用いられる場合には、近接センサ63の検出光を透過するために、本体部30に検出光を透過する透過窓を設けるとよい。 Also, a proximity sensor 63 is provided inside the main body 30 (on the wearer's side). The proximity sensor 63 is for detecting the approach of an object, and when the neck-hanging type device 100 is worn around the wearer's neck, the proximity sensor 63 detects the approach of the neck. Therefore, when the proximity sensor 63 detects the proximity of an object, the imaging units 41 and 51, the light projecting units 42 and 52, the lighting units 44 and 54, the sound collecting units 45, When the devices such as 55 are turned on (driving state) and the proximity sensor 63 is not detecting the proximity of an object, these devices can be turned off (sleep state) or cannot be activated. As a result, power consumption of the battery can be efficiently suppressed. In addition, when the proximity sensor 63 is in a state where the proximity sensor 63 is not detecting the proximity of an object, by making it impossible to activate the imaging units 41 and 51 and the sound collecting units 45 and 55, intentional or unintentional It can also be expected to have the effect of preventing data from being recorded in the As the proximity sensor 63, a known one can be used. A window should be provided.
 また、本体部30の外側(装着者の反対側)には放音部64(スピーカ)が設けられている。放音部64は、本体部30の外側に向かって音を出力するように配置されていることが好ましい。本実施形態では、左腕部10と右腕部20にそれぞれ集音部45,55が設けられているが、放音部64を装着者の首裏に相当する位置に設けておくことで、放音部64と各集音部45,55との物理的な距離を最大限離すことができる。すなわち、集音部45,55にて装着者や対話者の音声を集音している状態において、放音部64から何らかの音が出力されると、収録される装着者等の音声に放音部64からの音(自己出力音)が混入する場合がある。自己出力音が収録音声に混入すると音声認識を妨害することになるため、この自己出力音をエコーキャンセル処理などによって取り除く必要がある。しかし、実際は筐体振動などの影響を受け、エコーキャンセル処理を行ったとしても、完全に自己出力音を取り除くことは困難である。このため、装着者等の音声に混入される自己出力音の音量を最小化するために、上記の通り装着者の首裏に相当する位置に放音部64を設けて、集音部45,55との物理的な距離をとることが好ましい。 A sound emitting part 64 (speaker) is provided on the outside of the body part 30 (opposite side of the wearer). The sound emitting portion 64 is preferably arranged to output sound toward the outside of the main body portion 30 . In this embodiment, the left arm 10 and the right arm 20 are provided with the sound collectors 45 and 55, respectively. The physical distance between the unit 64 and the sound collecting units 45 and 55 can be maximized. That is, in a state where the voices of the wearer and the interlocutor are being collected by the sound collectors 45 and 55, when some sound is output from the sound emitting unit 64, the voice of the wearer or the like is recorded. A sound (self-output sound) from the unit 64 may be mixed. If the self-output sound is included in the recorded sound, it interferes with speech recognition, so it is necessary to remove the self-output sound by echo cancellation processing or the like. However, in reality, it is difficult to completely eliminate the self-output sound due to the influence of housing vibration and the like even if echo cancellation processing is performed. Therefore, in order to minimize the volume of the self-output sound that is mixed with the wearer's voice, the sound emitting unit 64 is provided at a position corresponding to the back of the wearer's neck as described above, and the sound collecting unit 45, Physical distance from 55 is preferred.
 また、首掛け型装置100の構造的特徴として、左腕部10と右腕部20は、図1に示されるように、本体部30との連結部位の近傍にフレキシブル部13,23を有する。フレキシブル部13,23は、ゴムやシリコーンなどの可撓性材料で形成されている。このため、首掛け型装置100の装着時に、左腕部10及び右腕部20が装着者の首元や肩上にフィットしやすくなる。なお、フレキシブル部13,23にも、各腕部10,20に搭載された電子機器と本体部30に設けられた制御部60とを接続する配線が挿通されている。 As a structural feature of the neck-mounted device 100, the left arm 10 and the right arm 20 have flexible parts 13 and 23 near the joints with the main body 30, as shown in FIG. Flexible portions 13 and 23 are made of a flexible material such as rubber or silicone. Therefore, when the neck-mounted device 100 is worn, the left arm portion 10 and the right arm portion 20 can be easily fitted around the wearer's neck and shoulders. Wires connecting the electronic devices mounted on the arms 10 and 20 and the control unit 60 provided on the main body 30 are also inserted through the flexible portions 13 and 23 .
 図3は、首掛け型装置100の機能構成を示したブロック図である。図3に示されるように、首掛け型装置100は、第1及び第2撮像部41,51、第1及び第2投光部42,52、第1及び第2駆動部43,53、第1及び第2照明部44,54、第1及び第2集音部45,55、操作部46、制御部60、記憶部61、通信部62、近接センサ63、放音部64、及びバッテリー70を有する。本実施形態において、左腕部10には、第1撮像部41、第1投光部42、第1駆動部43、第1照明部44、第1集音部45、及び操作部46が配置されている。また、右腕部20には、第2撮像部51、第2投光部52、第2駆動部53、第2照明部54、及び第2集音部55が配置されている。さらに、本体部30には、制御部60、記憶部61、通信部62、近接センサ63、放音部64、及びバッテリー70が配置されている。なお、首掛け型装置100は、図3に示した機能構成に加えて、ジャイロセンサ、加速度センサ、地磁気センサ、又はGPSセンサなどのセンサ類など、一般的な携帯型情報端末に搭載されているモジュール機器を適宜搭載することができる。 FIG. 3 is a block diagram showing the functional configuration of the neck-worn device 100. FIG. As shown in FIG. 3, the neck-mounted device 100 includes first and second imaging units 41 and 51, first and second light projecting units 42 and 52, first and second driving units 43 and 53, 1 and second lighting units 44 and 54, first and second sound collecting units 45 and 55, operation unit 46, control unit 60, storage unit 61, communication unit 62, proximity sensor 63, sound emitting unit 64, and battery 70 have In this embodiment, the left arm portion 10 is provided with a first imaging portion 41, a first light projecting portion 42, a first driving portion 43, a first lighting portion 44, a first sound collecting portion 45, and an operating portion 46. ing. A second imaging section 51 , a second light projecting section 52 , a second driving section 53 , a second lighting section 54 and a second sound collecting section 55 are arranged on the right arm section 20 . Furthermore, a control unit 60 , a storage unit 61 , a communication unit 62 , a proximity sensor 63 , a sound emitting unit 64 and a battery 70 are arranged in the main unit 30 . In addition to the functional configuration shown in FIG. 3, the neck-mounted device 100 is equipped with sensors such as a gyro sensor, an acceleration sensor, a geomagnetic sensor, a GPS sensor, and the like in general portable information terminals. Module equipment can be mounted as appropriate.
 撮像部41,51は、静止画像又は動画像の画像データを取得する。撮像部41,51としては一般的なデジタルカメラを採用すればよい。撮像部41,51は、例えば、撮影レンズ、メカシャッター、シャッタードライバ、CCDイメージセンサユニットなどの光電変換素子、光電変換素子から電荷量を読み出し画像データを生成するデジタルシグナルプロセッサ(DSP)、及びICメモリで構成される。また、撮像部41,51は、撮影レンズから被写体までの距離を測定するオートフォーカスセンサ(AFセンサ)と、このAFセンサが検出した距離に応じて撮影レンズの焦点距離を調整するための機構とを備えることが好ましい。AFセンサの種類は特に限定されないが、位相差センサやコントラストセンサといった公知のパッシブ方式のものを用いればよい。また、AFセンサとして、赤外線や超音波を被写体に向けてその反射光や反射波を受信するアクティブ方式のセンサを用いることもできる。撮像部41,51によって取得された画像データは、制御部60へと供給されて記憶部61に記憶され、所定の画像解析処理や画像加工処理が行われる。あるいは、画像データは、通信部62を介してインターネット経由で別の装置へと送信される。本実施形態において、前述したとおり、2つの撮像部41,51によって、装着者の正面側を撮影する。各撮像部41,51によって取得された2つの撮影画像は、部分的に重複しているため、制御部60では、これらの画像を統合する加工処理を行うと良い。 The imaging units 41 and 51 acquire image data of still images or moving images. A general digital camera may be adopted as the imaging units 41 and 51 . The imaging units 41 and 51 include, for example, a photographing lens, a mechanical shutter, a shutter driver, a photoelectric conversion element such as a CCD image sensor unit, a digital signal processor (DSP) that reads the amount of charge from the photoelectric conversion element and generates image data, and an IC. Consists of memory. The imaging units 41 and 51 also include an autofocus sensor (AF sensor) for measuring the distance from the photographic lens to the subject, and a mechanism for adjusting the focal length of the photographic lens according to the distance detected by the AF sensor. is preferably provided. The type of AF sensor is not particularly limited, but a known passive sensor such as a phase difference sensor or a contrast sensor may be used. Also, as the AF sensor, an active sensor that directs infrared rays or ultrasonic waves toward a subject and receives the reflected light or reflected waves thereof may be used. The image data acquired by the imaging units 41 and 51 are supplied to the control unit 60 and stored in the storage unit 61, where predetermined image analysis processing and image processing processing are performed. Alternatively, the image data is transmitted to another device via the Internet via communication unit 62 . In this embodiment, as described above, the front side of the wearer is photographed by the two imaging units 41 and 51 . Since the two captured images acquired by the imaging units 41 and 51 are partially overlapped, the control unit 60 preferably performs processing to integrate these images.
 投光部42,52は、光の照射によって装着者に情報を提示する。投光部42,52としては、公知のレーザ光源やプロジェクタを採用できるが、ここではレーザ光源を備える画像表示ユニットを例に挙げて説明する。画像表示ユニットは、例えば、レーザ光を出射するレーザ光源と、このレーザ光源に一端が接続された光ファイバと、この光ファイバの他端側に配置された光走査部を含む。なお、画像表示ユニットの構成については、例えば特開2019-133102号に開示されているものを参考にできる。 The light projecting units 42 and 52 present information to the wearer by emitting light. Known laser light sources and projectors can be used as the light projection units 42 and 52, but here, an image display unit equipped with a laser light source will be described as an example. The image display unit includes, for example, a laser light source that emits laser light, an optical fiber one end of which is connected to the laser light source, and an optical scanning section arranged on the other end side of the optical fiber. As for the configuration of the image display unit, for example, the disclosure in JP-A-2019-133102 can be referred to.
 レーザ光源は、R(赤色)、G(緑色)、B(青色)の各色のレーザ光を放射するものであり、これら3色のレーザ光をそれぞれ出射する3つの単色レーザダイオードチップを備える。レーザ光源から出射される光は、制御部60により、画像データに基づいて制御される。つまり、制御部60は、表示させる画像を表す複数の画素に対応する画像データを所定の方式で画像信号に変換し、この画像信号に基づいて各色レーザダイオードチップの発光タイミングや、RGBの各色レーザ光の変調を制御する。 The laser light source emits laser light of each color of R (red), G (green), and B (blue), and includes three monochromatic laser diode chips that emit these three colors of laser light. Light emitted from the laser light source is controlled by the controller 60 based on image data. That is, the control unit 60 converts image data corresponding to a plurality of pixels representing an image to be displayed into an image signal by a predetermined method. Controls the modulation of light.
 レーザ光源の出射端には光ファイバの一端が接続されており、画像信号に応じて変調された各色レーザ光は、光ファイバを通じて伝送される。光ファイバの他端は自由端として開放され、その先端部から光ファイバ中を伝搬されてきた各色レーザ光が出射される。この光ファイバとしては、3色のレーザ光が導波する単一のコアを備える1本の光ファイバを用いるか、各色のレーザ光が導波する単一コアを備える光ファイバを3本束ねたバンドルファイバを用いるか、あるいは各色のレーザ光が導波するコアを3つ備えたマルチコアファイバを用いればよい。 One end of an optical fiber is connected to the output end of the laser light source, and each color laser light modulated according to the image signal is transmitted through the optical fiber. The other end of the optical fiber is open as a free end, and each color laser beam propagated through the optical fiber is emitted from the tip. As this optical fiber, a single optical fiber having a single core for guiding laser beams of three colors is used, or three optical fibers having a single core for guiding laser beams of respective colors are bundled. A bundle fiber may be used, or a multi-core fiber having three cores for guiding laser light of each color may be used.
 光走査部は、例えば、光ファイバを片持ち支持する保持部材と、この保持部材に連結されて光ファイバの先端部側に設けられた円筒形圧電素子で構成される。光ファイバを円筒形圧電素子の中心に配置し、円筒形圧電素子を振動させることで、光ファイバをたわみ振動させる。具体的に、円筒形圧電素子は、4分割した電極を有し、制御部60から出力される駆動信号に基づき、これらの隣り合う電極にそれぞれ位相差π/2の電圧を印加して円筒形圧電素子の開放端側の端面を円振動させる。そして、制御部60により所定の周期で振幅が制御されることにより、光走査部は、光ファイバの先端部から出射するレーザ光を例えばスパイラル状に2次元走査させる。この場合、表示画面は円形又は楕円となる。なお、レーザ光を2次元方向に走査させる形態は、これに限定されない。例えば、円筒形圧電素子の各電極に印加する電圧の印加タイミングを調整して、リサージュ走査させてもよい。この場合、表示画面は略矩形に対応する。このように、光ファイバから出射するレーザ光を操作することで、図形、文字、数字、記号、又は画像をある程度自由に描画できる。 The optical scanning unit is composed of, for example, a holding member that cantilevers the optical fiber and a cylindrical piezoelectric element connected to the holding member and provided on the tip side of the optical fiber. An optical fiber is arranged at the center of a cylindrical piezoelectric element, and the cylindrical piezoelectric element is vibrated to flexure and vibrate the optical fiber. Specifically, the cylindrical piezoelectric element has electrodes divided into four, and a voltage with a phase difference of π/2 is applied to each of these adjacent electrodes based on a drive signal output from the control unit 60 to form a cylindrical shape. The end face of the piezoelectric element on the open end side is circularly vibrated. Then, the control unit 60 controls the amplitude at a predetermined cycle, so that the optical scanning unit causes the laser light emitted from the tip of the optical fiber to two-dimensionally scan, for example, in a spiral shape. In this case, the display screen becomes circular or elliptical. Note that the mode of scanning the laser light in two-dimensional directions is not limited to this. For example, Lissajous scanning may be performed by adjusting the application timing of the voltage applied to each electrode of the cylindrical piezoelectric element. In this case, the display screen corresponds to a substantially rectangular shape. In this way, by manipulating the laser beam emitted from the optical fiber, it is possible to draw figures, characters, numbers, symbols, or images with some degree of freedom.
 駆動部43,53は、投光部42,52からの光の照射方向を変動させるための機構を備える。例えば、駆動部43,53は、投光部42,52の出射端の向きを左右に回動させるための機構と、投光部42,52の出射端の向きを上下に回動させるための機構とを含む。これらの回動機構としては、公知の回転台などを採用すればよい。また、駆動部43,53としては、投光部42,53からの光を反射するミラーと、このミラーの向きを調整するための調整機構を含むものを採用することもできる。駆動部43,53は、制御部60からの制御信号に基づいて、これらの回転機構あるいは調整機構を駆動させて投光部42,52からの光の照射方向を制御する。 The drive units 43 and 53 have a mechanism for varying the irradiation direction of the light from the light projection units 42 and 52 . For example, the drive units 43 and 53 include a mechanism for turning the direction of the emission ends of the light projection units 42 and 52 left and right, and a mechanism for turning the direction of the emission ends of the light projection units 42 and 52 up and down. mechanism. A known turntable or the like may be adopted as these rotating mechanisms. Further, as the drive units 43 and 53, those including mirrors for reflecting the light from the light projection units 42 and 53 and an adjustment mechanism for adjusting the direction of the mirrors can be employed. The driving units 43 and 53 drive these rotating mechanisms or adjusting mechanisms based on control signals from the control unit 60 to control the irradiation direction of light from the light projecting units 42 and 52 .
 照明部44,54は、撮像部41,51の撮影範囲を照明するためのライトである。照明部44,54としては、例えば公知のLED、レーザ光源、又は拡散したレーザ光源などの発光素子を採用できる。例えば、照明部44,54は、白色光を出射できるものが望ましいため、青色LEDとその補色である黄色蛍光体の組合せで白色光を得るタイプのものや、赤色、青色及び緑色の3色のLEDの組み合わせで白色光を得るフルカラータイプのものを用いることができる。なお、照明部44,54は、明度センサ(不図示)により十分な明るさの環境光が検知されているときにはOFFとし、この明度センサにより十分な明るさが検知されていないときにONとしてもよい。 The illumination units 44 and 54 are lights for illuminating the shooting range of the imaging units 41 and 51 . Light-emitting elements such as known LEDs, laser light sources, or diffused laser light sources can be employed as the illumination units 44 and 54 . For example, since it is desirable that the illumination units 44 and 54 can emit white light, the illumination units 44 and 54 may be of a type that obtains white light by combining a blue LED and its complementary color, yellow phosphor, or of a type that obtains white light by combining red, blue, and green. A full-color type that obtains white light by combining LEDs can be used. The illumination units 44 and 54 may be turned off when the brightness sensor (not shown) detects sufficient brightness of ambient light, and may be turned on when the brightness sensor does not detect sufficient brightness. good.
 集音部45,55としては、ダイナミックマイクやコンデンサマイク、MEMS(Micro-Electrical-Mechanical Systems)マイクなど、公知のマイクロホンを採用すればよい。集音部45,55は、音を電気信号に変換し、その電気信号をアンプ回路によって増幅した上で、A/D変換回路によってデジタル情報に変換して制御部60へと出力する。本実施形態では、集音部45,55により、装着者の音声だけでなく、その周囲に存在する一又は複数の対話者の音声を取得することもできる。このため、装着者周囲で発生した音を広く集音できるように、各集音部45,55としては、全指向性(無指向性)のマイクロホンを採用することが好ましい。 As the sound collectors 45 and 55, known microphones such as dynamic microphones, condenser microphones, and MEMS (Micro-Electrical-Mechanical Systems) microphones may be adopted. The sound collectors 45 and 55 convert sounds into electric signals, amplify the electric signals with an amplifier circuit, convert them into digital information with an A/D converter circuit, and output the digital information to the control unit 60 . In this embodiment, the sound collectors 45 and 55 can acquire not only the voice of the wearer, but also the voices of one or more interlocutors existing around the wearer. For this reason, it is preferable to employ omnidirectional (omnidirectional) microphones as the sound collectors 45 and 55 so that sounds generated around the wearer can be widely collected.
 操作部46は、装着者による操作の入力を受け付ける。操作部46としては、公知のスイッチ回路又はタッチパネルなどを採用することができる。操作部46は、例えば音声入力の開始又は停止を指示する操作や、撮影を開始又は停止を指示する操作、装置の電源のON又はOFFを指示する操作、スピーカの音量の上げ下げを指示する操作、その他首掛け型装置100の機能の実現に必要な操作を受け付ける。操作部46を介して入力された情報は制御部60へと伝達される。 The operation unit 46 accepts an operation input by the wearer. A known switch circuit, touch panel, or the like can be employed as the operation unit 46 . The operation unit 46 is, for example, an operation for instructing start or stop of voice input, an operation for instructing start or stop of shooting, an operation for instructing to turn on or off the power of the device, an operation for instructing to increase or decrease the volume of the speaker, In addition, it accepts operations necessary for realizing functions of the neck-mounted device 100 . Information input via the operation unit 46 is transmitted to the control unit 60 .
 制御部60は、首掛け型装置100が備える他の要素を制御する演算処理を行う。制御部60としては、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)などのプロセッサを利用することができる。制御部60は、基本的に、記憶部61に記憶されているプログラムを読み出し、このプログラムに従って所定の演算処理を実行する。また、制御部60は、プログラムに従った演算結果を記憶部61に適宜書き込んだり読み出したりすることができる。 The control unit 60 performs arithmetic processing to control other elements included in the neck-mounted device 100 . As the control unit 60, a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit) can be used. The control unit 60 basically reads a program stored in the storage unit 61 and executes predetermined arithmetic processing according to this program. In addition, the control unit 60 can appropriately write and read the calculation result according to the program to and from the storage unit 61 .
 記憶部61は、制御部60での演算処理等に用いられる情報やその演算結果を記憶するための要素である。具体的に説明すると、記憶部61は、首掛け型装置100に本発明特有の機能を発揮させるプログラムを記憶している。ユーザからの指示によりこのプログラムが起動されると、制御部60によってプログラムに従った処理が実行される。記憶部61のストレージ機能は、例えばHDD及びSDDといった不揮発性メモリによって実現できる。また、記憶部61は、制御部60による演算処理の途中経過などを書き込む又は読み出すためのメモリとしての機能を有していてもよい。記憶部61のメモリ機能は、RAMやDRAMといった揮発性メモリにより実現できる。また、記憶部61には、それを所持するユーザ固有のID情報が記憶されていてもよい。また、記憶部61には、首掛け型装置100のネットワーク上の識別情報であるIPアドレスが記憶されていてもよい。 The storage unit 61 is an element for storing information used for arithmetic processing and the like in the control unit 60 and the calculation results thereof. Specifically, the storage unit 61 stores a program that causes the neck-worn device 100 to exhibit the functions unique to the present invention. When this program is activated by an instruction from the user, the control unit 60 executes processing according to the program. The storage function of the storage unit 61 can be realized by non-volatile memories such as HDD and SDD, for example. In addition, the storage unit 61 may have a function as a memory for writing or reading the intermediate progress of arithmetic processing by the control unit 60 or the like. A memory function of the storage unit 61 can be realized by a volatile memory such as a RAM or a DRAM. Further, the storage unit 61 may store ID information unique to the user who owns it. In addition, the storage unit 61 may store an IP address, which is identification information of the neck-worn device 100 on the network.
 通信部62は、クラウド上の別の装置(具体的には後述する支援者端末)や別の首掛け型装置と無線通信するための要素である。通信部62は、インターネットを介して支援者端末や別の首掛け型装置と通信を行うために、例えば、3G(W-CDMA)、4G(LTE/LTE-Advanced)、5Gといった公知の移動通信規格や、Wi-Fi(登録商標)等の無線LAN方式で無線通信するための通信モジュールを採用すればよい。また、通信部62は、別の首掛け型装置と直接的に通信を行うために、Bluetooth(登録商標)やNFC等の方式の近接無線通信用の通信モジュールを採用することもできる。 The communication unit 62 is an element for wirelessly communicating with another device on the cloud (specifically, a supporter terminal described later) or another neck-mounted device. The communication unit 62 uses known mobile communication such as 3G (W-CDMA), 4G (LTE/LTE-Advanced), and 5G to communicate with a supporter terminal or another neck-mounted device via the Internet. A standard or a communication module for wireless communication by a wireless LAN system such as Wi-Fi (registered trademark) may be adopted. Also, the communication unit 62 can employ a communication module for close proximity wireless communication such as Bluetooth (registered trademark) or NFC in order to directly communicate with another neck-mounted device.
 近接センサ63は、主に首掛け型装置100(特に本体部30)と装着者の接近を検知するために用いられる。近接センサ63としては、前述のように光学式、超音波式、磁気式、静電容量式、又は温感式などの公知のものを採用できる。近接センサ63は、本体部30の内側に配置され、装着者の首元が所定範囲内に接近したことを検出する。近接センサ63によって装着者の首元の接近が検出された場合、撮像部41,51や、投光部42,52、照明部44,54、及び/又は集音部45,55を起動することができる。 The proximity sensor 63 is mainly used to detect the proximity of the neck-worn device 100 (particularly the main body 30) and the wearer. As the proximity sensor 63, a known sensor such as an optical sensor, an ultrasonic sensor, a magnetic sensor, a capacitance sensor, or a temperature sensing sensor can be used as described above. The proximity sensor 63 is arranged inside the main body 30 and detects when the wearer's neck comes within a predetermined range. When the proximity sensor 63 detects that the neck of the wearer is approaching, the imaging units 41 and 51, the light projection units 42 and 52, the illumination units 44 and 54, and/or the sound collection units 45 and 55 are activated. can be done.
 放音部64は、電気信号を物理的振動(すなわち音)に変換する音響装置である。放音部64の例は、空気振動により音を装着者に伝達する一般的なスピーカである。この場合、前述したように、放音部64を本体部30の外側(装着者と反対側)に設けて、装着者の首裏から離れる方向(水平方向後方)又は首裏に沿う方向(鉛直方向上方)に向かって音を放出するように構成することが好ましい。また、放音部64としては、装着者の骨を振動させることにより音を装着者に伝達する骨伝導スピーカであってもよい。この場合、放音部64を本体部30の内側(装着者側)に設けて、骨伝導スピーカが装着者の首裏の骨(頚椎)に接触するように構成すればよい。 The sound emitting unit 64 is an acoustic device that converts electrical signals into physical vibrations (that is, sound). An example of the sound emitting part 64 is a general speaker that transmits sound to the wearer by air vibration. In this case, as described above, the sound emitting portion 64 is provided on the outer side of the body portion 30 (on the side opposite to the wearer), and the direction away from the back of the wearer's neck (rear in the horizontal direction) or the direction along the back of the neck (vertical direction) It is preferably configured to emit sound in an upward direction. Further, the sound emitting unit 64 may be a bone conduction speaker that transmits sound to the wearer by vibrating the bones of the wearer. In this case, the sound emitting part 64 may be provided inside the main body part 30 (on the side of the wearer) so that the bone conduction speaker contacts the bone on the back of the wearer's neck (cervical vertebrae).
 バッテリー70は、首掛け型装置100に含まれる各種電子部品に対して電力を供給する電池である。バッテリー70としては、充電可能な蓄電池が用いられる。バッテリー70は、リチウムイオン電池、リチウムポリマー電池、アルカリ蓄電池、ニッケルカドミウム電池、ニッケル水素電池、又は鉛蓄電池など公知のものを採用すればよい。 The battery 70 is a battery that supplies power to various electronic components included in the neck-mounted device 100 . A rechargeable storage battery is used as the battery 70 . The battery 70 may be a known battery such as a lithium ion battery, a lithium polymer battery, an alkaline storage battery, a nickel-cadmium battery, a nickel-metal hydride battery, or a lead storage battery.
 続いて、図4及び図5を参照して、本発明に係る首掛け型装置100の用途の一例について説明する。これらの図に示されるように、首掛け型装置100は、遠隔作業支援システムを構成する作業者端末として利用することができる。遠隔作業支援システムは、インターネットを介して互いに接続された作業者端末100(首掛け型装置)と支援者端末200を含む。作業者端末100は、現場で作業を行う作業者により装着される。作業者端末100としては、前述した構成の本発明に係る首掛け型装置を利用する。一方で、支援者端末200は、現場の作業者に対して支援情報を提供する支援者(オペレーター等)によって操作される端末である。これにより、支援者は、遠隔地から現場に居る作業者に対して作業指示等を送ることができる。支援者端末200としては、一般的なPCを用いることができる。具体的には、支援者端末200は、プロセッサや通信モジュールなどを含む制御装置210と、ディスプレイ等の表示装置220と、マウスやキーボードなどの入力装置230を備える。 Next, with reference to FIGS. 4 and 5, an example of application of the neck-mounted device 100 according to the present invention will be described. As shown in these figures, the neck-mounted device 100 can be used as a worker terminal that constitutes a remote work support system. The remote work support system includes a worker terminal 100 (a neck-mounted device) and a supporter terminal 200 which are connected to each other via the Internet. The worker terminal 100 is worn by a worker who works on site. As the operator terminal 100, the above-described neck-hanging device according to the present invention is used. On the other hand, the supporter terminal 200 is a terminal operated by a supporter (operator, etc.) who provides support information to workers on site. As a result, the supporter can send work instructions and the like from a remote location to the worker at the site. A general PC can be used as the supporter terminal 200 . Specifically, the supporter terminal 200 includes a control device 210 including a processor and a communication module, a display device 220 such as a display, and an input device 230 such as a mouse and keyboard.
 図5は、作業者端末100と支援者端末200の処理フローの一例を示している。図5に示されるように、まず、作業者端末100の撮像部41,51により、作業者(装着者)の正面側の撮影画像が取得される(ステップS1)。作業者端末100には、左右の腕部10,20のそれぞれに撮像部41,51が搭載されていることから、これらの2つの撮像部41,51によって水平方向の広い範囲に亘って撮影画像を取得できる。次に、作業者端末100は、2つの撮像部41,51によって取得した撮影画像の加工処理を行う(ステップS2)。ここでの加工処理では、主に2つの撮影画像を一つの画像に統合する処理が行われる。その他、加工処理としては、各撮影画像についてレンズによる歪みを矯正したり、コントラストや明るさ、シャープネス、色調を補正することとしてもよい。その後、作業者端末100は、加工処理後の画像をインターネットを介して支援者端末200に送信する(ステップS3)。なお、図5に示した例では、作業者端末100側で画像加工処理を行うこととしているが、この画像加工処理は支援者端末200側で行うこととしてもよい。この場合、作業者端末100は、各撮像部41,51で取得した撮影画像を未加工のまま支援者端末200に送信する。支援者端末200は、作業者端末100から受け取った撮影画像について前述した加工処理を行う。 FIG. 5 shows an example of the processing flow of the worker terminal 100 and the supporter terminal 200. As shown in FIG. 5, first, the imaging units 41 and 51 of the worker terminal 100 acquire a photographed image of the front side of the worker (wearer) (step S1). Since the imaging units 41 and 51 are mounted on the left and right arms 10 and 20 of the worker terminal 100, respectively, these two imaging units 41 and 51 capture images over a wide range in the horizontal direction. can be obtained. Next, the worker terminal 100 processes the captured images acquired by the two imaging units 41 and 51 (step S2). In the processing here, processing for integrating two captured images into one image is mainly performed. In addition, the processing may be correction of distortion caused by a lens, correction of contrast, brightness, sharpness, and color tone of each photographed image. After that, the operator terminal 100 transmits the processed image to the supporter terminal 200 via the Internet (step S3). In the example shown in FIG. 5, the image processing is performed on the worker terminal 100 side, but this image processing may be performed on the supporter terminal 200 side. In this case, the worker terminal 100 transmits the captured images acquired by the respective imaging units 41 and 51 to the supporter terminal 200 without being processed. The supporter terminal 200 performs the processing described above on the captured image received from the worker terminal 100 .
 続いて、支援者端末200は、作業者端末100から受信した加工済み画像をディスプレイに表示する(ステップS4)。なお、作業者端末100が撮像部41,51によって動画像を取得している場合、作業者端末100のディスプレイにも動画が表示されることとなる。次に、支援者端末200は、マウスやキーボードなどの入力装置230を介して、作業者に対する支援情報が入力されたか否かを判断する(ステップS5)。支援情報の例は、ディスプレイ上の表示画像に対する座標の指定である。例えば、図4に示されるように、表示画像に対して、マウスポインタによって所定の座標を指定することが可能である。支援情報の他の例は、作業者に対するメッセージ(文字、数字、記号、図形等)の入力である。例えば、図4に示されるように、キーボード等を用いて文字等の情報を入力することができる。このような各種の支援情報が入力された場合、支援者端末200は、この支援情報を作業者端末100に対して送信する(ステップS6)。他方で、この支援情報の入力がない場合には、ステップS1~ステップS4が繰り返される。 Subsequently, the supporter terminal 200 displays the processed image received from the worker terminal 100 on the display (step S4). In addition, when the worker terminal 100 acquires the moving image by the imaging units 41 and 51 , the moving image is also displayed on the display of the worker terminal 100 . Next, the supporter terminal 200 determines whether support information for the worker has been input via the input device 230 such as a mouse or keyboard (step S5). An example of assistance information is the designation of coordinates for a displayed image on a display. For example, as shown in FIG. 4, it is possible to designate predetermined coordinates with a mouse pointer with respect to the displayed image. Another example of assistance information is the input of messages (letters, numbers, symbols, graphics, etc.) to the operator. For example, as shown in FIG. 4, information such as characters can be input using a keyboard or the like. When such various types of support information are input, the supporter terminal 200 transmits this support information to the worker terminal 100 (step S6). On the other hand, if this support information is not input, steps S1 to S4 are repeated.
 続いて、作業者端末100は、支援者端末200から受信した支援情報を、投光部42,52及び駆動部43,53を制御することによって出力する(ステップS7)。具体的には、作業者端末100の制御部60は、支援者端末200から受信した支援情報に基づいて、投光部42,52と駆動部43,53に対する制御信号を生成する。投光部42,52に対する制御信号には、作業者に対するメッセージを再現するための情報が含まれる。また、駆動部43,53に対する制御信号には、投光部42,52の光の照射方向を制御するための情報が含まれる。作業者端末100の制御部60は、これらの制御信号を投光部42,52と駆動部43,53に提供し、例えば図4に示されるように、撮影範囲内の特定箇所に対して光(レーザ光)を照射したり、あるいは、撮影範囲内の対象物等に特定のメッセージを投影する。また、図4に示したように、左右の腕部10,20の投光部42,52から、別々の情報を持つ光を出射することもできる。例えば、図4に示した例では、左腕部10の第1投光部42からは、撮影範囲内の特定箇所を指示するためのレーザ光が出射されている。また、右腕部20の第2投光部52は、レーザ光を走査することによって特定の文字情報を再現している。このように、投光部42,52を2以上備えることで、同時に2以上の情報を出力することが可能である。これにより、支援者が支援者端末200に入力した支援情報を、作業者端末100の投光部42,52が光を照射することを通じて、作業者に対して伝達することができる。 Subsequently, the worker terminal 100 outputs the support information received from the supporter terminal 200 by controlling the light projecting units 42, 52 and the driving units 43, 53 (step S7). Specifically, the control unit 60 of the worker terminal 100 generates control signals for the light projection units 42 and 52 and the driving units 43 and 53 based on the support information received from the supporter terminal 200 . The control signals for the light projecting units 42 and 52 contain information for reproducing the message to the operator. In addition, the control signals for the drive units 43 and 53 include information for controlling the light irradiation direction of the light projection units 42 and 52 . The control unit 60 of the operator terminal 100 provides these control signals to the light projecting units 42 and 52 and the driving units 43 and 53, for example, as shown in FIG. It emits (laser light) or projects a specific message on an object or the like within the shooting range. Further, as shown in FIG. 4, light beams having different information can be emitted from the light projection portions 42, 52 of the left and right arm portions 10, 20, respectively. For example, in the example shown in FIG. 4, laser light is emitted from the first light projecting section 42 of the left arm 10 to indicate a specific location within the imaging range. Further, the second light projecting portion 52 of the right arm portion 20 reproduces specific character information by scanning laser light. By providing two or more light projection units 42 and 52 in this way, it is possible to output two or more pieces of information at the same time. As a result, the support information input by the supporter to the supporter terminal 200 can be transmitted to the worker through the light projection units 42 and 52 of the worker terminal 100 emitting light.
 また、作業者端末100と支援者端末200は、それぞれ通話機能を備えていてもよい。作業者端末100は、集音部45,55から作業者の音声を取得して支援者端末200に送信するとともに、支援者の音声を放音部64から出力する。支援者端末200は、例えばヘッドセット(図示省略)を利用して支援者の音声を取得して作業者端末100に送信するとともに、作業者の音声をこのヘッドセットから出力する。これにより、投光部42,52から出射される光を利用した視覚的な支援と、支援者と作業者の間での会話による聴覚的な支援を同時に行うことができる。 Also, the worker terminal 100 and the supporter terminal 200 may each have a call function. The worker terminal 100 acquires the voice of the worker from the sound collectors 45 and 55 and transmits it to the supporter terminal 200 , and outputs the voice of the supporter from the sound emitting unit 64 . The supporter terminal 200 uses, for example, a headset (not shown) to acquire the voice of the supporter, transmit it to the worker terminal 100, and output the voice of the worker from the headset. As a result, visual support using the light emitted from the light projecting units 42 and 52 and auditory support through conversation between the supporter and the worker can be provided at the same time.
 以上、本願明細書では、本発明の内容を表現するために、図面を参照しながら本発明の実施形態の説明を行った。ただし、本発明は、上記実施形態に限定されるものではなく、本願明細書に記載された事項に基づいて当業者が自明な変更形態や改良形態を包含するものである。 As described above, in the specification of the present application, the embodiments of the present invention have been described with reference to the drawings in order to express the content of the present invention. However, the present invention is not limited to the above embodiments, and includes modifications and improvements that are obvious to those skilled in the art based on the matters described in this specification.
10…左腕部         11…外向面
12…内向面         13…フレキシブル部
20…右腕部         21…外向面
22…内向面         23…フレキシブル部
30…本体部         31…下垂部
41…第1撮像部       42…第1投光部
43…第1駆動部       44…第1照明部
45…第1集音部       46…操作部
51…第2撮像部       52…第2投光部
53…第2駆動部       54…第2照明部
55…第2集音部       60…制御部
61…記憶部         62…通信部
63…近接センサ       64…放音部
70…バッテリー       100…首掛け型装置(作業者端末)
200…支援者端末      210…制御装置
220…表示装置       230…入力装置
DESCRIPTION OF SYMBOLS 10... Left arm part 11... Outward surface 12... Inward surface 13... Flexible part 20... Right arm part 21... Outward surface 22... Inward surface 23... Flexible part 30... Main body part 31... Hanging part 41... First imaging part 42... First Light projecting part 43... First driving part 44... First lighting part 45... First sound collecting part 46... Operation part 51... Second imaging part 52... Second light projecting part 53... Second driving part 54... Second lighting Part 55... Second sound collecting part 60... Control part 61... Storage part 62... Communication part 63... Proximity sensor 64... Sound emitting part 70... Battery 100... Neck hanging type device (worker terminal)
200... Supporter terminal 210... Control device 220... Display device 230... Input device

Claims (5)

  1.  ユーザの首元に装着される首掛け型装置であって、
     首元を挟んだ位置に配置可能な第1腕部及び第2腕部と、
     前記第1腕部及び前記第2腕部に設けられ、前記ユーザの正面側を撮影可能な撮像部と、
     前記第1腕部及び前記第2腕部に設けられ、前記撮像部の撮影範囲内に光を照射可能な投光部を備え、
     前記第1腕部の前記投光部は、前記第2腕部の前記撮像部の撮影範囲内に光を照射可能であり、
     前記第2腕部の前記投光部は、前記第1腕部の前記撮像部の撮影範囲内に光を照射可能である
     首掛け型装置。
    A neck-mounted device worn around the neck of a user,
    a first arm and a second arm that can be arranged at positions sandwiching the neck;
    an imaging unit provided on the first arm and the second arm, capable of imaging the front side of the user;
    A light projecting unit provided in the first arm and the second arm and capable of irradiating light within an imaging range of the imaging unit,
    The light projecting unit of the first arm can irradiate light within an imaging range of the imaging unit of the second arm,
    The neck hanging type device, wherein the light projecting section of the second arm can irradiate light within an imaging range of the imaging section of the first arm.
  2.  前記首掛け型装置は、現場の作業者の作業を遠隔から支援する遠隔支援システムにおいて、支援者装置と相互に通信可能に接続されている
     請求項1に記載の首掛け型装置。
    2. The neck-hanging device according to claim 1, wherein the neck-hanging device is connected to a supporter device in a remote support system that remotely supports the work of a worker on site so as to be able to communicate with each other.
  3.  平面視において、前記第1腕部及び前記第2腕部の前記撮像部の撮影方向は外向きに傾いており、
     平面視において、前記第1腕部及び前記第2腕部の前記投光部の投光方向は内向きに傾いている
     請求項1に記載の首掛け型装置。
    In plan view, photographing directions of the imaging units of the first arm and the second arm are inclined outward,
    2. The neck-mounted device according to claim 1, wherein the light projection directions of the light projection portions of the first arm portion and the second arm portion are inclined inward in plan view.
  4.  前記投光部を制御する制御部と、
     前記第1腕部又は前記第2腕部に設けられた、一又は複数の集音部をさらに備え、
     前記制御部は、前記集音部によって取得された音に関する情報に基づいて、その音が発せられた方向を光によって指し示すように前記投光部を制御する
     請求項1に記載の首掛け型装置。
    a control unit that controls the light projecting unit;
    Further comprising one or more sound collectors provided on the first arm or the second arm,
    2. The neck-mounted device according to claim 1, wherein the control unit controls the light projecting unit based on the information about the sound acquired by the sound collecting unit so as to indicate the direction in which the sound is emitted. .
  5.  ユーザの首周りに装着される首掛け型装置と支援者により操作される支援者装置とが情報通信回線を介して相互に接続された遠隔作業支援システムであって、
     前記首掛け型装置は、
      首元を挟んだ位置に配置可能な第1腕部及び第2腕部と、
      前記第1腕部及び前記第2腕部に設けられ、前記ユーザの正面側を撮影可能な撮像部と、
      前記第1腕部及び前記第2腕部に設けられ、前記撮像部の撮影範囲内に光を照射可能な投光部と、
      前記投光部を制御する制御部を備え、
      前記第1腕部の前記投光部は、前記第2腕部の前記撮像部の撮影範囲内に光を照射可能であり、
      前記第2腕部の前記投光部は、前記第1腕部の前記撮像部の撮影範囲内に光を照射可能であり、
     前記支援者装置は、
      前記首掛け型装置の前記撮像部によって取得された撮影画像を表示可能な表示部と、
      前記表示部の表示画面上における座標の指定を入力可能な操作部と、を備え、
     前記支援者装置は、前記表示画面に表示された前記撮影画像内の座標が前記操作部により指定された場合に、前記撮影画像内の座標情報を前記首掛け型装置に送信し、
     前記首掛け型装置の制御部は、前記座標情報に基づいて前記投光部から発せられる光の照射方向を制御する
     遠隔作業支援システム。
    A remote work support system in which a neck-mounted device worn around the neck of a user and a supporter device operated by a supporter are interconnected via an information communication line,
    The neck-hanging device includes:
    a first arm and a second arm that can be arranged at positions sandwiching the neck;
    an imaging unit provided on the first arm and the second arm, capable of imaging the front side of the user;
    a light projecting unit provided on the first arm and the second arm and capable of irradiating light within an imaging range of the imaging unit;
    A control unit that controls the light projecting unit,
    The light projecting unit of the first arm can irradiate light within an imaging range of the imaging unit of the second arm,
    The light projecting section of the second arm can irradiate light within an imaging range of the imaging section of the first arm,
    The supporter device is
    a display unit capable of displaying an image captured by the imaging unit of the neck-mounted device;
    an operation unit capable of inputting designation of coordinates on the display screen of the display unit,
    When coordinates in the photographed image displayed on the display screen are designated by the operation unit, the supporter device transmits coordinate information in the photographed image to the neck-hanging device,
    A remote work support system, wherein a control unit of the neck-mounted device controls an irradiation direction of light emitted from the light projecting unit based on the coordinate information.
PCT/JP2022/002803 2021-02-04 2022-01-26 Neck-hanging device and remote work assisting system WO2022168689A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-016986 2021-02-04
JP2021016986A JP7023022B1 (en) 2021-02-04 2021-02-04 Neck-mounted device and remote work support system

Publications (1)

Publication Number Publication Date
WO2022168689A1 true WO2022168689A1 (en) 2022-08-11

Family

ID=81076710

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/002803 WO2022168689A1 (en) 2021-02-04 2022-01-26 Neck-hanging device and remote work assisting system

Country Status (2)

Country Link
JP (1) JP7023022B1 (en)
WO (1) WO2022168689A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016186786A (en) * 2015-01-21 2016-10-27 トヨタ モーター エンジニアリング アンド マニュファクチャリング ノース アメリカ,インコーポレイティド Wearable smart device for hazard detection and warning based on image and audio data
JP2017503198A (en) * 2013-12-20 2017-01-26 クアルコム,インコーポレイテッド Trim content to project onto a target
JP2020150400A (en) * 2019-03-13 2020-09-17 Necプラットフォームズ株式会社 Wearable device and control method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017503198A (en) * 2013-12-20 2017-01-26 クアルコム,インコーポレイテッド Trim content to project onto a target
JP2016186786A (en) * 2015-01-21 2016-10-27 トヨタ モーター エンジニアリング アンド マニュファクチャリング ノース アメリカ,インコーポレイティド Wearable smart device for hazard detection and warning based on image and audio data
JP2020150400A (en) * 2019-03-13 2020-09-17 Necプラットフォームズ株式会社 Wearable device and control method

Also Published As

Publication number Publication date
JP2022119684A (en) 2022-08-17
JP7023022B1 (en) 2022-02-21

Similar Documents

Publication Publication Date Title
US11245843B2 (en) Imaging apparatus and imaging method for improvement of reproduction image quality
KR102200740B1 (en) Information processing apparatus, information processing method, program, and measuring system
RU2002133232A (en) METHOD OF VISUAL DISPLAY BY MEANS OF THE PROJECTOR OF ELECTRONIC INFORMATION ISSUED BY THE ELECTRONIC DEVICE, PROJECTOR AND CLOTHING
JP6011072B2 (en) Control device and program
US11245849B2 (en) Information processing apparatus and information processing method
US20120212647A1 (en) Portable photographing device
US9843727B2 (en) Image capturing apparatus and image capturing method
JP2019164420A (en) Transmission type head-mounted display device, control method of transmission type head-mounted display device, and computer program for control of transmission type head-mounted display device
JP2014021707A (en) Information input/output device and information input/output method
JP6136090B2 (en) Electronic device and display device
JP2017142857A (en) Input device
JP2010085472A (en) Image projection/imaging apparatus
US20200068098A1 (en) Shooting apparatus
US20160063290A1 (en) Portable information code reader
WO2022168689A1 (en) Neck-hanging device and remote work assisting system
JPWO2013161250A1 (en) Strobe device and imaging device including the same
JP2017146726A (en) Movement support device and movement support method
US20220167083A1 (en) Signal processing apparatus, signal processing method, program, and directivity variable system
US20190281233A1 (en) Image processing device, setting method, and program
JP2013239767A (en) Head-mounted type information input/output device
JP2009229509A (en) Optical device and optical system
JP2013174730A (en) Information display device
JP2014022942A (en) Head-mounted device
JP6733401B2 (en) Display system, display device, information display method, and program
JP2015159460A (en) Projection system, projection device, photographing device, method for generating guide frame, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22749563

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22749563

Country of ref document: EP

Kind code of ref document: A1