WO2016157923A1 - Information processing device and information processing method - Google Patents

Information processing device and information processing method Download PDF

Info

Publication number
WO2016157923A1
WO2016157923A1 PCT/JP2016/050293 JP2016050293W WO2016157923A1 WO 2016157923 A1 WO2016157923 A1 WO 2016157923A1 JP 2016050293 W JP2016050293 W JP 2016050293W WO 2016157923 A1 WO2016157923 A1 WO 2016157923A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
information processing
optical system
processing apparatus
captured image
Prior art date
Application number
PCT/JP2016/050293
Other languages
French (fr)
Japanese (ja)
Inventor
裕司 安藤
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2016157923A1 publication Critical patent/WO2016157923A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/16Housings; Caps; Mountings; Supports, e.g. with counterweight
    • G02B23/18Housings; Caps; Mountings; Supports, e.g. with counterweight for binocular arrangements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/02Viewfinders
    • G03B13/06Viewfinders with lenses with or without reflectors
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B19/00Cameras
    • G03B19/02Still-picture cameras
    • G03B19/04Roll-film cameras
    • G03B19/07Roll-film cameras having more than one objective
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Definitions

  • the present disclosure relates to an information processing apparatus and an information processing method.
  • the subject may be magnified and projected at a high magnification.
  • magnification is increased, the subject moves greatly even if the orientation is slightly changed, so that the subject is easily out of sight. Once the subject is off, the surroundings cannot be seen and it is difficult to capture the subject in the sight again.
  • the above difficulty can be reduced by increasing the apparent viewing angle, it has been technically and costly difficult to widen the apparent viewing angle. Therefore, as another example of a technique for more easily reducing the above difficulty, a technique for projecting the surrounding state in addition to the state of the observation target has been developed.
  • Patent Document 1 discloses a technique for providing a peripheral camera that captures the periphery of an area captured by a main camera in an electron microscope, and synthesizing the video obtained by the main camera with the video obtained by the peripheral camera. Has been.
  • Patent Document 2 a part of an image projected on an effective imaging region of the image sensor is extracted as a first image, and the periphery of the first image is extracted as a second image and stored as a captured image.
  • a technique for displaying a second image as a through image together with the first image is disclosed.
  • the image of the main camera to be synthesized is displayed relatively small as the magnification of the main camera increases. Further, in the technique disclosed in Patent Document 2, the main first image has become narrower in order to extract the second image.
  • the present disclosure proposes a new and improved information processing apparatus and information processing method capable of pseudo-expanding the apparent viewing angle.
  • the captured image obtained by the second obtaining unit which has been subjected to the processing based on the relationship between the field angles of the subject images obtained by the first and second obtaining units, is the first image.
  • An information processing apparatus includes an irradiation control unit that controls an irradiation process so as to irradiate an eyepiece that is irradiated with at least a part of a subject image obtained by an acquisition unit.
  • the captured image obtained by the second obtaining unit which has been subjected to the processing based on the relationship between the field angles of the subject images obtained by the first and second obtaining units.
  • an information processing method executed by a processor including controlling an irradiation process to irradiate an eyepiece irradiated with at least a part of a subject image obtained by one acquisition unit.
  • the first acquisition unit the second acquisition unit, an eyepiece that irradiates at least a part of the subject image obtained by the first acquisition unit, Irradiation for controlling the irradiation process to irradiate the eyepiece with the captured image obtained by the second acquisition unit, which has been processed based on the relationship between the field angles of the subject images obtained by the second acquisition unit.
  • an information processing apparatus including the control unit.
  • the apparent viewing angle can be pseudo-expanded.
  • the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
  • FIG. 1 is a diagram for explaining an outline of an optical system apparatus according to the present embodiment. In the present specification, description will be made on the assumption that an area 10 in the real space as shown in FIG.
  • FIGS. 2 and 3 are diagrams showing an example of an image seen from the eyepiece of the optical system device.
  • region 10 seen from the eyepiece of a common telescope or binoculars is shown.
  • the range where the image can be seen shows the apparent viewing angle, and the outside is dark (black).
  • FIG. 3 the image
  • the visible range of the image shown in FIG. 3 substantially matches the actual range of the photograph.
  • the apparent viewing angle is a viewing angle that represents the spread of an image that can be seen when looking through an eyepiece.
  • the viewing angle representing the range of the object included in the video is also referred to as an actual viewing angle.
  • the actual viewing angle has the same meaning as the angle of view indicating the imaging range in the camera.
  • FIG. 4 is a diagram for explaining a human viewing angle.
  • a range indicated by a symbol A in FIG. 4 indicates a discrimination visual field.
  • the discrimination visual field is a range in which highly accurate information can be received.
  • a range indicated by reference sign B indicates an effective visual field.
  • the effective visual field is a range in which information can be received instantaneously by accompanying eye movement.
  • a range indicated by a symbol C indicates a guidance visual field.
  • the guidance visual field is a range in which the presence of information can be determined.
  • a range indicated by reference sign D indicates an auxiliary visual field.
  • the auxiliary visual field is an auxiliary range for inducing a gaze operation with respect to a strong stimulus.
  • the discrimination visual field and the effective visual field can also be referred to as a central visual field.
  • the guidance field and the auxiliary field may be referred to as a peripheral field.
  • the information processing apparatus according to an embodiment of the present disclosure has been created with the above circumstances taken into consideration.
  • the information processing apparatus according to the present embodiment can artificially expand the apparent viewing angle.
  • the information processing apparatus according to this embodiment will be described in detail below with reference to FIGS.
  • FIG. 5 is a diagram illustrating an example of a hardware configuration of the binoculars 1 according to the present embodiment.
  • the binoculars 1 include a secondary optical system camera block 100, an image processing device 200, a main optical system camera block 300, a display block 400, a beam splitter 500, an objective lens 600, a prism 700, an eyepiece lens 800, and a switch. 900.
  • the binoculars 1 includes a pair of left and right main optical system camera blocks 300, a display block 400, a beam splitter 500, an objective lens 600, a prism 700, and an eyepiece lens 800. In FIG. 5, only one of them is shown and the other is omitted.
  • the lens 110, the lens 310, and the lens 410 are illustrated as single lenses, they may be designed as an appropriate optical system suitable for the application.
  • the light incident through the objective lens 600 is applied to the beam splitter 500.
  • the beam splitter 500 has a function of transmitting a part and reflecting the other. For example, most of the incident light travels straight (transmits), and a part is reflected toward the main optical system camera block 300. To do.
  • the light incident on the main optical system camera block 300 passes through the lens 310 and forms an image on the imager 320 and is imaged by the main optical system camera block 300.
  • the captured video signal (captured image) is input to the image processing apparatus 200.
  • an optical system such as the main optical system camera block 300 related to the light incident through the objective lens 600 is also referred to as a main optical system.
  • an optical system such as the secondary optical system camera block 100 relating to light incident through the lens 110 is also referred to as a secondary optical system.
  • the light incident through the lens 110 forms an image on the imager 120 and is picked up by the sub optical system camera block 100.
  • the sub optical system camera block 100 is adjusted so that the optical axis is positioned between the left and right main optical systems, and can capture an image in the same range as the image captured by the main optical system in the center.
  • the secondary optical system camera block 100 can capture an image having a wider angle of view than the main optical system.
  • the captured video signal (captured image) is input to the image processing apparatus 200.
  • the image processing device 200 functions as an arithmetic processing device and a control device, and controls the overall operation in the binoculars 1 according to various programs.
  • the image processing apparatus 200 is realized by an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor, for example.
  • the image processing apparatus 200 may include a ROM (Read Only Memory) that stores programs to be used, calculation parameters, and the like, and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate.
  • the image processing apparatus 200 functions as an information processing apparatus that performs image processing on an input video signal. For example, the image processing apparatus 200 performs image processing based on the video signal input from the main optical system camera block 300 on the video signal input from the sub optical system camera block 100 and displays the processed video signal on the display block 400. Output to.
  • the display block 400 displays the input video signal on an LCD (liquid crystal display) 420.
  • the image displayed on the LCD 420 is adjusted by the lens 410 and irradiated on the back side of the beam splitter 500.
  • the image irradiated on the back side of the beam splitter 500 is reflected by the beam splitter 500 and enters the user's eye through the prism 700 and the eyepiece lens 800 together with the light transmitted through the beam splitter 500.
  • the eyepiece 800 that is irradiated with at least a part of the subject image obtained by the main optical system (first acquisition unit) is processed by the image processing apparatus 200 obtained by the secondary optical system (second acquisition unit).
  • the captured image that has been subjected to is irradiated.
  • an image obtained by combining the image of the main optical system and the image of the sub optical system is displayed on the eyepiece 800.
  • FIG. 6 is a block diagram illustrating an example of a logical configuration of the image processing apparatus according to the present embodiment.
  • the image processing apparatus 200 includes an irradiation control unit 210 and an operation mode control unit 220, and is connected to the main optical system camera block 300, the sub optical system camera block 100, the display block 400, and the storage unit 230.
  • the irradiation control unit 210 and an operation mode control unit 220 is connected to the main optical system camera block 300, the sub optical system camera block 100, the display block 400, and the storage unit 230.
  • the irradiation control unit 210 has a function of performing image processing based on video signals input from the sub optical system camera block 100 and the main optical system camera block 300. For example, the irradiation control unit 210 performs image processing on the captured image input from the sub optical system camera block 100 based on the captured image input from the main optical system camera block 300. Then, the irradiation control unit 210 outputs the captured image after image processing to the display block 400, and controls the irradiation process in which the display block 400 irradiates the eyepiece 800 with the captured image.
  • the operation mode control unit 220 has a function of controlling the operation mode of the irradiation control unit 210.
  • the storage unit 230 is a part that records and reproduces data with respect to a predetermined recording medium.
  • the storage unit 230 stores captured images captured by the sub optical system camera block 100 and the main optical system camera block 300.
  • the storage unit 230 may be built in the binoculars 1 or may be formed as a removable medium such as a memory card.
  • the binoculars 1 (for example, the irradiation control unit 210) according to the present embodiment has a function of artificially expanding the apparent viewing angle of the main optical system.
  • the irradiation control unit 210 irradiates the second region on the outer edge of the first region irradiated with the subject image obtained by the main optical system with the captured image obtained by the sub optical system.
  • the apparent viewing angle is expanded in a pseudo manner.
  • the apparent viewing angle is effectively reduced by performing processing based on the relationship between the angle of view of the subject image obtained by the main optical system and the sub optical system on the captured image obtained by the sub optical system.
  • processing based on the relationship between the angles of view will be described with reference to FIGS.
  • FIG. 7 is a diagram for explaining an example of image processing by the irradiation control unit 210 according to the present embodiment.
  • FIG. 7 shows an example of an image captured on the eyepiece lens 800.
  • the region denoted by reference numeral 21 is a first region in which an image obtained by the main optical system (in other words, light incident through the objective lens 600 and transmitted through the beam splitter 500) is reflected.
  • An area indicated by reference numeral 22 is a second area in which an image obtained by the sub optical system (in other words, light displayed by the display block 400 and reflected by the beam splitter 500) is reflected.
  • the second region may correspond to a human peripheral vision. Note that the boundary line between the first area and the second area may be explicitly displayed or may not be displayed.
  • the apparent viewing angle of the original main optical system is the first region, whereas the second region is located at the outer edge of the first region, so that the apparent appearance angle of the main optical system is increased.
  • the viewing angle is pseudo-expanded to the second region.
  • the magnifications of the first area video and the second area video are the same.
  • the captured image obtained by the sub optical system is processed so as to represent the subject image included in the captured image at the same magnification as the subject image irradiated on the first region.
  • the subject image shown in the first area matches the subject image shown in the second area.
  • the video of the first area and the video of the second area are seamlessly connected.
  • the captured image obtained by the sub optical system is processed so that the angle of view at the boundary between the first area and the second area matches between the main optical system and the sub optical system.
  • the subject image shown in the first area matches the subject image shown in the second area.
  • the irradiation controller 210 enlarges / reduces the captured image obtained by the sub optical system so that the magnification of the sub optical system and the magnification of the main optical system coincide. Then, the irradiation control unit 210 generates an image (in other words, a masked image) in which a portion corresponding to the first region is blacked out from the enlarged / reduced captured image, and irradiates the eyepiece 800 with the image. Thereby, nothing is reflected by the beam splitter 500 in the masked portion, and the image of the main optical system that has passed through the beam splitter 500 appears to fit into the masked portion.
  • an image in other words, a masked image
  • FIG. 8 is a diagram for explaining an example of image processing by the irradiation control unit 210 according to the present embodiment.
  • FIG. 8 shows an example of an image captured on the eyepiece lens 800.
  • An area indicated by reference numeral 21 indicates a first area
  • an area indicated by reference numeral 22 indicates a second area.
  • the magnification of the image in the second area is lower than the image in the first area.
  • the captured image obtained by the sub optical system is processed to represent the subject image included in the captured image at a lower magnification than the subject image irradiated on the first region.
  • a video having a wider actual viewing angle (viewing angle) than that of the example shown in FIG. 7 is provided to the user.
  • FIG. 9 is a diagram for explaining an example of image processing by the irradiation control unit 210 according to the present embodiment.
  • FIG. 9 shows an example of an image captured on the eyepiece 800.
  • An area indicated by reference numeral 21 indicates a first area
  • an area indicated by reference numeral 22 indicates a second area.
  • the magnification of the image in the second region gradually decreases from the center side toward the outside.
  • the captured image obtained by the sub optical system is processed so that the angle of view increases non-uniformly from the boundary portion between the first region and the second region toward the outside.
  • a video having a wider actual viewing angle (viewing angle) than that of the example shown in FIG. 7 is provided to the user.
  • 10 and 11 are diagrams for explaining an example of image processing by the irradiation control unit 210 according to the present embodiment. 10 and 11 show an example of an image captured on the eyepiece 800.
  • FIG. An area indicated by reference numeral 21 indicates a first area, and an area indicated by reference numeral 22 indicates a second area.
  • FIG. 10 shows an example in which the angle of view increases non-uniformly in the second region.
  • the irradiation control unit 210 increases the increasing rate of the angle of view of the sub optical system image as it approaches the outside of the image. Specifically, the irradiation control unit 210 makes the interval 24 until the angle of view increases from 30 degrees to 40 degrees shorter than the interval 23 until the angle of view increases from 20 degrees to 30 degrees. In addition, the irradiation controller 210 makes the interval 25 until the angle of view increases from 40 degrees to 50 degrees shorter than the interval 24 until the angle of view increases from 30 degrees to 40 degrees. In contrast to the example shown in FIG. 10, the irradiation control unit 210 may decrease the increasing rate of the angle of view of the image of the sub optical system as it approaches the outside of the image.
  • Such a geometrical change of the image allows the binoculars 1 to be seamlessly displayed outward from the image of the main optical system and compressed and presented within the apparent viewing angle.
  • the apparent viewing angle is simply expanded, the subject is less likely to lose sight of being out of the viewing angle, and a subject that jumps out of the field of view can be quickly detected.
  • the distortion increases and the image quality worsens toward the outside of the image, the image quality of the image of the main optical system is maintained, so that the user can comfortably view the subject.
  • FIG. 11 shows an example in which the change rate of the angle of view increases evenly.
  • the irradiation control unit 210 increases the angle of view of the image of the sub optical system evenly as it approaches the outside of the image. Specifically, the irradiation controller 210 makes the interval 26 until the angle of view increases from 20 degrees to 25 degrees and the interval 27 until the angle of view increases from 25 degrees to 30 degrees have the same length.
  • the angle of view values shown in FIGS. 10 and 11 are examples.
  • the magnification of the image in the second area may change as the magnification of the image in the first area changes.
  • the captured image obtained by the sub optical system may be processed so that the relationship of magnification between the subject image irradiated on the first region and the subject image included in the captured image is maintained.
  • the irradiation control unit 210 enlarges the image of the sub optical system when the image obtained by the main optical system is enlarged, and the image obtained by the main optical system. If is reduced, the image of the sub optical system is also reduced.
  • the angle of view of the video in the second area automatically changes in accordance with the change in the angle of view of the video in the first area. Therefore, the consistency between the video in the first area and the video in the second area is maintained.
  • the same processing may be performed so as to maintain the relationship of magnification.
  • the zoom ratio of the main optical system is large, the image in the second area may become excessively rough if the image of the sub optical system is simply enlarged by image processing. Therefore, it is desirable that the sub optical system also has an optical zoom function.
  • the difference between the angle of view of the main optical system (the angle of view of the boundary portion) and the angle of view of the sub-optical system (the angle of view of the most peripheral portion) is excessively large, the image is distorted. May become excessively large, difficult to see, and uncomfortable video may be provided to the user.
  • the irradiation control unit 210 may adjust the angle of view of the secondary optical system according to a user operation, and the secondary optical system by image processing so that the difference does not become excessively large based on the zoom ratio of the primary optical system.
  • the angle of view may be automatically adjusted.
  • the irradiation control unit 210 may control the size of the first area and the second area. For example, the irradiation control unit 210 decreases the first area when the user is searching for the target subject, and increases the second area when the user is observing the target subject. You may make 2nd area
  • the binoculars 1 (for example, the operation mode control unit 220) according to the present embodiment has a function of switching an operation mode related to processing for artificially extending the apparent viewing angle of the main optical system.
  • the operation mode control unit 220 switches the content of the irradiation process controlled by the irradiation control unit 210. Specifically, the operation mode control unit 220 may switch which of the processes shown in FIGS. 7 to 11 is executed. In addition, the operation mode control unit 220 may switch the sizes of the first area and the second area.
  • the operation mode control unit 220 may perform switching according to a user operation.
  • the operation mode control unit 220 can control the operation mode in accordance with the switching operation to the switch 900.
  • the operation mode control unit 220 may perform switching according to the user's state.
  • the binoculars 1 may include sensors such as an acceleration sensor and a gyro sensor, and the operation mode control unit 220 may control the operation mode according to the user's exercise state detected by the sensor. For example, when it is detected that the user continues to exercise in the operation mode in which the process shown in FIG. 7 or 9 is performed, the operation mode control unit 220 automatically performs the process shown in FIG. You may switch to the operation mode performed. Thereby, a user's sickness and discomfort can be prevented or relieved.
  • the binoculars 1 (for example, the irradiation control unit 210) according to the present embodiment has a function of providing an image for comparing the current and the past.
  • the irradiation control unit 210 synthesizes an image captured by the main optical system or the sub optical system in the past, stored in the storage unit 230, with an image obtained by the main optical system.
  • the video stored in the storage unit 230 may be a video captured by the binoculars 1 or a video captured by another device and input to the binoculars 1.
  • FIGS. 12 and 13 a specific description will be given with reference to FIGS. 12 and 13.
  • FIG. 12 and 13 are diagrams for explaining an example of image processing by the irradiation control unit 210 according to the present embodiment.
  • FIG. 12 shows an image obtained by the main optical system.
  • FIG. 13 shows an image obtained by combining the past image 32 with the image 31 obtained by the main optical system.
  • the irradiation control unit 210 may synthesize the past image on the entire surface of the image obtained by the main optical system, or may synthesize, for example, only half as shown in FIG.
  • FIG. 14 is a flowchart illustrating an example of a flow of processing executed in the binoculars 1 according to the present embodiment.
  • the binoculars 1 obtain a captured image by the sub optical system (step S102). More specifically, the image processing apparatus 200 acquires a captured image captured by the sub optical system camera block 100.
  • the irradiation controller 210 enlarges / reduces the captured image (step S104).
  • the irradiation control unit 210 enlarges / reduces the captured image according to the operation mode setting by the operation mode control unit 220.
  • the irradiation control unit 210 may enlarge or reduce the captured image so that the magnification of the sub optical system and the magnification of the main optical system match, or the magnification of the sub optical system is larger than the magnification of the main optical system. You may expand / contract so that it may become low, and you may expand / contract so that magnification may become low gradually toward the outer side.
  • the irradiation controller 210 masks the central portion (in other words, the portion corresponding to the first region) of the captured image after the enlargement / reduction (step S106). At this time, the irradiation control unit 210 may control the size of the masked area depending on whether the user is searching for the target subject or the user is observing the target subject.
  • the irradiation controller 210 outputs the masked mask image to the display block 400 and causes the display block 400 to irradiate the eyepiece 800 with the mask image (step S108).
  • FIG. 15 is a diagram illustrating an example of a hardware configuration of the binoculars 1-1 according to the present modification.
  • the binoculars 1-1 shown in FIG. 15 has a configuration in which the sub optical system camera block 100 is omitted from the binoculars 1 shown in FIG.
  • the binoculars 1-1 also includes a memory card 1000 (in other words, the storage unit 230 formed as a removable medium).
  • the image processing apparatus 200 (for example, the irradiation control unit 210) stores the captured image captured by the main optical system camera block 300 in the memory card 1000. Then, the image processing apparatus 200 outputs the captured image read from the memory card 1000 to the display block 400, and controls the irradiation process in which the display block 400 irradiates the eyepiece 800 with the captured image. As a result, the past image of the main optical system captured by the main optical system camera block 300 is combined with the current image of the main optical system that has passed through the objective lens 600 and the beam splitter 500 and reached the eyepiece lens 800. . At that time, as shown in FIG. 16, the irradiation control unit 210 irradiates a past captured image on a region outside the current field of view of the main optical system according to the movement of the binoculars 1-1 detected by the sensor. May be.
  • FIG. 16 is a diagram for explaining an example of irradiation processing by the irradiation control unit 210 according to the present modification.
  • FIG. 16 shows an example of an image captured on the eyepiece lens 800.
  • An area denoted by reference numeral 41 is an area in which a current image obtained by the main optical system (in other words, light incident through the objective lens 600 and transmitted through the beam splitter 500) is reflected.
  • the area indicated by reference numeral 42 is an area in which a past video (in other words, a captured image captured by the main optical system camera block 300) obtained by the main optical system is reflected.
  • the upper diagram of FIG. 16 shows an example of an image when the binoculars 1-1 are stationary, and the lower diagram of FIG. 16 shows an example of an image when the binoculars 1-1 are moved to the left.
  • the irradiation control unit 210 when the binocular 1-1 is stationary, the irradiation control unit 210 does not irradiate the past main optical system with an image. Therefore, as shown in the upper diagram of FIG. 16, only the current image of the main optical system is displayed on the eyepiece 800. When the binoculars 1-1 are moved, the part that has been visible until now is out of the field of view. Therefore, the irradiation control unit 210 irradiates a captured image corresponding to the region acquired from the memory card 1000 to a region out of the field of view of the subject image obtained by the main optical system.
  • the irradiation control unit 210 irradiates the area 42 with a captured image of a part of the subject image in the area 41 illustrated in the upper diagram of FIG. 16 that is not included in the area 41 illustrated in the lower diagram. Therefore, when the binoculars 1-1 move to the left, the past video extends to the right (peripheral visual field). As described above, when the binoculars 1-1 move up, down, left, and right, the apparent field of view widens.
  • FIG. 17 shows a configuration example when the present technology is applied to electronic binoculars.
  • FIG. 17 is a diagram illustrating an example of a hardware configuration of the binoculars 1-2 according to the present modification.
  • a binocular 1-2 shown in FIG. 17 includes a secondary optical system camera block 100, an image processing apparatus 200, a main optical system camera block 300, an electronic viewfinder 1100, and a switch 900.
  • the light incident through the objective lens 311 forms an image on the imager 320 and is imaged by the main optical system camera block 300. Further, the light incident through the lens 110 forms an image on the imager 120 and is imaged by the sub optical system camera block 100.
  • the image processing apparatus 200 synthesizes the captured images captured by the main optical system camera block 300 and the sub optical system camera block 100 and outputs them to the electronic viewfinder 1100.
  • the electronic viewfinder 1100 displays the image synthesized by the image processing apparatus 200 on the LCD 420.
  • the image displayed on the LCD 1120 passes through the eyepiece 1110 and enters the user's eyes.
  • the binoculars 1-2 according to this modification performs the same processing as the binoculars 1 described above.
  • the binoculars 1-2 according to this modification is data in which each of the images to be synthesized is digitized, the ratio of the field of view of the main optical system and the sub optical system (in other words, the first region and the first region) It is possible to more easily control the ratio of the size to the area 2).
  • the imaging device and the display device may be formed separately.
  • an image may be output by an HMD (Head Mounted Display) instead of the electronic viewfinder 1100.
  • HMD Head Mounted Display
  • an endoscope is an example of a system in which an imaging device and a display device are separately formed.
  • a main camera that functions as a main optical system and a sub camera that functions as a sub optical system may be included in one endoscope.
  • images of a plurality of endoscopes having different angles of view and resolutions may be handled as images of the main optical system or images of the sub optical system, respectively. In that case, it is possible to keep the size of each endoscope small.
  • the endoscope system detects the positional relationship of a plurality of endoscopes in real time based on information of sensors provided in each of the endoscopes or feature amounts of images, and based on the detection results. To combine the images.
  • the present technology can also be applied to information processing apparatuses other than the optical system apparatus.
  • the present technology can be applied to an information processing apparatus capable of outputting video, such as an HMD, a PC, a smartphone, and a car navigation apparatus.
  • FIG. 18 illustrates a configuration example when the present technology is applied to an HMD.
  • FIG. 18 is a diagram illustrating an example of a hardware configuration of the HMD 2 according to the present modification.
  • the HMD 2 includes a pair of left and right camera blocks 2100, a display element 2200, and an eyepiece lens 2300.
  • the light incident through the lens 2110 forms an image on the imager 2120 and is imaged by the camera block 2100.
  • the HMD 2 includes an image processing apparatus (not shown), synthesizes the captured images captured by the camera block 2100, and displays the combined image on the display element 2200.
  • the image displayed by the display element 2200 passes through the eyepiece lens 2300 and enters the user's eye.
  • any one of the pair of left and right camera blocks 2100 may function as a main optical system, and the other may function as a sub optical system.
  • the HMD 2 simulates each of the video obtained by the main camera and the video obtained by the sub camera by performing image processing such as enlargement / reduction on the captured image captured by the pair of left and right camera blocks 2100. May be generated automatically.
  • the HMD 2 when the HMD 2 is used, the user's field of view is completely blocked by the display. For this reason, if the actual movement of the body and the video do not match, there is a risk of video sickness and discomfort. Therefore, as shown in FIG. 19, the HMD 2 may synthesize the user's external image captured by the camera block 2100 with the peripheral visual field. As a result, the actual movement of the body matches the image of the peripheral visual field, and it is possible to suppress the user's image sickness and discomfort.
  • FIG. 19 is a diagram showing an example of an image displayed by the HMD 2 according to this modification.
  • An area denoted by reference numeral 51 is an area in which a main video is shown.
  • a video of the virtual world may be shown in the area indicated by reference numeral 51.
  • the image obtained by the endoscope may be reflected in the area indicated by reference numeral 51.
  • An area indicated by reference numeral 52 is an area in which an image of the user's outside world is shown at the same magnification.
  • the user's external image is shown at the same magnification on the outside (the area indicated by reference numeral 52) of the two divided areas, but the inner area (the area indicated by reference numeral 51).
  • the HMD 2 projects the user's external image at the same magnification on the outermost side of the three-divided region, and performs the above-described processing for extending the apparent viewing angle using the two inner regions. Good.
  • HMD2 As a use case of HMD2, there is a use of viewing a spherical image prepared in advance.
  • HMD2 senses the movement of the user's head and moves the image according to the movement of the head, thereby providing an image as if the user is entering the virtual world and watching the surroundings. Can do.
  • the HMD 2 may perform the above-described processing for extending the apparent viewing angle when the user uses an optical system device such as virtual binoculars and a telescope in the virtual world. Specifically, when zooming, the HMD 2 may perform image processing shown as an example in FIGS. Further, the HMD 2 may suppress the motion sickness of the user by synthesizing a virtual world video or a real world video of the same size at the outermost periphery.
  • the image processing apparatus 200 performs imaging based on the relationship between the angles of view of the subject images obtained by the main optical system and the sub optical system, and obtains an image obtained by the sub optical system.
  • the irradiation process is controlled so that the image is irradiated to an eyepiece that irradiates at least a part of the subject image obtained by the main optical system. Since the apparent viewing angle of the main optical system is pseudo-expanded, the optical system apparatus can be reduced in size and weight compared with the case where the apparent viewing angle of the main optical system is physically expanded. In addition, it can be configured at a low cost. In addition, since the image obtained by the sub optical system corresponds to the peripheral visual field, the image quality can be lowered, and the optical system apparatus can be configured more inexpensively.
  • each device described in this specification may be realized as a single device, or a part or all of the devices may be realized as separate devices.
  • the image processing apparatus 200 may be provided in a device such as a server connected to other components via a network or the like.
  • the series of processing by each device described in this specification may be realized using any of software, hardware, and a combination of software and hardware.
  • the program constituting the software is stored in advance in a storage medium (non-transitory medium) provided inside or outside each device.
  • Each program is read into a RAM when executed by a computer and executed by a processor such as a CPU.
  • An irradiation control unit that controls the irradiation process so as to irradiate an eyepiece to which at least a part of the image is irradiated;
  • An information processing apparatus comprising: (2) The information processing apparatus according to (1), wherein the irradiation control unit irradiates the captured image to a second region at an outer edge of the first region irradiated with the subject image obtained by the first acquisition unit.
  • the information processing apparatus according to one item.
  • the information processing apparatus (10) The information processing apparatus according to (9), wherein the operation mode control unit performs the switching according to a user operation. (11) The information processing apparatus according to (9) or (10), wherein the operation mode control unit executes the switching according to a user state. (12) The irradiation control unit stores the captured image obtained by the first obtaining unit in a storage unit, and obtains the subject image obtained by the first obtaining unit from the storage unit in an area out of the field of view. The information processing apparatus according to any one of (1) to (11), wherein the captured image corresponding to the region is irradiated.
  • An irradiation control unit for controlling An information processing apparatus comprising:

Abstract

[Problem] To provide an information processing device and an information processing method capable of pseudo-expansion of an apparent field of view. [Solution] An information processing device that comprises an exposure control unit that controls an exposure process so that a captured image acquired via a second acquisition unit and processed on the basis of the relationship of angles of view of an object image acquired by a first acquisition unit and the second acquisition unit is exposed to an eye lens to which at least part of an object image acquired by the first acquisition unit has been exposed.

Description

情報処理装置及び情報処理方法Information processing apparatus and information processing method
 本開示は、情報処理装置及び情報処理方法に関する。 The present disclosure relates to an information processing apparatus and an information processing method.
 望遠鏡及び双眼鏡等の光学系装置においては、被写体を高い倍率で拡大して映す場合がある。倍率が高くなると僅かな向きの変化でも被写体は大きく動くため視界から外れやすく、ひとたび外れてしまうと周囲が見えないため被写体を再び視界に捉えることが困難となる。見かけの視野角を大きくすることで上記困難は軽減し得るものの、見かけの視野角を広げるためには、技術的に及びコスト的に困難な場合があった。そのため、上記困難をより簡易に軽減するための技術の他の一例として、観察対象の様子の他に、周囲の様子を併せて映す技術が開発されている。 In optical system devices such as telescopes and binoculars, the subject may be magnified and projected at a high magnification. When the magnification is increased, the subject moves greatly even if the orientation is slightly changed, so that the subject is easily out of sight. Once the subject is off, the surroundings cannot be seen and it is difficult to capture the subject in the sight again. Although the above difficulty can be reduced by increasing the apparent viewing angle, it has been technically and costly difficult to widen the apparent viewing angle. Therefore, as another example of a technique for more easily reducing the above difficulty, a technique for projecting the surrounding state in addition to the state of the observation target has been developed.
 例えば、下記特許文献1では、電子顕微鏡において、主カメラが捉えている領域の周辺を撮像する周辺カメラを設け、周辺カメラにより得られた映像に主カメラにより得られた映像を合成する技術が開示されている。 For example, Patent Document 1 below discloses a technique for providing a peripheral camera that captures the periphery of an area captured by a main camera in an electron microscope, and synthesizing the video obtained by the main camera with the video obtained by the peripheral camera. Has been.
 また、下記特許文献2では、撮像素子の有効撮像領域に投影された画像のうち一部を第1の画像として、第1の画像の周囲を第2の画像として抽出し、撮像画像として記憶される第1の画像と共に、第2の画像をスルー画像として表示する技術が開示されている。 Further, in Patent Document 2 below, a part of an image projected on an effective imaging region of the image sensor is extracted as a first image, and the periphery of the first image is extracted as a second image and stored as a captured image. A technique for displaying a second image as a through image together with the first image is disclosed.
特開2011-138096号公報JP 2011-138096 A 特開2004-343363号公報JP 2004-343363 A
 しかし、上記特許文献1に開示された技術では、主カメラの倍率が高くなるほど、合成される主カメラの映像が相対的に小さく表示されてしまっていた。また、上記特許文献2に開示された技術では、第2の画像を抽出するために、主である第1の画像が狭くなってしまっていた。 However, with the technique disclosed in Patent Document 1, the image of the main camera to be synthesized is displayed relatively small as the magnification of the main camera increases. Further, in the technique disclosed in Patent Document 2, the main first image has become narrower in order to extract the second image.
 そこで、本開示では、見かけの視野角を疑似的に拡張することが可能な、新規かつ改良された情報処理装置及び情報処理方法を提案する。 Therefore, the present disclosure proposes a new and improved information processing apparatus and information processing method capable of pseudo-expanding the apparent viewing angle.
 本開示によれば、第1及び第2の取得部により得られる被写体像の画角の関係に基づく処理が行われた、前記第2の取得部により得られた撮像画像を、前記第1の取得部により得られる被写体像の少なくとも一部が照射される接眼レンズに照射するよう照射処理を制御する照射制御部と、を備える情報処理装置が提供される。 According to the present disclosure, the captured image obtained by the second obtaining unit, which has been subjected to the processing based on the relationship between the field angles of the subject images obtained by the first and second obtaining units, is the first image. An information processing apparatus is provided that includes an irradiation control unit that controls an irradiation process so as to irradiate an eyepiece that is irradiated with at least a part of a subject image obtained by an acquisition unit.
 また、本開示によれば、第1及び第2の取得部により得られる被写体像の画角の関係に基づく処理が行われた、前記第2の取得部により得られた撮像画像を、前記第1の取得部により得られる被写体像の少なくとも一部が照射される接眼レンズに照射するよう照射処理を制御すること、を含む、プロセッサにより実行される情報処理方法が提供される。 Further, according to the present disclosure, the captured image obtained by the second obtaining unit, which has been subjected to the processing based on the relationship between the field angles of the subject images obtained by the first and second obtaining units, There is provided an information processing method executed by a processor, including controlling an irradiation process to irradiate an eyepiece irradiated with at least a part of a subject image obtained by one acquisition unit.
 また、本開示によれば、第1の取得部と、第2の取得部と、前記第1の取得部により得られた被写体像の少なくとも一部が照射される接眼レンズと、前記第1及び前記第2の取得部により得られる被写体像の画角の関係に基づく処理が行われた、前記第2の取得部により得られた撮像画像を前記接眼レンズに照射するよう照射処理を制御する照射制御部と、を備える情報処理装置が提供される。 According to the present disclosure, the first acquisition unit, the second acquisition unit, an eyepiece that irradiates at least a part of the subject image obtained by the first acquisition unit, Irradiation for controlling the irradiation process to irradiate the eyepiece with the captured image obtained by the second acquisition unit, which has been processed based on the relationship between the field angles of the subject images obtained by the second acquisition unit. And an information processing apparatus including the control unit.
 以上説明したように本開示によれば、見かけの視野角を疑似的に拡張することが可能となる。なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 As described above, according to the present disclosure, the apparent viewing angle can be pseudo-expanded. Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
本実施形態に係る光学系装置の概要を説明するための図である。It is a figure for demonstrating the outline | summary of the optical system apparatus which concerns on this embodiment. 光学系装置の接眼レンズから見える映像の一例を示す図である。It is a figure which shows an example of the image | video seen from the eyepiece lens of an optical system apparatus. 光学系装置の接眼レンズから見える映像の一例を示す図である。It is a figure which shows an example of the image | video seen from the eyepiece lens of an optical system apparatus. 人間の視野角について説明するための図である。It is a figure for demonstrating a human viewing angle. 本実施形態に係る双眼鏡のハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware constitutions of the binoculars concerning this embodiment. 本実施形態に係る画像処理装置の論理的な構成の一例を示すブロック図である。It is a block diagram which shows an example of a logical structure of the image processing apparatus which concerns on this embodiment. 本実施形態に係る照射制御部による画像処理の一例を説明するための図である。It is a figure for demonstrating an example of the image process by the irradiation control part which concerns on this embodiment. 本実施形態に係る照射制御部による画像処理の一例を説明するための図である。It is a figure for demonstrating an example of the image process by the irradiation control part which concerns on this embodiment. 本実施形態に係る照射制御部による画像処理の一例を説明するための図である。It is a figure for demonstrating an example of the image process by the irradiation control part which concerns on this embodiment. 本実施形態に係る照射制御部による画像処理の一例を説明するための図である。It is a figure for demonstrating an example of the image process by the irradiation control part which concerns on this embodiment. 本実施形態に係る照射制御部による画像処理の一例を説明するための図である。It is a figure for demonstrating an example of the image process by the irradiation control part which concerns on this embodiment. 本実施形態に係る照射制御部による画像処理の一例を説明するための図である。It is a figure for demonstrating an example of the image process by the irradiation control part which concerns on this embodiment. 本実施形態に係る照射制御部による画像処理の一例を説明するための図である。It is a figure for demonstrating an example of the image process by the irradiation control part which concerns on this embodiment. 本実施形態に係る双眼鏡において実行される処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the process performed in the binoculars which concern on this embodiment. 本変形例に係る双眼鏡のハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware constitutions of the binoculars which concern on this modification. 本変形例に係る照射制御部による照射処理の一例を説明するための図である。It is a figure for demonstrating an example of the irradiation process by the irradiation control part which concerns on this modification. 本変形例に係る双眼鏡のハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware constitutions of the binoculars which concern on this modification. 本変形例に係るHMDのハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware constitutions of HMD which concerns on this modification. 本変形例によるHMDにより表示される映像の一例を示す図である。It is a figure which shows an example of the image | video displayed by HMD by this modification.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
 なお、説明は以下の順序で行うものとする。
  1.概要
  2.構成例
   2.1.全体構成
   2.2.画像処理装置の構成
  3.技術的特徴
   3.1.見かけの視野角の拡張機能
   3.2.動作モード切り替え機能
   3.3.比較機能
  4.動作処理例
  5.変形例
   5.1.副光学系レス
   5.2.電子双眼鏡
   5.3.HMD
  6.まとめ
The description will be made in the following order.
1. Overview 2. Configuration example 2.1. Overall configuration 2.2. 2. Configuration of image processing apparatus Technical features 3.1. Apparent viewing angle expansion function 3.2. Operation mode switching function 3.3. Comparison function Example of operation processing Modification 5.1. Sub optical system-less 5.2. Electronic binoculars 5.3. HMD
6). Summary
 <<1.概要>>
 まず、図1~図4を参照して、本開示の一実施形態に係る情報処理装置の概要を説明する。以下では、一例として、本実施形態に係る情報処理装置が光学系装置として実現される場合の例を説明する。
<< 1. Overview >>
First, an overview of an information processing apparatus according to an embodiment of the present disclosure will be described with reference to FIGS. Hereinafter, as an example, an example in which the information processing apparatus according to the present embodiment is realized as an optical system apparatus will be described.
 図1は、本実施形態に係る光学系装置の概要を説明するための図である。本明細書では、図1に示したような実空間のうちの一領域10を光学系装置により拡大する例を想定して説明する。 FIG. 1 is a diagram for explaining an outline of an optical system apparatus according to the present embodiment. In the present specification, description will be made on the assumption that an area 10 in the real space as shown in FIG.
 図2及び図3は、光学系装置の接眼レンズから見える映像の一例を示す図である。図2では、一般的な望遠鏡又は双眼鏡の接眼レンズから見える、領域10の映像を示している。映像が見える範囲は見かけの視野角を示しており、その外は暗く(黒く)なっている。図3では、一般的なカメラのファインダーから見える、領域10の映像を示している。図3に示した映像の見える範囲は、実際に写真に写る範囲と概ね一致する。なお、見かけの視野角とは、接眼レンズを覗いた際に見える映像の広がりを表す視野角である。これに関連して、映像に含まれる対象物の範囲を表す視野角を、実視野角とも称する。実視野角は、カメラにおける撮像範囲を示す画角と同様の意味を有する。 2 and 3 are diagrams showing an example of an image seen from the eyepiece of the optical system device. In FIG. 2, the image | video of the area | region 10 seen from the eyepiece of a common telescope or binoculars is shown. The range where the image can be seen shows the apparent viewing angle, and the outside is dark (black). In FIG. 3, the image | video of the area | region 10 seen from the finder of a general camera is shown. The visible range of the image shown in FIG. 3 substantially matches the actual range of the photograph. The apparent viewing angle is a viewing angle that represents the spread of an image that can be seen when looking through an eyepiece. In this connection, the viewing angle representing the range of the object included in the video is also referred to as an actual viewing angle. The actual viewing angle has the same meaning as the angle of view indicating the imaging range in the camera.
 図4は、人間の視野角について説明するための図である。図4の符号Aに示す範囲は弁別視野を示している。弁別視野は、高精度な情報を受容することが可能な範囲である。符号Bに示す範囲は、有効視野を示している。有効視野は、眼球運動を伴うことで瞬時に情報を受容することが可能な範囲である。符号Cに示す範囲は、誘導視野を示している。誘導視野は、情報の存在を判定することが可能な範囲である。符号Dに示す範囲は、補助視野を示している。補助視野は、強い刺激に対して注視動作を誘発させる補助的な範囲である。なお、弁別視野及び有効視野は中心視野とも称され得る。また、誘導視野及び補助視野は周辺視野とも称され得る。 FIG. 4 is a diagram for explaining a human viewing angle. A range indicated by a symbol A in FIG. 4 indicates a discrimination visual field. The discrimination visual field is a range in which highly accurate information can be received. A range indicated by reference sign B indicates an effective visual field. The effective visual field is a range in which information can be received instantaneously by accompanying eye movement. A range indicated by a symbol C indicates a guidance visual field. The guidance visual field is a range in which the presence of information can be determined. A range indicated by reference sign D indicates an auxiliary visual field. The auxiliary visual field is an auxiliary range for inducing a gaze operation with respect to a strong stimulus. The discrimination visual field and the effective visual field can also be referred to as a central visual field. In addition, the guidance field and the auxiliary field may be referred to as a peripheral field.
 ここで、望遠鏡及び双眼鏡等の光学系装置においては、倍率が高くなると僅かな向きの変化でも被写体は大きく動くため視界から外れやすく、ひとたび外れてしまうと周囲が見えないため被写体を再び視界に捉えることが困難となる。見かけの視野角を大きくすることで、周辺視野向けの映像を広く映すことが可能となれば、上記困難は軽減し得る。例えば、上記図2及び図3に示した例に関しては、見かけの視野角が広がるほど被写体が視界から外れにくく、外れてしまった場合であっても被写体を再び視界に捉えやすくなる。また、接眼レンズを覗いているときに体が動いていると、酔い及び不快感を招いてしまっていた。このような困難は、倍率が高くなるほど顕著に生じる。なお、図3に示した例に関して言えば、写真に写る範囲を隅々まで確認するという目的と人間の有効視野の大きさとを考慮すると、いたずらにファインダーの視野角を広げることは得策ではないとも言える。 Here, in an optical system device such as a telescope and binoculars, if the magnification is increased, the subject will move greatly even if the direction changes slightly, so it will be easily deviated from the field of view. It becomes difficult. If the apparent viewing angle is increased so that a wide range of images for the peripheral vision can be projected, the above difficulties can be reduced. For example, in the example shown in FIGS. 2 and 3, the subject is less likely to be out of the field of view as the apparent viewing angle is widened, and even when the subject has deviated, the subject is easily captured in the field of view again. In addition, if the body is moving while looking into the eyepiece, it causes intoxication and discomfort. Such difficulty becomes more prominent as the magnification increases. Regarding the example shown in FIG. 3, it is not a good idea to widen the view angle of the viewfinder unnecessarily, considering the purpose of confirming every corner of the photographed area and the size of the effective human field of view. I can say that.
 しかし、見かけの視野角を広げるためには、技術的に及びコスト的に困難な場合がある。上記特許文献に記載された技術は、主となる映像が相対的に小さく写ってしまうことを考慮すると、高精細な映像を直接肉眼で見ることを目的とする装置のために適切な技術であるとは言えない。 However, it may be technically and costly difficult to widen the apparent viewing angle. The technique described in the above-mentioned patent document is an appropriate technique for an apparatus intended to directly view a high-definition image with the naked eye, considering that the main image is relatively small. It can not be said.
 そこで、上記事情を一着眼点として、本開示の一実施形態に係る情報処理装置を創作するに至った。本実施形態に係る情報処理装置は、見かけの視野角を疑似的に拡張することが可能である。以下、図5~図19を参照して、本実施形態に係る情報処理装置について詳細に説明する。 Accordingly, the information processing apparatus according to an embodiment of the present disclosure has been created with the above circumstances taken into consideration. The information processing apparatus according to the present embodiment can artificially expand the apparent viewing angle. The information processing apparatus according to this embodiment will be described in detail below with reference to FIGS.
 <<2.構成例>>
 以下では、本実施形態に係る情報処理装置の構成例を説明する。ここでは、情報処理装置の一例として、光学系装置である双眼鏡を挙げて説明する。
<< 2. Configuration example >>
Below, the structural example of the information processing apparatus which concerns on this embodiment is demonstrated. Here, as an example of the information processing apparatus, binoculars that are optical system apparatuses will be described.
  <2.1.全体構成>
 図5は、本実施形態に係る双眼鏡1のハードウェア構成の一例を示す図である。図5に示すように、双眼鏡1は、副光学系カメラブロック100、画像処理装置200、主光学系カメラブロック300、表示ブロック400、ビームスプリッター500、対物レンズ600、プリズム700、接眼レンズ800及びスイッチ900を含む。双眼鏡1は、左右一対の主光学系カメラブロック300、表示ブロック400、ビームスプリッター500、対物レンズ600、プリズム700及び接眼レンズ800を有する。図5では、そのうち片方のみを図示し、他方は省略している。また、レンズ110、レンズ310及びレンズ410は、単レンズとして図示しているが、用途に即した適切な光学系として設計されてもよい。
<2.1. Overall configuration>
FIG. 5 is a diagram illustrating an example of a hardware configuration of the binoculars 1 according to the present embodiment. As shown in FIG. 5, the binoculars 1 include a secondary optical system camera block 100, an image processing device 200, a main optical system camera block 300, a display block 400, a beam splitter 500, an objective lens 600, a prism 700, an eyepiece lens 800, and a switch. 900. The binoculars 1 includes a pair of left and right main optical system camera blocks 300, a display block 400, a beam splitter 500, an objective lens 600, a prism 700, and an eyepiece lens 800. In FIG. 5, only one of them is shown and the other is omitted. Further, although the lens 110, the lens 310, and the lens 410 are illustrated as single lenses, they may be designed as an appropriate optical system suitable for the application.
 対物レンズ600を通過して入射した光はビームスプリッター500に照射される。ビームスプリッター500は、一部を透過させ、他を反射する機能を有しており、例えば入射した光の大部分を直進(透過)させて、一部を主光学系カメラブロック300へ向けて反射する。主光学系カメラブロック300に入射した光は、レンズ310を通過してイメージャ320上に結像し、主光学系カメラブロック300により撮像される。撮像された映像信号(撮像画像)は、画像処理装置200に入力される。以下では、対物レンズ600を通過して入射した光に関する、例えば主光学系カメラブロック300等の光学系を、主光学系とも称する。これに対して、レンズ110を通過して入射した光に関する、例えば副光学系カメラブロック100等の光学系を、副光学系とも称する。 The light incident through the objective lens 600 is applied to the beam splitter 500. The beam splitter 500 has a function of transmitting a part and reflecting the other. For example, most of the incident light travels straight (transmits), and a part is reflected toward the main optical system camera block 300. To do. The light incident on the main optical system camera block 300 passes through the lens 310 and forms an image on the imager 320 and is imaged by the main optical system camera block 300. The captured video signal (captured image) is input to the image processing apparatus 200. Hereinafter, an optical system such as the main optical system camera block 300 related to the light incident through the objective lens 600 is also referred to as a main optical system. On the other hand, for example, an optical system such as the secondary optical system camera block 100 relating to light incident through the lens 110 is also referred to as a secondary optical system.
 レンズ110を通過して入射した光は、イメージャ120上に結像し、副光学系カメラブロック100により撮像される。副光学系カメラブロック100は、光軸が左右の主光学系のちょうど間に位置するよう調整されており、主光学系が捉える映像と同じ範囲の映像を中央に捉えることが可能である。ここで、副光学系カメラブロック100は、主光学系よりも広い画角の映像を撮像することが可能であるものとする。撮像された映像信号(撮像画像)は、画像処理装置200に入力される。 The light incident through the lens 110 forms an image on the imager 120 and is picked up by the sub optical system camera block 100. The sub optical system camera block 100 is adjusted so that the optical axis is positioned between the left and right main optical systems, and can capture an image in the same range as the image captured by the main optical system in the center. Here, it is assumed that the secondary optical system camera block 100 can capture an image having a wider angle of view than the main optical system. The captured video signal (captured image) is input to the image processing apparatus 200.
 画像処理装置200は、演算処理装置及び制御装置として機能し、各種プログラムに従って双眼鏡1内の動作全般を制御する。画像処理装置200は、例えばCPU(Central Processing Unit)、マイクロプロセッサ等の電子回路によって実現される。なお、画像処理装置200は、使用するプログラムや演算パラメータ等を記憶するROM(Read Only Memory)、及び適宜変化するパラメータ等を一時記憶するRAM(Random Access Memory)を含んでいてもよい。 The image processing device 200 functions as an arithmetic processing device and a control device, and controls the overall operation in the binoculars 1 according to various programs. The image processing apparatus 200 is realized by an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor, for example. The image processing apparatus 200 may include a ROM (Read Only Memory) that stores programs to be used, calculation parameters, and the like, and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate.
 具体的には、画像処理装置200は、入力された映像信号に画像処理を行う情報処理装置として機能する。例えば、画像処理装置200は、副光学系カメラブロック100から入力された映像信号に、主光学系カメラブロック300から入力された映像信号に基づく画像処理を行い、処理後の映像信号を表示ブロック400へ出力する。 Specifically, the image processing apparatus 200 functions as an information processing apparatus that performs image processing on an input video signal. For example, the image processing apparatus 200 performs image processing based on the video signal input from the main optical system camera block 300 on the video signal input from the sub optical system camera block 100 and displays the processed video signal on the display block 400. Output to.
 表示ブロック400は、入力された映像信号をLCD(liquid crystal display)420により表示する。LCD420により表示された映像は、レンズ410により調整され、ビームスプリッター500の裏側に照射される。ビームスプリッター500の裏側に照射された映像は、ビームスプリッター500により反射され、ビームスプリッター500を透過してきた光と共にプリズム700及び接眼レンズ800を通過してユーザの眼に入射する。換言すると、主光学系(第1の取得部)により得られる被写体像の少なくとも一部が照射される接眼レンズ800に、副光学系(第2の取得部)により得られ画像処理装置200による処理が行われた撮像画像が照射される。これにより、主光学系の映像と副光学系の映像とが合成された映像が、接眼レンズ800に映される。 The display block 400 displays the input video signal on an LCD (liquid crystal display) 420. The image displayed on the LCD 420 is adjusted by the lens 410 and irradiated on the back side of the beam splitter 500. The image irradiated on the back side of the beam splitter 500 is reflected by the beam splitter 500 and enters the user's eye through the prism 700 and the eyepiece lens 800 together with the light transmitted through the beam splitter 500. In other words, the eyepiece 800 that is irradiated with at least a part of the subject image obtained by the main optical system (first acquisition unit) is processed by the image processing apparatus 200 obtained by the secondary optical system (second acquisition unit). The captured image that has been subjected to is irradiated. Thus, an image obtained by combining the image of the main optical system and the image of the sub optical system is displayed on the eyepiece 800.
 以上、双眼鏡1のハードウェア構成について説明した。続いて、図6を参照して、画像処理装置200の内部構成について説明する。 The hardware configuration of the binoculars 1 has been described above. Next, an internal configuration of the image processing apparatus 200 will be described with reference to FIG.
  <2.2.画像処理装置の構成>
 図6は、本実施形態に係る画像処理装置の論理的な構成の一例を示すブロック図である。図6に示すように、画像処理装置200は、照射制御部210及び動作モード制御部220を含み、主光学系カメラブロック300、副光学系カメラブロック100、表示ブロック400及び記憶部230と接続される。
<2.2. Configuration of Image Processing Device>
FIG. 6 is a block diagram illustrating an example of a logical configuration of the image processing apparatus according to the present embodiment. As shown in FIG. 6, the image processing apparatus 200 includes an irradiation control unit 210 and an operation mode control unit 220, and is connected to the main optical system camera block 300, the sub optical system camera block 100, the display block 400, and the storage unit 230. The
 照射制御部210は、副光学系カメラブロック100及び主光学系カメラブロック300から入力された映像信号に基づく画像処理を行う機能を有する。例えば、照射制御部210は、副光学系カメラブロック100から入力された撮像画像に、主光学系カメラブロック300から入力された撮像画像に基づく画像処理を行う。そして、照射制御部210は、画像処理後の撮像画像を表示ブロック400へ出力し、表示ブロック400が撮像画像を接眼レンズ800に照射する照射処理を制御する。 The irradiation control unit 210 has a function of performing image processing based on video signals input from the sub optical system camera block 100 and the main optical system camera block 300. For example, the irradiation control unit 210 performs image processing on the captured image input from the sub optical system camera block 100 based on the captured image input from the main optical system camera block 300. Then, the irradiation control unit 210 outputs the captured image after image processing to the display block 400, and controls the irradiation process in which the display block 400 irradiates the eyepiece 800 with the captured image.
 動作モード制御部220は、照射制御部210の動作モードを制御する機能を有する。 The operation mode control unit 220 has a function of controlling the operation mode of the irradiation control unit 210.
 記憶部230は、所定の記録媒体に対してデータの記録再生を行う部位である。例えば、記憶部230は、副光学系カメラブロック100及び主光学系カメラブロック300により撮像された撮像画像を記憶する。記憶部230は、双眼鏡1に内蔵されていてもよいし、メモリーカードのように着脱可能な媒体として形成されてもよい。 The storage unit 230 is a part that records and reproduces data with respect to a predetermined recording medium. For example, the storage unit 230 stores captured images captured by the sub optical system camera block 100 and the main optical system camera block 300. The storage unit 230 may be built in the binoculars 1 or may be formed as a removable medium such as a memory card.
 以上、本実施形態に係る画像処理装置200の内部構成例について説明した。続いて、本実施形態に係る双眼鏡1の技術的特徴について説明する。 The example of the internal configuration of the image processing apparatus 200 according to the present embodiment has been described above. Next, technical features of the binoculars 1 according to this embodiment will be described.
 <<3.技術的特徴>>
  <3.1.見かけの視野角の拡張機能>
 本実施形態に係る双眼鏡1(例えば、照射制御部210)は、主光学系の見かけの視野角を疑似的に拡張する機能を有する。
<< 3. Technical features >>
<3.1. Apparent viewing angle expansion function>
The binoculars 1 (for example, the irradiation control unit 210) according to the present embodiment has a function of artificially expanding the apparent viewing angle of the main optical system.
 例えば、照射制御部210は、主光学系により得られる被写体像が照射される第1の領域の外縁の第2の領域に、副光学系により得られた撮像画像を照射する。これにより、見かけの視野角が疑似的に拡張される。その際、副光学系により得られた撮像画像に対して、主光学系及び副光学系により得られる被写体像の画角の関係に基づく処理が行われることにより、見かけの視野角が効果的に拡張され得る。以下、図7~図11を参照して、画角の関係に基づく処理の具体例を説明する。 For example, the irradiation control unit 210 irradiates the second region on the outer edge of the first region irradiated with the subject image obtained by the main optical system with the captured image obtained by the sub optical system. Thereby, the apparent viewing angle is expanded in a pseudo manner. At that time, the apparent viewing angle is effectively reduced by performing processing based on the relationship between the angle of view of the subject image obtained by the main optical system and the sub optical system on the captured image obtained by the sub optical system. Can be extended. Hereinafter, specific examples of processing based on the relationship between the angles of view will be described with reference to FIGS.
 図7は、本実施形態に係る照射制御部210による画像処理の一例を説明するための図である。図7では、接眼レンズ800に写る映像の一例を示している。符号21に示す領域は、主光学系により得られた映像(換言すると、対物レンズ600を通過して入射した光であって、ビームスプリッター500を透過してきた光)が映る第1の領域である。符号22に示す領域は、副光学系により得られた映像(換言すると、表示ブロック400により表示され、ビームスプリッター500により反射された光)が映る第2の領域である。第2の領域は、人間の周辺視野に相当し得る。なお、第1の領域と第2の領域との境界線は、明示的に表示されてもよいし、表示されなくてもよい。図7を参照すると、本来の主光学系の見かけの視野角が第1の領域であるのに対し、第1の領域の外縁に第2の領域が位置することで、主光学系の見かけの視野角が疑似的に第2の領域にまで拡張されている。 FIG. 7 is a diagram for explaining an example of image processing by the irradiation control unit 210 according to the present embodiment. FIG. 7 shows an example of an image captured on the eyepiece lens 800. The region denoted by reference numeral 21 is a first region in which an image obtained by the main optical system (in other words, light incident through the objective lens 600 and transmitted through the beam splitter 500) is reflected. . An area indicated by reference numeral 22 is a second area in which an image obtained by the sub optical system (in other words, light displayed by the display block 400 and reflected by the beam splitter 500) is reflected. The second region may correspond to a human peripheral vision. Note that the boundary line between the first area and the second area may be explicitly displayed or may not be displayed. Referring to FIG. 7, the apparent viewing angle of the original main optical system is the first region, whereas the second region is located at the outer edge of the first region, so that the apparent appearance angle of the main optical system is increased. The viewing angle is pseudo-expanded to the second region.
 図7に示した例では、第1の領域の映像と第2の領域の映像との倍率が一致している。換言すると、副光学系により得られた撮像画像は、第1の領域に照射される被写体像と同じ倍率で撮像画像に含まれる被写体像を表すよう処理されている。これにより、第1の領域に映る被写体像と第2の領域に映る被写体像とが整合することとなる。 In the example shown in FIG. 7, the magnifications of the first area video and the second area video are the same. In other words, the captured image obtained by the sub optical system is processed so as to represent the subject image included in the captured image at the same magnification as the subject image irradiated on the first region. As a result, the subject image shown in the first area matches the subject image shown in the second area.
 さらに、図7に示した例では、第1の領域の映像と第2の領域の映像とがシームレスにつながっている。換言すると、副光学系により得られた撮像画像は、第1の領域と第2の領域との境界部分の画角が、主光学系と副光学系とで一致するように処理されている。これにより、第1の領域に映る被写体像と第2の領域に映る被写体像とが整合することとなる。 Furthermore, in the example shown in FIG. 7, the video of the first area and the video of the second area are seamlessly connected. In other words, the captured image obtained by the sub optical system is processed so that the angle of view at the boundary between the first area and the second area matches between the main optical system and the sub optical system. As a result, the subject image shown in the first area matches the subject image shown in the second area.
 照射制御部210による処理について詳細に説明する。まず、照射制御部210は、副光学系により得られた撮像画像を、副光学系の倍率と主光学系の倍率とが一致するように拡縮する。そして、照射制御部210は、拡縮後の撮像画像から、第1の領域に相当する部分を黒くくり抜いた画像(換言すると、マスクした画像)を生成して、接眼レンズ800へ照射させる。これにより、マスクされた部分についてはビームスプリッター500で何も反射されず、ビームスプリッター500を通過してきた主光学系の映像がマスクされた部分に嵌ったように映る。 The process by the irradiation control unit 210 will be described in detail. First, the irradiation controller 210 enlarges / reduces the captured image obtained by the sub optical system so that the magnification of the sub optical system and the magnification of the main optical system coincide. Then, the irradiation control unit 210 generates an image (in other words, a masked image) in which a portion corresponding to the first region is blacked out from the enlarged / reduced captured image, and irradiates the eyepiece 800 with the image. Thereby, nothing is reflected by the beam splitter 500 in the masked portion, and the image of the main optical system that has passed through the beam splitter 500 appears to fit into the masked portion.
 図8は、本実施形態に係る照射制御部210による画像処理の一例を説明するための図である。図8では、接眼レンズ800に写る映像の一例を示している。また、符号21に示す領域は第1の領域を示し、符号22に示す領域は第2の領域を示している。図8に示した例では、第1の領域の映像よりも第2の領域の映像の倍率の方が低い。換言すると、副光学系により得られた撮像画像は、第1の領域に照射される被写体像より低い倍率で撮像画像に含まれる被写体像を表すよう処理されている。これにより、図7に示した例と比較して広い実視野角(画角)の映像がユーザに提供される。 FIG. 8 is a diagram for explaining an example of image processing by the irradiation control unit 210 according to the present embodiment. FIG. 8 shows an example of an image captured on the eyepiece lens 800. An area indicated by reference numeral 21 indicates a first area, and an area indicated by reference numeral 22 indicates a second area. In the example shown in FIG. 8, the magnification of the image in the second area is lower than the image in the first area. In other words, the captured image obtained by the sub optical system is processed to represent the subject image included in the captured image at a lower magnification than the subject image irradiated on the first region. As a result, a video having a wider actual viewing angle (viewing angle) than that of the example shown in FIG. 7 is provided to the user.
 ここで、人間は自己の姿勢を把握する際に、三半規管のみならず視覚情報にも頼っている。そのため、体の動きと視界の動きが著しく異なる場合、乗り物酔いと同じような症状を招いたり、よろけたりする等の問題が生じる場合がある。そのため、第2の領域に映る映像は、倍率が1倍に近いことが望ましい。その場合、外界の様子を肉眼で見た場合と同様の映像を周辺視野に映すことが可能となるので、ユーザの姿勢の感覚を安定させることができ、酔い及び不快感を防止又は和らげることができる。 Here, humans rely not only on the semicircular canal but also on visual information when grasping their posture. For this reason, when the body movement and the field of view are significantly different, there may be a problem such as causing a symptom similar to that of motion sickness or swaying. Therefore, it is desirable that the image shown in the second area has a magnification close to 1. In that case, since it is possible to project the same image as when the outside world is seen with the naked eye in the peripheral visual field, the sense of posture of the user can be stabilized, and sickness and discomfort can be prevented or reduced. it can.
 図9は、本実施形態に係る照射制御部210による画像処理の一例を説明するための図である。図9では、接眼レンズ800に写る映像の一例を示している。また、符号21に示す領域は第1の領域を示し、符号22に示す領域は第2の領域を示している。図9に示した例では、第2の領域の映像の倍率が中心側から外側へ向けて徐々に低まっている。換言すると、副光学系により得られた撮像画像は、第1の領域と第2の領域との境界部分から外側に向けて画角が非均等に増加するよう処理されている。これにより、図7に示した例と比較して広い実視野角(画角)の映像がユーザに提供される。さらに、第1の領域と第2の領域との境界部分の画角が、主光学系と副光学系とで一致しているので、図8に示した例と比較してユーザに与える違和感が軽減される。ここで、図10及び図11を参照して、画角の増加方式の一例を説明する。 FIG. 9 is a diagram for explaining an example of image processing by the irradiation control unit 210 according to the present embodiment. FIG. 9 shows an example of an image captured on the eyepiece 800. An area indicated by reference numeral 21 indicates a first area, and an area indicated by reference numeral 22 indicates a second area. In the example shown in FIG. 9, the magnification of the image in the second region gradually decreases from the center side toward the outside. In other words, the captured image obtained by the sub optical system is processed so that the angle of view increases non-uniformly from the boundary portion between the first region and the second region toward the outside. As a result, a video having a wider actual viewing angle (viewing angle) than that of the example shown in FIG. 7 is provided to the user. Furthermore, since the angle of view at the boundary between the first region and the second region is the same between the main optical system and the sub optical system, the user feels uncomfortable compared to the example shown in FIG. It is reduced. Here, an example of a method for increasing the angle of view will be described with reference to FIGS. 10 and 11.
 図10及び図11は、本実施形態に係る照射制御部210による画像処理の一例を説明するための図である。図10及び11では、接眼レンズ800に写る映像の一例を示している。また、符号21に示す領域は第1の領域を示し、符号22に示す領域は第2の領域を示している。 10 and 11 are diagrams for explaining an example of image processing by the irradiation control unit 210 according to the present embodiment. 10 and 11 show an example of an image captured on the eyepiece 800. FIG. An area indicated by reference numeral 21 indicates a first area, and an area indicated by reference numeral 22 indicates a second area.
 図10では、第2の領域において画角が非均等に増加する例を示している。例えば、照射制御部210は、副光学系の映像の画角の増加率を、映像の外側に近づくほど大きくする。詳しくは、照射制御部210は、画角が30度から40度に増加するまでの間隔24を、画角が20度から30度に増加するまでの間隔23よりも短くする。また、照射制御部210は、画角が40度から50度に増加するまでの間隔25を、画角が30度から40度に増加するまでの間隔24よりも短くする。なお、図10に示した例とは逆に、照射制御部210は、副光学系の映像の画角の増加率を、映像の外側に近づくほど小さくしてもよい。 FIG. 10 shows an example in which the angle of view increases non-uniformly in the second region. For example, the irradiation control unit 210 increases the increasing rate of the angle of view of the sub optical system image as it approaches the outside of the image. Specifically, the irradiation control unit 210 makes the interval 24 until the angle of view increases from 30 degrees to 40 degrees shorter than the interval 23 until the angle of view increases from 20 degrees to 30 degrees. In addition, the irradiation controller 210 makes the interval 25 until the angle of view increases from 40 degrees to 50 degrees shorter than the interval 24 until the angle of view increases from 30 degrees to 40 degrees. In contrast to the example shown in FIG. 10, the irradiation control unit 210 may decrease the increasing rate of the angle of view of the image of the sub optical system as it approaches the outside of the image.
 このような画像の幾何学変化により、双眼鏡1は、主光学系の映像から外側へシームレスに、広い実視野角を見かけの視野角の中に圧縮して提示することができる。これにより、単に見かけの視野角を拡張する場合と比較して、被写体が視野角から外れて見失うことが減り、視野外から飛び込んでくる被写体にいち早く察知することができるようになる。映像の外側ほど歪みが大きく画質も悪くなるものの、主光学系の映像の画質は維持されるので、ユーザは快適に被写体を見ることができる。 Such a geometrical change of the image allows the binoculars 1 to be seamlessly displayed outward from the image of the main optical system and compressed and presented within the apparent viewing angle. As a result, compared to a case where the apparent viewing angle is simply expanded, the subject is less likely to lose sight of being out of the viewing angle, and a subject that jumps out of the field of view can be quickly detected. Although the distortion increases and the image quality worsens toward the outside of the image, the image quality of the image of the main optical system is maintained, so that the user can comfortably view the subject.
 図11は、画角の変化率が均等に増加する例を示している。例えば、照射制御部210は、副光学系の映像の画角を、映像の外側に近づくほど均等に増加させる。詳しくは、照射制御部210は、画角が20度から25度に増加するまでの間隔26と、画角が25度から30度に増加するまでの間隔27とを同じ長さにする。なお、図10及び図11に示した画角の値は一例である。 FIG. 11 shows an example in which the change rate of the angle of view increases evenly. For example, the irradiation control unit 210 increases the angle of view of the image of the sub optical system evenly as it approaches the outside of the image. Specifically, the irradiation controller 210 makes the interval 26 until the angle of view increases from 20 degrees to 25 degrees and the interval 27 until the angle of view increases from 25 degrees to 30 degrees have the same length. The angle of view values shown in FIGS. 10 and 11 are examples.
 ここで、第1の領域の映像の倍率の変化に伴い、第2の領域の映像の倍率が変化してもよい。換言すると、副光学系により得られた撮像画像は、第1の領域に照射される被写体像と撮像画像に含まれる被写体像との倍率の関係が維持されるよう処理されてもよい。例えば、照射制御部210は、主光学系がズーム機能を有している場合に、主光学系により得られる映像が拡大されれば副光学系の映像も拡大し、主光学系により得られる映像が縮小されれば副光学系の映像も縮小する。これにより、第1の領域の映像の画角の変化に合わせて第2の領域の映像の画角が自動的に変化する。よって、第1の領域の映像と第2の領域の映像との整合性が維持される。他にも、主光学系又は副光学系のレンズが交換された場合も、倍率の関係が維持されるよう同様に処理されてもよい。 Here, the magnification of the image in the second area may change as the magnification of the image in the first area changes. In other words, the captured image obtained by the sub optical system may be processed so that the relationship of magnification between the subject image irradiated on the first region and the subject image included in the captured image is maintained. For example, when the main optical system has a zoom function, the irradiation control unit 210 enlarges the image of the sub optical system when the image obtained by the main optical system is enlarged, and the image obtained by the main optical system. If is reduced, the image of the sub optical system is also reduced. As a result, the angle of view of the video in the second area automatically changes in accordance with the change in the angle of view of the video in the first area. Therefore, the consistency between the video in the first area and the video in the second area is maintained. In addition, when the lenses of the main optical system or the sub optical system are exchanged, the same processing may be performed so as to maintain the relationship of magnification.
 なお、主光学系のズーム比率が大きい場合、副光学系の映像が画像処理により拡大されるだけでは、第2の領域の映像が過度に荒くなるおそれがある。そのため、副光学系も光学ズーム機能を有していることが望ましい。また、図9に示した例では、主光学系の画角(境界部分の画角)と副光学系の画角(最周辺部の画角)との差が過度に大きい場合、映像の歪みが過度に大きくなり、見づらく、不快感を与える映像がユーザに提供されるおそれがある。そのため、照射制御部210は、ユーザ操作に応じて副光学系の画角を調整してもよいし、主光学系のズーム比率に基づいて差が過度に大きくならないよう、画像処理によって副光学系の画角を自動的に調整してもよい。 If the zoom ratio of the main optical system is large, the image in the second area may become excessively rough if the image of the sub optical system is simply enlarged by image processing. Therefore, it is desirable that the sub optical system also has an optical zoom function. In the example shown in FIG. 9, if the difference between the angle of view of the main optical system (the angle of view of the boundary portion) and the angle of view of the sub-optical system (the angle of view of the most peripheral portion) is excessively large, the image is distorted. May become excessively large, difficult to see, and uncomfortable video may be provided to the user. Therefore, the irradiation control unit 210 may adjust the angle of view of the secondary optical system according to a user operation, and the secondary optical system by image processing so that the difference does not become excessively large based on the zoom ratio of the primary optical system. The angle of view may be automatically adjusted.
 照射制御部210は、第1の領域及び第2の領域の大きさを制御してもよい。例えば、照射制御部210は、ユーザが対象の被写体を探している場合は第1の領域を小さく第2の領域を大きく、ユーザが対象の被写体を観察している場合は第2の領域を大きく第2の領域を小さくしてもよい。第1の領域の大きさの制御は、主光学系に設けられる図示しない絞りにより実現され得る。また、第2の領域の大きさの制御は、副光学系により得られた撮像画像のうちマスクする領域の大きさを制御することにより実現され得る。 The irradiation control unit 210 may control the size of the first area and the second area. For example, the irradiation control unit 210 decreases the first area when the user is searching for the target subject, and increases the second area when the user is observing the target subject. You may make 2nd area | region small. Control of the size of the first region can be realized by a diaphragm (not shown) provided in the main optical system. Further, the control of the size of the second region can be realized by controlling the size of the masked region in the captured image obtained by the sub optical system.
  <3.2.動作モード切り替え機能>
 本実施形態に係る双眼鏡1(例えば、動作モード制御部220)は、主光学系の見かけの視野角を疑似的に拡張するための処理に係る動作モードを切り替える機能を有する。
<3.2. Operation mode switching function>
The binoculars 1 (for example, the operation mode control unit 220) according to the present embodiment has a function of switching an operation mode related to processing for artificially extending the apparent viewing angle of the main optical system.
 例えば、動作モード制御部220は、照射制御部210が制御する上記照射処理の内容を切り替える。具体的には、動作モード制御部220は、上記図7~図11に示した処理のいずれを実行するかを切り替えてもよい。また、動作モード制御部220は、第1の領域及び第2の領域の大きさを切り替えてもよい。 For example, the operation mode control unit 220 switches the content of the irradiation process controlled by the irradiation control unit 210. Specifically, the operation mode control unit 220 may switch which of the processes shown in FIGS. 7 to 11 is executed. In addition, the operation mode control unit 220 may switch the sizes of the first area and the second area.
 動作モード制御部220は、ユーザ操作に応じて切り替えを実行してもよい。例えば、動作モード制御部220は、スイッチ900への切り替え操作に応じて動作モードを制御し得る。 The operation mode control unit 220 may perform switching according to a user operation. For example, the operation mode control unit 220 can control the operation mode in accordance with the switching operation to the switch 900.
 動作モード制御部220は、ユーザの状態に応じて切り替えを実行してもよい。例えば、双眼鏡1は、加速度センサ及びジャイロセンサ等のセンサを有していてもよく、動作モード制御部220は、センサにより検出されたユーザの運動状態に応じて動作モードを制御してもよい。例えば、図7又は図9に示した処理が行われる動作モードにおいてユーザが運動を継続していることが検知された場合に、動作モード制御部220は、自動的に図8に示した処理が行われる動作モードに切り替えてもよい。これにより、ユーザの酔い及び不快感を防止又は和らげることができる。 The operation mode control unit 220 may perform switching according to the user's state. For example, the binoculars 1 may include sensors such as an acceleration sensor and a gyro sensor, and the operation mode control unit 220 may control the operation mode according to the user's exercise state detected by the sensor. For example, when it is detected that the user continues to exercise in the operation mode in which the process shown in FIG. 7 or 9 is performed, the operation mode control unit 220 automatically performs the process shown in FIG. You may switch to the operation mode performed. Thereby, a user's sickness and discomfort can be prevented or relieved.
  <3.3.比較機能>
 本実施形態に係る双眼鏡1(例えば、照射制御部210)は、現在と過去とを比較する映像を提供する機能を有する。
<3.3. Comparison function>
The binoculars 1 (for example, the irradiation control unit 210) according to the present embodiment has a function of providing an image for comparing the current and the past.
 例えば、照射制御部210は、主光学系により得られる映像に、記憶部230に記憶された、過去に主光学系又は副光学系により撮像された映像を合成する。なお、記憶部230に記憶された映像は、双眼鏡1において撮像された映像でもよいし、他の装置により撮像され双眼鏡1に入力された映像であってもよい。以下、図12及び図13を参照して、具体的に説明する。 For example, the irradiation control unit 210 synthesizes an image captured by the main optical system or the sub optical system in the past, stored in the storage unit 230, with an image obtained by the main optical system. The video stored in the storage unit 230 may be a video captured by the binoculars 1 or a video captured by another device and input to the binoculars 1. Hereinafter, a specific description will be given with reference to FIGS. 12 and 13.
 図12及び図13は、本実施形態に係る照射制御部210による画像処理の一例を説明するための図である。図12は、主光学系により得られる映像を示している。図13は、主光学系により得られる映像31に、過去の映像32が合成された映像を示している。照射制御部210は、過去の映像を主光学系により得られる映像の全面に合成してもよし、図13に示したように例えば半分だけ合成してもよい。 12 and 13 are diagrams for explaining an example of image processing by the irradiation control unit 210 according to the present embodiment. FIG. 12 shows an image obtained by the main optical system. FIG. 13 shows an image obtained by combining the past image 32 with the image 31 obtained by the main optical system. The irradiation control unit 210 may synthesize the past image on the entire surface of the image obtained by the main optical system, or may synthesize, for example, only half as shown in FIG.
 以上、本実施形態に係る双眼鏡1の技術的特徴について説明した。続いて、図14を参照して、本実施形態に係る双眼鏡1の動作処理例を説明する。 The technical features of the binoculars 1 according to this embodiment have been described above. Next, with reference to FIG. 14, an example of operation processing of the binoculars 1 according to the present embodiment will be described.
 <<4.動作処理例>>
 図14は、本実施形態に係る双眼鏡1において実行される処理の流れの一例を示すフローチャートである。
<< 4. Example of operation processing >>
FIG. 14 is a flowchart illustrating an example of a flow of processing executed in the binoculars 1 according to the present embodiment.
 図14に示すように、まず、双眼鏡1は、副光学系により撮像画像を取得する(ステップS102)。より詳しくは、画像処理装置200は、副光学系カメラブロック100により撮像された撮像画像を取得する。 As shown in FIG. 14, first, the binoculars 1 obtain a captured image by the sub optical system (step S102). More specifically, the image processing apparatus 200 acquires a captured image captured by the sub optical system camera block 100.
 次いで、照射制御部210は、撮像画像を拡縮する(ステップS104)。このとき、照射制御部210は、動作モード制御部220による動作モードの設定に従い撮像画像を拡縮する。例えば、照射制御部210は、撮像画像を、副光学系の倍率と主光学系の倍率とが一致するように拡縮してもよいし、副光学系の倍率の方が主光学系の倍率より低くなるよう拡縮してもよいし、外側へ向けて徐々に倍率が低まるよう拡縮してもよい。 Next, the irradiation controller 210 enlarges / reduces the captured image (step S104). At this time, the irradiation control unit 210 enlarges / reduces the captured image according to the operation mode setting by the operation mode control unit 220. For example, the irradiation control unit 210 may enlarge or reduce the captured image so that the magnification of the sub optical system and the magnification of the main optical system match, or the magnification of the sub optical system is larger than the magnification of the main optical system. You may expand / contract so that it may become low, and you may expand / contract so that magnification may become low gradually toward the outer side.
 次に、照射制御部210は、拡縮後の撮像画像の中央部分(換言すると、第1の領域に相当する部分)をマスクする(ステップS106)。このとき、照射制御部210は、ユーザが対象の被写体を探しているか、ユーザが対象の被写体を観察しているか、によってマスクする領域の大きさを制御してもよい。 Next, the irradiation controller 210 masks the central portion (in other words, the portion corresponding to the first region) of the captured image after the enlargement / reduction (step S106). At this time, the irradiation control unit 210 may control the size of the masked area depending on whether the user is searching for the target subject or the user is observing the target subject.
 そして、照射制御部210は、マスクしたマスク画像を表示ブロック400に出力して、表示ブロック400によりマスク画像を接眼レンズ800に照射させる(ステップS108)。 Then, the irradiation controller 210 outputs the masked mask image to the display block 400 and causes the display block 400 to irradiate the eyepiece 800 with the mask image (step S108).
 以上、本実施形態に係る双眼鏡1の動作処理例を説明した。 The operation processing example of the binoculars 1 according to the present embodiment has been described above.
 <<5.変形例>>
  <5.1.副光学系レス>
 本技術は、副光学系を有しない装置にも適用可能である。以下、図15~図16を参照して、副光学系を有しない装置への本技術の適用例を説明する。
<< 5. Modification >>
<5.1. No secondary optical system>
The present technology can also be applied to an apparatus that does not have a secondary optical system. Hereinafter, with reference to FIG. 15 to FIG. 16, an application example of the present technology to an apparatus that does not have a sub optical system will be described.
 図15は、本変形例に係る双眼鏡1-1のハードウェア構成の一例を示す図である。図15に示す双眼鏡1-1は、図5に示した双眼鏡1から副光学系カメラブロック100が省略された構成を有する。また、双眼鏡1-1は、メモリーカード1000(換言すると、着脱可能な媒体として形成された記憶部230)を有する。 FIG. 15 is a diagram illustrating an example of a hardware configuration of the binoculars 1-1 according to the present modification. The binoculars 1-1 shown in FIG. 15 has a configuration in which the sub optical system camera block 100 is omitted from the binoculars 1 shown in FIG. The binoculars 1-1 also includes a memory card 1000 (in other words, the storage unit 230 formed as a removable medium).
 図15に示した双眼鏡1-1においては、画像処理装置200(例えば、照射制御部210)は、主光学系カメラブロック300により撮像された撮像画像をメモリーカード1000に記憶する。そして、画像処理装置200は、メモリーカード1000から読み出した撮像画像を表示ブロック400へ出力し、表示ブロック400が撮像画像を接眼レンズ800に照射する照射処理を制御する。これにより、対物レンズ600及びビームスプリッター500を通過して接眼レンズ800に至った現在の主光学系の映像に、主光学系カメラブロック300により撮像された過去の主光学系の映像が合成される。その際、照射制御部210は、図16に示すように、センサにより検出された双眼鏡1-1の動きに応じて、主光学系の現在の視界から外れた領域に過去の撮像画像を照射してもよい。 In the binoculars 1-1 shown in FIG. 15, the image processing apparatus 200 (for example, the irradiation control unit 210) stores the captured image captured by the main optical system camera block 300 in the memory card 1000. Then, the image processing apparatus 200 outputs the captured image read from the memory card 1000 to the display block 400, and controls the irradiation process in which the display block 400 irradiates the eyepiece 800 with the captured image. As a result, the past image of the main optical system captured by the main optical system camera block 300 is combined with the current image of the main optical system that has passed through the objective lens 600 and the beam splitter 500 and reached the eyepiece lens 800. . At that time, as shown in FIG. 16, the irradiation control unit 210 irradiates a past captured image on a region outside the current field of view of the main optical system according to the movement of the binoculars 1-1 detected by the sensor. May be.
 図16は、本変形例に係る照射制御部210による照射処理の一例を説明するための図である。図16では、接眼レンズ800に写る映像の一例を示している。符号41に示す領域は、主光学系により得られた現在の映像(換言すると、対物レンズ600を通過して入射した光であって、ビームスプリッター500を透過してきた光)が映る領域である。符号42に示す領域は、主光学系により得られた過去の映像(換言すると、主光学系カメラブロック300により撮像された撮像画像)が映る領域である。図16の上図は、双眼鏡1-1が静止している場合の映像の一例を示し、図16の下図は、双眼鏡1-1が左へ動いた場合の映像の一例を示している。 FIG. 16 is a diagram for explaining an example of irradiation processing by the irradiation control unit 210 according to the present modification. FIG. 16 shows an example of an image captured on the eyepiece lens 800. An area denoted by reference numeral 41 is an area in which a current image obtained by the main optical system (in other words, light incident through the objective lens 600 and transmitted through the beam splitter 500) is reflected. The area indicated by reference numeral 42 is an area in which a past video (in other words, a captured image captured by the main optical system camera block 300) obtained by the main optical system is reflected. The upper diagram of FIG. 16 shows an example of an image when the binoculars 1-1 are stationary, and the lower diagram of FIG. 16 shows an example of an image when the binoculars 1-1 are moved to the left.
 例えば、照射制御部210は、双眼鏡1-1が静止している場合、過去の主光学系に映像を照射しない。そのため、図16の上図に示すように、接眼レンズ800には、現在の主光学系の映像のみが映っている。双眼鏡1-1が動くと、今まで見えていた部分が視野から外れることとなる。そのため、照射制御部210は、主光学系により得られる被写体像のうち視野から外れた領域に、メモリーカード1000から取得した当該領域に対応する撮像画像を照射する。例えば、照射制御部210は、図16の上図に示した領域41の被写体像のうち、下図に示した領域41に含まれない部分の撮像画像を、領域42に照射する。よって、双眼鏡1-1が左に動くと過去の映像が右側(周辺視野)に延びていく。このように、双眼鏡1-1が上下左右に動くと、見かけ上の視野が広がることとなる。 For example, when the binocular 1-1 is stationary, the irradiation control unit 210 does not irradiate the past main optical system with an image. Therefore, as shown in the upper diagram of FIG. 16, only the current image of the main optical system is displayed on the eyepiece 800. When the binoculars 1-1 are moved, the part that has been visible until now is out of the field of view. Therefore, the irradiation control unit 210 irradiates a captured image corresponding to the region acquired from the memory card 1000 to a region out of the field of view of the subject image obtained by the main optical system. For example, the irradiation control unit 210 irradiates the area 42 with a captured image of a part of the subject image in the area 41 illustrated in the upper diagram of FIG. 16 that is not included in the area 41 illustrated in the lower diagram. Therefore, when the binoculars 1-1 move to the left, the past video extends to the right (peripheral visual field). As described above, when the binoculars 1-1 move up, down, left, and right, the apparent field of view widens.
  <5.2.電子双眼鏡>
 上記では、双眼鏡を一例に挙げて説明してきたが、本技術は多様な装置に適用可能である。例えば、本技術は、望遠鏡、フィールドスコープ、一眼カメラ、顕微鏡、ビノキュラー、測量機器、銃の照準器のような、様々の光学系装置に適用可能である。また、上記では、主光学系の映像はアナログのままで、副光学系の映像が一旦デジタル化された映像である例を示したが、主光学系の映像もデジタル化された映像であってもよい。その一例として、図17に、本技術が電子双眼鏡に適用される場合の構成例を示した。
<5.2. Electronic binoculars>
In the above description, binoculars have been described as an example. However, the present technology can be applied to various apparatuses. For example, the present technology can be applied to various optical system devices such as a telescope, a field scope, a single-lens camera, a microscope, a binocular, a surveying instrument, and a gun sight. In the above example, the image of the main optical system is analog and the image of the sub optical system is once digitized. However, the image of the main optical system is also digitized. Also good. As an example, FIG. 17 shows a configuration example when the present technology is applied to electronic binoculars.
 図17は、本変形例に係る双眼鏡1-2のハードウェア構成の一例を示す図である。図17に示す双眼鏡1-2は、副光学系カメラブロック100、画像処理装置200、主光学系カメラブロック300、電子ビューファインダ1100及びスイッチ900を含む。 FIG. 17 is a diagram illustrating an example of a hardware configuration of the binoculars 1-2 according to the present modification. A binocular 1-2 shown in FIG. 17 includes a secondary optical system camera block 100, an image processing apparatus 200, a main optical system camera block 300, an electronic viewfinder 1100, and a switch 900.
 対物レンズ311を通過して入射した光は、イメージャ320上に結像し、主光学系カメラブロック300により撮像される。また、レンズ110を通過して入射した光は、イメージャ120上に結像し、副光学系カメラブロック100により撮像される。 The light incident through the objective lens 311 forms an image on the imager 320 and is imaged by the main optical system camera block 300. Further, the light incident through the lens 110 forms an image on the imager 120 and is imaged by the sub optical system camera block 100.
 画像処理装置200は、主光学系カメラブロック300及び副光学系カメラブロック100により撮像された撮像画像を合成して電子ビューファインダ1100に出力する。電子ビューファインダ1100は、画像処理装置200により合成された画像をLCD420により表示する。LCD1120により表示された画像は、接眼レンズ1110を通過してユーザの眼に入射する。 The image processing apparatus 200 synthesizes the captured images captured by the main optical system camera block 300 and the sub optical system camera block 100 and outputs them to the electronic viewfinder 1100. The electronic viewfinder 1100 displays the image synthesized by the image processing apparatus 200 on the LCD 420. The image displayed on the LCD 1120 passes through the eyepiece 1110 and enters the user's eyes.
 本変形例に係る双眼鏡1-2は、上記説明した双眼鏡1と同様の処理を行う。また、本変形例に係る双眼鏡1-2は、合成対象の画像の各々がデジタル化されたデータであるので、主光学系と副光学系の視野の比率(換言すると、第1の領域と第2の領域との大きさの比率)をより容易に制御することが可能である。 The binoculars 1-2 according to this modification performs the same processing as the binoculars 1 described above. In addition, since the binoculars 1-2 according to this modification is data in which each of the images to be synthesized is digitized, the ratio of the field of view of the main optical system and the sub optical system (in other words, the first region and the first region) It is possible to more easily control the ratio of the size to the area 2).
 なお、双眼鏡1-2が外部に映像出力可能な場合、撮像装置と表示装置とが分離して形成されてもよい。例えば、電子ビューファインダ1100の代わりにHMD(Head Mounted Display)により映像が出力されてもよい。 Note that when the binoculars 1-2 can output an image to the outside, the imaging device and the display device may be formed separately. For example, an image may be output by an HMD (Head Mounted Display) instead of the electronic viewfinder 1100.
 他にも、撮像装置と表示装置とが分離して形成されるシステムの一例として、内視鏡が挙げられる。例えば、主光学系として機能する主カメラと副光学系として機能する副カメラとが、ひとつの内視鏡に含まれていてもよい。その場合、内視鏡のサイズは大きくなるものの、主カメラ及び副カメラの双方の視野及び視差を製造段階で調整することが可能である。他にも、画角及び解像度の異なる複数の内視鏡の映像が、それぞれ主光学系の映像又は副光学系の映像として取り扱われてもよい。その場合、個々の内視鏡のサイズを小さいまま維持することが可能である。ただし、当該内視鏡システムは、内視鏡の各々に設けられたセンサの情報又は画像の特徴量等に基づいて、複数の内視鏡の位置関係をリアルタイムに検出して、検出結果に基づいて画像を合成する。 In addition, an endoscope is an example of a system in which an imaging device and a display device are separately formed. For example, a main camera that functions as a main optical system and a sub camera that functions as a sub optical system may be included in one endoscope. In this case, although the size of the endoscope is increased, it is possible to adjust the visual field and parallax of both the main camera and the sub camera at the manufacturing stage. In addition, images of a plurality of endoscopes having different angles of view and resolutions may be handled as images of the main optical system or images of the sub optical system, respectively. In that case, it is possible to keep the size of each endoscope small. However, the endoscope system detects the positional relationship of a plurality of endoscopes in real time based on information of sensors provided in each of the endoscopes or feature amounts of images, and based on the detection results. To combine the images.
  <5.3.HMD>
 本技術は、光学系装置以外の他の情報処理装置にも適用可能である。例えば、本技術は、HMD、PC、スマートフォン及びカーナビゲーション装置等の映像出力可能な情報処理装置に適用可能である。その一例として、図18に、本技術がHMDに適用される場合の構成例を示した。
<5.3. HMD>
The present technology can also be applied to information processing apparatuses other than the optical system apparatus. For example, the present technology can be applied to an information processing apparatus capable of outputting video, such as an HMD, a PC, a smartphone, and a car navigation apparatus. As an example thereof, FIG. 18 illustrates a configuration example when the present technology is applied to an HMD.
 図18は、本変形例に係るHMD2のハードウェア構成の一例を示す図である。図18に示すように、HMD2は、左右一対のカメラブロック2100、表示素子2200及び接眼レンズ2300を有する。 FIG. 18 is a diagram illustrating an example of a hardware configuration of the HMD 2 according to the present modification. As shown in FIG. 18, the HMD 2 includes a pair of left and right camera blocks 2100, a display element 2200, and an eyepiece lens 2300.
 レンズ2110を通過して入射した光は、イメージャ2120上に結像し、カメラブロック2100により撮像される。HMD2は、図示しない画像処理装置を含み、カメラブロック2100により撮像された撮像画像を合成して、表示素子2200により表示する。表示素子2200により表示された画像は、接眼レンズ2300を通過してユーザの眼に入射する。 The light incident through the lens 2110 forms an image on the imager 2120 and is imaged by the camera block 2100. The HMD 2 includes an image processing apparatus (not shown), synthesizes the captured images captured by the camera block 2100, and displays the combined image on the display element 2200. The image displayed by the display element 2200 passes through the eyepiece lens 2300 and enters the user's eye.
 左右一対のカメラブロック2100のいずれか一方が主光学系として機能し、他方が副光学系として機能してもよい。他にも、HMD2は、左右一対のカメラブロック2100により撮像された撮像画像に対して拡縮する等の画像処理を行うことで、主カメラにより得られる映像及び副カメラにより得られる映像の各々を疑似的に生成してもよい。 Any one of the pair of left and right camera blocks 2100 may function as a main optical system, and the other may function as a sub optical system. In addition, the HMD 2 simulates each of the video obtained by the main camera and the video obtained by the sub camera by performing image processing such as enlargement / reduction on the captured image captured by the pair of left and right camera blocks 2100. May be generated automatically.
 ここで、HMD2を使用する場合、ユーザの視界は完全にディスプレイで塞がれることとなる。そのため、実際の身体の動きと映像とが一致しない場合には、映像酔い及び不快感を招くおそれがある。そのため、HMD2は、図19に示すように、カメラブロック2100により撮像されたユーザの外界の映像を周辺視野に合成してもよい。これにより、実際の身体の動きと周辺視野の映像とが一致することとなり、ユーザの映像酔い及び不快感を抑制することが可能となる。 Here, when the HMD 2 is used, the user's field of view is completely blocked by the display. For this reason, if the actual movement of the body and the video do not match, there is a risk of video sickness and discomfort. Therefore, as shown in FIG. 19, the HMD 2 may synthesize the user's external image captured by the camera block 2100 with the peripheral visual field. As a result, the actual movement of the body matches the image of the peripheral visual field, and it is possible to suppress the user's image sickness and discomfort.
 図19は、本変形例によるHMD2により表示される映像の一例を示す図である。符号51に示す領域は、主となる映像が映る領域である。例えば、符号51に示す領域に、仮想世界の映像が映っていてもよい。他にも、符号51に示す領域に、内視鏡により得られた映像が映っていてもよい。符号52に示す領域は、ユーザの外界の映像が等倍で映る領域である。図19に示した例では、2分割された領域のうち外側(符号52に示す領域)に、ユーザの外界の映像が等倍で映っているが、その内側の領域(符号51に示す領域)の見かけの視野角を拡張しているとは言い難い。そこで、HMD2は、3分割した領域の最も外側に、ユーザの外界の映像を等倍で映し、その内側の2領域を用いて上記説明した見かけの視野角を拡張するための処理を行ってもよい。 FIG. 19 is a diagram showing an example of an image displayed by the HMD 2 according to this modification. An area denoted by reference numeral 51 is an area in which a main video is shown. For example, a video of the virtual world may be shown in the area indicated by reference numeral 51. In addition, the image obtained by the endoscope may be reflected in the area indicated by reference numeral 51. An area indicated by reference numeral 52 is an area in which an image of the user's outside world is shown at the same magnification. In the example shown in FIG. 19, the user's external image is shown at the same magnification on the outside (the area indicated by reference numeral 52) of the two divided areas, but the inner area (the area indicated by reference numeral 51). It is hard to say that the apparent viewing angle is expanded. Therefore, the HMD 2 projects the user's external image at the same magnification on the outermost side of the three-divided region, and performs the above-described processing for extending the apparent viewing angle using the two inner regions. Good.
 なお、HMD2のユースケースとして、予め用意された全天球映像を見るという用途がある。例えば、HMD2は、ユーザの頭の動きをセンシングして、頭の動きに合わせて映像を動かすことで、ユーザがあたかも仮想世界に入り込んで周りの様子を見ているかのような映像を提供することができる。HMD2は、ユーザが仮想世界において仮想的な双眼鏡及び望遠鏡のような光学系装置を用いる場合に、上記説明した見かけの視野角を拡張するための処理を行ってもよい。具体的には、HMD2は、ズームを行う場合、図7~図11に一例を示した画像処理を行ってもよい。さらに、HMD2は、最周辺部に等倍の仮想世界の映像又は実世界の映像を合成することで、ユーザの映像酔いを抑制してもよい。 As a use case of HMD2, there is a use of viewing a spherical image prepared in advance. For example, HMD2 senses the movement of the user's head and moves the image according to the movement of the head, thereby providing an image as if the user is entering the virtual world and watching the surroundings. Can do. The HMD 2 may perform the above-described processing for extending the apparent viewing angle when the user uses an optical system device such as virtual binoculars and a telescope in the virtual world. Specifically, when zooming, the HMD 2 may perform image processing shown as an example in FIGS. Further, the HMD 2 may suppress the motion sickness of the user by synthesizing a virtual world video or a real world video of the same size at the outermost periphery.
 <<6.まとめ>>
 以上、図1~図19を参照して、本開示の一実施形態について詳細に説明した。上記説明したように、本実施形態に係る画像処理装置200は、主光学系及び副光学系により得られる被写体像の画角の関係に基づく処理が行われた、副光学系により得られた撮像画像を、主光学系により得られる被写体像の少なくとも一部が照射される接眼レンズに照射するよう照射処理を制御する。主光学系の見かけの視野角が疑似的に拡張されるので、主光学系の見かけの視野角を物理的に拡張する場合と比較して、光学系装置を小型化及び軽量化することができる上に、廉価に構成することができる。また、副光学系により得られる映像は、周辺視野に相当するので画質を落とすことが可能であり、光学系装置をさらに廉価に構成することが可能である。
<< 6. Summary >>
The embodiment of the present disclosure has been described in detail above with reference to FIGS. As described above, the image processing apparatus 200 according to this embodiment performs imaging based on the relationship between the angles of view of the subject images obtained by the main optical system and the sub optical system, and obtains an image obtained by the sub optical system. The irradiation process is controlled so that the image is irradiated to an eyepiece that irradiates at least a part of the subject image obtained by the main optical system. Since the apparent viewing angle of the main optical system is pseudo-expanded, the optical system apparatus can be reduced in size and weight compared with the case where the apparent viewing angle of the main optical system is physically expanded. In addition, it can be configured at a low cost. In addition, since the image obtained by the sub optical system corresponds to the peripheral visual field, the image quality can be lowered, and the optical system apparatus can be configured more inexpensively.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 例えば、本明細書において説明した各装置は、単独の装置として実現されてもよく、一部または全部が別々の装置として実現されても良い。例えば、図5に示した双眼鏡1の機能構成例のうち、画像処理装置200が、他の構成要素とネットワーク等で接続されたサーバ等の装置に備えられていても良い。 For example, each device described in this specification may be realized as a single device, or a part or all of the devices may be realized as separate devices. For example, in the functional configuration example of the binoculars 1 illustrated in FIG. 5, the image processing apparatus 200 may be provided in a device such as a server connected to other components via a network or the like.
 また、本明細書において説明した各装置による一連の処理は、ソフトウェア、ハードウェア、及びソフトウェアとハードウェアとの組合せのいずれを用いて実現されてもよい。ソフトウェアを構成するプログラムは、例えば、各装置の内部又は外部に設けられる記憶媒体(非一時的な媒体:non-transitory media)に予め格納される。そして、各プログラムは、例えば、コンピュータによる実行時にRAMに読み込まれ、CPUなどのプロセッサにより実行される。 Further, the series of processing by each device described in this specification may be realized using any of software, hardware, and a combination of software and hardware. For example, the program constituting the software is stored in advance in a storage medium (non-transitory medium) provided inside or outside each device. Each program is read into a RAM when executed by a computer and executed by a processor such as a CPU.
 また、本明細書においてフローチャート及びシーケンス図を用いて説明した処理は、必ずしも図示された順序で実行されなくてもよい。いくつかの処理ステップは、並列的に実行されてもよい。また、追加的な処理ステップが採用されてもよく、一部の処理ステップが省略されてもよい。 In addition, the processes described using the flowcharts and sequence diagrams in this specification do not necessarily have to be executed in the order shown. Some processing steps may be performed in parallel. Further, additional processing steps may be employed, and some processing steps may be omitted.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 第1及び第2の取得部により得られる被写体像の画角の関係に基づく処理が行われた、前記第2の取得部により得られた撮像画像を、前記第1の取得部により得られる被写体像の少なくとも一部が照射される接眼レンズに照射するよう照射処理を制御する照射制御部と、
を備える情報処理装置。
(2)
 前記照射制御部は、前記第1の取得部により得られる被写体像が照射される第1の領域の外縁の第2の領域に前記撮像画像を照射する、前記(1)に記載の情報処理装置。
(3)
 前記撮像画像は、前記第1の領域に照射される被写体像より低い倍率で前記撮像画像に含まれる被写体像を表すよう処理される、前記(2)に記載の情報処理装置。
(4)
 前記撮像画像は、前記第1の領域と前記第2の領域との境界部分から外側に向けて画角が非均等に増加するよう処理される、前記(3)に記載の情報処理装置。
(5)
 前記撮像画像は、前記第1の領域と前記第2の領域との境界部分の画角が一致するよう処理される、前記(2)~(4)のいずれか一項に記載の情報処理装置。
(6)
 前記撮像画像は、前記第1の領域に照射される被写体像と前記撮像画像に含まれる被写体像との倍率の関係が維持されるよう処理される、前記(2)~(5)のいずれか一項に記載の情報処理装置。
(7)
 前記撮像画像は、前記第1の領域に照射される被写体像と同じ倍率で前記撮像画像に含まれる被写体像を表すよう処理される、前記(2)に記載の情報処理装置。
(8)
 前記照射制御部は、前記第1の領域及び前記第2の領域の大きさを制御する、前記(2)~(7)のいずれか一項に記載の情報処理装置。
(9)
 前記情報処理装置は、前記照射処理の内容を切り替える動作モード制御部をさらに備える、前記(1)~(8)のいずれか一項に記載の情報処理装置。
(10)
 前記動作モード制御部は、ユーザ操作に応じて前記切り替えを実行する、前記(9)に記載の情報処理装置。
(11)
 前記動作モード制御部は、ユーザの状態に応じて前記切り替えを実行する、前記(9)又は(10)に記載の情報処理装置。
(12)
 前記照射制御部は、前記第1の取得部により得られた撮像画像を記憶部に記憶し、前記第1の取得部により得られる被写体像のうち視野から外れた領域に、前記記憶部から取得した前記領域に対応する前記撮像画像を照射する、前記(1)~(11)のいずれか一項に記載の情報処理装置。
(13)
 第1及び第2の取得部により得られる被写体像の画角の関係に基づく処理が行われた、前記第2の取得部により得られた撮像画像を、前記第1の取得部により得られる被写体像の少なくとも一部が照射される接眼レンズに照射するよう照射処理を制御すること、
を含む、プロセッサにより実行される情報処理方法。
(14)
 第1の取得部と、
 第2の取得部と、
 前記第1の取得部により得られた被写体像の少なくとも一部が照射される接眼レンズと、
 前記第1及び前記第2の取得部により得られる被写体像の画角の関係に基づく処理が行われた、前記第2の取得部により得られた撮像画像を前記接眼レンズに照射するよう照射処理を制御する照射制御部と、
を備える情報処理装置。
The following configurations also belong to the technical scope of the present disclosure.
(1)
A subject obtained by the first acquisition unit using a captured image obtained by the second acquisition unit, which has been processed based on the relationship between the angles of view of the subject images obtained by the first and second acquisition units. An irradiation control unit that controls the irradiation process so as to irradiate an eyepiece to which at least a part of the image is irradiated;
An information processing apparatus comprising:
(2)
The information processing apparatus according to (1), wherein the irradiation control unit irradiates the captured image to a second region at an outer edge of the first region irradiated with the subject image obtained by the first acquisition unit. .
(3)
The information processing apparatus according to (2), wherein the captured image is processed to represent a subject image included in the captured image at a lower magnification than a subject image irradiated on the first region.
(4)
The information processing apparatus according to (3), wherein the captured image is processed such that the angle of view increases non-uniformly outward from a boundary portion between the first region and the second region.
(5)
The information processing apparatus according to any one of (2) to (4), wherein the captured image is processed so that an angle of view of a boundary portion between the first area and the second area matches. .
(6)
Any one of (2) to (5), wherein the captured image is processed so that a magnification relationship between a subject image irradiated on the first region and a subject image included in the captured image is maintained. The information processing apparatus according to one item.
(7)
The information processing apparatus according to (2), wherein the captured image is processed to represent a subject image included in the captured image at the same magnification as a subject image irradiated on the first region.
(8)
The information processing apparatus according to any one of (2) to (7), wherein the irradiation control unit controls the size of the first region and the second region.
(9)
The information processing apparatus according to any one of (1) to (8), further including an operation mode control unit that switches contents of the irradiation process.
(10)
The information processing apparatus according to (9), wherein the operation mode control unit performs the switching according to a user operation.
(11)
The information processing apparatus according to (9) or (10), wherein the operation mode control unit executes the switching according to a user state.
(12)
The irradiation control unit stores the captured image obtained by the first obtaining unit in a storage unit, and obtains the subject image obtained by the first obtaining unit from the storage unit in an area out of the field of view. The information processing apparatus according to any one of (1) to (11), wherein the captured image corresponding to the region is irradiated.
(13)
A subject obtained by the first acquisition unit using a captured image obtained by the second acquisition unit, which has been processed based on the relationship between the angles of view of the subject images obtained by the first and second acquisition units. Controlling the irradiation process to irradiate an eyepiece to which at least a part of the image is irradiated;
An information processing method executed by a processor.
(14)
A first acquisition unit;
A second acquisition unit;
An eyepiece that irradiates at least a part of the subject image obtained by the first acquisition unit;
Irradiation processing to irradiate the eyepiece with the captured image obtained by the second acquisition unit, which has been processed based on the relationship between the field angles of the subject images obtained by the first and second acquisition units. An irradiation control unit for controlling
An information processing apparatus comprising:
 1    双眼鏡
 100  副光学系カメラブロック
 110  レンズ
 120  イメージャ
 200  画像処理装置
 210  照射制御部
 220  動作モード制御部
 230  記憶部
 300  主光学系カメラブロック
 310  レンズ
 320  イメージャ
 400  表示ブロック
 410  レンズ
 420  LCD
 500  ビームスプリッター
 600  対物レンズ
 700  プリズム
 800  接眼レンズ
 900  スイッチ
 1000  メモリーカード
 1100  電子ビューファインダ
 1110  接眼レンズ
 1120  LCD
 2   HMD
 2100  カメラブロック
 2110  レンズ
 2120  イメージャ
 2200  表示素子
 2300  接眼レンズ
 
DESCRIPTION OF SYMBOLS 1 Binoculars 100 Secondary optical system camera block 110 Lens 120 Imager 200 Image processing device 210 Irradiation control part 220 Operation mode control part 230 Memory | storage part 300 Main optical system camera block 310 Lens 320 Imager 400 Display block 410 Lens 420 LCD
500 Beam splitter 600 Objective lens 700 Prism 800 Eyepiece 900 Switch 1000 Memory card 1100 Electronic viewfinder 1110 Eyepiece 1120 LCD
2 HMD
2100 Camera block 2110 Lens 2120 Imager 2200 Display element 2300 Eyepiece

Claims (14)

  1.  第1及び第2の取得部により得られる被写体像の画角の関係に基づく処理が行われた、前記第2の取得部により得られた撮像画像を、前記第1の取得部により得られる被写体像の少なくとも一部が照射される接眼レンズに照射するよう照射処理を制御する照射制御部と、
    を備える情報処理装置。
    A subject obtained by the first acquisition unit using a captured image obtained by the second acquisition unit, which has been processed based on the relationship between the angles of view of the subject images obtained by the first and second acquisition units. An irradiation control unit that controls the irradiation process so as to irradiate an eyepiece to which at least a part of the image is irradiated;
    An information processing apparatus comprising:
  2.  前記照射制御部は、前記第1の取得部により得られる被写体像が照射される第1の領域の外縁の第2の領域に前記撮像画像を照射する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the irradiation control unit irradiates the captured image to a second region at an outer edge of the first region irradiated with the subject image obtained by the first acquisition unit.
  3.  前記撮像画像は、前記第1の領域に照射される被写体像より低い倍率で前記撮像画像に含まれる被写体像を表すよう処理される、請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the captured image is processed to represent a subject image included in the captured image at a lower magnification than a subject image irradiated on the first region.
  4.  前記撮像画像は、前記第1の領域と前記第2の領域との境界部分から外側に向けて画角が非均等に増加するよう処理される、請求項3に記載の情報処理装置。 The information processing apparatus according to claim 3, wherein the captured image is processed such that a field angle increases non-uniformly outward from a boundary portion between the first area and the second area.
  5.  前記撮像画像は、前記第1の領域と前記第2の領域との境界部分の画角が一致するよう処理される、請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the captured image is processed so that an angle of view of a boundary portion between the first area and the second area matches.
  6.  前記撮像画像は、前記第1の領域に照射される被写体像と前記撮像画像に含まれる被写体像との倍率の関係が維持されるよう処理される、請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the captured image is processed so that a magnification relationship between a subject image irradiated on the first region and a subject image included in the captured image is maintained.
  7.  前記撮像画像は、前記第1の領域に照射される被写体像と同じ倍率で前記撮像画像に含まれる被写体像を表すよう処理される、請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the captured image is processed to represent a subject image included in the captured image at the same magnification as a subject image irradiated on the first region.
  8.  前記照射制御部は、前記第1の領域及び前記第2の領域の大きさを制御する、請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the irradiation control unit controls the sizes of the first area and the second area.
  9.  前記情報処理装置は、前記照射処理の内容を切り替える動作モード制御部をさらに備える、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, further comprising an operation mode control unit that switches contents of the irradiation process.
  10.  前記動作モード制御部は、ユーザ操作に応じて前記切り替えを実行する、請求項9に記載の情報処理装置。 The information processing apparatus according to claim 9, wherein the operation mode control unit executes the switching in accordance with a user operation.
  11.  前記動作モード制御部は、ユーザの状態に応じて前記切り替えを実行する、請求項9に記載の情報処理装置。 The information processing apparatus according to claim 9, wherein the operation mode control unit performs the switching according to a user state.
  12.  前記照射制御部は、前記第1の取得部により得られた撮像画像を記憶部に記憶し、前記第1の取得部により得られる被写体像のうち視野から外れた領域に、前記記憶部から取得した前記領域に対応する前記撮像画像を照射する、請求項1に記載の情報処理装置。 The irradiation control unit stores the captured image obtained by the first obtaining unit in a storage unit, and obtains the subject image obtained by the first obtaining unit from the storage unit in an area out of the field of view. The information processing apparatus according to claim 1, wherein the captured image corresponding to the region is irradiated.
  13.  第1及び第2の取得部により得られる被写体像の画角の関係に基づく処理が行われた、前記第2の取得部により得られた撮像画像を、前記第1の取得部により得られる被写体像の少なくとも一部が照射される接眼レンズに照射するよう照射処理を制御すること、
    を含む、プロセッサにより実行される情報処理方法。
    A subject obtained by the first acquisition unit using a captured image obtained by the second acquisition unit, which has been processed based on the relationship between the angles of view of the subject images obtained by the first and second acquisition units. Controlling the irradiation process to irradiate an eyepiece to which at least a part of the image is irradiated;
    An information processing method executed by a processor.
  14.  第1の取得部と、
     第2の取得部と、
     前記第1の取得部により得られた被写体像の少なくとも一部が照射される接眼レンズと、
     前記第1及び前記第2の取得部により得られる被写体像の画角の関係に基づく処理が行われた、前記第2の取得部により得られた撮像画像を前記接眼レンズに照射するよう照射処理を制御する照射制御部と、
    を備える情報処理装置。
     
    A first acquisition unit;
    A second acquisition unit;
    An eyepiece that irradiates at least a part of the subject image obtained by the first acquisition unit;
    Irradiation processing to irradiate the eyepiece with the captured image obtained by the second acquisition unit, which has been processed based on the relationship between the field angles of the subject images obtained by the first and second acquisition units. An irradiation control unit for controlling
    An information processing apparatus comprising:
PCT/JP2016/050293 2015-03-30 2016-01-07 Information processing device and information processing method WO2016157923A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-069568 2015-03-30
JP2015069568 2015-03-30

Publications (1)

Publication Number Publication Date
WO2016157923A1 true WO2016157923A1 (en) 2016-10-06

Family

ID=57004953

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/050293 WO2016157923A1 (en) 2015-03-30 2016-01-07 Information processing device and information processing method

Country Status (1)

Country Link
WO (1) WO2016157923A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018128483A (en) * 2017-02-06 2018-08-16 鎌倉光機株式会社 Optical observation device
WO2019225354A1 (en) * 2018-05-22 2019-11-28 ソニー株式会社 Information processing device, information processing method, and program
EP4339682A1 (en) * 2022-09-16 2024-03-20 Swarovski-Optik AG & Co KG. Telescope with at least one viewing channel
EP4339683A1 (en) * 2022-09-16 2024-03-20 Swarovski-Optik AG & Co KG. Telescope with at least one viewing channel

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004201104A (en) * 2002-12-19 2004-07-15 Minolta Co Ltd Imaging apparatus
JP2008096584A (en) * 2006-10-10 2008-04-24 Nikon Corp Camera
JP2014003422A (en) * 2012-06-18 2014-01-09 Sony Corp Display control device, imaging device, and display control method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004201104A (en) * 2002-12-19 2004-07-15 Minolta Co Ltd Imaging apparatus
JP2008096584A (en) * 2006-10-10 2008-04-24 Nikon Corp Camera
JP2014003422A (en) * 2012-06-18 2014-01-09 Sony Corp Display control device, imaging device, and display control method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018128483A (en) * 2017-02-06 2018-08-16 鎌倉光機株式会社 Optical observation device
WO2019225354A1 (en) * 2018-05-22 2019-11-28 ソニー株式会社 Information processing device, information processing method, and program
JPWO2019225354A1 (en) * 2018-05-22 2021-07-26 ソニーグループ株式会社 Information processing equipment, information processing methods and programs
EP4339682A1 (en) * 2022-09-16 2024-03-20 Swarovski-Optik AG & Co KG. Telescope with at least one viewing channel
EP4339683A1 (en) * 2022-09-16 2024-03-20 Swarovski-Optik AG & Co KG. Telescope with at least one viewing channel
AT526577A1 (en) * 2022-09-16 2024-04-15 Swarovski Optik Ag & Co Kg Telescope with at least one viewing channel
AT526578A1 (en) * 2022-09-16 2024-04-15 Swarovski Optik Ag & Co Kg Telescope with at least one viewing channel

Similar Documents

Publication Publication Date Title
US11221494B2 (en) Adaptive viewport optical display systems and methods
JP7076447B2 (en) Light field capture and rendering for head-mounted displays
CN106796344B (en) System, arrangement and the method for the enlarged drawing being locked on object of interest
WO2017173735A1 (en) Video see-through-based smart eyeglasses system and see-through method thereof
US20210350762A1 (en) Image processing device and image processing method
JP5409107B2 (en) Display control program, information processing apparatus, display control method, and information processing system
WO2016157923A1 (en) Information processing device and information processing method
CN107037584B (en) Intelligent glasses perspective method and system
JP5695809B1 (en) Display device, display method, and program
JP5484453B2 (en) Optical devices with multiple operating modes
JP6576639B2 (en) Electronic glasses and control method of electronic glasses
JP6419118B2 (en) Image display system
TWI435160B (en) Method for composing three dimensional image with long focal length and three dimensional imaging system
JP2015007722A (en) Image display device
JPH02291787A (en) Wide visual field display device and wide visual field photography device
WO2017081915A1 (en) Image processing device, image processing method and program
JP4727188B2 (en) Superposition type image observation device
US20230244307A1 (en) Visual assistance
JP5281904B2 (en) Viewfinder system and imaging apparatus having the same
JP2016133541A (en) Electronic spectacle and method for controlling the same
JP2023128820A (en) Image processing apparatus, control method thereof, image capturing apparatus, and program
WO2019230115A1 (en) Medical image processing apparatus
JP2006071770A (en) Superimposed type image viewing apparatus
CN113574855A (en) Imaging apparatus, imaging signal processing apparatus, and imaging signal processing method
KR20050091583A (en) The apparatus and method for vergence control of a parallel-axis camera system using auxiliary image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16771809

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16771809

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP