WO2013132636A1 - Indicator, and close-up lens unit - Google Patents

Indicator, and close-up lens unit Download PDF

Info

Publication number
WO2013132636A1
WO2013132636A1 PCT/JP2012/056025 JP2012056025W WO2013132636A1 WO 2013132636 A1 WO2013132636 A1 WO 2013132636A1 JP 2012056025 W JP2012056025 W JP 2012056025W WO 2013132636 A1 WO2013132636 A1 WO 2013132636A1
Authority
WO
WIPO (PCT)
Prior art keywords
close
lens
subject
image
panel
Prior art date
Application number
PCT/JP2012/056025
Other languages
French (fr)
Japanese (ja)
Inventor
貞雄 安達
隆司 松山
Original Assignee
株式会社洛洛.Com
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社洛洛.Com filed Critical 株式会社洛洛.Com
Priority to PCT/JP2012/056025 priority Critical patent/WO2013132636A1/en
Publication of WO2013132636A1 publication Critical patent/WO2013132636A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/565Optical accessories, e.g. converters for close-up photography, tele-convertors, wide-angle convertors
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B43/00Testing correct operation of photographic apparatus or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis

Definitions

  • the present invention relates to an index body mainly used for a close-up lens and a technology of a close-up lens unit.
  • Patent Documents 1 and 2 a mobile phone terminal, a portable video game machine, a camera module mounted on a PC (Personal Computer), etc. is attached to a human skin or the like in a state of being in contact with or close to human skin. A close-up lens for photographing is disclosed.
  • the present invention has been made in view of such problems, and is a shooting environment such as camera function and performance, illumination light, and light source for an image acquired by a user using a close-up lens.
  • the purpose is to make it possible to reduce the effects of
  • the index body according to one aspect of the present invention is present at a position in contact with or close to the subject when the close-up lens attached to the camera is in contact with or close to the subject to photograph the subject, and at least one of them.
  • the index body when a user acquires an image using a close-up lens, a predetermined color that is an index for analysis of an image photographed by a camera is attached to the subject. The indicator part is photographed.
  • the index body it is possible to analyze the acquired image based on the index section photographed together with the subject. As a result, it is possible to reduce the influence of the shooting environment such as the function and performance of the camera, the illumination light, and the light source on the image acquired by the user using the close-up lens.
  • the index body is configured to capture the subject when the close-up lens mounted on the camera is in contact with or close to the subject and captures the subject. It may further include a mark indicating predetermined information that is photographed together with the subject and the indicator portion.
  • each of the index bodies is configured such that when the close-up lens mounted on the camera is in contact with or close to the subject and the subject is photographed, You may further provide opening in the part contained in an imaging
  • the opening may be provided so that the center of the opening is located in the vicinity of the optical axis of the camera.
  • the indicator portion may be provided so as to surround an outer edge of the opening.
  • the opening may have a rotationally symmetric shape.
  • the indicator portion may have a color pattern arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening.
  • a plurality of the marks may be arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening.
  • the indicator portion may have a step in the optical axis direction.
  • a close-up lens unit that realizes each of the above configurations may be used.
  • a close-up lens unit is used by being attached to a camera, and when attached to the camera, at least a part of the close-up lens unit is in contact with or close to a subject at the time of shooting.
  • the present invention it is possible to reduce the influence of the shooting environment such as the function and performance of the camera, the illumination light, and the light source on the image acquired by the user using the close-up lens.
  • FIG. 1 is a perspective view showing an example of a close-up lens to which a panel according to an embodiment is attached.
  • FIG. 2 is a bottom perspective view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 3 is a plan view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 4 is a bottom view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 5 is a front view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 6 is a cross-sectional view taken along line AA of FIG. 3, showing an example of a close-up lens to which the panel according to the embodiment is attached.
  • FIG. 7 is an exploded perspective view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 8 is a bottom exploded perspective view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 9 shows an example of using a close-up lens equipped with the panel according to the embodiment.
  • FIG. 10 shows an example of a panel according to the embodiment.
  • FIG. 11 is a perspective view showing another mounting example of the panel according to the embodiment.
  • FIG. 12 is an exploded perspective view illustrating another mounting example of the panel according to the embodiment.
  • FIG. 13 is a cross-sectional view showing another mounting example of the panel according to the embodiment.
  • FIG. 14 shows an example of a panel according to another embodiment.
  • FIG. 15A shows an example of a panel according to another embodiment.
  • FIG. 15B shows an example of a panel according to another embodiment.
  • FIG. 15C shows an example of a panel according to another embodiment.
  • FIG. 15D shows an example of a panel according to another embodiment.
  • FIG. 15E shows an example of a panel according to another embodiment.
  • FIG. 15F shows an example of a panel according to another embodiment.
  • FIG. 16 shows an example of a panel according to another embodiment.
  • FIG. 17A shows an example of a panel according to another embodiment.
  • FIG. 17B shows an example of a panel according to another embodiment.
  • FIG. 17C shows an example of a panel according to another embodiment.
  • FIG. 18 is a perspective view showing an example of a panel according to another embodiment.
  • FIG. 19 is a cross-sectional view showing an example of a close-up lens mounted with a panel according to another embodiment.
  • FIG. 20A shows an example of a panel according to another embodiment.
  • FIG. 20B shows an example of a panel according to another embodiment.
  • FIG. 21 illustrates an outline of a hardware configuration of the information processing system according to the embodiment.
  • FIG. 22 illustrates a schematic hardware configuration of an information processing system according to another embodiment.
  • FIG. 23 illustrates an outline of a functional configuration of the information processing system.
  • FIG. 24 is a flowchart illustrating an example of an information processing procedure performed by the information processing system according to the embodiment.
  • FIG. 25 shows an example of an image acquired in the information processing system according to the embodiment.
  • FIG. 26 shows an example of a screen displayed during shooting in the information processing system according to the embodiment.
  • FIG. 27 is a flowchart illustrating an example of a processing procedure of analysis processing and correction processing by the information processing system according to the embodiment.
  • FIG. 28 illustrates data stored in the information processing system according to the embodiment.
  • this embodiment an embodiment according to one aspect of the present invention (hereinafter also referred to as “this embodiment”) will be described with reference to the drawings.
  • the present embodiment described below is merely an example of the present invention in all points, and is not intended to limit the scope thereof. It goes without saying that various improvements and modifications can be made without departing from the scope of the present invention. That is, in carrying out the present invention, a specific configuration according to each embodiment may be adopted as appropriate.
  • data appearing in this embodiment is described in a natural language, more specifically, it is specified by a pseudo language, a command, a parameter, a machine language, etc. that can be recognized by a computer.
  • ⁇ 1 Close-up lens a scene in which a user photographs his / her skin with a camera module connected to or mounted on a mobile phone terminal, a portable video game machine, a PC (Personal Computer) or the like is illustrated.
  • the application scene of the present invention is not limited to such an example.
  • the present invention is widely applicable in scenes where close-ups of subjects are taken.
  • even a device having another name is referred to as a “camera” if the device has or can have a camera function.
  • FIG. 1 is a perspective view showing an example of a close-up lens 50 to which a panel 40 according to this embodiment is attached.
  • FIG. 2 is a bottom perspective view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached.
  • FIG. 3 is a plan view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached.
  • FIG. 4 is a bottom view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached.
  • FIG. 5 is a front view showing an example of the close-up lens 50 to which the panel 40 according to this embodiment is attached.
  • FIG. 6 is a cross-sectional view taken along the line AA in FIG.
  • FIG. 7 is an exploded perspective view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached.
  • FIG. 8 is a bottom exploded perspective view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached.
  • FIG. 9 shows an example of using the close-up lens 50 to which the panel 40 according to this embodiment is attached. Note that a rear view, a left side view, and a right side view showing an example of a close-up lens to which the panel 40 according to the present embodiment is attached are each expressed in the same manner as the front view shown in FIG.
  • the close-up lens 50 according to the present embodiment will be described mainly with reference to FIG.
  • the close-up lens 50 according to the present embodiment is used to magnify a subject (skin).
  • the close-up lens 50 according to the present embodiment includes a lens unit 54 that realizes the function of the magnified photographing, and an outer portion that surrounds the lens unit 54.
  • the outer portion includes a bottom surface portion 51 that faces the camera when the close-up lens 50 is attached to the camera, a groove portion 52 to which a lens fixing clip used for attaching the close-up lens 50 to the camera is attached, and when photographing. And an upper end 53 that contacts or is close to the subject.
  • the bottom surface portion 51 is a portion that comes into contact with the camera when the close-up lens 50 is attached to the camera, as shown in FIG. Therefore, a detachable or reattachable adhesive for attaching the close-up lens 50 to the camera may be applied or affixed to the bottom surface portion 51.
  • the groove portion 52 is provided above the bottom surface portion 51 and around the lens portion 54 as shown in FIGS. 5 and 6.
  • the groove portion 52 is a portion to which a lens fixing clip for attaching the close-up lens 50 to the camera is attached.
  • the lens fixing clip is attached to the close-up lens 50 by sandwiching the groove formed by the groove 52 with the lens fixing clip.
  • the groove part 52 exists so that the circumference
  • the upper end portion 53 is a portion that comes into contact with or is close to the subject when the subject is photographed with the close-up lens 50 attached to the camera. As shown in FIGS. 6 and 7, a recess having substantially the same shape and size as the outer periphery of the panel 40 is formed inside the upper end portion 53 in order to fit a circular panel 40 described later.
  • the lens unit 54 has a convex lens shape to magnify a subject.
  • the lens unit 54 is designed to be suitable for a subject to be processed. For example, when the skin of the hand or face is mainly taken as an object to be photographed, the lens unit 54 is designed so that the magnification of the lens is about 30 times as an example. For example, when the scalp is mainly taken as an imaging target, the lens unit 54 is designed so that the magnification of the lens is about 50 to 80 times as an example.
  • the lens portion 54 is formed integrally with the outer portion of the close-up lens 50.
  • the user attaches such a close-up lens 50 to the camera and shoots his / her skin as shown in FIG.
  • the subject illustrated in FIG. 9 is his own skin (skin on the back of his hand).
  • the subject is not particularly limited, and may be other than its own skin.
  • the close-up lens 50 is made of a translucent member, for example.
  • the close-up lens 50 is made of a transparent plastic.
  • the close-up lens 50 is made of a translucent member, so that ambient light can be taken in as illumination light even when photographing with the upper end 53 in contact with the subject.
  • a panel 40 is attached to the close-up lens 50 of the present embodiment.
  • the panel 40 corresponds to an embodiment of the index body of the present invention.
  • the index body of the present invention is not limited to the close-up lens panel exemplified by the panel 40.
  • FIG. 10 shows an example of the panel 40 according to the present embodiment.
  • the panel 40 is present at a position where the close-up lens 50 attached to the camera is in contact with or close to the subject and the subject is in contact with or close to the subject, and at least a part of the panel 40 is in the shooting range 70 of the camera.
  • an index portion 41 with a predetermined color as an index for analysis of an image photographed by the camera is photographed together with the subject.
  • the acquired image can be analyzed based on the index unit 41.
  • the shooting range 70 shown in FIG. 10 or the like is obtained when shooting is performed with each panel mounted on the close-up lens 50 and the close-up lens 50 mounted on a camera (a user terminal 2 described later). The range photographed by the camera on the panel is shown.
  • the panel 40 is attached to the close-up lens 50 by being fitted at the upper end portion 53.
  • the upper end portion 53 is a portion that is in contact with or close to the subject at the time of shooting. Therefore, the panel 40 is present at a position in contact with or close to the subject at the time of shooting.
  • an index portion 41 described later is provided on the surface of the panel 40 on the camera side. Therefore, an index unit 41 to be described later is also present at a position in contact with or close to the subject at the time of shooting.
  • the mounting method of the panel 40 which concerns on this embodiment is not limited to such a method.
  • the panel 40 may be bonded to the upper end portion 53 of the close-up lens 50.
  • the panel 40 may be attached to the close-up lens 50 by fitting the upper end portion 53 of the close-up lens 50 to the panel 40 as a cap of the close-up lens 50.
  • FIG. 11 is a perspective view showing another mounting example of the panel 40 according to the present embodiment.
  • FIG. 12 is an exploded perspective view showing another mounting example of the panel 40 according to the present embodiment.
  • FIG. 13 is sectional drawing which shows the other mounting example of the panel 40 which concerns on this embodiment. As shown in FIGS. 11 to 13, for example, when the cap 60 is attached to the close-up lens 50, the panel 40 is sandwiched between the close-up lens 50 and the cap 60. It may be attached to.
  • the shape of the panel 40 is a circle.
  • the shape of the panel 40 may be other than circular.
  • the shape of the panel 40 may be a polygon.
  • the panel 40 is fitted into a recess provided in the upper end portion 53 of the close-up lens 50. Therefore, the said recessed part is shape
  • FIG. 14 shows an example of variations in the shape of the panel 40.
  • the panel 40A shown in FIG. 14 has a semicircular shape.
  • the panel 40A includes an indicator portion 41A that is a variation of the indicator portion 41 of the panel 40 according to the present embodiment.
  • the index portion 41A appears in the right half of the photographing range 70 and the subject appears in the left half of the photographed image.
  • the shape of the indicator portion 41 according to the present embodiment is a donut shape in which the outer periphery and the inner periphery of the indicator portion 41 are circular. As shown in FIG. 10, the indicator portion 41 is formed on the entire surface of the panel 40 except for the region of the opening 43.
  • the index unit 41 only needs to be photographed together with the subject, and the shape of the index unit 41 is not limited to the shape shown in FIG. The shape of the index part 41 is appropriately selected.
  • FIGS. 15A to 15F illustrate variations in the shape of the indicator portion 41, respectively.
  • symbol is attached
  • the same type of mark is given the same symbol.
  • a mark 44 when referring to a mark regardless of the type, it is referred to as a mark 44.
  • the indicator portion 41B of the panel 40B has a donut shape like the indicator portion 41, but unlike the indicator portion 41, it is not formed on the entire surface of the panel 40B. Further, unlike the indicator portion 41, the color assigned to the indicator portion 41B is a single color.
  • the shape of the outer periphery of the indicator portion 41C of the panel 40C shown in FIG. 15B is a regular polygon (regular dodecagon in the figure) unlike the indicator portion 41. Further, unlike the indicator unit 41, the indicator unit 41C has a color pattern in which four colors are arranged.
  • the panel 40C has four types of marks (44a to 44d).
  • the shape of the outer periphery of the indicator portion 41D of the panel 40D shown in FIG. 15C is the same as the shape of the outer periphery of the indicator portion 41C.
  • the index part 41D has a larger area than the index part 41C.
  • the index part 41D has a color pattern in which two colors are arranged, like the index part 41.
  • the shape of the outer periphery of the indicator portion 41E of the panel 40E shown in FIG. 15D is the same as the shape of the indicator portion 41.
  • the indicator portion 41E does not cover the entire surface of the panel 40E.
  • the index part 41E has a color pattern in which four colors are arranged, like the index part 41C. Note that a mark 44 described later may not be present in all color pattern regions, as shown on the panel 40E.
  • the shape of the outer periphery of the indicator portion 41F of the panel 40F shown in FIG. 15E is the same as the shape of the outer periphery of the indicator portion 41D.
  • the index part 41F has the same area as the index part 41D.
  • the indicator portion 41F has a color pattern in which four colors are arranged.
  • the indicator unit 41 may or may not exist in order to realize information processing using a mark 44 described later. Therefore, as in the panel 40G shown in FIG. 15F, the indicator portion may not be present from the viewpoint of realizing information processing using a mark 44 described later.
  • the indicator unit 41 according to the present embodiment has a color pattern in which white and black are alternately arranged.
  • the predetermined color or colors assigned to the index unit 41 may be any color as long as it is an index for analysis of an image captured by the camera, and may be selected as appropriate.
  • the indicator portion 41 may be formed in a single color. Further, for example, the indicator unit 41 may be colored in cyan, magenta, yellow, black, and white. For example, the indicator unit 41 may be colored in red, green, blue, black, and white. As described above, the predetermined one or a plurality of colors attached to the indicator unit 41 may be appropriately selected.
  • FIG. 16 illustrates a variation of the color scheme of the indicator unit 41.
  • the region of the opening 43 is a region where the subject is captured.
  • the area (index part 41G) adjacent to the area where the subject is photographed on the panel 40H shown in FIG. 16 is arranged so as to gradually change from position B in the clockwise direction within the range of colors that the subject can take.
  • the indicator unit 41 may be arranged in this way. The effects obtained by such color arrangement can be described as follows.
  • the color of the range that can be taken by the subject is arranged stepwise on the indicator 41G. Therefore, it is possible to specify the position (angle) of the index unit 41G to which the same color as the subject color is given on the acquired image.
  • an identifiable position can be used as a reference for the color of a subject as an effect of color arrangement as in the indicator portion 41G.
  • the color of the subject on the photographed image may be different from the actual color of the subject due to the influence of the photographing environment such as the function and performance of the camera, illumination light, and light source.
  • the index unit 41G since the index unit 41G is disposed at a position adjacent to the subject, the index unit 41G is affected in substantially the same manner as the subject. Therefore, on the captured image, it can be estimated that the actual color of the index unit 41 at the position (angle) of the index unit 41G to which the same color as the subject color is given is almost the same as the actual color of the subject. It is. Therefore, the actual color of the subject can be estimated by analyzing the photographed image and specifying the position of the same color as the subject in the index portion 41G. In order to obtain such an effect, the indicator section 41 may be colored like the indicator section 41G shown in FIG.
  • the indicator 41 is provided not on the subject side surface of the panel 40 but on the camera side surface.
  • the indicator 41 is provided by printing on the camera side surface of the panel 40 by oil-based offset printing, silk printing, or the like.
  • the index part 41 may be printed on the subject side of the panel 40.
  • the panel 40 is made of a translucent member such as transparent plastic, the subject 41 can be photographed even if the index portion 41 is printed on the subject side of the panel 40.
  • the index portion 41 may be provided by sticking a sticker on which a picture of the index portion 41 is printed on the surface of the panel 40 on the camera side.
  • the sticker may be attached to the subject side of the panel 40 as in the case of the printing. This also applies to the mark 44 described later.
  • the subject side surface of the panel 40 that cannot be photographed from the camera may be used arbitrarily.
  • the difference between the color of the index portion 41 on the acquired image and the actual color of the index portion 41 is used in the color correction and uneven illumination correction of the image, which will be described later.
  • a two-dimensional distribution in which the difference is obtained at a position where the index portion 41 is shown on the image is used. Therefore, in order to increase the accuracy of color correction and illumination unevenness correction, which will be described later, it is preferable that the index unit 41 exists in all directions around the area where the subject appears in the shooting range 70 of the camera.
  • the panel 40 includes a mark 44 indicating predetermined information.
  • the mark 44 is photographed together with the subject and the index unit 41 by being included in the photographing range 70 of the camera when photographing the subject with the close-up lens 50 attached to the camera being in contact with or close to the subject.
  • the mark 44 may be attached
  • the predetermined information indicated by the mark 44 is information for identifying that the close-up lens 50 is a regular lens.
  • the predetermined information indicated by the mark 44 is information for identifying the distributor who distributed the close-up lens 50 together with the panel 40.
  • the predetermined information indicated by the mark 44 is information on a product sold together with the panel 40 and the close-up lens 50.
  • mark 44 exists in the acquired image, for example, information processing such as identifying the acquired image based on predetermined information indicated by the mark 44 becomes possible. That is, the mark 44 may be handled as an identifier for performing predetermined identification.
  • the mark 44 may be, for example, a barcode that represents information in accordance with a predetermined standard.
  • the mark 44 may be a trademark of a distributor, a product manufacturer, a target product, etc., for example.
  • the mark 44 may be set as appropriate.
  • the panel 40 may have a plurality of types of information by including a plurality of types of marks 44.
  • the panel 40 according to the present embodiment includes two types of marks, a mark 44a and a mark 44b.
  • the panel 40C shown in FIG. 15B includes four types of marks 44a to 44d.
  • the mark 44 may or may not exist in realizing image analysis using the index unit 41. Accordingly, as illustrated in FIG. 10 and the like, the mark 44 may not be present from the viewpoint of realizing image analysis using the index unit 41.
  • the panel 40 according to the present embodiment shown in FIG. 10 has an opening 43 in a portion included in the photographing range 70 of the camera when the close-up lens 50 attached to the camera is brought into contact with or close to the subject and the subject is photographed.
  • the opening 43 is created, for example, by making a hole in the panel 40 by a processing method such as Thomson processing.
  • the region of the opening 43 in the shooting range 70 of the camera is a region where the subject can be seen from the camera side, in other words, a region where the subject is captured. That is, the opening 43 may not be provided in the panel 40 as long as the subject can be photographed.
  • the panel 40 according to the present embodiment is made of a translucent member such as a transparent plastic, it is possible to capture the subject, and thus the opening 43 may not be provided.
  • the panel 40 is not provided with the opening 43 and is not made of a translucent member, as long as the subject can be photographed as in the panel 40A shown in FIG. Good.
  • the opening 43 should be provided so as to include the optical axis.
  • the center of the opening 43 is provided in the vicinity of the optical axis of the camera (close-up lens 50).
  • the center of the opening 43 is provided so as to be located on the optical axis.
  • the indicator portion 41 may be provided so as to surround the outer edge of the opening 43 provided so that the center is located near or on the optical axis. As a result, even if the panel 40 is mounted so as to be shifted vertically and horizontally, the index portion 41 exists around the opening 43, so that the index portion 41 can be included in the shooting range 70 of the camera.
  • the opening 43 may have a circular symmetry shape. Thereby, even if the panel 40 rotates around the optical axis, the influence can be reduced or ignored.
  • the indicator section 41 may have a color pattern arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening 43.
  • a plurality of marks 44 may be arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening 43.
  • the predetermined regularity is a regularity corresponding to the rotational symmetry of the opening 43.
  • the predetermined regularity corresponds to, for example, the rotational symmetry of the opening 43, and an angle p at which the state of the index portion 41 or the mark 44 included in the imaging range 70 does not change before and after the rotation except for a multiple of 360 degrees. Is present (0 ⁇ p ⁇ 360).
  • the shape of the opening 43 according to the present embodiment is a circle.
  • the shape of the opening 43 is not limited to a circle.
  • the shape of the opening 43 is appropriately selected.
  • the shape of the opening 43 includes a polygon, an ellipse, a star, and the like.
  • FIG. 17A-17C illustrate variations of the opening 43.
  • the shape of the opening 43A of the panel 40I shown in FIG. 17A is a regular polygon as a shape having rotational symmetry.
  • the indicator portion 41H of the panel 40I is provided on the entire surface of the panel 40I.
  • the indicator portion 41I of the panel 40J shown in FIG. 17B has a shape that protrudes toward the center of the opening 43B.
  • the indicator part 41J of the panel 40K shown by FIG. 17C has a shape which protrudes toward the center of the opening 43B, and 2 color schemes of white and black are made
  • the indicator portion 41J is provided with marks 44a and 44b.
  • the user brings the panel 40 (close-up lens 50) into contact with the subject, for example.
  • the subject may partially enter the close-up lens 50 through the opening 43.
  • the index part 41 and the mark 44 are attached to the surface of the panel 40 on the camera side. Therefore, when the deformation of the subject is not taken into consideration, the index portion 41 and the mark 44 are positioned closer to the lens than the subject by the thickness of the panel 40 in the optical axis direction.
  • the index part 41 and the mark 44 exist at positions different from the subject in the optical axis direction. Therefore, when focusing on the subject, there is a possibility that the index unit 41 and the mark 44 may not be focused due to a difference in position in the optical axis direction.
  • the indicator portion 41 may have one or a plurality of steps in the optical axis direction.
  • the mark 44 may be attached to a surface perpendicular to the optical axis direction of each step formed in the panel 40.
  • FIG. 18 and 19 show variations of the panel 40.
  • FIG. FIG. 18 is a perspective view showing an example of a panel 40L in which one step is provided in the indicator portion 41K.
  • FIG. 19 is a cross-sectional view showing an example of a panel 40L in which one step is provided in the indicator portion 41K.
  • a panel 40M shown in FIG. 20A is a modification of the panel 40B shown in FIG. 15A.
  • the panel 40M has one step at the indicator portion 41L.
  • the panel 40N shown by FIG. 20B is a modification of the panel 40C shown by FIG. 15B.
  • the panel 40N has one step at the indicator portion 41M.
  • the panel 40 may be formed integrally with the close-up lens 50.
  • a unit in which the panel 40 and the close-up lens 50 are integrated corresponds to an embodiment of the close-up lens unit of the present invention.
  • the index body of the present invention may be a sticker that is attached to a subject and that exhibits the same function as the panel 40.
  • sticker can be demonstrated as a seal
  • the user for example, affixes a picture sticker that realizes the surface of the panel 40 to the subject, and contacts the close-up lens 50 on the sticker to photograph the subject. Thereby, the image to be photographed can obtain the same effect as that obtained by photographing using the panel 40.
  • a sticker is mainly used by sticking to a subject. Therefore, instead of the opening 43 present in the center of the seal, for example, a transparent sheet that reacts with skin oil may be provided. Thus, the user can check the oil content of the skin when photographing the skin using the close-up lens.
  • the information processing system when a close-up lens mounted on a camera is brought into contact with or close to a subject and the subject is photographed, predetermined information existing at a position in contact with or close to the subject is obtained.
  • the mark shown is included in the imaging range of the camera, an image shot with the subject is acquired. Then, the information processing system according to the present embodiment extracts a mark from the acquired image, and stores the image in association with information indicated by the extracted mark.
  • the information processing system operates as described above, and stores an image photographed using the close-up lens in association with information indicated by a mark. Therefore, the stored image can be identified based on information indicated by the mark. As a result, according to the present embodiment, it is possible to efficiently use the images captured and collected using the close-up lens.
  • FIG. 21 illustrates a hardware configuration of the information processing system 1 according to the present embodiment.
  • An information processing system 1 according to the present embodiment includes a user terminal 2 and a server 3 connected via a network 5.
  • Information transmission between the user terminal 2 connected to the network 5 and the server 3 is, for example, data communication via the network 5 such as a 3G (3rd Generation) network, the Internet, a telephone network, and a dedicated network. It is realized with.
  • the type of the network 5 is appropriately selected according to each data communication.
  • the user terminal 2 includes a control unit 21 including a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and an auxiliary storage device 22 that stores programs executed by the control unit 21.
  • the information processing apparatus includes a network interface 23 for performing communication via the network 5, an input device 24 including a camera module 26, and an output device 25 including a display 27, an LED, a speaker, and the like. In FIG. 21 and FIG. 22 described later, the network interface is described as “NW I / F”.
  • the specific hardware configuration of the user terminal 2 may be appropriately omitted, replaced, and added according to the embodiment.
  • the user terminal 2 may be further connected with a mouse, a keyboard, or the like as an input device.
  • the control unit 21 may include a plurality of processors.
  • a PC for example, a PC, a mobile phone, a smartphone, a tablet terminal, a portable game machine, or the like may be used in addition to a terminal designed exclusively for the provided service.
  • the user photographs the subject using the camera module 26 provided in the user terminal 2.
  • the user attaches the close-up lens 50 attached with the panel 40 to the camera module 26 of the user terminal 2.
  • the user performs photographing in a state where the close-up lens 50 is in contact with or close to his / her skin.
  • an image obtained by enlarging the skin, the index unit 41, and the mark 44 is acquired by shooting in this way.
  • the server 3 includes a control unit 31 including a CPU, a RAM, a ROM, and the like, an auxiliary storage device 32 that stores programs executed by the control unit 31, and a network interface 33 for performing communication via the network 5. And an information processing apparatus. In the present embodiment, images taken by the user are stored in the server 3.
  • the server 2 may be implemented by one or a plurality of information processing devices.
  • the information processing system 1 is not limited to the example realized by the two information processing apparatuses illustrated in FIG.
  • the information processing system 1 may be realized by one information processing apparatus or may be realized by three or more information processing apparatuses.
  • the hardware configuration of the information processing system 1 it is possible to omit, replace, and add components according to the embodiment.
  • FIG. 22 illustrates a variation of the hardware configuration of the information processing system 1 according to the present embodiment.
  • An information processing system 1A shown in FIG. 22 is realized by a single information processing apparatus.
  • the information processing system 1A includes a control unit 10 including a CPU, RAM, ROM, and the like, an auxiliary storage device 11 that stores a program executed by the control unit 10, and a network interface 12 for performing communication via a network.
  • the information processing apparatus includes an input device 13 including a camera module 26 and an output device 14 including a display, an LED, a speaker, and the like.
  • the specific hardware configuration of the information processing apparatus as in the case of the user terminal 2 and the server 3, the omission, replacement, and addition of components may be appropriately performed according to the embodiment.
  • FIG. 23 illustrates an outline of a functional configuration of the information processing system 1 according to the present embodiment.
  • the information processing system 1 according to the present embodiment includes an image acquisition unit 15, an image processing unit 16, a storage unit 17, and a display control unit 18.
  • each function may be realized in either the user terminal 2 or the server 3.
  • the display control unit 18 may be realized in the server 3.
  • the image processing unit 16 may be realized in the user terminal 2.
  • one function may be realized across a plurality of information processing apparatuses.
  • the image processing unit 16 may be realized across the user terminal 2 and the server 3.
  • the user terminal 2 and the server 3 realize the image processing unit 16 by sharing a plurality of processes executed by the image processing unit 16 described later. That is, a plurality of information processing apparatuses may realize the one function by sharing a plurality of processes in one function.
  • the plurality of information processing apparatuses cooperate with each other to realize the function of the information processing system 1.
  • data communication for information exchange related to the cooperation is performed between the plurality of information processing apparatuses.
  • the user terminal 2 and the server 3 perform data communication via the network 5 to cooperate with each other to realize the function of the information processing system 1 of the present embodiment.
  • the functional configuration shown in FIG. 23 is only an example of the information processing system 1 according to the present embodiment. Therefore, the functions of the information processing system 1 may be appropriately omitted, replaced, and added according to the embodiment.
  • the display control unit 18 may be omitted when image display or the like at the time of shooting, which will be described later, is omitted.
  • the image acquisition unit 15 When the close-up lens 50 mounted on the camera module 26 of the user terminal 2 is brought into contact with or close to the subject and the image acquisition unit 15 captures the subject, the image acquisition unit 15 is located at a position in contact with or close to the subject. Is included in the shooting range of the camera module 26, and an image shot with the subject is acquired.
  • the image processing unit 16 extracts the mark 44 from the image acquired by the image acquisition unit 15. Then, the storage unit 17 stores the image in association with the information indicated by the mark 44 extracted from the image.
  • the image acquisition unit 15 exists at a position that contacts or approaches the subject.
  • the index unit 41 with a predetermined color that serves as an index for analyzing an image captured by the camera module 26 is included in the imaging range of the camera module 26 so that the subject and the mark are included.
  • An image taken together with 44 may be acquired.
  • the image processing unit 16 may extract the index unit 41 from the image acquired by the image acquisition unit 15. Further, the image processing unit 16 may execute an analysis process on the acquired image based on the predetermined color of the extracted index unit 41. Further, the image processing unit 16 may perform a correction process on the acquired image based on the result of the analysis process of the image. Then, the storage unit 17 may store the image that has been subjected to the correction processing by the image processing unit 16.
  • the image processing unit 16 may perform the subject analysis processing on the portion related to the subject of the image acquired by the image acquisition unit 15. Then, the storage unit 17 may store the result of the subject analysis process executed by the image processing unit 16 in association with the image acquired by the image acquisition unit 15.
  • the image processing unit 16 may instruct the image acquisition unit 15 to acquire an image again when the subject analysis processing fails.
  • the image acquisition unit 15 may acquire an image including a mark indicating a position where the close-up lens 50 is attached in the camera module 26 of the user terminal 2.
  • the image processing unit 16 may determine whether or not the mark is included in a predetermined area on the image acquired by the image acquisition unit 15. If the image processing unit 16 determines that the mark is not included in the predetermined area on the acquired image, the image processing unit 16 may notify that the mounting position of the close-up lens 50 is incorrect.
  • the display control unit 18 indicates an image acquired by the camera module 26 while the camera module 26 of the user terminal 2 captures a subject, and a position where the close-up lens 50 is mounted on the camera module 26.
  • An image including the mark may be output, and guidance information indicating a reference for the position where the close-up lens 50 is attached to the mark may be output on the output image.
  • FIG. 24 shows an example of the processing procedure of the information processing system 1 according to the present embodiment.
  • the step is abbreviated as “S”.
  • the program stored in the auxiliary storage device 22 is expanded in the RAM or the like of the control unit 21. Then, the program developed in the RAM or the like of the control unit 21 is executed by the CPU of the control unit 21. In this way, the information processing system 1 starts processing.
  • Step 100 when the close-up lens 50 attached to the camera module 26 of the user terminal 2 is brought into contact with or close to the subject and the subject is photographed by the image acquisition unit 15, the image acquisition unit 15 is brought into contact with or close to the subject.
  • the existing mark 44 indicating the predetermined information is included in the shooting range of the camera module 26, whereby an image shot with the subject is acquired.
  • the user attaches the close-up lens 50 with the panel 40 attached to the camera module 26 of the user terminal 2. Then, the user operates the user terminal 2 with the close-up lens 50 in contact with or close to his / her skin.
  • the control unit 21 controls shooting by the camera module 26 and acquires an image related to the shooting from the camera module 26. Thereby, the information processing system 1 acquires an image in which the mark (44a, 44b) is photographed together with the subject (skin).
  • FIG. 25 illustrates an image 80 captured by the camera module 26 of the user terminal 2.
  • the region 81 corresponds to the region of the opening 43 of the panel 40 and is a region where the subject (skin) is captured.
  • An area 82 corresponds to an area where the index portion 41 and the mark 44 of the panel 40 exist, and is an area where the index portion 41 and the mark 44 are reflected.
  • the boundary 83 is a boundary between the region 81 and the region 83 and corresponds to the boundary between the opening 43 and the indicator portion 41.
  • the image acquisition unit 15 may acquire an image in which the index unit 41 is photographed together with the subject and the mark 44.
  • the image acquisition unit 15 acquires an image that does not include the index part 41.
  • the information processing system 1 may process an image that does not include such an index unit 41.
  • the boundary 83 corresponds to the boundary between the opening 43 and the index portion 41, that is, the edge that defines the opening 43.
  • the panel 40 provided with the opening 43 is mounted so as to be fitted into the upper end portion 53 of the close-up lens 50. Therefore, when viewed from the camera module 26 of the user terminal 2, the position of the edge that defines the opening 43 corresponds to the position where the close-up lens 50 to which the panel 40 is mounted is present. That is, the position of the edge that defines the opening 43 is determined according to the position where the close-up lens 50 exists, and further, the position of the boundary 83 on the acquired image is determined. Therefore, the boundary 83 serves as a mark indicating the position where the close-up lens 50 is attached in the camera module 26 of the user terminal 2.
  • the image acquisition unit 15 acquires an image including a mark indicating a position where the close-up lens 50 is mounted in the camera module 26 of the user terminal 2.
  • the said mark is not limited to the boundary 83 illustrated by this embodiment.
  • the mark only needs to indicate the positional relationship between the camera module 26 of the user terminal 2 and the close-up lens 50.
  • an edge in the color pattern of the index unit 41, the mark 44, or the like may be used.
  • a predetermined pattern or the like provided at a position on the close-up lens 50 included in the shooting range of the camera module 26 is used. May be.
  • the image processing unit 16 may determine whether or not the boundary 83 is included in a predetermined area on the acquired image. When the image processing unit 16 determines that the boundary 83 is not included in the predetermined region on the acquired image, the image processing unit 16 may notify that the close-up lens 50 is mounted in the wrong position.
  • an area 84 shown in FIG. 25 is an example of the predetermined area. The predetermined area may be arbitrarily set.
  • control unit 21 may extract the boundary 83 from the image acquired by the camera module 26 by edge extraction or the like. Further, the control unit 21 may determine whether or not the extracted boundary 83 is included in the region 84.
  • control unit 21 determines that the boundary 83 is not included in the region 84, the control unit 21 operates the speaker included in the output device 25 to notify the user that the close-up lens 50 is mounted incorrectly. May be output. In addition, an image that informs the user that the close-up lens 50 is mounted in an incorrect position may be displayed on the display 27.
  • control unit 21 controls the output device 25 to perform audio output or screen output instructing the user of the mounting position of the close-up lens 50, thereby making it difficult to adjust the mounting position of the close-up lens 50. It may be reduced.
  • the control unit 21 determines that the boundary 83 is included in the region 84, the sound for notifying the user that the position of the close-up lens 50 is correct by operating the speaker included in the output device 25. May be output. Further, a screen for informing the user that the close-up lens 50 is mounted correctly may be displayed on the display 27. Thus, the user can recognize that the close-up lens 50 is mounted correctly.
  • the process related to determining whether or not the boundary 83 is included in the region 84 may be executed in an analysis process in step 200 described later. Further, the server 3 may execute a process related to a determination as to whether or not the boundary 83 is included in the region 84. For example, the user terminal 2 may transmit the acquired image to the server 3. If the server 3 determines that the boundary 83 is not included in the region 84, the server 3 may notify the user terminal 2 that the close-up lens 50 is mounted incorrectly. The user terminal 2 may execute the above-described audio output, screen output, and the like by receiving the notification.
  • the display control unit 18 indicates an image acquired by the camera module 26 while the camera module 26 of the user terminal 2 captures a subject, and a position where the close-up lens 50 is mounted on the camera module 26.
  • An image including the mark (boundary 83) may be output, and guidance information indicating a reference of the position where the close-up lens 50 is attached to the mark (boundary 83) may be output to the display 27 on the output image. .
  • FIG. 26 illustrates a screen display of the display 27 according to the present embodiment.
  • the control unit 21 outputs images continuously acquired by the camera module 26 during shooting to the display 27.
  • the control unit 21 outputs a guide line 91 to the display 27 as guide information indicating the reference of the mounting position of the close-up lens 50 with respect to the boundary 83 on the output image.
  • a guide line 91 indicating the reference of the boundary 83 is displayed together with an image as shown in FIG.
  • the user can mount the close-up lens 50 at the correct mounting position by determining the mounting position of the close-up lens 50 so that the imaged boundary 83 is aligned with the guide line 91. That is, by displaying such a guide line 91, the difficulty associated with the adjustment of the mounting position of the close-up lens 50 can be reduced.
  • the shape of the guide line 91 that is guide information indicating the reference of the mounting position of the close-up lens 50 corresponds to the shape of the boundary 83 that is a mark indicating the position where the close-up lens 50 is mounted.
  • the guide line 91 is not limited to a shape corresponding to the shape of the boundary 83. The shape of the guide line 91 may be selected as appropriate.
  • the guide information according to the present embodiment is limited to a sign indicating a position to match a mark (boundary 83) indicating a position where the close-up lens 50 is mounted, such as the guide line 91 according to the present embodiment. is not.
  • the guidance information may be a sign indicating the direction in which the close-up lens 50 is moved, or may be a character display indicating the mounting position of the close-up lens 50. The guide information is appropriately set according to the embodiment.
  • the close-up lens 50 is a lens used for magnified photographing, a small shift in the mounting position has a great influence on the photographed image.
  • the mounting position of the close-up lens 50 is not fixed in the camera module 26, the user must adjust the mounting position of the close-up lens 50 so that such a small shift does not occur. There was a scene with difficulty.
  • the guide information reduces the difficulty in such a situation by indicating a reference of the mounting position of the close-up lens 50 with respect to a mark indicating the position where the close-up lens 50 is mounted.
  • an image taken by an apparatus included in the information processing system 1 is acquired.
  • the information processing system 1 may acquire an image taken by another device that is not included in the information processing system 1.
  • the camera that captures an image is not limited to the camera module 26 of the user terminal 2.
  • the information processing system 1 may acquire an image from another device.
  • the information processing system 1 may acquire an image in which the mark 44 is photographed together with the subject from another information processing apparatus connected to the network 5.
  • the image to be processed by the information processing system 1 may be an image that is continuously acquired during shooting by a camera such as the camera module 26 of the user terminal 2 or the shutter of the camera is released.
  • the image obtained in (1) may be used.
  • step 200 the image processing unit 16 executes analysis processing and correction processing.
  • a specific example of the analysis process and the correction process is shown in FIG. Below, the server 3 shows the example which performs these processes. However, these processes may be executed in either the user terminal 2 or the server 3. Further, the user terminal 2 and the server 3 may share these processes and execute them. When these processes are shared by the user terminal 2 and the server 3, the user terminal 2 and the server 3 exchange data to be processed by data communication via the network 5 and switch processing subjects.
  • FIG. 27 is a flowchart illustrating an analysis process and a correction process by the information processing system 1 according to this embodiment.
  • the server 3 receives the image acquired from step 100 and transmitted from the user terminal 2. Then, the server 3 performs analysis processing and correction processing on the received image. Specifically, the process is executed as follows.
  • step 201 the control unit 31 extracts the index unit 41 from the image acquired in step 100.
  • the control unit 31 extracts a region 82 in which the index unit 41 is shown from the image 80 shown in FIG. Note that this processing corresponds to an example of image analysis processing according to the present invention.
  • the indicator unit 41 has a color pattern in which white and black are alternately arranged.
  • the control unit 31 estimates a region with a color approximated to white or black as a region 82 in which the index unit 41 is reflected, and extracts the region.
  • the index unit 41 refers to the RGB value of each pixel of the image 80. If the difference between the referenced RGB value and the RGB value indicating white (255, 255, 255) is within a predetermined threshold value, the control unit 31 selects a pixel having the RGB value as a white region of the index unit 41. It is determined that the pixel is included in the captured area. If the difference between the referenced RGB value and the RGB value (0, 0, 0) indicating black is within a predetermined threshold, the control unit 31 determines that the pixel having the RGB value is the black value of the index unit 41. The pixel is determined to be included in the region where the region is reflected.
  • the control unit 31 determines that the pixel having the RGB value is a pixel that is not included in the region where the index unit 41 is captured.
  • the control unit 31 estimates and extracts the region 82 in which the index unit 41 is captured by making these determinations for each pixel of the image 80.
  • the pixel in the region (region 81) in which the subject is captured may be erroneously determined as the pixel in the region (region 82) in which the index portion 41 is captured.
  • control unit 31 may store the shape of the opening 43 or the shape of the index unit 41 in advance in the auxiliary storage device 32 or the like. Then, the control unit 31 erroneously determines the pixel of the region 81 as the pixel of the region 82 by referring to the information such as the shape of the opening 43 stored in advance and estimating the position of the region 81 or the region 82. You may avoid that.
  • control unit 31 extracts the boundary 83 and determines in advance the position of the boundary between the region 81 and the region 82, thereby avoiding erroneous determination of the pixel in the region 81 as the pixel in the region 82. May be.
  • control unit 31 may specify the position of the boundary 83 from the extracted region 82. And the control part 31 may determine whether the mounting position of the close-up lens 50 mentioned above is correct. For example, the control unit 31 may determine whether the region 84 includes the boundary 83 that specifies the position. If the control unit 31 determines that the boundary 83 is not included in the region 84, the control unit 31 may notify the user terminal 2 of a message indicating that the close-up lens 50 is mounted in the wrong position. On the other hand, when it is determined that the boundary 83 is included in the region 84, the control unit 31 may notify the user terminal 2 of a message indicating that the close-up lens 50 is mounted correctly.
  • control unit 31 tries to extract the region 82 in which the index unit 41 is captured from the image 80.
  • the method of extracting the region 82 is not limited to such a method, and may be appropriately selected according to the embodiment.
  • control unit 31 ends the analysis and correction process (“NO” in step 201). On the other hand, if the area 82 can be extracted from the image 80 (“YES” in step 202), the control unit 31 advances the process to step 203.
  • the control unit 31 performs image blur correction. For example, the control unit 31 determines an area where the image 80 is out of focus. And the control part 31 performs a defocus correction
  • the control unit 31 determines whether or not defocusing has occurred using the boundary (edge) between white and black in the color pattern of the index unit 41 that is captured in the region 82 extracted in step 201. .
  • the change (gradient) of the pixel value (RGB value) at the edge becomes gentler. Therefore, the control unit 31 determines whether or not a blur has occurred based on a change in the pixel value (RGB value) at the edge.
  • control part 31 sharpens the image of the area
  • the method of correcting the defocus of the image is not limited to such a method, and may be appropriately selected according to the embodiment.
  • step 204 the controller 31 corrects image color and illumination unevenness. For example, an image having a color different from the actual color may be obtained depending on the function and performance of the camera. In addition, the illumination may be uneven due to the influence of the illumination light and the light source, and an image having a color different from the actual color may be obtained.
  • the control unit 31 analyzes the influence of these photographing environments and corrects the image so as to reduce the influence.
  • the process for analyzing the influence of the shooting environment corresponds to an example of the image analysis process of the present invention. Further, the process of correcting an image so as to reduce the influence of the shooting environment corresponds to an example of the correction process of the present invention.
  • the indicator section 41 shown in the area 82 of the image 80 has a color pattern in which white and black are alternately arranged. Pixels included in the white pattern region should have the same RGB value. Also, the pixels included in the black pattern region should have the same RGB value.
  • the RGB value of each pixel becomes a value different from the actual color according to the influence. Therefore, when affected by these shooting environments, variations in RGB values occur in regions that should have the same RGB values. By correcting the RGB values so as to eliminate this variation, the control unit 31 performs color correction and illumination unevenness correction.
  • control unit 31 uses the actual color of each area as a reference (reference color) for each of the area including the white pattern and the area including the black pattern, and the RGB values of the pixels included in each area And the difference (shift) between the RGB value of the reference color and the reference color.
  • control unit 31 obtains the difference between the RGB value of each pixel included in the region and the RGB value indicating white in the region where the white pattern is captured, thereby determining the color of each pixel and the reference Measure the difference from the color. Thereby, the control unit 31 obtains a two-dimensional distribution indicating the difference between the color of each pixel and the reference color in the region where the white pattern is reflected. In addition, the control unit 31 similarly measures a region where a black pattern is captured, and obtains a two-dimensional distribution indicating a difference between the color of each pixel and the reference color.
  • control unit 31 obtains the color of the pixel included in the region 81 and the reference color from the waveform of the two-dimensional distribution obtained for the pixel included in the region 82 and the difference between the color of each pixel and the reference color.
  • a two-dimensional distribution related to the difference is estimated. For example, by filling a blank area (area 81) of the two-dimensional distribution created in the area 82 with a predetermined interpolation calculation, a two-dimensional distribution related to the difference between the color of the pixel included in the area 81 and the reference color is estimated. .
  • the control unit 31 corrects the RGB value of each pixel using the two-dimensional distribution.
  • color correction and uneven illumination correction are realized.
  • the reference color may be set as appropriate. However, from the viewpoint of bringing the color of the subject close to the actual color, it is desirable to use the actual color of each area as a reference as in the present embodiment.
  • the color of the subject shown in the image can be brought close to the actual color by the correction in step 204. Therefore, it is possible to improve the accuracy of skin analysis described later. Also, since the color of the subject can be brought close to the actual color, the color recognition accuracy is improved, and for example, the number of colors that can be used in a code system that expresses information using a two-dimensional array of colors is increased. Can do.
  • step 205 the control unit 31 corrects image distortion.
  • the control unit 31 determines a region where distortion is generated in the image 80.
  • the control part 31 correct
  • the process of determining a region where distortion occurs corresponds to an example of an image analysis process of the present invention.
  • the process for correcting the distortion corresponds to an example of the correction process of the present invention.
  • the control unit 31 determines whether or not distortion has occurred using an edge in the color pattern of the index unit 41 shown in the region 82 extracted in step 201.
  • the shape of the edge in the color pattern of the index unit 41 is a straight line. Therefore, the control unit 31 determines whether or not distortion has occurred by determining whether or not the shape of the edge exceeds the threshold and is no longer a straight line.
  • control unit 31 corrects the distortion of the region determined to be distorted by converting the shape of the edge determined to be no longer a straight line beyond the threshold value so as to become a straight line.
  • the method for correcting image distortion is not limited to such a method, and may be appropriately selected according to the embodiment.
  • the control unit 31 extracts the mark 44 from the image 80.
  • the control unit 31 uses the pattern of the mark 44 stored in advance in the auxiliary storage device 32 or the like, and detects the pattern that matches the pattern of the mark 44 from the image 80, thereby extracting the mark 44 from the image 80. Do (pattern matching).
  • two types of marks 44 that is, marks 44a and 44b are extracted.
  • the mark 44a indicates that the close-up lens 50 to which the panel 40 is attached is a regular product.
  • the mark 44b indicates a business operator (A trader) of a product sold together with the close-up lens 50 and the panel. Based on the pattern used for the extraction, the control unit 31 recognizes the predetermined information as indicated by the mark 44 attached to the panel 40.
  • the panel 40 there is a panel 40L having a step in the index portion.
  • the panel 40 ⁇ / b> L has a step around the opening, and includes an index portion 41 ⁇ / b> K and marks (44 a, 44 b) at each step.
  • the position in the optical axis direction is different, so that one of the images in the stage is clearer than the image in the other stage.
  • the information processing system 1 changes the place (stage) from which an image is acquired according to the scene by using a panel having a step as described above, thereby providing a clearer image including the index unit 41 and the mark 44. It becomes possible to get. That is, it is possible to reduce the shift of the focus between the index portion 41 and the mark 44 when the subject is focused.
  • the image processing unit 16 executes subject analysis processing.
  • the control unit 31 analyzes the skin (skin analysis) shown in the image 80. For example, the control unit 31 binarizes and thins the image of the area 81 in which the subject is captured. And the control part 31 detects the regularity by a frequency analysis, etc., and detects the texture, the pore, the stain, etc. in an image.
  • skin analysis is exemplified as the subject analysis processing.
  • the subject analysis process is not limited to skin analysis, and may be set as appropriate according to the subject.
  • the panel 40 shown in FIG. 16 there is a panel 40H in which the indicator portions are colored stepwise.
  • the information processing system 1 compares each color of the index unit 41G with the color of the subject, so that the position of the same color as the subject (in the index unit 41G ( Angle) can be specified. Then, the information processing system 1 can estimate the color of the subject on the basis of the color of the index unit 41G attached to the specified position (angle).
  • step 300 This completes the image analysis processing and correction processing. Then, the process proceeds to step 300.
  • control unit 31 may change the order of correction processing in steps 203 to 205. Further, the control unit 31 may execute the process of step 206 before step 201. Further, the control unit 31 may change the order of the correction processing in steps 203 to 205.
  • processing may be omitted, replaced, and added as appropriate according to the embodiment.
  • the indicator portion does not exist in the panel 40G shown in FIG. 15F. Therefore, the control unit 31 may omit the processes in steps 201 to 205.
  • RGB red, green, blue, and blue
  • the image processing unit 16 determines whether the image acquisition unit 15 is to acquire an image again. For example, if the index unit 41 cannot be extracted in step 201 described above, or if the mark 44 cannot be extracted in step 206, the control unit 31 does not succeed in subject (skin) analysis processing in step 207. In such a case, the image acquisition unit 15 determines to acquire the image again. And the control part 31 returns a process to step 100 by performing the notification for making the user terminal 2 acquire an image again. On the other hand, if there is no such situation, the control unit 31 advances the process to step 400.
  • step 400 the display control unit 18 outputs the analysis result of the subject in step 207.
  • the control unit 31 of the server 3 notifies the user terminal 2 of the analysis result of the subject obtained in step 207.
  • the control unit 21 of the user terminal 2 presents the analysis result of the subject to the user by outputting the received analysis result of the subject to the display 27.
  • the storage unit 17 stores the image on which each correction process has been executed, the predetermined information indicated by the mark 44, and the analysis result of the subject in association with each other.
  • the auxiliary storage device 32 of the server 3 includes a database for storing the acquired information.
  • FIG. 28 illustrates a database record used by the information processing system 1 according to the present embodiment for storing images. Specifically, FIG. 28 illustrates a list of records related to the user A.
  • the database used to store the acquired information includes an ID field for storing an identifier for identifying a record, a shooting date / time field for storing the shooting date / time of the acquired image, and a file for storing the file name of the acquired image A name field, an attribute information field for storing predetermined information indicated by a mark 44, and an analysis result field for storing an analysis result of a subject (skin).
  • the control unit 31 creates a record in which fields are filled so as to match the image on which each correction process has been executed, the predetermined information indicated by the mark 44 extracted in step 206, and the analysis result of the subject in step 207. Add the created record to the database. Thereby, the control unit 31 accumulates the image on which each correction process has been executed, the predetermined information indicated by the mark 44, and the analysis result of the subject in association with each other.

Abstract

The purpose of one aspect of the present invention is to enable the impact of camera function and performance, illumination light, light source, or similar, on an image that a user has acquired using a close-up lens to be reduced. An indicator according to one aspect of the present invention is provided with indicator sections having specific colours which, when a close-up lens mounted on a camera comes into contact with or comes close to a photographic subject and the photographic subject is captured, are present in positions that come close into contact with or come close to the photographic subject, and are at least partly included in the capture-range of the camera, thus are captured together with the photographic subject, and form analysis indicators for the image that has been captured by the camera.

Description

指標体、及び、接写レンズユニットIndex body and close-up lens unit
 本発明は、主に接写レンズに用いられる指標体、及び、接写レンズユニットの技術に関する。 The present invention relates to an index body mainly used for a close-up lens and a technology of a close-up lens unit.
 特許文献1及び2には、携帯電話端末、携帯可能なビデオゲーム機、PC(Personal Computer)等に実装されているカメラモジュールに装着される、人の肌等に当接又は近接させた状態で撮影を行うための接写レンズが開示されている。 In Patent Documents 1 and 2, a mobile phone terminal, a portable video game machine, a camera module mounted on a PC (Personal Computer), etc. is attached to a human skin or the like in a state of being in contact with or close to human skin. A close-up lens for photographing is disclosed.
意匠登録第1352831号公報Design Registration No. 1352831 意匠登録第1401884号公報Design Registration No. 1401884
 近年、このような接写レンズを利用して、ユーザ自身に肌等を撮影させることで得られる画像を用いて肌等を解析するサービスが提供され始めている。ユーザは、このようなサービス受けるには、解析を受けたい部位の肌等を撮影しなければならない。しかしながら、カメラの機能及び性能、照明光、光源等の撮影環境に影響を受けることで、当該サービスの提供を受けるのに十分な画質の画像が得られない場合があるという問題点があった。 In recent years, a service for analyzing skin and the like using an image obtained by allowing the user to photograph the skin and the like using such a close-up lens has begun to be provided. In order to receive such a service, the user must photograph the skin or the like of the part to be analyzed. However, there is a problem in that an image having a sufficient image quality to receive the service may not be obtained due to the influence of the shooting environment such as the function and performance of the camera, illumination light, and light source.
 一側面では、本発明は、このような問題点を考慮してなされたものであり、ユーザが接写レンズを利用して取得した画像に対する、カメラの機能及び性能、照明光、光源等の撮影環境の影響を軽減可能にすることを目的とする。 In one aspect, the present invention has been made in view of such problems, and is a shooting environment such as camera function and performance, illumination light, and light source for an image acquired by a user using a close-up lens. The purpose is to make it possible to reduce the effects of
 本発明の一側面に係る指標体は、カメラに装着された接写レンズを被写体に当接又は近接して前記被写体を撮影する時に、前記被写体に当接又は近接する位置に存在し、少なくともその一部が前記カメラの撮影範囲に含まれることで前記被写体と共に撮影される、前記カメラにより撮影された画像の解析の指標となる所定の色が付された指標部を備える。 The index body according to one aspect of the present invention is present at a position in contact with or close to the subject when the close-up lens attached to the camera is in contact with or close to the subject to photograph the subject, and at least one of them. And an index unit with a predetermined color as an index of analysis of an image captured by the camera, which is captured together with the subject when the unit is included in the capturing range of the camera.
 上記本発明の一側面に係る指標体によれば、ユーザが接写レンズを利用して画像を取得する際に、被写体と共に、カメラにより撮影された画像の解析の指標となる所定の色が付された指標部が撮影される。 According to the index body according to one aspect of the present invention, when a user acquires an image using a close-up lens, a predetermined color that is an index for analysis of an image photographed by a camera is attached to the subject. The indicator part is photographed.
 従って、上記本発明の一側面に係る指標体によれば、被写体と共に撮影される当該指標部に基づいて、取得される画像を解析することが可能となる。その結果、ユーザが接写レンズを利用して取得した画像に対する、カメラの機能及び性能、照明光、光源等の撮影環境の影響を軽減することが可能となる。 Therefore, according to the index body according to one aspect of the present invention, it is possible to analyze the acquired image based on the index section photographed together with the subject. As a result, it is possible to reduce the influence of the shooting environment such as the function and performance of the camera, the illumination light, and the light source on the image acquired by the user using the close-up lens.
 また、上記一側面に係る指標体の別形態として、上記指標体は、前記カメラに装着された前記接写レンズを前記被写体に当接又は近接して前記被写体を撮影する時に、前記カメラの撮影範囲に含まれることで、前記被写体及び前記指標部と共に撮影される、所定の情報を示す印を更に備えてもよい。 Further, as another form of the index body according to the one aspect, the index body is configured to capture the subject when the close-up lens mounted on the camera is in contact with or close to the subject and captures the subject. It may further include a mark indicating predetermined information that is photographed together with the subject and the indicator portion.
 また、上記各一側面に係る指標体の別形態として、上記各指標体は、前記カメラに装着された前記接写レンズを前記被写体に当接又は近接して前記被写体を撮影する時に、前記カメラの撮影範囲に含まれる部分に開口を更に備えてもよい。 Further, as another form of the index body according to each of the above-described aspects, each of the index bodies is configured such that when the close-up lens mounted on the camera is in contact with or close to the subject and the subject is photographed, You may further provide opening in the part contained in an imaging | photography range.
 また、上記一側面に係る指標体の別形態として、前記開口は、前記開口の中心が前記カメラの光軸近傍に位置するように設けられてもよい。 Further, as another form of the index body according to the one aspect, the opening may be provided so that the center of the opening is located in the vicinity of the optical axis of the camera.
 また、上記一側面に係る指標体の別形態として、前記指標部は、前記開口の外縁を囲うように設けられてもよい。 Further, as another form of the indicator body according to the one aspect, the indicator portion may be provided so as to surround an outer edge of the opening.
 また、上記一側面に係る指標体の別形態として、前記開口は、回転対称性を有する形状であってもよい。 Further, as another form of the index body according to the one aspect, the opening may have a rotationally symmetric shape.
 また、上記一側面に係る指標体の別形態として、前記指標部は、前記開口の回転対称性に対応した所定の規則性に従って配色されたカラーパタンを有してもよい。 Further, as another form of the indicator body according to the above aspect, the indicator portion may have a color pattern arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening.
 また、上記各一側面に係る指標体の別形態として、前記開口の回転対称性に対応した所定の規則性に従って、複数の前記印が配列されてもよい。 Further, as another form of the index body according to each of the above-described one aspect, a plurality of the marks may be arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening.
 また、上記各一側面に係る指標体の別形態として、各形態の指標体において、前記指標部は、光軸方向に段差を有してもよい。 Further, as another form of the indicator body according to each of the above-described aspects, in the indicator body of each form, the indicator portion may have a step in the optical axis direction.
 また、上記各一側面に係る指標体の別態様として、以上の各構成を実現する接写レンズユニットであってもよい。 Further, as another aspect of the index body according to each of the above aspects, a close-up lens unit that realizes each of the above configurations may be used.
 例えば、本発明の一側面に係る接写レンズユニットは、カメラに装着されて用いられ、前記カメラに装着された状態において、撮影時に被写体に当接又は近接する位置に、少なくともその一部が前記カメラの撮影範囲に含まれることで前記被写体と共に撮影される、前記カメラにより撮影された画像の解析の指標となる所定の色が付された指標部、
を備える。
For example, a close-up lens unit according to one aspect of the present invention is used by being attached to a camera, and when attached to the camera, at least a part of the close-up lens unit is in contact with or close to a subject at the time of shooting. An index part with a predetermined color as an index of analysis of an image captured by the camera, which is captured with the subject by being included in the imaging range of
Is provided.
 本発明によれば、ユーザが接写レンズを利用して取得した画像に対する、カメラの機能及び性能、照明光、光源等の撮影環境の影響を軽減することが可能となる。 According to the present invention, it is possible to reduce the influence of the shooting environment such as the function and performance of the camera, the illumination light, and the light source on the image acquired by the user using the close-up lens.
図1は、実施の形態に係るパネルが装着された接写レンズの一例を示す斜視図である。FIG. 1 is a perspective view showing an example of a close-up lens to which a panel according to an embodiment is attached. 図2は、実施の形態に係るパネルが装着された接写レンズの一例を示す底面斜視図である。FIG. 2 is a bottom perspective view showing an example of a close-up lens mounted with the panel according to the embodiment. 図3は、実施の形態に係るパネルが装着された接写レンズの一例を示す平面図である。FIG. 3 is a plan view showing an example of a close-up lens mounted with the panel according to the embodiment. 図4は、実施の形態に係るパネルが装着された接写レンズの一例を示す底面図である。FIG. 4 is a bottom view showing an example of a close-up lens mounted with the panel according to the embodiment. 図5は、実施の形態に係るパネルが装着された接写レンズの一例を示す正面図である。FIG. 5 is a front view showing an example of a close-up lens mounted with the panel according to the embodiment. 図6は、実施の形態に係るパネルが装着された接写レンズの一例を示す、図3のA-A線に沿う断面図である。FIG. 6 is a cross-sectional view taken along line AA of FIG. 3, showing an example of a close-up lens to which the panel according to the embodiment is attached. 図7は、実施の形態に係るパネルが装着された接写レンズの一例を示す分解斜視図である。FIG. 7 is an exploded perspective view showing an example of a close-up lens mounted with the panel according to the embodiment. 図8は、実施の形態に係るパネルが装着された接写レンズの一例を示す底面分解斜視図である。FIG. 8 is a bottom exploded perspective view showing an example of a close-up lens mounted with the panel according to the embodiment. 図9は、実施の形態に係るパネルが装着された接写レンズの使用例を示す。FIG. 9 shows an example of using a close-up lens equipped with the panel according to the embodiment. 図10は、実施の形態に係るパネルの一例を示す。FIG. 10 shows an example of a panel according to the embodiment. 図11は、実施の形態に係るパネルの他の装着例を示す斜視図である。FIG. 11 is a perspective view showing another mounting example of the panel according to the embodiment. 図12は、実施の形態に係るパネルの他の装着例を示す分解斜視図である。FIG. 12 is an exploded perspective view illustrating another mounting example of the panel according to the embodiment. 図13は、実施の形態に係るパネルの他の装着例を示す断面図である。FIG. 13 is a cross-sectional view showing another mounting example of the panel according to the embodiment. 図14は、他の形態に係るパネルの一例を示す。FIG. 14 shows an example of a panel according to another embodiment. 図15Aは、他の形態に係るパネルの一例を示す。FIG. 15A shows an example of a panel according to another embodiment. 図15Bは、他の形態に係るパネルの一例を示す。FIG. 15B shows an example of a panel according to another embodiment. 図15Cは、他の形態に係るパネルの一例を示す。FIG. 15C shows an example of a panel according to another embodiment. 図15Dは、他の形態に係るパネルの一例を示す。FIG. 15D shows an example of a panel according to another embodiment. 図15Eは、他の形態に係るパネルの一例を示す。FIG. 15E shows an example of a panel according to another embodiment. 図15Fは、他の形態に係るパネルの一例を示す。FIG. 15F shows an example of a panel according to another embodiment. 図16は、他の形態に係るパネルの一例を示す。FIG. 16 shows an example of a panel according to another embodiment. 図17Aは、他の形態に係るパネルの一例を示す。FIG. 17A shows an example of a panel according to another embodiment. 図17Bは、他の形態に係るパネルの一例を示す。FIG. 17B shows an example of a panel according to another embodiment. 図17Cは、他の形態に係るパネルの一例を示す。FIG. 17C shows an example of a panel according to another embodiment. 図18は、他の形態に係るパネルの一例を示す斜視図である。FIG. 18 is a perspective view showing an example of a panel according to another embodiment. 図19は、他の形態に係るパネルが装着された接写レンズの一例を示す断面図である。FIG. 19 is a cross-sectional view showing an example of a close-up lens mounted with a panel according to another embodiment. 図20Aは、他の形態に係るパネルの一例を示す。FIG. 20A shows an example of a panel according to another embodiment. 図20Bは、他の形態に係るパネルの一例を示す。FIG. 20B shows an example of a panel according to another embodiment. 図21は、実施の形態に係る情報処理システムのハードウェア構成の概略を例示する。FIG. 21 illustrates an outline of a hardware configuration of the information processing system according to the embodiment. 図22は、他の形態に係る情報処理システムのハードウェア構成の概略を例示する。FIG. 22 illustrates a schematic hardware configuration of an information processing system according to another embodiment. 図23は、情報処理システムの機能構成の概略を例示する。FIG. 23 illustrates an outline of a functional configuration of the information processing system. 図24は、実施の形態に係る情報処理システムによる情報処理の処理手順の一例を示すフローチャートである。FIG. 24 is a flowchart illustrating an example of an information processing procedure performed by the information processing system according to the embodiment. 図25は、実施の形態に係る情報処理システムにおいて取得される画像の一例を示す。FIG. 25 shows an example of an image acquired in the information processing system according to the embodiment. 図26は、実施の形態に係る情報処理システムにおいて、撮影中に表示される画面の一例を示す。FIG. 26 shows an example of a screen displayed during shooting in the information processing system according to the embodiment. 図27は、実施の形態に係る情報処理システムによる解析処理及び補正処理の処理手順の一例を示すフローチャートである。FIG. 27 is a flowchart illustrating an example of a processing procedure of analysis processing and correction processing by the information processing system according to the embodiment. 図28は、実施の形態に係る情報処理システムにおいて記憶されるデータを例示する。FIG. 28 illustrates data stored in the information processing system according to the embodiment.
 以下、本発明の一側面に係る実施の形態(以下、「本実施形態」とも表記する)を、図面に基づいて説明する。ただし、以下で説明する本実施形態は、あらゆる点において本発明の例示に過ぎず、その範囲を限定しようとするものではない。本発明の範囲を逸脱することなく種々の改良や変形を行うことができることは言うまでもない。つまり、本発明の実施にあたって、各実施の形態に応じた具体的構成が適宜採用されてもよい。 Hereinafter, an embodiment according to one aspect of the present invention (hereinafter also referred to as “this embodiment”) will be described with reference to the drawings. However, the present embodiment described below is merely an example of the present invention in all points, and is not intended to limit the scope thereof. It goes without saying that various improvements and modifications can be made without departing from the scope of the present invention. That is, in carrying out the present invention, a specific configuration according to each embodiment may be adopted as appropriate.
 なお、本実施形態において登場するデータを自然言語により説明しているが、より具体的には、コンピュータが認識可能な疑似言語、コマンド、パラメタ、マシン語等で指定される。 Although data appearing in this embodiment is described in a natural language, more specifically, it is specified by a pseudo language, a command, a parameter, a machine language, etc. that can be recognized by a computer.
 §1 接写レンズ
 本実施形態では、ユーザが、携帯電話端末、携帯可能なビデオゲーム機、PC(Personal Computer)等に接続又は実装されているカメラモジュールで自身の肌を撮影する場面が例示される。ただし、本発明の適用場面はこのような例に限定されない。本発明は、被写体を接写する場面において広く適用可能である。なお、本実施形態では、他の称呼を有する装置であっても、その装置がカメラ機能を有する又は有することが可能であるならば、その装置を「カメラ」と称する。
§1 Close-up lens In this embodiment, a scene in which a user photographs his / her skin with a camera module connected to or mounted on a mobile phone terminal, a portable video game machine, a PC (Personal Computer) or the like is illustrated. . However, the application scene of the present invention is not limited to such an example. The present invention is widely applicable in scenes where close-ups of subjects are taken. In the present embodiment, even a device having another name is referred to as a “camera” if the device has or can have a camera function.
 図1は、本実施形態に係るパネル40が装着された接写レンズ50の一例を示す斜視図である。図2は、本実施形態に係るパネル40が装着された接写レンズ50の一例を示す底面斜視図である。図3は、本実施形態に係るパネル40が装着された接写レンズ50の一例を示す平面図である。図4は、本実施形態に係るパネル40が装着された接写レンズ50の一例を示す底面図である。図5は、本実施形態に係るパネル40が装着された接写レンズ50の一例を示す正面図である。図6は、本実施形態に係るパネル40が装着された接写レンズ50の一例を示す、図3のA-A線に沿う断面図である。図7は、本実施形態に係るパネル40が装着された接写レンズ50の一例を示す分解斜視図である。図8は、本実施形態に係るパネル40が装着された接写レンズ50の一例を示す底面分解斜視図である。図9は、本実施形態に係るパネル40が装着された接写レンズ50の使用例を示す。なお、本実施形態に係るパネル40が装着された接写レンズの一例を示す背面図、左側面図、及び、右側面図は、それぞれ、図5により示される正面図と同様に表現される。 FIG. 1 is a perspective view showing an example of a close-up lens 50 to which a panel 40 according to this embodiment is attached. FIG. 2 is a bottom perspective view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached. FIG. 3 is a plan view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached. FIG. 4 is a bottom view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached. FIG. 5 is a front view showing an example of the close-up lens 50 to which the panel 40 according to this embodiment is attached. FIG. 6 is a cross-sectional view taken along the line AA in FIG. 3, showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached. FIG. 7 is an exploded perspective view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached. FIG. 8 is a bottom exploded perspective view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached. FIG. 9 shows an example of using the close-up lens 50 to which the panel 40 according to this embodiment is attached. Note that a rear view, a left side view, and a right side view showing an example of a close-up lens to which the panel 40 according to the present embodiment is attached are each expressed in the same manner as the front view shown in FIG.
 [接写レンズ]
 主に、図6を用いて、本実施形態に係る接写レンズ50を説明する。本実施形態に係る接写レンズ50は、被写体(肌)を拡大撮影するために用いられる。図6に示されるとおり、本実施形態に係る接写レンズ50は、この拡大撮影の機能を実現するレンズ部54と、当該レンズ部54を囲う外郭部とを有する。
[Close-up lens]
The close-up lens 50 according to the present embodiment will be described mainly with reference to FIG. The close-up lens 50 according to the present embodiment is used to magnify a subject (skin). As shown in FIG. 6, the close-up lens 50 according to the present embodiment includes a lens unit 54 that realizes the function of the magnified photographing, and an outer portion that surrounds the lens unit 54.
 外郭部は、接写レンズ50をカメラに装着する際に当該カメラに向けられる底面部51と、接写レンズ50をカメラに装着するために用いられるレンズ固定クリップが取り付けられる溝部52と、撮影する際に被写体に当接又は近接する上端部53とを有する。 The outer portion includes a bottom surface portion 51 that faces the camera when the close-up lens 50 is attached to the camera, a groove portion 52 to which a lens fixing clip used for attaching the close-up lens 50 to the camera is attached, and when photographing. And an upper end 53 that contacts or is close to the subject.
 底面部51は、図10に示されるとおり、接写レンズ50をカメラに装着する際に当該カメラに接触する部分である。そのため、底面部51には、接写レンズ50をカメラに装着するための、乖離又は再付着可能な粘着剤が塗布又は貼付されていてもよい。 The bottom surface portion 51 is a portion that comes into contact with the camera when the close-up lens 50 is attached to the camera, as shown in FIG. Therefore, a detachable or reattachable adhesive for attaching the close-up lens 50 to the camera may be applied or affixed to the bottom surface portion 51.
 溝部52は、図5及び図6に示されるとおり、底面部51の上方、レンズ部54の周囲に設けられる。溝部52は、接写レンズ50をカメラに装着するためのレンズ固定クリップが取り付けられる部分である。当該溝部52により形成される溝をレンズ固定クリップで挟むことで、当該レンズ固定クリップは接写レンズ50に取り付けられる。なお、溝部52は、レンズ部54の周囲を囲うように存在する。そのため、接写レンズ50がカメラに装着されている際に、当該接写レンズ50は光軸を中心に回転する可能性がある。 The groove portion 52 is provided above the bottom surface portion 51 and around the lens portion 54 as shown in FIGS. 5 and 6. The groove portion 52 is a portion to which a lens fixing clip for attaching the close-up lens 50 to the camera is attached. The lens fixing clip is attached to the close-up lens 50 by sandwiching the groove formed by the groove 52 with the lens fixing clip. In addition, the groove part 52 exists so that the circumference | surroundings of the lens part 54 may be enclosed. Therefore, when the close-up lens 50 is attached to the camera, the close-up lens 50 may rotate around the optical axis.
 上端部53は、接写レンズ50がカメラに装着された状態で被写体を撮影する際に、当該被写体に当接又は近接する部分である。図6及び図7に示されるとおり、上端部53の内側には、後述する円形のパネル40を嵌め込むために、パネル40の外周とほぼ同様の形状及び大きさの凹部が形成される。 The upper end portion 53 is a portion that comes into contact with or is close to the subject when the subject is photographed with the close-up lens 50 attached to the camera. As shown in FIGS. 6 and 7, a recess having substantially the same shape and size as the outer periphery of the panel 40 is formed inside the upper end portion 53 in order to fit a circular panel 40 described later.
 レンズ部54は、被写体を拡大撮影するために凸レンズの形状を有する。レンズ部54は、対象とする被写体に適するように設計される。例えば、主に手や顔面の肌を撮影対象とする場合は、レンズ部54は、一例として、レンズの倍率が30倍程度になるように設計される。また、例えば、主に頭皮を撮影対象とする場合は、レンズ部54は、一例として、レンズの倍率が50倍から80倍程度になるように設計される。なお、本実施形態では、図6に示されるとおり、レンズ部54は、接写レンズ50の外郭部と一体に形成されている。 The lens unit 54 has a convex lens shape to magnify a subject. The lens unit 54 is designed to be suitable for a subject to be processed. For example, when the skin of the hand or face is mainly taken as an object to be photographed, the lens unit 54 is designed so that the magnification of the lens is about 30 times as an example. For example, when the scalp is mainly taken as an imaging target, the lens unit 54 is designed so that the magnification of the lens is about 50 to 80 times as an example. In the present embodiment, as shown in FIG. 6, the lens portion 54 is formed integrally with the outer portion of the close-up lens 50.
 ユーザは、このような接写レンズ50をカメラに装着し、図9に示されるようにして、自身の肌を撮影する。なお、図9において例示される被写体は、自身の肌(手の甲の肌)である。ただし、被写体は特に限定されず、自身の肌以外であってもよい。 The user attaches such a close-up lens 50 to the camera and shoots his / her skin as shown in FIG. The subject illustrated in FIG. 9 is his own skin (skin on the back of his hand). However, the subject is not particularly limited, and may be other than its own skin.
 なお、接写レンズ50は、例えば、透光性の部材で作成される。例えば、接写レンズ50は、透明のプラスチックで作成される。接写レンズ50は、透光性の部材で作成されることで、上端部53を被写体に接触させた状態で撮影する場合であっても、環境光を照明光として取り入れることができる。 Note that the close-up lens 50 is made of a translucent member, for example. For example, the close-up lens 50 is made of a transparent plastic. The close-up lens 50 is made of a translucent member, so that ambient light can be taken in as illumination light even when photographing with the upper end 53 in contact with the subject.
 [パネル]
 図7及び8等で示されるように、本実施形態の接写レンズ50には、パネル40が装着される。パネル40は、本発明の指標体の一実施形態に対応する。ただし、本発明の指標体は、パネル40により例示される接写レンズ用のパネルに限定されない。
[panel]
As shown in FIGS. 7 and 8, etc., a panel 40 is attached to the close-up lens 50 of the present embodiment. The panel 40 corresponds to an embodiment of the index body of the present invention. However, the index body of the present invention is not limited to the close-up lens panel exemplified by the panel 40.
 図10は、本実施形態に係るパネル40の一例を示す。パネル40は、カメラに装着された接写レンズ50を被写体に当接又は近接して被写体を撮影する時に、被写体に当接又は近接する位置に存在し、少なくともその一部がカメラの撮影範囲70に含まれることで被写体と共に撮影される、カメラにより撮影された画像の解析の指標となる所定の色が付された指標部41を備える。 FIG. 10 shows an example of the panel 40 according to the present embodiment. The panel 40 is present at a position where the close-up lens 50 attached to the camera is in contact with or close to the subject and the subject is in contact with or close to the subject, and at least a part of the panel 40 is in the shooting range 70 of the camera. An index unit 41 with a predetermined color as an index for analysis of an image captured by a camera that is captured with the subject by being included.
 被写体を撮影する際にこのパネル40が用いられることで、カメラにより撮影された画像の解析の指標となる所定の色の付された指標部41が被写体と共に撮影される。本実施形態では、この指標部41に基づいて、取得される画像の解析が可能となる。その結果、ユーザが接写レンズを利用して取得した画像に対する、カメラの機能及び性能、照明光、光源等の撮影環境の影響を軽減することが可能となる。 By using this panel 40 when photographing a subject, an index portion 41 with a predetermined color as an index for analysis of an image photographed by the camera is photographed together with the subject. In the present embodiment, the acquired image can be analyzed based on the index unit 41. As a result, it is possible to reduce the influence of the shooting environment such as the function and performance of the camera, the illumination light, and the light source on the image acquired by the user using the close-up lens.
 なお、図10等において示される撮影範囲70は、接写レンズ50に各パネルを装着し、かつ、カメラ(後述するユーザ端末2)に接写レンズ50を装着した状態で撮影を行う際に、当該各パネル上において、当該カメラにより撮影される範囲を示す。 Note that the shooting range 70 shown in FIG. 10 or the like is obtained when shooting is performed with each panel mounted on the close-up lens 50 and the close-up lens 50 mounted on a camera (a user terminal 2 described later). The range photographed by the camera on the panel is shown.
 まず、パネル40の装着方法について説明する。パネル40は、上端部53において嵌合されることで、接写レンズ50に装着される。上端部53は、撮影時に、被写体に当接又は近接する部分である。そのため、パネル40は、撮影時に、被写体に当接又は近接する位置に存在することになる。また、後述する指標部41は、パネル40のカメラ側の表面に設けられる。よって、後述する指標部41も、撮影時に、被写体に当接又は近接する位置に存在することになる。 First, a method for mounting the panel 40 will be described. The panel 40 is attached to the close-up lens 50 by being fitted at the upper end portion 53. The upper end portion 53 is a portion that is in contact with or close to the subject at the time of shooting. Therefore, the panel 40 is present at a position in contact with or close to the subject at the time of shooting. In addition, an index portion 41 described later is provided on the surface of the panel 40 on the camera side. Therefore, an index unit 41 to be described later is also present at a position in contact with or close to the subject at the time of shooting.
 なお、本実施形態に係るパネル40の装着方法は、このような方法に限定されない。例えば、パネル40は、接写レンズ50の上端部53に接着されてもよい。また、例えば、接写レンズ50の上端部53が、接写レンズ50のキャップとしてのパネル40に嵌合されることで、パネル40が接写レンズ50に装着されてもよい。 In addition, the mounting method of the panel 40 which concerns on this embodiment is not limited to such a method. For example, the panel 40 may be bonded to the upper end portion 53 of the close-up lens 50. For example, the panel 40 may be attached to the close-up lens 50 by fitting the upper end portion 53 of the close-up lens 50 to the panel 40 as a cap of the close-up lens 50.
 図11~13を用いて、パネル40を接写レンズ50に装着する他の方法の一例を説明する。図11は、本実施形態に係るパネル40の他の装着例を示す斜視図である。図12は、本実施形態に係るパネル40の他の装着例を示す分解斜視図である。また、図13は、本実施形態に係るパネル40の他の装着例を示す断面図である。図11~13に示されるように、例えば、接写レンズ50にキャップ60を装着する際に、接写レンズ50とキャップ60との間でパネル40が挟持されることで、パネル40は、接写レンズ50に装着されてもよい。 An example of another method for attaching the panel 40 to the close-up lens 50 will be described with reference to FIGS. FIG. 11 is a perspective view showing another mounting example of the panel 40 according to the present embodiment. FIG. 12 is an exploded perspective view showing another mounting example of the panel 40 according to the present embodiment. Moreover, FIG. 13 is sectional drawing which shows the other mounting example of the panel 40 which concerns on this embodiment. As shown in FIGS. 11 to 13, for example, when the cap 60 is attached to the close-up lens 50, the panel 40 is sandwiched between the close-up lens 50 and the cap 60. It may be attached to.
 次に、パネル40の形状について説明する。図10に示されるように、本実施形態に係るパネル40の形状は、円形である。しかしながら、パネル40の形状は円形以外の形状であってもよい。例えば、パネル40の形状は、多角形であってもよい。なお、本実施形態では、パネル40は、接写レンズ50の上端部53に設けられた凹部に嵌め込まれる。そのため、当該凹部は、パネル40の外周の形状に合わせて成形される。 Next, the shape of the panel 40 will be described. As shown in FIG. 10, the shape of the panel 40 according to the present embodiment is a circle. However, the shape of the panel 40 may be other than circular. For example, the shape of the panel 40 may be a polygon. In the present embodiment, the panel 40 is fitted into a recess provided in the upper end portion 53 of the close-up lens 50. Therefore, the said recessed part is shape | molded according to the shape of the outer periphery of the panel 40. FIG.
 図14は、パネル40の形状のバリエーションの一例を示す。図14に示されるパネル40Aは、半円の形状を有する。なお、パネル40Aは、本実施形態に係るパネル40の指標部41のバリエーションである指標部41Aを有する。パネル40Aを接写レンズ50に装着して被写体を撮影した場合、撮影される画像には、撮影範囲70の右半分に指標部41Aが写りこみ、左半分に被写体が写りこむ。 FIG. 14 shows an example of variations in the shape of the panel 40. The panel 40A shown in FIG. 14 has a semicircular shape. Note that the panel 40A includes an indicator portion 41A that is a variation of the indicator portion 41 of the panel 40 according to the present embodiment. When the subject is photographed with the panel 40A attached to the close-up lens 50, the index portion 41A appears in the right half of the photographing range 70 and the subject appears in the left half of the photographed image.
 次に、本実施形態に係る指標部41の形状について説明する。図10に示されるとおり、本実施形態に係る指標部41の形状は、指標部41の外周及び内周の形状が円形のドーナツ形である。図10で示されるように、指標部41は、開口43の領域を除き、パネル40の全面に形成される。ただし、指標部41は、被写体と共に撮影されればよく、指標部41の形状は、図10で示されるような形状に限定されない。指標部41の形状は、適宜選択される。 Next, the shape of the index part 41 according to the present embodiment will be described. As shown in FIG. 10, the shape of the indicator portion 41 according to the present embodiment is a donut shape in which the outer periphery and the inner periphery of the indicator portion 41 are circular. As shown in FIG. 10, the indicator portion 41 is formed on the entire surface of the panel 40 except for the region of the opening 43. However, the index unit 41 only needs to be photographed together with the subject, and the shape of the index unit 41 is not limited to the shape shown in FIG. The shape of the index part 41 is appropriately selected.
 図15A~15Fを用いて、指標部41の形状のバリエーションを説明する。図15A~15Fは、それぞれ、指標部41の形状のバリエーションを例示する。なお、各図では、同じ形状の開口には同じ符号が付される。また、同じ種別の印には同じ符号が付される。なお、種別を問わず印を称する場合、印44と称する。 The variation of the shape of the indicator portion 41 will be described with reference to FIGS. 15A to 15F. FIGS. 15A to 15F illustrate variations in the shape of the indicator portion 41, respectively. In addition, in each figure, the same code | symbol is attached | subjected to the opening of the same shape. The same type of mark is given the same symbol. In addition, when referring to a mark regardless of the type, it is referred to as a mark 44.
 図15Aにより示されるパネル40Bの指標部41Bは、指標部41と同様にドーナツ状を有するが、指標部41とは異なり、パネル40Bの全面に形成されていない。また、指標部41Bに付された色は、指標部41とは異なり、単色である。 15A, the indicator portion 41B of the panel 40B has a donut shape like the indicator portion 41, but unlike the indicator portion 41, it is not formed on the entire surface of the panel 40B. Further, unlike the indicator portion 41, the color assigned to the indicator portion 41B is a single color.
 図15Bにより示されるパネル40Cの指標部41Cの外周の形状は、指標部41とは異なり、正多角形(図では正12角形)である。また、指標部41Cは、指標部41とは異なり、4色の配色がされたカラーパタンを有している。なお、パネル40Cは、4種類の印(44a~44d)を有している。 15B, the shape of the outer periphery of the indicator portion 41C of the panel 40C shown in FIG. 15B is a regular polygon (regular dodecagon in the figure) unlike the indicator portion 41. Further, unlike the indicator unit 41, the indicator unit 41C has a color pattern in which four colors are arranged. The panel 40C has four types of marks (44a to 44d).
 図15Cにより示されるパネル40Dの指標部41Dの外周の形状は、指標部41Cの外周の形状と同じである。ただし、指標部41Dは、指標部41Cよりも大きい面積を有している。また、指標部41Dは、指標部41と同様、2色の配色がされたカラーパタンを有している。 The shape of the outer periphery of the indicator portion 41D of the panel 40D shown in FIG. 15C is the same as the shape of the outer periphery of the indicator portion 41C. However, the index part 41D has a larger area than the index part 41C. Further, the index part 41D has a color pattern in which two colors are arranged, like the index part 41.
 図15Dにより示されるパネル40Eの指標部41Eの外周の形状は、指標部41の形状と同じである。ただし、指標部41Eは、パネル40Eの表面全体を覆っていない。また、指標部41Eは、指標部41Cと同様に、4色の配色がされたカラーパタンを有している。なお、後述する印44は、パネル40Eに示されるように、全てのカラーパタンの領域に存在しなくてもよい。 The shape of the outer periphery of the indicator portion 41E of the panel 40E shown in FIG. 15D is the same as the shape of the indicator portion 41. However, the indicator portion 41E does not cover the entire surface of the panel 40E. In addition, the index part 41E has a color pattern in which four colors are arranged, like the index part 41C. Note that a mark 44 described later may not be present in all color pattern regions, as shown on the panel 40E.
 図15Eにより示されるパネル40Fの指標部41Fの外周の形状は、指標部41Dの外周の形状と同じである。また、指標部41Fは、指標部41Dと同じ大きさの面積を有している。ただし、指標部41Fは、指標部41Dとは異なり、4色の配色がされたカラーパタンを有している。 The shape of the outer periphery of the indicator portion 41F of the panel 40F shown in FIG. 15E is the same as the shape of the outer periphery of the indicator portion 41D. The index part 41F has the same area as the index part 41D. However, unlike the indicator portion 41D, the indicator portion 41F has a color pattern in which four colors are arranged.
 なお、後述する印44による情報処理を実現する上では、指標部41は、存在してもよいし、存在しなくてもよい。従って、図15Fに示されるパネル40Gように、後述する印44による情報処理を実現する観点において、指標部は存在しなくてもよい。 It should be noted that the indicator unit 41 may or may not exist in order to realize information processing using a mark 44 described later. Therefore, as in the panel 40G shown in FIG. 15F, the indicator portion may not be present from the viewpoint of realizing information processing using a mark 44 described later.
 次に、本実施形態に係る指標部41に付される所定の色について説明する。本実施形態に係る指標部41は、白及び黒が交互に配色されたカラーパタンを有する。但し、指標部41に付される所定の1又は複数の色は、カメラにより撮影された画像の解析の指標となる色であればよく、適宜選択されてよい。 Next, the predetermined color assigned to the indicator unit 41 according to the present embodiment will be described. The indicator unit 41 according to the present embodiment has a color pattern in which white and black are alternately arranged. However, the predetermined color or colors assigned to the index unit 41 may be any color as long as it is an index for analysis of an image captured by the camera, and may be selected as appropriate.
 例えば、指標部41Bに示されるように、指標部41は、単色で形成されてもよい。また、例えば、指標部41は、シアン、マゼンタ、イエロー、黒、及び、白で配色されてもよい。また、例えば、指標部41は、赤、緑、青、黒、及び、白で配色されてもよい。このように、指標部41に付される所定の1又は複数の色は、適宜選択されてよい。 For example, as shown in the indicator portion 41B, the indicator portion 41 may be formed in a single color. Further, for example, the indicator unit 41 may be colored in cyan, magenta, yellow, black, and white. For example, the indicator unit 41 may be colored in red, green, blue, black, and white. As described above, the predetermined one or a plurality of colors attached to the indicator unit 41 may be appropriately selected.
 図16は、指標部41の配色のバリエーションを例示する。後述するとおり、開口43の領域は、被写体の写る領域である。図16に示されるパネル40Hの、被写体の写る領域に隣接する領域(指標部41G)は、被写体の取りうる色の範囲において、位置Bから時計回りに段階的に変化するように配色される。指標部41は、このように配色されてもよい。このように配色されることで得られる効果は以下の通り説明することができる。 FIG. 16 illustrates a variation of the color scheme of the indicator unit 41. As will be described later, the region of the opening 43 is a region where the subject is captured. The area (index part 41G) adjacent to the area where the subject is photographed on the panel 40H shown in FIG. 16 is arranged so as to gradually change from position B in the clockwise direction within the range of colors that the subject can take. The indicator unit 41 may be arranged in this way. The effects obtained by such color arrangement can be described as follows.
 指標部41Gには被写体の取り得る範囲の色が段階的に配色されている。そのため、取得された画像上において、被写体の色と同じ色が付された指標部41Gの位置(角度)を特定することが可能である。まず、指標部41Gのように配色した効果として、特定可能な位置を、被写体の色の基準として用いることができる。 The color of the range that can be taken by the subject is arranged stepwise on the indicator 41G. Therefore, it is possible to specify the position (angle) of the index unit 41G to which the same color as the subject color is given on the acquired image. First, an identifiable position can be used as a reference for the color of a subject as an effect of color arrangement as in the indicator portion 41G.
 また、指標部41Gの色と同様、撮影された画像上における被写体の色は、カメラの機能及び性能、照明光、光源等の撮影環境による影響によって、被写体の実際の色とは異なりうる。ここで、指標部41Gは被写体に隣接する位置に配置されているため、指標部41Gは被写体とほぼ同様の影響を受ける。そのため、撮影された画像上において、被写体の色と同じ色が付された指標部41Gの位置(角度)における指標部41の実際の色は、被写体の実際の色とほぼ同じであると推定可能である。従って、撮影した画像を解析して、指標部41Gにおける被写体と同じ色の位置を特定することで、被写体の実際の色を推定することができる。このような効果を得るために、指標部41は、図16に示される指標部41Gのように配色されてもよい。 Also, similar to the color of the index unit 41G, the color of the subject on the photographed image may be different from the actual color of the subject due to the influence of the photographing environment such as the function and performance of the camera, illumination light, and light source. Here, since the index unit 41G is disposed at a position adjacent to the subject, the index unit 41G is affected in substantially the same manner as the subject. Therefore, on the captured image, it can be estimated that the actual color of the index unit 41 at the position (angle) of the index unit 41G to which the same color as the subject color is given is almost the same as the actual color of the subject. It is. Therefore, the actual color of the subject can be estimated by analyzing the photographed image and specifying the position of the same color as the subject in the index portion 41G. In order to obtain such an effect, the indicator section 41 may be colored like the indicator section 41G shown in FIG.
 なお、図7及び図8により示されるように、指標部41は、パネル40の被写体側の表面ではなく、カメラ側の表面に設けられる。 As shown in FIGS. 7 and 8, the indicator 41 is provided not on the subject side surface of the panel 40 but on the camera side surface.
 例えば、指標部41は、パネル40のカメラ側の表面に油性オフセット印刷、シルク印刷等で印刷されることで、設けられる。ただし、指標部41が撮影可能であれば、指標部41はパネル40の被写体側に印刷されてもよい。例えば、透明プラスチック等の透光性の部材でパネル40が作成される場合、パネル40の被写体側に指標部41が印刷されても、当該被写体41は撮影可能である。 For example, the indicator 41 is provided by printing on the camera side surface of the panel 40 by oil-based offset printing, silk printing, or the like. However, if the index part 41 can be photographed, the index part 41 may be printed on the subject side of the panel 40. For example, when the panel 40 is made of a translucent member such as transparent plastic, the subject 41 can be photographed even if the index portion 41 is printed on the subject side of the panel 40.
 また、例えば、指標部41の絵柄の印刷されたシールがパネル40のカメラ側の表面に貼付されることで、指標部41は設けられてもよい。ただし、指標部41が撮影可能であれば、当該シールは、上記印刷の場合と同様、パネル40の被写体側に貼付されてもよい。この点は、後述する印44も同様である。 Also, for example, the index portion 41 may be provided by sticking a sticker on which a picture of the index portion 41 is printed on the surface of the panel 40 on the camera side. However, if the index portion 41 can be photographed, the sticker may be attached to the subject side of the panel 40 as in the case of the printing. This also applies to the mark 44 described later.
 なお、カメラからは写すことができないパネル40の被写体側の表面は、任意に用いられてもよい。 It should be noted that the subject side surface of the panel 40 that cannot be photographed from the camera may be used arbitrarily.
 なお、本実施形態において、後述する画像のカラー補正及び照明むら補正では、取得された画像上の指標部41の色と指標部41の実際の色との差が用いられる。具体的には、画像上において指標部41が写っている位置で当該差を求めた2次元分布が用いられる。そのため、後述する画像のカラー補正及び照明むら補正の精度を上げるためには、指標部41は、カメラの撮影範囲70において、被写体の写る領域の周囲全方位に写るように存在する方が好ましい。 In the present embodiment, the difference between the color of the index portion 41 on the acquired image and the actual color of the index portion 41 is used in the color correction and uneven illumination correction of the image, which will be described later. Specifically, a two-dimensional distribution in which the difference is obtained at a position where the index portion 41 is shown on the image is used. Therefore, in order to increase the accuracy of color correction and illumination unevenness correction, which will be described later, it is preferable that the index unit 41 exists in all directions around the area where the subject appears in the shooting range 70 of the camera.
 次に、パネル40が備える印44について説明する。図10等を参照すると、本実施形態に係るパネル40は、所定の情報を示す印44を備える。印44は、カメラに装着された接写レンズ50を被写体に当接又は近接して被写体を撮影する時に、当該カメラの撮影範囲70に含まれることで、当該被写体及び指標部41と共に撮影される。なお、印44は、図10等に示されるように、指標部41が存在する領域に付されてもよい。 Next, the mark 44 provided on the panel 40 will be described. Referring to FIG. 10 and the like, the panel 40 according to the present embodiment includes a mark 44 indicating predetermined information. The mark 44 is photographed together with the subject and the index unit 41 by being included in the photographing range 70 of the camera when photographing the subject with the close-up lens 50 attached to the camera being in contact with or close to the subject. In addition, the mark 44 may be attached | subjected to the area | region where the parameter | index part 41 exists, as FIG. 10 etc. show.
 例えば、印44により示される所定の情報は、接写レンズ50が正規レンズであることを識別するための情報である。また、例えば、印44により示される所定の情報は、パネル40と共に接写レンズ50を配布した配布者を識別するための情報である。また、例えば、印44により示される所定の情報は、パネル40及び接写レンズ50と共に販売される商品の情報である。 For example, the predetermined information indicated by the mark 44 is information for identifying that the close-up lens 50 is a regular lens. For example, the predetermined information indicated by the mark 44 is information for identifying the distributor who distributed the close-up lens 50 together with the panel 40. For example, the predetermined information indicated by the mark 44 is information on a product sold together with the panel 40 and the close-up lens 50.
 取得した画像にこのような印44が存在することで、例えば、取得した画像を当該印44により示される所定の情報に基づいて識別する等の情報処理が可能になる。つまり、印44は、所定の識別を行うための識別子として扱われてもよい。 Since such a mark 44 exists in the acquired image, for example, information processing such as identifying the acquired image based on predetermined information indicated by the mark 44 becomes possible. That is, the mark 44 may be handled as an identifier for performing predetermined identification.
 なお、印44は、例えば、所定の規格に従って情報を表現するバーコード等であってもよい。また、印44は、例えば、配布者、商品製造者、対象商品等の商標であってもよい。印44は、適宜、設定されてよい。 Note that the mark 44 may be, for example, a barcode that represents information in accordance with a predetermined standard. The mark 44 may be a trademark of a distributor, a product manufacturer, a target product, etc., for example. The mark 44 may be set as appropriate.
 また、パネル40は、複数種類の印44を備えることで、複数種類の情報を有してもよい。例えば、本実施形態に係るパネル40は、印44a及び印44bの2種類の印を備える。また、例えば、図15Bで示されるパネル40Cは、印44a~44dの4種類の印を備える。 In addition, the panel 40 may have a plurality of types of information by including a plurality of types of marks 44. For example, the panel 40 according to the present embodiment includes two types of marks, a mark 44a and a mark 44b. Further, for example, the panel 40C shown in FIG. 15B includes four types of marks 44a to 44d.
 なお、指標部41を用いた画像解析を実現する上では、印44は、存在してもよいし、存在しなくてもよい。従って、図10等に示されるように、指標部41を用いた画像解析を実現する観点において、印44は存在しなくてもよい。 Note that the mark 44 may or may not exist in realizing image analysis using the index unit 41. Accordingly, as illustrated in FIG. 10 and the like, the mark 44 may not be present from the viewpoint of realizing image analysis using the index unit 41.
 次に、パネル40に設けられる開口43について説明する。図10により示される本実施形態に係るパネル40は、カメラに装着された接写レンズ50を被写体に当接又は近接して当該被写体を撮影する時に、カメラの撮影範囲70に含まれる部分に開口43を有する。当該開口43は、例えば、トムソン加工等の加工方法でパネル40に穴を開けることで、作成される。 Next, the opening 43 provided in the panel 40 will be described. The panel 40 according to the present embodiment shown in FIG. 10 has an opening 43 in a portion included in the photographing range 70 of the camera when the close-up lens 50 attached to the camera is brought into contact with or close to the subject and the subject is photographed. Have The opening 43 is created, for example, by making a hole in the panel 40 by a processing method such as Thomson processing.
 本実施形態において、カメラの撮影範囲70における開口43の領域は、カメラ側から見て被写体が見える領域、言い換えると、被写体が写る領域である。つまり、被写体が撮影可能な状態であれば、パネル40において、当該開口43は設けられなくてもよい。例えば、本実施形態に係るパネル40が透明なプラスチック等の透光性の部材で作成される場合、被写体を写すことは可能であるため、開口43は設けられなくてもよい。 In the present embodiment, the region of the opening 43 in the shooting range 70 of the camera is a region where the subject can be seen from the camera side, in other words, a region where the subject is captured. That is, the opening 43 may not be provided in the panel 40 as long as the subject can be photographed. For example, when the panel 40 according to the present embodiment is made of a translucent member such as a transparent plastic, it is possible to capture the subject, and thus the opening 43 may not be provided.
 また、パネル40は、開口43が設けられておらず、かつ、透光性の部材で作成されていなくても、図14により示されるパネル40Aのように、被写体が撮影可能な状態であればよい。 Further, the panel 40 is not provided with the opening 43 and is not made of a translucent member, as long as the subject can be photographed as in the panel 40A shown in FIG. Good.
 接写レンズを用いた撮影では、一般的に、当該接写レンズの光軸付近の像ほど鮮明である傾向にある。そのため、被写体をより鮮明に写すためには、開口43は、光軸を含むように設けられたほうがよい。例えば、開口43の中心は、カメラ(接写レンズ50)の光軸近傍に位置するように設けられる。好ましくは、開口43の中心は、光軸上に位置するように設けられる。 In photographing using a close-up lens, generally, an image near the optical axis of the close-up lens tends to be clearer. Therefore, in order to capture the subject more clearly, the opening 43 should be provided so as to include the optical axis. For example, the center of the opening 43 is provided in the vicinity of the optical axis of the camera (close-up lens 50). Preferably, the center of the opening 43 is provided so as to be located on the optical axis.
 また、光軸近傍又は光軸上に中心が位置するように設けられた開口43の外縁を囲うように指標部41が設けられてもよい。これにより、パネル40が上下左右にずれて装着されても、開口43の周囲には指標部41が存在するため、指標部41をカメラの撮影範囲70に含めることが可能となる。 Further, the indicator portion 41 may be provided so as to surround the outer edge of the opening 43 provided so that the center is located near or on the optical axis. As a result, even if the panel 40 is mounted so as to be shifted vertically and horizontally, the index portion 41 exists around the opening 43, so that the index portion 41 can be included in the shooting range 70 of the camera.
 また、開口43は、円形等の回転対称性を有する形状であってもよい。これにより、パネル40が光軸を中心に回転しても、その影響を軽減又は無視することが可能になる。 Further, the opening 43 may have a circular symmetry shape. Thereby, even if the panel 40 rotates around the optical axis, the influence can be reduced or ignored.
 同様の理由から、図10等に示されるように、指標部41は、開口43の回転対称性に対応した所定の規則性に従って配色されたカラーパタンを有してもよい。また、開口43の回転対称性に対応した所定の規則性に従って、複数の印44が配列されてもよい。 For the same reason, as shown in FIG. 10 and the like, the indicator section 41 may have a color pattern arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening 43. A plurality of marks 44 may be arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening 43.
 ここで、所定の規則性とは、開口43の回転対称性に対応する規則性である。所定の規則性とは、例えば、開口43の回転対称性と対応し、360度の倍数を除いて、撮影範囲70に含まれる指標部41又は印44の状態が回転の前後で変化しない角度pが存在することを指す(0<p<360)。 Here, the predetermined regularity is a regularity corresponding to the rotational symmetry of the opening 43. The predetermined regularity corresponds to, for example, the rotational symmetry of the opening 43, and an angle p at which the state of the index portion 41 or the mark 44 included in the imaging range 70 does not change before and after the rotation except for a multiple of 360 degrees. Is present (0 <p <360).
 本実施形態に係る開口43の形状は、円形である。ただし、開口43の形状は、円形に限定されない。開口43の形状は、適宜選択される。例えば、開口43の形状として、多角形、楕円形、星形等が挙げられる。 The shape of the opening 43 according to the present embodiment is a circle. However, the shape of the opening 43 is not limited to a circle. The shape of the opening 43 is appropriately selected. For example, the shape of the opening 43 includes a polygon, an ellipse, a star, and the like.
 図17A~17Cを用いて、開口43のバリエーションを説明する。図17A~17Cは、開口43のバリエーションを例示する。 Variations of the opening 43 will be described with reference to FIGS. 17A to 17C. 17A-17C illustrate variations of the opening 43. FIG.
 例えば、図17Aに示されるパネル40Iの開口43Aの形状は、回転対称性を有する形状として、正多角形である。なお、パネル40Iの指標部41Hは、パネル40Iの表面全体に設けられる。 For example, the shape of the opening 43A of the panel 40I shown in FIG. 17A is a regular polygon as a shape having rotational symmetry. The indicator portion 41H of the panel 40I is provided on the entire surface of the panel 40I.
 また、例えば、図17Bに示されるパネル40Jの指標部41Iは、開口43Bの中心に向かって突出するような形状を有している。そして、図17Cにより示されるパネル40Kの指標部41Jは、開口43Bの中心に向かって突出するような形状を有し、かつ、白及び黒の2色の配色がなされている。また、指標部41Jには、印44a及び印44bが付されている。 Also, for example, the indicator portion 41I of the panel 40J shown in FIG. 17B has a shape that protrudes toward the center of the opening 43B. And the indicator part 41J of the panel 40K shown by FIG. 17C has a shape which protrudes toward the center of the opening 43B, and 2 color schemes of white and black are made | formed. The indicator portion 41J is provided with marks 44a and 44b.
 なお、被写体を撮影する際、ユーザは、例えば、パネル40(接写レンズ50)を被写体に接触させる。この場合、被写体が変形することで、開口43を通じて、被写体が部分的に接写レンズ50の内側に入り込む可能性がある。 Note that when photographing a subject, the user brings the panel 40 (close-up lens 50) into contact with the subject, for example. In this case, when the subject is deformed, the subject may partially enter the close-up lens 50 through the opening 43.
 また、本実施形態では、指標部41及び印44は、パネル40のカメラ側の表面に付される。そのため、被写体の変形を考慮しない場合、指標部41及び印44は、光軸方向において、パネル40の厚さの分だけ被写体よりもレンズ側に位置する。 In this embodiment, the index part 41 and the mark 44 are attached to the surface of the panel 40 on the camera side. Therefore, when the deformation of the subject is not taken into consideration, the index portion 41 and the mark 44 are positioned closer to the lens than the subject by the thickness of the panel 40 in the optical axis direction.
 これらの要因により、指標部41及び印44は、光軸方向において、被写体と異なる位置に存在する可能性がある。そのため、被写体にピントを合わせた場合、光軸方向の位置の違いによって、指標部41及び印44にピントが合わない可能性がある。 Due to these factors, there is a possibility that the index part 41 and the mark 44 exist at positions different from the subject in the optical axis direction. Therefore, when focusing on the subject, there is a possibility that the index unit 41 and the mark 44 may not be focused due to a difference in position in the optical axis direction.
 これらの可能性を考慮して、指標部41は、光軸方向に1又は複数の段差を有してもよい。また、印44は、パネル40において形成される各段の光軸方向に垂直な面に付されてもよい。 In consideration of these possibilities, the indicator portion 41 may have one or a plurality of steps in the optical axis direction. The mark 44 may be attached to a surface perpendicular to the optical axis direction of each step formed in the panel 40.
 図18及び19は、パネル40のバリエーションを示す。図18は、指標部41Kに1つの段差が設けられたパネル40Lの一例を示す斜視図である。また、図19は、指標部41Kに1つの段差が設けられたパネル40Lの一例を示す断面図である。 18 and 19 show variations of the panel 40. FIG. FIG. 18 is a perspective view showing an example of a panel 40L in which one step is provided in the indicator portion 41K. FIG. 19 is a cross-sectional view showing an example of a panel 40L in which one step is provided in the indicator portion 41K.
 このような段差を設けることで、光軸方向において、指標部41及び印44と被写体との位置のズレを軽減することができる。そのため、被写体にピントを合わせた場合における、指標部41及び印44のピントのズレを軽減することができる。 By providing such a step, it is possible to reduce the positional deviation between the index portion 41 and the mark 44 and the subject in the optical axis direction. Therefore, it is possible to reduce the focus shift between the index portion 41 and the mark 44 when the subject is focused.
 なお、図20A及び図20Bは、光軸方向に指標部41が1つの段差を有するパネルのバリエーションを例示する。図20Aにより示されるパネル40Mは、図15Aにより示されたパネル40Bの変形例である。パネル40Mは、指標部41Lにおいて1段の段差を有している。また、図20Bにより示されるパネル40Nは、図15Bにより示されたパネル40Cの変形例である。パネル40Nは、指標部41Mにおいて1段の段差を有している。 20A and 20B exemplify variations of the panel in which the index portion 41 has one step in the optical axis direction. A panel 40M shown in FIG. 20A is a modification of the panel 40B shown in FIG. 15A. The panel 40M has one step at the indicator portion 41L. Moreover, the panel 40N shown by FIG. 20B is a modification of the panel 40C shown by FIG. 15B. The panel 40N has one step at the indicator portion 41M.
 [その他の形態]
 なお、パネル40は、接写レンズ50と一体に形成されてもよい。パネル40と接写レンズ50とが一体になったユニットは、本発明の接写レンズユニットの一実施形態に対応する。
[Other forms]
The panel 40 may be formed integrally with the close-up lens 50. A unit in which the panel 40 and the close-up lens 50 are integrated corresponds to an embodiment of the close-up lens unit of the present invention.
 また、本発明の指標体は、パネル40と同様の機能を発揮する、被写体に貼付されるシールであってもよい。当該シールは、図10等を用いて説明したパネル40の表面を実現する絵柄のシールとして説明可能である。ユーザは、例えば、このようなパネル40の表面を実現する絵柄のシールを被写体に貼付し、そのシールの上から接写レンズ50を接触させて被写体を撮影する。これにより、撮影される画像は、パネル40を用いて撮影した場合と同様の効果を得ることができる。 Further, the index body of the present invention may be a sticker that is attached to a subject and that exhibits the same function as the panel 40. The said seal | sticker can be demonstrated as a seal | sticker of the pattern which implement | achieves the surface of the panel 40 demonstrated using FIG. The user, for example, affixes a picture sticker that realizes the surface of the panel 40 to the subject, and contacts the close-up lens 50 on the sticker to photograph the subject. Thereby, the image to be photographed can obtain the same effect as that obtained by photographing using the panel 40.
 なお、このようなシールは、主に被写体に貼り付けて用いられる。そのため、当該シールの中央に存在する開口43に代えて、例えば、肌の油に反応する透明なシートが設けられてもよい。これにより、ユーザは、接写レンズを用いて肌を撮影する際に、肌の油分を確認することができる。 It should be noted that such a sticker is mainly used by sticking to a subject. Therefore, instead of the opening 43 present in the center of the seal, for example, a transparent sheet that reacts with skin oil may be provided. Thus, the user can check the oil content of the skin when photographing the skin using the close-up lens.
 §2 情報処理システム
 次に、上述のような接写レンズ50とパネル40とを用いて撮影した画像を収集する情報処理システムについて説明する。
§2 Information processing system Next, an information processing system that collects images taken using the close-up lens 50 and the panel 40 as described above will be described.
 本実施形態に係る情報処理システムは、カメラに装着された接写レンズを被写体に当接又は近接して当該被写体を撮影する時に、当該被写体に当接又は近接する位置に存在する、所定の情報を示す印が、当該カメラの撮像範囲に含まれることで、当該被写体と共に撮影された画像を取得する。そして、本実施形態に係る情報処理システムは、取得した画像から印を抽出し、抽出した印により示される情報と対応付けて当該画像を蓄積する。 In the information processing system according to the present embodiment, when a close-up lens mounted on a camera is brought into contact with or close to a subject and the subject is photographed, predetermined information existing at a position in contact with or close to the subject is obtained. When the mark shown is included in the imaging range of the camera, an image shot with the subject is acquired. Then, the information processing system according to the present embodiment extracts a mark from the acquired image, and stores the image in association with information indicated by the extracted mark.
 本実施形態に係る情報処理システムは、このように動作することで、接写レンズを利用して撮影された画像を、印により示される情報と対応付けて蓄積する。そのため、蓄積される画像は、印により示される情報に基づいて識別することが可能になる。その結果、本実施形態によれば、接写レンズを利用して撮影され、収集された画像を効率よく利用できるようにすることが可能となる。 The information processing system according to the present embodiment operates as described above, and stores an image photographed using the close-up lens in association with information indicated by a mark. Therefore, the stored image can be identified based on information indicated by the mark. As a result, according to the present embodiment, it is possible to efficiently use the images captured and collected using the close-up lens.
 [ハードウェア構成]
 図21は、本実施形態に係る情報処理システム1のハードウェア構成を例示する。本実施形態に係る情報処理システム1は、ネットワーク5を介して接続されたユーザ端末2とサーバ3とを備える。なお、ネットワーク5に接続されるユーザ端末2とサーバ3との間における情報の伝達は、例えば、3G(3rd Generation)ネットワーク、インターネット、電話網、及び、専用網等のネットワーク5を介したデータ通信で実現される。ネットワーク5の種類は、各データ通信に応じて、適宜選択される。
[Hardware configuration]
FIG. 21 illustrates a hardware configuration of the information processing system 1 according to the present embodiment. An information processing system 1 according to the present embodiment includes a user terminal 2 and a server 3 connected via a network 5. Information transmission between the user terminal 2 connected to the network 5 and the server 3 is, for example, data communication via the network 5 such as a 3G (3rd Generation) network, the Internet, a telephone network, and a dedicated network. It is realized with. The type of the network 5 is appropriately selected according to each data communication.
 ユーザ端末2は、CPU(Central Processing Unit)、RAM(Random Access Memory)、ROM(Read Only Memory)等を含む制御部21と、制御部21で実行されるプログラム等を記憶する補助記憶装置22と、ネットワーク5を介して通信を行うためのネットワークインタフェース23と、カメラモジュール26を含む入力装置24と、ディスプレイ27、LED、スピーカ等を含む出力装置25と、を備える情報処理装置である。なお、図21及び後述する図22では、ネットワークインタフェースは「NW I/F」と記載される。 The user terminal 2 includes a control unit 21 including a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and an auxiliary storage device 22 that stores programs executed by the control unit 21. The information processing apparatus includes a network interface 23 for performing communication via the network 5, an input device 24 including a camera module 26, and an output device 25 including a display 27, an LED, a speaker, and the like. In FIG. 21 and FIG. 22 described later, the network interface is described as “NW I / F”.
 ユーザ端末2の具体的なハードウェア構成に関しては、実施形態に応じて適宜構成要素の省略、置換、及び、追加が行われてよい。例えば、ユーザ端末2には、入力装置として更にマウス、キーボード等が接続されてもよい。また、制御部21は、複数のプロセッサを含んでもよい。また、ユーザ端末2として、例えば、提供されるサービス専用に設計された端末の他、PC、携帯電話、スマートフォン、タブレット端末、携帯ゲーム機等が用いられてもよい。 The specific hardware configuration of the user terminal 2 may be appropriately omitted, replaced, and added according to the embodiment. For example, the user terminal 2 may be further connected with a mouse, a keyboard, or the like as an input device. The control unit 21 may include a plurality of processors. Further, as the user terminal 2, for example, a PC, a mobile phone, a smartphone, a tablet terminal, a portable game machine, or the like may be used in addition to a terminal designed exclusively for the provided service.
 本実施形態において、ユーザは、当該ユーザ端末2が備えるカメラモジュール26を用いて被写体を撮影する。具体的には、ユーザは、パネル40を装着した接写レンズ50をユーザ端末2のカメラモジュール26に装着する。そして、ユーザは、例えば、図9に示されるように、接写レンズ50を自身の肌に当接又は近接させた状態で、撮影を行う。本実施形態では、このように撮影されることで、肌、指標部41、及び、印44が拡大撮影された画像が取得される。 In the present embodiment, the user photographs the subject using the camera module 26 provided in the user terminal 2. Specifically, the user attaches the close-up lens 50 attached with the panel 40 to the camera module 26 of the user terminal 2. Then, for example, as shown in FIG. 9, the user performs photographing in a state where the close-up lens 50 is in contact with or close to his / her skin. In the present embodiment, an image obtained by enlarging the skin, the index unit 41, and the mark 44 is acquired by shooting in this way.
 また、サーバ3は、CPU、RAM、ROM等を含む制御部31と、制御部31で実行されるプログラム等を記憶する補助記憶装置32と、ネットワーク5を介して通信を行うためのネットワークインタフェース33と、を備える情報処理装置である。本実施形態では、ユーザが撮影した画像は、当該サーバ3に蓄積される。 The server 3 includes a control unit 31 including a CPU, a RAM, a ROM, and the like, an auxiliary storage device 32 that stores programs executed by the control unit 31, and a network interface 33 for performing communication via the network 5. And an information processing apparatus. In the present embodiment, images taken by the user are stored in the server 3.
 サーバ3の具体的なハードウェア構成に関しては、ユーザ端末2と同様、実施形態に応じて適宜構成要素の省略、置換、及び、追加が行われてよい。また、サーバ2は、1又は複数の情報処理装置により実装されてもよい。 As for the specific hardware configuration of the server 3, as with the user terminal 2, components may be omitted, replaced, and added as appropriate according to the embodiment. The server 2 may be implemented by one or a plurality of information processing devices.
 なお、本実施形態に係る情報処理システム1は、図21に示される2台の情報処理装置で実現される例に限定されない。例えば、情報処理システム1は、1台の情報処理装置で実現されてもよいし、3台以上の情報処理装置で実現されてもよい。情報処理システム1のハードウェア構成に関して、実施形態に応じた構成要素の省略、置換、及び、追加が可能である。 Note that the information processing system 1 according to the present embodiment is not limited to the example realized by the two information processing apparatuses illustrated in FIG. For example, the information processing system 1 may be realized by one information processing apparatus or may be realized by three or more information processing apparatuses. With regard to the hardware configuration of the information processing system 1, it is possible to omit, replace, and add components according to the embodiment.
 例えば、図22は、本実施形態に係る情報処理システム1のハードウェア構成のバリエーションを例示する。図22に示される情報処理システム1Aは、1台の情報処理装置で実現されている。情報処理システム1Aは、CPU、RAM、ROM等を含む制御部10と、制御部10で実行されるプログラム等を記憶する補助記憶装置11と、ネットワークを介して通信を行うためのネットワークインタフェース12と、カメラモジュール26を含む入力装置13と、ディスプレイ、LED、スピーカ等を含む出力装置14と、を備える情報処理装置である。なお、当該情報処理装置の具体的なハードウェア構成に関しては、ユーザ端末2及びサーバ3と同様、実施形態に応じて適宜構成要素の省略、置換、及び、追加が行われてよい。 For example, FIG. 22 illustrates a variation of the hardware configuration of the information processing system 1 according to the present embodiment. An information processing system 1A shown in FIG. 22 is realized by a single information processing apparatus. The information processing system 1A includes a control unit 10 including a CPU, RAM, ROM, and the like, an auxiliary storage device 11 that stores a program executed by the control unit 10, and a network interface 12 for performing communication via a network. The information processing apparatus includes an input device 13 including a camera module 26 and an output device 14 including a display, an LED, a speaker, and the like. In addition, regarding the specific hardware configuration of the information processing apparatus, as in the case of the user terminal 2 and the server 3, the omission, replacement, and addition of components may be appropriately performed according to the embodiment.
 [機能構成]
 図23は、本実施形態に係る情報処理システム1の機能構成の概略を例示する。本実施形態に係る情報処理システム1は、画像取得部15、画像処理部16、蓄積部17、及び、表示制御部18を備える。
[Function configuration]
FIG. 23 illustrates an outline of a functional configuration of the information processing system 1 according to the present embodiment. The information processing system 1 according to the present embodiment includes an image acquisition unit 15, an image processing unit 16, a storage unit 17, and a display control unit 18.
 これらの機能は、ユーザ端末2及びサーバ3それぞれのCPUが、それぞれのRAMに展開された各種プログラムを解釈及び実行して、各構成要素を制御することで実現される。例えば、画像取得部15及び表示制御部18はユーザ端末2において実現され、画像処理部16及び蓄積部17はサーバ3において実現される。 These functions are realized by the CPUs of the user terminal 2 and the server 3 interpreting and executing various programs developed in the respective RAMs to control each component. For example, the image acquisition unit 15 and the display control unit 18 are realized in the user terminal 2, and the image processing unit 16 and the storage unit 17 are realized in the server 3.
 ただし、各機能は、ユーザ端末2及びサーバ3のいずれにおいて実現されてもよい。例えば、表示制御部18はサーバ3において実現されてもよい。また、画像処理部16はユーザ端末2において実現されてもよい。 However, each function may be realized in either the user terminal 2 or the server 3. For example, the display control unit 18 may be realized in the server 3. Further, the image processing unit 16 may be realized in the user terminal 2.
 また、1つの機能が、複数の情報処理装置にまたがって実現されてもよい。例えば、画像処理部16は、ユーザ端末2及びサーバ3にまたがって実現されてもよい。この場合、例えば、ユーザ端末2及びサーバ3は、後述する画像処理部16により実行される複数の処理を分担することで、画像処理部16を実現する。つまり、複数の情報処理装置は、1つの機能における複数の処理を分担することで、当該1つの機能を実現してもよい。 Further, one function may be realized across a plurality of information processing apparatuses. For example, the image processing unit 16 may be realized across the user terminal 2 and the server 3. In this case, for example, the user terminal 2 and the server 3 realize the image processing unit 16 by sharing a plurality of processes executed by the image processing unit 16 described later. That is, a plurality of information processing apparatuses may realize the one function by sharing a plurality of processes in one function.
 このように、情報処理システム1の機能が複数の情報処理装置により実現される場合、当該複数の情報処理装置は互いに連携して、情報処理システム1の機能を実現する。当該複数の情報処理装置が互いに連携する場合、当該複数の情報処理装置間では当該連携に係る情報交換のためのデータ通信が行われる。本実施形態では、ユーザ端末2及びサーバ3は、ネットワーク5を介してデータ通信を行うことで、互いに連携し、本実施形態の情報処理システム1の機能を実現する。 Thus, when the function of the information processing system 1 is realized by a plurality of information processing apparatuses, the plurality of information processing apparatuses cooperate with each other to realize the function of the information processing system 1. When the plurality of information processing apparatuses cooperate with each other, data communication for information exchange related to the cooperation is performed between the plurality of information processing apparatuses. In the present embodiment, the user terminal 2 and the server 3 perform data communication via the network 5 to cooperate with each other to realize the function of the information processing system 1 of the present embodiment.
 なお、本実施形態では、これらの機能がいずれも汎用のCPUによって実現される例について説明している。しかしながら、これらの機能の一部又は全部が、1又は複数の専用のプロセッサにより実現されてもよい。 In the present embodiment, an example in which these functions are realized by a general-purpose CPU is described. However, some or all of these functions may be realized by one or more dedicated processors.
 また、図23により示される機能構成は、本実施形態に係る情報処理システム1の一例に過ぎない。そのため、情報処理システム1の機能は、実施形態に応じて適宜機能の省略、置換、及び、追加が行われてもよい。例えば、後述する撮影時における画像の表示等が省略される場合、表示制御部18は省略されてよい。 Further, the functional configuration shown in FIG. 23 is only an example of the information processing system 1 according to the present embodiment. Therefore, the functions of the information processing system 1 may be appropriately omitted, replaced, and added according to the embodiment. For example, the display control unit 18 may be omitted when image display or the like at the time of shooting, which will be described later, is omitted.
 画像取得部15は、ユーザ端末2のカメラモジュール26に装着された接写レンズ50を被写体に当接又は近接して当該被写体を撮影する時に、当該被写体に当接又は近接する位置に存在する、所定の情報を示す印44が、当該カメラモジュール26の撮影範囲に含まれることで、当該被写体と共に撮影された画像を取得する。画像処理部16は、画像取得部15により取得された画像から印44を抽出する。そして、蓄積部17は、当該画像から抽出した印44により示される情報と対応付けて当該画像を蓄積する。 When the close-up lens 50 mounted on the camera module 26 of the user terminal 2 is brought into contact with or close to the subject and the image acquisition unit 15 captures the subject, the image acquisition unit 15 is located at a position in contact with or close to the subject. Is included in the shooting range of the camera module 26, and an image shot with the subject is acquired. The image processing unit 16 extracts the mark 44 from the image acquired by the image acquisition unit 15. Then, the storage unit 17 stores the image in association with the information indicated by the mark 44 extracted from the image.
 なお、画像取得部15は、ユーザ端末2のカメラモジュール26に装着された接写レンズ50を被写体に当接又は近接して当該被写体を撮影する時に、当該被写体に当接又は近接する位置に存在する、当該カメラモジュール26により撮影された画像を解析するための指標となる所定の色が付された指標部41が、少なくともその一部がカメラモジュール26の撮影範囲に含まれることで当該被写体及び印44と共に撮影された画像を取得してもよい。画像処理部16は、画像取得部15により取得された画像から指標部41を抽出してもよい。また、画像処理部16は、抽出した指標部41の所定の色に基づいて、取得された画像の解析処理を実行してもよい。更に、画像処理部16は、当該取得された画像に対して、当該画像の解析処理の結果に基づいた補正処理を実行してもよい。そして、蓄積部17は、画像処理部16により補正処理が実行された画像を蓄積してもよい。 Note that when the close-up lens 50 attached to the camera module 26 of the user terminal 2 contacts or approaches the subject and the image acquisition unit 15 captures the subject, the image acquisition unit 15 exists at a position that contacts or approaches the subject. The index unit 41 with a predetermined color that serves as an index for analyzing an image captured by the camera module 26 is included in the imaging range of the camera module 26 so that the subject and the mark are included. An image taken together with 44 may be acquired. The image processing unit 16 may extract the index unit 41 from the image acquired by the image acquisition unit 15. Further, the image processing unit 16 may execute an analysis process on the acquired image based on the predetermined color of the extracted index unit 41. Further, the image processing unit 16 may perform a correction process on the acquired image based on the result of the analysis process of the image. Then, the storage unit 17 may store the image that has been subjected to the correction processing by the image processing unit 16.
 また、画像処理部16は、画像取得部15により取得された画像の被写体に係る部分に対して、当該被写体の解析処理を実行してもよい。そして、蓄積部17は、画像処理部16により実行された当該被写体の解析処理の結果を、画像取得部15により取得された画像に対応付けて、蓄積してもよい。 In addition, the image processing unit 16 may perform the subject analysis processing on the portion related to the subject of the image acquired by the image acquisition unit 15. Then, the storage unit 17 may store the result of the subject analysis process executed by the image processing unit 16 in association with the image acquired by the image acquisition unit 15.
 また、画像処理部16は、被写体の解析処理が失敗する場合に、画像取得部15に対して、再度、画像の取得を指示してもよい。 The image processing unit 16 may instruct the image acquisition unit 15 to acquire an image again when the subject analysis processing fails.
 また、画像取得部15は、ユーザ端末2のカメラモジュール26において接写レンズ50が装着されている位置を示す目印を含む画像を取得してもよい。画像処理部16は、画像取得部15により取得された画像上の所定の領域に当該目印が含まれるか否かを判定してもよい。そして、画像処理部16は、取得された画像上の所定の領域に当該目印が含まれないと判定した場合、接写レンズ50の装着位置が間違っていることを通知してもよい。 Further, the image acquisition unit 15 may acquire an image including a mark indicating a position where the close-up lens 50 is attached in the camera module 26 of the user terminal 2. The image processing unit 16 may determine whether or not the mark is included in a predetermined area on the image acquired by the image acquisition unit 15. If the image processing unit 16 determines that the mark is not included in the predetermined area on the acquired image, the image processing unit 16 may notify that the mounting position of the close-up lens 50 is incorrect.
 また、表示制御部18は、ユーザ端末2のカメラモジュール26が被写体を撮影する間、当該カメラモジュール26において取得されている画像と、当該カメラモジュール26において接写レンズ50が装着されている位置を示す目印と、を含む画像を出力すると共に、当該出力する画像上において、当該目印に対する接写レンズ50の装着位置の基準を示す案内情報を出力してもよい。 Further, the display control unit 18 indicates an image acquired by the camera module 26 while the camera module 26 of the user terminal 2 captures a subject, and a position where the close-up lens 50 is mounted on the camera module 26. An image including the mark may be output, and guidance information indicating a reference for the position where the close-up lens 50 is attached to the mark may be output on the output image.
 [動作例]
 次に、図24~28を用いて、本実施形態に係る情報処理システム1の動作例を説明する。なお、以下で説明する動作例は、本実施形態に係る情報処理システム1の情報処理の一例に過ぎない。そのため、各処理の順序は、当該各処理の前に実行された処理の結果を用いなければならない等の理由がない限り、可能な範囲で変更されてもよい。また、実施形態に応じて、適宜、処理の省略、置換、及び、追加が行われてもよい。
[Operation example]
Next, an operation example of the information processing system 1 according to the present embodiment will be described with reference to FIGS. Note that the operation example described below is merely an example of information processing of the information processing system 1 according to the present embodiment. Therefore, the order of the processes may be changed within a possible range unless there is a reason that the result of the process executed before each process must be used. Further, processing may be omitted, replaced, and added as appropriate according to the embodiment.
 図24は、本実施形態に係る情報処理システム1の処理手順の一例を示す。なお、図24及び後述する図27では、ステップを「S」と略称する。 FIG. 24 shows an example of the processing procedure of the information processing system 1 according to the present embodiment. In FIG. 24 and FIG. 27 described later, the step is abbreviated as “S”.
 まず、例えば、ユーザによるユーザ端末2の操作に応じて、補助記憶装置22に格納されたプログラムが、制御部21のRAM等に展開される。そして、制御部21のRAM等に展開された当該プログラムが、制御部21のCPUにより実行される。このようにして、情報処理システム1は、処理を開始する。 First, for example, according to the operation of the user terminal 2 by the user, the program stored in the auxiliary storage device 22 is expanded in the RAM or the like of the control unit 21. Then, the program developed in the RAM or the like of the control unit 21 is executed by the CPU of the control unit 21. In this way, the information processing system 1 starts processing.
 ステップ100では、画像取得部15によって、ユーザ端末2のカメラモジュール26に装着された接写レンズ50を被写体に当接又は近接して当該被写体を撮影する時に、当該被写体に当接又は近接する位置に存在する、所定の情報を示す印44が、カメラモジュール26の撮影範囲に含まれることで、当該被写体と共に撮影された画像が取得される。 In Step 100, when the close-up lens 50 attached to the camera module 26 of the user terminal 2 is brought into contact with or close to the subject and the subject is photographed by the image acquisition unit 15, the image acquisition unit 15 is brought into contact with or close to the subject. The existing mark 44 indicating the predetermined information is included in the shooting range of the camera module 26, whereby an image shot with the subject is acquired.
 例えば、ユーザは、パネル40を装着した接写レンズ50をユーザ端末2のカメラモジュール26に装着する。そして、ユーザは、接写レンズ50を自身の肌に当接又は近接させた状態で、ユーザ端末2を操作する。当該ユーザ端末2の操作に応じて、制御部21は、カメラモジュール26による撮影を制御し、カメラモジュール26から当該撮影に係る画像を取得する。これにより、情報処理システム1は、被写体(肌)と共に印(44a、44b)が撮影された画像を取得する。 For example, the user attaches the close-up lens 50 with the panel 40 attached to the camera module 26 of the user terminal 2. Then, the user operates the user terminal 2 with the close-up lens 50 in contact with or close to his / her skin. In response to the operation of the user terminal 2, the control unit 21 controls shooting by the camera module 26 and acquires an image related to the shooting from the camera module 26. Thereby, the information processing system 1 acquires an image in which the mark (44a, 44b) is photographed together with the subject (skin).
 図25は、ユーザ端末2のカメラモジュール26により撮影される画像80を例示する。領域81は、パネル40の開口43の領域に対応し、被写体(肌)が写る領域である。また、領域82は、パネル40の指標部41及び印44が存在する領域に対応し、指標部41と印44とが写る領域である。また境界83は、領域81と領域83との境界であり、開口43と指標部41との境界に対応する。 FIG. 25 illustrates an image 80 captured by the camera module 26 of the user terminal 2. The region 81 corresponds to the region of the opening 43 of the panel 40 and is a region where the subject (skin) is captured. An area 82 corresponds to an area where the index portion 41 and the mark 44 of the panel 40 exist, and is an area where the index portion 41 and the mark 44 are reflected. The boundary 83 is a boundary between the region 81 and the region 83 and corresponds to the boundary between the opening 43 and the indicator portion 41.
 このように、本実施形態に係る画像取得部15は、被写体及び印44と共に指標部41が撮影された画像を取得してもよい。一方、撮影範囲70に指標部41を含まないパネル40G(図15F)が用いられて撮影された場合、画像取得部15は、指標部41を含まない画像を取得する。本実施形態に係る情報処理システム1は、このような指標部41を含まない画像を処理対象としてもよい。 As described above, the image acquisition unit 15 according to the present embodiment may acquire an image in which the index unit 41 is photographed together with the subject and the mark 44. On the other hand, when a panel 40G (FIG. 15F) that does not include the index part 41 is used in the shooting range 70, the image acquisition unit 15 acquires an image that does not include the index part 41. The information processing system 1 according to the present embodiment may process an image that does not include such an index unit 41.
 ここで、境界83は、開口43と指標部41との境界、すなわち、開口43を規定する縁(ふち)に対応する。開口43を設けられたパネル40は、接写レンズ50の上端部53に嵌め込むように装着されている。そのため、ユーザ端末2のカメラモジュール26から見て、開口43を規定する縁の位置は、パネル40が装着された接写レンズ50が存在する位置に対応する。つまり、接写レンズ50の存在する位置に応じて、開口43を規定する縁の位置が決まり、更に、取得される画像上の境界83の位置が定まる。よって、当該境界83は、ユーザ端末2のカメラモジュール26において接写レンズ50が装着されている位置を示す目印となる。 Here, the boundary 83 corresponds to the boundary between the opening 43 and the index portion 41, that is, the edge that defines the opening 43. The panel 40 provided with the opening 43 is mounted so as to be fitted into the upper end portion 53 of the close-up lens 50. Therefore, when viewed from the camera module 26 of the user terminal 2, the position of the edge that defines the opening 43 corresponds to the position where the close-up lens 50 to which the panel 40 is mounted is present. That is, the position of the edge that defines the opening 43 is determined according to the position where the close-up lens 50 exists, and further, the position of the boundary 83 on the acquired image is determined. Therefore, the boundary 83 serves as a mark indicating the position where the close-up lens 50 is attached in the camera module 26 of the user terminal 2.
 このように、本実施形態では、画像取得部15は、ユーザ端末2のカメラモジュール26において接写レンズ50が装着されている位置を示す目印を含む画像を取得する。なお、当該目印は、本実施形態で例示される境界83に限定されない。当該目印は、ユーザ端末2のカメラモジュール26と接写レンズ50との位置関係を示すものであればよい。例えば、ユーザ端末2のカメラモジュール26において接写レンズ50が装着されている位置を示す目印として、指標部41の有するカラーパタンにおけるエッジ、印44等が用いられてもよい。また、ユーザ端末2のカメラモジュール26において接写レンズ50が装着されている位置を示す目印として、当該カメラモジュール26の撮影範囲に含まれる接写レンズ50上の位置に設けられた所定の模様等が用いられてもよい。 As described above, in this embodiment, the image acquisition unit 15 acquires an image including a mark indicating a position where the close-up lens 50 is mounted in the camera module 26 of the user terminal 2. In addition, the said mark is not limited to the boundary 83 illustrated by this embodiment. The mark only needs to indicate the positional relationship between the camera module 26 of the user terminal 2 and the close-up lens 50. For example, as a mark indicating the position where the close-up lens 50 is mounted in the camera module 26 of the user terminal 2, an edge in the color pattern of the index unit 41, the mark 44, or the like may be used. Further, as a mark indicating the position where the close-up lens 50 is mounted in the camera module 26 of the user terminal 2, a predetermined pattern or the like provided at a position on the close-up lens 50 included in the shooting range of the camera module 26 is used. May be.
 この場合、画像処理部16は、取得された画像上の所定の領域に境界83が含まれるか否かを判定してもよい。画像処理部16は、取得された画像上の所定の領域に境界83が含まれないと判定した場合、接写レンズ50の装着位置が間違っていることを通知してもよい。なお、図25に示される領域84は、当該所定の領域の一例である。当該所定の領域は、任意に設定されてよい。 In this case, the image processing unit 16 may determine whether or not the boundary 83 is included in a predetermined area on the acquired image. When the image processing unit 16 determines that the boundary 83 is not included in the predetermined region on the acquired image, the image processing unit 16 may notify that the close-up lens 50 is mounted in the wrong position. Note that an area 84 shown in FIG. 25 is an example of the predetermined area. The predetermined area may be arbitrarily set.
 例えば、制御部21は、カメラモジュール26により取得された画像から、エッジ抽出等により、境界83を抽出してもよい。また、制御部21は、抽出した境界83が領域84に含まれているか否かを判定してもよい。 For example, the control unit 21 may extract the boundary 83 from the image acquired by the camera module 26 by edge extraction or the like. Further, the control unit 21 may determine whether or not the extracted boundary 83 is included in the region 84.
 そして、制御部21は、境界83が領域84に含まれていないと判定した場合、出力装置25に含まれるスピーカを作動させて、接写レンズ50の装着位置が間違っていることをユーザに知らせるための音声を出力してもよい。また、接写レンズ50の装着位置が間違っていることをユーザに知らせる画像をディスプレイ27に表示させてもよい。 When the control unit 21 determines that the boundary 83 is not included in the region 84, the control unit 21 operates the speaker included in the output device 25 to notify the user that the close-up lens 50 is mounted incorrectly. May be output. In addition, an image that informs the user that the close-up lens 50 is mounted in an incorrect position may be displayed on the display 27.
 これにより、ユーザは、接写レンズ50の装着位置が間違っていることを知ることができる。また、このとき、制御部21は、出力装置25を制御し、接写レンズ50の装着位置をユーザに指示する音声出力又は画面出力を行うことで、接写レンズ50の装着位置の調整に伴う困難の軽減を図ってもよい。 Thereby, the user can know that the mounting position of the close-up lens 50 is wrong. Further, at this time, the control unit 21 controls the output device 25 to perform audio output or screen output instructing the user of the mounting position of the close-up lens 50, thereby making it difficult to adjust the mounting position of the close-up lens 50. It may be reduced.
 他方、制御部21は、境界83が領域84に含まれていると判定した場合、出力装置25に含まれるスピーカを作動させて、接写レンズ50の装着位置が正しいことをユーザに知らせるための音声を出力してもよい。また、接写レンズ50の装着位置が正しいことをユーザに知らせる画面をディスプレイ27に表示させてもよい。これにより、ユーザは、接写レンズ50の装着位置が正しいことを認知することができる。 On the other hand, if the control unit 21 determines that the boundary 83 is included in the region 84, the sound for notifying the user that the position of the close-up lens 50 is correct by operating the speaker included in the output device 25. May be output. Further, a screen for informing the user that the close-up lens 50 is mounted correctly may be displayed on the display 27. Thus, the user can recognize that the close-up lens 50 is mounted correctly.
 なお、境界83が領域84に含まれているか否かの判定に係る処理は、後述するステップ200の解析処理において実行されてもよい。また、サーバ3が、境界83が領域84に含まれているか否かの判定に係る処理を実行してもよい。例えば、ユーザ端末2は、取得した画像をサーバ3に送信してもよい。そして、サーバ3は、境界83が領域84に含まれていないと判定した場合、ユーザ端末2に、接写レンズ50の装着位置が間違っていることを通知してもよい。ユーザ端末2は、当該通知を受けることで、上述の音声出力、画面出力等を実行してもよい。 It should be noted that the process related to determining whether or not the boundary 83 is included in the region 84 may be executed in an analysis process in step 200 described later. Further, the server 3 may execute a process related to a determination as to whether or not the boundary 83 is included in the region 84. For example, the user terminal 2 may transmit the acquired image to the server 3. If the server 3 determines that the boundary 83 is not included in the region 84, the server 3 may notify the user terminal 2 that the close-up lens 50 is mounted incorrectly. The user terminal 2 may execute the above-described audio output, screen output, and the like by receiving the notification.
 また、表示制御部18は、ユーザ端末2のカメラモジュール26が被写体を撮影する間、当該カメラモジュール26において取得されている画像と、当該カメラモジュール26において接写レンズ50が装着されている位置を示す目印(境界83)と、を含む画像を出力すると共に、当該出力する画像上において、当該目印(境界83)に対する接写レンズ50の装着位置の基準を示す案内情報をディスプレイ27に出力してもよい。 Further, the display control unit 18 indicates an image acquired by the camera module 26 while the camera module 26 of the user terminal 2 captures a subject, and a position where the close-up lens 50 is mounted on the camera module 26. An image including the mark (boundary 83) may be output, and guidance information indicating a reference of the position where the close-up lens 50 is attached to the mark (boundary 83) may be output to the display 27 on the output image. .
 図26は、本実施形態に係るディスプレイ27の画面表示を例示する。例えば、制御部21は、カメラモジュール26により撮影中に連続的に取得されている画像をディスプレイ27に出力する。このとき、制御部21は、当該出力する画像上において、境界83に対する接写レンズ50の装着位置の基準を示す案内情報として、ガイド線91をディスプレイ27に出力する。 FIG. 26 illustrates a screen display of the display 27 according to the present embodiment. For example, the control unit 21 outputs images continuously acquired by the camera module 26 during shooting to the display 27. At this time, the control unit 21 outputs a guide line 91 to the display 27 as guide information indicating the reference of the mounting position of the close-up lens 50 with respect to the boundary 83 on the output image.
 つまり、ディスプレイ27では、図25により示されるような画像とともに、境界83の基準を示すガイド線91が表示される。これにより、ユーザは、撮像される境界83を当該ガイド線91に合わせるように接写レンズ50の装着位置を定めることで、接写レンズ50を正しい装着位置で装着することができる。つまり、このようなガイド線91を表示することで、接写レンズ50の装着位置の調整に伴う困難の軽減が図られる。 That is, on the display 27, a guide line 91 indicating the reference of the boundary 83 is displayed together with an image as shown in FIG. Thus, the user can mount the close-up lens 50 at the correct mounting position by determining the mounting position of the close-up lens 50 so that the imaged boundary 83 is aligned with the guide line 91. That is, by displaying such a guide line 91, the difficulty associated with the adjustment of the mounting position of the close-up lens 50 can be reduced.
 なお、本実施形態では、接写レンズ50の装着位置の基準を示す案内情報であるガイド線91の形状は、接写レンズ50が装着されている位置を示す目印である境界83の形状に対応する。しかしながら、ガイド線91は、境界83の形状に対応する形状に限定される訳ではない。ガイド線91の形状は、適宜、選択されてよい。 In this embodiment, the shape of the guide line 91 that is guide information indicating the reference of the mounting position of the close-up lens 50 corresponds to the shape of the boundary 83 that is a mark indicating the position where the close-up lens 50 is mounted. However, the guide line 91 is not limited to a shape corresponding to the shape of the boundary 83. The shape of the guide line 91 may be selected as appropriate.
 また、本実施形態に係る案内情報は、本実施形態に係るガイド線91のような、接写レンズ50が装着されている位置を示す目印(境界83)を合わせる位置を示す標識に限定される訳ではない。例えば、案内情報は、接写レンズ50を動かす方向を示す標識であってもよいし、接写レンズ50の装着位置を指示する文字表示であってもよい。案内情報は、実施形態に応じて、適宜、設定される。 In addition, the guide information according to the present embodiment is limited to a sign indicating a position to match a mark (boundary 83) indicating a position where the close-up lens 50 is mounted, such as the guide line 91 according to the present embodiment. is not. For example, the guidance information may be a sign indicating the direction in which the close-up lens 50 is moved, or may be a character display indicating the mounting position of the close-up lens 50. The guide information is appropriately set according to the embodiment.
 接写レンズ50は拡大撮影に用いられるレンズであるため、装着位置の小さなずれが、撮影画像に大きな影響を与えてしまう。カメラモジュール26において接写レンズ50の装着位置が固定されない場合、ユーザは、このような小さなずれが生じないように、接写レンズ50の当該装着位置を調整しなければならず、接写レンズ50の装着に困難を伴う場面があった。上記案内情報は、接写レンズ50が装着されている位置を示す目印に対する接写レンズ50の装着位置の基準を示すことで、このような場面における困難を軽減する。 Since the close-up lens 50 is a lens used for magnified photographing, a small shift in the mounting position has a great influence on the photographed image. When the mounting position of the close-up lens 50 is not fixed in the camera module 26, the user must adjust the mounting position of the close-up lens 50 so that such a small shift does not occur. There was a scene with difficulty. The guide information reduces the difficulty in such a situation by indicating a reference of the mounting position of the close-up lens 50 with respect to a mark indicating the position where the close-up lens 50 is mounted.
 なお、本動作例では、情報処理システム1に含まれる装置によって撮影された画像が取得される。しかしながら、情報処理システム1は、当該情報処理システム1には含まれない他の装置によって撮影された画像を取得してもよい。画像を撮影するカメラは、ユーザ端末2のカメラモジュール26に限定されない。 In this operation example, an image taken by an apparatus included in the information processing system 1 is acquired. However, the information processing system 1 may acquire an image taken by another device that is not included in the information processing system 1. The camera that captures an image is not limited to the camera module 26 of the user terminal 2.
 また、情報処理システム1は、他の装置から画像を取得してもよい。例えば、情報処理システム1は、ネットワーク5に接続された他の情報処理装置から、被写体と共に印44が撮影された画像を取得してもよい。 Further, the information processing system 1 may acquire an image from another device. For example, the information processing system 1 may acquire an image in which the mark 44 is photographed together with the subject from another information processing apparatus connected to the network 5.
 また、情報処理システム1の処理対象となる画像は、ユーザ端末2のカメラモジュール26等のカメラにより撮影中に連続的に取得されている画像であってもよいし、当該カメラのシャッターを切ることで得られる画像であってもよい。 Further, the image to be processed by the information processing system 1 may be an image that is continuously acquired during shooting by a camera such as the camera module 26 of the user terminal 2 or the shutter of the camera is released. The image obtained in (1) may be used.
 ステップ200では、画像処理部16により、解析処理及び補正処理が実行される。当該解析処理及び補正処理の具体例は、図27に示される。以下では、サーバ3がこれらの処理を実行する例を示す。ただし、これらの処理は、ユーザ端末2及びサーバ3のいずれにおいて実行されてもよい。また、ユーザ端末2及びサーバ3は、これらの処理を分担して、実行してもよい。これらの処理をユーザ端末2及びサーバ3で分担する場合、ユーザ端末2及びサーバ3は、ネットワーク5を介したデータ通信で、処理対象のデータをやりとりして、処理主体を切り替える。 In step 200, the image processing unit 16 executes analysis processing and correction processing. A specific example of the analysis process and the correction process is shown in FIG. Below, the server 3 shows the example which performs these processes. However, these processes may be executed in either the user terminal 2 or the server 3. Further, the user terminal 2 and the server 3 may share these processes and execute them. When these processes are shared by the user terminal 2 and the server 3, the user terminal 2 and the server 3 exchange data to be processed by data communication via the network 5 and switch processing subjects.
 図27は、本実施形態に係る情報処理システム1による解析処理及び補正処理を例示するフローチャートである。サーバ3は、ユーザ端末2から送信される、ステップ100において取得された画像を受信する。そして、サーバ3は、受信した画像に対して、解析処理及び補正処理を実行する。具体的には、以下のとおりに処理が実行される。 FIG. 27 is a flowchart illustrating an analysis process and a correction process by the information processing system 1 according to this embodiment. The server 3 receives the image acquired from step 100 and transmitted from the user terminal 2. Then, the server 3 performs analysis processing and correction processing on the received image. Specifically, the process is executed as follows.
 ステップ201では、制御部31によって、ステップ100において取得された画像から指標部41が抽出される。例えば、制御部31は、図25に示される画像80から、指標部41が写っている領域82を抽出する。なお、当該処理は、本発明の画像の解析処理の一例に相当する。 In step 201, the control unit 31 extracts the index unit 41 from the image acquired in step 100. For example, the control unit 31 extracts a region 82 in which the index unit 41 is shown from the image 80 shown in FIG. Note that this processing corresponds to an example of image analysis processing according to the present invention.
 本実施形態では、指標部41は、白及び黒が交互に配色されたカラーパタンを有する。制御部31は、白又は黒に近似している色が付された領域を、指標部41が写っている領域82と推定して、当該領域を抽出する。 In this embodiment, the indicator unit 41 has a color pattern in which white and black are alternately arranged. The control unit 31 estimates a region with a color approximated to white or black as a region 82 in which the index unit 41 is reflected, and extracts the region.
 具体的には、指標部41は、画像80の各画素のRGB値を参照する。参照したRGB値と白を示すRGB値(255,255,255)との差が所定の閾値以内であれば、制御部31は、当該RGB値を有する画素を、指標部41の白の領域が写った領域に含まれる画素と判定する。また、参照したRGB値と黒を示すRGB値(0,0,0)との差が所定の閾値以内であれば、制御部31は、当該RGB値を有する画素を、指標部41の黒の領域が写った領域に含まれる画素と判定する。そして、参照したRGB値が上記2つの条件を満たさない場合は、制御部31は、当該RGB値を有する画素は、指標部41が写った領域に含まれない画素と判定する。制御部31は、画像80の各画素について、これらの判定を行うことで、指標部41が写っている領域82を推定し、抽出する。 Specifically, the index unit 41 refers to the RGB value of each pixel of the image 80. If the difference between the referenced RGB value and the RGB value indicating white (255, 255, 255) is within a predetermined threshold value, the control unit 31 selects a pixel having the RGB value as a white region of the index unit 41. It is determined that the pixel is included in the captured area. If the difference between the referenced RGB value and the RGB value (0, 0, 0) indicating black is within a predetermined threshold, the control unit 31 determines that the pixel having the RGB value is the black value of the index unit 41. The pixel is determined to be included in the region where the region is reflected. When the referenced RGB value does not satisfy the above two conditions, the control unit 31 determines that the pixel having the RGB value is a pixel that is not included in the region where the index unit 41 is captured. The control unit 31 estimates and extracts the region 82 in which the index unit 41 is captured by making these determinations for each pixel of the image 80.
 なお、白又は黒に近い色を被写体が有することで、当該被写体が写る領域(領域81)の画素が、指標部41が写る領域(領域82)の画素と誤って判定される場合がある。 Note that when the subject has a color close to white or black, the pixel in the region (region 81) in which the subject is captured may be erroneously determined as the pixel in the region (region 82) in which the index portion 41 is captured.
 このような場合に対応するため、制御部31は、開口43の形状、又は、指標部41の形状を補助記憶装置32等に予め記憶していてもよい。そして、制御部31は、予め記憶している開口43の形状等の情報を参照し、領域81又は領域82の位置を推定することで、領域81の画素を領域82の画素と誤って判定することを回避してもよい。 In order to cope with such a case, the control unit 31 may store the shape of the opening 43 or the shape of the index unit 41 in advance in the auxiliary storage device 32 or the like. Then, the control unit 31 erroneously determines the pixel of the region 81 as the pixel of the region 82 by referring to the information such as the shape of the opening 43 stored in advance and estimating the position of the region 81 or the region 82. You may avoid that.
 また、制御部31は、境界83を抽出し、領域81と領域82との境の位置を予め判定しておくことで、領域81の画素を領域82の画素と誤って判定することを回避してもよい。 Further, the control unit 31 extracts the boundary 83 and determines in advance the position of the boundary between the region 81 and the region 82, thereby avoiding erroneous determination of the pixel in the region 81 as the pixel in the region 82. May be.
 また、制御部31は、抽出した領域82から境界83の位置を特定してもよい。そして、制御部31は、上述した、接写レンズ50の装着位置が正しいか否かの判定を行ってもよい。例えば、制御部31は、その位置を特定した境界83が領域84に含まれている否かを判定してもよい。そして、制御部31は、境界83が領域84に含まれていないと判定した場合、接写レンズ50の装着位置が間違っていることを示すメッセージをユーザ端末2に通知してもよい。他方、制御部31は、境界83が領域84に含まれていると判定した場合、接写レンズ50の装着位置は正しいことを示すメッセージをユーザ端末2に通知してもよい。 Further, the control unit 31 may specify the position of the boundary 83 from the extracted region 82. And the control part 31 may determine whether the mounting position of the close-up lens 50 mentioned above is correct. For example, the control unit 31 may determine whether the region 84 includes the boundary 83 that specifies the position. If the control unit 31 determines that the boundary 83 is not included in the region 84, the control unit 31 may notify the user terminal 2 of a message indicating that the close-up lens 50 is mounted in the wrong position. On the other hand, when it is determined that the boundary 83 is included in the region 84, the control unit 31 may notify the user terminal 2 of a message indicating that the close-up lens 50 is mounted correctly.
 このようにして、制御部31は、画像80から、指標部41が写っている領域82の抽出を試みる。なお、領域82を抽出する方法は、このような方法に限定されず、実施形態に応じて適宜選択されてよい。 In this way, the control unit 31 tries to extract the region 82 in which the index unit 41 is captured from the image 80. Note that the method of extracting the region 82 is not limited to such a method, and may be appropriately selected according to the embodiment.
 制御部31は、画像80から領域82を抽出できない場合、当該解析及び補正処理を終了する(ステップ201の「NO」)。他方、制御部31は、画像80から領域82を抽出できた場合(ステップ202の「YES」)、ステップ203に処理を進める。 If the area 82 cannot be extracted from the image 80, the control unit 31 ends the analysis and correction process (“NO” in step 201). On the other hand, if the area 82 can be extracted from the image 80 (“YES” in step 202), the control unit 31 advances the process to step 203.
 ステップ203では、制御部31によって、画像のピンぼけ補正が行われる。例えば、制御部31は、画像80においてピンぼけが生じている領域を判定する。そして、制御部31は、鮮鋭化フィルタ等を用いて、当該ピンぼけが生じている領域の画像を鮮鋭化することで、ピンぼけ補正を行う。なお、ピンぼけが生じている領域を判定する処理は、本発明の画像の解析処理の一例に相当する。また、ピンぼけが生じている領域の画像を鮮鋭化する処理は、本発明の補正処理の一例に相当する。 In step 203, the control unit 31 performs image blur correction. For example, the control unit 31 determines an area where the image 80 is out of focus. And the control part 31 performs a defocus correction | amendment by sharpening the image of the area | region where the said blur has arisen using a sharpening filter etc. Note that the process of determining the area where the defocus is generated corresponds to an example of the image analysis process of the present invention. Further, the process of sharpening the image of the area where the blur is generated corresponds to an example of the correction process of the present invention.
 本実施形態では、制御部31は、ステップ201で抽出した領域82に写る指標部41の有するカラーパタンにおける白と黒との境(エッジ)を用いて、ピンぼけが生じているか否かを判定する。ピントが合っていないほど、エッジにおける画素値(RGB値)の変化(傾斜)が緩やかになる。よって、制御部31は、当該エッジにおける画素値(RGB値)の変化に基づいて、ピンぼけが生じているか否かを判定する。 In the present embodiment, the control unit 31 determines whether or not defocusing has occurred using the boundary (edge) between white and black in the color pattern of the index unit 41 that is captured in the region 82 extracted in step 201. . As the focus is not achieved, the change (gradient) of the pixel value (RGB value) at the edge becomes gentler. Therefore, the control unit 31 determines whether or not a blur has occurred based on a change in the pixel value (RGB value) at the edge.
 そして、制御部31は、ピンぼけが生じていると判定した領域に鮮鋭化フィルタ等を適用することで、当該ピンぼけが生じている領域の画像を鮮鋭化する。これによって、制御部31は、画像80のピンぼけ補正を行う。なお、画像のピンぼけ補正の方法は、このような方法に限定されず、実施形態に応じて適宜選択されてよい。 And the control part 31 sharpens the image of the area | region where the said blur has arisen by applying a sharpening filter etc. to the area | region determined that the blur has arisen. Thereby, the control unit 31 performs defocus correction of the image 80. Note that the method of correcting the defocus of the image is not limited to such a method, and may be appropriately selected according to the embodiment.
 ステップ204では、制御部31によって、画像のカラー補正及び照明むらの補正が行われる。例えば、カメラの機能及び性能に影響を受けて、実際の色とは異なった色の画像が得られる場合がある。また、照明光及び光源に影響を受けて、照明むらができ、実際の色とは異なった色の画像が得られる場合がある。ステップ204では、制御部31は、これらの撮影環境による影響を解析し、当該影響を軽減するように画像を補正する。なお、撮影環境による影響を解析する処理は、本発明の画像の解析処理の一例に相当する。また、撮影環境による影響を軽減するように画像を補正する処理は、本発明の補正処理の一例に相当する。 In step 204, the controller 31 corrects image color and illumination unevenness. For example, an image having a color different from the actual color may be obtained depending on the function and performance of the camera. In addition, the illumination may be uneven due to the influence of the illumination light and the light source, and an image having a color different from the actual color may be obtained. In step 204, the control unit 31 analyzes the influence of these photographing environments and corrects the image so as to reduce the influence. The process for analyzing the influence of the shooting environment corresponds to an example of the image analysis process of the present invention. Further, the process of correcting an image so as to reduce the influence of the shooting environment corresponds to an example of the correction process of the present invention.
 画像80の領域82に写る指標部41は、白及び黒が交互に配色されたカラーパタンを有する。白のパタンの領域に含まれる画素同士は、同じRGB値を有するはずである。また、黒のパタンの領域に含まれる画素同士も、同じRGB値を有するはずである。 The indicator section 41 shown in the area 82 of the image 80 has a color pattern in which white and black are alternately arranged. Pixels included in the white pattern region should have the same RGB value. Also, the pixels included in the black pattern region should have the same RGB value.
 しかしながら、カメラの機能及び性能、照明光、光源等の撮影環境の影響を受けると、各画素のRGB値は、当該影響に応じて、実際の色とは異なる値となる。そのため、これらの撮影環境の影響を受けると、同じRGB値になるはずの領域にRGB値のばらつきが生じる。このばらつきをなくすようにRGB値を補正することで、制御部31は、カラー補正及び照明むら補正を行う。 However, when affected by the shooting environment such as the function and performance of the camera, illumination light, and light source, the RGB value of each pixel becomes a value different from the actual color according to the influence. Therefore, when affected by these shooting environments, variations in RGB values occur in regions that should have the same RGB values. By correcting the RGB values so as to eliminate this variation, the control unit 31 performs color correction and illumination unevenness correction.
 例えば、制御部31は、白のパタンが写っている領域及び黒のパタンが写っている領域それぞれについて、各領域の実際の色を基準(基準色)として、各領域に含まれる画素のRGB値と基準色のRGB値との差(ずれ)を求める。 For example, the control unit 31 uses the actual color of each area as a reference (reference color) for each of the area including the white pattern and the area including the black pattern, and the RGB values of the pixels included in each area And the difference (shift) between the RGB value of the reference color and the reference color.
 具体的には、制御部31は、白のパタンが写っている領域において、当該領域に含まれる各画素のRGB値と白を示すRGB値との差を求めることで、各画素の色と基準色との差を測定する。これにより、制御部31は、白のパタンが写っている領域において、各画素の色と基準色との差を示す二次元分布を得る。また、制御部31は、黒のパタンが写っている領域も同様に測定し、各画素の色と基準色との差を示す二次元分布を得る。 Specifically, the control unit 31 obtains the difference between the RGB value of each pixel included in the region and the RGB value indicating white in the region where the white pattern is captured, thereby determining the color of each pixel and the reference Measure the difference from the color. Thereby, the control unit 31 obtains a two-dimensional distribution indicating the difference between the color of each pixel and the reference color in the region where the white pattern is reflected. In addition, the control unit 31 similarly measures a region where a black pattern is captured, and obtains a two-dimensional distribution indicating a difference between the color of each pixel and the reference color.
 そして、制御部31は、領域82に含まれる画素について得られた、各画素の色と基準色との差に係る二次元分布の波形から、領域81に含まれる画素の色と基準色との差に係る二次元分布を推測する。例えば、領域82において作成された二次元分布の空白領域(領域81)を所定の補間計算で埋めることによって、領域81に含まれる画素の色と基準色との差に係る二次元分布を推測する。 Then, the control unit 31 obtains the color of the pixel included in the region 81 and the reference color from the waveform of the two-dimensional distribution obtained for the pixel included in the region 82 and the difference between the color of each pixel and the reference color. A two-dimensional distribution related to the difference is estimated. For example, by filling a blank area (area 81) of the two-dimensional distribution created in the area 82 with a predetermined interpolation calculation, a two-dimensional distribution related to the difference between the color of the pixel included in the area 81 and the reference color is estimated. .
 これにより、制御部31は、画像80全域において、画素の色と基準色との差を示す二次元分布が得られたので、当該二次元分布を用いて、各画素のRGB値を補正することで、カラー補正及び照明むら補正を実現する。 Thereby, since the two-dimensional distribution indicating the difference between the color of the pixel and the reference color is obtained in the entire image 80, the control unit 31 corrects the RGB value of each pixel using the two-dimensional distribution. Thus, color correction and uneven illumination correction are realized.
 なお、基準となる色は、適宜、設定されてよい。但し、被写体の色を実際の色に近づけるという観点からは、本実施形態のように各領域の実際の色を基準に使うことが望ましい。 Note that the reference color may be set as appropriate. However, from the viewpoint of bringing the color of the subject close to the actual color, it is desirable to use the actual color of each area as a reference as in the present embodiment.
 ステップ204の補正により、画像に写る被写体の色を実際の色に近づけることができる。そのため、後述する肌解析の精度を上げることが可能となる。また、被写体の色を実際の色に近づけることができるため、色の認識精度が高まり、例えば、色の2次元配列を使って情報を表現するコード体系に用いることができる色の種類を増やすことができる。 The color of the subject shown in the image can be brought close to the actual color by the correction in step 204. Therefore, it is possible to improve the accuracy of skin analysis described later. Also, since the color of the subject can be brought close to the actual color, the color recognition accuracy is improved, and for example, the number of colors that can be used in a code system that expresses information using a two-dimensional array of colors is increased. Can do.
 ステップ205では、制御部31によって、画像の歪み補正が行われる。例えば、制御部31は、画像80において歪みが生じている領域を判定する。そして、制御部31は、歪みが生じていると判定した領域の歪みを補正する。なお、歪みが生じている領域を判定する処理は、本発明の画像の解析処理の一例に相当する。また、歪みを補正する処理は、本発明の補正処理の一例に相当する。 In step 205, the control unit 31 corrects image distortion. For example, the control unit 31 determines a region where distortion is generated in the image 80. And the control part 31 correct | amends the distortion of the area | region determined that distortion has arisen. Note that the process of determining a region where distortion occurs corresponds to an example of an image analysis process of the present invention. The process for correcting the distortion corresponds to an example of the correction process of the present invention.
 本実施形態では、制御部31は、ステップ201で抽出した領域82に写る指標部41の有するカラーパタンにおけるエッジを用いて、歪みが生じているか否かを判定する。指標部41の有するカラーパタンにおけるエッジの形状は直線である。よって、制御部31は、エッジの形状が閾値を超えて直線ではなくなっているか否かを判定することで、歪みが生じているか否かを判定する。 In the present embodiment, the control unit 31 determines whether or not distortion has occurred using an edge in the color pattern of the index unit 41 shown in the region 82 extracted in step 201. The shape of the edge in the color pattern of the index unit 41 is a straight line. Therefore, the control unit 31 determines whether or not distortion has occurred by determining whether or not the shape of the edge exceeds the threshold and is no longer a straight line.
 そして、制御部31は、閾値を超えて直線ではなくなっていると判定したエッジの形状を直線になるように変換することで、歪みが生じていると判定した領域の歪みを補正する。なお、画像の歪みを補正する方法は、このような方法に限定されず、実施形態に応じて適宜選択されてよい。 Then, the control unit 31 corrects the distortion of the region determined to be distorted by converting the shape of the edge determined to be no longer a straight line beyond the threshold value so as to become a straight line. Note that the method for correcting image distortion is not limited to such a method, and may be appropriately selected according to the embodiment.
 ステップ206では、制御部31によって、画像80から印44が抽出される。例えば、制御部31は、補助記憶装置32等に予め記憶された印44のパタンを用いて、当該印44のパタンに適合するパタンを画像80から検出することで、画像80から印44を抽出する(パタンマッチング)。図10により示されるパネル40では、印44a及び印44bの2種類の印44が抽出される。 In step 206, the control unit 31 extracts the mark 44 from the image 80. For example, the control unit 31 uses the pattern of the mark 44 stored in advance in the auxiliary storage device 32 or the like, and detects the pattern that matches the pattern of the mark 44 from the image 80, thereby extracting the mark 44 from the image 80. Do (pattern matching). In the panel 40 shown in FIG. 10, two types of marks 44, that is, marks 44a and 44b are extracted.
 例えば、印44aは、パネル40が装着された接写レンズ50が正規品であることを示す。また、例えば、印44bは、当該接写レンズ50及びパネルと共に販売した商品の事業者(A業者)を示す。制御部31は、当該抽出に用いたパタンに基づいて、パネル40に付された印44が示す上記のような所定の情報を認識する。 For example, the mark 44a indicates that the close-up lens 50 to which the panel 40 is attached is a regular product. Further, for example, the mark 44b indicates a business operator (A trader) of a product sold together with the close-up lens 50 and the panel. Based on the pattern used for the extraction, the control unit 31 recognizes the predetermined information as indicated by the mark 44 attached to the panel 40.
 なお、パネル40のバリエーションとして、指標部に段差を有するパネル40Lがある。図18に示されるように、パネル40Lは、開口部周辺に段差を有し、それぞれの段に、指標部41Kと印(44a、44b)とを備える。このようなパネル40Lを用いて、ピントを肌(被写体)に合わせて撮影した場合、光軸方向における位置がそれぞれ異なるため、いずれかの段の画像が他方の段の画像よりも鮮明になる。 As a variation of the panel 40, there is a panel 40L having a step in the index portion. As shown in FIG. 18, the panel 40 </ b> L has a step around the opening, and includes an index portion 41 </ b> K and marks (44 a, 44 b) at each step. When such a panel 40L is used to photograph in focus with the skin (subject), the position in the optical axis direction is different, so that one of the images in the stage is clearer than the image in the other stage.
 そのため、このように段差のあるパネルを用いることで、情報処理システム1は、場面に応じて、画像を取得する場所(段)を変えることで、指標部41及び印44を含むより鮮明な画像を取得することが可能になる。つまり、被写体にピントを合わせた場合における、指標部41及び印44のピントのズレを軽減することが可能になる。 Therefore, the information processing system 1 changes the place (stage) from which an image is acquired according to the scene by using a panel having a step as described above, thereby providing a clearer image including the index unit 41 and the mark 44. It becomes possible to get. That is, it is possible to reduce the shift of the focus between the index portion 41 and the mark 44 when the subject is focused.
 ステップ207では、画像処理部16によって、被写体の解析処理が実行される。本実施形態では、被写体の対象は人の肌であるため、制御部31は、画像80に写った肌の解析(肌解析)を行う。例えば、制御部31は、被写体が写る領域81の画像を二値化及び細線化する。そして、制御部31は、周波数解析により規則性を計測する等して、画像中のきめ、毛穴、しみ等を検出する。 In step 207, the image processing unit 16 executes subject analysis processing. In the present embodiment, since the subject is the human skin, the control unit 31 analyzes the skin (skin analysis) shown in the image 80. For example, the control unit 31 binarizes and thins the image of the area 81 in which the subject is captured. And the control part 31 detects the regularity by a frequency analysis, etc., and detects the texture, the pore, the stain, etc. in an image.
 なお、本実施形態では、被写体の解析処理として、肌解析を例示した。しかしながら、被写体の解析処理は、肌解析に限定されるものではなく、対象とする被写体に応じて適宜設定されてよい。 In this embodiment, skin analysis is exemplified as the subject analysis processing. However, the subject analysis process is not limited to skin analysis, and may be set as appropriate according to the subject.
 また、図16により示されるパネル40のバリエーションとして、指標部が段階的に配色されているパネル40Hがある。このようなパネル40Hを用いて肌(被写体)を撮影した場合、情報処理システム1は、指標部41Gの各色と被写体の色とを比較することで、指標部41Gにおける被写体と同じ色の位置(角度)を特定することができる。そして、情報処理システム1は、特定した当該位置(角度)に付されている指標部41Gの色を基準として、被写体の色を推定することができる。 Further, as a variation of the panel 40 shown in FIG. 16, there is a panel 40H in which the indicator portions are colored stepwise. When the skin (subject) is photographed using such a panel 40H, the information processing system 1 compares each color of the index unit 41G with the color of the subject, so that the position of the same color as the subject (in the index unit 41G ( Angle) can be specified. Then, the information processing system 1 can estimate the color of the subject on the basis of the color of the index unit 41G attached to the specified position (angle).
 これにより、画像の解析処理及び補正処理は終了する。そして、処理は、ステップ300に進む。 This completes the image analysis processing and correction processing. Then, the process proceeds to step 300.
 なお、ステップ201~207の各処理の順序は、実施形態に応じて、適宜、変更されてもよい。例えば、制御部31は、ステップ203~205の補正処理の順番を変更してもよい。また、制御部31は、ステップ206の処理をステップ201よりも先に実行してもよい。更に、制御部31は、ステップ203~205の補正処理の順序を変更してもよい。 Note that the order of the processes in steps 201 to 207 may be changed as appropriate according to the embodiment. For example, the control unit 31 may change the order of correction processing in steps 203 to 205. Further, the control unit 31 may execute the process of step 206 before step 201. Further, the control unit 31 may change the order of the correction processing in steps 203 to 205.
 また、実施形態に応じて、適宜、処理の省略、置換、及び、追加が行われてもよい。例えば、図15Fにより示されるパネル40Gには指標部が存在しない。そのため、制御部31は、ステップ201~205の処理を省略してもよい。 Further, processing may be omitted, replaced, and added as appropriate according to the embodiment. For example, the indicator portion does not exist in the panel 40G shown in FIG. 15F. Therefore, the control unit 31 may omit the processes in steps 201 to 205.
 なお、本実施形態では、色の表現として、RGBを用いて説明した。しかしながら、本実施形態で用いることができる色の表現法は、RGBに限られる訳ではなく、適宜、選択されてよい。 In the present embodiment, description has been made using RGB as a color expression. However, the color expression method that can be used in the present embodiment is not limited to RGB, and may be appropriately selected.
 図24に戻り、ステップ300では、画像処理部16は、画像取得部15に画像を再度取得させるか否かを判定する。例えば、制御部31は、上述したステップ201において指標部41の抽出ができなかった場合、ステップ206において印44の抽出ができなかった場合、ステップ207において被写体(肌)の解析処理が成功しなかった場合等、画像取得部15に画像を再度取得させると判定する。そして、制御部31は、ユーザ端末2に再度画像を取得させるための通知を行うことで、処理をステップ100に戻す。他方、これらのような事情がない場合、制御部31は処理をステップ400に進める。 Referring back to FIG. 24, in step 300, the image processing unit 16 determines whether the image acquisition unit 15 is to acquire an image again. For example, if the index unit 41 cannot be extracted in step 201 described above, or if the mark 44 cannot be extracted in step 206, the control unit 31 does not succeed in subject (skin) analysis processing in step 207. In such a case, the image acquisition unit 15 determines to acquire the image again. And the control part 31 returns a process to step 100 by performing the notification for making the user terminal 2 acquire an image again. On the other hand, if there is no such situation, the control unit 31 advances the process to step 400.
 ステップ400では、表示制御部18によって、ステップ207における被写体の解析結果が出力される。例えば、サーバ3の制御部31は、ステップ207において得た被写体の解析結果をユーザ端末2に通知する。そして、ユーザ端末2の制御部21は、受け取った当該被写体の解析結果をディスプレイ27に出力することで、当該被写体の解析結果をユーザに提示する。 In step 400, the display control unit 18 outputs the analysis result of the subject in step 207. For example, the control unit 31 of the server 3 notifies the user terminal 2 of the analysis result of the subject obtained in step 207. Then, the control unit 21 of the user terminal 2 presents the analysis result of the subject to the user by outputting the received analysis result of the subject to the display 27.
 ステップ500では、蓄積部17によって、各補正処理が実行された画像、印44により示される所定の情報、及び、被写体の解析結果が対応付けられて蓄積される。例えば、サーバ3の補助記憶装置32は、取得したこれらの情報を蓄積するためのデータベースを備える。 In step 500, the storage unit 17 stores the image on which each correction process has been executed, the predetermined information indicated by the mark 44, and the analysis result of the subject in association with each other. For example, the auxiliary storage device 32 of the server 3 includes a database for storing the acquired information.
 図28は、本実施形態に係る情報処理システム1が画像を蓄積するために利用するデータベースのレコードを例示している。具体的には、図28は、ユーザAに係るレコードの一覧を例示している。 FIG. 28 illustrates a database record used by the information processing system 1 according to the present embodiment for storing images. Specifically, FIG. 28 illustrates a list of records related to the user A.
 取得した情報を蓄積するために利用されるデータベースは、レコードを識別するための識別子を格納するIDフィールド、取得した画像の撮影日時を格納する撮影日時フィールド、取得した画像のファイル名を格納するファイル名フィールド、印44により示される所定の情報を格納する属性情報フィールド、及び、被写体(肌)の解析結果を格納する解析結果フィールドを備える。 The database used to store the acquired information includes an ID field for storing an identifier for identifying a record, a shooting date / time field for storing the shooting date / time of the acquired image, and a file for storing the file name of the acquired image A name field, an attribute information field for storing predetermined information indicated by a mark 44, and an analysis result field for storing an analysis result of a subject (skin).
 制御部31は、各補正処理が実行された画像、ステップ206で抽出した印44により示される所定の情報、及び、ステップ207における被写体の解析結果に適合するようにフィールドを埋めたレコードを作成し、当該作成したレコードをデータベースに追加する。これにより、制御部31は、各補正処理が実行された画像、印44により示される所定の情報、及び、被写体の解析結果を対応付けて蓄積する。 The control unit 31 creates a record in which fields are filled so as to match the image on which each correction process has been executed, the predetermined information indicated by the mark 44 extracted in step 206, and the analysis result of the subject in step 207. Add the created record to the database. Thereby, the control unit 31 accumulates the image on which each correction process has been executed, the predetermined information indicated by the mark 44, and the analysis result of the subject in association with each other.
 1…情報処理システム、2…ユーザ端末、
 3…サーバ、5…ネットワーク、
 15…画像取得部、16…画像処理部、
 17…蓄積部、18…表示制御部、
 21…制御部、22…補助記憶装置、
 23…ネットワークインタフェース、
 24…入力装置、25…出力装置、
 26…カメラモジュール、27…ディスプレイ、
 31…制御部、32…補助記憶装置、
 33…ネットワークインタフェース、
 40…パネル、41…指標部、43…開口、44…印、
 50…接写レンズ、51…底面部、52…溝部、
 53…上端部、54…レンズ部、
 60…キャップ、70…撮影範囲、
 80…画像、81、82、84…領域、83…境界、91…ガイド線
1 ... information processing system, 2 ... user terminal,
3 ... server, 5 ... network,
15 ... Image acquisition unit, 16 ... Image processing unit,
17 ... Accumulation unit, 18 ... Display control unit,
21 ... Control unit, 22 ... Auxiliary storage device,
23 ... Network interface,
24 ... input device, 25 ... output device,
26 ... Camera module, 27 ... Display,
31 ... control unit, 32 ... auxiliary storage device,
33 ... Network interface,
40 ... Panel, 41 ... Indicator part, 43 ... Opening, 44 ... Mark,
50 ... close-up lens, 51 ... bottom surface, 52 ... groove,
53 ... upper end part, 54 ... lens part,
60 ... cap, 70 ... shooting range,
80: Image, 81, 82, 84: Area, 83: Border, 91: Guide line

Claims (10)

  1.  カメラに装着された接写レンズを被写体に当接又は近接して前記被写体を撮影する時に、前記被写体に当接又は近接する位置に存在し、少なくともその一部が前記カメラの撮影範囲に含まれることで前記被写体と共に撮影される、前記カメラにより撮影された画像の解析の指標となる所定の色が付された指標部、
    を備える指標体。
    When photographing the subject with a close-up lens attached to the camera in contact with or close to the subject, the close-up lens is in contact with or close to the subject, and at least a part thereof is included in the shooting range of the camera An indicator portion with a predetermined color as an indicator of analysis of an image taken by the camera, taken with the subject at
    An indicator body comprising
  2.  前記カメラに装着された前記接写レンズを前記被写体に当接又は近接して前記被写体を撮影する時に、前記カメラの撮影範囲に含まれることで、前記被写体及び前記指標部と共に撮影される、所定の情報を示す印、
    を更に備える、
    請求項1に記載の指標体。
    When the close-up lens attached to the camera is in contact with or close to the subject and the subject is photographed, it is included in the photographing range of the camera so that it is photographed together with the subject and the index unit. Information mark,
    Further comprising
    The indicator body according to claim 1.
  3.  前記カメラに装着された前記接写レンズを前記被写体に当接又は近接して前記被写体を撮影する時に、前記カメラの撮影範囲に含まれる部分に開口を更に備える、
    請求項1又は2に記載の指標体。
    When the close-up lens mounted on the camera is in contact with or close to the subject and the subject is photographed, the camera further includes an opening in a portion included in the photographing range of the camera.
    The indicator body according to claim 1 or 2.
  4.  前記開口は、前記開口の中心が前記カメラの光軸近傍に位置するように設けられる、
    請求項3に記載の指標体。
    The opening is provided such that the center of the opening is located near the optical axis of the camera.
    The indicator body according to claim 3.
  5.  前記指標部は、前記開口の外縁を囲うように設けられる、
    請求項4に記載の指標体。
    The indicator portion is provided so as to surround an outer edge of the opening.
    The indicator body according to claim 4.
  6.  前記開口は、回転対称性を有する形状である、
    請求項5に記載の指標体。
    The opening is a shape having rotational symmetry.
    The indicator body according to claim 5.
  7.  前記指標部は、前記開口の回転対称性に対応した所定の規則性に従って配色されたカラーパタンを有する、
    請求項6に記載の指標体。
    The indicator portion has a color pattern arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening.
    The indicator body according to claim 6.
  8.  前記開口の回転対称性に対応した所定の規則性に従って、複数の前記印が配列された、
    請求項6又は7に記載の指標体。
    A plurality of the marks are arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening,
    The indicator body according to claim 6 or 7.
  9.  前記指標部は、光軸方向に段差を有する、
    請求項1から8のいずれか1項に記載の指標体。
    The indicator portion has a step in the optical axis direction.
    The indicator body according to any one of claims 1 to 8.
  10.  カメラに装着されて用いられる接写レンズユニットであって、
     前記カメラに装着された状態において、撮影時に被写体に当接又は近接する位置に、少なくともその一部が前記カメラの撮影範囲に含まれることで前記被写体と共に撮影される、前記カメラにより撮影された画像の解析の指標となる所定の色が付された指標部、
    を備える、
    接写レンズユニット。
    A close-up lens unit used by being mounted on a camera,
    An image photographed by the camera, which is photographed together with the subject when at least a part thereof is included in the photographing range of the camera at a position in contact with or close to the subject at the time of photographing when mounted on the camera. An indicator part with a predetermined color as an indicator of the analysis of
    Comprising
    Close-up lens unit.
PCT/JP2012/056025 2012-03-08 2012-03-08 Indicator, and close-up lens unit WO2013132636A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/056025 WO2013132636A1 (en) 2012-03-08 2012-03-08 Indicator, and close-up lens unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/056025 WO2013132636A1 (en) 2012-03-08 2012-03-08 Indicator, and close-up lens unit

Publications (1)

Publication Number Publication Date
WO2013132636A1 true WO2013132636A1 (en) 2013-09-12

Family

ID=49116150

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/056025 WO2013132636A1 (en) 2012-03-08 2012-03-08 Indicator, and close-up lens unit

Country Status (1)

Country Link
WO (1) WO2013132636A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019026472A1 (en) * 2017-07-31 2019-02-07 マクセルホールディングス株式会社 Conversion lens unit and state measurement system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0719839A (en) * 1993-06-30 1995-01-20 Shiseido Co Ltd Surface condition analyzing system
JPH1163943A (en) * 1997-08-19 1999-03-05 Teijin Ltd Method and device for measuring facial configuration
JP2003162712A (en) * 2001-08-29 2003-06-06 L'oreal Sa Method and device for acquiring image of part of human body

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0719839A (en) * 1993-06-30 1995-01-20 Shiseido Co Ltd Surface condition analyzing system
JPH1163943A (en) * 1997-08-19 1999-03-05 Teijin Ltd Method and device for measuring facial configuration
JP2003162712A (en) * 2001-08-29 2003-06-06 L'oreal Sa Method and device for acquiring image of part of human body

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019026472A1 (en) * 2017-07-31 2019-02-07 マクセルホールディングス株式会社 Conversion lens unit and state measurement system

Similar Documents

Publication Publication Date Title
US9345429B2 (en) Image processing device, image processing system, image processing method, and program
US20130113888A1 (en) Device, method and program for determining obstacle within imaging range during imaging for stereoscopic display
GB2533692A (en) Auto-contrast viewfinder for an indicia reader
EP2635019B1 (en) Image processing device, image processing method, and program
JP2017182038A (en) Projection system and correction method of projection screen
KR102134751B1 (en) Sperm liver test kit, system and method for liver sperm test
JP2005176230A (en) Image processor and print system
EP3627822B1 (en) Focus region display method and apparatus, and terminal device
JP6294012B2 (en) Lens unit
US20120327259A1 (en) Image processing device, image processing method, image capturing device, and program
JP2017038162A (en) Imaging apparatus and image processing method, program, and storage medium
JP5619124B2 (en) Image processing apparatus, imaging apparatus, image processing program, and image processing method
US20160227196A1 (en) 3d scanning apparatus and method using lighting based on smart phone
JPWO2014007268A1 (en) Lens unit
TWI394085B (en) Method of identifying the dimension of a shot subject
KR102503872B1 (en) Information processing device, information processing method, and program
WO2013132636A1 (en) Indicator, and close-up lens unit
JP5964772B2 (en) Lens information registration system, lens information server and camera body used for lens information registration system
WO2013132637A1 (en) Information processing system, information processing method, and programme
WO2021172019A1 (en) Image processing device and method for controlling image processing device
CN111314608B (en) Image focusing prompting method, computer device and computer readable storage medium
JP2014116789A (en) Photographing device, control method therefor, and program
WO2020059157A1 (en) Display system, program, display method, and head-mounted device
JP2019111004A (en) Drawing system, drawing device, and terminal device
JP5294100B1 (en) Dot pattern reading lens unit, figure with dot pattern reading lens unit mounted on pedestal, card placed on dot pattern reading lens unit, information processing apparatus, information processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12870344

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12870344

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP