WO2013136602A1 - Imaging device with projector and imaging control method therefor - Google Patents

Imaging device with projector and imaging control method therefor Download PDF

Info

Publication number
WO2013136602A1
WO2013136602A1 PCT/JP2012/080614 JP2012080614W WO2013136602A1 WO 2013136602 A1 WO2013136602 A1 WO 2013136602A1 JP 2012080614 W JP2012080614 W JP 2012080614W WO 2013136602 A1 WO2013136602 A1 WO 2013136602A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
projector
subject
unit
projected
Prior art date
Application number
PCT/JP2012/080614
Other languages
French (fr)
Japanese (ja)
Inventor
林 大輔
近藤 茂
貴嗣 青木
和紀 井上
宗之 大島
三沢 岳志
三沢 充史
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2013136602A1 publication Critical patent/WO2013136602A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3173Constructional details thereof wherein the projection device is specially adapted for enhanced portability
    • H04N9/3176Constructional details thereof wherein the projection device is specially adapted for enhanced portability wherein the projection device is incorporated in a camera
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/24Details of cameras or camera bodies; Accessories therefor with means for separately producing marks on the film, e.g. title, time of exposure
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/48Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
    • G03B17/54Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Definitions

  • the present invention relates to a photographing apparatus with a projector that increases the utility value of a projector function and a photographing control method thereof.
  • the projector mounted on the digital camera is limited to a small one because it can be mounted only in a narrow empty space in the digital camera housing. For this reason, compared with the case where the photographed image is displayed on a large television receiver at home or is projected by a dedicated large projector, the image quality and fineness of the displayed image are inferior.
  • the digital cameras with a projector disclosed in Patent Documents 1 and 2 are configured to project an image to the front side of the camera so that a catch light can be put in the subject's eyes or the color of the face image of the subject can be changed. It has become.
  • Patent Documents 1 and 2 are not sufficient to satisfy the user's willingness to purchase a digital camera with a projector, and further increase the utility value of the projector function and improve the usability of the projector function.
  • An object of the present invention is to provide a photographing apparatus with a projector and a photographing control method thereof that can enhance the utility value of the projector function and improve the usability thereof.
  • An imaging device with a projector includes an imaging unit that captures an image of a subject, a first image that is an image of a collective pattern obtained by combining characters, figures, symbols, or any of these, and a marker pattern having a preset shape.
  • Each of the projectors projecting onto the subject, and the projector captures captured images in a state where a plurality of the marker patterns are respectively projected onto different plane regions of the subject, and the degree of deformation of each marker pattern in the captured image
  • An inclination calculation unit that calculates the inclination of each of the planar areas on which the marker pattern is projected, a plane selection unit that selects a planar area where the degree of deformation of the marker pattern is the smallest from the calculated inclination values, and the plane According to the value of the inclination with respect to the planar area selected by the selection unit, the first image projected onto the subject
  • a first image deformation unit that deforms an image by reversing the enlargement / reduction direction of the deformation, and the first image obtained by deforming the image by the first image deformation unit by the projector using the selected plane area of the subject.
  • an imaging control unit that images the subject by the imaging unit in a state of being projected onto the subject.
  • An imaging control method for an imaging apparatus with a projector includes an imaging unit that images a subject, a first image that is an image of a collective pattern obtained by combining characters, figures, symbols, or any of these, and a preset shape And a projector for projecting each of the marker patterns onto the subject, and capturing an image obtained by projecting a plurality of the marker patterns onto the different planar regions of the subject by the projector. Calculating the inclination of each of the planar areas onto which the marker pattern is projected from the degree of deformation of each marker pattern in the captured image, and the degree of deformation of the marker pattern from the calculated inclination value.
  • FIG. 1 is an external perspective view of a photographing apparatus according to a first embodiment of the present invention. It is a functional block block diagram of the imaging device shown in FIG. It is a flowchart which shows the control procedure of the imaging device which concerns on the 1st Embodiment of this invention. It is a figure which shows the state which projected the marker pattern which concerns on the 1st Embodiment of this invention on the to-be-photographed object. It is a figure which shows the state which projected the character information etc. which concern on the 1st Embodiment of this invention on the to-be-photographed object. It is explanatory drawing of the normal vector which concerns on the 1st Embodiment of this invention. It is modification explanatory drawing of the character information etc.
  • FIG. 1 is an external perspective view according to a first embodiment of a photographing apparatus with a projector (digital camera) 10.
  • the digital camera 10 includes a photographic lens 12 at the front of a rectangular camera casing 11.
  • the taking lens 12 is housed in a retractable lens barrel 13.
  • a shutter release button 14 is provided on the left shoulder portion of the camera casing 11.
  • a liquid crystal display unit (LCD 16 in FIG. 2) for displaying captured images, through images (live view images), camera menu images, and the like is provided on the rear surface of the camera housing 11.
  • a flash light emitting unit 44 is provided at the front of the right shoulder.
  • a front projection type projector (video projection unit) 17 is provided inside the upper side of the camera housing 11 and projects a display image of a built-in small liquid crystal display unit to the front of the camera through a projection window 18 at the front. It is configured.
  • a live view image is displayed on the small liquid crystal display unit built in the projector 17 and the user can look into the live view image through a finder window provided on the rear side of the camera, so that the electronic view finder device can also be used. good.
  • the projector 17 is built in, but may be provided outside the camera housing as described in, for example, Japanese Patent Application Laid-Open No. 2006-80875.
  • FIG. 2 is a functional block configuration diagram of the digital camera shown in FIG.
  • the digital camera 10 includes an image sensor (a CMOS image sensor in this embodiment) 21 provided on the back of the photographing lens 12 and a control unit (CPU) 22 that performs overall control of the digital camera 10.
  • image sensor a CMOS image sensor in this embodiment
  • CPU control unit
  • the CPU 22 is connected to the bus 23, and the bus interface 23 stores a frame memory 24, a signal processing unit 25 that performs image processing, and a card interface (stored in the external memory card 26) in the external memory card 26.
  • I / F) 27 a display control unit 28 that performs display control of an LCD (liquid crystal display unit) 16 on the back of the camera, an OSD signal generation unit 29 that generates OSD signals such as character information to be displayed on the LCD 16, and a projector 17 Is connected to the video projection control unit 30 for controlling the image projection.
  • the bus 23 is further connected to a projected character deformation unit 31, a plane normal vector calculation unit 32, and a marker detection unit 33. These can be constituted by individual electronic components, or can be realized as one of the processing functions of the CPU 22. The same applies to the following embodiments.
  • the CPU 22 is connected to a ROM 22a and a RAM 22b that store control programs and the like, and an operation unit 40 including a shutter release button 14 is connected.
  • the digital camera 10 also drives the lens drive unit 41 that controls the focus lens position and the like of the photographing lens 12, the timing generator (TG) 42 that generates the drive timing signal of the image sensor 21, and the image sensor 21.
  • a driver 43 and a flash control circuit 45 that performs light emission control of the flash light emitting unit 44 are provided, and these are instructed to be controlled by the CPU 22.
  • the CPU 22 functions as an imaging control unit.
  • FIG. 3 is a flowchart showing the processing procedure of the control program executed when the user images the subject using the digital camera 10.
  • This control program is stored in the ROM 22a, read out to the RAM 22b at the time of execution, and executed by the CPU 22.
  • a character image “January 1, 2012” representing the shooting date and time is projected onto the plane portion of the subject, and the subject of the subject on which this character image is projected is projected.
  • An image is taken through the taking lens 12.
  • a marker pattern is projected onto the subject.
  • FIG. 1 An example in which a marker pattern is projected is shown in FIG.
  • the marker pattern is a square frame 51 and a character 52 of “copy” inside the square frame 51.
  • the marker pattern may be only the square frame 51. However, since there are many square and rectangular subjects in the world, the character 52 of "copy” is also used as a marker pattern so that it can be easily distinguished from a normal square subject. It is included.
  • the marker pattern may be a colored pattern such as red or green.
  • step S2 the live view image of the subject on which the marker pattern is projected is captured, and in step S3, the marker pattern region in the subject image is detected.
  • the marker pattern 51 in the subject image is deeper than the near side dimension. It will deform
  • the marker pattern may not be detected in the live view image depending on the shape of the subject, the reflectance, the irregular reflectance, and the like.
  • step S4 it is determined whether or not the marker pattern is detected in the live view image. If the marker pattern 51 is not detected, the process returns to step S1, and if the marker pattern 51 is detected, the process proceeds to step S5.
  • step S5 a normal vector of the plane on which the marker pattern is projected is calculated.
  • step S6 character information such as characters to be projected is deformed according to the normal vector.
  • this character is displayed on the small liquid crystal display unit in the projector 17 so as to be projected, and is projected toward the plane 54 of the subject (step S7).
  • the shooting date information can be projected in the subject image so that all the characters have the same size.
  • the extent to which the character size is deformed depends on how oblique the plane 54 is to the camera optical axis. That is, the character to be projected may be affine transformed according to the direction of the plane normal vector.
  • step S7 the deformed character is projected toward the object plane, and while maintaining this projection state, an image of the object is captured in step S8, and this process ends.
  • the marker pattern is the square frame 51.
  • the shooting date since the shooting date is horizontally long, it may be a horizontally long rectangular frame.
  • a vertically written shooting date is desired to be projected, a vertically long rectangular frame may be used.
  • arbitrary character information may be projected. For example, “* Hakone trip *” can be projected vertically.
  • FIG. 8 is a functional block configuration diagram of a digital camera according to the second embodiment of the present invention. 2 is different from the embodiment of FIG. 2 in that the bus 23 is connected to a plane selection unit 34 that minimizes the angle between the normal vector of the marker pattern and the optical axis.
  • FIG. 9 is a flowchart showing a control procedure according to the second embodiment. This embodiment differs from the flowchart of FIG. 3 in the following three points. First, step S1a is executed instead of step S1, and a plurality of marker patterns are projected onto the subject. The second point is that step S4a is executed instead of step S4 to determine whether one or more marker patterns have been detected. Third, step S11 is provided between step S5 and step S6.
  • a plurality of marker patterns are projected from the projector 17.
  • the shooting screen is divided into 16 ⁇ 4 ⁇ 4, and one marker pattern is projected on each divided screen. Then, a plane on which a projection image that is most easily viewable when a character or character is projected is selected.
  • FIG. 10 is an explanatory diagram of the second embodiment.
  • a plurality of marker patterns are projected onto the subject in step S1a, and it is determined whether or not one or more markers are detected in step S4a. Then, in step S5, each normal vector is calculated for each detected marker pattern, and in the next step 11, a plane that minimizes the angle formed by the normal vector and the camera optical axis is selected. This is because the projection image onto the plane is the least deformed and easy to see.
  • step S6 the character information projected on the plane is affine transformed
  • step S7 the subject image is captured with the affine transformed character information projected
  • the directly-facing plane 55 is selected as the projection target plane, and easy-to-see character information and the like are projected. .
  • the CPU 22 causes the projected character deforming unit (first image deforming unit) 31 to fold the projected character information and, as shown in FIG. 11, “2012” and “January 1”. The image is changed to a folded image divided into two lines and projected onto the plane 55.
  • a divided image of projection character information divided into “2012” and “January 1” is generated, and “2012” is projected on the plane 55, and “January 1” is generated. ” May be projected onto the plane 54.
  • the character information is affine transformed according to the direction of each normal vector.
  • the screen is equally divided and a plurality of marker patterns are projected.
  • character information when character information is projected onto an important part of the subject, it may be bothersome and disturbing.
  • the user designates an area in which character information is not projected in advance while viewing a live view image displayed on the liquid crystal display unit 16 on the back of the camera. And it is good to project a some marker pattern to an area
  • the captured image live view image
  • the position on the screen is determined from the degree of deformation of the mesh image.
  • the CPU 22 checks in advance how large the plane on which character information can be projected is, the plane candidates are displayed on the liquid crystal display unit 16 to be selected by the user, and the user selects a plurality of candidates.
  • the second embodiment may be executed.
  • FIG. 13 is a functional block configuration diagram of a digital camera according to the third embodiment of the present invention. 2 differs from the embodiment of FIG. 2 in that a marker color setting unit 35, a complementary color calculation unit 36, and a color determination unit 37 on the projection surface are connected to the bus 23.
  • the color determination unit for the projection surface determines the color of the subject area (projection surface) on which the character information is projected by the projector 17.
  • the complementary color calculation unit 36 calculates and obtains a complementary color of the color of the projection surface, and the marker color setting unit 35 sets the marker color as the complementary color.
  • FIG. 14 is a flowchart showing a control procedure of the third embodiment. 3 is different from the flowchart of FIG. 3 only in that steps S12 and S13 are provided before the step S1.
  • step S12 a live view image is captured, and the color information of the plane 55 serving as the projection surface shown in FIG. Then, a complementary color of the color of the plane 55 is obtained, and in step S13, the color of the marker pattern is set to this complementary color. Thereafter, the process proceeds to step S1 to step S8, and a marker pattern 51 which is a complementary color to the color of the plane 55 is projected. Since the marker pattern 51 has a complementary color relationship with the color of the plane 55, the marker detection accuracy on the camera side is improved. Needless to say, the color may be different from the color of the plane 55 even if it is not a complementary color.
  • FIG. 16 is a functional block configuration diagram of a digital camera according to the fourth embodiment of the present invention. Compared with the third embodiment of FIG. 13, the difference is that a projection character color setting unit 38 is connected to the bus 23.
  • FIG. 17 is a flowchart showing a control procedure of the fourth embodiment. This embodiment is different from the flowchart of FIG. 14 in that step S14 is provided between step S6 and step S7, and the complementary color of the plane color is set as the projection color.
  • the character string 56 of the shooting date is displayed by the projected character color setting unit (color setting unit) 38 in a color that is complementary to the color of the plane 55 that is the projection surface. Is projected. Thereby, easy-to-read text information and the like are projected onto the subject and imaged. Needless to say, the color may be different from the color of the projection surface even if it is not a complementary color.
  • FIG. 19 is a functional block configuration diagram of a digital camera according to the fifth embodiment of the present invention. Compared with the embodiment of FIG. 2, the projection marker expansion / contraction unit 39 is connected to the bus 23.
  • FIG. 20 is a flowchart showing a control procedure of the fifth embodiment. Compared with the embodiment of FIG. 2, step S6a is executed instead of step 6, and when a marker pattern is not detected in step S4, step S15 is executed and then the process returns to step S1.
  • steps S1 to S4 in FIG. 20 are executed, and it is determined whether or not a marker pattern is detected in step S4.
  • the marker pattern is not detected, it can be determined that the marker pattern has not been detected because the marker pattern image larger than the plane having a predetermined width has been projected.
  • the projection marker enlargement / reduction unit (marker size changing unit) 39 reduces the marker pattern by a predetermined rate. For example, it is multiplied by 0.9 and the process returns to step S1. In this way, as shown in FIG. 21, steps S1 to S4 are repeated until the marker pattern is detected while reducing the marker pattern.
  • step S6a the projected character deforming unit (first image size changing unit) 31 converts the affine transformation according to the direction of the normal vector into the projection target character information or the like. Apply.
  • step S6a when performing the affine transformation, the size of the projection target character information and the like is reduced and deformed by the reduction rate of the marker pattern.
  • the subject since the subject is imaged in a state where arbitrary character information or the like is projected onto the plane area of the subject, the character information or the like is embedded in the captured image in a natural form. Since the projector function can be used in this way, the utility value of the projector function is improved and the usability of the projector function is also improved.
  • the planar area of the subject is detected with a marker pattern.
  • a complete “plane” often does not exist.
  • the user may set a threshold value for how many planes are used as the “plane” for character information projection.
  • the CPU 22 may automatically determine that it is a “plane” on which character information can be projected.
  • a straight line in a marker pattern when projected onto an uneven surface, it becomes a projected image of a zigzag line instead of a straight line image.
  • the surface of a tree or the like is close to the surface of a cylinder, and the straight projection image is an arc line image.
  • these lines can be regarded as straight lines within a certain threshold range, they can be regarded as planes on which character information can be projected.
  • the processing capability of the CPU 22 is high, it is possible to handle uneven surfaces and curved surfaces as a plurality of small plane aggregates, obtain normal vectors of individual planes, and transform character information projected on the individual planes. become. Thereby, for example, when photographing a subject with a small bird on a branch, it is possible to capture a subject image in which character information indicating the photographing date and photographing location is projected on the image of the branch.
  • the CPU 22 can recognize that the elongated flat surface capable of projecting character information extends obliquely, the character information is drawn along the oblique plane by drawing the character information obliquely on the small liquid crystal display unit in the projector 17. Can be projected.
  • the digital camera has been described as the embodiment of the photographing apparatus of the present invention, but the configuration of the photographing apparatus is not limited to this.
  • a built-in type or an external type PC camera or a portable terminal device having a shooting function as described below can be used.
  • Examples of the portable terminal device that is an embodiment of the photographing apparatus of the present invention include a mobile phone, a smartphone, a PDA (Personal Digital Assistant), and a portable game machine.
  • a smartphone will be described as an example, and will be described in detail with reference to the drawings.
  • FIG. 22 shows the appearance of a smartphone 200 that is an embodiment of the photographing apparatus of the present invention.
  • a smartphone 200 illustrated in FIG. 22 includes a flat housing 201, and a display input in which a display panel 202 as a display unit and an operation panel 203 as an input unit are integrated on one surface of the housing 201. Part 204 is provided.
  • Such a housing 201 includes a speaker 205, a microphone 206, an operation unit 207, and a camera unit 208.
  • the configuration of the housing 201 is not limited thereto, and for example, a configuration in which the display unit and the input unit are independent can be employed, or a configuration having a folding structure and a slide mechanism can be employed.
  • the projector 17 (see FIG. 23) that projects an image of character information or the like on the subject is provided as in FIG. 1.
  • FIG. 23 is a block diagram showing a configuration of the smartphone 200 shown in FIG.
  • the main components of the smartphone include a wireless communication unit 210, a display input unit 204, a call unit 211, an operation unit 207, a camera unit 208, a storage unit 212, and an external input / output unit. 213, a GPS (Global Positioning System) receiving unit 214, a motion sensor unit 215, a power supply unit 216, and a main control unit 220.
  • a wireless communication function for performing mobile wireless communication via a base station device BS (not shown) and a mobile communication network NW (not shown) is provided.
  • the wireless communication unit 210 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW according to an instruction from the main control unit 220. Using this wireless communication, transmission and reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
  • the display input unit 204 displays images (still images and moving images), character information, and the like, visually transmits information to the user under the control of the main control unit 220, and detects user operations on the displayed information.
  • a so-called touch panel which includes a display panel 202 and an operation panel 203.
  • the display panel 202 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device.
  • the operation panel 203 is a device on which an image displayed on the display surface of the display panel 202 is placed so as to be visible, and detects one or a plurality of coordinates operated by a user's finger or stylus. When this device is operated with a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 220. Next, the main control unit 220 detects an operation position (coordinates) on the display panel 202 based on the received detection signal.
  • the projector 17 is connected to the main control unit 220.
  • the display panel 202 and the operation panel 203 of the smartphone 200 exemplified as an embodiment of the photographing apparatus of the present invention integrally constitute a display input unit 204.
  • the arrangement 203 covers the display panel 202 completely.
  • the operation panel 203 may have a function of detecting a user operation even in an area outside the display panel 202.
  • the operation panel 203 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 202 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 202. May be included).
  • the operation panel 203 may include two sensitive areas of the outer edge portion and the other inner portion. Further, the width of the outer edge portion is appropriately designed according to the size of the housing 201 and the like.
  • the position detection method employed in the operation panel 203 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, a capacitance method, and the like. You can also
  • the call unit 211 includes a speaker 205 and a microphone 206, converts user's voice input through the microphone 206 into voice data that can be processed by the main control unit 220, and outputs the voice data to the main control unit 220. 210 or the audio data received by the external input / output unit 213 is decoded and output from the speaker 205.
  • the speaker 205 can be mounted on the same surface as the display input unit 204, and the microphone 206 can be mounted on the side surface of the housing 201.
  • the operation unit 207 is a hardware key using a key switch or the like, and receives an instruction from the user.
  • the operation unit 207 is mounted on the side surface of the housing 201 of the smartphone 200 and is turned on when pressed with a finger or the like, and turned off when the finger is released with a restoring force such as a spring. It is a push button type switch.
  • the storage unit 212 includes a control program and control data of the main control unit 220, application software, address data that associates the name and telephone number of a communication partner, transmitted / received e-mail data, Web data downloaded by Web browsing, The downloaded content data is stored, and streaming data and the like are temporarily stored.
  • the storage unit 212 includes an internal storage unit 217 built in the smartphone and an external storage unit 218 having a removable external memory slot.
  • Each of the internal storage unit 217 and external storage unit 218 constituting the storage unit 212 includes a flash memory type (flash memory type), a hard disk type (hard disk type), a multimedia card micro type (multimedia card micro type), It is realized using a storage medium such as a card type memory (for example, MicroSD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory) or the like.
  • flash memory type flash memory type
  • hard disk type hard disk type
  • multimedia card micro type multimedia card micro type
  • a storage medium such as a card type memory (for example, MicroSD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory) or the like.
  • the external input / output unit 213 serves as an interface with all external devices connected to the smartphone 200, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or a network.
  • external devices for example, universal serial bus (USB), IEEE 1394, etc.
  • a network for example, the Internet, wireless LAN, Bluetooth (registered trademark), RFID (Radio Frequency Identification), infrared communication (IrDA) (registered trademark), UWB (Ultra Wideband) (registered trademark), ZigBee ( ZigBee) (registered trademark, etc.) for direct or indirect connection.
  • Examples of external devices connected to the smartphone 200 include a memory card connected via a wired / wireless headset, wired / wireless external charger, wired / wireless data port, card socket, and SIM (Subscriber).
  • Identity Module Card / UIM User Identity Module Card
  • external audio / video equipment connected via audio / video I / O (Input / Output) terminal
  • external audio / video equipment connected wirelessly, yes / no
  • the external input / output unit 213 transmits data received from such an external device to each component inside the smartphone 200, or allows the data inside the smartphone 200 to be transmitted to the external device. Can do.
  • the GPS receiving unit 214 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 220, executes positioning calculation processing based on the received GPS signals, and calculates the latitude of the smartphone 200. Detect the position consisting of longitude and altitude.
  • the GPS reception unit 214 can acquire position information from the wireless communication unit 210 or the external input / output unit 213 (for example, a wireless LAN), the GPS reception unit 214 can also detect the position using the position information.
  • the motion sensor unit 215 includes, for example, a three-axis acceleration sensor, and detects the physical movement of the smartphone 200 in accordance with an instruction from the main control unit 220. By detecting the physical movement of the smartphone 200, the moving direction and acceleration of the smartphone 200 are detected. The detection result is output to the main control unit 220.
  • the power supply unit 216 supplies power stored in a battery (not shown) to each unit of the smartphone 200 in accordance with an instruction from the main control unit 220.
  • the main control unit 220 includes a microprocessor, operates according to a control program and control data stored in the storage unit 212, and controls each unit of the smartphone 200 in an integrated manner.
  • the main control unit 220 includes a mobile communication control function that controls each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 210.
  • the application processing function is realized by the main control unit 220 operating according to the application software stored in the storage unit 212.
  • Application processing functions include, for example, an infrared communication function that controls the external input / output unit 213 to perform data communication with the opposite device, an e-mail function that transmits and receives e-mails, and a web browsing function that browses web pages. .
  • the main control unit 220 has an image processing function such as displaying video on the display input unit 204 based on image data (still image or moving image data) such as received data or downloaded streaming data.
  • the image processing function is a function in which the main control unit 220 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 204.
  • the main control unit 220 executes display control for the display panel 202 and operation detection control for detecting a user operation through the operation unit 207 and the operation panel 203.
  • the main control unit 220 displays an icon for starting application software, a software key such as a scroll bar, or displays a window for creating an e-mail.
  • the scroll bar refers to a software key for accepting an instruction to move the display portion of a large image that does not fit in the display area of the display panel 202.
  • the main control unit 220 detects a user operation through the operation unit 207 or accepts an operation on the icon or an input of a character string in the input field of the window through the operation panel 203. Or a display image scroll request through a scroll bar.
  • the main control unit 220 causes the operation position with respect to the operation panel 203 to overlap with the display panel 202 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 202.
  • a touch panel control function for controlling the sensitive area of the operation panel 203 and the display position of the software key.
  • the main control unit 220 can also detect a gesture operation on the operation panel 203 and execute a preset function in accordance with the detected gesture operation.
  • Gesture operation is not a conventional simple touch operation, but an operation of drawing a trajectory with at least one position from a plurality of positions by drawing a trajectory with a finger or the like, or simultaneously specifying a plurality of positions. means.
  • the camera unit 208 is a digital camera that performs electronic photography using an image sensor such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge-Coupled Device).
  • the camera unit 208 converts image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 220 and records the data in the storage unit 212 or externally.
  • the data can be output through the input / output unit 213 and the wireless communication unit 210.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge-Coupled Device
  • the camera unit 208 is mounted on the same surface as the display input unit 204, but the mounting position of the camera unit 208 is not limited to this, and the camera unit 208 may be mounted on the back surface of the display input unit 204. Alternatively, a plurality of camera units 208 may be mounted. When a plurality of camera units 208 are installed, the camera unit 208 used for shooting can be switched for shooting alone, or a plurality of camera units 208 can be used for shooting simultaneously. it can.
  • the camera unit 208 can be used for various functions of the smartphone 200.
  • an image acquired by the camera unit 208 can be displayed on the display panel 202, or the image of the camera unit 208 can be used as one of operation inputs of the operation panel 203.
  • the GPS receiving unit 214 detects a position
  • the position can be detected with reference to an image from the camera unit 208.
  • the optical axis direction of the camera unit 208 of the smartphone 200 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment.
  • the image from the camera unit 208 can also be used in the application software.
  • a camera unit 208 shown in FIG. 22 is a camera for self-portrait when making a videophone call. Is provided.
  • the position information acquired by the GPS receiver 214 to the image data of the still image or the moving image, the voice information acquired by the microphone 206 (the text information may be converted into voice information by the main control unit or the like), Posture information and the like acquired by the motion sensor unit 215 can be added and stored in the storage unit 212, or can be output through the external input / output unit 213 or the wireless communication unit 210.
  • the imaging device with a projector includes an imaging unit that images a subject, a first image that is an image of a set pattern in which characters, figures, symbols, or any of these are combined, and a preset image.
  • a projector that projects a marker pattern of a shape formed on the subject, and a captured image in a state in which the plurality of marker patterns are respectively projected onto different planar areas of the subject by the projector, and each of the captured images
  • An inclination calculation unit that calculates the inclination of each of the planar areas on which the marker pattern is projected from the degree of deformation of the marker pattern, and a planar area that has the smallest degree of deformation of the marker pattern is selected from the calculated inclination values.
  • a first image deforming unit that deforms the first image projected on the subject by reversing the expansion / contraction direction of the deformation, and the first image deformed by the first image deforming unit is An imaging control unit that images the subject with the imaging unit in a state of being projected onto the selected planar area.
  • the photographing apparatus with a projector captures the live view image of the subject, determines the color of the selected plane area of the subject, and the marker having a color different from the determined color
  • a marker color setting unit for projecting a pattern by the projector is provided.
  • the photographing apparatus with a projector captures the live view image of the subject, determines the color of the selected plane area of the subject, and determines the color of the subject that is different from the determined color.
  • a color setting unit for projecting one image by the projector is provided.
  • the imaging apparatus with a projector includes a marker size changing unit that changes the size of the marker pattern when the marker pattern projected from the projector onto the subject is undetectable, and the changed marker pattern And a first image size changing unit that deforms the first image projected from the projector according to the inclination of the planar area detected by the inclination calculating unit.
  • the first image deformation unit of the photographing apparatus with a projector changes the first image to a folded image that is folded halfway when the first image before the change does not fit in the planar area. It is characterized by doing.
  • the first image deformation unit of the photographing apparatus with a projector has a second plane area adjacent to the first plane area when the first image does not fit in the first plane area.
  • the first image of a portion of the first image that is out of the first plane area is deformed according to the inclination of the second plane area, and the first image is transformed into the first plane area and An image for generating a projection divided into the second plane region is generated.
  • the imaging control method of the imaging apparatus with a projector includes an imaging unit that images a subject, a first image that is an image of a collective pattern obtained by combining characters, figures, symbols, or any of these, and a preset image And a projector for projecting a marker pattern of a different shape onto the subject, and a captured image obtained by projecting a plurality of the marker patterns onto different planar areas of the subject by the projector.
  • a step of calculating the inclination of each of the planar areas on which the marker pattern is projected from the degree of deformation of each marker pattern in the captured image, and the deformation of the marker pattern from the calculated inclination value A step of selecting the plane area having the smallest degree and the plane area selected above The step of deforming the first image projected onto the subject with the inclination direction of the deformation reversed according to the value of the inclination with respect to the subject, and the deformed first image of the subject by the projector. Imaging the subject with the imaging unit in a state of being projected onto the selected plane area.
  • arbitrary character information is projected onto an arbitrary plane area of the subject, and the subject image is imaged together with the projected character information image, so that the usability and utility value of the projector function can be improved. It becomes possible.
  • the imaging apparatus with a projector according to the present invention can project character information and the like onto a subject and image the subject, so that the character information and the like can be embedded in the captured image in a natural state, and the projector function can be used. This is useful for spreading high-value imaging devices.

Abstract

This imaging control method for an imaging device with a projector involves capturing a captured image formed by projecting each of a plurality of marker patterns on different planar regions of a subject with a projector, and calculating the slope of each planar region on which the marker patterns are projected from the degree of deformation of each marker pattern. The planar region having the marker pattern with the smallest degree of deformation according to the slope values is selected, and a first image, which is to be projected on the subject, is oppositely deformed according to the value of the slope of the selected planar region. The subject is photographed by an imaging unit in a state in which the deformed first image is projected on the selected planar region of the subject by the projector.

Description

プロジェクタ付き撮影装置及びその撮影制御方法Imaging device with projector and imaging control method thereof
 本発明は、プロジェクタ機能の利用価値を高めたプロジェクタ付き撮影装置及びその撮影制御方法に関する。 The present invention relates to a photographing apparatus with a projector that increases the utility value of a projector function and a photographing control method thereof.
 近年、デジタルカメラ等の撮影装置にプロジェクタを搭載したものが普及し始めている。デジタルカメラがプロジェクタを持っていると、撮影した画像を、カメラ背面に設けられた小さな表示部に表示させて確認するのではなく、撮影したその現場や家に帰った後、スクリーンや白壁などに拡大投射して確認,観賞できるため、便利である。 In recent years, a projector equipped with a photographing device such as a digital camera has begun to spread. When a digital camera has a projector, the captured image is not displayed on the small display on the back of the camera and checked. It is convenient because it can be confirmed and viewed with magnified projection.
 しかし、デジタルカメラに搭載するプロジェクタは、デジタルカメラ筐体内の狭い空きスペースにしか搭載できない関係で、小型のものに限られてしまう。このため、撮影した画像を、家の大型テレビジョン受像機に表示させたり、専用の大型プロジェクタで投射させたりした場合に比べて、表示画像の画質や精細感は劣る。 However, the projector mounted on the digital camera is limited to a small one because it can be mounted only in a narrow empty space in the digital camera housing. For this reason, compared with the case where the photographed image is displayed on a large television receiver at home or is projected by a dedicated large projector, the image quality and fineness of the displayed image are inferior.
 この様に低画質の画像しか表示できないと、デジタルカメラのユーザは、プロジェクタに利用価値を見い出すことができず、プロジェクタを搭載することでコスト増になってしまうプロジェクタ付きデジタルカメラの購買を躊躇させてしまう。 If only low-quality images can be displayed in this way, digital camera users cannot find utility value in the projector, and they are reluctant to purchase a digital camera with a projector that would increase the cost by installing the projector. End up.
 そこで、投射画像の画質向上以外に、プロジェクタ機能の利用価値を高め、カメラ使用者がプロジェクタ付きデジタルカメラに興味を示すようにする必要がある。例えば、特許文献1,2のプロジェクタ付きデジタルカメラでは、カメラ前方側に画像を投射する構成となっており、被写体の目にキャッチライトを入れたり、被写体の顔画像の色味を変えたりできる様になっている。 Therefore, in addition to improving the image quality of the projected image, it is necessary to increase the utility value of the projector function so that the camera user is interested in the digital camera with the projector. For example, the digital cameras with a projector disclosed in Patent Documents 1 and 2 are configured to project an image to the front side of the camera so that a catch light can be put in the subject's eyes or the color of the face image of the subject can be changed. It has become.
日本国特開2008―249956号公報Japanese Unexamined Patent Publication No. 2008-249956 日本国特開2008―148089号公報Japanese Unexamined Patent Publication No. 2008-148089
 カメラ前方側に、キャッチライト像や顔色装飾用画像を投射すれば、綺麗な被写体を撮像できるため、プロジェクタ機能の利用価値を高めることができる。しかし、特許文献1,2の技術だけでは、ユーザのプロジェクタ付きデジタルカメラに対する購買意欲を満たすには不十分であり、更にプロジェクタ機能の利用価値を高めると共にプロジェクタ機能の使い勝手を向上させる必要が生じる。 Projecting a catchlight image or a face decoration image on the front side of the camera can capture a beautiful subject, increasing the utility value of the projector function. However, the techniques disclosed in Patent Documents 1 and 2 are not sufficient to satisfy the user's willingness to purchase a digital camera with a projector, and further increase the utility value of the projector function and improve the usability of the projector function.
 本発明の目的は、プロジェクタ機能の利用価値を高めると共にその使い勝手を向上させることができるプロジェクタ付き撮影装置及びその撮影制御方法を提供することにある。 An object of the present invention is to provide a photographing apparatus with a projector and a photographing control method thereof that can enhance the utility value of the projector function and improve the usability thereof.
 本発明のプロジェクタ付き撮影装置は、被写体を撮像する撮像部と、文字,図形,記号又はこれらのいずれかを結合した集合パターンの画像である第1画像、及び予め設定された形状のマーカパターンを夫々上記被写体に投射するプロジェクタと、そのプロジェクタによって複数個の上記マーカパターンが上記被写体の異なる平面領域に夫々投射された状態における撮像画像を取り込み、その撮像画像中の各マーカパターンの変形程度から上記マーカパターンが投射されている上記平面領域それぞれの傾きを算出する傾き算出部と、上記算出される傾きの値から上記マーカパターンの変形程度が最も小さな平面領域を選択する平面選択部と、その平面選択部が選択した平面領域に対する上記傾きの値に応じて、上記被写体に投射する上記第1画像を、上記変形の拡縮方向を逆にして変形させる第1画像変形部と、その第1画像変形部で変形させた上記第1画像を、上記プロジェクタにより、上記被写体の上記選択された平面領域に投射した状態で、上記被写体を上記撮像部により撮像する撮像制御部と、を備えることを特徴とする。 An imaging device with a projector according to the present invention includes an imaging unit that captures an image of a subject, a first image that is an image of a collective pattern obtained by combining characters, figures, symbols, or any of these, and a marker pattern having a preset shape. Each of the projectors projecting onto the subject, and the projector captures captured images in a state where a plurality of the marker patterns are respectively projected onto different plane regions of the subject, and the degree of deformation of each marker pattern in the captured image An inclination calculation unit that calculates the inclination of each of the planar areas on which the marker pattern is projected, a plane selection unit that selects a planar area where the degree of deformation of the marker pattern is the smallest from the calculated inclination values, and the plane According to the value of the inclination with respect to the planar area selected by the selection unit, the first image projected onto the subject A first image deformation unit that deforms an image by reversing the enlargement / reduction direction of the deformation, and the first image obtained by deforming the image by the first image deformation unit by the projector using the selected plane area of the subject. And an imaging control unit that images the subject by the imaging unit in a state of being projected onto the subject.
 本発明のプロジェクタ付き撮影装置の撮影制御方法は、被写体を撮像する撮像部と、文字,図形,記号又はこれらのいずれかを結合した集合パターンの画像である第1画像、及び予め設定された形状のマーカパターンを夫々上記被写体に投射するプロジェクタとを備えるプロジェクタ付き撮影装置の撮影制御方法であって、上記プロジェクタによって複数個の上記マーカパターンを上記被写体の異なる平面領域に夫々投射した撮像画像を取り込むステップと、上記撮像画像中の各マーカパターンの変形程度から上記マーカパターンが投射されている上記平面領域それぞれの傾きを算出するステップと、上記算出された傾きの値から上記マーカパターンの変形程度が最も小さな平面領域を選択するステップと、上記選択した平面領域に対する上記傾きの値に応じて、上記被写体に投射する上記第1画像を、上記変形の拡縮方向を逆にして変形させるステップと、変形させた上記第1画像を、上記プロジェクタにより、上記被写体の上記選択された平面領域に投射した状態で、上記被写体を上記撮像部により撮像するステップと、を含むことを特徴とする。 An imaging control method for an imaging apparatus with a projector according to the present invention includes an imaging unit that images a subject, a first image that is an image of a collective pattern obtained by combining characters, figures, symbols, or any of these, and a preset shape And a projector for projecting each of the marker patterns onto the subject, and capturing an image obtained by projecting a plurality of the marker patterns onto the different planar regions of the subject by the projector. Calculating the inclination of each of the planar areas onto which the marker pattern is projected from the degree of deformation of each marker pattern in the captured image, and the degree of deformation of the marker pattern from the calculated inclination value. Selecting the smallest planar area, and for the selected planar area The step of deforming the first image to be projected onto the subject according to the inclination value by reversing the deformation direction of the deformation, and the deformed first image by the projector using the projector. And imaging the subject with the imaging unit in a state where the subject is projected onto the selected planar region.
 本発明によれば、被写体画像中に自然な状態で任意の文字情報画像を埋め込むことができるため、プロジェクタ機能の利用価値や使い勝手が向上する。 According to the present invention, since an arbitrary character information image can be embedded in a subject image in a natural state, the utility value and usability of the projector function are improved.
本発明の第1の実施形態に係る撮影装置の外観斜視図である。1 is an external perspective view of a photographing apparatus according to a first embodiment of the present invention. 図1に示す撮影装置の機能ブロック構成図である。It is a functional block block diagram of the imaging device shown in FIG. 本発明の第1の実施形態に係る撮影装置の制御手順を示すフローチャートである。It is a flowchart which shows the control procedure of the imaging device which concerns on the 1st Embodiment of this invention. 本発明の第1の実施形態に係るマーカパターンを被写体に投射した状態を示す図である。It is a figure which shows the state which projected the marker pattern which concerns on the 1st Embodiment of this invention on the to-be-photographed object. 本発明の第1の実施形態に係る文字情報等を被写体に投射した状態を示す図である。It is a figure which shows the state which projected the character information etc. which concern on the 1st Embodiment of this invention on the to-be-photographed object. 本発明の第1の実施形態に係る法線ベクトルの説明図である。It is explanatory drawing of the normal vector which concerns on the 1st Embodiment of this invention. 本発明の第1の実施形態に係る文字情報等の変形説明図である。It is modification explanatory drawing of the character information etc. which concern on the 1st Embodiment of this invention. 本発明の第2の実施形態に係る撮影装置の機能ブロック構成図である。It is a functional block block diagram of the imaging device which concerns on the 2nd Embodiment of this invention. 本発明の第2の実施形態に係る撮影装置の制御手順を示すフローチャートである。It is a flowchart which shows the control procedure of the imaging device which concerns on the 2nd Embodiment of this invention. 本発明の第2の実施形態による複数マーカパターン投射の説明図である。It is explanatory drawing of the multiple marker pattern projection by the 2nd Embodiment of this invention. 本発明の第2の実施形態の変形例による投射文字情報等の説明図である。It is explanatory drawing of the projection character information etc. by the modification of the 2nd Embodiment of this invention. 本発明の第2の実施形態の更なる変形例による投射文字情報等の説明図である。It is explanatory drawing of the projection character information etc. by the further modification of the 2nd Embodiment of this invention. 本発明の第3の実施形態に係る撮影装置の機能ブロック構成図である。It is a functional block block diagram of the imaging device which concerns on the 3rd Embodiment of this invention. 本発明の第3の実施形態に係る撮影装置の制御手順を示すフローチャートである。It is a flowchart which shows the control procedure of the imaging device which concerns on the 3rd Embodiment of this invention. 本発明の第3の実施形態による投射マーカパターンの説明図である。It is explanatory drawing of the projection marker pattern by the 3rd Embodiment of this invention. 本発明の第4の実施形態に係る撮影装置の機能ブロック構成図である。It is a functional block block diagram of the imaging device which concerns on the 4th Embodiment of this invention. 本発明の第4の実施形態に係る撮影装置の制御手順を示すフローチャートである。It is a flowchart which shows the control procedure of the imaging device which concerns on the 4th Embodiment of this invention. 本発明の第4の実施形態による投射文字情報等の説明図である。It is explanatory drawing of the projection character information etc. by the 4th Embodiment of this invention. 本発明の第5の実施形態に係る撮影装置の機能ブロック構成図である。It is a functional block block diagram of the imaging device which concerns on the 5th Embodiment of this invention. 本発明の第5の実施形態に係る撮影装置の制御手順を示すフローチャートである。It is a flowchart which shows the control procedure of the imaging device which concerns on the 5th Embodiment of this invention. 本発明の第5の実施形態による投射マーカパターンの説明図である。It is explanatory drawing of the projection marker pattern by the 5th Embodiment of this invention. 本発明の更に別実施形態に係る撮影装置の外観斜視図である。It is an external appearance perspective view of the imaging device which concerns on another embodiment of this invention. 図22に示す撮影装置の機能ブロック構成図である。It is a functional block block diagram of the imaging device shown in FIG.
 以下、本発明の一実施形態について、図面を参照して説明する。 Hereinafter, an embodiment of the present invention will be described with reference to the drawings.
 図1は、プロジェクタ付き撮影装置(デジタルカメラ)10の第1の実施形態に係る外観斜視図である。このデジタルカメラ10は、矩形のカメラ筐体11の前部に、撮影レンズ12を備える。この撮影レンズ12は、沈胴式のレンズ鏡筒13内に収納されている。カメラ筐体11の向かって左肩部分にはシャッタレリーズボタン14が設けられている。カメラ筐体11の背面には、撮影画像やスルー画像(ライブビュー画像),カメラメニュー画像等を表示する液晶表示部(図2のLCD16)が設けられている。右肩部分の前部にはフラッシュ発光部44が設けられている。 FIG. 1 is an external perspective view according to a first embodiment of a photographing apparatus with a projector (digital camera) 10. The digital camera 10 includes a photographic lens 12 at the front of a rectangular camera casing 11. The taking lens 12 is housed in a retractable lens barrel 13. A shutter release button 14 is provided on the left shoulder portion of the camera casing 11. A liquid crystal display unit (LCD 16 in FIG. 2) for displaying captured images, through images (live view images), camera menu images, and the like is provided on the rear surface of the camera housing 11. A flash light emitting unit 44 is provided at the front of the right shoulder.
 カメラ筐体11の上側内部には、前方投射型のプロジェクタ(映像投射部)17が設けられており、内蔵する小型液晶表示部の表示画像を、前部の投射窓18を通してカメラ前方に投射する構成になっている。 A front projection type projector (video projection unit) 17 is provided inside the upper side of the camera housing 11 and projects a display image of a built-in small liquid crystal display unit to the front of the camera through a projection window 18 at the front. It is configured.
 なお、プロジェクタ17に内蔵する小型液晶表示部にライブビュー画像を表示させ、このライブビュー画像を、カメラ背面側に設けたファインダ窓を通してユーザが覗くことで、電子ビューファインダ装置を兼用させる構成としても良い。また、図示する例では、プロジェクタ17を内蔵させたが、例えば特開2006―80875号公報に記載されている様に、カメラ筐体の外部に設けても良い。 In addition, a live view image is displayed on the small liquid crystal display unit built in the projector 17 and the user can look into the live view image through a finder window provided on the rear side of the camera, so that the electronic view finder device can also be used. good. In the illustrated example, the projector 17 is built in, but may be provided outside the camera housing as described in, for example, Japanese Patent Application Laid-Open No. 2006-80875.
 図2は、図1に示すデジタルカメラの機能ブロック構成図である。このデジタルカメラ10は、撮影レンズ12の背部に設けられたイメージセンサ(この実施形態ではCMOS型撮像素子)21と、デジタルカメラ10を統括制御する制御部(CPU)22を備える。 FIG. 2 is a functional block configuration diagram of the digital camera shown in FIG. The digital camera 10 includes an image sensor (a CMOS image sensor in this embodiment) 21 provided on the back of the photographing lens 12 and a control unit (CPU) 22 that performs overall control of the digital camera 10.
 CPU22は、バス23に接続され、このバス23に、フレームメモリ24と、画像処理を行う信号処理部25と、JPEG形式等に圧縮された撮影画像データを外部メモリカード26に保存するカードインタフェース(I/F)27と、カメラ背面のLCD(液晶表示部)16の表示制御を行う表示制御部28と、LCD16に表示する文字情報等のOSD信号を発生させるOSD信号発生部29と、プロジェクタ17による画像投射を制御する映像投射制御部30とが接続される。 The CPU 22 is connected to the bus 23, and the bus interface 23 stores a frame memory 24, a signal processing unit 25 that performs image processing, and a card interface (stored in the external memory card 26) in the external memory card 26. I / F) 27, a display control unit 28 that performs display control of an LCD (liquid crystal display unit) 16 on the back of the camera, an OSD signal generation unit 29 that generates OSD signals such as character information to be displayed on the LCD 16, and a projector 17 Is connected to the video projection control unit 30 for controlling the image projection.
 バス23には更に、投射文字変形部31と、平面の法線ベクトル算出部32と、マーカ検出部33とが接続されている。これらは、個別の電子部品で構成することもでき、或いは、CPU22の処理機能の1つとして実現することもできる。以下の実施形態でも同様である。 The bus 23 is further connected to a projected character deformation unit 31, a plane normal vector calculation unit 32, and a marker detection unit 33. These can be constituted by individual electronic components, or can be realized as one of the processing functions of the CPU 22. The same applies to the following embodiments.
 投射文字変形部31は、図1のプロジェクタ17から投射窓18を通してカメラ前方の被写体に投射する文字,図形,記号,又はこれらのいずれかの集合パターン(以下、文字情報という。)の画像(第1画像)を変形させる機能を有する。マーカ検出部33は、プロジェクタ17から投射窓18を通してカメラ前方の被写体に投射したマーカパターンの画像を検出する。平面の法線ベクトル算出部(傾き算出部)32は、マーカ検出部33が検出したマーカパターンの法線ベクトルを算出することで、平面の傾きの程度を算出する。 The projected character deforming unit 31 is an image (first character image) of characters, figures, symbols, or a set pattern (hereinafter referred to as character information) projected from the projector 17 of FIG. 1 image). The marker detection unit 33 detects an image of a marker pattern projected from the projector 17 through the projection window 18 onto a subject in front of the camera. The plane normal vector calculation unit (tilt calculation unit) 32 calculates the normal vector of the marker pattern detected by the marker detection unit 33, thereby calculating the degree of plane tilt.
 CPU22には、制御プログラム等を格納したROM22aやRAM22bが接続されており、また、シャッタレリーズボタン14を含む操作部40が接続されている。また、このデジタルカメラ10には、撮影レンズ12のフォーカスレンズ位置等を制御するレンズ駆動部41と、イメージセンサ21の駆動タイミング信号を発生するタイミングジェネレータ(TG)42と、イメージセンサ21を駆動するドライバ43と、フラッシュ発光部44の発光制御を行うフラッシュ制御回路45とが設けられており、これらはCPU22によって制御指示される。このCPU22は撮像制御部として機能する。 The CPU 22 is connected to a ROM 22a and a RAM 22b that store control programs and the like, and an operation unit 40 including a shutter release button 14 is connected. The digital camera 10 also drives the lens drive unit 41 that controls the focus lens position and the like of the photographing lens 12, the timing generator (TG) 42 that generates the drive timing signal of the image sensor 21, and the image sensor 21. A driver 43 and a flash control circuit 45 that performs light emission control of the flash light emitting unit 44 are provided, and these are instructed to be controlled by the CPU 22. The CPU 22 functions as an imaging control unit.
 図3は、ユーザがデジタルカメラ10を用いて被写体を撮像するときに実行される制御プログラムの処理手順を示すフローチャートである。この制御プログラムはROM22aに保存されており、実行時にRAM22bに読み出されCPU22により実行処理される。 FIG. 3 is a flowchart showing the processing procedure of the control program executed when the user images the subject using the digital camera 10. This control program is stored in the ROM 22a, read out to the RAM 22b at the time of execution, and executed by the CPU 22.
 本実施形態のデジタルカメラ10では、例えば図5に示す様に、被写体の平面部分に撮影日時を表す「2012年1月1日」の文字画像を投射し、この文字画像が投射された被写体の画像を撮影レンズ12を通して撮像する。 In the digital camera 10 of the present embodiment, for example, as shown in FIG. 5, a character image “January 1, 2012” representing the shooting date and time is projected onto the plane portion of the subject, and the subject of the subject on which this character image is projected is projected. An image is taken through the taking lens 12.
 先ず、ステップS1で、マーカパターンを被写体に投射する。マーカパターンを投射した例を図4に示す。マーカパターンとは、この例では正方形枠51及びその内部の「写」の文字52である。マーカパターンは正方形枠51だけでも良いが、世の中には正方形や矩形の被写体が多々あるため、普通の正方形の被写体と容易に区別できるようにするために、「写」の文字52もマーカパターンに含めている。勿論、マーカパターンを、赤色とか緑色等の色付きパターンとしても良い。 First, in step S1, a marker pattern is projected onto the subject. An example in which a marker pattern is projected is shown in FIG. In this example, the marker pattern is a square frame 51 and a character 52 of “copy” inside the square frame 51. The marker pattern may be only the square frame 51. However, since there are many square and rectangular subjects in the world, the character 52 of "copy" is also used as a marker pattern so that it can be easily distinguished from a normal square subject. It is included. Of course, the marker pattern may be a colored pattern such as red or green.
 次のステップS2では、マーカパターンを投射した被写体のライブビュー画像を取り込み、ステップS3で、被写体画像中のマーカパターンの領域を検出する。図4に示す被写体画像中の台座正面は、カメラ光軸(=プロジェクタ17の光軸)に対して垂直になっている。このため、正方形のマーカを投射した被写体画像中の正方形枠(マーカパターン)51は、正方形状が維持されたままとなっている。 In the next step S2, the live view image of the subject on which the marker pattern is projected is captured, and in step S3, the marker pattern region in the subject image is detected. The front surface of the pedestal in the subject image shown in FIG. 4 is perpendicular to the camera optical axis (= the optical axis of the projector 17). For this reason, the square frame (marker pattern) 51 in the subject image on which the square marker is projected is maintained in a square shape.
 これに対し、図6(a)に示す様に、カメラ光軸に対して斜めの平面54に正方形のマーカパターンが投射されると、被写体画像中のマーカパターン51は、手前側の寸法より奥側の寸法が延びた台形画像に変形してしまう。被写体の形状や反射率,乱反射率などに応じて、マーカパターンがライブビュー画像中に検出されない場合もある。 On the other hand, as shown in FIG. 6A, when a square marker pattern is projected onto a plane 54 that is oblique to the camera optical axis, the marker pattern 51 in the subject image is deeper than the near side dimension. It will deform | transform into the trapezoid image with which the dimension of the side extended. The marker pattern may not be detected in the live view image depending on the shape of the subject, the reflectance, the irregular reflectance, and the like.
 このため、ステップS4では、マーカパターンがライブビュー画像中に検出されたか否かを判定する。マーカパターン51が検出されない場合には、ステップS1に戻り、マーカパターン51が検出された場合には、ステップS5に進む。 Therefore, in step S4, it is determined whether or not the marker pattern is detected in the live view image. If the marker pattern 51 is not detected, the process returns to step S1, and if the marker pattern 51 is detected, the process proceeds to step S5.
 ステップS5では、マーカパターンが投射された平面の法線ベクトルを算出する。そして、ステップS6で、この法線ベクトルに応じて、投射する文字等のキャラクタ情報を変形する。 In step S5, a normal vector of the plane on which the marker pattern is projected is calculated. In step S6, character information such as characters to be projected is deformed according to the normal vector.
 例えば、図6の斜めの平面54に、図7(a)に示す様に、全ての文字が同じ大きさの「2012年1月1日」を投射すると、図7(b)に示す様に、投射画像は、手前の文字ほど小さく奥の文字ほど大きく変形してしまう。そこで、図7(c)に示す様に、投射する文字の大きさを、手前に投射される文字ほど大きく、奥に投射される文字ほど小さくなるように、平面54の傾きに応じて逆に変形させる。つまり、斜めの平面54に投射する投射画像を、ライブビュー画像中に検出されたマーカパターン51の変形の拡縮方向を逆にして変形させる。即ち、マーカパターンが大きく変形(拡大変形)される部位は小さく変形させ、小さく変形(縮小変形)される部位は大きく変形させることを画素単位、乃至は数画素単位で行う。そして、この文字を投射すべくプロジェクタ17内の小型液晶表示部に表示させ、これを被写体の平面54に向けて投射する(ステップS7)。この結果、図6(b)に示す様に、全ての文字の大きさが同じとなるように被写体画像中に撮影年月日情報を投射することができる。 For example, when “January 1, 2012” in which all characters are the same size as shown in FIG. 7A is projected onto the oblique plane 54 in FIG. 6, as shown in FIG. 7B. In the projected image, the smaller the character in the foreground, the larger the character in the back. Therefore, as shown in FIG. 7C, the size of the character to be projected is reversed according to the inclination of the plane 54 so that the character projected to the front is larger and the character projected to the back is smaller. Deform. That is, the projection image projected on the oblique plane 54 is deformed by reversing the expansion / contraction direction of the deformation of the marker pattern 51 detected in the live view image. That is, a portion where the marker pattern is greatly deformed (enlarged deformation) is deformed small, and a portion where the marker pattern is deformed (reduced deformation) is largely deformed in units of pixels or several pixels. Then, this character is displayed on the small liquid crystal display unit in the projector 17 so as to be projected, and is projected toward the plane 54 of the subject (step S7). As a result, as shown in FIG. 6B, the shooting date information can be projected in the subject image so that all the characters have the same size.
 どの程度、文字の大きさを変形させるかは、カメラ光軸に対して平面54がどの程度斜めになっているかによる。つまり、平面の法線ベクトルの向きに応じて、投射したい文字をアフィン変換すれば良い。 The extent to which the character size is deformed depends on how oblique the plane 54 is to the camera optical axis. That is, the character to be projected may be affine transformed according to the direction of the plane normal vector.
 ステップS7で変形文字を被写体平面に向けて投射し、この投射状態を維持したまま、ステップS8で被写体の画像を撮像し、この処理を終了する。 In step S7, the deformed character is projected toward the object plane, and while maintaining this projection state, an image of the object is captured in step S8, and this process ends.
 写真撮影するときに、撮影年月日を後加工によって画像に書き込むことは従来から行われている。しかし、この従来の書き込みは、撮影した被写体画像の奥行き位置とは無関係に被写体画像から浮いたように見える位置に書き込まれるため、不自然な違和感のある文字情報となる。 When writing a photo, it has been conventional to write the date of shooting into the image by post-processing. However, since this conventional writing is written at a position that appears to float from the subject image regardless of the depth position of the photographed subject image, the character information has an unnatural feeling.
 これに対し、本実施形態の様に、プロジェクタによって被写体に投射した撮影年月日の画像を被写体画像と一緒に撮影することで、自然な撮影年月日の重畳が可能となる。 On the other hand, as in this embodiment, by capturing an image of the shooting date projected onto the subject by the projector together with the subject image, natural shooting date can be superimposed.
 なお、実施形態では、マーカパターンを正方形枠51としたが、撮影年月日が横長のため、横長の矩形枠としても良い。或いは、縦書きの撮影年月日を投射したい場合には、縦長の矩形枠としても良い。また、撮影年月日を例に実施形態を説明したが、任意の文字情報を投射することでも良い。例えば、縦書きで「*箱根旅行*」と投射することも可能である。更にまた、2行,3行,2列,3列に渡った文字情報を投射することもできる。 In the embodiment, the marker pattern is the square frame 51. However, since the shooting date is horizontally long, it may be a horizontally long rectangular frame. Alternatively, when a vertically written shooting date is desired to be projected, a vertically long rectangular frame may be used. Further, although the embodiment has been described by taking the shooting date as an example, arbitrary character information may be projected. For example, “* Hakone trip *” can be projected vertically. Furthermore, it is possible to project character information extending over 2 rows, 3 rows, 2 columns, and 3 columns.
 図8は、本発明の第2実施形態に係るデジタルカメラの機能ブロック構成図である。図2の実施形態と比べて、バス23に、マーカパターンの法線ベクトルと光軸のなす角が最小となる平面の選択部34が接続されている点が異なる。 FIG. 8 is a functional block configuration diagram of a digital camera according to the second embodiment of the present invention. 2 is different from the embodiment of FIG. 2 in that the bus 23 is connected to a plane selection unit 34 that minimizes the angle between the normal vector of the marker pattern and the optical axis.
 図9は、第2の実施形態に係る制御手順を示すフローチャートである。本実施形態が図3のフローチャートと異なる点は、次の3点である。先ず第1に、ステップS1の代わりにステップS1aを実行して、複数個のマーカパターンを被写体に投射する点である。第2は、ステップS4の代わりにステップS4aを実行して、マーカパターンが1つ以上検出されたか否かを判定する点である。第3に、ステップS5とステップS6との間にステップS11を設けた点である。 FIG. 9 is a flowchart showing a control procedure according to the second embodiment. This embodiment differs from the flowchart of FIG. 3 in the following three points. First, step S1a is executed instead of step S1, and a plurality of marker patterns are projected onto the subject. The second point is that step S4a is executed instead of step S4 to determine whether one or more marker patterns have been detected. Third, step S11 is provided between step S5 and step S6.
 本実施形態では、プロジェクタ17からマーカパターンを複数個投射する。例えば撮影画面内を4×4に16分割し、各分割画面に1つのマーカパターンを投射する。そして、文字やキャラクタを投射したとき最も見易い投射画像が得られる平面を選択する。 In the present embodiment, a plurality of marker patterns are projected from the projector 17. For example, the shooting screen is divided into 16 × 4 × 4, and one marker pattern is projected on each divided screen. Then, a plane on which a projection image that is most easily viewable when a character or character is projected is selected.
 図10は、第2の実施形態の説明図である。本実施形態では、ステップS1aで複数のマーカパターンを被写体に投射し、ステップS4aでマーカが1つ以上検出されたか否かを判定する。そして、ステップS5で、検出されたマーカパターン毎に、夫々の法線ベクトルを算出し、次のステップ11で、法線ベクトルとカメラ光軸との成す角が最小となる平面を選択する。この平面への投射画像が最も変形が小さくなり、見易くなるためである。 FIG. 10 is an explanatory diagram of the second embodiment. In this embodiment, a plurality of marker patterns are projected onto the subject in step S1a, and it is determined whether or not one or more markers are detected in step S4a. Then, in step S5, each normal vector is calculated for each detected marker pattern, and in the next step 11, a plane that minimizes the angle formed by the normal vector and the camera optical axis is selected. This is because the projection image onto the plane is the least deformed and easy to see.
 そして、この平面に投射する文字情報等をアフィン変換し(ステップS6)、アフィン変換した文字情報等を投射し(ステップS7)たまま、被写体画像を撮像する(ステップS8)。 Then, the character information projected on the plane is affine transformed (step S6), and the subject image is captured with the affine transformed character information projected (step S7) (step S8).
 この実施形態によれば、図10に示す斜めの平面54と真正面の平面55とが検出されたとき、真正面の平面55が投射対象平面に選択され、見易い文字情報等が投射されることになる。 According to this embodiment, when the oblique plane 54 and the directly-facing plane 55 shown in FIG. 10 are detected, the directly-facing plane 55 is selected as the projection target plane, and easy-to-see character information and the like are projected. .
 平面55に長い文字情報を投射する場合、例えば、平面55の横の長さに対して「2012年1月1日」が収まらない場合もある。この様な場合を検出したCPU22は、投射文字変形部(第1画像変形部)31に対し投射文字情報を折り返させ、図11に示す様に、「2012年」と「1月1日」の2行に分けた折り返し画像に変更して、平面55に投射する。或いは、図12に示す様に、「2012年」と「1月1日」とに分割した投影文字情報の分割画像を生成し、平面55に「2012年」を投射し、「1月1日」を平面54に投射することでも良い。勿論、夫々の法線ベクトルの方向に応じて、文字情報をアフィン変換する。 When projecting long text information on the plane 55, for example, “January 1, 2012” may not fit within the horizontal length of the plane 55. When detecting such a case, the CPU 22 causes the projected character deforming unit (first image deforming unit) 31 to fold the projected character information and, as shown in FIG. 11, “2012” and “January 1”. The image is changed to a folded image divided into two lines and projected onto the plane 55. Alternatively, as shown in FIG. 12, a divided image of projection character information divided into “2012” and “January 1” is generated, and “2012” is projected on the plane 55, and “January 1” is generated. ”May be projected onto the plane 54. Of course, the character information is affine transformed according to the direction of each normal vector.
 なお、第2の実施形態では、画面を等分割して複数のマーカパターンを投射したが、被写体の重要箇所に文字情報が投射されると煩わしく、邪魔な場合がある。この様な場合、例えばユーザは、カメラ背面の液晶表示部16に映るライブビュー画像を見ながら、文字情報を投射しない領域を予め指定する様にする。そして、それ以外の領域に複数のマーカパターンを投射し、文字情報を書き込む平面を探すようにするのが良い。 In the second embodiment, the screen is equally divided and a plurality of marker patterns are projected. However, when character information is projected onto an important part of the subject, it may be bothersome and disturbing. In such a case, for example, the user designates an area in which character information is not projected in advance while viewing a live view image displayed on the liquid crystal display unit 16 on the back of the camera. And it is good to project a some marker pattern to an area | region other than that, and to look for the plane which writes in character information.
 或いは、マーカパターンを被写体に投射する前に、例えば細かなメッシュ画像を赤外線等で被写体側に投射し、その撮像画像(ライブビュー画像)を取り込み、メッシュ画像の変形程度から、画面内のどの位置にどの程度の広さの文字情報投射可能な平面があるかをCPU22が事前に調べ、平面候補を液晶表示部16に表示してユーザに選択させ、ユーザが複数の候補を選択したとき、第2の実施形態を実行する様にしても良い。 Alternatively, before projecting the marker pattern onto the subject, for example, a fine mesh image is projected onto the subject side with infrared rays, the captured image (live view image) is captured, and the position on the screen is determined from the degree of deformation of the mesh image. When the CPU 22 checks in advance how large the plane on which character information can be projected is, the plane candidates are displayed on the liquid crystal display unit 16 to be selected by the user, and the user selects a plurality of candidates. The second embodiment may be executed.
 図13は、本発明の第3実施形態に係るデジタルカメラの機能ブロック構成図である。図2の実施形態と比べて、バス23に、マーカ色設定部35と、補色演算部36と、投射面の色判別部37とを接続した点が異なる。投射面の色判定部は、プロジェクタ17で文字情報を投射する被写体領域(投射面)の色が何色であるかを判別する。補色演算部36は、投射面の色の補色を演算して求め、マーカ色設定部35は、この補色の色にマーカ色を設定する。 FIG. 13 is a functional block configuration diagram of a digital camera according to the third embodiment of the present invention. 2 differs from the embodiment of FIG. 2 in that a marker color setting unit 35, a complementary color calculation unit 36, and a color determination unit 37 on the projection surface are connected to the bus 23. The color determination unit for the projection surface determines the color of the subject area (projection surface) on which the character information is projected by the projector 17. The complementary color calculation unit 36 calculates and obtains a complementary color of the color of the projection surface, and the marker color setting unit 35 sets the marker color as the complementary color.
 図14は、第3実施形態の制御手順を示すフローチャートである。図3のフローチャートに比べて、ステップS1の前段にステップS12,S13を設けた点だけ異なる。 FIG. 14 is a flowchart showing a control procedure of the third embodiment. 3 is different from the flowchart of FIG. 3 only in that steps S12 and S13 are provided before the step S1.
 本実施形態では、先ず、ステップS12でライブビュー画像を取り込み、図15に示す投射面となる平面55の色情報を識別する。そして平面55の色の補色を求め、ステップS13で、マーカパターンの色を、この補色に設定する。以下、ステップS1~ステップS8に進み、平面55の色に対して補色となるマーカパターン51を投射する。マーカパターン51が平面55の色と補色関係にあるため、カメラ側のマーカ検出精度が向上する。なお、補色でなくても、平面55の色と異なる色であればよいことはいうまでもない。 In this embodiment, first, in step S12, a live view image is captured, and the color information of the plane 55 serving as the projection surface shown in FIG. Then, a complementary color of the color of the plane 55 is obtained, and in step S13, the color of the marker pattern is set to this complementary color. Thereafter, the process proceeds to step S1 to step S8, and a marker pattern 51 which is a complementary color to the color of the plane 55 is projected. Since the marker pattern 51 has a complementary color relationship with the color of the plane 55, the marker detection accuracy on the camera side is improved. Needless to say, the color may be different from the color of the plane 55 even if it is not a complementary color.
 図16は、本発明の第4実施形態に係るデジタルカメラの機能ブロック構成図である。図13の第3実施形態に比べて、バス23に、投射文字色設定部38を接続した点が異なる。 FIG. 16 is a functional block configuration diagram of a digital camera according to the fourth embodiment of the present invention. Compared with the third embodiment of FIG. 13, the difference is that a projection character color setting unit 38 is connected to the bus 23.
 図17は、第4実施形態の制御手順を示すフローチャートである。本実施形態では、図14のフローチャートに比べて、ステップS6とステップS7との間にステップS14を設け、平面の色の補色を投射色に設定する点が異なる。 FIG. 17 is a flowchart showing a control procedure of the fourth embodiment. This embodiment is different from the flowchart of FIG. 14 in that step S14 is provided between step S6 and step S7, and the complementary color of the plane color is set as the projection color.
 本実施形態によれば、図18示す様に、投射文字色設定部(色設定部)38によって、投射面となる平面55の色に対して補色となる色で撮影年月日の文字列56が投射される。これにより、見易い文字情報等が被写体に投射され、撮像されることになる。なお、補色でなくても、投射面の色と異なる色であれば良いことはいうまでもない。 According to the present embodiment, as shown in FIG. 18, the character string 56 of the shooting date is displayed by the projected character color setting unit (color setting unit) 38 in a color that is complementary to the color of the plane 55 that is the projection surface. Is projected. Thereby, easy-to-read text information and the like are projected onto the subject and imaged. Needless to say, the color may be different from the color of the projection surface even if it is not a complementary color.
 図19は、本発明の第5実施形態に係るデジタルカメラの機能ブロック構成図である。
図2の実施形態に比べて、バス23に、投射マーカ拡縮部39を接続した点が異なる。
FIG. 19 is a functional block configuration diagram of a digital camera according to the fifth embodiment of the present invention.
Compared with the embodiment of FIG. 2, the projection marker expansion / contraction unit 39 is connected to the bus 23.
 図20は、第5実施形態の制御手順を示すフローチャートである。図2の実施形態に比べて、ステップ6の代わりにステップS6aを実行し、ステップS4でマーカパターンが検出されなかったときステップS15を実行してからステップS1に戻る点が異なる。 FIG. 20 is a flowchart showing a control procedure of the fifth embodiment. Compared with the embodiment of FIG. 2, step S6a is executed instead of step 6, and when a marker pattern is not detected in step S4, step S15 is executed and then the process returns to step S1.
 図11で説明した実施形態では、投射する平面が狭い場合、投射する文字情報等を折り返し、図12では、2つの平面に分けて投射する例を説明した。しかし、どうしても1つの平面領域内に文字情報等を全て投射したい場合もある。この第5実施形態では、これを実現する。 In the embodiment described with reference to FIG. 11, when the plane to be projected is narrow, the character information to be projected is turned back, and in FIG. 12, an example in which the projection is divided into two planes has been described. However, there are cases where it is absolutely necessary to project all character information and the like in one plane area. In the fifth embodiment, this is realized.
 本実施形態では、第1の実施形態と同様に、図20のステップS1~ステップS4を実行し、ステップS4でマーカパターンが検出されたか否かを判定する。マーカパターンが検出されなかった場合には、所定広さの平面より大きなマーカパターン画像が投射された関係でマーカパターンが検出されなかったと判断できる。このため、ステップS15で、投射マーカ拡縮部(マーカサイズ変更部)39はマーカパターンを所定率だけ縮小する。例えば0.9倍し、ステップS1に戻る。この様に、図21に示す如く、マーカパターンを縮小しながら、マーカパターンが検出されるまでステップS1~ステップS4を繰り返す。 In this embodiment, similarly to the first embodiment, steps S1 to S4 in FIG. 20 are executed, and it is determined whether or not a marker pattern is detected in step S4. When the marker pattern is not detected, it can be determined that the marker pattern has not been detected because the marker pattern image larger than the plane having a predetermined width has been projected. For this reason, in step S15, the projection marker enlargement / reduction unit (marker size changing unit) 39 reduces the marker pattern by a predetermined rate. For example, it is multiplied by 0.9 and the process returns to step S1. In this way, as shown in FIG. 21, steps S1 to S4 are repeated until the marker pattern is detected while reducing the marker pattern.
 ステップS4でマーカパターンが検出された後は、ステップS5で平面の法線ベクトルの向きを算出する。そして、次のステップS6aでは、第1実施形態のステップS6と同様に、投射文字変形部(第1画像サイズ変更部)31が法線ベクトルの向きに応じたアフィン変換を投射対象文字情報等に施す。そしてステップS6aでは更に、このアフィン変換を行うとき、投射対象文字情報等の大きさを、マーカパターンの縮小率だけ縮小変形させる。 After the marker pattern is detected in step S4, the direction of the plane normal vector is calculated in step S5. Then, in the next step S6a, as in step S6 of the first embodiment, the projected character deforming unit (first image size changing unit) 31 converts the affine transformation according to the direction of the normal vector into the projection target character information or the like. Apply. In step S6a, when performing the affine transformation, the size of the projection target character information and the like is reduced and deformed by the reduction rate of the marker pattern.
 これにより、1つの平面内に投射対象文字情報等を全て投射することが可能となる。 This makes it possible to project all projection target character information and the like in one plane.
 以上述べた実施形態によれば、任意の文字情報等を被写体の平面領域に投射した状態で被写体の撮像を行うため、文字情報等が自然な形で撮像画像中に埋め込まれる。この様にプロジェクタ機能を利用できるため、プロジェクタ機能の利用価値が向上し、プロジェクタ機能の使い勝手も向上する。 According to the embodiment described above, since the subject is imaged in a state where arbitrary character information or the like is projected onto the plane area of the subject, the character information or the like is embedded in the captured image in a natural form. Since the projector function can be used in this way, the utility value of the projector function is improved and the usability of the projector function is also improved.
 なお、上述した各実施形態では、被写体の平面領域をマーカパターンで検出するとした。しかし、自然物を被写体として撮影するときは、完全な「平面」は存在しない場合が多い。どの程度の平面を文字情報投射用の「平面」とするかは、ユーザがその閾値を設定しても良い。或いは、完全な平面ではないがCPU22が文字情報の投射可能な「平面」であると自動判別する様にしても良い。 In each of the above-described embodiments, the planar area of the subject is detected with a marker pattern. However, when photographing a natural object as a subject, a complete “plane” often does not exist. The user may set a threshold value for how many planes are used as the “plane” for character information projection. Alternatively, although not a complete plane, the CPU 22 may automatically determine that it is a “plane” on which character information can be projected.
 例えば、マーカパターン中の直線は、凸凹な面に投射されると、直線画像ではなくジグザグな線の投射画像になる。或いは、木などの表面は円柱の表面に近くなり、直線の投射画像は円弧線の画像となる。これらの線がある閾値範囲内の直線と見なすことができるとき、文字情報投射可能な平面と見なせば良い。 For example, when a straight line in a marker pattern is projected onto an uneven surface, it becomes a projected image of a zigzag line instead of a straight line image. Alternatively, the surface of a tree or the like is close to the surface of a cylinder, and the straight projection image is an arc line image. When these lines can be regarded as straight lines within a certain threshold range, they can be regarded as planes on which character information can be projected.
 また、CPU22の処理能力が高ければ、凸凹な面や曲面を、小さな複数の平面集合体として取り扱い、個々の平面の法線ベクトルを求め、個々の平面に投射する文字情報を変形することも可能になる。これにより、例えば、枝に小鳥がとまっている被写体を撮影するとき、枝の画像に撮影年月日や撮影場所を表す文字情報を投射した被写体画像を撮像することができる。 Also, if the processing capability of the CPU 22 is high, it is possible to handle uneven surfaces and curved surfaces as a plurality of small plane aggregates, obtain normal vectors of individual planes, and transform character information projected on the individual planes. become. Thereby, for example, when photographing a subject with a small bird on a branch, it is possible to capture a subject image in which character information indicating the photographing date and photographing location is projected on the image of the branch.
 このとき、文字情報投射可能な細長い平面が斜めに延びていることをCPU22が認識できれば、プロジェクタ17内の小型液晶表示部に文字情報を斜めに描画することで、斜めの平面に沿って文字情報を投射できる。 At this time, if the CPU 22 can recognize that the elongated flat surface capable of projecting character information extends obliquely, the character information is drawn along the oblique plane by drawing the character information obliquely on the small liquid crystal display unit in the projector 17. Can be projected.
 なお、第1の実施形態から第5の実施形態やその変形例まで、夫々、別の実施形態として説明したが、任意の複数の実施形態を組み合わせて実施することも可能である。 In addition, although it demonstrated as another embodiment from 1st Embodiment to 5th Embodiment and its modification, respectively, it is also possible to implement combining some arbitrary embodiment.
 以上、本発明の撮影装置の実施形態として、デジタルカメラについて説明してきたが、撮影装置の構成はこれに限定されない。本発明のその他の撮影装置としては、例えば、内蔵型又は外付け型のPC用カメラ、或いは、以下に説明するような、撮影機能を有する携帯端末装置とすることができる。 As described above, the digital camera has been described as the embodiment of the photographing apparatus of the present invention, but the configuration of the photographing apparatus is not limited to this. As another imaging device of the present invention, for example, a built-in type or an external type PC camera or a portable terminal device having a shooting function as described below can be used.
 本発明の撮影装置の一実施形態である携帯端末装置としては、例えば、携帯電話機やスマートフォン、PDA(Personal Digital Assistants)、携帯型ゲーム機が挙げられる。以下、スマートフォンを例に挙げ、図面を参照しつつ、詳細に説明する。 Examples of the portable terminal device that is an embodiment of the photographing apparatus of the present invention include a mobile phone, a smartphone, a PDA (Personal Digital Assistant), and a portable game machine. Hereinafter, a smartphone will be described as an example, and will be described in detail with reference to the drawings.
 図22は、本発明の撮影装置の一実施形態であるスマートフォン200の外観を示すものである。図22に示すスマートフォン200は、平板状の筐体201を有し、筐体201の一方の面に表示部としての表示パネル202と、入力部としての操作パネル203とが一体となった表示入力部204を備えている。また、この様な筐体201は、スピーカ205と、マイクロホン206と、操作部207と、カメラ部208とを備えている。なお、筐体201の構成はこれに限定されず、例えば、表示部と入力部とが独立した構成を採用したり、折り畳み構造やスライド機構を有する構成を採用したりすることもできる。また、図22では図示を省略したが、図1と同様に、被写体に文字情報等像を投射するプロジェクタ17(図23参照)を備える。 FIG. 22 shows the appearance of a smartphone 200 that is an embodiment of the photographing apparatus of the present invention. A smartphone 200 illustrated in FIG. 22 includes a flat housing 201, and a display input in which a display panel 202 as a display unit and an operation panel 203 as an input unit are integrated on one surface of the housing 201. Part 204 is provided. Such a housing 201 includes a speaker 205, a microphone 206, an operation unit 207, and a camera unit 208. Note that the configuration of the housing 201 is not limited thereto, and for example, a configuration in which the display unit and the input unit are independent can be employed, or a configuration having a folding structure and a slide mechanism can be employed. Although not shown in FIG. 22, the projector 17 (see FIG. 23) that projects an image of character information or the like on the subject is provided as in FIG. 1.
 図23は、図22に示すスマートフォン200の構成を示すブロック図である。図23に示すように、スマートフォンの主たる構成要素として、無線通信部210と、表示入力部204と、通話部211と、操作部207と、カメラ部208と、記憶部212と、外部入出力部213と、GPS(Global Positioning System)受信部214と、モーションセンサ部215と、電源部216と、主制御部220とを備える。また、スマートフォン200の主たる機能として、図示省略の基地局装置BSと図示省略の移動通信網NWとを介した移動無線通信を行う無線通信機能を備える。 FIG. 23 is a block diagram showing a configuration of the smartphone 200 shown in FIG. As shown in FIG. 23, the main components of the smartphone include a wireless communication unit 210, a display input unit 204, a call unit 211, an operation unit 207, a camera unit 208, a storage unit 212, and an external input / output unit. 213, a GPS (Global Positioning System) receiving unit 214, a motion sensor unit 215, a power supply unit 216, and a main control unit 220. As a main function of the smartphone 200, a wireless communication function for performing mobile wireless communication via a base station device BS (not shown) and a mobile communication network NW (not shown) is provided.
 無線通信部210は、主制御部220の指示にしたがって、移動通信網NWに収容された基地局装置BSに対し無線通信を行うものである。この無線通信を使用して、音声データ、画像データ等の各種ファイルデータ、電子メールデータなどの送受信や、Webデータやストリーミングデータなどの受信を行う。 The wireless communication unit 210 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW according to an instruction from the main control unit 220. Using this wireless communication, transmission and reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
 表示入力部204は、主制御部220の制御により、画像(静止画像及び動画像)や文字情報などを表示して視覚的にユーザに情報を伝達するとともに、表示した情報に対するユーザ操作を検出する、いわゆるタッチパネルであって、表示パネル202と、操作パネル203とを備える。 The display input unit 204 displays images (still images and moving images), character information, and the like, visually transmits information to the user under the control of the main control unit 220, and detects user operations on the displayed information. A so-called touch panel, which includes a display panel 202 and an operation panel 203.
 表示パネル202は、LCD(Liquid Crystal Display)、OELD(Organic Electro-Luminescence Display)などを表示デバイスとして用いたものである。操作パネル203は、表示パネル202の表示面上に表示される画像を視認可能に載置され、ユーザの指や尖筆によって操作される一又は複数の座標を検出するデバイスである。このデバイスをユーザの指や尖筆によって操作すると、操作に起因して発生する検出信号を主制御部220に出力する。次いで、主制御部220は、受信した検出信号に基づいて、表示パネル202上の操作位置(座標)を検出する。主制御部220にはプロジェクタ17が接続される。 The display panel 202 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device. The operation panel 203 is a device on which an image displayed on the display surface of the display panel 202 is placed so as to be visible, and detects one or a plurality of coordinates operated by a user's finger or stylus. When this device is operated with a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 220. Next, the main control unit 220 detects an operation position (coordinates) on the display panel 202 based on the received detection signal. The projector 17 is connected to the main control unit 220.
 図22に示すように、本発明の撮影装置の一実施形態として例示しているスマートフォン200の表示パネル202と操作パネル203とは一体となって表示入力部204を構成しているが、操作パネル203が表示パネル202を完全に覆うような配置となっている。係る配置を採用した場合、操作パネル203は、表示パネル202外の領域についても、ユーザ操作を検出する機能を備えてもよい。換言すると、操作パネル203は、表示パネル202に重なる重畳部分についての検出領域(以下、表示領域と称する)と、それ以外の表示パネル202に重ならない外縁部分についての検出領域(以下、非表示領域と称する)とを備えていてもよい。 As shown in FIG. 22, the display panel 202 and the operation panel 203 of the smartphone 200 exemplified as an embodiment of the photographing apparatus of the present invention integrally constitute a display input unit 204. The arrangement 203 covers the display panel 202 completely. When such an arrangement is adopted, the operation panel 203 may have a function of detecting a user operation even in an area outside the display panel 202. In other words, the operation panel 203 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 202 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 202. May be included).
 なお、表示領域の大きさと表示パネル202の大きさとを完全に一致させても良いが、両者を必ずしも一致させる必要は無い。また、操作パネル203が、外縁部分と、それ以外の内側部分の2つの感応領域を備えていてもよい。更に、外縁部分の幅は、筐体201の大きさなどに応じて適宜設計されるものである。更にまた、操作パネル203で採用される位置検出方式としては、マトリクススイッチ方式、抵抗膜方式、表面弾性波方式、赤外線方式、電磁誘導方式、静電容量方式などが挙げられ、いずれの方式を採用することもできる。 Although the size of the display area and the size of the display panel 202 may be completely matched, it is not always necessary to match the two. In addition, the operation panel 203 may include two sensitive areas of the outer edge portion and the other inner portion. Further, the width of the outer edge portion is appropriately designed according to the size of the housing 201 and the like. Furthermore, examples of the position detection method employed in the operation panel 203 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, a capacitance method, and the like. You can also
 通話部211は、スピーカ205やマイクロホン206を備え、マイクロホン206を通じて入力されたユーザの音声を主制御部220にて処理可能な音声データに変換して主制御部220に出力したり、無線通信部210或いは外部入出力部213により受信された音声データを復号してスピーカ205から出力したりするものである。また、図22に示すように、例えば、スピーカ205を表示入力部204が設けられた面と同じ面に搭載し、マイクロホン206を筐体201の側面に搭載することができる。 The call unit 211 includes a speaker 205 and a microphone 206, converts user's voice input through the microphone 206 into voice data that can be processed by the main control unit 220, and outputs the voice data to the main control unit 220. 210 or the audio data received by the external input / output unit 213 is decoded and output from the speaker 205. As shown in FIG. 22, for example, the speaker 205 can be mounted on the same surface as the display input unit 204, and the microphone 206 can be mounted on the side surface of the housing 201.
 操作部207は、キースイッチなどを用いたハードウェアキーであって、ユーザからの指示を受け付けるものである。例えば、図22に示すように、操作部207は、スマートフォン200の筐体201の側面に搭載され、指などで押下されるとオンとなり、指を離すとバネなどの復元力によってオフ状態となる押しボタン式のスイッチである。 The operation unit 207 is a hardware key using a key switch or the like, and receives an instruction from the user. For example, as illustrated in FIG. 22, the operation unit 207 is mounted on the side surface of the housing 201 of the smartphone 200 and is turned on when pressed with a finger or the like, and turned off when the finger is released with a restoring force such as a spring. It is a push button type switch.
 記憶部212は、主制御部220の制御プログラムや制御データ、アプリケーションソフトウェア、通信相手の名称や電話番号などを対応づけたアドレスデータ、送受信した電子メールのデータ、WebブラウジングによりダウンロードしたWebデータや、ダウンロードしたコンテンツデータを記憶し、またストリーミングデータなどを一時的に記憶するものである。また、記憶部212は、スマートフォン内蔵の内部記憶部217と着脱自在な外部メモリスロットを有する外部記憶部218により構成される。なお、記憶部212を構成するそれぞれの内部記憶部217と外部記憶部218は、フラッシュメモリタイプ(flash memory type)、ハードディスクタイプ(hard disk type)、マルチメディアカードマイクロタイプ(multimedia card micro type)、カードタイプのメモリ(例えば、MicroSD(登録商標)メモリ等)、RAM(Random Access Memory)、ROM(Read Only Memory)などの格納媒体を用いて実現される。 The storage unit 212 includes a control program and control data of the main control unit 220, application software, address data that associates the name and telephone number of a communication partner, transmitted / received e-mail data, Web data downloaded by Web browsing, The downloaded content data is stored, and streaming data and the like are temporarily stored. The storage unit 212 includes an internal storage unit 217 built in the smartphone and an external storage unit 218 having a removable external memory slot. Each of the internal storage unit 217 and external storage unit 218 constituting the storage unit 212 includes a flash memory type (flash memory type), a hard disk type (hard disk type), a multimedia card micro type (multimedia card micro type), It is realized using a storage medium such as a card type memory (for example, MicroSD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory) or the like.
 外部入出力部213は、スマートフォン200に連結される全ての外部機器とのインターフェースの役割を果たすものであり、他の外部機器に通信等(例えば、ユニバーサルシリアルバス(USB)、IEEE1394など)又はネットワーク(例えば、インターネット、無線LAN、ブルートゥース(Bluetooth)(登録商標)、RFID(Radio Frequency Identification)、赤外線通信(Infrared Data Association:IrDA)(登録商標)、UWB(Ultra Wideband)(登録商標)、ジグビー(ZigBee)(登録商標)など)により直接的又は間接的に接続するためのものである。 The external input / output unit 213 serves as an interface with all external devices connected to the smartphone 200, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or a network. (For example, the Internet, wireless LAN, Bluetooth (registered trademark), RFID (Radio Frequency Identification), infrared communication (IrDA) (registered trademark), UWB (Ultra Wideband) (registered trademark), ZigBee ( ZigBee) (registered trademark, etc.) for direct or indirect connection.
 スマートフォン200に連結される外部機器としては、例えば、有/無線ヘッドセット、有/無線外部充電器、有/無線データポート、カードソケットを介して接続されるメモリカード(Memory card)やSIM(Subscriber Identity Module Card)/UIM(User Identity Module Card)カード、オーディオ・ビデオI/O(Input/Output)端子を介して接続される外部オーディオ・ビデオ機器、無線接続される外部オーディオ・ビデオ機器、有/無線接続されるスマートフォン、有/無線接続されるパーソナルコンピュータ、有/無線接続されるPDA、有/無線接続されるパーソナルコンピュータ、イヤホンなどがある。外部入出力部213は、このような外部機器から伝送を受けたデータをスマートフォン200の内部の各構成要素に伝達することや、スマートフォン200の内部のデータが外部機器に伝送されるようにすることができる。 Examples of external devices connected to the smartphone 200 include a memory card connected via a wired / wireless headset, wired / wireless external charger, wired / wireless data port, card socket, and SIM (Subscriber). Identity Module Card) / UIM User Identity Module Card, external audio / video equipment connected via audio / video I / O (Input / Output) terminal, external audio / video equipment connected wirelessly, yes / no There are a wirelessly connected smartphone, a wired / wireless personal computer, a wired / wireless PDA, a wired / wireless personal computer, an earphone, and the like. The external input / output unit 213 transmits data received from such an external device to each component inside the smartphone 200, or allows the data inside the smartphone 200 to be transmitted to the external device. Can do.
 GPS受信部214は、主制御部220の指示にしたがって、GPS衛星ST1~STnから送信されるGPS信号を受信し、受信した複数のGPS信号に基づく測位演算処理を実行し、当該スマートフォン200の緯度、経度、高度からなる位置を検出する。GPS受信部214は、無線通信部210や外部入出力部213(例えば、無線LAN)から位置情報を取得できる時には、その位置情報を用いて位置を検出することもできる。 The GPS receiving unit 214 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 220, executes positioning calculation processing based on the received GPS signals, and calculates the latitude of the smartphone 200. Detect the position consisting of longitude and altitude. When the GPS reception unit 214 can acquire position information from the wireless communication unit 210 or the external input / output unit 213 (for example, a wireless LAN), the GPS reception unit 214 can also detect the position using the position information.
 モーションセンサ部215は、例えば、3軸の加速度センサなどを備え、主制御部220の指示にしたがって、スマートフォン200の物理的な動きを検出する。スマートフォン200の物理的な動きを検出することにより、スマートフォン200の動く方向や加速度が検出される。係る検出結果は、主制御部220に出力されるものである。 The motion sensor unit 215 includes, for example, a three-axis acceleration sensor, and detects the physical movement of the smartphone 200 in accordance with an instruction from the main control unit 220. By detecting the physical movement of the smartphone 200, the moving direction and acceleration of the smartphone 200 are detected. The detection result is output to the main control unit 220.
 電源部216は、主制御部220の指示にしたがって、スマートフォン200の各部に、バッテリ(図示しない)に蓄えられる電力を供給するものである。 The power supply unit 216 supplies power stored in a battery (not shown) to each unit of the smartphone 200 in accordance with an instruction from the main control unit 220.
 主制御部220は、マイクロプロセッサを備え、記憶部212が記憶する制御プログラムや制御データにしたがって動作し、スマートフォン200の各部を統括して制御するものである。また、主制御部220は、無線通信部210を通じて、音声通信やデータ通信を行うために、通信系の各部を制御する移動通信制御機能と、アプリケーション処理機能を備える。 The main control unit 220 includes a microprocessor, operates according to a control program and control data stored in the storage unit 212, and controls each unit of the smartphone 200 in an integrated manner. In addition, the main control unit 220 includes a mobile communication control function that controls each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 210.
 アプリケーション処理機能は、記憶部212が記憶するアプリケーションソフトウェアにしたがって主制御部220が動作することにより実現するものである。アプリケーション処理機能としては、例えば、外部入出力部213を制御して対向機器とデータ通信を行う赤外線通信機能や、電子メールの送受信を行う電子メール機能、Webページを閲覧するWebブラウジング機能などがある。 The application processing function is realized by the main control unit 220 operating according to the application software stored in the storage unit 212. Application processing functions include, for example, an infrared communication function that controls the external input / output unit 213 to perform data communication with the opposite device, an e-mail function that transmits and receives e-mails, and a web browsing function that browses web pages. .
 また、主制御部220は、受信データやダウンロードしたストリーミングデータなどの画像データ(静止画像や動画像のデータ)に基づいて、映像を表示入力部204に表示する等の画像処理機能を備える。画像処理機能とは、主制御部220が、上記画像データを復号し、この復号結果に画像処理を施して、画像を表示入力部204に表示する機能のことをいう。 Also, the main control unit 220 has an image processing function such as displaying video on the display input unit 204 based on image data (still image or moving image data) such as received data or downloaded streaming data. The image processing function is a function in which the main control unit 220 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 204.
 更に、主制御部220は、表示パネル202に対する表示制御と、操作部207、操作パネル203を通じたユーザ操作を検出する操作検出制御を実行する。表示制御の実行により、主制御部220は、アプリケーションソフトウェアを起動するためのアイコンや、スクロールバーなどのソフトウェアキーを表示したり、或いは電子メールを作成するためのウィンドウを表示したりする。なお、スクロールバーとは、表示パネル202の表示領域に収まりきれない大きな画像などについて、画像の表示部分を移動する指示を受け付けるためのソフトウェアキーのことをいう。 Further, the main control unit 220 executes display control for the display panel 202 and operation detection control for detecting a user operation through the operation unit 207 and the operation panel 203. By executing the display control, the main control unit 220 displays an icon for starting application software, a software key such as a scroll bar, or displays a window for creating an e-mail. Note that the scroll bar refers to a software key for accepting an instruction to move the display portion of a large image that does not fit in the display area of the display panel 202.
 また、操作検出制御の実行により、主制御部220は、操作部207を通じたユーザ操作を検出したり、操作パネル203を通じて、上記アイコンに対する操作や、上記ウィンドウの入力欄に対する文字列の入力を受け付けたり、或いは、スクロールバーを通じた表示画像のスクロール要求を受け付ける。 In addition, by executing the operation detection control, the main control unit 220 detects a user operation through the operation unit 207 or accepts an operation on the icon or an input of a character string in the input field of the window through the operation panel 203. Or a display image scroll request through a scroll bar.
 更に、操作検出制御の実行により主制御部220は、操作パネル203に対する操作位置が、表示パネル202に重なる重畳部分(表示領域)か、それ以外の表示パネル202に重ならない外縁部分(非表示領域)かを判定し、操作パネル203の感応領域や、ソフトウェアキーの表示位置を制御するタッチパネル制御機能を備える。 Further, by executing the operation detection control, the main control unit 220 causes the operation position with respect to the operation panel 203 to overlap with the display panel 202 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 202. And a touch panel control function for controlling the sensitive area of the operation panel 203 and the display position of the software key.
 また、主制御部220は、操作パネル203に対するジェスチャ操作を検出し、検出したジェスチャ操作に応じて、予め設定された機能を実行することもできる。ジェスチャ操作とは、従来の単純なタッチ操作ではなく、指などによって軌跡を描いたり、複数の位置を同時に指定したり、或いはこれらを組み合わせて、複数の位置から少なくとも1つについて軌跡を描く操作を意味する。 The main control unit 220 can also detect a gesture operation on the operation panel 203 and execute a preset function in accordance with the detected gesture operation. Gesture operation is not a conventional simple touch operation, but an operation of drawing a trajectory with at least one position from a plurality of positions by drawing a trajectory with a finger or the like, or simultaneously specifying a plurality of positions. means.
 カメラ部208は、CMOS(Complementary Metal Oxide Semiconductor)やCCD(Charge-Coupled Device)などの撮像素子を用いて電子撮影するデジタルカメラである。また、カメラ部208は、主制御部220の制御により、撮像によって得た画像データを例えばJPEG(Joint Photographic coding Experts Group)などの圧縮した画像データに変換し、記憶部212に記録したり、外部入出力部213や無線通信部210を通じて出力したりすることができる。図22に示すにスマートフォン200において、カメラ部208は表示入力部204と同じ面に搭載されているが、カメラ部208の搭載位置はこれに限らず、表示入力部204の背面に搭載されてもよいし、或いは、複数のカメラ部208が搭載されてもよい。なお、複数のカメラ部208が搭載されている場合には、撮影に供するカメラ部208を切り替えて単独にて撮影したり、或いは、複数のカメラ部208を同時に使用して撮影したりすることもできる。 The camera unit 208 is a digital camera that performs electronic photography using an image sensor such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge-Coupled Device). In addition, the camera unit 208 converts image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 220 and records the data in the storage unit 212 or externally. The data can be output through the input / output unit 213 and the wireless communication unit 210. In the smartphone 200 shown in FIG. 22, the camera unit 208 is mounted on the same surface as the display input unit 204, but the mounting position of the camera unit 208 is not limited to this, and the camera unit 208 may be mounted on the back surface of the display input unit 204. Alternatively, a plurality of camera units 208 may be mounted. When a plurality of camera units 208 are installed, the camera unit 208 used for shooting can be switched for shooting alone, or a plurality of camera units 208 can be used for shooting simultaneously. it can.
 また、カメラ部208はスマートフォン200の各種機能に利用することができる。例えば、表示パネル202にカメラ部208で取得した画像を表示することや、操作パネル203の操作入力のひとつとして、カメラ部208の画像を利用することができる。また、GPS受信部214が位置を検出する際に、カメラ部208からの画像を参照して位置を検出することもできる。更には、カメラ部208からの画像を参照して、3軸の加速度センサを用いずに、或いは、3軸の加速度センサと併用して、スマートフォン200のカメラ部208の光軸方向を判断することや、現在の使用環境を判断することもできる。勿論、カメラ部208からの画像をアプリケーションソフトウェア内で利用することもできる。 In addition, the camera unit 208 can be used for various functions of the smartphone 200. For example, an image acquired by the camera unit 208 can be displayed on the display panel 202, or the image of the camera unit 208 can be used as one of operation inputs of the operation panel 203. Further, when the GPS receiving unit 214 detects a position, the position can be detected with reference to an image from the camera unit 208. Furthermore, referring to the image from the camera unit 208, the optical axis direction of the camera unit 208 of the smartphone 200 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment. Of course, the image from the camera unit 208 can also be used in the application software.
 なお、図22に示すカメラ部208は、テレビ電話をするときの自分撮り用のカメラであり、ライブビュー画像を表示した表示パネル202をファインダ代わりとして使用するカメラ部208も、表示パネル202の裏側に設けられている。 A camera unit 208 shown in FIG. 22 is a camera for self-portrait when making a videophone call. Is provided.
 その他、静止画又は動画の画像データにGPS受信部214により取得した位置情報、マイクロホン206により取得した音声情報(主制御部等により、音声テキスト変換を行ってテキスト情報となっていてもよい)、モーションセンサ部215により取得した姿勢情報等などを付加して記憶部212に記憶させたり、外部入出力部213や無線通信部210を通じて出力したりすることもできる。 In addition, the position information acquired by the GPS receiver 214 to the image data of the still image or the moving image, the voice information acquired by the microphone 206 (the text information may be converted into voice information by the main control unit or the like), Posture information and the like acquired by the motion sensor unit 215 can be added and stored in the storage unit 212, or can be output through the external input / output unit 213 or the wireless communication unit 210.
 以上述べた様に、本実施形態によるプロジェクタ付き撮影装置は、被写体を撮像する撮像部と、文字,図形,記号又はこれらのいずれかを結合した集合パターンの画像である第1画像、及び予め設定された形状のマーカパターンを夫々上記被写体に投射するプロジェクタと、そのプロジェクタによって複数個の上記マーカパターンが上記被写体の異なる平面領域に夫々投射された状態における撮像画像を取り込み、その撮像画像中の各マーカパターンの変形程度から上記マーカパターンが投射されている上記平面領域それぞれの傾きを算出する傾き算出部と、上記算出される傾きの値から上記マーカパターンの変形程度が最も小さな平面領域を選択する平面選択部と、その平面選択部が選択した平面領域に対する上記傾きの値に応じて、上記被写体に投射する上記第1画像を、上記変形の拡縮方向を逆にして変形させる第1画像変形部と、その第1画像変形部で変形させた上記第1画像を、上記プロジェクタにより、上記被写体の上記選択された平面領域に投射した状態で、上記被写体を上記撮像部により撮像する撮像制御部と、を備えることを特徴とする。 As described above, the imaging device with a projector according to the present embodiment includes an imaging unit that images a subject, a first image that is an image of a set pattern in which characters, figures, symbols, or any of these are combined, and a preset image. A projector that projects a marker pattern of a shape formed on the subject, and a captured image in a state in which the plurality of marker patterns are respectively projected onto different planar areas of the subject by the projector, and each of the captured images An inclination calculation unit that calculates the inclination of each of the planar areas on which the marker pattern is projected from the degree of deformation of the marker pattern, and a planar area that has the smallest degree of deformation of the marker pattern is selected from the calculated inclination values. Depending on the plane selection part and the value of the inclination relative to the plane area selected by the plane selection part, A first image deforming unit that deforms the first image projected on the subject by reversing the expansion / contraction direction of the deformation, and the first image deformed by the first image deforming unit is An imaging control unit that images the subject with the imaging unit in a state of being projected onto the selected planar area.
 また、実施形態のプロジェクタ付き撮影装置は、上記被写体のライブビュー画像を取り込んで、その被写体の上記選択された平面領域の色を判別し、かつ、その判別された色とは異なる色の上記マーカパターンを上記プロジェクタにより投射させるマーカ色設定部を備えることを特徴とする。 In addition, the photographing apparatus with a projector according to the embodiment captures the live view image of the subject, determines the color of the selected plane area of the subject, and the marker having a color different from the determined color A marker color setting unit for projecting a pattern by the projector is provided.
 また、実施形態のプロジェクタ付き撮影装置は、上記被写体のライブビュー画像を取り込んで、その被写体の上記選択された平面領域の色を判別し、かつ、その判別された色とは異なる色の上記第1画像を上記プロジェクタにより投射させる色設定部を備えることを特徴とする。 In addition, the photographing apparatus with a projector according to the embodiment captures the live view image of the subject, determines the color of the selected plane area of the subject, and determines the color of the subject that is different from the determined color. A color setting unit for projecting one image by the projector is provided.
 また、実施形態のプロジェクタ付き撮影装置は、上記プロジェクタから上記被写体に投射したマーカパターンが検出できないサイズである場合に、上記マーカパターンのサイズを変更するマーカサイズ変更部と、変更された上記マーカパターンのサイズ、及び上記傾き算出部により検出された上記平面領域の傾きに応じて、上記プロジェクタから投射させる上記第1画像を変形させる第1画像サイズ変更部と、を備えることを特徴とする。 The imaging apparatus with a projector according to the embodiment includes a marker size changing unit that changes the size of the marker pattern when the marker pattern projected from the projector onto the subject is undetectable, and the changed marker pattern And a first image size changing unit that deforms the first image projected from the projector according to the inclination of the planar area detected by the inclination calculating unit.
 また、実施形態のプロジェクタ付き撮影装置の上記第1画像変形部は、変更前の上記第1画像の全てが上記平面領域に収まらない場合に、上記第1画像を途中で折り返した折り返し画像に変更することを特徴とする。 In addition, the first image deformation unit of the photographing apparatus with a projector according to the embodiment changes the first image to a folded image that is folded halfway when the first image before the change does not fit in the planar area. It is characterized by doing.
 また、実施形態のプロジェクタ付き撮影装置の上記第1画像変形部は、第1の上記平面領域に上記第1画像が全て収まらない場合に、上記第1の平面領域に隣接する第2の平面領域に上記第1画像のうち上記第1の平面領域から外れる部分の上記第1画像を、上記第2の平面領域の傾きに応じて変形させ、上記第1画像を、上記第1の平面領域及び上記第2の平面領域に分割して投射させるための画像を生成することを特徴とする。 Further, the first image deformation unit of the photographing apparatus with a projector according to the embodiment has a second plane area adjacent to the first plane area when the first image does not fit in the first plane area. The first image of a portion of the first image that is out of the first plane area is deformed according to the inclination of the second plane area, and the first image is transformed into the first plane area and An image for generating a projection divided into the second plane region is generated.
 また、実施形態のプロジェクタ付き撮影装置の撮影制御方法は、被写体を撮像する撮像部と、文字,図形,記号又はこれらのいずれかを結合した集合パターンの画像である第1画像、及び予め設定された形状のマーカパターンを夫々上記被写体に投射するプロジェクタとを備えるプロジェクタ付き撮影装置の撮影制御方法であって、上記プロジェクタによって複数個の上記マーカパターンを上記被写体の異なる平面領域に夫々投射した撮像画像を取り込むステップと、上記撮像画像中の各マーカパターンの変形程度から上記マーカパターンが投射されている上記平面領域それぞれの傾きを算出するステップと、上記算出された傾きの値から上記マーカパターンの変形程度が最も小さな平面領域を選択するステップと、上記選択した平面領域に対する上記傾きの値に応じて、上記被写体に投射する上記第1画像を、上記変形の拡縮方向を逆にして変形させるステップと、変形させた上記第1画像を、上記プロジェクタにより、上記被写体の上記選択された平面領域に投射した状態で上記被写体を上記撮像部により撮像するステップと、を含むことを特徴とする。 In addition, the imaging control method of the imaging apparatus with a projector according to the embodiment includes an imaging unit that images a subject, a first image that is an image of a collective pattern obtained by combining characters, figures, symbols, or any of these, and a preset image And a projector for projecting a marker pattern of a different shape onto the subject, and a captured image obtained by projecting a plurality of the marker patterns onto different planar areas of the subject by the projector. A step of calculating the inclination of each of the planar areas on which the marker pattern is projected from the degree of deformation of each marker pattern in the captured image, and the deformation of the marker pattern from the calculated inclination value A step of selecting the plane area having the smallest degree and the plane area selected above The step of deforming the first image projected onto the subject with the inclination direction of the deformation reversed according to the value of the inclination with respect to the subject, and the deformed first image of the subject by the projector. Imaging the subject with the imaging unit in a state of being projected onto the selected plane area.
 以上述べた実施形態によれば、被写体の任意の平面領域に任意の文字情報を投射して、投射文字情報画像と一緒に被写体画像を撮像するため、プロジェクタ機能の使い勝手や利用価値を高めることが可能となる。 According to the embodiment described above, arbitrary character information is projected onto an arbitrary plane area of the subject, and the subject image is imaged together with the projected character information image, so that the usability and utility value of the projector function can be improved. It becomes possible.
 本発明に係るプロジェクタ付き撮影装置は、文字情報等を被写体に投射して被写体の撮像を行うことができるため、自然な状態で撮像画像中に文字情報等を埋め込むことができ、プロジェクタ機能の利用価値の高い撮影装置を普及させるのに有用である。 The imaging apparatus with a projector according to the present invention can project character information and the like onto a subject and image the subject, so that the character information and the like can be embedded in the captured image in a natural state, and the projector function can be used. This is useful for spreading high-value imaging devices.
 本出願は、2012年3月16日出願の日本特許出願(特願2012-60380)に基づくものであり、その内容はここに参照として取り込まれる。 This application is based on a Japanese patent application filed on March 16, 2012 (Japanese Patent Application No. 2012-60380), the contents of which are incorporated herein by reference.
10,200 プロジェクタ付き撮影装置
12 撮影レンズ
17 前方投射型のプロジェクタ
21 イメージセンサ
22,220 制御部(CPU)
31 投射文字変形部(文字情報変形部)
32 平面の法線ベクトル算出部
33 マーカ検出部
34 平面選択部
35 マーカ色設定部
36 補色演算部
37 投射面の色の判別部
38 投射文字色設定部
39 投射マーカ拡縮部
51 マーカパターン
DESCRIPTION OF SYMBOLS 10,200 Image | photographing apparatus 12 with a projector Image | photographing lens 17 Front projection type projector 21 Image sensor 22,220 Control part (CPU)
31 Projected character deformation section (character information deformation section)
32 Planar Normal Vector Calculation Unit 33 Marker Detection Unit 34 Plane Selection Unit 35 Marker Color Setting Unit 36 Complementary Color Calculation Unit 37 Projection Surface Color Discrimination Unit 38 Projection Character Color Setting Unit 39 Projection Marker Enlargement / Reduction Unit 51 Marker Pattern

Claims (7)

  1.  被写体を撮像する撮像部と、
     文字,図形,記号又はこれらのいずれかを結合した集合パターンの画像である第1画像、及び予め設定された形状のマーカパターンを夫々前記被写体に投射するプロジェクタと、
     該プロジェクタによって複数個の前記マーカパターンが前記被写体の異なる平面領域に夫々投射された状態における撮像画像を取り込み、該撮像画像中の各マーカパターンの変形程度から前記マーカパターンが投射されている前記平面領域それぞれの傾きを算出する傾き算出部と、
     前記算出される傾きの値から前記マーカパターンの変形程度が最も小さな平面領域を選択する平面選択部と、
     該平面選択部が選択した平面領域に対する前記傾きの値に応じて、前記被写体に投射する前記第1画像を、前記変形の拡縮方向を逆にして変形させる第1画像変形部と、
     該第1画像変形部で変形させた前記第1画像を、前記プロジェクタにより、前記被写体の前記選択された平面領域に投射した状態で、前記被写体を前記撮像部により撮像する撮像制御部と、
    を備えるプロジェクタ付き撮影装置。
    An imaging unit for imaging a subject;
    A first image that is an image of a character, figure, symbol, or a set pattern obtained by combining any of these, and a projector that projects a marker pattern of a preset shape onto the subject,
    The plane on which a plurality of marker patterns are projected by the projector in a state where each of the marker patterns is projected onto different plane areas of the subject, and the marker pattern is projected from the degree of deformation of each marker pattern in the captured image. An inclination calculator for calculating the inclination of each area;
    A plane selection unit that selects a plane region having the smallest deformation degree of the marker pattern from the calculated inclination value;
    A first image deforming unit configured to deform the first image projected onto the subject in a reverse direction of the deformation according to the value of the inclination with respect to the plane region selected by the plane selecting unit;
    An imaging control unit that images the subject by the imaging unit in a state in which the first image deformed by the first image transformation unit is projected by the projector onto the selected plane area of the subject;
    An imaging device with a projector.
  2.  請求項1に記載のプロジェクタ付き撮影装置であって、
     前記被写体のライブビュー画像を取り込んで、該被写体の前記選択された平面領域の色を判別し、かつ、該判別された色とは異なる色の前記マーカパターンを前記プロジェクタにより投射させるマーカ色設定部を備えるプロジェクタ付き撮影装置。
    The photographing apparatus with a projector according to claim 1,
    Marker color setting unit that captures the live view image of the subject, determines the color of the selected plane area of the subject, and causes the projector to project the marker pattern having a color different from the determined color An imaging device with a projector.
  3.  請求項1又は請求項2に記載のプロジェクタ付き撮影装置であって、
     前記被写体のライブビュー画像を取り込んで、該被写体の前記選択された平面領域の色を判別し、かつ、該判別された色とは異なる色の前記第1画像を前記プロジェクタにより投射させる色設定部を備えるプロジェクタ付き撮影装置。
    The imaging device with a projector according to claim 1 or 2,
    A color setting unit that captures the live view image of the subject, determines the color of the selected plane area of the subject, and causes the projector to project the first image having a color different from the determined color An imaging device with a projector.
  4.  請求項1乃至請求項3のいずれか1項に記載のプロジェクタ付き撮影装置であって、
     前記プロジェクタから前記被写体に投射したマーカパターンが検出できないサイズである場合に、前記マーカパターンのサイズを変更するマーカサイズ変更部と、
     変更された前記マーカパターンのサイズ、及び前記傾き算出部により検出された前記平面領域の傾きに応じて、前記プロジェクタから投射させる前記第1画像を変形させる第1画像サイズ変更部と、
    を備えるプロジェクタ付き撮影装置。
    A photographing apparatus with a projector according to any one of claims 1 to 3,
    A marker size changing unit that changes the size of the marker pattern when the marker pattern projected onto the subject from the projector is a size that cannot be detected;
    A first image size changing unit that deforms the first image projected from the projector according to the changed size of the marker pattern and the inclination of the planar area detected by the inclination calculating unit;
    An imaging device with a projector.
  5.  請求項1乃至請求項4のいずれか1項に記載のプロジェクタ付き撮影装置であって、
     前記第1画像変形部は、変更前の前記第1画像の全てが前記平面領域に収まらない場合に、前記第1画像を途中で折り返した折り返し画像に変更するプロジェクタ付き撮影装置。
    A photographing device with a projector according to any one of claims 1 to 4,
    The imaging device with a projector, wherein the first image deforming unit changes the first image to a folded image that is folded halfway when the first image before the change does not fit in the planar area.
  6.  請求項1乃至請求項5のいずれか1項に記載のプロジェクタ付き撮影装置であって、
     前記第1画像変形部は、第1の前記平面領域に前記第1画像が全て収まらない場合に、前記第1の平面領域に隣接する第2の平面領域に前記第1画像のうち前記第1の平面領域から外れる部分の前記第1画像を、前記第2の平面領域の傾きに応じて変形させ、前記第1画像を、前記第1の平面領域及び前記第2の平面領域に分割して投射させるための画像を生成するプロジェクタ付き撮影装置。
    A photographing device with a projector according to any one of claims 1 to 5,
    The first image deforming unit may include the first image of the first image in a second plane area adjacent to the first plane area when the first image does not fit in the first plane area. The first image of the portion outside the plane area is deformed according to the inclination of the second plane area, and the first image is divided into the first plane area and the second plane area. An imaging device with a projector that generates an image for projection.
  7.  被写体を撮像する撮像部と、文字,図形,記号又はこれらのいずれかを結合した集合パターンの画像である第1画像、及び予め設定された形状のマーカパターンを夫々前記被写体に投射するプロジェクタとを備えるプロジェクタ付き撮影装置の撮影制御方法であって、
     前記プロジェクタによって複数個の前記マーカパターンを前記被写体の異なる平面領域に夫々投射した撮像画像を取り込むステップと、
     前記撮像画像中の各マーカパターンの変形程度から前記マーカパターンが投射されている前記平面領域それぞれの傾きを算出するステップと、
     前記算出された傾きの値から前記マーカパターンの変形程度が最も小さな平面領域を選択するステップと、
     前記選択した平面領域に対する前記傾きの値に応じて、前記被写体に投射する前記第1画像を、前記変形の拡縮方向を逆にして変形させるステップと、
     変形させた前記第1画像を、前記プロジェクタにより、前記被写体の前記選択された平面領域に投射した状態で、前記被写体を前記撮像部により撮像するステップと、
    を含むプロジェクタ付き撮影装置の制御方法。
    An imaging unit that images a subject, a first image that is an image of a set pattern in which characters, figures, symbols, or any of these are combined, and a projector that projects a marker pattern having a preset shape onto the subject, respectively. A photographing control method for a photographing apparatus with a projector, comprising:
    Capturing captured images obtained by projecting a plurality of the marker patterns on the different planar regions of the subject by the projector;
    Calculating the inclination of each of the planar areas on which the marker pattern is projected from the degree of deformation of each marker pattern in the captured image;
    Selecting a plane area having the smallest deformation degree of the marker pattern from the calculated inclination value;
    Transforming the first image projected onto the subject in a reverse direction of the deformation according to the value of the inclination with respect to the selected plane region;
    Imaging the subject with the imaging unit in a state in which the deformed first image is projected onto the selected plane area of the subject by the projector;
    Control method for a photographing apparatus with a projector including:
PCT/JP2012/080614 2012-03-16 2012-11-27 Imaging device with projector and imaging control method therefor WO2013136602A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-060380 2012-03-16
JP2012060380 2012-03-16

Publications (1)

Publication Number Publication Date
WO2013136602A1 true WO2013136602A1 (en) 2013-09-19

Family

ID=49160555

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/080614 WO2013136602A1 (en) 2012-03-16 2012-11-27 Imaging device with projector and imaging control method therefor

Country Status (1)

Country Link
WO (1) WO2013136602A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015169832A (en) * 2014-03-07 2015-09-28 キヤノン株式会社 Video projection device, video projection method, and program
JP2015220661A (en) * 2014-05-20 2015-12-07 株式会社リコー Projection system, information processor, information processing method, and program
WO2016063392A1 (en) * 2014-10-23 2016-04-28 富士通株式会社 Projection apparatus and image processing program
CN113630588A (en) * 2020-05-08 2021-11-09 精工爱普生株式会社 Control method of image projection system and image projection system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005031912A (en) * 2003-07-10 2005-02-03 Ricoh Co Ltd Image input device and image processing program
JP2007142495A (en) * 2005-11-14 2007-06-07 Nippon Telegr & Teleph Corp <Ntt> Planar projector and planar projection program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005031912A (en) * 2003-07-10 2005-02-03 Ricoh Co Ltd Image input device and image processing program
JP2007142495A (en) * 2005-11-14 2007-06-07 Nippon Telegr & Teleph Corp <Ntt> Planar projector and planar projection program

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015169832A (en) * 2014-03-07 2015-09-28 キヤノン株式会社 Video projection device, video projection method, and program
JP2015220661A (en) * 2014-05-20 2015-12-07 株式会社リコー Projection system, information processor, information processing method, and program
WO2016063392A1 (en) * 2014-10-23 2016-04-28 富士通株式会社 Projection apparatus and image processing program
JPWO2016063392A1 (en) * 2014-10-23 2017-09-21 富士通株式会社 Projection apparatus and image processing program
CN113630588A (en) * 2020-05-08 2021-11-09 精工爱普生株式会社 Control method of image projection system and image projection system
CN113630588B (en) * 2020-05-08 2023-04-21 精工爱普生株式会社 Control method of image projection system and image projection system

Similar Documents

Publication Publication Date Title
JP5719967B2 (en) Imaging device with projector and control method thereof
JP6205072B2 (en) Imaging control apparatus, imaging control method, camera, camera system, and program
JP5567235B2 (en) Image processing apparatus, photographing apparatus, program, and image processing method
JP6205071B2 (en) Imaging control apparatus, imaging control method, camera system, and program
JP6205067B2 (en) Pan / tilt operating device, camera system, pan / tilt operating program, and pan / tilt operating method
KR102114377B1 (en) Method for previewing images captured by electronic device and the electronic device therefor
US20090227283A1 (en) Electronic device
JP5819564B2 (en) Image determination apparatus, imaging apparatus, three-dimensional measurement apparatus, image determination method, and program
JP5719480B2 (en) Image deformation apparatus and operation control method thereof
US10924789B2 (en) Display control apparatus, control method for display control apparatus, and non-transitory computer readable medium
JP2014071377A (en) Image display control device, image display device, program and image display method
JP5564633B2 (en) Stereoscopic image display control apparatus, imaging apparatus including the same, and stereoscopic image display control method
WO2013136602A1 (en) Imaging device with projector and imaging control method therefor
CN110881097B (en) Display control apparatus, control method, and computer-readable medium
US10079973B2 (en) Imaging device operation device, operation method, and program
JP6374535B2 (en) Operating device, tracking system, operating method, and program
JP6840903B2 (en) Imaging device, imaging method, and program
CN110881102B (en) Image capturing apparatus, control method of image capturing apparatus, and computer readable medium
JP2007208596A (en) Data reproducing apparatus, and data reproducing method and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12871550

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12871550

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP