WO2005076210A1 - 画像処理方法、画像処理装置及び移動通信端末装置 - Google Patents
画像処理方法、画像処理装置及び移動通信端末装置 Download PDFInfo
- Publication number
- WO2005076210A1 WO2005076210A1 PCT/JP2005/001753 JP2005001753W WO2005076210A1 WO 2005076210 A1 WO2005076210 A1 WO 2005076210A1 JP 2005001753 W JP2005001753 W JP 2005001753W WO 2005076210 A1 WO2005076210 A1 WO 2005076210A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- information
- mask
- interest
- original
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims description 36
- 238000010295 mobile communication Methods 0.000 title claims description 14
- 239000002131 composite material Substances 0.000 claims description 158
- 238000000605 extraction Methods 0.000 claims description 31
- 238000003384 imaging method Methods 0.000 claims description 28
- 238000000034 method Methods 0.000 claims description 16
- 238000003860 storage Methods 0.000 claims description 15
- 230000005540 biological transmission Effects 0.000 claims description 11
- 239000000284 extract Substances 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 23
- 230000000694 effects Effects 0.000 description 18
- 230000009467 reduction Effects 0.000 description 12
- 230000003287 optical effect Effects 0.000 description 11
- 230000014860 sensory perception of taste Effects 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- USQWRDRXXKZFDI-UHFFFAOYSA-N 3-methoxymethamphetamine Chemical compound CNC(C)CC1=CC=CC(OC)=C1 USQWRDRXXKZFDI-UHFFFAOYSA-N 0.000 description 7
- 230000015572 biosynthetic process Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 229920007962 Styrene Methyl Methacrylate Polymers 0.000 description 5
- 230000008859 change Effects 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 230000001747 exhibiting effect Effects 0.000 description 2
- 210000000744 eyelid Anatomy 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 102220548131 Cell division cycle-associated 7-like protein_F17A_mutation Human genes 0.000 description 1
- 102100029057 Coagulation factor XIII A chain Human genes 0.000 description 1
- 102100029058 Coagulation factor XIII B chain Human genes 0.000 description 1
- 102220562239 Disintegrin and metalloproteinase domain-containing protein 11_F16P_mutation Human genes 0.000 description 1
- 102220471848 Glycosylphosphatidylinositol-anchored high density lipoprotein-binding protein 1_F14C_mutation Human genes 0.000 description 1
- 101000918352 Homo sapiens Coagulation factor XIII A chain Proteins 0.000 description 1
- 101000918350 Homo sapiens Coagulation factor XIII B chain Proteins 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 102220518693 Mitochondrial import inner membrane translocase subunit TIM50_F14A_mutation Human genes 0.000 description 1
- 102220638341 Spartin_F24D_mutation Human genes 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003287 bathing Methods 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- YQYUWUKDEVZFDB-UHFFFAOYSA-N mmda Chemical compound COC1=CC(CC(C)N)=CC2=C1OCO2 YQYUWUKDEVZFDB-UHFFFAOYSA-N 0.000 description 1
- 230000002940 repellent Effects 0.000 description 1
- 239000005871 repellent Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 102220098981 rs878854155 Human genes 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/3871—Composing, repositioning or otherwise geometrically modifying originals the composed originals being of different kinds, e.g. low- and high-resolution originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/3872—Repositioning or masking
Definitions
- the present invention relates to an image processing method, an image processing device, and a mobile communication terminal device, and more specifically, an image processing method for creating a composite image in which a mask image is superimposed on an original image, and the image processing method
- the present invention relates to an image processing apparatus and a mobile communication terminal apparatus including the image processing apparatus.
- Such image processing techniques include, for example, a face recognition technique that recognizes that a human face image is present in an original image acquired by a digital camera or the like. There is also a face recognition technology for further advancing this face recognition, extracting face shape features, and identifying an individual (see Non-Patent Document 1, Patent Documents 1 and 2, etc.).
- Non-Patent Document 1 Eyematic Japan Co., Ltd., “Technical Information”, [Search on December 29, 2003], Internet ⁇ URL: http://www.eyematic.co.jp/tech.html>
- Patent document 1 Patent publication 2003-271958 gazette
- Patent Document 2 Patent Publication 2003-296735
- Patent Document 3 Patent Publication 2003-309829
- the conventional image processing techniques such as image recognition, image authentication, and image tracking described above are very excellent to be applied to moving pictures, and various applications can be considered.
- image processing technology it is not within the technical scope to process the original image according to the user's own taste and express the inwardness of oneself, etc. with such image processing technology.
- the heart mark when performing an effect of displaying a heart mark substantially at eye size at the eye position of the face image in the original image including the user's face image, the heart mark is fixed on the display screen. If it is superimposed on the image, it is impossible to display a heart mark at the eye position of the face image in the ever-changing moving image. In addition, regarding the size of the display of the heart mark, which is only at the display position, if the size of the face image in the moving image is not set, the display image becomes strange.
- the present invention has been made in view of the above-described circumstances, and it is possible to simplify an image in which the effect of the user is applied to the original image, regardless of whether it is a still image or a moving image. Also, it is an object of the present invention to provide an image processing method and an image processing method that can be appropriately created.
- the present invention provides a mobile communication terminal device capable of transmitting an image in which the effect of the user is appropriately applied to the original image regardless of whether it is a still image or a moving image.
- the purpose is to
- the image processing method is an image processing method for processing an image, which specifies a target image area, which is an area of an image to be focused, on an original image, and specifies the target image area in the original image.
- information of a specific element image is selected in advance corresponding to each of at least one relationship information indicating the relationship between the position and the size with the image of interest in the mask image information file.
- Element image selection information indicating that, and mask image information including at least one element image information selected by the element image selection information are registered.
- the “image of interest area” is, for example, an area of the face of the person in the original image including the person's image, etc., and can be identified by image features in the image assumed to be the original image. Area.
- the target image information extraction step the target image area is identified in the original image, and the target image area in the original image is identified. Extract target image information including information related to position and size.
- the mask image data creating step a mask image to be displayed superimposed on the original image is created based on the noted image information and the mask image information registered in the mask image information file.
- the size of the target image area in the original image is determined at a position determined by the information indicating the relationship between the information related to the position of the target image area in the original image and the position of the target image area in the related information included in the mask image information.
- An image is created. Then, in the synthetic image creation step, for example, on the eye image-containing image including the target image in the target image area such as the face image in the face area at the same position as the original image in the same size.
- a composite image in which the mask image is superimposed is created.
- an image according to the user's sense of taste is applied to the image containing the image of interest. Images can be created easily and properly.
- the image-of-interest containing image can be used as the original image.
- an effect corresponding to the user's sense of taste can be performed on the original image by selecting the mask image.
- the mask image information file further includes background image selection information indicating that information of a specific background image has been selected, and the synthetic image creation step is performed.
- a specific additional display image including the image of interest is extracted from the original image, and the specific additional display image is superimposed on the specific background image at the same position as that of the original image and has the same size as the specific background image.
- the composite image can be created after making the target image containing image.
- the composite image creating step first, for example, a specific display image including a target image such as a human image in the case where the target image is a face image is cut out from the original image. Subsequently, the specific display image is superimposed on the background image designated by the background image selection information included in the mask image information file, and a background additional image is created. Then, the above-mentioned mask image is superimposed on the image containing the image of interest with the background added image as the image containing the image of interest. Therefore, an effect according to the user's sense of taste with respect to a specific display image such as a human figure can be given by selecting a mask image and a background image.
- the information related to the position of the noted image area can include position information of a representative point of the noted image area in the original image.
- the information indicating the relationship between the position of the image of interest and the position of the image of interest in the relationship information included in the mask image information can be used as the information indicating the relative position with respect to the representative point of the image of interest.
- Information indicating the relationship with the position of the area can be simplified. In the case where the information indicating the relationship with the position of the image of interest area in the relationship information is the information indicating the relative position to the representative point of the image of interest area, the relative position information is used as the reference image area.
- the relative position information in the case of having the value of the relative position information according to the ratio between the size of the image of interest in the original image and the size of the object of interest as the reference is It can be converted to a value according to the size. Further, the value of the relative position information is set as a value after being normalized by the size of the image of interest in the original image, and the value of the relative position information is a value according to the size of the image of interest in the original image. You can convert it to
- the information related to the position of the noted image area may further include inclination information of the eye image area in the original image.
- the original image By using the tilt information of the image area of interest in, it is possible to create a composite image in which a desired element image is superimposed and displayed at a position according to the tilt of the original image of the image area of interest. Further, the element image to be displayed can be displayed with an inclination according to the inclination of the original image of the noted image area.
- the tilt information may be one angle information indicating the two-dimensional tilt of the image of interest, or two or three pieces of angle information indicating the three-dimensional tilt of the image of interest. .
- the relationship In the information, the representative point of the noted image area is set as the origin, and based on the inclination of the noted image area!
- Coordinate position information in a coordinate system in which the direction of the coordinate axis is determined can be included as display position information of the specific element image.
- the information indicating the relationship between the target image area and the position of the target image area in the relationship information is the information indicating the relative position with respect to the representative point of the target image area.
- Coordinate position information t in the coordinate system in which the direction of the coordinate axis is determined based on the inclination can be made simple and easy to handle.
- the relationship information may include ratio information between the size of the noted image area and the size of the specified element image at the time of display. In this case, even if the size of the target image area changes every moment in the original image, it is possible to create a mask image including an element image of a size that matches the size of the target image area. .
- the execution time information of each of the element image display determined by the combination of the relation information and the element image selection information corresponding to the relation information is included in the mask image information. It can be further included.
- each desired element image can be displayed only for the display execution time set to the desired time, and a dynamically changing mask image is created. can do.
- time-sequentially displaying a plurality of types of element image displays determined by a combination of the relation information and element image selection information corresponding to the relation information.
- by creating a mask image in accordance with the information on time sequential display designation it is possible to display a plurality of element images on the original image repeatedly in time sequence for each desired time.
- the mask image information may simultaneously display a plurality of types of element image displays determined by a combination of the relation information and element image selection information corresponding to the relation information. It is possible to further include simultaneous display specification information specifying. In this case, it is possible to create a mask image in which each of a plurality of desired component images is displayed at a desired position, and it is possible to create mask images of various nourishments.
- the original image can be a moving image.
- the method further includes a predetermined phenomenon occurrence determining step of determining whether or not a predetermined phenomenon has occurred in the target image area based on the target image information.
- the mask image data creating step and the composite image data creating step can be started.
- the mask image data creation step and the composite image creation step are started, so that it is matched to the relevant phenomenon. It is possible to create a mask image that has been rendered, and to create a composite image in which the mask image is superimposed on the original image. For example, when the image of interest is a human face, when the eyelid in the image of interest is closed, it is possible to create a composite image in which tears flow from the position of the eye.
- An image processing apparatus is an image processing apparatus that processes an image, and identifies a target image area, which is an area of an image to be focused, on an original image, and specifies the target image area in the original image.
- Attention image for extracting attention image information including information related to the position and size of the attention image region Information extraction means; and at least one indicating the relationship between the position and size of the attention image region Relation information, element image selection information indicating that information of a specific element image has been selected corresponding to each of the relation information, and at least one element image information selected by the element image selection information are included.
- Storage means for storing a mask image information file in which mask image information is registered; and a mask image displayed superimposed on the original image based on the noted image information and the mask image information
- a mask image data generation unit for generating data; a composite image in which the mask image is superimposed on an image containing an image of interest including the image of interest in the image of interest in the same position as the original image in the same size;
- An image processing apparatus comprising:
- the focused image information extraction unit identifies the focused image area as the original image, and identifies an eye image including information related to the position and size of the focused image area in the original image. Extract information.
- the mask image data creation means creates a mask image to be displayed superimposed on the original image based on the noted image information and the mask image information registered in the mask image information file.
- the composite image creation means creates a composite image in which the mask image is superimposed on the target image containing image including the target image in the target image area at the same position as the original image and having the same size.
- a composite image is created using the image processing method of the present invention. Therefore, according to the image processing apparatus of the present invention, regardless of whether the original image is a still image or a moving image, an image in which an effect according to the user's sense of taste is applied to the image containing the image of interest. It can be created easily and appropriately.
- the target image-containing image can be used as the original image.
- an effect according to the user's taste can be performed on the original image by selecting the mask image.
- the mask image information file further includes background image selection information indicating that information of a specific background image has been selected
- the composite image creation unit includes the image of interest
- a specific display image including the background image is extracted from the original image, and a background additional image in which the specific display image is superimposed on the specific background image with the same size and the same position as in the original image is included in the target image.
- the composite image can be created.
- the composite image creation means first cuts out the specific display image including the image of interest from the original image. Subsequently, the composite image creation means superimposes the specific display image on the background image specified by the background image selection information included in the mask image information file, and creates a background additional image.
- the composite image creation means superimposes the above-mentioned mask image on the background additional image. Therefore, it is possible to perform an operation according to the user's sense of taste with respect to the specific display image such as a human image by selecting the mask image and the background image.
- the image processing apparatus of the present invention can be configured to further include display means for displaying the composite image.
- display means for displaying the composite image.
- the creator of the composite image can confirm the composite image created with the effect based on his / her taste.
- the image processing apparatus may further comprise transmission means for transmitting the composite image data.
- transmission means for transmitting the composite image data it is possible to transmit to the receiving side a composite image that is created by applying effects based on the taste of the creator of the composite image on the transmitting side. It is possible to notify a composite image that has been created with a presentation based on a sense of taste.
- the image processing apparatus of the present invention may further include an imaging unit for acquiring the original image.
- an image acquired by imaging by the imaging means can be used as an original image, and an image in which an effect according to the creator's taste is given to the original image is simply and appropriately created. be able to.
- the image acquired by the imaging means may be a still image or a moving image.
- the mobile communication terminal apparatus of the present invention is a mobile communication terminal apparatus provided with the image processing apparatus of the present invention.
- the composite image created by the image processing apparatus of the present invention can be transmitted to the receiving terminal apparatus, the original image can be used regardless of whether it is a still image or a moving image. It is possible to transmit an image appropriately rendered by the user.
- an image in which the effect of the user is applied to the original image can be simplified and Suitable It can be created on the fly.
- the composite image is created using the image processing method of the present invention, it is possible to use the original image regardless of whether it is a still image or a moving image. It is possible to easily and appropriately create an image to which a user's presentation is given.
- the image processing device of the present invention since the image processing device of the present invention is provided, the effect of the user on the original image can be obtained regardless of whether it is a still image or a moving image. An appropriately applied image can be transmitted.
- FIG. 1A is a view schematically showing an appearance of a front side of a mobile phone device according to a first embodiment of the present invention.
- FIG. 1B is a view schematically showing the appearance of the back side of the mobile phone device according to the first embodiment of the present invention.
- FIG. 2 is a functional block diagram for explaining an internal configuration of the mobile phone device of FIGS. 1A and 1B.
- FIG. 3 is a functional block diagram for explaining an internal configuration of the mask image processing unit of FIG. 2;
- FIG. 4A A diagram (part 1) for describing noted image area information.
- FIG. 4B A diagram (part 2) for describing noted image area information.
- FIG. 5A is a view for explaining the configuration of the mask image information file of FIG. 2;
- FIG. 5B is a view for explaining the configuration of the element image file of FIG. 5A.
- FIG. 6 is a diagram (part 1) illustrating an example of an element image.
- FIG. 7 A diagram (part 1) showing an example of the contents of the display designation file of FIG. 5A.
- FIG. 8A is a diagram (part 2) showing an example of the content of the display designation file of FIG. 5A.
- FIG. 8B A diagram (part 3) showing an example of the content of the display designation file of FIG. 5A.
- FIG. 9A is a view showing an example of an image of interest.
- FIG. 9B is a view for explaining an example of a noted image coordinate system in FIG. 9A.
- FIG. 10A is a diagram showing an example of an original image.
- FIG. 10B A diagram showing an example of a mask image (part 1).
- FIG. 11 is a first diagram showing an example of a composite image in the first embodiment.
- FIG. 12A is a second diagram showing an example of an element image.
- FIG. 12B is a third diagram showing an example of an element image.
- FIG. 12C A fourth diagram showing an example of an element image.
- FIG. 13 is a second drawing showing an example of a mask image.
- FIG. 14 is a second diagram showing an example of a composite image in the first embodiment.
- FIG. 15A is a fifth drawing showing an example of an element image.
- FIG. 15B A diagram (part 6) showing an example of an element image.
- FIG. 15C is a diagram (No. 7) showing an example of an element image.
- FIG. 16 A diagram (part 3) showing an example of a mask image.
- FIG. 17 is a third diagram showing an example of a composite image in the first embodiment.
- FIG. 18 is a functional block diagram for explaining an internal configuration of a mobile telephone device according to a second embodiment of the present invention.
- FIG. 19 is a functional block diagram for explaining an internal configuration of the mask image processing unit of FIG. 18;
- FIG. 20 is a view for explaining the configuration of a mask image information file of FIG. 18;
- FIG. 21 is a diagram showing an example of the content of the display designation file of FIG. 20.
- FIG. 22 is a view showing an example of a background image.
- FIG. 23 is a diagram showing an example of a specific display image.
- FIG. 24 is a view for explaining a composite image creation operation in the composite image creation unit of FIG. 19;
- FIG. 1 and FIG. 2 schematically show the configuration of a mobile telephone apparatus 10 which is a mobile communication terminal apparatus in one embodiment.
- FIG. 1A shows a front view of the appearance of the mobile telephone device 10
- FIG. 1B shows a rear view of the appearance of the mobile telephone device 10.
- FIG. 2 shows a functional block configuration of the mobile telephone device 10.
- FIG. 1A, FIG. IB and FIG. A mobile phone main body 11 having a control unit 21 and the like described later, (b) a numeric keypad for inputting a telephone number to the control unit 21, and various commands for switching operation modes and the like to the control unit 21.
- An operation unit 12 having a function key, and (c) a display unit 13 having a liquid crystal display device for displaying operation guidance, an operation status, a received message and the like according to a command from the control unit 21 are provided.
- the mobile phone device 10 also includes (d) a call speaker 14 for reproducing a voice signal sent from the other party during a call, a microphone 15 for inputting voice during a call, and (e) a control unit.
- a guidance speaker 16 is provided for generating a ring tone and a guidance sound in response to a command from the control unit 21.
- the mobile telephone device 10 is connected to the (f) transmission / reception unit 22 and includes an antenna 17 for exchanging radio signals with the base station.
- the mobile phone device 10 captures an image of the operator who operates the operation unit 12 while viewing the display on the display unit 13 (i.e., an imaging optical system for performing imaging in a so-called self-portrait mode) And (h) an imaging optical system 19 for taking an image in which the visual line direction of the operator is a visual field direction, that is, V, for taking an image in a mode of remote shooting.
- the mobile phone main body 11 (i) comprehensively controls the entire operation of the mobile phone device 10 and performs various data processing, (ii) a base
- the transmission / reception unit 22 transmits / receives a radio signal to / from a station via the antenna 17, and (iii) a storage unit 23 stores various data.
- the mobile phone main body 11 emits the light passing through the imaging optical system 18 and the light passing through the imaging optical system 19 toward the imaging device 25 described later according to the control of the control unit 21.
- a switching optical system 24 and (iv) an imaging element 25 such as a CCD element for capturing an optical image formed on a light receiving surface by light passing through the switching optical system 24 are provided.
- the imaging device 25 performs an imaging operation under the control of the control unit 21 and notifies the control unit 21 of an imaging result.
- the control unit 21 includes a basic control unit 29 and a mask image processing unit 30.
- the basic control unit 29 controls the start and stop of the operation of the mask image processing unit 30, and controls each element disposed outside the control unit 21 and performs image processing for mask image creation described later. Perform other data processing.
- the mask image processing unit 30 described above includes a focused image information extraction unit 31, a mask image creation unit 32, and a composite image creation unit 33.
- the noted image information extraction unit 31 Based on the original image data OMD from the image pickup device 25, for example, a target image area such as a face area of a specific person (for example, a user of the mobile phone device 10) in the original image is specified.
- the target image information WMD including the position information of the representative point, the tilt information of the target image area in the original image, and the size information of the target image area is extracted.
- each pixel position in the original image area OMR is an original image coordinate system X Y
- W is represented by a counterclockwise angle.
- the focused image information extraction unit 31 specifies the focused image area WMR in the original image area OMR
- the representative point O is used as position information of a representative point of the focused image area WMR. Find the coordinate position O (X, Y) in the original image coordinate system of.
- the representative point of the area WMR is the position of the center of gravity of the image area of interest WMR, and in the image area of interest obtained by image analysis such as the tip position of the nose in the face image when the image of interest is a face image of a specific person. Characteristic points can be adopted.
- the focused image information extraction unit 31 obtains an intersection angle ⁇ between the X axis and the X axis (that is, the above-described tilt angle ⁇ ) as tilt information in the original image region OMR of the focused image region WMR.
- the inclination angle ⁇ is the image area of interest WMR defined in advance as the inclination angle ⁇ S “0”.
- the X ⁇ coordinate system (the coordinate system of interest) in is to be determined.
- the focused image information extraction unit 31 obtains a characteristic length related to the focused image area WMR as size information of the focused image area WMR.
- a characteristic length related to the focused image area WMR as size information of the focused image area WMR.
- the width of the target image area WMR is taken as the characteristic length (width LX in FIG. 4A).
- the focused image area WMR on the X axis Note that as the size information of the focused image area WMR, the focused image area WMR on the X axis
- the maximum width of the image of interest in WMR or the distance between two characteristic points in the image of interest (for example, the distance between the centers of both eyes in the face image when the image of interest is the face of a specific person), Length of X-axis component of or length of Y-axis component of the distance
- the notable image information extraction unit 31 specifies the notable image region WMR in the original image region OMR as in the case of FIG. 4A described above, and then determines the representative points in the notable image region WMR. Obtain position information, tilt information and size information.
- the coordinate position O (X, Y) of the representative point O in the original image coordinate system as the position information of the representative point.
- the width LX of the target image area WMR on the X axis is determined.
- the mask image creation unit 32 described above is a mask image based on the target image information WMD extracted by the target image information extraction unit 31 and the mask image information MDD stored in the storage unit 23 described above. Create data MMD.
- the creation of mask image data MMD by the mask image information MDD and the mask image creation unit 32 will be described later.
- the above-described composite image creation unit 33 creates composite image data SMD in which the mask image is superimposed on the original image based on the original image data OMD and the mask image data MMD.
- the composite image data SMD created by the composite image creation unit 33 is sent to the basic control unit 29. Be notified.
- control unit 21 includes a central processing unit (CPU), a digital signal processing unit (DSP), and the like.
- the basic processing unit 29 and the mask image processing unit 30 configured as programs to be executed by the control unit 21 perform various data processing to realize a mobile phone function including mail transmission / reception processing, and Became action control of other components.
- the storage unit 23 stores a mask image information file 40.
- this mask image information file 40 stores (i) a display specification description that specifies how to display the mask image in relation to the target image area in the original image. It is composed of a designated file 41 and (ii) an element image file 42 in which element image data designated by the display designation description in the display designation file 41 is stored!
- the mask image information file 40 is provided to the mobile phone device 10 via the content provider wireless line, or after the user creates it using a personal computer or the like, via a storage medium or an external interface (not shown). Provided.
- Element images are stored in the element image file 42 as shown in FIG. 5B.
- the i-th element image stored in the element image file 42 is hereinafter referred to as an element image EMM.
- FIG. 5B the case where there are a plurality of element images stored in the element image file 42 is shown.
- the number of element images stored in the force element image file 42 may be one, There are also two or more cases.
- As the element image EM for example, there are a heart figure image, a lip figure image and the like.
- Such an element image EMM is representatively shown for the element image EMM in FIG.
- the coordinate system specific to the element image that is, the X Y coordinate system having the origin O as the representative point of the element image EMM (the center point of the element image EMM in FIG. 6)
- Image EMM is an element image EMM of a heart shape, but element images of different shapes and sizes a ii
- FIG. 7, FIG. 8A and FIG. 8B Is shown. Note that the display specification description in the present embodiment is described in a format according to SMIL (Synchronized Multimedia Integration Language) format!
- SMIL Synchronized Multimedia Integration Language
- two positions are fixed at two positions determined according to the position, tilt, and size of the image of interest.
- the additional image 1 is specified by the description between ⁇ additional image 1 specification> and ⁇ additional image 1 specification end>. That is, as designated by “display position designation”, the image of interest area w
- Element image EM specified by "element image specification” at coordinate position (X, Y) of target system
- the additional image 2 is specified by the description between ⁇ additional image 2 specification> and ⁇ additional image 2 specification end>. That is, as designated by “display position designation”, the X and Y coordinates in the case where the size information of the noted image area W MR has a predetermined reference value LX
- Element image EM specified by "element image specification” at coordinate position (X, Y) of target system
- 11 1 Element image file 42 stores only the element image EMM as an element image.
- FIG. 8A display is performed at one position (for example, the position of the mouth when the image of interest is a face image of a specific person) determined according to the position, tilt, and size of the image of interest.
- the display specification is an example for specifying the display to be displayed by one additional image 1 whose content changes with time. An example is shown as aisle 41.
- An example is shown as aisle 41.
- the additional image 1 is specified by the description between ⁇ additional image 1 specification> and ⁇ additional image 1 specification end>. That is, first, the target image area designated by the first “display position designation” of the element image EMM designated by the first “element image designation” is determined in advance.
- the element image specified by EMM force is the position specified by the second "display position specification"
- the maximum width in the E axis direction is the length of the ratio R with respect to the size LX of the image area WMR of interest. Also,
- the element image EMM force time T is passed as specified by 2 “display time designation”
- the maximum width is the length of ratio R with respect to the size LX of the image area of interest WMR. Also,
- the element image EMM force time T is passed as specified by 3 “display time designation”
- the element image EMM, the element image EMM, and the element image EMM are time-sequential
- element image EMM is displayed in time sequence.
- element image EMM The display according to the specification mode of the raw image EMM and element image EMM is repeated in time sequence
- one element image is usually displayed at one position determined in accordance with the position, inclination, and size of the image area of interest, and the position, inclination, and size of the image area of interest are displayed.
- An example is shown as a display specification file 41 for performing display specification for alternatively displaying two element images different in time sequence at two other positions determined according to. sand
- an additional image 1 and an additional image 2 designated by ⁇ additional image 1 designation> and ⁇ additional image 2 designation> described between this and ⁇ simultaneous display designation end> Is specified to be displayed simultaneously.
- the additional image 1 is specified by the description between ⁇ additional image 1 specification> and ⁇ additional image 1 specification end>. That is, as designated by “display position designation”, the image of interest area w
- Element image EM specified by "element image specification” at coordinate position (X, Y) of target system
- the additional image 2 is specified by the description between ⁇ additional image 2 specification> and ⁇ additional image 2 specification end>. That is, first, as designated by the first “display position designation”, the size information of the eye image area WMR has a value LX as a predetermined reference.
- Element image EMM is displayed. Also, as specified by "size specified",
- the maximum width of the elemental image EMM in the X-axis direction at the time of display is the
- Element image EMM force time T is displayed over time.
- the element image EMM at the time of display is specified by “size specification”.
- the additional image 1 and the additional image 2 are combined to obtain an element image EMM and an element image E.
- the matrix image creation unit 32 has already read the contents of the mask image information file 40 in the storage unit 23 as mask image information MDD.
- the original image is assumed to be an image formed on the imaging surface of the imaging device 25 through the imaging optical system 18 and the switching optical system 24.
- the image of interest WMM is a face image of a specific person as shown in FIG. 9A.
- the tip position of the nose in the image of interest WMM is taken as the origin O, and the direction connecting both surfaces is taken as the X axis direction,
- the origin O and the X-axis direction are obtained by image analysis of the image of interest in the original image, Y-axis
- the direction shall be determined based on the determined X-axis direction.
- This example is a heart-shaped graphic image shown in FIG. 6 with the positions of both eyes as the display position in FIG. 7 described above and the element image EMM 1S.
- original image data OMD which is data of an original image OMM captured by the imaging device 25, is notified to the focused image information extraction unit 31.
- the original image data OMD is simultaneously notified to the composite image creation unit 33 (see FIG. 3).
- an example of the obtained original image OMM is shown in FIG. 10A.
- the noted image information extraction unit 31 performs image processing on the notified original image OMM to extract a region of the noted image WMM. Subsequently, the image of interest image information extraction unit 31 analyzes the image of interest WMM, and the nose tip position (X 1, X 2, Y 3) in the image of interest WMM in the original image coordinate system (X Y coordinate system).
- (Y) is determined as the position of the origin o of the image coordinates of interest (X Y coordinate system), and
- the information extraction unit 31 obtains the width LX on the X axis of the image of interest WMM. And attention
- the image information extraction unit 31 sets the position (X, Y) as position information of the target image area, the angle ⁇ ⁇ ⁇ as tilt information of the target image area, and the size of the target image area as the target image information data WMD.
- the mask image creation unit 32 is notified of the length LX as texture information.
- the mask image creation unit 32 having received the noted image information data WMD first displays the display position (X, Y) of the element image EMM in the additional image 1 in the XY coordinate system according to the following equation (1) and
- the mask image creation unit 32 enlarges the element image EMM in the additional image 1 or
- the scaling factor M of reduction is calculated by the following equation (3).
- the mask image creation unit 32 sets the element image EMM to the origin O in the X-Y coordinate system.
- a large or reduced element image EMM is angled around the origin O in the X-Y coordinate system
- the mask image creation unit 32 sets the display position (X, Y) of the element image EMM in the additional image 2 in the XY coordinate system to the following (4) ) Expression
- the mask image creation unit 32 calculates the magnification M of the enlargement or reduction of the element image EMM in the additional image 2 according to the following equation (6).
- the mask image creation unit 32 sets the element image EMM to the X / Y coordinate system! /, And the origin O
- a large or reduced element image EMM is angled around the origin O in the X-Y coordinate system
- a mask image MMMA as shown as image F10B in FIG. 10B is created.
- the mask image creation unit 32 notifies the composite image creation unit 33 of data of the created mask image MMMA as mask image data MMD.
- the composite image creation unit 33 that has received the OMD creates a composite image SMMA in which the mask image MMMA is superimposed on the original image data OMM.
- the composite image SMMA thus created is shown in FIG. 11 as //, image Fl 1 A!
- a composite image SMMB is created in which a group image is superimposed on the original image.
- the image Fl 1C As shown in the figure, as in the above case, a mask image in which the element image E MM is displayed at the positions of both eyes of the image of interest WMM at a size of a ratio designated according to the size of the image of interest of interest.
- a composite image SMMC is created in which the image is superimposed on the original image.
- the element image EMM is larger in the image area of interest in the positions of both eyes of the image of interest WMM.
- Image SMMD is created.
- the change in the coordinate position in the original image coordinate system (XY coordinate system) of the nose tip position in the image of interest It becomes.
- the position of the mouth is the display position in FIG. 8A described above, and the element image EMM shown in FIG. 12A, the element shown in FIG. 12B.
- the mask image shown is an example of a composite image displayed superimposed on the original image.
- element images EMM, EMM, EMM are displayed in a size according to the "size information" specified in the description of FIG. 8A. That is, element image EMM is the image of interest WMM
- data of an original image OMM imaged by the imaging device 25 is notified to the focused image information extraction unit 31 as original image data OMD.
- the image-of-interest extraction unit 31 performs image processing on the notified original image OMM to specify the area of the image of interest WMM, and extracts the image-of-interest information WMR.
- the image of interest image information extraction unit 31 sets the position (X, Y) as position information of the image of interest, the angle ⁇ ⁇ ⁇ as tilt information of the image of interest, and the size of the image of interest as the image of interest image data WMD.
- the mask image creation unit 32 is notified of the length LX as information.
- the mask image creation unit 32 having received the noted image information data WMD first applies the additional image 1 to the additional image 1.
- the display position (X, Y) of the first element image EMM in the XY coordinate system is calculated by the following equation (7)
- the mask image creation unit 32 sets the magnification M of the enlargement or reduction of the first element image EMM.
- the mask image creation unit 32 enlarges or reduces the element image EMM at a magnification M around the origin O in the X-Y coordinate system.
- the reduced and rotated element image EMM is set to the coordinate position (X (X)
- Y is placed in the X-Y coordinate system so that it is at the position of the origin O.
- the mask image creation unit 32 deciphers the created mask image MMM.
- the composite image creation unit 33 that receives the mask image data MMD and the original image data OMD creates a composite image SMM in which the mask image MMM is superimposed on the original image OMM. like this
- a composite image SMM created as a result is shown as image F14A in FIG.
- the mask image creation unit 32 displays the position of the mouth of the image of interest WMM over time T.
- the element image EMM is set to the length value LX representing the size of the image of interest WMM.
- a composite image in which 21 21 or reduced mask images are superimposed is sequentially created over time ⁇ .
- the mask image creation unit 32 when time T elapses, the mask image creation unit 32 generates the second image in the additional image 1.
- the display position (X, Y) of the element image EMM in the X and Y coordinate system can be expressed by the above equation (7) and
- the mask image creation unit 32 performs the same process as in the case of the first element image EMM described above.
- the element image EMM is magnified by a magnification factor M around the origin O in the X Y coordinate system or
- the mask image creation unit 32 rotates the enlarged or reduced element image EM M by an angle ⁇ ⁇ ⁇ about the origin O in the X-Y coordinate system. And spread
- the mask image creation unit 32 creates data of the created mask image MMM.
- the mask image data MMD is notified to the composite image formation unit 33.
- the composite image creation unit 33 that receives the mask image data MMD and the original image data OMD creates a composite image SMM in which the mask image MMM is superimposed on the original image OMM. like this
- the composite image SMM created as a result is shown as F14B in FIG.
- the mask image creation unit 32 displays the position of the mouth of the image of interest WMM over time T.
- the element image EMM is set to the length value LX representing the size of the image of interest WMM.
- mask image formation unit 32 when time T elapses, mask image formation unit 32 generates the third image in additional image 1.
- the display position (X, Y) of the element image EMM in the X and Y coordinate system can be expressed by the above equation (7) and
- the mask image creation unit 32 performs the same process as in the case of the first element image EMM described above.
- the element image EMM is magnified by a magnification factor M around the origin O in the X Y coordinate system or
- the mask image creation unit 32 rotates the enlarged or reduced element image EM M by an angle ⁇ ⁇ ⁇ about the origin O in the X-Y coordinate system. And spread
- the mask image creation unit 32 generates data of the created mask image ⁇ .
- the mask image data MMD is notified to the composite image generation unit 33 as the mask image data MMD.
- the composite image creation unit 33 that receives the mask image data MMD and the original image data OMD creates a composite image SMM in which the mask image is superimposed on the original image.
- the composite image SMM so created is shown as image F14C in FIG.
- the mask image creation unit 32 displays the position of the mouth of the image of interest WMM over time.
- the element image EMM is set to the length value LX representing the size of the image of interest WMM.
- a mask image to be displayed with a size of 23 W and a ratio R is generated.
- mask image formation unit 32 performs the same process as described above.
- the composite image creation unit 33 based on the original image data OMD and the mask image data MMD! /, An element image EMM expanded or reduced to a designated size at the position of the mouth of the image of interest in the original image, Element image EMM and mask image of element image EMM force are superimposed
- This example corresponds to the mask image specification in FIG. 8 described above, and the additional image 1 whose display position with respect to the element image and the image of interest is a fixed display, and the display position for the element image and the image of interest are A mask image simultaneously displaying the temporally changing additional image 2 is an example of a synthetic image displayed superimposed on the original image.
- the element image EMM in the additional image 1 is as shown in FIG. 15A.
- the first element image EMM in the additional image 2 is the one shown in FIG. 15B.
- the element image EMM has a size according to the “size information” specified in the description of FIG.
- the original image data OMD imaged by the imaging device 25 is notified to the focused image information extraction unit 31 in the same manner as the above-described first or second composite image formation.
- the eye image information extraction unit 31 performs image processing on the notified original image OMM, specifies an area of the image of interest WMM, and extracts image of interest information.
- the focused image information extraction unit 31 determines the position (X 1, Y 2) as position information of the focused image area as the focused image information data WMD.
- the angle 0 be the inclination information of the area
- the length LX be the size information of the image area of interest
- the mask image creation unit 32 having received the noted image information data WMD first displays the display position (X, Y) of the first element image EMM in the additional image 1 in the X-Y coordinate system as follows (12
- the mask image creation unit 32 sets the magnification M of the enlargement or reduction of the first element image EMM.
- the mask image creation unit 32 enlarges the element image EMM at a magnification M with the origin O as the center in the X / Y coordinate system.
- the mask image creation unit 32 rotates the enlarged or reduced element image E MM by an angle ⁇ ⁇ ⁇ about the origin O in the X-Y coordinate system.
- the scaled or rotated element image EMM is located in the X-Y coordinate system
- the mask image creation unit 32 sets the magnification M of the enlargement or reduction of the second element image EMM.
- the mask image creation unit 32 executes the process as in the case of the element image EMM described above.
- the mask image generation unit 32 generates the enlarged or reduced element image EMM.
- the mask image creation unit 32 creates data of the created mask image MMM.
- the composite image creation unit 33 that receives the mask image data MMD and the original image data OMD creates a composite image SMM in which the mask image MMM is superimposed on the original image OMM.
- the composite image SMM created in this way is shown as image F17A in FIG.
- the mask image creation unit 32 determines that the target image WMM is designated over the time T.
- the element image EMM is paired with the length value LX representing the size of the image WMM of interest.
- the composite image in which the mask image represented by the mask image data MMD notified from the creation unit 32 is superimposed on the original image is created over time T.
- mask image formation section 32 generates the first image in additional image 1.
- the display position (X, Y) and magnification M in the XY coordinate system of element image EMM are
- the mask image creation unit 32 has the X and Y coordinates
- the element image EMM is scaled up or down at a magnification factor M with the origin O as the center of the system
- the mask image creation unit 32 generates the first element image EMM in the additional image 1 and
- the mask image creation unit 32 sets the magnification M of the enlargement or reduction of the third element image EMM.
- the mask image creation unit 32 performs the same process as the first element image EMM described above.
- the element image EMM is magnified by a magnification factor M around the origin O in the X Y coordinate system or
- the mask image creation unit 32 rotates the enlarged or reduced element image EM M by an angle ⁇ ⁇ ⁇ about the origin O in the X-Y coordinate system. And spread
- the mask image creation unit 32 Created. Then, the mask image creation unit 32 generates data of the created mask image MMM.
- the mask image data MMD is notified to the composite image creation unit 33 as the mask image data MMD.
- the composite image creation unit 33 that receives the mask image data MMD and the original image data OMD creates a composite image SMM in which the mask image MMM is superimposed on the original image OMM.
- the composite image SMM created in this way is shown as image F17B in FIG.
- the mask image creation unit 32 determines that the target image WMM is designated over the time T.
- the element image EMM is paired with the length value LX representing the size of the image WMM of interest.
- mask image formation unit 32 performs the same process as described above.
- the composite image creation unit 33 sequentially creates a composite image in which the mask image represented by the mask image data MMD notified from the mask image creation unit 32 is superimposed on the original image.
- Information data SMD of the composite image created as in the first, second or third composite image creation example described above is notified from the composite image creation unit 33 to the basic control unit 29.
- the basic control unit 29 which has received this composite image information data displays the composite image SMM on the display unit 13 in accordance with a command previously made by the user operating the operation unit 12 or via the transmission / reception unit 21. Send to the other party.
- the area of the image of interest WMM is identified in the original image, and the image of interest information including information related to the position and size of the area of the image of interest WMM in the original image.
- Extract WMD the image information WMD and the mask image information registered in the mask image information file 40.
- a mask image MMM to be displayed superimposed on the original image OMM is created.
- the image of interest WM in the original image OMM is located at a position determined by the information indicating the relationship between the position of the region of the image of interest WMM in the original image OMM and the position of the region of interest in the mask image information.
- the element image selected by the element image selection information included in the mask image information has a size determined by the information related to the size of the M area and the information indicating the relationship of the size of the image of interest.
- Mask image is created. Then, a composite image in which the mask image is superimposed on the original image is created.
- the information related to the position of the image of interest area included in the mask image information includes the position information of the representative point of the image of interest area WMR in the original image OMM. Therefore, information indicating the relationship with the position of the target image area included in the mask image information
- the information can be information indicating the relative position with respect to the representative point of the image area of interest, and the information indicating the relationship of the relationship information with the position of the image area of interest can be simplified.
- the information related to the position of the noted image area included in the mask image information further includes tilt information of the noted image area in the original image OMM. For this reason, it is possible to create a composite image in which a desired element image is superimposed and displayed at a position according to the inclination of the original image in the target image area by using the inclination information of the target image area in the original image OMM. it can. Further, the element image to be displayed can be displayed with an inclination according to the inclination of the original image of the noted image area.
- the representative point of the target image area is set as the origin, and based on the inclination of the target image area in the original image.
- Coordinate position information in the coordinate system in which the direction of the coordinate axis is determined is included as display position information of the element image.
- information indicating the relationship with the position of the image area of interest is defined as the coordinate position information in the coordinate system in which the direction of the coordinate axis is determined based on the inclination of the image area of interest with the representative point of the image area of interest as the origin. And it can be easy to handle.
- the mask image information can include ratio information of the size of the noted image area and the size at the time of display of the element image EMM. In this case, even if the size of the target image area changes every moment in the original image OMM, a mask image MMM including an element image EMM of a size adapted to the size of the target image area is created. be able to.
- the mask image information can include execution time information of display of each of the element images EMM.
- each of the desired element images EMM can be displayed only for the display execution time set at the desired time, and the dynamically changing mask image can be displayed.
- M MM can be created.
- the mask image information may include time-sequential display specification information specifying that the display of the plurality of element images EMM is to be time-sequentially displayed and that the time-sequential display be repeated. it can.
- time-sequential display specification information specifying that the display of the plurality of element images EMM is to be time-sequentially displayed and that the time-sequential display be repeated. it can.
- the information of the time sequential display designation By creating a mask image WMM, a plurality of element images can be repeatedly displayed on the original image OMM in a time-sequential manner every desired time.
- the mask image information can include information of simultaneous display designation which designates to simultaneously display a plurality of element image displays.
- the mask image information can include information of simultaneous display designation which designates to simultaneously display a plurality of element image displays.
- simultaneous display designation which designates to simultaneously display a plurality of element image displays.
- the present embodiment is different from the first embodiment in that an image different from that of the original image can be used as the background of the human image.
- the following description will focus on the differences.
- the cellular phone device 10 ′ of the present embodiment includes a point that the control unit 21 includes a mask image processing unit 30 ′ in the cellular phone main body 11, and a storage unit.
- the point that the mask image information file 40 'is stored in 23 is different from the mobile telephone device 10 of the first embodiment.
- the mask image processing unit 30 ′ includes a focused image information extraction unit 31, a mask image creation unit 32 ′, and a composite image creation unit 33 ′.
- the focused image information extraction unit 31 operates in the same manner as in the first embodiment.
- the mask image creation unit 32 ′ includes the noted image information WMD extracted by the noted image information extraction unit 31, and the mask image information of the storage unit 23 described above. Based on the mask image information MDD stored in the file 40 ', the mask image data MMD is created. Further, the mask image creation unit 32, based on the mask image information MDD stored in the mask image information file 40 ', uses background image designation information indicating presence / absence of background image designation and background image designation information! Background image information BPN, which is the storage position information of the background image data BDD when there is a background image designation, is created.
- Background image information BPN which is the storage position information of the background image data BDD when there is a background image designation
- the composite image creation unit 33 ′ is notified by the background image information BPN that there is a background image designation.
- the composite image creation unit 33 ′ creates composite image data SMD ′ in which the mask image represented by the mask image information MDD is superimposed on the background image-containing image.
- the composite image creation unit 33 ′ when notified by the background image information BPN that no background image is specified, creates an original image as a background image-containing image. Subsequently, the composite image creation unit 33 ′ creates composite image data SMD ′ in which the mask image represented by the mask image information MDD is superimposed on the background image-containing image. In this case, the composite image data SMD ′ is the same as the composite image data SMD of the first embodiment.
- the composite image data SMD ′ created by the composite image creation unit 33 is notified to the basic control unit 29! /.
- the mask image information file 40 in the storage unit 23 is a display for designating how to display the mask image in relation to the target image area in (i) the original image. It is composed of a display specification file 41 'in which a specification description and a background image specification description are stored, (ii) the element image file 42 described above, and (iii) a background image file 43 storing background image data BDD. ing. Note that the mask image information file 40 'is provided to the mobile phone device 10 via the content provider wireless line, or after the user creates it by a personal computer etc., the storage medium and the external interface (not shown) are used. Provided through a display specification file 41 'in which a specification description and a background image specification description are stored, (ii) the element image file 42 described above, and (iii) a background image file 43 storing background image data BDD. ing. Note that the mask image information file 40 'is provided to the mobile phone device 10 via the content provider wireless line, or after the user creates it by a personal
- FIG. 21 An example of display specification description in the display specification file 41 ′ of the present embodiment is shown in FIG.
- the example of description shown in FIG. 21 is the example of the description of FIG. 7 in the case of the first embodiment added with the fact that there is a background image designation. That is, in the display specification description of FIG. 21, in addition to the description of the additional image 1 and the additional image 2 of FIG. 7 between ⁇ simultaneous display specification> and ⁇ simultaneous display specification end>, ⁇ background image specification> The background image to be displayed simultaneously is specified by ⁇ Background image specification Finish> and the description of “Background image specification” between them.
- the “background image designation” the head address BMMA in the storage unit 23 of the background image file 43 is designated.
- the background image The background image data BDD representing the background image BMM in the background image area BMR having the same shape and size as the original image area OMR as shown in FIG. 22 is stored. Do.
- the person image in the original image region OMR shown in FIG. 23 SDM force original image A specific display image to be cut out from the OMM It shall be.
- the original image data which is data of the original image OMM captured by the imaging device 25 OMD force Attentioned image information extraction unit 31
- the notice image information extraction unit 31 generates a notice image information data WMD.
- the noted image information data WMD created in this manner is notified from the noted image information extraction unit 31 to the mask image creation unit 32 ′.
- the mask image creation unit 32 ′ having received the focused image information data WMD includes the focused image information data WMD and the mask image information MDD. Based on the data of the mask image MMMA, data of the mask image MMDA is created and notified to the composite image creation unit 33 'as mask image data MMD. In addition, the mask image creation unit 32 ′ determines whether or not there is a background image designation based on the mask image information MDD, and background image information BPN including head address power of the background image file 43 when there is a background image designation. Is created and notified to the synthetic image creation unit 33 '.
- the background image designation is described in the display designation file 41 ′ as shown in FIG. 21 described above, the background image information BPN indicates that there is a background image designation and the background image file 43 The start address will be included.
- the composite image generation unit 33 ′ that has received the mask image data MMD, the original image data OMD and the background image information BPN Determine In this case, since the background image is specified as described above, the judgment result is positive. Subsequently, the composite image creation unit 33 'reads the background image data BDD from the background image file 43 using the start address of the background image file 43 in the background image information BPN.
- the composite image creation unit 33 ′ that acquires the background image data BDD in addition to the mask image data MMD and the original image data OMD creates a composite image as follows.
- the composite image creation unit 33 ′ cuts out a person image SDM as a specific display image as shown as the image F 24 B. Such cutout is performed by specifying the contour of the human image SDM from the features of the human image in general, and then extracting the image within the contour.
- the composite image creation unit 33 ′ superimposes the specific display image SDM on the background image BMM shown as the image F24C with the same size at the same position as in the original image OMR. As a result, a background additional image BAM shown as an image F24D is created.
- the composite image creation unit 33 ′ superimposes the mask image MMMA shown as the image F24E on the background additional image BAM. As a result, a composite image SMMA 'shown as an image F24F is created.
- the basic control unit 29 having received this composite image information data displays the composite image SMMA 'on the display unit 13 in accordance with a command previously made by the user operating the operation unit 12, or via the transmission / reception unit 21. Send to the other party.
- the mask image generation unit 32 If the background image specification is described in the display specification file 41 ′ !, or if the storage start address of the background image data BDD is “0”, the mask image generation unit 32. Create background image information BPN containing only the background image designation nothing and the effect, and notify the composite image creation unit 33 '. Having received such background image information BPN, the composite image creation unit 33 'determines that no background image is specified, and overlays the mask image MMMA on the original image OMM to create a composite image. As a result, the same composite image S MMA as in the first composite image creation example in the first embodiment is created.
- the mask image specification is not described in the display specification file 41 ′, and If only the landscape designation is described, the background additional image BAM is created as a final composite image in the composite image creation unit 33 '.
- the composite image creation unit 33 ′ cuts out the original image OMM force as the specific display image as the specific display image, and designates the specified image.
- the image is superimposed on the image BMM to create a background additional image BAM as an image containing the image of interest.
- the composite image creation unit 33 ′ superimposes the mask image MMMA on the background additional image BAM. Therefore, an effect according to the user's sense of taste with respect to the specific display image such as the human image SDM can be given by selecting the mask image and the background image.
- the mask image information file 40 ′ includes the background image file 43.
- the background image file 43 can be placed in an area outside the mask image information file.
- the original image is a moving image, but the present invention can be applied even if the original image is a still image.
- the force V with which the imaging result of light through the imaging optical system 18 imaged in a so-called self-portrait mode is an original image
- V An imaging result of light through the imaging optical system 19 which is imaged in the mode of can be used as an original image.
- the image stored in the storage unit 23 can be used as an original image.
- the element image used for the mask image is also created by the two-dimensional model without considering the structure along the depth for the eye object in the image of interest in the original image. It was done.
- the element image used for the mask image may be created by a three-dimensional model. In this case, even when a motion having a rotational component around an axis parallel to the display surface of the object of interest is displayed, it is possible to create a mask image without any sense of harmony while following the rotation.
- the number of element images displayed simultaneously is arbitrary.
- the number of elementary images that perform time sequential display is also arbitrary.
- the mask image information file is composed of the display designation file and the element image file.
- the mask image information file force element image preview file is further included, and when the preview of the element image is designated, the element image preview file is referred to and the display unit 13 is displayed. It is also possible to preview element images!
- a predetermined phenomenon is determined based on the image of interest image whether or not a predetermined phenomenon has occurred in the image of interest region, and the occurrence of the predetermined phenomenon is triggered. It is also possible to start the creation of a mask image as For example, when the face image of a specific person is taken as the image of interest, when the eyelid is closed, it is possible to create a composite image in which the eye position force and tear flow.
- the present invention is applied to a mobile telephone device, but the present invention can be applied to other types of mobile communication terminal devices. Furthermore, the present invention can be applied to general electronic devices such as personal computers.
- the image processing method and the image processing method of the present invention can be applied to the processing of the obtained original image.
- the mobile communication terminal of the present invention can be applied to a mobile communication terminal having an image mail function and a video telephone function.
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05709805A EP1713030A4 (en) | 2004-02-05 | 2005-02-07 | PICTURE PROCESSING METHOD, PICTURE PROCESSING DEVICE AND MOBILE COMMUNICATION TERMINAL DEVICE |
JP2005517765A JPWO2005076210A1 (ja) | 2004-02-05 | 2005-02-07 | 画像処理方法、画像処理装置及び移動通信端末装置 |
US11/498,983 US7864198B2 (en) | 2004-02-05 | 2006-08-04 | Image processing method, image processing device and mobile communication terminal |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004029943 | 2004-02-05 | ||
JP2004-029943 | 2004-02-05 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/498,983 Continuation US7864198B2 (en) | 2004-02-05 | 2006-08-04 | Image processing method, image processing device and mobile communication terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005076210A1 true WO2005076210A1 (ja) | 2005-08-18 |
Family
ID=34835975
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/001753 WO2005076210A1 (ja) | 2004-02-05 | 2005-02-07 | 画像処理方法、画像処理装置及び移動通信端末装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US7864198B2 (ja) |
EP (1) | EP1713030A4 (ja) |
JP (1) | JPWO2005076210A1 (ja) |
WO (1) | WO2005076210A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008053855A (ja) * | 2006-08-22 | 2008-03-06 | Olympus Imaging Corp | 撮像装置 |
WO2013132557A1 (ja) * | 2012-03-05 | 2013-09-12 | パナソニック株式会社 | コンテンツ加工装置とその集積回路、方法、およびプログラム |
JP2013198086A (ja) * | 2012-03-22 | 2013-09-30 | Canon Inc | 設定装置、画像処理装置、設定方法、画像処理方法、及びプログラム |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4417292B2 (ja) * | 2005-05-25 | 2010-02-17 | ソフトバンクモバイル株式会社 | オブジェクト出力方法及び情報処理装置 |
EP2225878A1 (en) * | 2007-11-22 | 2010-09-08 | Koninklijke Philips Electronics N.V. | Methods and devices for receiving and transmitting an indication of presence |
US20090169074A1 (en) * | 2008-01-02 | 2009-07-02 | General Electric Company | System and method for computer assisted analysis of medical image |
JP5020135B2 (ja) * | 2008-03-19 | 2012-09-05 | ソニーモバイルコミュニケーションズ, エービー | 携帯端末装置およびコンピュータプログラム |
EP2113881A1 (en) * | 2008-04-29 | 2009-11-04 | Holiton Limited | Image producing method and device |
JP2009278325A (ja) * | 2008-05-14 | 2009-11-26 | Seiko Epson Corp | 画像処理装置、画像処理方法、およびプログラム |
US9239847B2 (en) * | 2009-03-12 | 2016-01-19 | Samsung Electronics Co., Ltd. | Method and apparatus for managing image files |
US20110018961A1 (en) * | 2009-07-24 | 2011-01-27 | Huboro Co., Ltd. | Video call device and method |
JP5304529B2 (ja) * | 2009-08-17 | 2013-10-02 | 富士ゼロックス株式会社 | 画像処理装置及び画像処理プログラム |
EP2355472B1 (en) * | 2010-01-22 | 2020-03-04 | Samsung Electronics Co., Ltd. | Apparatus and method for transmitting and receiving handwriting animation message |
JP5525923B2 (ja) | 2010-06-09 | 2014-06-18 | 任天堂株式会社 | 画像処理プログラム、画像処理装置、画像処理システム、および画像処理方法 |
US8854298B2 (en) * | 2010-10-12 | 2014-10-07 | Sony Computer Entertainment Inc. | System for enabling a handheld device to capture video of an interactive application |
JP5791256B2 (ja) * | 2010-10-21 | 2015-10-07 | キヤノン株式会社 | 表示制御装置、表示制御方法 |
WO2013065121A1 (ja) * | 2011-11-01 | 2013-05-10 | アイシン精機株式会社 | 障害物警報装置 |
JP5805503B2 (ja) * | 2011-11-25 | 2015-11-04 | 京セラ株式会社 | 携帯端末、表示方向制御プログラムおよび表示方向制御方法 |
US9236024B2 (en) | 2011-12-06 | 2016-01-12 | Glasses.Com Inc. | Systems and methods for obtaining a pupillary distance measurement using a mobile computing device |
US9378584B2 (en) | 2012-05-23 | 2016-06-28 | Glasses.Com Inc. | Systems and methods for rendering virtual try-on products |
US9483853B2 (en) * | 2012-05-23 | 2016-11-01 | Glasses.Com Inc. | Systems and methods to display rendered images |
US9286715B2 (en) | 2012-05-23 | 2016-03-15 | Glasses.Com Inc. | Systems and methods for adjusting a virtual try-on |
GB2502591B (en) * | 2012-05-31 | 2014-04-30 | Sony Comp Entertainment Europe | Apparatus and method for augmenting a video image |
KR101948692B1 (ko) | 2012-10-09 | 2019-04-25 | 삼성전자주식회사 | 촬영 장치 및 이미지 합성 방법 |
JP6315895B2 (ja) * | 2013-05-31 | 2018-04-25 | キヤノン株式会社 | 撮像装置、画像処理装置、撮像装置の制御方法、画像処理装置の制御方法、プログラム |
CN106133796B (zh) * | 2014-03-25 | 2019-07-16 | 苹果公司 | 用于在真实环境的视图中表示虚拟对象的方法和系统 |
JP7002846B2 (ja) * | 2017-03-03 | 2022-01-20 | キヤノン株式会社 | 画像処理装置およびその制御方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999057900A1 (en) | 1998-05-03 | 1999-11-11 | John Karl Myers | Videophone with enhanced user defined imaging system |
JP2000322588A (ja) * | 1999-05-06 | 2000-11-24 | Toshiba Corp | 画像処理装置及びその方法 |
JP2001292305A (ja) * | 2000-02-02 | 2001-10-19 | Casio Comput Co Ltd | 画像データ合成装置、画像データ合成システム、画像データ合成方法及び記録媒体 |
JP2002077592A (ja) * | 2000-04-13 | 2002-03-15 | Fuji Photo Film Co Ltd | 画像処理方法 |
JP2003309829A (ja) * | 2002-04-15 | 2003-10-31 | Matsushita Electric Ind Co Ltd | 携帯動画電話装置 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2002A (en) * | 1841-03-12 | Tor and planter for plowing | ||
EP0260710A3 (en) * | 1986-09-19 | 1989-12-06 | Hoya Corporation | Method of forming a synthetic image in simulation system for attachment of spectacles |
US5469536A (en) * | 1992-02-25 | 1995-11-21 | Imageware Software, Inc. | Image editing system including masking capability |
KR100197835B1 (ko) * | 1995-09-01 | 1999-06-15 | 윤종용 | 더블스크린을 이용한 정보신호 표시장치 |
JPH09269999A (ja) * | 1996-01-31 | 1997-10-14 | Fuji Photo Film Co Ltd | 画像合成装置および方法 |
US6400374B2 (en) * | 1996-09-18 | 2002-06-04 | Eyematic Interfaces, Inc. | Video superposition system and method |
US6141433A (en) * | 1997-06-19 | 2000-10-31 | Ncr Corporation | System and method for segmenting image regions from a scene likely to represent particular objects in the scene |
US6018774A (en) * | 1997-07-03 | 2000-01-25 | Yobaby Productions, Llc | Method and system for creating messages including image information |
BR9906453A (pt) * | 1998-05-19 | 2000-09-19 | Sony Computer Entertainment Inc | Dispositivo e método do processamento de imagem, e meio de distribuição. |
US6721446B1 (en) * | 1999-04-26 | 2004-04-13 | Adobe Systems Incorporated | Identifying intrinsic pixel colors in a region of uncertain pixels |
US7236622B2 (en) * | 1999-08-25 | 2007-06-26 | Eastman Kodak Company | Method for forming a depth image |
US6556704B1 (en) * | 1999-08-25 | 2003-04-29 | Eastman Kodak Company | Method for forming a depth image from digital image data |
US7230653B1 (en) * | 1999-11-08 | 2007-06-12 | Vistas Unlimited | Method and apparatus for real time insertion of images into video |
US7106887B2 (en) * | 2000-04-13 | 2006-09-12 | Fuji Photo Film Co., Ltd. | Image processing method using conditions corresponding to an identified person |
JP3784289B2 (ja) | 2000-09-12 | 2006-06-07 | 松下電器産業株式会社 | メディア編集方法及びその装置 |
US6993553B2 (en) * | 2000-12-19 | 2006-01-31 | Sony Corporation | Data providing system, data providing apparatus and method, data acquisition system and method, and program storage medium |
JP3948986B2 (ja) * | 2002-03-14 | 2007-07-25 | 三洋電機株式会社 | 撮影画像表示装置及び撮影画像表示方法 |
US7227976B1 (en) * | 2002-07-08 | 2007-06-05 | Videomining Corporation | Method and system for real-time facial image enhancement |
JP4307910B2 (ja) * | 2003-03-07 | 2009-08-05 | 富士フイルム株式会社 | 動画像切り出し装置および方法並びにプログラム |
TWI241824B (en) * | 2003-10-31 | 2005-10-11 | Benq Corp | Mobile phone and related method for displaying text message with various background images |
-
2005
- 2005-02-07 WO PCT/JP2005/001753 patent/WO2005076210A1/ja not_active Application Discontinuation
- 2005-02-07 EP EP05709805A patent/EP1713030A4/en not_active Ceased
- 2005-02-07 JP JP2005517765A patent/JPWO2005076210A1/ja active Pending
-
2006
- 2006-08-04 US US11/498,983 patent/US7864198B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999057900A1 (en) | 1998-05-03 | 1999-11-11 | John Karl Myers | Videophone with enhanced user defined imaging system |
JP2000322588A (ja) * | 1999-05-06 | 2000-11-24 | Toshiba Corp | 画像処理装置及びその方法 |
JP2001292305A (ja) * | 2000-02-02 | 2001-10-19 | Casio Comput Co Ltd | 画像データ合成装置、画像データ合成システム、画像データ合成方法及び記録媒体 |
JP2002077592A (ja) * | 2000-04-13 | 2002-03-15 | Fuji Photo Film Co Ltd | 画像処理方法 |
JP2003309829A (ja) * | 2002-04-15 | 2003-10-31 | Matsushita Electric Ind Co Ltd | 携帯動画電話装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1713030A4 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008053855A (ja) * | 2006-08-22 | 2008-03-06 | Olympus Imaging Corp | 撮像装置 |
WO2013132557A1 (ja) * | 2012-03-05 | 2013-09-12 | パナソニック株式会社 | コンテンツ加工装置とその集積回路、方法、およびプログラム |
JPWO2013132557A1 (ja) * | 2012-03-05 | 2015-07-30 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | コンテンツ加工装置とその集積回路、方法、およびプログラム |
US9117275B2 (en) | 2012-03-05 | 2015-08-25 | Panasonic Intellectual Property Corporation Of America | Content processing device, integrated circuit, method, and program |
JP2013198086A (ja) * | 2012-03-22 | 2013-09-30 | Canon Inc | 設定装置、画像処理装置、設定方法、画像処理方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP1713030A1 (en) | 2006-10-18 |
US20070183679A1 (en) | 2007-08-09 |
EP1713030A4 (en) | 2007-05-02 |
JPWO2005076210A1 (ja) | 2007-10-18 |
US7864198B2 (en) | 2011-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005076210A1 (ja) | 画像処理方法、画像処理装置及び移動通信端末装置 | |
JP4377886B2 (ja) | 画像通信中の画像合成方法及び装置 | |
US8111280B2 (en) | Video conference system and method in a communication network | |
TWI584641B (zh) | 提供影像的裝置與方法及非暫態電腦可讀取記錄媒體 | |
KR20040047623A (ko) | 화상 처리 방법 및 그 장치 | |
JP4417292B2 (ja) | オブジェクト出力方法及び情報処理装置 | |
WO2012037850A1 (zh) | 移动终端及其视频通话中对远端图像局部放大方法 | |
US10984537B2 (en) | Expression transfer across telecommunications networks | |
JP2000322588A (ja) | 画像処理装置及びその方法 | |
JP2007213364A (ja) | 画像変換装置、画像変換方法及び画像変換プログラム | |
JP2013115527A (ja) | テレビ会議システム及びテレビ会議方法 | |
JP2005328484A (ja) | テレビ会議システム、情報処理装置及び情報処理方法並びにプログラム | |
JP2008085421A (ja) | テレビ電話機、通話方法、プログラム、声質変換・画像編集サービス提供システム、および、サーバ | |
JP2004056488A (ja) | 画像処理方法、画像処理装置および画像通信装置 | |
KR20070006337A (ko) | 휴대단말기의 이미지편집 방법 | |
JP2007026090A (ja) | 映像作成装置 | |
JP2006099058A (ja) | 表示装置、表示方法、およびプログラム | |
JP4202330B2 (ja) | マスク画像情報作成方法、マスク画像情報作成装置及びマスク画像情報作成プログラム | |
JP2006065683A (ja) | アバタ通信システム | |
JP2022003818A (ja) | 画像表示システム、画像表示プログラム、画像表示方法及びサーバ | |
WO2014208169A1 (ja) | 情報処理装置、制御方法、プログラム、および記憶媒体 | |
JP2006065684A (ja) | アバタ通信システム | |
JP7026839B1 (ja) | リアルタイムデータ処理装置 | |
JP2007180914A (ja) | 画面拡大表示機能付携帯電話、画面拡大表示方法、及び、画面拡大表示プログラムとその記録媒体 | |
JP2003339034A (ja) | ネットワーク会議システム、ネットワーク会議方法およびネットワーク会議プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005517765 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005709805 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11498983 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2005709805 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11498983 Country of ref document: US |