JP5477025B2 - Image display device and program - Google Patents

Image display device and program Download PDF

Info

Publication number
JP5477025B2
JP5477025B2 JP2010021737A JP2010021737A JP5477025B2 JP 5477025 B2 JP5477025 B2 JP 5477025B2 JP 2010021737 A JP2010021737 A JP 2010021737A JP 2010021737 A JP2010021737 A JP 2010021737A JP 5477025 B2 JP5477025 B2 JP 5477025B2
Authority
JP
Japan
Prior art keywords
image
frame
environment
use
plurality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2010021737A
Other languages
Japanese (ja)
Other versions
JP2011158794A (en
Inventor
智 片貝
Original Assignee
カシオ計算機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by カシオ計算機株式会社 filed Critical カシオ計算機株式会社
Priority to JP2010021737A priority Critical patent/JP5477025B2/en
Publication of JP2011158794A publication Critical patent/JP2011158794A/en
Application granted granted Critical
Publication of JP5477025B2 publication Critical patent/JP5477025B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an image processing technique for displaying frame images representing different types of frame on the outer periphery of an image obtained by, for example, photography.

  In recent years, various digital photo frames (hereinafter referred to as “DPF”) have been commercialized as devices for displaying, for example, images captured by a digital camera and recorded as image data, that is, photographs. By the way, when decorating an actual painting in a room, the painting is often placed in a frame. Therefore, when displaying an image (photograph) in the DPF, it is desirable to display a frame image representing a frame on the outer periphery of the image.

  As a technique for displaying a frame image on the outer periphery of an image, for example, in Patent Document 1 below, the periphery of a protective panel for protecting a liquid crystal panel that displays an image is formed in a lens shape, and the image is displayed on the liquid crystal panel. A technique for displaying a frame image on a peripheral portion of a liquid crystal panel is described. In the technique disclosed in Patent Document 1 below, the user selects in advance the type of frame image to be displayed together with the image.

JP-A-10-260654

  However, in the technique of Patent Literature 1, when an image is displayed, the frame image selected based on the user's preference is merely displayed. Therefore, the user has to perform a troublesome task of selecting the frame image himself. In this case, there is a problem that the type of frame cannot be changed.

  An object of the present invention is to selectively display different types of frames on the outer periphery of an image without forcing the user to perform troublesome work.

In the image display device according to the first aspect of the present invention, in an image display device that displays a frame image around an ornamental image, a captured image obtained by photographing the periphery of the image display device is displayed. An imaging unit for acquiring, an environment determining unit for determining which of the plurality of predetermined usage environments the current usage environment corresponds to in accordance with the content related to the subject of the captured image acquired by the imaging unit, A correspondence information storage unit that stores correspondence information indicating a correspondence relationship between a plurality of determined use environments and the type of the frame image, a correspondence information stored in the correspondence information storage unit, and the environment determination unit The frame setting means for setting a frame image of a type corresponding to the use environment determined to be applicable as a usage target, and the frame setting means set as the usage target together with the ornamental image. The type of frame images and display control means for displaying on a screen in a state arranged on the periphery of the image for the ornamental, the environment determination means, from the photographed image of said imaging means has acquired, it is predetermined Including a subject recognizing unit for recognizing a plurality of different specific subjects, and a current use environment defined by the presence of the specific subject recognized by the subject recognizing unit among the plurality of predetermined use environments It is judged to be an environment .

Further, in the image display device according to the second aspect of the present invention, in the image display device that displays the frame image around the ornamental image, the image of the periphery of the image display device is captured. An imaging unit for acquiring an image; an environment determination unit for determining which of the plurality of predetermined usage environments the current usage environment corresponds to according to the content of the subject of the captured image acquired by the imaging unit; Referencing the correspondence information stored in the correspondence information storage means for storing correspondence information indicating the correspondence relationship between the plurality of predetermined use environments and the types of the frame images, and determining the environment A frame setting means for setting a frame image of a type corresponding to the use environment determined to be applicable as a use target, and the frame setting means as a use target together with the ornamental image. Display control means for displaying a predetermined type of frame image on the screen in a state of being arranged around the ornamental image, and the image display device is installed in the plurality of predetermined use environments A plurality of usage environments corresponding to at least one of a room type as a place and a viewer type are included.

Further, in the program according to the third aspect of the present invention, the periphery of the image display device is photographed with a computer included in the image display device that displays the frame image arranged around the ornamental image. In accordance with the content of the subject of the photographed image, an environment determination means for determining which of the plurality of predetermined use environments corresponds to a current use environment, and a photographed image obtained by photographing the periphery of the image display device in advance. Subject recognition means for recognizing a plurality of different specific subjects, and correspondence information indicating correspondence relationships between the plurality of predetermined use environments and the types of frame images, and the environment determination means The frame setting means for setting a frame image of a type corresponding to the use environment determined to be applicable as the object to be used, and the frame setting means are used together with the ornamental image. To function as a display control means for displaying on the screen the type of picture frame image set in a state of being arranged around the image for the ornamental as the environment judgment unit, among the plurality of usage environment where the determined in advance The use environment defined by the presence of the specific subject recognized by the subject recognition means is determined as the current use environment .

In the program according to the fourth aspect of the present invention, the periphery of the image display device is photographed with a computer included in the image display device that displays the frame image arranged around the ornamental image. Environment determining means for determining which of the plurality of predetermined usage environments the current usage environment corresponds to in accordance with the contents of the subject of the photographed image, the plurality of predetermined usage environments and the frame image A frame setting means for setting a frame image of a type corresponding to the use environment determined to be applicable by the environment determination means as a use target with reference to correspondence information indicating a correspondence relationship with the type of the image, and the ornamental image, The frame setting means functions as display control means for displaying on the screen a frame image of a type set as a use target and arranged around the ornamental image. The plurality of predetermined usage environments include a plurality of usage environments corresponding to at least one of a room type and a viewer type as a place where the image display apparatus is installed. And

  According to the present invention, different types of frames can be selectively displayed on the outer periphery of an image without forcing the user to perform troublesome work.

It is a front view of the digital photo frame in the embodiment of the present invention. It is a block diagram which shows the outline of the hardware constitutions of a digital photo frame. It is a functional block diagram which shows the principal part of the function of a digital photo frame. It is explanatory drawing which shows the relationship between the kind of use environment, a prescription | regulation condition, and the kind of frame. It is a flowchart which shows the process sequence of CPU in a specific display mode. (A) is a figure which shows the example of the image with a frame containing the frame image of a character pattern, (b) is a figure which shows the example of the image with a frame containing a Western-style frame image, (c) is a Japanese-style frame image It is a figure which shows the example of the image with a frame containing.

  Hereinafter, embodiments of the present invention will be described. FIG. 1 is a front view of a digital photo frame (hereinafter referred to as DPF) 1 to which the present invention is applied. The DPF 1 mainly displays an image (photograph) taken by a digital camera or the like. In the following description, an arbitrary image (photograph) displayed by the DPF 1 is referred to as an ornamental image. Further, the DPF 1 has a function of displaying a frame image on the outer periphery of the ornamental image when displaying the ornamental image.

  As shown in FIG. 1, the main body 2 of the DPF 1 holds a color liquid crystal display panel 3 that occupies almost the entire area of the front surface excluding the lower side portion of the main body 2. A power switch 4 and an operation switch 5 are provided at the right end of the lower side portion of the main body 2, and a photographing lens 6 is provided at the center of the lower side portion of the main body 2.

  The photographing lens 6 is provided for photographing a wide range of situations on the front side of the DPF 1 by a built-in camera 401 described later, and is a wide-angle lens having a focal length of about 20 mm, for example.

  FIG. 2 is a block diagram showing the electrical configuration of the DPF 1. The DPF 1 has a configuration in which the entire system is controlled by a CPU (Central Processing Unit) 101. Connected to the CPU 101 are a ROM (Read Only Memory) 102, a RAM (Random Access Memory) 103, a media controller 104, an operation key group 201, a display 301, and a built-in camera 401.

  The ROM 102 is a memory in which a plurality of types of programs and data for causing the CPU 101 to control the entire system are stored. The programs stored in the ROM 102 include a display control program 102a that causes the CPU 101 to function as environment information acquisition means, environment determination means, frame setting means, and display control means.

  The data stored in the ROM 102 stores frame image data representing a plurality of types of frame images displayed on the outer periphery of the ornamental image. There are three specific types of frame images represented by the respective frame image data: with character pattern, Western style, and Japanese style.

  A frame image with a character pattern is an image representing a frame in which a pattern by a character that an infant is likely to like is arranged. A Western-style frame image is an image that represents a frame having a design (shape, color, pattern, etc.) that matches the atmosphere of a Western-style room. A Japanese-style frame image is an image representing a frame having a design (shape, color, pattern, etc.) that matches the atmosphere of a Japanese-style room.

  The RAM 103 is specifically an SDRAM (Synchronous dynamic random-access memory) or the like.

  The media controller 104 is an input / output interface that controls data input / output between the CPU 101 and a recording medium 50 that is detachably mounted in a memory card slot provided in the main body 2 of the DPF 1. The memory card slot is provided on the side surface or the back surface of the main body 2.

  In the recording medium 50, a plurality of image data of images (photographs) photographed and recorded by a digital camera or the like are recorded. The image data is compressed by, for example, the JPEG (Joint Photographic Expert Group) method, and stored as a still image file together with additional information such as the shooting date and time.

  The operation key group 201 includes the power switch 4 and the operation switch 5 shown in FIG. The CPU 101 detects various instructions from the user by predetermined key operations by constantly scanning the operation state of the operation keys and the like in the operation key group 201.

  The display 301 includes the color liquid crystal display panel 3 shown in FIG. 1, the backlight of the color liquid crystal display panel 3, and a drive circuit that drives the color liquid crystal display panel in accordance with display data such as image data supplied from the CPU 101. Consists of

  The built-in camera 401 is output from a CCD (Charge Coupled Device) or MOS (Complementary Meta 10xide Semiconductor) type imaging device that mainly captures a light image formed by the photographing lens 6 shown in FIG. It is composed of a plurality of stages of signal processing circuits for generating image data based on the imaging signal. Image data generated by the built-in camera 401 is supplied to the CPU 101. In the following description, an image represented by image data supplied from the built-in camera 401 to the CPU 101 is referred to as a captured image.

  FIG. 3 is a functional block diagram showing the main part of the functions of the DPF 1. The DPF 1 includes a control unit 10, an operation unit 20, a display unit 30, and a photographing unit 40. The control unit 10 includes an environment determination unit 11 including a subject recognition unit 12, a frame setting unit 13, a display control unit 14, and a storage unit 15. The environment determination unit 11 includes a subject recognition unit 12 and a determination unit 13.

  The operation unit 20 is a functional part that detects various instructions from the user to the DPF 1, and is realized by the operation key group 201 shown in FIG.

  The display unit 30 displays a still image represented by image data recorded on the recording medium 50 as an ornamental image, or displays various operation screens required when the user operates the DPF 1. . The display unit 30 is realized by the display 301 illustrated in FIG.

  The imaging unit 40 supplies the captured image acquired by imaging the situation on the front side of the DPF 1 to the control unit 10. The photographing unit 40 is realized by the built-in camera 401 shown in FIG.

  The subject recognizing unit 12 recognizes a specific subject that is determined in advance with respect to captured image data representing a captured image captured and acquired by the capturing unit 40. There are a plurality of types of specific subjects recognized by the subject recognition unit 12. The subject recognition unit 12 is realized by the CPU 101 shown in FIG.

  The environment determination unit 11 causes the photographing unit 40 to photograph the situation on the front side of the DPF 1 and acquires photographed image data representing the photographed image as environment information indicating the use environment of the DPF 1. The environment determination unit 11 determines the current use environment of the DPF 1 based on the recognition result of the subject in the subject recognition unit 12. More specifically, the determination unit 13 includes, as a condition for defining the use environment, the presence of a specific subject that the subject recognition unit 12 has successfully recognized among a plurality of use environments predetermined in the DPF 1. It is determined that a specific usage environment is a usage environment when the current usage environment is applicable. The environment determination unit 11 is realized by the CPU 101 shown in FIG.

  Here, a plurality of predetermined usage environments and conditions for defining the respective usage environments (hereinafter referred to as prescribed conditions) will be described. There are four types of usage environments predetermined in the DPF 1, usage environment A to usage environment D shown in FIG. 4, and the specified conditions for each usage environment are conditions that can be determined from the contents of the subject of the captured image. is there.

  Usage environments A and B are usage environments related to the type of viewer. The usage conditions of the usage environment A are “the viewer is an infant”, and the usage environment B is “the viewer is a Westerner”. It is. Usage environments C and D are usage environments related to the type of room at the usage location. The usage environment C has a specified condition of “There must be a Japanese-style room” and the usage environment D has a specified condition of “No Japanese-style room”. is there. A Japanese-style room specific item is an article or the like that is generally present only in a Japanese-style room and not present in a Western-style room, such as a shoji screen or a tatami mat.

  The frame setting unit 13 is associated with a specific use environment determined by the environment determination unit 11 among the three types of frame images with character patterns stored in the ROM 102 as frame image data. A specific type of frame image is set as a target of use.

  The correspondence between the predetermined use environment and the type of the frame image is described in the display control program 102 a stored in the ROM 102. The correspondence between the predetermined usage environment and the type of frame image is as shown in FIG. That is, a frame image with a lacquer pattern is associated with the use environment A, a Western-style frame image is associated with the use environments B and D, and a Japanese-style frame image is associated with the use environment C. ing.

  The display control unit 14 causes the display unit 30 to display an ornamental image. In addition, the display control unit 14 causes the display unit 30 to display the frame image of the type set by the frame setting unit 13 together with the ornamental image in a state of being arranged on the outer periphery of the ornamental image. The display control unit 14 is realized by the CPU 101 shown in FIG.

  The storage unit 15 is a captured image acquired by the imaging unit 40 or a display image when the display control unit 14 causes the display unit 30 to display a framed image in which a frame image is arranged on the outer periphery of the ornamental image. Store data temporarily. The storage unit 15 is realized by the RAM 103 illustrated in FIG.

  In the DPF 1 configured as described above, a specific display mode for displaying a frame image around the ornamental image when the ornamental image is displayed is provided as a display mode. A specific display mode is set by the user operating the operation switch 5 in a predetermined procedure.

  Hereinafter, an operation when any image (photograph) recorded on the recording medium 50 is displayed as an ornamental image in a state where a specific display mode is set will be described. Note that the image to be displayed is, for example, selected by the user by a predetermined key operation.

  FIG. 5 is a flowchart showing the contents of processing executed by the control unit 10 (CPU 101) in accordance with the display control program 102a stored in the ROM 102 in a specific display mode.

  In the specific display mode, the control unit 10 first reads image data representing an ornamental image from the recording medium 50 and temporarily stores it in the storage unit 15 (RAM 103) (step S1).

  Next, the environment determination unit 11 (CPU 101) causes the photographing unit 40 (built-in camera 401) to photograph the situation on the front side of the DPF 1, and uses photographed image data representing the photographed image supplied from the photographing unit 40 to use the DPF 1. Obtained as environment information indicating the environment (step S2). The environment determination unit 11 stores the acquired captured image data in the storage unit 15 (103).

  Subsequently, the subject recognition unit 12 (CPU 101) recognizes the infant's face using the captured image data stored in the storage unit 15 as a processing target (step S3). In the process of step S3, the subject recognition unit 12 detects a face area, which is an area where a human face exists, in the captured image represented by the captured image data, and then determines the age from the detected face area, It is determined whether or not the face in the face area is an infant's face.

  The method of detecting the face area by the subject recognition unit 12 is a method in which an eye, nose, or mouth in a certain range of positional relationship exists by image recognition techniques such as binarization, contour extraction, and pattern matching, for example. This is a method to which a known technique for detecting an area as a face area is applied. In addition, the method for determining the age by the subject recognizing unit 12 is to determine race, age, gender, and the like by arithmetic processing using feature points constituting the facial organ described in Japanese Patent Application Laid-Open No. 2007-280291. This is a method to which a known technique is applied.

  When the subject recognition unit 12 succeeds in recognizing the infant's face in the process of step S3 (step S4: YES), the environment determination unit 11 indicates that the current usage environment is “the viewer is an infant”. It is determined that the usage environment A (see FIG. 4) is used as the prescribed condition (step S5). Further, the frame setting unit 13 (CPU 101) sets a frame image with a character pattern corresponding to the use environment A as a display target (step S6).

  Thereafter, the display control unit 14 (CPU 101) generates display image data representing a display image in which a frame image with a character pattern is arranged on the outer periphery of the appreciation image in the storage unit 15, and displays the generated image data. The image is sent to the unit 30 (display 301), and an image with a frame is displayed on the display unit 30 (step S17).

  FIG. 6A illustrates a frame displayed by the display unit 30 when the environment determination unit 11 determines that the current use environment is the use environment A with “the viewer is an infant” as a specified condition. It is the figure which showed the example of the attached image for convenience. That is, it is a diagram showing a framed image GA in which a frame image Ga with a character pattern is arranged on the outer periphery of the appreciation image G.

  If the subject recognition unit 12 cannot recognize the infant's face in the process of step S3 (step S4: NO), the subject recognition unit 12 continues to process the captured image data stored in the storage unit 15 as a processing target. A face is recognized (step S7). In the process of step S7, the subject recognition unit 12 determines the race from the face area face that has already been detected in the process of step S3, and determines whether the face area face is a Western face. The race recognition method by the subject recognizing unit 12 also discriminates race, age, gender, and the like by arithmetic processing using feature points constituting the facial organ described in Japanese Patent Application Laid-Open No. 2007-280291. This is a method using a known technique or the like.

  When the subject recognition unit 12 succeeds in recognizing the Western face in the process of step S7 (step S8: YES), the environment determination unit 11 indicates that the current usage environment is “the observer is a westerner”. "Is used environment B (see FIG. 4) with the prescribed condition (step S9). Further, the frame setting unit 13 sets a Western-style frame image corresponding to the use environment B as a display target (step S10).

  Thereafter, the display control unit 14 generates display image data representing a display image in which a Western-style frame image is arranged on the outer periphery of the viewing image in the storage unit 15, and sends the generated image data to the display unit 30. An image with a frame is displayed on the display unit 30 (step S17).

  6B, when the environment determination unit 11 determines that the current use environment is the use environment B with “the observer is a Westerner” as a specified condition, the display unit 30 displays the display. It is the figure which showed the example of the image with a frame for convenience. That is, it is a diagram showing a frame-added image GB in which a Western-style frame image Gb is arranged on the outer periphery of the appreciation image G.

  If the face of the infant cannot be recognized in the process of step S7 (step S8: NO), the subject recognition unit 12 continues to use the photographed image data stored in the storage unit 15 as a processing target. Recognize the shoji and tatami (step S11).

  A method for recognizing a Japanese-style room specific object by the subject recognition unit 12 is a method in which a known technique such as binarization, contour extraction, pattern matching, or the like is applied to an image to be detected. More specifically, the subject recognition unit 12 performs binarization and contour extraction processing on the captured image data. Next, the subject recognizing unit 12 uses template data representing a feature such as a pattern that is a reference for a Japanese-style room thing stored in the ROM 102, and in the captured image data, the image in the image having the feature represented by the template data is used. A process of detecting one or a plurality of area portions is performed. Then, the subject recognizing unit 12 determines that the detected region portion is a Japanese-style room-specific object if the degree of coincidence with the feature represented by the template data in the detected one or more region portions is a certain degree or more. To do.

  When the subject recognizing unit 12 succeeds in recognizing the Japanese-style room specific item in the process of step S11 (step S12: YES), the environment determining unit 11 defines that the current use environment is “Japanese-style room specific item exists”. It is determined that the usage environment C is a condition (see FIG. 4) (step S13). Further, the frame setting unit 13 sets a Japanese-style frame image corresponding to the use environment C as a display target (step S14).

  Thereafter, the display control unit 14 generates display image data representing a display image in which a Japanese-style frame image is arranged on the outer periphery of the viewing image, and sends the generated image data to the display unit 30. An image with a frame is displayed on the display unit 30 (step S17).

  FIG. 6C shows a frame with a frame displayed by the display unit 30 when the environment determination unit 11 determines that the current use environment is the use environment C with “there is a unique Japanese-style room” as a specified condition. It is the figure which showed the example of the image for convenience. That is, it is a diagram showing a framed image GC in which a Japanese-style frame image Gc is arranged on the outer periphery of the viewing image G.

  In addition, when the subject recognition unit 12 cannot recognize the unique Japanese-style room in the process of step S11 (step S12: NO), the environment determination unit 11 defines that the current use environment is “no special Japanese-style room”. It is determined that the usage environment D is a condition (see FIG. 4) (step S15). Further, the frame setting unit 13 sets a Western-style frame image corresponding to the use environment D as a display target (step S116).

  Thereafter, the display control unit 14 generates display image data representing a display image in which a Western-style frame image is arranged on the outer periphery of the viewing image in the storage unit 15, and sends the generated image data to the display unit 30. An image with a frame is displayed on the display unit 30 (step S17).

  When the environment determination unit 11 determines that the current use environment is the use environment D having a prescribed condition that “there is no Japanese-style room special item”, the display unit 30 displays the information in FIG. The framed image GC shown in FIG.

  As described above, in the DPF 1 of the present embodiment, when displaying an appreciation image, the current usage environment is determined by recognizing a specific subject from the captured image acquired by the imaging unit 40, and is associated with the current usage environment. The displayed frame image is automatically displayed on the outer periphery of the viewing image. That is, the DPF 1 automatically changes the frame image on the basis of the content related to the subject in the captured image. Therefore, different types of frames can be selectively displayed on the outer periphery of the viewing image without forcing the user to perform troublesome work.

  Moreover, the type of frame image displayed on the outer periphery of the viewing image by the DPF 1 is a type corresponding to the usage environment corresponding to the current usage environment among the predetermined usage environments (A to D). Therefore, the DPF 1 can automatically display a frame image of a type suitable for the usage environment on the outer periphery of the viewing image when displaying the viewing image. For example, when the observer is an infant, the DPF 1 can automatically display a frame image of a character pattern in which a pattern by a character that the infant is likely to like is arranged together with an appreciation image.

  In addition, the DPF 1 recognizes a plurality of specific subjects included in the predetermined conditions for defining each use environment in the order determined, and appreciates when one of the subjects is successfully recognized. Sets the type of frame image to be displayed on the outer periphery of the image. That is, the recognition operation is finished when one of the subjects is successfully recognized. Therefore, the recognition operation of the specific subject with respect to the captured image is minimal, and the current use environment can be determined efficiently.

(Deformation etc.)
In the present embodiment, a case has been described in which a plurality of predetermined use environments are the use environments A and B related to the type of viewer and the use environments C and D related to the type of room in the use place. However, the DPF 1 may be assumed to have only a plurality of usage environments related to the type of viewer, or may only be assumed to be related to the type of room at the usage location.

  In addition, specific contents of a plurality of usage environments and the number of usage environments can be changed as appropriate. In the DPF 1, use environments other than those related to the type of viewer and the type of room at the use place may be determined in advance as a plurality of use environments. In addition, specific types of frame images corresponding to a plurality of predetermined use environments can be changed as appropriate. In addition, a specific subject under conditions that define a plurality of use environments can be changed as appropriate.

  In the present embodiment, the case where the present invention is applied to a DPF that displays an image recorded as image data has been described. However, the present invention can be applied to any device as long as it has a function of displaying a frame image on the outer periphery of the ornamental image. Optional devices include digital cameras, mobile phone terminals with cameras, and the like.

1 DPF
2 Main body 3 Color liquid crystal display panel 4 Power switch 5 Operation switch 6 Shooting lens 10 Control unit 11 Environment determination unit 12 Subject recognition unit 13 Judgment unit 14 Frame setting unit 15 Display control unit 16 Storage unit 20 Operation unit 30 Display unit 40 Imaging unit 50 recording media 101 CPU
102 ROM
102a Display control program 103 RAM
104 Media Controller 201 Operation Key Group 301 Display 401 Built-in Camera A Use Environment B Use Environment C Use Environment D Use Environment

Claims (4)

  1. In an image display device that displays a frame image arranged around an ornamental image,
    Photographing means for obtaining a photographed image obtained by photographing the periphery of the image display device;
    An environment determination unit that determines which of the plurality of predetermined usage environments the current usage environment corresponds to according to the content of the subject of the captured image acquired by the imaging unit;
    Correspondence information storage means for storing correspondence information indicating a correspondence relationship between the plurality of predetermined use environments and the type of the frame image;
    A frame setting unit that refers to the correspondence information stored in the correspondence information storage unit and sets a frame image of a type corresponding to the use environment determined to be applicable by the environment determination unit as a target of use;
    Display control means for displaying on the screen together with the ornamental image a frame image of the type set as the object of use by the frame setting means on the periphery of the ornamental image ;
    The environment judgment means includes
    Subject recognition means for recognizing a plurality of different specific subjects determined in advance from a photographed image acquired by the photographing means;
    An image display apparatus characterized in that among the plurality of predetermined use environments, a use environment defined by the presence of a specific subject recognized by the subject recognition unit is determined as a current use environment .
  2. In an image display device that displays a frame image arranged around an ornamental image,
    Photographing means for obtaining a photographed image obtained by photographing the periphery of the image display device;
    An environment determination unit that determines which of the plurality of predetermined usage environments the current usage environment corresponds to according to the content of the subject of the captured image acquired by the imaging unit;
    Correspondence information storage means for storing correspondence information indicating a correspondence relationship between the plurality of predetermined use environments and the type of the frame image;
    A frame setting unit that refers to the correspondence information stored in the correspondence information storage unit and sets a frame image of a type corresponding to the use environment determined to be applicable by the environment determination unit as a target of use;
    Display control means for displaying on the screen together with the ornamental image a frame image of the type set as the object of use by the frame setting means on the periphery of the ornamental image ;
    The plurality of predetermined use environments include a plurality of use environments corresponding to at least one of a room type and a viewer type as a place where the image display apparatus is installed. A characteristic image display device.
  3. A computer included in an image display device that displays a frame image arranged around an ornamental image,
    Environment determination means for determining which of the plurality of predetermined usage environments the current usage environment corresponds to in accordance with the content of the subject of the captured image taken around the image display device;
    Subject recognition means for recognizing a plurality of different predetermined subjects from a photographed image obtained by photographing the periphery of the image display device;
    With reference to correspondence information indicating a correspondence relationship between the plurality of predetermined use environments and the type of the frame image, a frame image of a type corresponding to the use environment determined to be applicable by the environment determination unit is used as a use target. Frame setting means to be set;
    Along with the ornamental image, the frame setting means functions as display control means for displaying on the screen a frame image of the type set as the object of use, arranged around the ornamental image ,
    The environment determining unit determines that a usage environment defined by the presence of a specific subject recognized by the subject recognition unit is a current usage environment among the plurality of predetermined usage environments. Program.
  4. A computer included in an image display device that displays a frame image arranged around an ornamental image,
    Environment determination means for determining which of the plurality of predetermined usage environments the current usage environment corresponds to in accordance with the content of the subject of the captured image taken around the image display device;
    With reference to correspondence information indicating a correspondence relationship between the plurality of predetermined use environments and the type of the frame image, a frame image of a type corresponding to the use environment determined to be applicable by the environment determination unit is used as a use target. Frame setting means to be set;
    Along with the ornamental image, the frame setting means functions as display control means for displaying on the screen a frame image of the type set as the object of use, arranged around the ornamental image ,
    The plurality of predetermined use environments include a plurality of use environments corresponding to at least one of a room type and a viewer type as a place where the image display apparatus is installed. A featured program.
JP2010021737A 2010-02-03 2010-02-03 Image display device and program Active JP5477025B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010021737A JP5477025B2 (en) 2010-02-03 2010-02-03 Image display device and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010021737A JP5477025B2 (en) 2010-02-03 2010-02-03 Image display device and program

Publications (2)

Publication Number Publication Date
JP2011158794A JP2011158794A (en) 2011-08-18
JP5477025B2 true JP5477025B2 (en) 2014-04-23

Family

ID=44590764

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010021737A Active JP5477025B2 (en) 2010-02-03 2010-02-03 Image display device and program

Country Status (1)

Country Link
JP (1) JP5477025B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9552650B2 (en) 2014-08-29 2017-01-24 Fujifilm Corporation Image combining apparatus, image combining method and recording medium storing control program for image combining apparatus

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10260654A (en) * 1997-03-17 1998-09-29 Melco:Kk Display for interior decoration
JP4218830B2 (en) * 2003-11-18 2009-02-04 ソニー・エリクソン・モバイルコミュニケーションズ株式会社 Portable information device
JP4520897B2 (en) * 2005-04-22 2010-08-11 株式会社エクシング Portable electrical equipment
JP2007047480A (en) * 2005-08-10 2007-02-22 Olympus Imaging Corp Image display device
JP5233474B2 (en) * 2008-07-25 2013-07-10 富士通モバイルコミュニケーションズ株式会社 Portable electronic devices
JP2011022420A (en) * 2009-07-16 2011-02-03 Sharp Corp Image display device
JP2011137899A (en) * 2009-12-28 2011-07-14 Casio Computer Co Ltd Image processor and program
JP5110098B2 (en) * 2010-02-08 2012-12-26 カシオ計算機株式会社 Display processing apparatus and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9552650B2 (en) 2014-08-29 2017-01-24 Fujifilm Corporation Image combining apparatus, image combining method and recording medium storing control program for image combining apparatus

Also Published As

Publication number Publication date
JP2011158794A (en) 2011-08-18

Similar Documents

Publication Publication Date Title
CN101242467B (en) Image processing apparatus
EP1839435B1 (en) Digital image acquisition system with portrait mode
US7940965B2 (en) Image processing apparatus and method and program storage medium
JP2010004118A (en) Digital photograph frame, information processing system, control method, program, and information storage medium
US7672580B2 (en) Imaging apparatus and method for controlling display device
CN101399916B (en) Image taking apparatus and image taking method
US20070285528A1 (en) Imaging apparatus, control method of imaging apparatus, and computer program
US7349020B2 (en) System and method for displaying an image composition template
JP2009100284A (en) Imaging apparatus and imaging control method
EP2728853B1 (en) Method and device for controlling a camera
CN102055834B (en) Double-camera photographing method of mobile terminal
US8411171B2 (en) Apparatus and method for generating image including multiple people
JP2005318554A (en) Imaging device, control method thereof, program, and storage medium
JP4442330B2 (en) Electronic camera and electronic camera system
JP4640456B2 (en) Image recording apparatus, image recording method, image processing apparatus, image processing method, and program
EP1522952A2 (en) Digital camera
JP2005086516A (en) Imaging device, printer, image processor and program
US20100157084A1 (en) Imaging apparatus and image processing method used in imaging device
JP2010514044A (en) Automatic image direction change based on face detection on display
JP2006311574A (en) Method and apparatus for creation of compound digital image effects
KR101421716B1 (en) Face detection device, imaging apparatus and face detection method
US20090225173A1 (en) Image capturing method, control method therefor, and program
JP4898532B2 (en) Image processing apparatus, photographing system, blink state detection method, blink state detection program, and recording medium on which the program is recorded
CN101193208A (en) Image shooting device, image processing device, image processing method and image processing program
CN1810022A (en) Imaging method and system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20130125

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20131007

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20131015

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20131105

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140114

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140127

R150 Certificate of patent or registration of utility model

Ref document number: 5477025

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150