WO2016208070A1 - 撮像装置及び画像処理方法 - Google Patents
撮像装置及び画像処理方法 Download PDFInfo
- Publication number
- WO2016208070A1 WO2016208070A1 PCT/JP2015/068535 JP2015068535W WO2016208070A1 WO 2016208070 A1 WO2016208070 A1 WO 2016208070A1 JP 2015068535 W JP2015068535 W JP 2015068535W WO 2016208070 A1 WO2016208070 A1 WO 2016208070A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- image
- person
- unit
- processing unit
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2625—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
Definitions
- the present invention relates to an imaging apparatus and an image processing method, and particularly to a technique for synthesizing image data obtained by imaging a plurality of people.
- Patent Document 1 has an object of “Easy to take a group photo of a plurality of people including a photographer”. Immediately after being instructed, a photographer takes a photographed image a of other photographed subjects (a plurality of subjects), and then passes it to a digital camera to someone else of the photographed subject. The photographed image b is photographed, and a predetermined composition condition, for example, one photographed image has a plurality of faces of a predetermined size or more, and the other photographed image has a face of a prescribed size or more.
- Patent Document 1 when the captured images a and b are continuously photographed (hereinafter referred to as “a series of photographing”) immediately after the composite photographing mode is instructed, a composite image c is created based on them.
- a series of photographing a series of photographing
- the present invention has been made in view of the above circumstances, and an object thereof is to provide an imaging apparatus and an image processing method with improved operability when a photograph including all of a plurality of persons is easily generated. .
- an imaging processing unit that captures an image of a subject and generates image data
- a face recognition processing unit that executes face recognition processing of the image data
- an image recording unit that records the image data
- a synthesis processing unit that performs synthesis processing so that a person captured in each of the plurality of image data is included in one image data to generate composite data
- the face recognition processing unit includes: The face recognition process is performed on the first image data to recognize the face of the first person, and the image is shot at an arbitrary shooting timing different from the shooting timing of the first image data.
- the synthesis processing unit is configured to Wherein generating the synthetic data that the first person and the second person is superimposed on the same background using the image data and the second image data.
- FIG. 1 is a block diagram illustrating a configuration example of an imaging apparatus according to a first embodiment.
- Rear view of the imaging apparatus according to the first embodiment 1 is a flowchart showing an example of the overall operation of an imaging apparatus according to the first embodiment.
- the flowchart which shows the operation example of the synthetic
- Explanatory drawing which shows an example of search and composition of image data according to the first embodiment
- Explanatory drawing which shows an example of search and composition of image data according to the first embodiment
- the flowchart which shows the operation example of the whole imaging device which concerns on 2nd embodiment.
- the flowchart which shows the operation example of the synthetic
- the figure which shows the example of a display of a display part when the reproduction
- Explanatory drawing which shows an example of search and composition of image data according to the second embodiment
- the flowchart which shows the operation example of the whole imaging device which concerns on 3rd embodiment.
- the flowchart which shows the operation example of the synthetic
- the image data being displayed and the image data selected by the imaging device are combined in a manual synthesis mode (an operation mode in which the image is synthesized after a synthesis instruction by the user) or an automatic synthesis mode (an operation in which the imaging device automatically synthesizes Mode).
- FIG. 1A is a block diagram illustrating a configuration example of an imaging apparatus according to the first embodiment of the present invention.
- a main control unit 101 includes a CPU (Central Processing Unit) and the like, and the imaging apparatus 1 according to various operation programs and data stored in a ROM (Read Only Memory) 102 or a RAM (Random Access Memory) 103. To control the whole.
- the system bus 100 is a data communication path for performing data transmission / reception between the control unit 101 and each unit in the imaging apparatus 1.
- the ROM 102 is a memory in which various programs for controlling the imaging apparatus 1 are stored.
- a rewritable ROM such as an EEPROM (Electrically Erasable Programmable ROM) or a flash ROM is used.
- the RAM 103 is used as a temporary storage area during execution of a program stored in the ROM 102 or as a temporary storage of captured images.
- the storage unit 104 (corresponding to the image recording unit) stores information such as operation setting values of the imaging apparatus 1, and is a nonvolatile rewritable device such as a flash ROM or SSD (Solid State Drive). Is used.
- the ROM 102 and the RAM 103 may be integrated with the main control unit 101. Further, the ROM 102 may not use an independent configuration as shown in FIG. 1A but may use a partial storage area in the storage unit 104.
- An external storage medium interface (I / F) 105 is an interface for recording / reading information to / from an external recording medium 106 that can be stored in and taken out of the imaging apparatus 1 such as an SD card.
- the external interface 107 is a group of interfaces for extending the functions of the imaging apparatus 1.
- the external interface 107 is a USB (Universal Serial Bus) interface (I / F) 107 a, a video output interface (I / F) 107 b, an audio.
- An output interface (I / F) 107c is included.
- the USB interface 107a is connected to a USB interface of an external device such as a personal computer or a television receiver, and image data stored in the storage unit 104 or the external recording medium 106 can be read and displayed on the external device.
- the video output interface 107b and the audio output interface 107c output a video signal / audio signal to an external video / audio output device.
- the video output interface 107b and the audio output interface 107c may output video and audio together using HDMI (High-Definition Multimedia Interface: registered trademark).
- the audio input unit 108 includes a microphone that converts audio around the imaging device 1 into an electric signal, an A / D converter that converts audio converted into the electric signal into audio data of a digital signal, and the like.
- the audio signal processing unit 109 performs a filtering process on the audio data input from the audio input unit, a process of converting the audio data according to the format of the moving image recorded on the external recording medium 106, and the like.
- the imaging unit 110 includes an optical system including a lens for zooming and focusing operation, a mechanism system for driving the lens for zooming and focusing operation, and a CCD that converts a light image of a subject input from the lens into an electrical signal or
- An image sensor such as a CMOS and an electric device such as an A / D converter that converts an image converted into an electric signal into image data of a digital signal are configured.
- the image signal processing unit 111 performs image processing such as white balance adjustment, exposure adjustment, and gamma correction on the image data input from the imaging unit 110. Further, according to the format recorded on the external recording medium 106, for example, in the case of a moving image, MPEG2 or H.264. Conversion to image data such as H.264 is performed, and in the case of a still image, processing such as conversion to image data such as JPEG or TIFF is performed. Further, the image data stored in the storage unit 104 or the external recording medium 106 is combined, face recognition processing, pattern matching processing, and the like are performed.
- the display unit 112 displays an image of a subject imaged by the image sensor of the imaging unit 110, an image recorded in the storage unit 104 or the external recording medium 106, a screen for performing various settings of the imaging device 1, and the like. .
- the operation unit 113 is an instruction input unit that inputs an operation instruction to the imaging apparatus 1, and includes a power on / off button, a shutter button, a playback button, a composition button, buttons for performing various settings, and the like. Further, a touch panel may be provided on the surface of the display unit 112, and an operation instruction to the imaging device 1 may be input by detecting a position of a button or the like displayed on the display unit 112 and a position where the touch panel is touched.
- the timekeeping unit 114 measures the elapsed time from the date / time set by the user, for example, using an RTC (Real Time Clock) circuit, and outputs date / time information.
- the date / time acquired from the timekeeping unit 114 Recording is performed by adding date / time information to the image data based on the time information.
- the GPS receiving unit 115 receives radio waves from a plurality of GPS satellites, and can acquire position (latitude, longitude, etc.) information of the imaging device 1 based on the received signals. This position information is also added to the image data and recorded.
- the power supply unit 116 includes a battery (not shown). In response to an instruction from the main control unit 101, the power supply unit 116 supplies power to each part of the imaging apparatus 1 according to a power-on state, a power-off state, a power-off standby state, and the like. Supply power.
- the imaging device 1 may be a digital camera, a mobile phone, a smartphone, a tablet terminal, a navigation device, or the like as long as it includes the imaging unit 110, or a PDA (Personal Digital Assistant) or a notebook PC (Personal Computer). It may be. A music player, a portable game machine, or the like having a communication function, or other portable digital devices may be used. A wearable smart watch, smart glass, or the like may also be used.
- FIG. 1B is a diagram illustrating a software configuration of the imaging apparatus according to the first embodiment, and illustrates a software configuration in the ROM 102, the RAM 103, and the storage unit 104.
- FIG. 1B when the main control unit 101 executes a program stored in the ROM 102, a basic operation execution unit 1021, a position information acquisition unit 1022, an imaging processing unit 1023, an image recording unit 1024, an image reproduction unit 1025, power management A unit 1026, a composition processing unit 1027, and a face recognition processing unit 1028.
- the RAM 103 temporarily stores a temporary storage area 1031 for temporarily storing data as necessary when executing a program stored in the ROM 102, and temporarily captures an image captured by the imaging processing unit 1023 via the imaging unit 110. It is assumed that an image temporary storage area 1032 for storing is provided.
- the storage unit 104 includes a setting information storage area 1041 that stores setting information when executing a program stored in the ROM 102, an image storage area 1042 that stores still image data and moving image data captured by the imaging apparatus 1, and the like. Also have.
- the main control unit 101 controls each operation block by executing a program stored in the ROM 102.
- the basic operation execution unit 1021 controls various settings and overall operations of the imaging apparatus 1.
- the position information acquisition unit 1022 performs a process of acquiring position information such as the latitude and longitude of the imaging device 1 based on the signal from the GPS satellite received by the GPS reception unit 115.
- it has a Wi-Fi (registered trademark) reception function to improve the accuracy by correcting the positional information using Wi-Fi radio wave reception level information, and to improve the Wi-Fi radio wave reception level.
- the position information may be acquired using the information.
- the imaging processing unit 1023 When the shutter button included in the operation unit 113 is pressed, the imaging processing unit 1023 performs processing for capturing the image data of the subject captured by the imaging unit 110 into the image temporary storage area 1032 of the RAM 103.
- the image recording unit 1024 performs processing of recording the image data captured in the temporary image storage area 1032 in the image storage area 1042 of the storage unit 104 or recording it in the external recording medium 106 via the external recording medium interface 105.
- the image reproduction unit 1025 performs processing for displaying the image of the subject imaged by the image sensor of the imaging unit 110 on the display unit 112 via the image temporary storage area 1032 of the RAM 103. Further, the image data recorded in the storage unit 104 or the external recording medium 106 is read and displayed on the display unit 112.
- the power management unit 1026 performs a process of controlling the power supplied to each part of the imaging apparatus 1 by using a power button included in the operation unit 113.
- the composition processing unit 1027 receives the image data stored in the storage unit 104 or the external recording medium 106 when a composition button included in the operation unit 113 is pressed or when display of the composed image data is selected. Perform processing to synthesize.
- the face recognition processing unit 1028 performs face recognition processing of image data stored in the RAM 103, the storage unit 104, and the external recording medium 106.
- the program stored in the ROM 102 may be stored in advance at the time of product shipment.
- a program acquired from a server device on the Internet by a personal computer after product shipment may be stored via the USB interface unit 107a.
- a buffer memory for temporarily storing image data may be provided in the imaging unit 110 or the image signal processing unit 111.
- FIG. 2A is a front view of the imaging apparatus 1.
- a lens constituting the imaging unit 110 is disposed on the front surface of the imaging device 1. Further, a power button 113-1 and a shutter button 113-2 constituting the operation unit 113 are arranged on the upper part of the imaging apparatus 1.
- FIG. 2B is a rear view of the image pickup apparatus 1.
- a display unit 112 On the back of the imaging apparatus 1, a display unit 112, a selection / determination operation button 113-3 constituting the operation unit 113, a menu button 113-4 for performing various setting / file operation processing selections, the storage unit 104 or the like
- a reproduction button 113-5 for reproducing the image data recorded on the external recording medium 106 and a combination button 113-6 (corresponding to a combination instruction operation unit) for combining the image data are arranged.
- the display unit 112 includes a touch sensor so that it can be operated in place of the shutter button 113-2, the selection / determination operation button 113-3, the menu button 113-4, the playback button 113-5, and the composition button 113-6. Good.
- FIG. 3 is a flowchart showing an overall operation example of the imaging apparatus according to the first embodiment.
- the image composition processing (S 313) is triggered by a manual mode, that is, a user's input operation of an image composition instruction, more specifically “image composition button depression (S 312; Yes)” in the following example. The flow of operations when executed will be described.
- the power management unit 1026 confirms that the power button 113-1 has been pressed. If not pushed down (S301; No), it waits at this step, without changing to the next step.
- the imaging device 1 When the power button 113-1 is pressed (S301; Yes), the imaging device 1 is activated (S302). When activated, the image of the subject imaged by the image sensor of the imaging unit 110 is displayed on the display unit 112.
- the power management unit 1026 confirms that the power button 113-1 has been pressed. When the power button 113-1 is pressed (S303; Yes), the termination process of the imaging device 1 is performed (S304).
- the imaging processing unit 1023 confirms that the shutter button 113-2 is pressed.
- the imaging unit 110 captures and acquires image data of the subject (S306) and stores it in the temporary image storage area 1032 of the RAM 103. Further, additional information such as the imaged model, aperture value, number of pixels, ISO sensitivity, shooting date / time, and position information is also stored in the temporary image storage area 1032.
- the image data and additional information stored in the temporary image storage area 1032 are recorded in the storage unit 104 or the external recording medium 106 by the image recording unit 1024 (S307). Then, the image data captured by the imaging unit 110 and stored in the temporary image storage area 1032 is displayed on the display unit 112 (S309).
- the image playback unit 1025 confirms that the playback button 113-5 has been pressed.
- the playback button 113-5 is pressed (S308; Yes)
- the image data recorded in the storage unit 104 or the external recording medium 106 is played back by the image playback unit 1025 and displayed on the display unit 112 (S309).
- An image to be played back / displayed can be displayed on the display unit 112 by pressing the play button 113-5.
- the image data displayed on the display unit 112 and the image data (S306) generated by the user and displayed on the display unit 112 correspond to the first image data.
- FIG. 6A is a diagram illustrating a display example of the display unit 112 immediately after shooting by the imaging apparatus 1.
- a person A is photographed against a mountain background.
- the display time of the captured image data is about 3 seconds, for example, and may be changed by the user.
- the predetermined time elapses, the display of the image data disappears, but when the playback button 113-5 is pressed, it is displayed again.
- the composition processing unit 1027 confirms that the composition button 113-6 is pressed.
- the image processing unit 110 captures image data recorded in the storage unit 104 or the external recording medium 106 via the temporary image storage area 1032. The combined image data is displayed on the display unit 112 (S315).
- FIG. 6B is a diagram illustrating a display example of image data synthesized by the imaging apparatus 1.
- a person B photographed against the same mountain is combined with an image including the person A photographed against the mountain.
- the image recording unit 1024 is synthesized by the synthesis processing unit 1027.
- the image data thus recorded is recorded in the storage unit 104 or the external recording medium 106 (S318).
- FIG. 7A is a diagram showing a display example of a selection screen for determining whether or not to record the image data synthesized by the imaging apparatus 1. The user selects and confirms “Yes” or “No” with the selection / determination operation button 113-3.
- FIG. 7B is a diagram showing a display example when an image cannot be synthesized.
- the screen shown in FIG. 7B is displayed on the display unit 112 instead of FIG. 7A.
- the compositing process may be selected and executed by the menu button 113-4 or the selection / decision operation button 113-3 instead of the independent compositing button 113-6.
- the composition operation may be automatically performed.
- the composition operation may be automatically performed when shooting is performed by touching a subject displayed on the display unit 112 having a touch sensor instead of pressing the shutter button 113-2.
- the subject is a person, the person may be registered as an important person.
- the composition operation may be automatically performed.
- FIG. 4 is a flowchart showing another operation example (automatic synthesis mode) of the imaging apparatus 1 according to the first embodiment.
- the composition processing unit 1027 confirms whether or not the automatic composition mode is selected (S412).
- the automatic synthesis mode is set in advance by using the menu button 113-4 or the selection / determination operation button 113-3. Accordingly, the menu button 113-4 and the selection / determination operation button 113-3 correspond to a mode setting unit.
- the composition processing unit 1027 synthesizes image data captured by the imaging unit 110 and recorded in the storage unit 104 or the external recording medium 106 via the image temporary storage area 1032 (S413). ). The composition processing unit 1027 performs composition processing in the image temporary storage area 1032.
- the display unit 112 displays a screen for selecting whether to display the synthesized image data.
- the user selects / determines whether or not to display the combined image data using the selection / determination operation button 113-3 (S414; Yes).
- FIG. 8 is a diagram illustrating a display example of a selection screen for determining whether or not to display image data combined with another image by the imaging apparatus 1.
- the user selects and confirms “Yes” or “No” with the selection / determination operation button 113-3. If it cannot be combined with another image, it is not displayed.
- S415 to 418 The same as S315 to S318 in FIG.
- FIG. 5 is a flowchart showing an operation example of the composition processing unit 1027.
- the composition processing unit 1027 searches the image storage area 1042 of the storage unit 104 or the image data stored in the area other than the external recording field 106 for the latest image with the same shooting date as the image displayed on the display unit 112. (S501).
- search image the searched image
- display image the image displayed on the display unit 112
- the composition processing unit 1027 checks whether the shooting location of the search image is the same as the shooting location of the display image based on the location information acquired by the location information acquisition unit 1022 (S502).
- the predetermined time is, for example, about 3 minutes and may be changed by the user.
- the face recognition processing unit 1028 confirms whether the same person as the person included in the display image is included in the search image (S505).
- the search image and the display image are superimposed, the position of the person captured in the display image in the display image is extracted from the search image. It is checked whether or not the position of the person to be combined with the display image overlaps (S506). It may be possible to set the determination level, such as “Yes” when people overlap even a little, or “Yes” only when faces overlap.
- the shooting date and time is within a predetermined time (S503; Yes)
- the background is the same (S504; Yes)
- the same person is not included (S505; No)
- the person in the search image is extracted by outline detection and pasted on the display image and synthesized (S507).
- the person in the display image is extracted by outline detection and pasted on the search image and synthesized.
- a geomagnetic sensor may be mounted on the imaging device, and whether or not the orientation of the imaging device is the same may be added to the above determination.
- the combining process may be executed by omitting S502.
- FIG. 9 is an explanatory diagram showing an example of search and synthesis of image data according to the first embodiment.
- the image data 901 to 905 are image data taken in order from 10:02:00 to 10:02:55 on 2013.15.15 at 35 ° 24'53 ′′ north latitude and 138 ° 51′31 ′′ east longitude. is there.
- the image data 901, 902, 904, and 905 have the same background, and only the image data 903 has a different background.
- the date / time information is acquired by the timekeeping unit 114, and the position information is acquired by the GSP receiving unit 115.
- Image data 901 and 902 are image data in which a person B is photographed by the person A
- image data 903 is image data in which a bird is photographed by the person A or the person B
- Image data 904 and 905 are image data obtained by photographing the person A by the person B.
- the image data 901 and 902 are both image data in which the person B is photographed by the person A. However, since the person B in the image data 901 has no smile, the image data 902 is photographed again by the person A. Show.
- the image data 904 and 905 are both image data in which the person A is photographed by the person B. However, since the person A in the image data 904 has no smile, the image data 905 is photographed again by the person B. It shows that. Further, the image data 903 indicates that a bird that suddenly appeared during a series of photographing (meaning the period of photographing of the image data 901 to 905) has been photographed.
- the shutter button 113-2 of the imaging apparatus 1 When the shutter button 113-2 of the imaging apparatus 1 is pressed, the subject is photographed, and the photographed image is displayed on the display unit 112.
- the synthesis button 113-6 When the synthesis button 113-6 is pressed, the images are synthesized.
- Image data 906 is image data synthesized from the image data 902 and 905.
- FIG. 6A described above is a display example of the display unit 112 immediately after the image data 905 is captured by the image capturing apparatus 1 or after the playback button 113-5 is pressed after the image capturing.
- the composition button 113-6 of the imaging apparatus 1 is pressed, the image data 903 is searched according to the flowchart of FIG. 5, and the image data 902 and 905 are synthesized to generate the image data 906.
- FIG. 6B is a display example of the image data 906 synthesized by the imaging apparatus 1.
- the image data 901 to 904 have the same shooting location as the image data 905 (S502; Yes), and the shooting date and time is within a predetermined time (S503; Yes).
- the image data 904 includes the same person as the image data 905 (S505; Yes).
- the image data 903 has a different background from the image data 905 (S504; No).
- the image data 901 and 902 are the case where the background is the same as the image data 905 (S504; Yes), the same person is not included (S505; No), and the persons do not overlap (S506; No).
- the image data 902 is selected as the newest image data, and is combined with the image data 905 to generate combined image data 906. In the example of FIG.
- the image 905 corresponds to the first image data
- the image 902 corresponds to the second image data.
- the image data 903 is excluded because the background is different from that of the image data 905.
- image data having a lens focal length that differs by a predetermined value or more may be excluded.
- the composition based on the background it is possible to absorb the angle of view deviation of each image data.
- the person B of the image data 902 is extracted by contour detection and pasted on the image data 905 with reference to the background.
- the person A of the image data 905 may be extracted by contour detection and combined with the image data 902 based on the background.
- FIG. 10 is an explanatory diagram showing an example of search and synthesis of image data according to the first embodiment.
- the image data 1001 and 1002 are image data taken in order from 10:02:00 to 10:02:50 in 2013.03.16 at 35 ° 24'53 ′′ north latitude and 138 ° 51′31 ′′ east longitude. is there.
- the image data 1001 and 1002 have the same background.
- the image data 1001 is image data obtained by photographing the person B by the person A
- the image data 1002 is image data obtained by photographing the person A by the person B.
- the images are combined.
- the images are automatically synthesized after shooting.
- Composite image data 1003 is image data synthesized from the image data 1001 and 1002.
- Image data 1004 is an image in which person B is newly taken by person A after recording composite image data 1003 in storage unit 104 or external recording medium 106 or without recording.
- the image data 1001 and 1004 are both image data in which the person B is photographed by the person A. However, since the person B in the image data 1001 has no smile, the image data 1004 is photographed again by the person A. Show. In addition, the person B can take a picture of the person A again.
- the composition button 113-6 of the image pickup apparatus 1 When the composition button 113-6 of the image pickup apparatus 1 is pressed after the image data 1004 is photographed, the image is composed again. Alternatively, in the automatic synthesis mode, the images are automatically synthesized again after shooting.
- Image data 1005 is image data synthesized from the image data 1002 and 1004.
- the image capturing apparatus when the image capturing apparatus captures an image to be combined and then presses the combining button, the image capturing apparatus can search for an optimal image and perform combining. Alternatively, the imaging apparatus can automatically search for an optimal image and perform synthesis after shooting.
- the number of images to be combined may be taken, and the imaging device searches for an optimal image for combining.
- a plurality of images to be combined can be combined regardless of the shooting timing as long as they have the same background. For this reason, even when one of the images is re-taken or when an image that is not a compositing target is captured during the capturing timing of the image that is a compositing target, the compositing process is performed by excluding the image that is not the compositing target. It is possible.
- an example in which a newer image is preferentially selected has been described, but an image including a smiling person or a person facing the front may be preferentially selected.
- the second embodiment is an embodiment in which composition is performed using image data selected by the user.
- FIG. 11 is a flowchart illustrating an example of the overall operation of the imaging apparatus according to the second embodiment.
- S1101 to S1108 are the same as S301 to S308 in FIG.
- the image data captured by the imaging unit 110 and stored in the image temporary storage area 1032 is displayed on the display unit 112 (S1109).
- the playback button 113-5 When the playback button 113-5 is pressed (S1108; Yes) and when the change of image is selected by the selection / determination operation button 113-3 (S1111; Yes), the storage unit 104 or the external recording medium 106 is stored.
- the recorded image data is reproduced by the image reproduction unit 1025 and displayed on the display unit 112 (S1110).
- the selection / determination operation button 113-3 functions as an operation member for the user to select an image stored in the storage unit 104, and thus corresponds to a selection operation unit.
- FIG. 13A is a diagram showing a display example of the display unit 112 when the play button 113-5 is pressed.
- the image data displayed on the display unit 112 is changed to another image data. Instead of image data (S1111; Yes).
- FIG. 13B is a diagram illustrating a display example of the display unit 112 by the imaging apparatus 1 when the image data is changed to another image data.
- S1112 is the same as S312 in FIG.
- the composition button 113-6 When the composition button 113-6 is pressed (S1112; Yes) and when an image change is selected by the selection / determination operation button 113-3 (S1116; Yes), the storage unit 104 or the external recording medium 106 is stored.
- the recorded image data is synthesized by the synthesis processing unit 1027 (S1113: Yes).
- the composition is performed in the temporary image storage area 1032.
- S1115 is the same as S315 in FIG.
- FIG. 14A is a display example of the display unit 112 when the synthesis button 113-6 is pressed.
- the user uses the selection / determination operation button 113-3 to select whether or not to change the image data to be synthesized to another image data (S1116).
- FIG. 14B is a display example of the display unit 112 by the imaging apparatus 1 when the image data is changed to another image data.
- the image data 1404 selected in step S1116 is not changed, but the image data 1401 to 1403 to be combined therewith is changed.
- S1117 to S1118 are the same as S317 to S318 in FIG.
- the recording, reproduction, and composition of images are performed by performing the processes of S1101 to S1118.
- FIG. 12 is a flowchart illustrating an operation example of the synthesis processing unit according to the second embodiment.
- S1202 to S1209 are the same as S502 to S509 in FIG.
- FIG. 15 is an explanatory diagram showing an example of search and synthesis of image data according to the second embodiment.
- the image data 1501 to 1505 are image data taken in order from 10:02:00 to 10:02:55 on 2013.15.15 at 35 ° 24'53 ′′ north latitude and 138 ° 51′31 ′′ east longitude. is there.
- the image data 1501 to 1505 have the same background.
- the image data 1501 to 1503 are image data obtained by photographing the person B by the person A
- the image data 1504 and 1505 are image data obtained by photographing the person A by the person B.
- the shutter button 113-2 of the imaging apparatus 1 When the shutter button 113-2 of the imaging apparatus 1 is pressed, the subject is photographed, and the photographed image is displayed on the display unit 112.
- the synthesis button 113-6 When the synthesis button 113-6 is pressed, the images are synthesized.
- Image data 1506 is image data synthesized from the image data 1502 and 1504.
- the same effect as that of the first embodiment can be obtained, and the user can freely select an image to be combined.
- ⁇ Third embodiment> In the third embodiment, only registered persons are added to the combined image, and unregistered persons are deleted from the combined image.
- FIG. 16 is a flowchart illustrating an example of the overall operation of the imaging apparatus according to the third embodiment.
- S1601 to S1612 are the same as S301 to S312 in FIG.
- FIG. 18A is a diagram showing a display example of the display unit 112 immediately after shooting by the imaging apparatus 1 or when the playback button 113-5 is pressed.
- a person A is photographed against a mountain background, but unregistered persons C and D are also photographed. Persons A and B are registered in advance as important persons.
- the person may be registered as an important person.
- the composite button 113-6 When the composite button 113-6 is pressed (S1612; Yes), the image data captured by the imaging unit 110 and stored in the image temporary storage area 1032 and the image data recorded in the storage unit 104 or the external recording medium 106 are displayed. Are synthesized by the synthesis processing unit 1027 (S1613). The composition is performed in the temporary image storage area 1032, a registered person is added by the composition, and an unregistered person is deleted.
- FIG. 18B is a diagram showing a display example of image data synthesized by the imaging apparatus 1.
- a person B photographed against the same mountain is combined with an image including the person A photographed against the mountain.
- Unregistered persons C and D are deleted.
- S1615 to S1618 are the same as S315 to S318 in FIG.
- Recording, reproduction, and composition of an image are performed by performing the processes of S1601 to S1618.
- FIG. 17 is a flowchart showing an operation example of the composition processing unit according to the third embodiment.
- S1701 to S1706 are the same as S501 to S506 in FIG.
- the search image and the display image have the same shooting location (S1702; Yes), the shooting date and time is within a predetermined time (S1703; Yes), the background is the same (S1704; Yes), and the same person is not included (S1705; No) ) And when the persons do not overlap (S1706; No), the registered person of the search image is extracted by outline detection and pasted on the display image and synthesized. Alternatively, the registered person of the display image is extracted by outline detection and pasted on the search image and synthesized (S1707).
- the background of the search image at the position where the unregistered person exists in the display image is extracted and pasted on the display image (background data is synthesized) to delete the unregistered person.
- the background of the display image at the position where the unregistered person exists in the search image is extracted and pasted on the search image to delete the unregistered person.
- S1708 to S1709 are the same as S508 to S509 in FIG.
- the combining processing unit 1027 checks whether or not an unregistered person exists in the combined image data ( S1710).
- the composition processing unit 1027 checks whether there is another image (S1711).
- the composition processing unit 1027 checks whether or not the shooting location of the search image and the shooting location of the synthesized image are the same (S1713).
- the compositing processing unit 1027 checks whether the shooting date / time of the search image and the shooting date / time of the combined image are within a predetermined time (S1714).
- the predetermined time is, for example, about 3 minutes and may be changed by the user.
- the composition processing unit 1027 checks whether the background of the search image is the same as the background of the display image (S1715).
- composition processing unit 1027 deletes the unregistered person (S1716) and returns to S1710.
- the image to be pasted on the display image is searched, the registered person is extracted and synthesized, and the unregistered person is deleted.
- FIG. 19 is an explanatory diagram showing an example of image data search and composition.
- the image data 1901 to 1903 are image data taken in order from 10:02:00 to 10:02:50 in 2013.03.16 at 35 ° 24'53 ′′ north latitude and 138 ° 51′31 ′′ east longitude. is there.
- the image data 1901 to 1903 have the same background.
- the image data 1901 and 1902 are image data in which the person B is photographed by the person A, but the unregistered person D is also photographed in the image data 1902. Since the person B of the image data 1901 has no smile, it indicates that the image data 1902 has been re-photographed by the person A.
- the image data 1903 is image data in which the person A is photographed by the person B, but unregistered persons C and D are also photographed. Persons A and B are registered in advance as important persons. In addition, when shooting is performed by touching a person displayed on the display unit 112 including a touch sensor instead of the shutter button 113-2, the person may be registered as an important person.
- the shutter button 113-2 of the imaging apparatus 1 When the shutter button 113-2 of the imaging apparatus 1 is pressed, the subject is photographed, and the photographed image is displayed on the display unit 112.
- the synthesis button 113-6 When the synthesis button 113-6 is pressed, the images are synthesized.
- Image data 1904 is image data in which a registered person is synthesized and an unregistered person C is deleted from the image data 1902 and 1903.
- the person B of the image data 1902 is extracted by contour detection and pasted on the image data 1903 with reference to the background, thereby synthesizing the person B with the image data 1902.
- the unregistered person C can be deleted by pasting the background of the image data 1902 at the position where the unregistered person C exists in the image data 1903 to the unregistered person C of the image data 1903.
- the person A of the image data 1903 may be extracted by contour detection and combined with the image data 1902 with reference to the background to generate the combined image data 1904. If there is an unregistered person in one or both of the image data, it is preferable to extract the registered person from the image data with more unregistered persons and paste it into the image data with fewer registered persons.
- the unregistered person D is deleted by pasting the background of the image data 1901 at the position where the unregistered person D exists in the image data 1904 to the unregistered person D of the image data 1904.
- Image data 1905 is image data obtained by deleting the unregistered person D from the image data 1904 and 1901.
- the same effect as that of the first embodiment can be obtained, and only the registered person can be added to the image, and the unregistered person can be deleted.
- the unregistered person can be deleted without adding a registered person to the image.
- ⁇ Fourth embodiment> when an image of only a background in which a person is not imaged (hereinafter referred to as a “background image”) is imaged, the background image is imaged, and when there is no background image, the background published on the network is imaged In this embodiment, the retrieved image data is searched, and unregistered persons are deleted using the background images.
- background image an image of only a background in which a person is not imaged
- FIG. 20A the same processing parts as those in FIG.
- FIG. 20A is a block diagram illustrating a configuration example of an imaging apparatus according to the fourth embodiment.
- the imaging device 1 illustrated in FIG. 20A includes a wireless communication unit 120, is connected to the external network 3 via the access point device 2, and transmits and receives data to and from the server device 4 on the external network 3.
- the connection to the access point device 2 is assumed to be performed by wireless connection using Wi-Fi (registered trademark) or the like.
- the access point device 2 may be a mobile communication carrier base station. Further, instead of the image capturing apparatus 1 directly transmitting / receiving data to / from the access point 2, data may be transmitted / received via a smartphone or a portable terminal.
- the server apparatus 4 has a plurality of image data with position information and image data without position information, and the imaging apparatus 1 can acquire various image data via the access point apparatus 2 and the network 3.
- FIG. 20B is a software configuration diagram of the imaging apparatus 1 according to the present embodiment, and the same processing parts as those in FIG.
- FIG. 20 is a diagram showing a software configuration diagram of the imaging apparatus according to the fourth embodiment shown in FIG. 20B.
- the ROM unit 102 includes a wireless connection processing unit 1029, and performs wireless connection processing with the access point device 2 by controlling the wireless communication unit 120. Further, the position information acquisition unit 1022 performs processing for updating the position information of the metadata of the image file recorded in the storage 104 of the main body or the external recording medium 106.
- FIG. 21 is a flowchart showing an operation example of the synthesis processing unit according to the fourth embodiment.
- S2101 to S2110 are the same as S1701 to S1710 in FIG.
- S2113 to S2116 are the same as S1713 to S1716 in FIG.
- the external network 3 is connected via the access point device 2 and the image of the server device 4 on the external network 3 is searched ( S2117). Then, processing similar to S2116 is performed using the image on the network, and unregistered persons are deleted (S2118).
- the image to be pasted on the display image is searched, the registered person is extracted and synthesized, and the unregistered person is deleted.
- FIG. 22 is an explanatory diagram showing an example of search and synthesis of image data according to the fourth embodiment.
- the image data 2201 to 2203 are image data photographed in order from 10:02:00 to 10:02:50 in 2011.03.16 at 35 ° 24'53 ′′ north latitude and 138 ° 51′31 ′′ east longitude. is there.
- the image data 2201 to 2203 have the same background.
- the image data 2201 is image data in which the person B is photographed by the person A, but the unregistered person D is also photographed.
- the image data 2202 is image data in which the person A is photographed by the person B, but unregistered persons C and D are also photographed. Persons A and B are registered in advance as important persons. In addition, when shooting is performed by touching a person displayed on the display unit 112 including a touch sensor instead of the shutter button 113-2, the person may be registered as an important person.
- the shutter button 113-2 of the imaging apparatus 1 When the shutter button 113-2 of the imaging apparatus 1 is pressed, the subject is photographed, and the photographed image is displayed on the display unit 112.
- the synthesis button 113-6 When the synthesis button 113-6 is pressed, the image is synthesized.
- the composite image data 2203 is image data obtained by combining a registered person and deleting an unregistered person C from the image data 2201 and 2202.
- the person B of the image data 2201 is extracted by outline detection and pasted on the image data 2202 with reference to the background, and the background of the image data 2201 at the position where the unregistered person C of the image data 2202 exists is displayed in the image data 2202. By pasting on the unregistered person C, the unregistered person C is deleted.
- the person A of the image data 2202 is extracted by contour detection and pasted on the image data 2201 with the background as a reference. If there is an unregistered person in one or both of the image data, it is preferable to extract the registered person from the image data with more unregistered persons and paste it into the image data with fewer registered persons.
- the unregistered person D can be deleted by re-taking and pasting an image (background image) with no unregistered person D in the same background.
- Image data search is performed by pattern matching to find similar image data. You may search using a photography location, a photography season, a photography date, etc.
- the composite image data 2205 is image data in which the unregistered person D is deleted from the composite image data 2203 using the image data 2204 searched on the network.
- the unregistered person D is deleted by pasting the background of the image data 2204 at the position where the unregistered person D exists in the composite image data 2203 to the unregistered person D of the composite image data 2203.
- the present embodiment can obtain the same effects as those of the third embodiment, and can further search for image data published on the network and delete unregistered persons using the searched image data. .
- the image data shot by the user at the same location may be combined.
- each processing example may be independent programs, or a plurality of programs may constitute one application program. Further, the order of performing each process may be changed and executed.
- the functions or the like of the present invention described above may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
- the microprocessor unit or the like may be realized by software by interpreting and executing an operation program that realizes each function or the like. Hardware and software may be used together.
- control lines and information lines shown in the figure are those that are considered necessary for the explanation, and not all control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
- imaging device 101: main control unit, 102: ROM, 103: RAM, 104: storage unit, 106: external recording medium, 110: imaging unit, 115: GPS receiving unit, 120: wireless communication unit, 1022: position Information acquisition unit, 1023: imaging processing unit, 1024: image recording unit, 1025: image reproduction unit, 1027: composition processing unit, 1028: face recognition processing unit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
- Image Processing (AREA)
Abstract
Description
第一実施形態は、表示中の画像データと撮像装置が選択した画像データとを、手動合成モード(ユーザによる合成指示後に合成する動作モード)、又は自動合成モード(撮像装置が自動で合成する動作モード)により合成する実施形態である。
図1Aは、本発明の第一実施形態に係る撮像装置の構成例を示すブロック図である。
図1Bは、第一実施形態に係る撮像装置のソフトウェア構成図を示す図であり、ROM102、RAM103およびストレージ部104におけるソフトウェアの構成を示す。
図2A、及び図2Bを参照して第一実施形態に係る撮像装置の外観について説明する。
図3は第一実施形態に係る撮像装置の全体の動作例を示すフローチャートである。図3では、手動モード、すなわち画像合成の指示をユーザが入力操作、より具体的には以下の例における「画像合成ボタンの押下げ(S312;Yes)」を契機として画像合成処理(S313)が実行される場合の動作の流れについて説明する。
次に、S313とS411の画像合成処理について説明する。
第二実施形態は、ユーザが選択した画像データを用いて合成を行う実施形態である。
図11は第二実施形態に係る撮像装置の全体の動作例を示すフローチャートである。
次に、S1113の画像合成処理について説明する。図12は第二実施形態に係る合成処理部の動作例を示すフローチャートである。
第三実施形態は、登録人物のみを合成後の画像に加え、未登録人物を合成後の画像から削除する実施形態である。
図16は第三実施形態に係る撮像装置の全体の動作例を示すフローチャートである。
次に、S1613の画像合成処理について説明する。
第四実施形態は、人物が撮像されていない背景のみの画像(以下「背景画像」という)を撮像した場合はその背景画像を、背景画像が無い場合はネットワーク上で公開されている背景が撮像された画像データを検索し、それらの背景画像を用いて未登録人物を削除する実施形態である。
図20Aにおいて、図1Aと同じ処理部分には同じ符号を付し、説明を省略する。
図20Bは本実施例の撮像装置1のソフトウェア構成図であり、図1Bと同じ処理部分には同じ符号を付し、説明を省略する。
次に、画像合成処理について説明する。図21は第四実施形態に係る合成処理部の動作例を示すフローチャートである。
Claims (11)
- 被写体を撮像して画像データを生成する撮像処理部と、
前記画像データの顔認識処理を実行する顔認識処理部と、
前記画像データを記録する画像記録部と、
複数の画像データの其々に撮像された人物が、一つの画像データに含まれるように合成処理を実行して合成データを生成する合成処理部と、を備え、
前記顔認識処理部は、第一の画像データに対して前記顔認識処理を実行して第一の人物の顔を認識し、
前記第一の画像データの撮影タイミングとは異なる任意の撮影タイミングで撮影されると共に、前記第一の画像データの背景と同一の背景を有し、かつ前記第一の人物とは異なる第二の人物が撮像された第二の画像データが前記画像記録部に記録されている場合に、前記合成処理部は前記第一の画像データ及び前記第二の画像データを用いて前記同一の背景上に前記第一の人物及び前記第二の人物が重畳された前記合成データを生成する、
ことを特徴とする撮像装置。 - 請求項1に記載の撮像装置において、
前記撮像処理部が生成した画像データ、又は前記画像記録部に記録された画像データの一つを表示する表示部を更に備え、
前記顔認識処理部は、前記表示部に表示された画像データを前記第一の画像データとして用い、前記顔認識処理を実行する、
ことを特徴とする撮像装置。 - 請求項1に記載の撮像装置において、
前記画像記録部に記録された画像データをユーザが選択する選択操作部を更に備え、
前記合成処理部は、前記選択された画像データを前記第二の画像データとして用い、前記合成処理を実行する、
ことを特徴とする撮像装置。 - 請求項1に記載の撮像装置において、
ユーザが前記合成処理の開始を指示するための合成指示操作部を更に備え、
前記合成指示操作部が前記指示を受け付けると、前記顔認識処理部は前記顔認識処理を開始する、
ことを特徴とする撮像装置。 - 請求項1に記載の撮像装置において、
前記撮像処理部が前記画像データを生成すると、前記合成処理部に前記合成処理を実行させる自動合成モードに設定するためのモード設定部を更に備え、
前記撮像装置が前記自動合成モードに設定されている場合、前記撮像処理部が前記画像データを生成すると、前記顔認識処理部は前記撮像処理部が生成した画像データに対して前記顔認識処理を開始する、
ことを特徴とする撮像装置。 - 請求項5に記載の撮像装置において、
前記撮像処理部により生成された画像データにおいて、当該画像データの画像中心から所定の割合以上ずれた位置に人物が撮像された被写体領域がある場合に、前記顔認識処理部は前記撮像処理部が生成した画像データに対して前記顔認識処理を開始する、
ことを特徴とする撮像装置。 - 請求項1に記載の撮像装置において、
前記画像記録部は、合成対象となる登録人物の顔画像データを記録し、
前記合成処理部は、前記顔認識処理部が前記第一の画像データ及び前記第二の画像データに対して顔認識処理を実行した認識結果及び前記登録人物の顔画像データを比較し、前記第一の画像データ又は前記第二の画像データに前記登録人物とは異なる未登録人物の顔が撮像されていると判断すると、当該未登録人物が撮像された被写体領域を削除し、その削除した領域に前記背景データを合成して前記合成データを生成する、
ことを特徴とする撮像装置。 - 請求項7に記載の撮像装置において、
外部装置と無線通信回線を介してデータを送受信する無線通信部を更に備え、
前記合成処理部は、前記無線通信部を介して前記外部装置から前記同一の背景を有する第三の画像データを取得し、当該第三の画像データを用いて前記削除した領域に前記背景データを合成する、
ことを特徴とする撮像装置。 - 請求項1に記載の撮像装置において、
前記被写体を撮像した位置に対応する位置情報を取得する位置情報取得部を更に備え、
前記画像記録部は、前記画像データに付加して前記位置情報を記録し、
前記合成処理部は、前記第一の画像データに付加された位置情報と同じ位置情報が付加されていることを検索条件として前記画像記録部に記録された画像データから前記第二の画像データの候補を抽出する、
ことを特徴とする撮像装置。 - 被写体を撮像して第一の人物が撮像された第一の画像データを生成するステップと、
前記第一の画像データを用いた合成処理の開始の指示を受け付けるステップと、
前記第一の画像データに対して顔認識処理を実行し、前記第一の人物を認識するステップと、
前記第一の画像データとは異なる画像データが記録された画像記録部から、前記第一の画像データの背景と同一の背景を有し、かつ前記第一の人物とは異なる第二の人物が撮像された第二の画像データを検索して読み出すステップと、
前記同一の背景上に前記第一の人物及び前記第二の人物が重畳された合成データを生成するステップと、
を含むことを特徴とする画像処理方法。 - 被写体を撮像して画像データが生成されると、当該画像データを用いた合成処理の実行を開始させる動作モードを設定するステップと、
第一の人物が撮像された第一の画像データを生成するステップと、
前記第一の画像データに対して顔認識処理を実行し、前記第一の人物を認識するステップと、
前記第一の画像データとは異なる画像データが記録された画像記録部から、前記第一の画像データの背景と同一の背景を有し、かつ前記第一の人物とは異なる第二の人物が撮像された第二の画像データを検索して読み出すステップと、
前記同一の背景上に前記第一の人物及び前記第二の人物が重畳された合成データを生成するステップと、
を含むことを特徴とする画像処理方法。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/739,097 US10542224B2 (en) | 2015-06-26 | 2015-06-26 | Imaging device and image processing method |
CN201580079346.8A CN107710731B (zh) | 2015-06-26 | 2015-06-26 | 摄像装置以及图像处理方法 |
JP2017524553A JP6537608B2 (ja) | 2015-06-26 | 2015-06-26 | 撮像装置及び画像処理方法 |
CN202110381333.3A CN113099119B (zh) | 2015-06-26 | 2015-06-26 | 摄像装置以及图像处理方法 |
CN202110381252.3A CN113099118B (zh) | 2015-06-26 | 2015-06-26 | 摄像装置以及图像处理方法 |
PCT/JP2015/068535 WO2016208070A1 (ja) | 2015-06-26 | 2015-06-26 | 撮像装置及び画像処理方法 |
US16/709,476 US11159714B2 (en) | 2015-06-26 | 2019-12-10 | Imaging device and image processing method |
US17/499,275 US11528432B2 (en) | 2015-06-26 | 2021-10-12 | Imaging device and image processing method |
US18/054,600 US11870944B2 (en) | 2015-06-26 | 2022-11-11 | Imaging device and image processing method |
US18/527,950 US20240106972A1 (en) | 2015-06-26 | 2023-12-04 | Imaging device and image processing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/068535 WO2016208070A1 (ja) | 2015-06-26 | 2015-06-26 | 撮像装置及び画像処理方法 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/739,097 A-371-Of-International US10542224B2 (en) | 2015-06-26 | 2015-06-26 | Imaging device and image processing method |
US16/709,476 Continuation US11159714B2 (en) | 2015-06-26 | 2019-12-10 | Imaging device and image processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016208070A1 true WO2016208070A1 (ja) | 2016-12-29 |
Family
ID=57585226
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/068535 WO2016208070A1 (ja) | 2015-06-26 | 2015-06-26 | 撮像装置及び画像処理方法 |
Country Status (4)
Country | Link |
---|---|
US (5) | US10542224B2 (ja) |
JP (1) | JP6537608B2 (ja) |
CN (3) | CN113099119B (ja) |
WO (1) | WO2016208070A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110140152A (zh) * | 2017-10-20 | 2019-08-16 | 三菱电机株式会社 | 数据处理装置、可编程显示器及数据处理方法 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220058975A (ko) * | 2016-09-16 | 2022-05-10 | 소니 세미컨덕터 솔루션즈 가부시키가이샤 | 촬상 장치 및 전자 기기 |
CN109089045A (zh) * | 2018-09-18 | 2018-12-25 | 上海连尚网络科技有限公司 | 一种基于多个摄像装置的摄像方法及设备及其终端 |
US11513669B2 (en) * | 2020-02-28 | 2022-11-29 | Micron Technology, Inc. | User interface for modifying pictures |
US11776237B2 (en) * | 2020-08-19 | 2023-10-03 | Adobe Inc. | Mitigating people distractors in images |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010108475A (ja) * | 2008-10-03 | 2010-05-13 | Sony Corp | 画像処理装置および方法、プログラム、並びに記録媒体 |
JP2012109693A (ja) * | 2010-11-16 | 2012-06-07 | Casio Comput Co Ltd | 撮像装置、画像合成方法、及びプログラム |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11136568A (ja) | 1997-10-31 | 1999-05-21 | Fuji Photo Film Co Ltd | タッチパネル操作式カメラ |
US7221395B2 (en) * | 2000-03-14 | 2007-05-22 | Fuji Photo Film Co., Ltd. | Digital camera and method for compositing images |
JP4508596B2 (ja) | 2002-11-06 | 2010-07-21 | キヤノン株式会社 | 通信装置、画像記憶装置およびそれらの制御方法 |
CN100388277C (zh) * | 2002-11-06 | 2008-05-14 | 佳能株式会社 | 通信装置以及通信装置的控制方法 |
JP2004248020A (ja) | 2003-02-14 | 2004-09-02 | Fuji Photo Film Co Ltd | 画像処理装置および画像処理システム |
JP2005033532A (ja) * | 2003-07-14 | 2005-02-03 | Noritsu Koki Co Ltd | 写真処理装置 |
JP4670976B2 (ja) | 2008-10-03 | 2011-04-13 | ソニー株式会社 | 学習装置および方法、認識装置および方法、プログラム、並びに記録媒体 |
JP2011118694A (ja) | 2009-12-03 | 2011-06-16 | Sony Corp | 学習装置および方法、認識装置および方法、並びにプログラム |
JP5105550B2 (ja) * | 2009-03-19 | 2012-12-26 | カシオ計算機株式会社 | 画像合成装置及びプログラム |
JP4752941B2 (ja) * | 2009-03-31 | 2011-08-17 | カシオ計算機株式会社 | 画像合成装置及びプログラム |
JP5267279B2 (ja) | 2009-03-31 | 2013-08-21 | カシオ計算機株式会社 | 画像合成装置及びプログラム |
CN102075675A (zh) * | 2010-12-24 | 2011-05-25 | 富泰华工业(深圳)有限公司 | 便携式电子装置及其拍照方法 |
US20150009359A1 (en) * | 2013-03-19 | 2015-01-08 | Groopic Inc. | Method and apparatus for collaborative digital imaging |
JP5761323B2 (ja) | 2013-12-17 | 2015-08-12 | カシオ計算機株式会社 | 撮像装置、撮像方法、及びプログラム |
CN103747180A (zh) * | 2014-01-07 | 2014-04-23 | 宇龙计算机通信科技(深圳)有限公司 | 照片拍摄方法及拍照终端 |
KR20160057867A (ko) * | 2014-11-14 | 2016-05-24 | 삼성전자주식회사 | 디스플레이 장치 및 그에 의한 이미지 처리 방법 |
-
2015
- 2015-06-26 CN CN202110381333.3A patent/CN113099119B/zh active Active
- 2015-06-26 JP JP2017524553A patent/JP6537608B2/ja active Active
- 2015-06-26 US US15/739,097 patent/US10542224B2/en active Active
- 2015-06-26 CN CN201580079346.8A patent/CN107710731B/zh active Active
- 2015-06-26 WO PCT/JP2015/068535 patent/WO2016208070A1/ja active Application Filing
- 2015-06-26 CN CN202110381252.3A patent/CN113099118B/zh active Active
-
2019
- 2019-12-10 US US16/709,476 patent/US11159714B2/en active Active
-
2021
- 2021-10-12 US US17/499,275 patent/US11528432B2/en active Active
-
2022
- 2022-11-11 US US18/054,600 patent/US11870944B2/en active Active
-
2023
- 2023-12-04 US US18/527,950 patent/US20240106972A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010108475A (ja) * | 2008-10-03 | 2010-05-13 | Sony Corp | 画像処理装置および方法、プログラム、並びに記録媒体 |
JP2012109693A (ja) * | 2010-11-16 | 2012-06-07 | Casio Comput Co Ltd | 撮像装置、画像合成方法、及びプログラム |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110140152A (zh) * | 2017-10-20 | 2019-08-16 | 三菱电机株式会社 | 数据处理装置、可编程显示器及数据处理方法 |
CN110140152B (zh) * | 2017-10-20 | 2020-10-30 | 三菱电机株式会社 | 数据处理装置、可编程显示器及数据处理方法 |
Also Published As
Publication number | Publication date |
---|---|
CN113099119B (zh) | 2023-08-22 |
US11870944B2 (en) | 2024-01-09 |
CN113099118A (zh) | 2021-07-09 |
US20180241951A1 (en) | 2018-08-23 |
US20230075223A1 (en) | 2023-03-09 |
US10542224B2 (en) | 2020-01-21 |
US11159714B2 (en) | 2021-10-26 |
JP6537608B2 (ja) | 2019-07-03 |
CN113099118B (zh) | 2023-08-22 |
US11528432B2 (en) | 2022-12-13 |
CN113099119A (zh) | 2021-07-09 |
CN107710731A (zh) | 2018-02-16 |
JPWO2016208070A1 (ja) | 2018-03-08 |
US20220046167A1 (en) | 2022-02-10 |
US20240106972A1 (en) | 2024-03-28 |
CN107710731B (zh) | 2021-05-04 |
US20200112694A1 (en) | 2020-04-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8599251B2 (en) | Camera | |
KR101720774B1 (ko) | 디지털 촬영 장치 및 그의 사진 제공 방법 | |
US8665345B2 (en) | Video summary including a feature of interest | |
US8643746B2 (en) | Video summary including a particular person | |
US11528432B2 (en) | Imaging device and image processing method | |
KR101901910B1 (ko) | 선택 영역을 변화시키는 결과 영상을 생성 또는 저장하는 장치 및 방법 | |
US8525913B2 (en) | Digital photographing apparatus, method of controlling the same, and computer-readable storage medium | |
JP2010199633A (ja) | 再生装置および再生方法 | |
JP2012084052A (ja) | 撮像装置、制御方法及びプログラム | |
JP6703361B2 (ja) | 撮像装置及び画像処理方法 | |
JP6916342B2 (ja) | 撮像装置及び画像処理方法 | |
JP7362696B2 (ja) | 撮像装置 | |
JP2008271239A (ja) | カメラ、コンテンツ作成方法、及びプログラム | |
CN103581512B (zh) | 拍摄装置和方法 | |
US9571717B2 (en) | Imaging device, imaging system, imaging method, and computer-readable recording medium | |
JP2017228828A (ja) | 撮像装置、表示装置、及び撮像表示システム | |
JP2022137136A (ja) | 表示装置 | |
JP5687480B2 (ja) | 撮影装置、撮影方法及び撮影プログラム | |
JP2012134864A (ja) | 情報処理装置とその処理方法及びプログラム | |
JP2014216904A (ja) | 撮像装置、画像再生装置、データ記録方法、画像再生方法及びプログラム | |
JP2010109634A (ja) | 画像表示装置及び撮像装置 | |
JP2008270975A (ja) | 情報付加装置、情報付加方法及びプログラム | |
JP2010087712A (ja) | 動画撮像装置、動画撮像方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15896388 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017524553 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15739097 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15896388 Country of ref document: EP Kind code of ref document: A1 |