WO2015115203A1 - 撮影装置、撮影方法、及び、プログラム - Google Patents
撮影装置、撮影方法、及び、プログラム Download PDFInfo
- Publication number
- WO2015115203A1 WO2015115203A1 PCT/JP2015/051018 JP2015051018W WO2015115203A1 WO 2015115203 A1 WO2015115203 A1 WO 2015115203A1 JP 2015051018 W JP2015051018 W JP 2015051018W WO 2015115203 A1 WO2015115203 A1 WO 2015115203A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- imaging
- communication
- unit
- photographing
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/08—Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
- G03B7/091—Digital circuits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/53—Constructional details of electronic viewfinders, e.g. rotatable or detachable
- H04N23/531—Constructional details of electronic viewfinders, e.g. rotatable or detachable being rotatable or detachable
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/634—Warning indications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/684—Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time
Definitions
- the present technology relates to a photographing apparatus, a photographing method, and a program, and more particularly, to a photographing apparatus, a photographing method, and a program that can appropriately photograph a subject, for example.
- the present technology has been made in view of such a situation, and makes it possible to appropriately photograph a subject.
- An imaging device or a program includes a communication control unit that controls communication processing for communicating with a subject, and shooting control that controls shooting by the imaging unit that captures the subject according to the communication processing.
- An imaging apparatus including an imaging control unit to perform, or a program for causing a computer to function as such an imaging apparatus.
- the photographing method of the present technology is a photographing method including a step of performing photographing control for controlling photographing by a photographing unit that photographs the subject according to communication processing for communicating with the subject.
- photographing control for controlling photographing by the photographing unit that photographs the subject is performed in accordance with the communication process for communicating with the subject.
- the photographing device may be an independent device or an internal block constituting one device.
- the program can be provided by being transmitted through a transmission medium or by being recorded on a recording medium.
- the subject can be appropriately photographed.
- FIG. 10 is a flowchart for explaining an example of processing for photographing a subject in a desired state. It is a flowchart explaining the example of the process which selects a communication process (it is an induction process). It is a figure explaining the example of a desired state and induction processing.
- 10 is a flowchart illustrating an example of processing for photographing a subject that is in a stationary state with a smile. It is a flowchart explaining the example of the process of imaging
- FIG. 18 is a block diagram illustrating a configuration example of an embodiment of a computer to which the present technology is applied.
- FIG. 1 is a block diagram illustrating a configuration example of an embodiment of a photographing apparatus to which the present technology is applied.
- the photographing apparatus is, for example, a digital camera that photographs a moving image or a still image, and includes a control unit 11, a communication processing unit 12, a photographing unit 13, a recording unit 14, a communication unit 15, and a position information acquisition unit 16.
- the control unit 11 controls the entire photographing apparatus.
- control unit 11 performs necessary processing on the captured image captured by the imaging unit 13 supplied from the imaging unit 13, and supplies the recorded image to the recording unit 14 for recording.
- control unit 11 includes a communication control unit 11A and a photographing control unit 11B.
- the communication control unit 11A controls communication processing by the communication processing unit 12.
- the photographing control unit 11B performs photographing control for controlling photographing of the photographed image by the photographing unit 13.
- the communication processing unit 12 performs (executes) communication processing for communicating with the subject in accordance with the control of the control unit 11 (communication control unit 11A).
- any process for communicating with the subject can be adopted. That is, as the communication processing, for example, processing such as image display, audio output, and gesture by a robot can be employed.
- the communication processing unit 12 is included in the imaging device, but the communication processing unit 12 can be configured as a device separate from the imaging device.
- the control unit 11 (the communication control unit 11A) performs communication processing by performing wireless communication or wired communication with the communication processing unit 12. The communication processing by the unit 12 is controlled.
- the imaging unit 13 has an image sensor such as a CMOS (Complementary Metal Oxide Semiconductor) sensor, for example, and images the subject according to the imaging control by the control unit 11 (the imaging control unit 11B). Further, the photographing unit 13 supplies a photographed image obtained by photographing the subject to the control unit 11.
- CMOS Complementary Metal Oxide Semiconductor
- the recording unit 14 records the captured image supplied from the control unit 11.
- the recording unit 14 includes a recording control unit 14A and a recording medium 14B.
- the recording control unit 14A performs recording control for recording the captured image supplied from the control unit 11 on the recording medium 14B.
- the recording medium 14B is, for example, a recording medium such as a memory card or a disk, and may be a recording medium that can be attached to and detached from the recording unit 14, or may be a recording medium that is fixed in the recording unit 14.
- the communication unit 15 performs communication via the Internet or other networks under the control of the control unit 11.
- the control unit 11 can acquire necessary information (contents, programs, and the like) from, for example, a server on the Internet by causing the communication unit 15 to perform communication.
- the location information acquisition unit 16 acquires the current location of the imaging apparatus using, for example, GPS (Global Positioning System) and supplies the current location to the control unit 11.
- the control unit 11 performs processing using the current location supplied from the position information acquisition unit 16 as necessary.
- FIG. 2 is a perspective view showing a first external configuration example of the photographing apparatus of FIG.
- the photographing device is a digital still camera capable of photographing still images and moving images, and includes a main body 21 and a liquid crystal panel 22.
- the main body 21 incorporates the control unit 11, the photographing unit 13, and the recording unit 14 of FIG. 1, and a lens for condensing light on the photographing unit 13 (image sensor) is provided on the front side. .
- the liquid crystal panel 22 functions as the communication processing unit 12 in FIG. In FIG. 2, the liquid crystal panel 22 is provided on the back side of the main body 21 and is mounted so as to be rotatable about the upper part of the main body 21 as a rotation center.
- the display screen of the liquid crystal panel 22 faces the front side (photographing direction), whereby the image displayed on the display screen is displayed on the subject.
- the image can be used to communicate with the subject.
- the liquid crystal panel 22 performs a process of displaying a predetermined message or other image on the display screen as a communication process, thereby achieving communication with the subject by the image displayed on the display screen.
- the liquid crystal panel 22 (or the main body 21) can incorporate a speaker.
- the liquid crystal panel 22 performs a process of outputting a predetermined message or other sound (sound) from the speaker as a communication process. It can be carried out. In this case, communication with the subject can be achieved by the sound output from the speaker.
- FIG. 3 is a perspective view showing a second external configuration example of the photographing apparatus of FIG.
- the photographing apparatus is a digital still camera capable of photographing a still image and a moving image, and includes a main body 31.
- the main body 31 is configured similarly to the main body 21 of FIG. That is, the main body 31 incorporates the control unit 11, the imaging unit 13, and the recording unit 14 of FIG. 1, and a lens for condensing light on the imaging unit 13 is provided on the front side.
- the main body 31 can communicate with an output terminal 32 separate from the main body 31 by wireless communication such as wireless LAN (Local Area Network) or Bluetooth (registered trademark).
- wireless LAN Local Area Network
- Bluetooth registered trademark
- the output terminal 32 is a terminal capable of communication processing such as image display and sound output, and functions as the communication processing unit 12 in FIG.
- the output terminal 32 communicates with the subject by performing communication processing such as image display and sound output according to control by wireless communication from the main body 31 (the control unit 11).
- a tablet terminal can be adopted as the output terminal 32.
- a tablet terminal for example, a photographer who operates the main body 31 as a photographing device to photograph a subject has the output terminal 32 and is displayed on the output terminal 32. Etc. are presented to the subject.
- the output terminal 32 for example, a smartphone owned by the subject or a wearable device such as a glass type can be employed.
- a smartphone or wearable device owned by the subject is adopted as the output terminal 32, the subject receives presentation of an image or the like displayed on the smartphone or wearable device from the smartphone or wearable device owned by the subject. Can do.
- communication with the subject can be achieved by the image displayed on the output terminal 32 separate from the main body 31 and the sound output from the output terminal 32.
- the liquid crystal panel 22 removed from the main body 21 is connected to the output terminal 32 of FIG. Can be used as
- a photographed image of the subject photographed by the photographing unit 13 in addition to an image displayed in the communication process is displayed from the main body 31. It can be transmitted to a smartphone or the like owned by the subject.
- the main body 31 specific information for identifying a smartphone or the like owned by the subject is registered in advance, and when the subject is photographed, the subject is recognized and specific information for identifying the smartphone or the like owned by the subject is searched. By doing so, it is possible to transmit a captured image obtained by photographing the subject to a smartphone or the like identified by the identification information, that is, a smartphone or the like owned by the photographed subject.
- FIG. 4 is a perspective view showing a third external configuration example of the photographing apparatus of FIG.
- the photographing device is a digital video camera capable of photographing moving images and still images, and includes a main body 41 and a liquid crystal panel 42.
- the main body 41 incorporates the control unit 11, the image capturing unit 13, and the recording unit 14 of FIG. 1, and a lens for condensing light on the image capturing unit 13 is provided on the front side.
- the liquid crystal panel 42 functions as the communication processing unit 12 in FIG. In FIG. 4, the liquid crystal panel 42 is provided on the side surface side of the main body 41, and is mounted so as to be able to be rotated from the main body 41 with one side of the lens side of the main body 41 as the rotation center. .
- the liquid crystal panel 42 is configured to be rotatable about a direction orthogonal to the rotation center of the rotation that opens and closes with respect to the main body 41, so that the liquid crystal panel 42 is, for example, When closed on the main body 41, the display screen can be rotated to a state of being housed on the main body 41 side or a state of facing the outside.
- the liquid crystal panel 42 can turn the display screen toward the subject (photographing direction) while being opened from the main body 41, thereby presenting the image displayed on the display screen to the subject and the image. Makes it possible to communicate with the subject.
- the liquid crystal panel 42 communicates with the subject by the image displayed on the display screen by performing a process of displaying a predetermined message or other image on the display screen as a communication process.
- the liquid crystal panel 42 (or the main body 41) can incorporate a speaker.
- the liquid crystal panel 42 can perform a process of outputting a predetermined message or other sound from the speaker as a communication process. it can.
- communication with the subject can be achieved by the sound output from the speaker.
- the output terminal 32 separate from the main body 41 can function as the communication processing unit 12 of FIG.
- FIG. 5 is a perspective view showing a fourth external configuration example of the photographing apparatus of FIG.
- the photographing apparatus is configured as a pet (dog) type robot, and moves by moving four legs autonomously or according to a remote operation of a user (photographer or subject) for photographing. Images can be taken.
- the photographing unit 13 is built in the head 51 of the pet robot as eyes (vision) of the pet robot.
- a liquid crystal panel 52 that functions as the communication processing unit 12 of FIG. 1 is provided on the chest of the pet-type robot.
- the liquid crystal panel 52 performs a process of displaying a predetermined message and other images as a communication process, thereby achieving communication with the subject by the image displayed in the communication process.
- the liquid crystal panel 52 can be provided at a position other than the chest of the pet-type robot, that is, for example, at the abdomen.
- the image displayed on the liquid crystal panel 52 is displayed on the subject when the pet-type robot is standing only on the back leg (so-called chinchin state). Presented and can communicate with the subject.
- the head 51 of the pet-type robot can incorporate a speaker as a mouth of the pet-type robot, and the pet-type robot performs a process of outputting a predetermined message or other sound from the speaker as a communication process. It can be carried out. In this case, communication with the subject can be achieved by the sound output from the speaker.
- the output terminal 32 separate from the pet type robot can function as the communication processing unit 12 of FIG.
- gestures can be performed as communication processing.
- the gesture it is possible to adopt a predetermined action by moving the head 51, legs, tail, etc. of the pet-type robot.
- the head 51 can be configured so that the mouth can be opened and closed, and an action such as putting out an candy ball from the mouth can be employed as a gesture as a communication process. Such an action of putting out an candy ball from the mouth can alert a child.
- a projector is mounted on the pet-type robot shown in FIG. 5, and a process of displaying an image on the screen by the projector can be performed as a communication process.
- the present invention can be applied to an electronic device that can be equipped with a function for taking an image.
- FIG. 6 is a flowchart for explaining an example of processing for photographing a subject in a desired state in the photographing apparatus of FIG.
- the photographing apparatus of FIG. 1 by performing communication processing in the communication processing unit 12, for example, the same communication as when a professional photographer performs photographing is taken between the subject and the subject is desired by the communication.
- the subject in a desired state can be photographed by causing the user to act so as to be in the state.
- step S11 the photographing unit 13 photographs a subject, supplies a photographed image obtained by the photographing to the control unit 11, and the process proceeds to step S12.
- step S12 the control unit 11 detects the state of the subject using a photographed image from the photographing unit 13, and the process proceeds to step S13.
- the state of the subject in addition to the captured image, the sound of the subject, the body temperature of the subject, and other biological information can be used.
- step S13 the control unit 11 determines whether or not the state of the subject is a desired state (or is likely to become a desired state).
- the desired state includes, for example, a state where the subject is smiling, a state where the subject is facing (in the shooting direction), a state where the subject is stationary (the subject appearing in the captured image is not blurred). State of being stationary).
- the desired state can be determined, for example, according to a user operation or autonomously by the photographing apparatus.
- step S13 If it is determined in step S13 that the subject is not in the desired state (or is not likely to be in the desired state), the process proceeds to step S14, and the control unit 11 performs a plurality of patterns for the desired state.
- One (or two or more) induction processes are selected as communication processes from the prepared induction processes, and the communication processing unit 12 is controlled to execute the communication processes. Proceed to S15.
- the triggering process is a process for triggering the subject to be in a predetermined state, and includes a process for triggering the subject to take an action that finally becomes a desired state.
- “trigger” refers to direct or indirect instructions (for example, output of a message of “laughing” or a message with humor) and presentation of information to bring a subject into a predetermined state. This is a broad concept including all other actions that cause the subject to be in a predetermined state.
- Examples of the inducing process include a process for inducing the subject to take a predetermined pose, and a process for inducing the subject to be photographed in a predetermined direction.
- Examples of the triggering process that triggers the subject to take a predetermined pose include, for example, a triggering process that causes the subject to smile (take a smile pose), and a subject to stand still (take a pause pose). Processing that triggers, processing that triggers the subject to jump (poses a jump pose), processing that triggers the subject to pose a peace sign (V-sign), and that the subject is theoretically preferable for portrait photography There are a process for inducing a pose to be taken and a process for inducing a pose to be taken like a model.
- Examples of the triggering process that induces the subject to be photographed in a predetermined direction include, for example, a process that induces the subject to face directly (shooting is performed in a direction that the subject faces directly), and photographing is a front light. There is a process to induce it to be performed in the direction.
- Trigger processing is provided with multiple patterns for each desired state.
- the desired state is a state in which the subject is smiling, for example, outputting a “laughing” message that induces direct laughing (displaying an image of a “laughing” message) And audio output
- images of characters that tend to be popular with children images of characters that tend to be popular with women, output of content that tends to be popular in each region
- a plurality of patterns of processes such as the output of a gesture of a pet type robot that tends to be popular with women are prepared as induction processes.
- a “stop” message with a different wording for example, a gentle wording “stop” ”Or“ stop ”in a strong word, display images of products that tend to be popular with men, and idols that tend to be popular with women (for example, many women ’s magazines in the previous year)
- a plurality of patterns of processing such as displaying an image of a posted idol, etc. is prepared as a triggering process.
- step S14 the control unit 11 selects one induction process as a communication process from among the induction processes prepared in a plurality of patterns as described above for a desired state.
- step S15 the communication processing unit 12 executes the communication process selected in step S14 under the control of the control unit 11.
- step S15 returns from step S15 to step S11, and the processes of steps S11 to S15 are repeated until it is determined in step S13 that the state of the subject is a desired state.
- the triggering process that has not been selected as the communication process in step S14 can be selected as the communication process. This is because the induction process that has already been selected may have a low effect of bringing the subject to a desired state.
- step S13 determines that the state of the subject is the desired state
- the process proceeds to step S16, and the photographing unit 13 photographs the subject in the desired state, and obtains a photographed image obtained as a result. , Supplied to the control unit 11.
- step S16 the control unit 11 causes the recording unit 14 to record the photographed image from the photographing unit 13 and ends the process.
- control unit 11 stores the communication process performed when the subject is in a desired state (the induced process selected as the subject) as a communication history in association with the subject, and uses the communication history. For example, it is possible to perform ranking learning that ranks the induction processing (communication processing) in which a subject is likely to be in a desired state for each subject, age group, gender, and region.
- the triggering process that easily brings the subject to a desired state can be preferentially selected as the communication process.
- the subject can be photographed in a desired state relatively quickly.
- age and sex of the subject can be predicted (estimated) by recognizing the subject using a photographed image obtained by photographing the subject.
- the imaging apparatus based on the current location acquired by the position information acquisition unit 16, the time setting of the imaging apparatus (which area time is set by the time setting), and the like. be able to.
- FIG. 7 is a flowchart for explaining an example of a process for selecting the communication process (the induction process) in step S14 of FIG.
- step S21 the control unit 11 recognizes the subject of interest, which is the subject in the captured image from the imaging unit 13, by image processing, and the processing proceeds to step S22.
- step S22 the control unit 11 determines whether or not the subject of interest recognized in step S21 is a registered subject.
- subject information relating to the subject is registered in a table (hereinafter also referred to as a subject table) stored in a memory (not shown) built in the control unit 11. Is done.
- Registered subject means a subject whose subject information is registered in the subject table.
- the subject information includes, for example, the feature amount of the subject extracted from the photographed image in which the subject is captured (or the subject image itself), is given a unique ID (Identification), and is registered in the subject table.
- step S22 If it is determined in step S22 that the subject of interest is not a registered subject, that is, if subject information of the subject of interest is not registered in the subject table, the process proceeds to step S23, and the control unit 11 The subject (subject information) is registered in the subject table, and the process proceeds to step S24.
- step S24 the control unit 11 selects one induction process as a communication process from among a plurality of default induction processes prepared for a desired state, for example, based on the subject of interest. The process proceeds to step S27.
- control unit 11 can recognize attributes such as the age, sex, and area of the subject of interest, and if the above-described ranking learning has been completed, the ranking learning is performed in step S24. Based on the learning result and the subject of interest, an induction process effective for the attribute of the subject of interest can be selected as the communication process. According to the communication process selected in such a manner, the subject of interest can be quickly brought into a desired state.
- an effective triggering process for example, a triggering process that displays an image of an idol that tends to be popular among young women when the subject of interest (attribute) is a young woman
- a communication process can be selected as a communication process. it can.
- one induction process can be randomly selected as a communication process from among a plurality of default induction processes prepared for a desired state regardless of the subject of interest.
- step S22 if it is determined in step S22 that the subject of interest is a registered subject, the process proceeds to step S25, and a trigger process performed as a communication process to bring the subject of interest into a desired state. It is determined whether or not a communication history as a history is registered in the subject table in association with the subject of interest (subject information of the subject).
- step S25 If it is determined in step S25 that the communication history for the subject of interest has not been registered, the process proceeds to step S24, and the above-described processes are performed.
- step S25 If it is determined in step S25 that the communication history for the target subject has been registered, the process proceeds to step S26, and the control unit 11 uses the communication history for the target user to select the target subject. Learning to find an effective triggering process to bring the subject into a state, and based on the learning result, select one triggering process effective to bring the subject of interest to the desired state as the communication process. The process proceeds to step S27.
- the trigger process performed as the communication process for the target subject is associated with the reaction of the target subject when the trigger process is performed.
- step S26 an induction process associated with a good reaction (reaction in which the target subject is in a desired state) is required as an effective induction process.
- the reaction associated with the triggering process includes the time from when the triggering process (communication process) is performed until the subject of interest reaches a predetermined state, and the target subject when the triggering process is performed.
- Information representing the emotion of the subject of interest predicted from a facial expression or the like can be employed.
- the desired state is a smile state
- the emotion of joy and fun among the emotions of emotions is great, and the reaction from the trigger process to the time of smile is reduced.
- the associated triggering process is obtained by learning in step S26 as the triggering process associated with a good reaction.
- step S26 can be performed in advance.
- the effective triggering process As described above, by selecting the triggering process to be performed as the communication process based on the communication history of the target subject, it is possible to select the effective triggering process as the communication process from the past communication with the target subject. it can.
- the effective induction process as described above is performed as a communication process for the subject of interest, so that the subject of interest can be quickly brought into a desired state.
- the desired state is a smiling state
- communication processing in which the subject of interest becomes more smiling is performed in the communication processing performed in the past on the subject of interest.
- the subject can be smiled effectively.
- step S27 the control unit 11 adds (registers) the induction process selected as the communication process in step S24 or S26 to the subject table as the communication history of the subject of interest, and ends the process.
- step S15 of FIG. 6 the communication process selected as described above is executed in step S15 of FIG. 6, and thereafter, the reaction of the subject of interest with respect to the trigger process as the communication process is associated with the trigger process executed as the communication process. And registered in the subject table.
- FIG. 8 is a diagram for explaining an example of a desired state and induction processing.
- Examples of the desired state include a state where the subject is smiling, a state where the subject is stationary, a state where the subject is facing, a state where the eyes are open, a state where the mouth is closed, and a state where the posture is good. Can be adopted.
- the triggering process for making the subject smile is, for example, an output timing (based on the so-called shutter (exposure) timing (shooting timing)) or the like.
- output timing based on the so-called shutter (exposure) timing (shooting timing)
- Examples of the triggering process to bring a subject to a stationary state include display of multiple “stop” messages with different output timings and expressions, etc., and display of multiple patterns of alerts that differ for each age group and gender. Etc.
- Examples of the triggering process to bring the subject to the opposite state include, for example, display of multiple patterns of “Let ’s turn” messages, output of candy balls, warning sounds, flashing light, etc. There are outputs.
- the triggering process for setting the subject in a state in which the eyes are open for example, there are a plurality of patterns of “open eyes” messages displayed with different output timings and expressions.
- Examples of the triggering process for bringing the subject into a state where the mouth is closed include, for example, displaying a “close mouth” message with a plurality of patterns having different output timings and expressions.
- the triggering process for bringing the subject into a good posture for example, there are a plurality of patterns such as a “stretching chin” message and a “stretching back” message that are different in output timing, saying, and the like.
- step S14 of FIG. 6 steps S24 and S26 of FIG. 7
- the control unit 11 selects one induction process from among induction processes prepared for a desired state. Is selected as the communication process.
- one induction process is selected as the communication process for the communication.
- two or more induction processes are selected as the communication process.
- a trigger process (combination) can be selected.
- two or more induction processes such as a trigger process for displaying a “laughing” message of a predetermined pattern and a trigger process for outputting a gesture of a predetermined pattern inviting laughter, etc.
- the triggering process can be selected as a communication process.
- two or more trigger processes as the communication processes can be executed simultaneously or in time series.
- the contents of the induction process can be registered in the control unit 11 in advance.
- the communication unit 15 communicates with an external server to acquire the content of a new triggering process from the server, and the triggering process registered in the control unit 11 according to the content of the new triggering process. Can be updated.
- FIG. 9 is a diagram illustrating an example of timing for performing the induction process.
- the desired state is a smile state
- the control unit 11 obtains a smile degree indicating the degree of smile of the subject from the subject in the captured image, and the subject is desired based on the smile degree. It is determined (detected) whether or not a smile state is established.
- FIG. 9 shows an example of the change in the degree of smile over time.
- the time from the start time t0 of communication processing such as the output (display) of the “laughing” message performed in the past to the time t1 when the smile degree of the subject is maximized (hereinafter referred to as the reaction time).
- ⁇ T) can be included in the communication history.
- step S26 in FIG. 7 the reaction time ⁇ T from the start of the communication process to the maximum smile degree is obtained for each subject etc. using the reaction time ⁇ T included in the communication history.
- the start timing of starting the communication process such as the output timing of outputting the “laughing” message, can be adjusted.
- FIG. 10 is a flowchart for explaining an example of processing for photographing an object in a stationary state with a smile in the photographing apparatus of FIG.
- 1 can employ a plurality of states as a desired state, and can photograph a subject in a plurality of desired states.
- FIG. 10 is a flowchart for explaining a process of photographing a subject that is smiling and stationary while adopting a smiling state and a stationary state (state without motion blur) as desired states. .
- step S31 the photographing unit 13 photographs the subject, supplies a photographed image obtained by the photographing to the control unit 11, and the process proceeds to step S32.
- step S32 the control unit 11 detects the state of the subject using the captured image from the imaging unit 13, and the process proceeds to step S33.
- step S33 the control unit 11 determines whether the subject is in a smiling state (or is likely to be in a smiling state).
- step S33 If it is determined in step S33 that the subject is not in a smiling state (or is not likely to be in a smiling state), the process proceeds to step S34, and the communication processing unit 12 enters a smiling state as a communication process. For example, a “laughing” message is output (image display or audio output) as one of induction processes for inducing the event, and the process returns to step S31.
- step S33 determines whether the subject is in a smiling state. If it is determined in step S33 that the subject is in a smiling state, the process proceeds to step S35, and the photographing unit 13 photographs the subject in a smiling state and controls the captured image obtained as a result. The process proceeds to step S36.
- step S36 the control unit 11 determines whether or not there is a motion blur in the subject shown in the captured image supplied from the imaging unit 13 in step S35.
- step S36 If it is determined in step S36 that the subject appearing in the captured image has motion blur, the process proceeds to step S37, and the control unit 11 determines that the captured image in which the subject has motion blur has been captured N times (N times, For example, it is determined whether or not the images have been shot continuously for a preset number of times.
- step S37 if it is determined that the captured image with motion blur on the subject has not yet been captured N times consecutively, that is, the number of times the captured image with motion blur on the subject has been continuously captured, If it is less than N times, the process returns to step S31, and the same process is repeated thereafter.
- step S37 if it is determined that the captured image with motion blur is captured N times consecutively, that is, the subject is moving violently without being stationary even after waiting for a certain amount of time. If a captured image without motion blur cannot be captured, the process proceeds to step S38, and the communication processing unit 12 serves as one of induction processes that induce a stationary state as the communication process, for example. The message “stop” is output, and the process returns to step S31.
- step S36 If it is determined in step S36 that the subject in the captured image has no motion blur, that is, if a subject with a smile and no motion blur can be photographed, the control unit 11 A photographed image of a subject with a smile and no motion blur is recorded in the recording unit 14, and the process ends.
- a “stop” message is output as a communication process that induces the subject to be in a stationary state.
- the “stop” message output that is, N times as a threshold value, can be set to a predetermined number of times, or a variable number based on the communication history.
- N times as a threshold value are set in advance.
- the number can be less than the default number.
- the threshold N times is set to 0, for example, and a captured image with motion blur is captured. When it is done, the message “stop” can be output immediately.
- a “stop” message is output immediately if a captured image without motion blur cannot be captured.
- FIG. 11 is a flowchart for explaining an example of the photographing process according to the photographing control corresponding to the communication process performed by the photographing apparatus of FIG.
- the image is captured again with the same settings as when the captured image with motion blur was captured (the settings of the imaging unit 13 (shutter speed (exposure time), aperture, shutter timing, etc.)).
- the settings of the imaging unit 13 shutter speed (exposure time), aperture, shutter timing, etc.)
- the control unit 11 performs, for example, shooting control to increase the shutter speed in response to the output of the “stop” message, and shooting unit 13 increases the shutter speed according to the shooting control to Shoot.
- FIG. 11 is a flowchart for explaining the above photographing process.
- step S51 the control unit 11 selects one induction process as a communication process from among a plurality of induction processes, and controls the communication processing unit 12 to execute the communication process. Proceed to step S52.
- the selection of the communication process in step S51 can be performed as described in FIG. 7, for example.
- two or more induction processes can be selected as the communication process in addition to one induction process.
- step S52 the communication processing unit 12 executes the communication process selected in step S51 according to the control of the control unit 11, and the process proceeds to step S53.
- step S53 the control unit 11 performs photographing control for controlling the photographing unit 13 in accordance with the communication process executed by the communication process 12, and the photographing unit 13 photographs a subject according to the photographing control.
- step S53 the photographing unit 13 supplies a photographed image obtained as a result of photographing the subject to the control unit 11, and the process proceeds to step S54.
- step S54 the control unit 11 causes the recording unit 14 to record the captured image from the imaging unit 13 and ends the process.
- the subject can be appropriately photographed by performing the photographing control according to the communication process and photographing the subject according to the photographing control.
- shooting control is performed in accordance with communication processing, and a specific example of shooting processing for shooting a subject in accordance with the shooting control is performed, when shooting a subject without motion blur, shooting a subject in a jumping state
- shooting processing for shooting a subject in accordance with the shooting control is performed, when shooting a subject without motion blur, shooting a subject in a jumping state
- An example will be described in the case of shooting, and a case where a subject in a state of facing is taken.
- FIG. 12 is a flowchart for explaining an example of photographing processing when photographing a subject in a state without motion blur.
- step S61 the photographing unit 13 photographs the subject, supplies a photographed image obtained as a result to the control unit 11, and the process proceeds to step S62.
- step S62 the control unit 11 determines whether or not there is a motion blur in the subject that appears in the captured image supplied from the imaging unit 13.
- step S62 If it is determined in step S62 that there is a motion blur in the subject in the captured image, that is, if the subject in the captured image has a motion blur because the subject is moving, the process proceeds to step S63. move on.
- step S63 the control unit 11 selects, for example, the output of a “stop” message as one of the induction processes for inducing the stationary state, and executes the communication process.
- the communication processing unit 12 is controlled, and the process proceeds to step S64.
- step S64 the communication processing unit 12 outputs a “stop” message as communication processing in accordance with the control of the control unit 11, and the processing proceeds to step S65.
- step S65 the control unit 11 performs shooting control of the shooting unit 13 so as to increase the shutter speed in response to the output of the “stop” message as the communication process performed in step S64.
- step S65 the photographing unit 13 photographs the subject according to the photographing control of the control unit 11, for example, by raising (fastening) the shutter speed from the reference value set in normal auto, and obtaining the result.
- the captured image is supplied to the control unit 11, and the process proceeds to step S66.
- step S62 determines whether there is no motion blur in the subject in the captured image. If it is determined in step S62 that there is no motion blur in the subject in the captured image, the process proceeds to step S66.
- step S66 the control unit 11 causes the recording unit 14 to record the captured image from the imaging unit 13 and ends the process.
- the communication processing unit 12 “stops” as a communication process in order to capture a captured image without motion blur. A message is output, and shooting control for increasing the shutter speed is performed in accordance with the output of the “stop” message.
- the photographing unit 13 the subject is photographed by increasing the shutter speed according to the photographing control.
- step S65 in addition to shooting control for increasing the shutter speed, for example, shooting control for adjusting the shooting timing can be performed.
- the shooting control of the shooting unit 13 is performed so that the timing when the reaction time obtained as a result of learning has elapsed from the output of the “stop” message is set as the shooting timing. It can be carried out.
- the reaction time learning can be performed for each user who is a subject, or can be performed by classifying subjects according to age group, gender, or the like, or without such classification.
- FIG. 13 is a flowchart for explaining an example of photographing processing when photographing a subject in a jumping state.
- step S ⁇ b> 71 the control unit 11 selects, for example, “jump” message output as a communication process as one of induction processes that induce a jumping state, and the communication process is performed.
- the communication processing unit 12 is controlled to execute, and the process proceeds to step S72.
- step S72 the communication processing unit 12 outputs a “jump” message as a communication process under the control of the control unit 11, and the process proceeds to step S73.
- step S73 the control unit 11 changes the shooting mode of the shooting unit 13 to the still image continuous shooting mode or the high-speed moving image according to the output of the “jump to” message as the communication process performed in step S72.
- the photographing control of the photographing unit 13 is performed so as to set the frame rate mode.
- step S73 the photographing unit 13 sets the photographing mode to a still image continuous shooting mode or a moving image high frame rate mode in accordance with the photographing control of the control unit 11, and photographs the subject.
- the obtained captured image is supplied to the control unit 11, and the process proceeds to step S74.
- step S74 the control unit 11 causes the recording unit 14 to record the captured image from the imaging unit 13 and ends the process.
- examples of the shooting mode of the shooting unit 13 include a moving image mode for shooting a moving image and a still image mode for shooting a still image.
- a normal frame rate mode for capturing a moving image at a normal frame rate that is a predetermined frame rate
- a high frame for capturing a moving image at a high frame rate that is a frame rate higher than the normal frame rate.
- rate mode for example, a normal frame rate mode for capturing a moving image at a normal frame rate that is a predetermined frame rate, and a high frame for capturing a moving image at a high frame rate that is a frame rate higher than the normal frame rate.
- the still image mode for example, there are a mode for shooting one still image, a mode for continuously shooting still images, and a mode for performing time-lapse shooting (time-lapse shooting).
- jumping is performed by performing shooting control for setting the shooting mode of the shooting unit 13 to the continuous shooting mode of still images or the high frame rate mode of moving images.
- the photographed subject can be photographed appropriately.
- shooting control is performed in which the shooting mode is set to a still image continuous shooting mode or a moving image high frame rate mode. Shooting is performed in the mode or the high frame rate mode of the moving image.
- the jumping subject when the subject jumps in response to the “jump to” message output, the jumping subject is shot in the continuous shooting mode or the high frame rate mode of the moving image. It is possible to obtain a captured image that captures every moment of the subject that is performing.
- shooting mode control setting
- shooting timing control adjustment
- effects on shot images are included as shooting control performed in response to the “jump to” message output as communication processing.
- Control movement control of the photographing device, control of the photographing direction of the photographing unit 13, and the like can be employed.
- the shooting timing can be controlled so that the shooting is performed at the timing when the subject reaches the highest jump point.
- the effect can be controlled so that an effect of applying a blur to the background that emphasizes the jump of the subject is executed.
- the photographing device when the photographing device is a movable pet-type robot as shown in FIG. 5, the photographing device is set so that the panning is performed in accordance with the jump of the subject. Control to move can be performed.
- the photographing device is a pet-type robot, and the head 51 containing the photographing unit 13 is moved to move the photographing direction of the photographing unit 13. Can be adjusted so that the shooting direction of the shooting unit 13 follows the subject so that the panning is performed in accordance with the jump of the subject.
- FIG. 14 is a flowchart for explaining an example of a photographing process in the case of photographing a subject in a state of facing.
- step S81 the photographing unit 13 photographs the subject, supplies the photographed image obtained as a result to the control unit 11, and the process proceeds to step S82.
- step S82 the control unit 11 determines whether or not the subject shown in the photographed image supplied from the photographing unit 13 is not directly facing (sideways).
- step S82 If it is determined in step S82 that the subject is not directly facing, that is, if the subject (its face) is sideways, the process proceeds to step S83.
- step S83 the control unit 11 selects, for example, an output of a message “Turn over here” as one of the triggering processes for inducing that the subject is facing the subject, and the communication process.
- the communication processing unit 12 is controlled to execute the process, and the process proceeds to step S84.
- step S84 the communication processing unit 12 outputs a message “Turn over here” as communication processing in accordance with the control of the control unit 11, and the process proceeds to step S85.
- step S85 the control unit 11 starts detecting the movement of the subject's face and confirms the deceleration of the movement of the face in response to the output of the “Look Here” message as the communication process performed in step S84.
- the photographing control of the photographing unit 13 is performed so that the later timing is the photographing timing.
- step S85 the photographing unit 13 photographs the subject at the photographing timing according to the photographing control of the control unit 11, and supplies the photographed image obtained as a result to the control unit 11, and the processing is performed in step S86. Proceed to
- step S82 determines whether the subject in the captured image is facing directly (not sideways). If it is determined in step S82 that the subject in the captured image is facing directly (not sideways), the process proceeds to step S86.
- step S86 the control unit 11 causes the recording unit 14 to record the captured image from the imaging unit 13 and ends the process.
- the communication processing unit 12 outputs a message “Look here” as communication processing in order to photograph the facing subject.
- shooting control is performed with the timing after confirming the deceleration of the facial motion as the shooting timing.
- the photographing unit 13 photographs the subject at the photographing timing according to the photographing control.
- the subject starts to move the face in the shooting direction (of the shooting unit 13) and is facing the shooting direction.
- Shooting is performed at the timing after the confirmation of the deceleration, that is, at the timing when the subject faces (almost) in the shooting direction.
- the subject can be appropriately photographed, that is, a photographed image showing the subject in a directly facing state can be photographed.
- FIG. 15 is a flowchart for explaining an example of processing for selecting one triggering process as a communication process from among a plurality of triggering processes in step S51 of FIG.
- step S91 the control unit 11 determines whether or not there are a plurality of subjects based on a photographed image (as a so-called through image) photographed by the photographing unit 13.
- step S91 When it is determined in step S91 that the subject is not a plurality of persons, that is, when it is determined that the subject is one person, the control unit 11 performs a trigger process for one subject from the plurality of trigger processes. Is selected as the communication process, and the process ends.
- the plurality of triggering processes include a triggering process for one subject for communicating with one subject and a triggering process for multiple subjects for communicating with a plurality of subjects. is there.
- the triggering process for a plurality of subjects includes a triggering process for the case where the main subject that is the main subject exists among the subjects of the plurality of subjects.
- the inducing process includes an inducing process for a main subject for communicating with a main subject among a plurality of subjects, and an inducing process for communicating with a subject other than the main subject.
- step S92 one of the induction processes for one subject is selected as the communication process from the plurality of induction processes as described above.
- step S91 determines whether there are a plurality of subjects. If it is determined in step S91 that there are a plurality of subjects, the process proceeds to step S93, and the control unit 11 appears in the photographed image based on the photographed image photographed by the photographing unit 13, for example. After determining the relationship between the subjects of the plurality of persons, the process proceeds to step S94.
- a user who is a photographer has information on the subject (for example, a parent / child of a father (mother) and a son (daughter), a friend, etc.) By inputting (information), it can be determined based on the information.
- the control unit 11 estimates the age group and gender of a subject that appears in a previously captured image, and the estimation result or the plurality of subjects that appear in the captured image.
- the state for example, the state where one subject is holding another subject, the state where multiple subjects are lined up, etc.
- the determination can be made based on the learning result.
- step S94 the control unit 11 determines whether or not a main subject is present among a plurality of subjects based on the relationship between the subjects.
- step S94 for example, in the case where the relationship between a plurality of subjects is friends, it is presumed that the purpose is to photograph all of the plurality of subjects who are the friends. Can be determined not to exist.
- step S94 for example, when the relationship between a plurality of subjects is a parent and a child of a father and a child, it is presumed that the child is photographed, so it is determined that the child is the main subject. Can do.
- a child priority mode that gives priority to children is provided as the operation mode of the photographing apparatus, and the user (photographer) performs an operation so that the operation mode is set to the child priority mode.
- the user performs an operation so that the operation mode is set to the child priority mode.
- step S94 If it is determined in step S94 that the main subject does not exist among the plurality of subjects, the process proceeds to step S95, and the control unit 11 performs the main subject in the induction processing for the plurality of subjects.
- One of the triggering processes other than the triggering process for the case where there is is selected as the communication process, and the process ends.
- step S94 If it is determined in step S94 that the main subject is present among a plurality of subjects, the process proceeds to step S95, and the control unit 11 determines that the main subject is in accordance with the relationship between the plurality of subjects.
- One of the trigger processes for the main subject or one of the trigger processes for communicating with a subject other than the main subject is selected as the communication process, and the process ends. To do.
- the selection process of the communication process in FIG. 15 for example, when the relationship between a plurality of subjects is a (father) parent and a child's parent and child, it is determined that a child as a main subject exists.
- a smile is directly induced, such as outputting a “laughing” message.
- the output of a message that indirectly induces a smile such as the output of the message “I want to be with my dad” according to the relationship between multiple subjects, and the main subject Includes content output suitable for children.
- output of a message “I can be with my dad”, which is one of the trigger processing for the main subject, can be selected as the communication processing.
- the message “I can be with my dad” is output to the child as the main subject.
- a triggering process for communicating with a subject other than the main subject for example, guiding the child as the main subject to a predetermined state, etc.
- the guidance instruction message for example, a message for instructing a parent who is not the main subject to guide the child who is the main subject to an appropriate state such as a smiling state by, for example, beating the child Can be adopted.
- the output of a guidance instruction message which is one of induction processes for communicating with subjects other than the main subject, can be selected as a communication process.
- a guidance instruction message for instructing the parent who is not the main subject to guide the child to be in an appropriate state such as a smiling state by outputting the child who is the main subject is output.
- the guidance instruction message can be displayed in the communication processing unit 12 as, for example, a so-called crib sheet.
- a triggering process to be a communication process can be selected in consideration of a relationship between a plurality of subjects such as a parent and a child.
- Imaging control such as movement of a pet-type robot as an imaging apparatus and adjustment of the imaging direction of the imaging unit 13 can be performed.
- FIG. 16 is a flowchart for explaining an example of the imaging control performed in accordance with the communication process in step S53 of FIG.
- Shooting control can be performed according to the reaction taken by the subject, for example, for communication processing.
- FIG. 16A is a flowchart illustrating a first example of imaging control performed in accordance with subject reaction to communication processing.
- step S101 the control unit 11 detects a subject that has reacted to the communication process, and the process proceeds to step S102.
- step S102 the control unit 11 performs shooting control for framing the subject that has reacted to the communication processing or shooting control for focusing, and ends the processing.
- FIG. 16B is a flowchart for explaining a second example of the imaging control performed in accordance with the reaction of the subject with respect to the communication process.
- step S111 the control unit 11 performs shooting control for applying an effect corresponding to the reaction taken by the subject to the communication process, and ends the process.
- an effect according to the reaction taken by the subject with respect to the communication processing for example, when a reaction that causes the subject to jump is taken, an effect that effectively shows the jump can be adopted.
- an effect that recalls a pleasant emotion for example, an overlay of illustrations such as flowers or stars, or a pink background
- a pleasant emotion for example, an overlay of illustrations such as flowers or stars, or a pink background
- FIG. 16C is a flowchart for explaining a third example of the imaging control performed in accordance with the reaction of the subject with respect to the communication process.
- step S121 the control unit 11 performs shooting control for setting a shooting mode corresponding to the reaction taken by the subject with respect to the communication process, and ends the process.
- the shooting mode according to the reaction taken by the subject with respect to the communication processing for example, when the subject takes an interesting motion reaction, the normal frame rate mode of the moving image can be adopted.
- the continuous shooting mode of still images can be adopted.
- FIG. 17 shows step S54 in FIG. 11 (in addition, step S66 in FIG. 12, step S74 in FIG. 13, step S86 in FIG. 14, step S16 in FIG. 6, and step S39 in FIG. 10).
- 4 is a flowchart for explaining an example of processing for recording a captured image performed by a recording unit.
- step S141 the recording unit 14 records the captured image supplied from the control unit 11 in association with the communication information related to the communication process performed when the captured image is captured, and ends the process.
- the communication information related to the communication process for example, information representing the content (category) of the communication process can be employed.
- a tag representing the “laughing” message can be adopted as communication information.
- the communication information as described above can be recorded as metadata of captured images, and such communication information can be used when organizing (classifying) captured images.
- the communication process is an image (moving image or still image) display
- the image (the image displayed in the communication process itself, a reduced image of the image, an image obtained by symbolizing the image, etc.) Can be adopted as communication information.
- a video of the user's reaction (hereinafter also referred to as a reaction video) is shot when the video is displayed in the communication process, and immediately after the video displayed in the communication process
- a video to which a reaction video is added can be used as communication information.
- the communication process is a display of a still image and the still image displayed in the communication process is adopted as the communication information
- the communication associated with the captured image is performed.
- a still image (a reduced image) as information can be displayed as an overlay on the captured image.
- the still image displayed in the communication process performed when the captured image is captured can be confirmed.
- FIG. 18 is a flowchart for explaining an example of a communication process and a learning process of shooting control process rules.
- communication processing and shooting control processing are performed using a communication history that is a history in which communication processing executed in the past is associated with subject reaction for the communication processing.
- a rule is learned, and subsequent communication processing and imaging control can be performed according to the processing rule obtained as a result of the learning.
- FIG. 18 is a flowchart illustrating an example of processing rule learning processing (hereinafter also referred to as rule learning processing) as described above.
- the rule learning process can be performed after the communication process is executed, for example.
- step S151 the control unit 11 associates the communication process executed by the communication process unit 12 with the reaction (expression, action, etc.) of the subject with respect to the communication process, and records, for example, as a communication history.
- the process proceeds to step S152.
- step S152 using the communication history, the control unit 11 learns (updates) the communication processing (triggering process selected as) and imaging control processing rules, and ends the processing.
- a processing rule for communication processing for example, a first rule for determining content to be output in communication processing, a second rule for determining an output method for outputting content in communication processing, or content in communication processing There is a third rule for determining the output timing to output.
- a processing rule for shooting control for example, there is a fourth rule for determining shooting timing.
- the first rule can be updated so that content with a higher smile degree of the subject is determined as content to be output in the communication process.
- the first rule updated in the rule learning process for example, content with a higher smile degree of the subject can be output in the communication process. Also, different contents can be output depending on whether the subject is one person or a plurality of persons.
- the second rule outputs, for example, an output method having a high reaction degree representing the goodness of the subject reaction among the output methods of contents by sound and image, and outputs the content in the communication process.
- the output method can be updated to be determined.
- the second rule updated in the rule learning process for example, in the communication process, it is possible to output the content with the better subject reaction among voice and image.
- the third rule can be updated so that, for example, a “stop” message is output at an earlier output timing for a subject that is likely to cause motion blur.
- a “stop” message is output early and a stationary object is induced early for a subject that is prone to motion blur. can do.
- the fourth rule can be updated so as to determine the shooting timing in consideration of the time from when the content is output in the communication process until the subject takes a reaction.
- the shooting timing can be controlled in consideration of the time from the output of the content in the communication process until the subject takes a reaction.
- FIG. 19 is a block diagram illustrating a configuration example of another embodiment of a photographing apparatus to which the present technology is applied.
- FIG. 19 is common to the case of FIG. 1 in that it includes a control unit 11 or a position information acquisition unit 16. However, the imaging apparatus of FIG. 19 is different from the case of FIG. 1 in that it includes a movement control unit 17 and a movement unit 18.
- the movement control unit 17 performs movement control for moving the photographing apparatus by driving the moving unit 18 according to the control of the control unit 11.
- the moving unit 18 includes, for example, the legs of the pet-type robot shown in FIG. 5, wheels that cause the imaging device to travel like an automobile, a propeller (rotor) that causes the imaging device to fly like a helicopter, and the imaging device as a motor boat or submarine. It is a mechanism that moves the photographing device, such as a propeller that travels as it is.
- the moving unit 18 is driven according to the movement control of the movement control unit 17, thereby moving the imaging apparatus.
- the movement control of the movement control unit 17 it is possible to perform shooting control for adjusting the position and shooting direction of the shooting apparatus during shooting by moving the shooting apparatus.
- FIG. 20 shows a configuration example of an embodiment of a computer in which a program for executing the series of processes described above is installed.
- the program can be recorded in advance in a hard disk 105 or a ROM 103 as a recording medium built in the computer.
- the program can be stored (recorded) in the removable recording medium 111.
- a removable recording medium 111 can be provided as so-called package software.
- examples of the removable recording medium 111 include a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disc, a DVD (Digital Versatile Disc), a magnetic disc, and a semiconductor memory.
- the program can be installed on the computer from the removable recording medium 111 as described above, or can be downloaded to the computer via the communication network or the broadcast network and installed on the built-in hard disk 105. That is, the program is transferred from a download site to a computer wirelessly via a digital satellite broadcasting artificial satellite, or wired to a computer via a network such as a LAN (Local Area Network) or the Internet. be able to.
- a network such as a LAN (Local Area Network) or the Internet.
- the computer includes a CPU (Central Processing Unit) 102, and an input / output interface 110 is connected to the CPU 102 via the bus 101.
- CPU Central Processing Unit
- the CPU 102 executes a program stored in a ROM (Read Only Memory) 103 accordingly. .
- the CPU 102 loads a program stored in the hard disk 105 into a RAM (Random Access Memory) 104 and executes it.
- the CPU 102 performs processing according to the flowchart described above or processing performed by the configuration of the block diagram described above. Then, the CPU 102 outputs the processing result as necessary, for example, via the input / output interface 110, from the output unit 106, transmitted from the communication unit 108, and further recorded in the hard disk 105.
- the input unit 107 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 106 includes an LCD (Liquid Crystal Display), a speaker, and the like.
- the processing performed by the computer according to the program does not necessarily have to be performed in chronological order in the order described as the flowchart. That is, the processing performed by the computer according to the program includes processing executed in parallel or individually (for example, parallel processing or object processing).
- the program may be processed by one computer (processor), or may be distributedly processed by a plurality of computers. Furthermore, the program may be transferred to a remote computer and executed.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- the present technology can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is jointly processed.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- this technique can take the following structures.
- a shooting section for shooting the subject; A communication control unit for controlling communication processing for communicating with the subject; An imaging apparatus comprising: an imaging control unit that performs imaging control for controlling imaging by the imaging unit according to the communication process.
- An imaging apparatus comprising: an imaging control unit that performs imaging control for controlling imaging by the imaging unit according to the communication process.
- the shooting mode includes a mode for shooting a single still image, a mode for continuously shooting still images, a mode for time-lapse shooting, a mode for shooting moving images at a normal frame rate, and a mode for shooting moving images at a high frame rate.
- the photographing apparatus having two or more modes.
- ⁇ 5> The imaging apparatus according to ⁇ 1> or ⁇ 2>, wherein the induction process for inducing the subject to take a predetermined action is performed as the communication process.
- ⁇ 6> The photographing apparatus according to ⁇ 1>, ⁇ 2>, or ⁇ 5>, in which the photographing shutter speed is controlled by the photographing unit as the photographing control.
- ⁇ 7> The imaging apparatus according to ⁇ 1>, ⁇ 2>, or ⁇ 5>, wherein the imaging timing of the imaging unit is controlled as the imaging control.
- ⁇ 8> The imaging apparatus according to ⁇ 1>, ⁇ 2>, or ⁇ 5>, wherein as the imaging control, effect control is performed on a captured image captured by the imaging unit.
- ⁇ 9> The photographing apparatus according to ⁇ 1>, ⁇ 2>, or ⁇ 5>, wherein one or both of movement control of the photographing apparatus and control of a photographing direction of the photographing unit is performed as the photographing control.
- ⁇ 10> The imaging device according to any one of ⁇ 1> to ⁇ 9>, wherein the induction processing for inducing the subject to take a predetermined pose is performed as the communication processing.
- the triggering process for triggering the subject to smile is performed.
- the triggering process for triggering the subject to stand still is performed.
- the triggering process for triggering the subject to jump is performed.
- the imaging apparatus according to any one of ⁇ 1> to ⁇ 9>, wherein the triggering process for triggering the subject to be shot in a predetermined direction is performed as the communication process.
- the imaging device according to ⁇ 12>, wherein the inducing process for inducing the subject to face the subject is performed as the communication process.
- the communication control unit performs communication processing for communicating with the main subject.
- the photographing apparatus according to any one of> to ⁇ 13>.
- the photographing control unit performs the photographing control in accordance with a subject that has taken a reaction with respect to the communication process or a reaction that has been taken by the subject with respect to the communication process.
- ⁇ 1> to ⁇ 14> The imaging device described in 1.
- ⁇ 16> The imaging according to any one of ⁇ 1> to ⁇ 15>, further comprising: a recording unit that records communication information related to the communication processing performed at the time of imaging by the imaging unit in association with a captured image captured by the imaging unit. apparatus.
- ⁇ 17> Corresponding the executed communication process and the subject's reaction to the communication process are recorded as a communication history, The imaging device according to any one of ⁇ 1> to ⁇ 16>, wherein the communication processing or imaging control is performed based on the communication history.
- a photographing method including a step of performing photographing control for controlling photographing by a photographing unit that photographs the subject in accordance with communication processing for communicating with the subject.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Exposure Control For Cameras (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
Abstract
Description
被写体を撮影する撮影部と、
前記被写体とコミュニケーションをとるためのコミュニケーション処理を制御するコミュニケーション制御部と、
前記コミュニケーション処理に応じて、前記撮影部による撮影を制御する撮影制御を行う撮影制御部と
を備える撮影装置。
<2>
前記コミュニケーション処理として、前記被写体が所定の状態になるように誘発する誘発処理が行われる
<1>に記載の撮影装置。
<3>
前記撮影制御として、撮影モードの制御を行う
<1>又は<2>に記載の撮影装置。
<4>
前記撮影モードとして、静止画を1枚撮影するモード、静止画を連写するモード、タイムラプス撮影を行うモード、動画を通常フレームレートで撮影するモード、及び、動画をハイフレームレートで撮影するモードのうちの2以上のモードを有する
<3>に記載の撮影装置。
<5>
前記コミュニケーション処理として、前記被写体が所定の行動をとるように誘発する前記誘発処理が行われる
<1>又は<2>に記載の撮影装置。
<6>
前記撮影制御として、前記撮影部による撮影のシャッタスピードの制御を行う
<1>、<2>、又は、<5>に記載の撮影装置。
<7>
前記撮影制御として、前記撮影部の撮影タイミングの制御を行う
<1>、<2>、又は、<5>に記載の撮影装置。
<8>
前記撮影制御として、前記撮影部により撮影された撮影画像に対するエフェクトの制御を行う
<1>、<2>、又は、<5>に記載の撮影装置。
<9>
前記撮影制御として、前記撮影装置の移動制御、及び、前記撮影部の撮影方向の制御のうちの一方又は両方を行う
<1>、<2>、又は、<5>に記載の撮影装置。
<10>
前記コミュニケーション処理として、前記被写体が所定のポーズをとるように誘発する前記誘発処理が行われる
<1>ないし<9>のいずれかに記載の撮影装置。
<11>
前記コミュニケーション処理として、前記被写体が笑顔になるように誘発する前記誘発処理、前記被写体が静止するように誘発する前記誘発処理、又は、前記被写体がジャンプするように誘発する前記誘発処理が行われる
<10>に記載の撮影装置。
<12>
前記コミュニケーション処理として、前記被写体の撮影が所定の向きで行われるように誘発する前記誘発処理が行われる
<1>ないし<9>のいずれかに記載の撮影装置。
<13>
前記コミュニケーション処理として、前記被写体が正対するように誘発する前記誘発処理が行われる
<12>に記載の撮影装置。
<14>
複数人の被写体が存在し、その複数人の被写体の中に、主たる被写体である主要被写体が存在する場合、前記コミュニケーション制御部は、前記主要被写体とコミュニケーションをとるためのコミュニケーション処理を行わせる
<1>ないし<13>のいずれかに記載の撮影装置。
<15>
前記撮影制御部は、前記コミュニケーション処理に対してリアクションをとった被写体、又は、前記コミュニケーション処理に対して被写体がとったリアクションに応じて、前記撮影制御を行う
<1>ないし<14>のいずれかに記載の撮影装置。
<16>
前記撮影部による撮影時に行った前記コミュニケーション処理に関するコミュニケーション情報を、前記撮影部により撮影された撮影画像と対応付けて記録する記録部をさらに備える
<1>ないし<15>のいずれかに記載の撮影装置。
<17>
実行されたコミュニケーション処理と、そのコミュニケーション処理に対する被写体のリアクションとを対応付け、コミュニケーション履歴として記録し、
前記コミュニケーション履歴に基づいて、前記コミュニケーション処理、又は、撮影制御を行う
<1>ないし<16>のいずれかに記載の撮影装置。
<18>
前記コミュニケーション処理を行うコミュニケーション処理部をさらに備える
<1>ないし<17>のいずれかに記載の撮影装置。
<19>
被写体とコミュニケーションをとるためのコミュニケーション処理に応じて、前記被写体を撮影する撮影部による撮影を制御する撮影制御を行う
ステップを含む撮影方法。
<20>
被写体とコミュニケーションをとるためのコミュニケーション処理を制御するコミュニケーション制御部と、
前記コミュニケーション処理に応じて、前記被写体を撮影する撮影部による撮影を制御する撮影制御を行う撮影制御部と
して、コンピュータを機能させるためのプログラム。
Claims (20)
- 被写体を撮影する撮影部と、
前記被写体とコミュニケーションをとるためのコミュニケーション処理を制御するコミュニケーション制御部と、
前記コミュニケーション処理に応じて、前記撮影部による撮影を制御する撮影制御を行う撮影制御部と
を備える撮影装置。 - 前記コミュニケーション処理として、前記被写体が所定の状態になるように誘発する誘発処理が行われる
請求項1に記載の撮影装置。 - 前記撮影制御として、撮影モードの制御を行う
請求項2に記載の撮影装置。 - 前記撮影モードとして、静止画を1枚撮影するモード、静止画を連写するモード、タイムラプス撮影を行うモード、動画を通常フレームレートで撮影するモード、及び、動画をハイフレームレートで撮影するモードのうちの2以上のモードを有する
請求項3に記載の撮影装置。 - 前記コミュニケーション処理として、前記被写体が所定の行動をとるように誘発する前記誘発処理が行われる
請求項2に記載の撮影装置。 - 前記撮影制御として、前記撮影部による撮影のシャッタスピードの制御を行う
請求項5に記載の撮影装置。 - 前記撮影制御として、前記撮影部の撮影タイミングの制御を行う
請求項5に記載の撮影装置。 - 前記撮影制御として、前記撮影部により撮影された撮影画像に対するエフェクトの制御を行う
請求項5に記載の撮影装置。 - 前記撮影制御として、前記撮影装置の移動制御、及び、前記撮影部の撮影方向の制御のうちの一方又は両方を行う
請求項5に記載の撮影装置。 - 前記コミュニケーション処理として、前記被写体が所定のポーズをとるように誘発する前記誘発処理が行われる
請求項4に記載の撮影装置。 - 前記コミュニケーション処理として、前記被写体が笑顔になるように誘発する前記誘発処理、前記被写体が静止するように誘発する前記誘発処理、又は、前記被写体がジャンプするように誘発する前記誘発処理が行われる
請求項10に記載の撮影装置。 - 前記コミュニケーション処理として、前記被写体の撮影が所定の向きで行われるように誘発する前記誘発処理が行われる
請求項4に記載の撮影装置。 - 前記コミュニケーション処理として、前記被写体が正対するように誘発する前記誘発処理が行われる
請求項12に記載の撮影装置。 - 複数人の被写体が存在し、その複数人の被写体の中に、主たる被写体である主要被写体が存在する場合、前記コミュニケーション制御部は、前記主要被写体とコミュニケーションをとるためのコミュニケーション処理を行わせる
請求項4に記載の撮影装置。 - 前記撮影制御部は、前記コミュニケーション処理に対してリアクションをとった被写体、又は、前記コミュニケーション処理に対して被写体がとったリアクションに応じて、前記撮影制御を行う
請求項4に記載の撮影装置。 - 前記撮影部による撮影時に行った前記コミュニケーション処理に関するコミュニケーション情報を、前記撮影部により撮影された撮影画像と対応付けて記録する記録部をさらに備える
請求項4に記載の撮影装置。 - 実行されたコミュニケーション処理と、そのコミュニケーション処理に対する被写体のリアクションとを対応付け、コミュニケーション履歴として記録し、
前記コミュニケーション履歴に基づいて、前記コミュニケーション処理、又は、撮影制御を行う
請求項4に記載の撮影装置。 - 前記コミュニケーション処理を行うコミュニケーション処理部をさらに備える
請求項4に記載の撮影装置。 - 被写体とコミュニケーションをとるためのコミュニケーション処理に応じて、前記被写体を撮影する撮影部による撮影を制御する撮影制御を行う
ステップを含む撮影方法。 - 被写体とコミュニケーションをとるためのコミュニケーション処理を制御するコミュニケーション制御部と、
前記コミュニケーション処理に応じて、前記被写体を撮影する撮影部による撮影を制御する撮影制御を行う撮影制御部と
して、コンピュータを機能させるためのプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015559868A JP6481866B2 (ja) | 2014-01-28 | 2015-01-16 | 情報処理装置、撮像装置、情報処理方法、及び、プログラム |
US15/112,444 US10455148B2 (en) | 2014-01-28 | 2015-01-16 | Image capturing device to capture image of a subject based on a communication process |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-013299 | 2014-01-28 | ||
JP2014013299 | 2014-01-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015115203A1 true WO2015115203A1 (ja) | 2015-08-06 |
Family
ID=53756780
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/051018 WO2015115203A1 (ja) | 2014-01-28 | 2015-01-16 | 撮影装置、撮影方法、及び、プログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US10455148B2 (ja) |
JP (1) | JP6481866B2 (ja) |
WO (1) | WO2015115203A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106131393A (zh) * | 2016-06-15 | 2016-11-16 | 北京小米移动软件有限公司 | 拍照提示方法及装置 |
WO2017051577A1 (ja) * | 2015-09-25 | 2017-03-30 | ソニー株式会社 | 感情誘導システム、および感情誘導方法 |
JP2020092370A (ja) * | 2018-12-07 | 2020-06-11 | ルネサスエレクトロニクス株式会社 | 撮影制御装置、撮影システム及び撮影制御方法 |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9979890B2 (en) | 2015-04-23 | 2018-05-22 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
US9936127B2 (en) * | 2015-11-02 | 2018-04-03 | Paypal, Inc. | Systems and methods for providing attention directing functions in an image capturing device |
US9854156B1 (en) | 2016-06-12 | 2017-12-26 | Apple Inc. | User interface for camera effects |
DK180859B1 (en) | 2017-06-04 | 2022-05-23 | Apple Inc | USER INTERFACE CAMERA EFFECTS |
US11112964B2 (en) | 2018-02-09 | 2021-09-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
DK201870623A1 (en) | 2018-09-11 | 2020-04-15 | Apple Inc. | USER INTERFACES FOR SIMULATED DEPTH EFFECTS |
US10674072B1 (en) | 2019-05-06 | 2020-06-02 | Apple Inc. | User interfaces for capturing and managing visual media |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
CN114303365A (zh) * | 2019-10-16 | 2022-04-08 | 松下电器(美国)知识产权公司 | 机器人、控制处理方法以及控制处理程序 |
US11039074B1 (en) | 2020-06-01 | 2021-06-15 | Apple Inc. | User interfaces for managing media |
CN114079724A (zh) * | 2020-07-31 | 2022-02-22 | 北京小米移动软件有限公司 | 起跳抓拍方法、装置及存储介质 |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0293527A (ja) * | 1988-09-28 | 1990-04-04 | Takayuki Shioyama | 顔貌撮像用制御装置 |
JP2003219218A (ja) * | 2002-01-23 | 2003-07-31 | Fuji Photo Film Co Ltd | デジタルカメラ |
JP2006287285A (ja) * | 2005-03-31 | 2006-10-19 | Omron Entertainment Kk | 写真シール作成装置および方法、並びにプログラム |
JP2008058791A (ja) * | 2006-09-01 | 2008-03-13 | Sega Corp | 自動撮影装置 |
JP2009246832A (ja) * | 2008-03-31 | 2009-10-22 | Casio Comput Co Ltd | 撮像装置、撮像方法及びプログラム |
JP2011019051A (ja) * | 2009-07-08 | 2011-01-27 | Sony Ericsson Mobilecommunications Japan Inc | 撮影装置および撮影制御方法 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005010512A (ja) | 2003-06-19 | 2005-01-13 | Nikon Corp | 自律的撮影装置 |
JP4197019B2 (ja) * | 2006-08-02 | 2008-12-17 | ソニー株式会社 | 撮像装置および表情評価装置 |
KR101700357B1 (ko) * | 2009-11-30 | 2017-01-26 | 삼성전자주식회사 | 점프 영상 촬영 장치 및 방법 |
KR101634247B1 (ko) * | 2009-12-04 | 2016-07-08 | 삼성전자주식회사 | 피사체 인식을 알리는 디지털 촬영 장치, 상기 디지털 촬영 장치의 제어 방법 |
KR101755598B1 (ko) * | 2010-10-27 | 2017-07-07 | 삼성전자주식회사 | 디지털 촬영 장치 및 이의 제어 방법 |
JP2012151541A (ja) * | 2011-01-17 | 2012-08-09 | Rohm Co Ltd | 撮像装置 |
KR102138516B1 (ko) * | 2013-10-11 | 2020-07-28 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
-
2015
- 2015-01-16 JP JP2015559868A patent/JP6481866B2/ja active Active
- 2015-01-16 US US15/112,444 patent/US10455148B2/en active Active
- 2015-01-16 WO PCT/JP2015/051018 patent/WO2015115203A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0293527A (ja) * | 1988-09-28 | 1990-04-04 | Takayuki Shioyama | 顔貌撮像用制御装置 |
JP2003219218A (ja) * | 2002-01-23 | 2003-07-31 | Fuji Photo Film Co Ltd | デジタルカメラ |
JP2006287285A (ja) * | 2005-03-31 | 2006-10-19 | Omron Entertainment Kk | 写真シール作成装置および方法、並びにプログラム |
JP2008058791A (ja) * | 2006-09-01 | 2008-03-13 | Sega Corp | 自動撮影装置 |
JP2009246832A (ja) * | 2008-03-31 | 2009-10-22 | Casio Comput Co Ltd | 撮像装置、撮像方法及びプログラム |
JP2011019051A (ja) * | 2009-07-08 | 2011-01-27 | Sony Ericsson Mobilecommunications Japan Inc | 撮影装置および撮影制御方法 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017051577A1 (ja) * | 2015-09-25 | 2017-03-30 | ソニー株式会社 | 感情誘導システム、および感情誘導方法 |
JPWO2017051577A1 (ja) * | 2015-09-25 | 2018-07-26 | ソニー株式会社 | 感情誘導システム、および感情誘導方法 |
CN106131393A (zh) * | 2016-06-15 | 2016-11-16 | 北京小米移动软件有限公司 | 拍照提示方法及装置 |
EP3258414A1 (en) * | 2016-06-15 | 2017-12-20 | Beijing Xiaomi Mobile Software Co., Ltd. | Prompting method and apparatus for photographing |
US10230891B2 (en) | 2016-06-15 | 2019-03-12 | Beijing Xiaomi Mobile Software Co., Ltd. | Method, device and medium of photography prompts |
JP2020092370A (ja) * | 2018-12-07 | 2020-06-11 | ルネサスエレクトロニクス株式会社 | 撮影制御装置、撮影システム及び撮影制御方法 |
JP7401968B2 (ja) | 2018-12-07 | 2023-12-20 | ルネサスエレクトロニクス株式会社 | 撮影制御装置、撮影システム及び撮影制御方法 |
Also Published As
Publication number | Publication date |
---|---|
US10455148B2 (en) | 2019-10-22 |
US20160337582A1 (en) | 2016-11-17 |
JPWO2015115203A1 (ja) | 2017-03-23 |
JP6481866B2 (ja) | 2019-03-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6481866B2 (ja) | 情報処理装置、撮像装置、情報処理方法、及び、プログラム | |
US11509817B2 (en) | Autonomous media capturing | |
JP5881873B2 (ja) | 撮像装置、撮像方法およびプログラム | |
JP4877762B2 (ja) | 表情誘導装置および表情誘導方法、表情誘導システム | |
US9934823B1 (en) | Direction indicators for panoramic images | |
US11785328B2 (en) | System and camera device for capturing images | |
JP6298563B1 (ja) | ヘッドマウントデバイスによって仮想空間を提供するためのプログラム、方法、および当該プログラムを実行するための情報処理装置 | |
JP2019118098A (ja) | 撮像装置及びその制御方法、プログラム、記憶媒体 | |
US9824723B1 (en) | Direction indicators for panoramic images | |
CN108141540A (zh) | 具有移动检测的全向相机 | |
JP2019012441A (ja) | 仮想空間を提供するためにコンピュータで実行されるプログラム、情報処理装置および仮想空間を提供するための方法 | |
KR20200132569A (ko) | 특정 순간에 관한 사진 또는 동영상을 자동으로 촬영하는 디바이스 및 그 동작 방법 | |
CN106210699B (zh) | 信息处理装置、信息处理装置的控制方法及图像处理系统 | |
US11173375B2 (en) | Information processing apparatus and information processing method | |
US11477433B2 (en) | Information processor, information processing method, and program | |
US20210383097A1 (en) | Object scanning for subsequent object detection | |
JP2024050757A (ja) | 撮影システム、撮影方法、撮影プログラム、及びぬいぐるみ | |
TW202328871A (zh) | 用於擴展現實系統的動態內容呈現 | |
WO2019044135A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP2011151482A (ja) | 撮像装置および撮像装置の制御方法 | |
JP2019012509A (ja) | ヘッドマウントデバイスによって仮想空間を提供するためのプログラム、方法、および当該プログラムを実行するための情報処理装置 | |
JP7199808B2 (ja) | 撮像装置およびその制御方法 | |
WO2022014273A1 (ja) | 撮像支援制御装置、撮像支援制御方法、撮像支援システム | |
KR20170018907A (ko) | 지원 템플릿을 포함하는 튜토리얼 모델 | |
WO2022086028A1 (ko) | 전자 장치 및 이의 제어 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15743074 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015559868 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15112444 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15743074 Country of ref document: EP Kind code of ref document: A1 |