JP2009049951A - Imaging apparatus and imaging method - Google Patents

Imaging apparatus and imaging method Download PDF

Info

Publication number
JP2009049951A
JP2009049951A JP2007216687A JP2007216687A JP2009049951A JP 2009049951 A JP2009049951 A JP 2009049951A JP 2007216687 A JP2007216687 A JP 2007216687A JP 2007216687 A JP2007216687 A JP 2007216687A JP 2009049951 A JP2009049951 A JP 2009049951A
Authority
JP
Japan
Prior art keywords
imaging
setting
user
image data
heart rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2007216687A
Other languages
Japanese (ja)
Inventor
Masamichi Asukai
Daiji Ito
Yasunori Kamata
Takayasu Kon
Yoichiro Sako
Akane Sano
孝安 今
大二 伊藤
曜一郎 佐古
あかね 佐野
恭則 鎌田
正道 飛鳥井
Original Assignee
Sony Corp
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp, ソニー株式会社 filed Critical Sony Corp
Priority to JP2007216687A priority Critical patent/JP2009049951A/en
Publication of JP2009049951A publication Critical patent/JP2009049951A/en
Application status is Pending legal-status Critical

Links

Abstract

Automatic imaging is performed with appropriate operation settings according to a user's psychological state.
For example, as an automatic imaging for a life log application, an imaging operation setting is performed based on a user's pulse wave information when an imaging apparatus automatically performs imaging at a fixed time interval or the like. Depending on the pulse wave information, it can be determined whether the heart rate fluctuation of the user is caused by psychology or exercise. Moreover, a user's stress condition and psychological condition can also be estimated. Therefore, by performing appropriate imaging operation settings for the image size, compression rate, imaging time interval, and the like according to these determination results, automatic imaging is performed under imaging conditions suitable for the user.
[Selection] Figure 5

Description

  The present invention relates to an imaging apparatus and an imaging method, and more particularly to a technique suitable for an automatic imaging operation in which imaging is automatically performed regardless of a shutter operation of a user.

JP 2002-34030 A JP 2007-66251 A JP-A-2005-331716

  For example, a camera that records a scene that the user sees in daily life as image data by automatically taking a periodic image with a camera worn by the user, such as a life log camera, has been proposed. By using this life log camera, it becomes possible to leave a user's action history, memories, etc. as image data.

By the way, when performing automatic imaging regularly as a lifelog camera, various scenes that the user has seen in daily life will be captured, but the user will look back on those scenes later. Something that is valuable or nothing is included. A valuable scene is a scene in which the user is interested, a scene in which the user's emotions have changed greatly, and the like.
For example, it can be considered that it is preferable to capture captured image data obtained by capturing a scene of such value with a different imaging operation setting.
Therefore, an object of the present invention is to enable an automatic imaging operation according to a user's psychological state in automatic imaging.

An imaging apparatus according to the present invention includes, as an imaging operation, an imaging unit that obtains captured image data of a subject and stores the captured image data, a biological information detection unit that detects biological information of the user, and a shutter of the user When performing imaging control as automatic imaging processing that is not based on an operation, an imaging operation setting is performed based on biological information obtained by the biological information detection unit, and an imaging operation based on the imaging operation setting is performed on the imaging unit. Control means.
In particular, the biological information detection means detects pulse wave information as the biological information.

The control means performs image quality setting, image size setting, imaging interval time setting, frame rate setting, still image imaging / moving image imaging switching setting, or shutter speed setting as the imaging operation setting.
The control means determines whether the heart rate fluctuation obtained from the pulse wave information detected by the biological information detection means is a heart rate fluctuation caused by psychology or a heart rate fluctuation caused by exercise. Accordingly, the imaging operation setting is performed.
Further, the control means determines the stress situation of the user based on the pulse wave information detected by the biological information detection means, and performs the imaging operation setting according to the determination result.
The control means discriminates the psychological state of the user based on the pulse wave information detected by the biological information detection means, and performs the imaging operation setting according to the discrimination result.

  The imaging method of the present invention is an imaging method of an imaging apparatus that obtains captured image data of a subject and performs storage processing of the captured image data as an imaging operation, and a biological information detection step of detecting biological information of a user; When performing imaging control as an automatic imaging process that is not based on a user's shutter operation, an imaging operation setting is performed based on the biological information detected in the biological information detection step, and an imaging operation based on the imaging operation setting is executed. An imaging step.

  In the present invention as described above, for example, as an automatic image capturing for life log use, when the image capturing apparatus automatically performs image capturing (regardless of a user's shutter operation) at regular time intervals, the control means However, the imaging operation setting is performed based on the biological information (particularly pulse wave information) of the user (user). Depending on the pulse wave information, it can be determined whether the heart rate fluctuation of the user is caused by psychology or exercise. Moreover, a user's stress condition and psychological condition can also be estimated. Therefore, by performing appropriate imaging operation settings according to these determination results, automatic imaging with settings suitable for the user can be performed.

According to the present invention, automatic imaging is sequentially performed while imaging operation settings adapted to the user's psychology are performed. Specifically, for example, when the user is psychologically different from the normal situation, it is possible to set important image and normal scenes by performing settings such as high-quality imaging and shortening the imaging interval time. Imaging (storing of captured image data) can be performed in different states.
As a result, for example, as an automatic image capturing for life log use, a suitable image capturing is realized such that a scene impressive to the user is imaged with high image quality.
Further, by using the user's pulse wave information as biometric information, detection is relatively easy, which is advantageous in terms of the apparatus configuration.

Embodiments of the present invention will be described below. The description will be made in the following order.
[1. Appearance example of imaging device]
[2. Configuration example of imaging apparatus]
[3. Imaging operation settings]
[4. Imaging operation setting processing example I]
[5. Imaging operation setting processing example II]
[6. Imaging operation setting processing example III]
[7. Effects of the embodiment]

[1. Appearance example of imaging device]

Various forms are assumed as the imaging apparatus 1 of the embodiment, and examples of their appearance are illustrated in FIGS.
FIG. 1A shows a neck-mounted type imaging apparatus 1. The imaging apparatus 1 has a part to which a strap is attached, for example, and is attached by attaching the strap to this part and hanging it on the user's neck as shown in the figure. The user only needs to wear the imaging lens 3L included in the imaging device 1 so that the imaging can be performed with the front direction of the user as the subject direction.
Although not shown, for example, a display unit used for an imaging monitor or reproduction of a captured image may be provided on the back surface of the imaging device 1 or the like.

FIG. 1B shows an imaging apparatus 1 that is a glasses-type display camera. The imaging device 1 has a mounting unit having a frame structure that makes a half turn, for example, from both sides of the head to the back of the head, and is mounted on the user by being placed on both ear shells as illustrated.
In the imaging device 1, the imaging lens 3 </ b> L is arranged forward so that imaging is performed with the user's visual field direction as the subject direction when the user is wearing the imaging device 1.
Further, in the wearing state as shown in the figure, a pair of display units 5 and 5 for the left eye and the right eye are arranged immediately before both eyes of the user, that is, at a position where a lens in normal glasses is located. Has been. For example, a liquid crystal panel is used for the display unit 5. By controlling the transmittance, a through state as shown in the figure, that is, a transparent or translucent state can be obtained. Since the display unit 5 is in the through state, there is no problem in normal life even if the user always wears it like glasses.
In addition to a pair of display units 2 corresponding to both eyes, a configuration in which one display unit 2 is provided corresponding to one eye is also conceivable. A configuration in which the display unit 2 is not provided is also conceivable.

In FIGS. 1A and 1B, the neck-mounted type or the eyeglass-type imaging device 1 is exemplified, but various structures for the user to wear the imaging device 1 can be considered. For example, a headphone type, a neckband type, an ear hook type, or any other mounting unit may be worn by the user. Furthermore, for example, a configuration may be adopted in which the user attaches to normal glasses, a visor, headphones, or the like with an attachment such as a clip. Moreover, it does not necessarily have to be worn on the user's head.
In addition, in the case of FIG. 1A, the imaging direction is the front direction of the user, but the imaging device 1 may be worn on the neck so as to capture the back of the user when worn.
In the case of FIG. 1B, the imaging direction is the user's field of view, but the imaging lens 3 </ b> L is attached so as to capture the user's rear, side, upper, foot direction, and the like when worn. Alternatively, a configuration in which a plurality of imaging systems having the same or different imaging directions is provided is also conceivable.
Further, in FIGS. 1A and 1B, an imaging direction variable mechanism that can change the subject direction manually or automatically may be provided for one or a plurality of imaging lenses 3L.

Needless to say, forms other than those shown in FIGS. 1A and 1B are also conceivable as an imaging apparatus that captures moving images and still images. For example, an imaging apparatus 1 that can be installed in a vehicle and that can image inside or outside the vehicle can be assumed as the present embodiment. For example, it is an imaging device attached to take an image of the interior of a vehicle or an imaging device attached to take an image of a front landscape, a rear landscape, or the like of a car.
Further, an apparatus such as a mobile phone, a PDA (Personal Digital Assistant), and a portable personal computer that has a function as an imaging apparatus can be assumed as the imaging apparatus 1 of the present embodiment.
In these various forms, for example, a microphone that collects external sound may be provided, and an audio signal to be recorded together with image data may be obtained during imaging. Further, a speaker unit and an earphone unit that perform audio output may be formed.
It is also conceivable that a light emitting unit that performs illumination in the direction of the subject is provided in the vicinity of the imaging lens 3L using, for example, an LED (Light Emitting Diode) or a flash light emitting unit for taking a still image.

[2. Configuration example of imaging apparatus]

Here, a configuration example of the imaging apparatus 1 according to the embodiment will be described.
FIG. 2 is a block diagram illustrating an internal configuration of the imaging apparatus 1.
As illustrated, the imaging apparatus 1 includes a system controller 2, an imaging unit 3, an imaging control unit 4, a display unit 5, a display control unit 6, an operation input unit 7, a storage unit 8, a communication unit 9, a pulse wave sensor 10, It has a pulse wave database 11.

The system controller 2 includes, for example, a microcomputer including a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a nonvolatile memory unit, and an interface unit, and controls the entire imaging apparatus 1. The control unit The system controller 2 performs various arithmetic processes and exchanges control signals with each unit via the bus 12 based on a program held in an internal ROM or the like, and causes each unit to execute a required operation.
Particularly in the case of this example, the system controller 2 also performs various imaging operation setting processes at the time of automatic imaging based on the pulse wave information supplied from the pulse sensor 10 and the information stored in the pulse wave database 11.

The imaging unit 3 includes an imaging optical system 3a, an imaging element unit 3b, and an imaging signal processing unit 3c.
The imaging optical system 3a in the imaging unit 3 performs a focusing operation and a zooming operation on a lens system including the imaging lens 3L illustrated in FIG. 1, a diaphragm, a zoom lens, a focus lens, and the like, and the lens system. A drive system or the like is provided.
In the image pickup device section 3b in the image pickup section 3, a solid-state image pickup device array that detects image pickup light obtained by the image pickup optical system 3a and generates an image pickup signal by performing photoelectric conversion is provided. The solid-state imaging device array is, for example, a CCD (Charge Coupled Device) sensor array or a CMOS (Complementary Metal Oxide Semiconductor) sensor array.
The imaging signal processing unit 3c in the imaging unit 3 includes a sample hold / AGC (Automatic Gain Control) circuit that performs gain adjustment and waveform shaping on a signal obtained by the solid-state imaging device, and a video A / D converter. Obtained image data. Also, white balance processing, luminance processing, color signal processing, and the like are performed on the captured image data.

Imaging is performed by the imaging unit 3 including the imaging optical system 3a, the imaging element unit 3b, and the imaging signal processing unit 3c, and captured image data is obtained.
Image data obtained by the imaging operation of the imaging unit 3 is processed by the imaging control unit 4.
The imaging control unit 4 performs processing such as image compression processing for compressing captured image data at various compression ratios, image size conversion processing, and image format conversion processing according to the control of the system controller 2, and according to the operation situation. The captured image data is transferred to the storage unit 8, the display control unit 6, the communication unit 9, and the like.
The imaging control unit 4 controls on / off of an imaging operation in the imaging unit 3, shutter processing, drive control of the zoom lens and focus lens of the imaging optical system 3a, and shutter of the imaging element unit 3b based on an instruction from the system controller 2. Speed control, frame rate control, parameter control of each process of the imaging signal processing unit 3c, execution process setting, and the like are also performed.

A display unit 5 and a display control unit 6 are provided as a configuration for displaying to the user in the imaging apparatus 1.
The display unit 5 is provided with a display panel unit such as a liquid crystal display and a display driving unit for driving the display panel unit. The display driving unit includes a pixel driving circuit for displaying the image data supplied from the imaging control unit 4 on the display panel unit. The pixel drive circuit applies a drive signal based on the video signal to each pixel arranged in a matrix in the display panel unit at a predetermined horizontal / vertical drive timing, and causes display to be performed.

The display control unit 6 drives the pixel driving circuit in the display unit 5 based on the control of the system controller 2 to execute a predetermined display. For example, monitor display of an image captured by the image capturing unit 3 and display of an image reproduced by the storage unit 8 are performed.
For these displays, for example, brightness level adjustment, color correction, contrast adjustment, sharpness (outline emphasis) adjustment, and the like can be performed. Also, generation of an enlarged image obtained by enlarging a part of the image data, generation of a reduced image, soft focus, mosaic, luminance inversion, highlight display (highlight display) of a part of the image, change in atmosphere of the whole color, etc. Image effect processing can also be performed.

The operation input unit 7 has operation elements such as keys, buttons, and dials. For example, an operation element for power on / off operation, an operation related to automatic imaging, a required input operation, and the like. An operator is formed. When not only automatic imaging but also imaging in accordance with the user's shutter operation is possible, for example, an operator used for shutter operation, zoom operation, exposure setting operation, self-timer operation, etc. It may be formed.
The operation input unit 7 supplies information obtained from such an operator to the system controller 2, and the system controller 2 performs necessary arithmetic processing and control corresponding to the information.

The storage unit 8 is used for storing captured image data and other various data.
The storage unit 8 may be configured by a solid-state memory such as a flash memory, or may be configured by an HDD (Hard Disk Drive), for example.
Further, instead of a built-in recording medium, a portable recording medium, for example, a recording / reproducing drive corresponding to a recording medium such as a memory card incorporating a solid-state memory, an optical disk, a magneto-optical disk, or a hologram memory may be used.
Of course, both a built-in type memory such as a solid-state memory and an HDD, and a recording / reproducing drive for a portable recording medium may be mounted.
The storage unit 8 records / reproduces captured image data and other various data based on the control of the system controller 2.

The communication unit 9 is provided as a part that performs data communication with various external devices.
For example, data transmission / reception with a server device (not shown) may be performed. In that case, for example, a configuration in which network communication is performed via short-range wireless communication with a network access point by a method such as wireless LAN or Bluetooth, or direct wireless communication with a server device having a corresponding communication function is possible. It may be what performs.
In addition, the communication unit 9 may be connected to a device such as a personal computer using an interface such as a USB (Universal Serial Bus) system to transmit and receive data.
The communication unit 9 can transfer, for example, captured image data captured and stored in the storage unit 8 to a personal computer or other external device. Accordingly, a large number of captured image data recorded and stored in the storage unit 8 by periodic imaging as a life log can be reproduced and displayed on the display unit 5 by the imaging device 1 itself, and an external device such as a personal computer. Can be transferred to and played back on the external device.

The pulse wave sensor 10 detects a user's pulse wave, and supplies this to the system controller 2 as pulse wave information.
The pulse wave database 11 is a database in which various pieces of information for determining a human psychological state from a pulse wave waveform pattern are stored.
The pulse wave database 11 may be stored in a storage area of the storage unit 8 or an internal memory of the system controller 2.

By the way, the pulse wave sensor 10 is made to contact the body part which can detect a pulse wave with a user's body, for example. For example, near the ear shell, wrist, chest, and the like. For example, when a spectacle-type imaging device 1 as shown in FIG. 1B is considered, it is conceivable that the pulse wave sensor 10 can detect a pulse by contacting the vicinity of the base of the user's ear shell. That is, it is only necessary that the pulse wave sensor 10 is arranged inside the portion corresponding to the temple of the glasses.
On the other hand, in the case of the type as shown in FIG. 1A, the imaging device 1 cannot contact the pulse detection site of the user's body, so the pulse wave sensor 10 may be configured separately from the main body of the imaging device 1. Conceivable.

FIG. 3 shows a configuration example when the pulse wave sensor 10 has a separate configuration. As shown in the figure, the component of the pulse wave sensor 10 is a sensor unit 20 that is separate from the main body of the imaging apparatus 1.
In this case, the sensor unit 20 includes the pulse wave sensor 10 and the detection information transmission unit 14. A detection information receiving unit 15 is provided on the main body side of the imaging apparatus 1.

The detection information reception unit 15 and the detection information transmission unit 14 communicate with each other by, for example, wireless communication or wired communication. In the case of wireless communication, for example, a short-range wireless communication method such as Bluetooth may be employed, or an optical communication method in which data communication is performed by optical pulse modulation using visible light or invisible light may be used. Of course, a wireless communication method corresponding to a longer distance may be used.
In the case of FIG. 3, the pulse wave information detected by the pulse wave sensor 10 is transmitted from the detection information transmitting unit 14 and received by the detection information receiving unit 15. The detection information receiving unit 15 demodulates the received pulse wave information and supplies it to the system controller 2.
With such a configuration, the pulse wave sensor 10 can be easily attached to a pulse detection site in the user's body. For example, by configuring the sensor unit 20 in a bracelet shape so that the user can attach it to the wrist, or by making the sensor unit 20 attachable to the ear shell, the pulse wave of the user can be detected appropriately.

As described above, the configuration example of the imaging device 1 is shown in FIGS. 2 and 3, but these are only examples. Addition and deletion of various components are naturally conceivable depending on the actual operation example and function.

[3. Imaging operation settings]

A characteristic operation of the imaging apparatus 1 having the above configuration will be described below.
The imaging device 1 according to the embodiment basically performs an imaging operation that is not based on a user's shutter operation when used for a life log application. For example, automatic imaging is periodically performed, and captured image data is stored in the storage unit 8. For example, automatic imaging is performed at predetermined intervals such as an interval of 5 seconds, an interval of 10 seconds, and an interval of 30 seconds. Of course, instead of periodic imaging, automatic imaging may be performed at irregular timing according to some trigger (trigger other than the user's shutter operation). Furthermore, if the user performs a shutter operation during execution of regular or irregular automatic imaging, the imaging process may be performed normally at that time.

  When the user wears the imaging device 1 in daily life and thus automatically performs periodic imaging in this manner, for example, a scene corresponding to the user's action is captured. However, there are various scenes that the user sees in daily activities, such as scenes that are common for the user every day, and scenes that the user feels psychological situations such as joy, surprise, and sadness. A scene in which a user's psychological change occurs can be considered to be highly valuable as a captured image for the user.

  Therefore, in this example, the psychological state is estimated from the user pulse wave, and automatic imaging is performed by switching various imaging operation settings such as the image size, compression rate, still image / video mode, and imaging time interval according to the user's psychological state. It is. As a result, the imaging apparatus 1 estimates a scene in which the user is interested (such as a scene in a different psychological state), and can automatically capture images under imaging conditions different from usual.

Here, as the imaging operation setting, image quality setting, image size setting, imaging interval setting, or shutter speed setting can be considered.
The image quality setting is the image quality of the captured image data, which can be set to a high or low compression rate, for example. In other words, the captured image data obtained by the imaging unit 3 is compressed by the imaging control unit 4 using a predetermined compression method, transferred to the storage unit 8, and recorded, but the compression rate is increased during the compression process. The smaller the data capacity, the lower the quality of the captured image data stored. On the other hand, if the compression rate is low (or if it is not compressed), the data capacity increases, but the quality of the stored captured image data increases.
Therefore, it is possible to change the quality of image data to be stored as a setting of the compression rate, and to effectively use the storage capacity.

The image size setting can be considered as, for example, the number of pixels of the captured image data. For example, the image size corresponding to the number of pixels in the imaging element unit 3b is maximized, and for example, the pixel number is decreased by pixel thinning processing The image size becomes smaller.
Again, the larger the image size, the higher the quality but the larger the capacity. On the other hand, the smaller the image size, the lower the image quality but the smaller the capacity.

The imaging interval setting is a time setting as an imaging interval when periodic automatic imaging is performed as described above. If the imaging interval is shortened, a large number of captured image data can be obtained, which is suitable as a life log, but requires a large amount of recording capacity as a whole. On the other hand, if the imaging interval is lengthened, the number of imaging opportunities decreases, but the overall recording capacity is saved.
For example, a setting in which the imaging interval is switched between an interval of 10 seconds and an interval of 30 seconds can be considered.

  The shutter speed setting is an exposure time setting for the CCD sensor or the CMOS sensor in the image sensor unit 3b. Usually, it is appropriate to increase the shutter speed for a fast-moving subject.

With these various imaging operation settings, for example, it is appropriate to make a relatively advantageous setting for a scene that is impressive for the user. For example, a setting that lowers the compression rate and achieves high image quality, a setting that increases the image size, and a setting that shortens the imaging interval time are advantageous settings in terms of improving the image quality and the number of captured image data to be stored.
For this reason, in this example, the psychological state of the user is estimated based on the pulse wave information from the pulse wave sensor 10, and a scene that is impressive for the user is set to be dominant.

Note that these image quality (compression rate) setting, image size setting, imaging interval setting, and shutter speed setting are illustrated by taking into account the case of taking a still image, but the imaging apparatus 1 may also take a moving image. it can. Accordingly, as the imaging operation setting, still image imaging / moving image switching setting, frame rate setting at the time of moving image imaging, and the like can be considered.
For example, when an imaging device that continuously captures moving images is considered, it is conceivable that the frame rate, the image size, the compression rate, the shutter speed, and the like are sequentially changed according to the user's psychological state. For example, if you capture video at a high frame rate and low compression rate, you can save a high-quality video, and if you capture a video at a low frame rate and high compression rate, the video is relatively low quality but saves capacity. Can record.

In addition to continuously capturing images, for example, an operation method in which moving images are captured intermittently for a certain period of time is conceivable. In such a case, the frame rate, image size, compression rate, imaging The quality and capacity of the moving image data to be saved can be changed by setting the interval and the shutter speed.
Furthermore, compared to still image capturing, moving image capturing can be switched between still image capturing and moving image capturing as a superior setting over still image capturing based on the idea that stored scene content increases. Conceivable.

[4. Imaging operation setting processing example I]

An imaging operation setting processing example I for performing imaging operation setting based on the user's pulse wave information will be described with reference to FIGS.
Normally, when emotion changes occur in the user, the heart rate increases, but the heart rate also increases due to exercise.
Here, an example will be described in which when the heart rate becomes high, it is determined whether it is caused by psychology or exercise, and the imaging operation setting is performed.

FIG. 4 is an estimation model showing temporal changes in heart rate due to psychological and exercise causes. A characteristic L1 indicates a change in heart rate per unit time due to psychology, and a characteristic L2 indicates a change in heart rate per unit time due to exercise.
Although it depends on the exercise load, the change in heart rate due to exercise is more gradual than the change in heart rate due to psychology such as surprise.
From these characteristics, for example, as shown in the figure, the threshold value Dth (slope value) for determining the cause is set, and the change in the heart rate per unit time is detected. It can be determined whether or not is due to psychology.

FIG. 5 shows a processing example in which the cause determination is performed and the imaging operation setting is changed. FIG. 5 shows control processing executed by the system controller 2 based on a program stored in the internal ROM.
During execution of automatic imaging, the system controller 2 counts time as a predetermined imaging interval by an internal timer, for example. Then, the imaging operation is executed according to a predetermined time count such as every 10 seconds. That is, the image capturing control unit 4 is caused to execute compression processing or the like on the captured image data obtained by the image capturing unit 3 and transfer the compressed image data to the storage unit 8. The storage unit 8 is caused to execute an operation for saving the transferred captured image data as a still image.

  For example, when the system controller 2 instructs such an imaging operation at regular time intervals, automatic imaging is executed. During this automatic imaging execution, the system controller 2 always continues the processing of FIG. A sequential imaging operation setting change process is performed.

With the start of automatic imaging control, the system controller 2 starts the process of FIG. First, in step F101, initialization processing is performed. Here, a variable n indicating the heart rate number currently being processed is n = 1.
Next, in step F102, the timer counter t is initialized to t = 0 and counting is started.

In step F <b> 103, the system controller 2 acquires pulse wave information obtained from the pulse wave sensor 10. In step F104, the heart rate is calculated from the pulse wave information.
In step F105, the passage of unit time is confirmed. That is, it is confirmed whether or not the count value (time elapsed) by the timer counter t has reached a predetermined unit time T.
Here, steps F103 and F104 are repeated until the elapsed time reaches unit time T. When it is determined in step F105 that the unit time T has elapsed, the heart rate in the unit time T is counted.

When the unit time T has elapsed, the system controller 2 proceeds to step F106, and the value of the heart rate measured in the process of step F104 during the unit time T is determined as the heart rate B (n) currently being processed. Hold.
In step F107, it is determined whether or not the process is the first process (variable n = 1). If the process is the first process, the variable n is incremented in step F112 and the process returns to step F102. , F103, F104, and F105, the heart rate per unit time T is measured.

After performing the second time as the measurement of the heart rate per unit time T, the system controller 2 advances the process from step F107 to F108.
In step F108, a change value d of the current heart rate measurement value B (n) from the previous heart rate measurement value B (n-1) is obtained. In this case, the change value d is obtained by dividing the difference between the current heart rate measurement value B (n) and the previous heart rate measurement value B (n-1) by the unit time T. As a result, the change value d indicates the slope of the change in heart rate per unit time.
After calculating the change value d, the system controller 2 compares the change value d with the threshold value Dth for cause determination shown in FIG. 4 in step F109. As described in FIG. 4, when the heart rate changes suddenly, it can be estimated that it is caused by psychology. When the change in heart rate is slow, it can be estimated that it is caused by exercise.
Therefore, the inclination as the change value d of the heart rate per unit time T is compared with the inclination value as the threshold value Dth, and if the change value d> the threshold value Dth, it is estimated that the heart rate change is caused by psychology. On the other hand, if the change value d ≦ the threshold value Dth, it is estimated that the heart rate change is caused by exercise.

When it is estimated that the heart rate change is caused by psychology, the system controller 2 proceeds to step F110 and performs an imaging operation setting corresponding to the heart rate change caused by psychology. In this case, operation settings such as a large image size setting, a small compression ratio setting, a short imaging interval setting, and a sports mode off are performed.
If it is estimated that the heart rate change is caused by exercise, the system controller 2 proceeds to step F111 and performs an imaging operation setting corresponding to the heart rate change caused by exercise. In this case, operation settings such as a small image size setting, a large compression ratio setting, an imaging interval long time setting, and a sports mode on are performed.

After the setting process in step F110 or F111, the system controller 2 increments the variable n in step F112, returns to step F102, and repeats the same process from step F102.
When the automatic imaging operation is ended, the processing of FIG. 5 is also determined to be ended in step F113 and is ended.

By performing the processing of FIG. 5, during automatic imaging, imaging operation settings are switched according to the cause of the user's heart rate change, and imaging according to the user's behavior and psychological situation is performed.
If a psychological heart rate change is recognized, the imaging control unit 4 increases the image size of the captured image data obtained by the imaging unit 3 based on the setting in step F110, and performs compression processing with a small compression rate. The captured image data is transferred to the storage unit 8. The storage unit 8 stores the transferred captured image data as a still image. In this case, high-quality captured image data is stored.
Further, the system controller 2 shortens the imaging interval. For example, when the imaging interval is switched between a 5-second interval and a 10-second interval, the imaging operation is controlled at an interval of 5 seconds after setting in Step F110.
Further, the imaging control unit 4 increases the shutter speed of the imaging element unit 3b in the imaging unit 3 according to the operation setting of sport mode off.
If it is estimated that the user's psychological state has changed significantly, such as joy and startle, by the operation based on such settings, a large number of captured image data are stored with high image quality.

On the other hand, when the heartbeat change is caused by exercise, the imaging control unit 4 reduces the image size of the captured image data obtained by the imaging unit 3 based on the setting of step F111 and performs compression processing with a large compression rate. And the captured image data is transferred to the storage unit 8. The storage unit 8 stores the transferred captured image data as a still image. In this case, captured image data with relatively low image quality is stored.
Further, the system controller 2 increases the imaging interval. For example, when the imaging interval is switched between the 5-second interval and the 10-second interval as described above, the imaging operation is executed at an interval of 10 seconds after the setting of Step F110.
In addition, the imaging control unit 4 shortens the shutter speed of the imaging element unit 3b in the imaging unit 3 in accordance with the operation setting of sport mode on.
When it is estimated that the heartbeat change is caused by the motion by the operation based on such settings, a relatively small number of captured image data is stored with a relatively low image quality. In addition, by reducing the shutter speed, it corresponds to a situation in which blurring of the subject image is likely to occur due to exercise.

When the user is in a calm state and a normal state and there is not much heart rate change, the change value d ≦ threshold value Dth, and in this case also the setting in step F111 is performed. Different imaging operation settings may be performed for a case where the change is not recognized so much and a case of a heartbeat change due to exercise. These situations can be distinguished by setting a second threshold value for determining whether there is a heart rate change or almost no heart rate change.
When there is almost no heartbeat change, the same setting may be applied to a small image size, a large compression ratio, and a long imaging interval, but a setting that turns off the sports mode may be considered.
Of course, the image size, the compression rate, and the imaging interval may be switched to three levels during normal times, at the time of a heart rate change caused by exercise, and at the time of a heart rate change caused by psychology.

[5. Imaging operation setting processing example II]

Next, imaging operation setting processing example II in which imaging operation setting is performed based on the user's pulse wave information will be described with reference to FIGS. This is an example of estimating a psychological state as to whether or not a particular stress is felt due to fluctuations in the heartbeat interval.

FIG. 6 shows an example of estimating the psychological state due to fluctuations in the heartbeat interval. When a human pulse wave is measured, the heartbeat interval is shown in FIG. 6A, but it is known that this heartbeat interval is not constant and fluctuates in a certain cycle. It occurs due to the interaction of sympathetic and parasympathetic nerves in the autonomic nervous system.
When frequency analysis of fluctuations in the heartbeat interval is performed, it is known that peaks appear at two locations as shown in FIG.
The cycle called HF (high frequency) occurring around 0.3 Hz is due to parasympathetic activity, and the cycle called LF (low frequency) occurring around 0.1 Hz is due to sympathetic and parasympathetic activity. If the level of the low frequency LF and the level of the high frequency HF are detected and the LF / HF ratio is obtained, it can be used as an index of sympathetic nerve activity.

Sympathetic nerves work to create conditions for human activity by increasing heart rate and increasing blood pressure. Parasympathetic nerves, on the other hand, work to create conditions for humans to rest by reducing heart rate and lowering blood pressure. It is generally known that sympathetic nerves become active when humans feel stress.
Then, when the LF / HF ratio is large, it can be determined that the user feels stress. Conversely, when the LF / HF ratio is small, it can be determined that the user does not feel much stress.

FIG. 7 shows a processing example of the system controller 2 that determines the degree of stress of the user based on fluctuations in the heartbeat interval and reflects this in the imaging operation setting.
As in the case of the imaging operation setting processing example I described above, the system controller 2 counts a predetermined imaging interval by an internal timer when executing automatic imaging, and counts a predetermined time, for example, every 10 seconds. In response to this, the imaging operation (storage of captured image data) by the imaging unit 3, the imaging control unit 4, and the storage unit 8 is executed.
For example, the system controller 2 is instructed to perform an imaging operation at regular time intervals to execute automatic imaging, and continuously performs the processing of FIG. 7 while performing the automatic imaging, and sequentially changes the imaging operation setting. become.

With the start of automatic imaging control, the system controller 2 starts the process of FIG. First, in step F201, initialization processing is performed.
In step F202, the system controller 2 acquires pulse wave information obtained from the pulse wave sensor 10. In step F203, an LF / HF ratio is calculated based on the pulse wave information. That is, in this case, frequency analysis is performed on the acquired pulse wave information, and a peak level as the low frequency LH and a peak level as the high frequency HF are obtained. And the ratio of the low frequency LH and the high frequency HF is calculated | required and this is made into the value of LF / HF ratio.

Next, in step F204, the system controller 2 compares the LF / HF ratio with a predetermined threshold value and determines whether the LF / HF ratio is large or small.
When the LF / HF ratio is larger than the threshold value, it is determined that the user feels stress, and the process proceeds to step F205. In this case, the system controller 2 performs imaging operation settings such as a large image size setting, a small compression rate setting, and a short time setting of an imaging interval.
From the time when the setting of step F205 is performed, the imaging control unit 4 increases the image size of the captured image data obtained by the imaging unit 3, executes a compression process with a small compression rate, and acquires the captured image data. Transfer to the storage unit 8. The storage unit 8 stores the transferred captured image data as a still image. In this case, high-quality captured image data is stored. Further, the system controller 2 shortens the imaging interval. For example, when switching between the 5-second interval and the 10-second interval as the imaging interval, the imaging operation is controlled at an interval of 5 seconds after the setting of Step F205.

If it is determined in step F204 that the LF / HF ratio is smaller than the threshold value, it is estimated that the user does not feel stress, and the system controller 2 proceeds to step F206 to set a small image size, a large compression ratio, An imaging operation setting such as a long imaging interval setting is performed.
From the time when the setting in step F206 is performed, the imaging control unit 4 performs a compression process with a small image size and a large compression rate on the captured image data obtained by the imaging unit 3, and stores the captured image data. Forward to part 8. The storage unit 8 stores the transferred captured image data as a still image. In this case, captured image data with relatively low image quality is stored.
Further, the system controller 2 increases the imaging interval. For example, when the imaging interval is switched between the 5-second interval and the 10-second interval as described above, the imaging operation is executed at an interval of 10 seconds after the setting of Step F206.

After the setting process in step F205 or F206, the system controller 2 returns to step F202 via step F207 and repeats the same process as described above.
When the automatic imaging operation is ended, the processing of FIG. 7 is also determined to be ended in step F207 and is ended.

By performing the processing of FIG. 7, during automatic imaging, imaging operation settings are switched according to the user's stress situation and imaging is performed.
For example, a situation in which the user feels startle or worried and is under stress can be considered as an important scene as a life log. In that case, a large number of captured image data are stored with high image quality.
On the other hand, the situation in which the user does not feel stress is a normal life state. In this case, a relatively small number of captured image data is stored with a relatively low image quality. This also saves the recording capacity of the storage unit 8 at normal times.

  In this example, it is assumed that the scene where the user feels stress is the scene that is important for the user (the scene where the value of the captured image data is high) is set for the imaging operation. Some people don't want to look back on the image later, and want to see it in a calm situation. Therefore, contrary to the above, the setting in step F205 may be performed when the stress is felt to be small, and the setting in step F206 may be performed when the stress is large. In particular, it may be possible to select in which case a large number of images are captured with high image quality by a user operation.

Further, in the example of FIG. 7, two-stage estimation of the presence or absence of stress is performed and the two-stage imaging operation setting is performed, but the stress level is divided into three or more stages with respect to the LH / HF ratio value. The image size, compression rate, imaging interval, etc. may be switched to three or more stages.

[6. Imaging operation setting processing example III]

Next, imaging operation setting processing example III in which imaging operation setting is performed based on the user's pulse wave information will be described with reference to FIGS. This is an example in which a psychological state is estimated from a heartbeat.

FIG. 8 shows an example of estimating the psychological state based on the heartbeat. The vertical axis represents the heart rate, the horizontal axis represents the heart rate variability (heart rate interval fluctuation), and it is generally said that the heart rate represents the arousal level and the heart rate variability represents the inducibility. FIG. 8 is a graph in which a person's psychological state (anger, joy, sadness, relaxation) is mapped on a graph with the degree of arousal and inductivity as an orthogonal axis.
As can be seen from this figure, psychological states such as anger, joy, sadness, and relaxation can be estimated from the correlation between the levels of heart rate variability and heart rate.

FIG. 8 shows a processing example of the system controller 2 that determines the psychological state of the user based on the heart rate variability and the heart rate, and reflects this in the imaging operation setting.
As in the case of the imaging operation setting processing examples I and II described above, the system controller 2 counts the time as a predetermined imaging interval by an internal timer when automatic imaging is executed, for example, a predetermined time such as every 10 seconds. In accordance with the count, the imaging operation (storage of captured image data) by the imaging unit 3, the imaging control unit 4, and the storage unit 8 is executed.
For example, the system controller 2 instructs the imaging operation at regular time intervals to execute automatic imaging, and continuously performs the processing in FIG. 9 while performing the automatic imaging, and sequentially changes the imaging operation setting. become.

With the start of automatic imaging control, the system controller 2 starts the process of FIG. First, in step F301, initialization processing is performed.
In step F <b> 302, the system controller 2 acquires pulse wave information obtained from the pulse wave sensor 10.
In step F303, the heart rate and the value of heart rate variability are calculated. For example, in this case, the heart rate may be obtained from pulse wave information per unit time, and the value of heart rate variability may be obtained from the previous heart rate per unit time and the current heart rate per unit time.

  When the heart rate and the heart rate variability value are obtained, the system controller 2 compares the heart rate and the heart rate variability value with predetermined threshold values, respectively, and determines whether the heart rate is currently high or low. At the same time, it is determined whether the heart rate variability is large or small. Then, the processing branches in steps F304, F305, and F306 according to the determination result of the heart rate and the heart rate change.

When it is determined that the heart rate is large and the heart rate variability is also large, the system controller 2 proceeds from step F304 to F307. This is a case where it is determined that the user's psychological state is “joy”.
In step F307, the system controller 2 performs imaging operation settings such as moving image mode setting, large image size setting, and small compression rate setting.
From the time when the setting in step F307 is performed, the imaging control unit 4 operates in the moving image imaging mode. That is, the captured image data of each frame obtained by the imaging unit 3 is processed as moving image data. At this time, the image size of each frame is processed with the large size setting, and the compression processing with the small compression rate is further performed, and the captured image data as a moving image is transferred to the storage unit 8. The storage unit 8 stores the transferred captured image data as a moving image. In this case, high-quality moving image captured image data is stored.

When it is determined that the heart rate is large and the heart rate variability is small, the system controller 2 proceeds from step F305 to F308. This is a case where it is determined that the user's psychological state is “anger”.
In step F308, the system controller 2 performs imaging operation settings such as moving image mode setting, small image size setting, and large compression rate setting.
From the time when the setting in step F308 is performed, the imaging control unit 4 operates in the moving image imaging mode. That is, the captured image data of each frame obtained by the imaging unit 3 is processed as moving image data. At this time, the image size of each frame is processed with the small size setting, and the compression processing with a large compression ratio is executed, and the captured image data as a moving image is transferred to the storage unit 8. The storage unit 8 stores the transferred captured image data as a moving image. In this case, moving image captured image data with relatively low image quality is stored.

If it is determined that the heart rate is small and the heart rate variability is large, the system controller 2 proceeds from step F306 to F309. This is a case where it is determined that the user's psychological state is “relaxed”.
In step F309, the system controller 2 performs imaging operation settings such as still image mode setting, small image size setting, large compression rate setting, and long-time imaging interval setting.
From the time when the setting in step F309 is performed, the imaging control unit 4 operates in the still image imaging mode. That is, one frame of captured image data obtained by the imaging unit 3 is processed as still image data. At this time, the image size of the captured image data is reduced and a compression process with a large compression rate is executed, and the captured image data is transferred to the storage unit 8. The storage unit 8 stores the transferred captured image data as a still image. In this case, captured image data with relatively low image quality is stored.
Further, the system controller 2 increases the imaging interval. For example, when switching between the 5-second interval and the 10-second interval as the imaging interval, the still-image imaging operation is executed at an interval of 10 seconds after the setting of Step F309.

If it is determined that the heart rate is small and the heart rate variability is small, the system controller 2 proceeds to step F310. This is a case where it is determined that the user's psychological state is “sadness”.
In step F310, the system controller 2 performs imaging operation settings such as still image mode setting, large image size setting, small compression rate setting, and short-time setting of the imaging interval.
From the time when the setting in step F310 is performed, the imaging control unit 4 operates in the still image imaging mode. That is, one frame of captured image data obtained by the imaging unit 3 is processed as still image data. At this time, the image size of the captured image data is increased and a compression process with a small compression rate is executed, and the captured image data is transferred to the storage unit 8. The storage unit 8 stores the transferred captured image data as a still image. In this case, captured image data with relatively high image quality is stored.
Further, the system controller 2 shortens the imaging interval. For example, when switching between the 5-second interval and the 10-second interval as the imaging interval as described above, the still-image imaging operation is executed at intervals of 5 seconds after the setting of Step F310.

After the setting process of step F307, F308, F309, or F310, the system controller 2 returns to step F302 via step F311 and repeats the same process as described above.
When the automatic imaging operation is ended, the processing of FIG. 9 is also determined to be ended in step F311 and is ended.

By performing the processing of FIG. 9, during automatic imaging, imaging operation settings are switched according to determination of joy, anger, relaxation, and sadness as the user's psychological state, and imaging is performed.
For example, a situation where the user feels joy is the most important scene that the user wants to save as a life log, and the captured image data is stored as a high-quality moving image.
The situation in which the user feels angry is a relatively important scene as a life log, and the captured image data is stored as a relatively low-quality moving image.
In a situation where the user feels sad, the captured image data is stored as a large number of high-quality still images.
In a situation where the user is relaxed, the captured image data is stored as a small number of still images with relatively low image quality as a normal state.

In addition, the example of the setting according to a user's psychological condition can be considered variously not depending on this example. For example, a situation in which anger is felt or a situation in which sadness is felt may be the most important scene, and a more advantageous setting may be made.
It is also conceivable that the user can arbitrarily select what kind of emotion is used for what kind of setting.

[7. Effects of the embodiment]

According to the present embodiment described above, for example, when automatic imaging is performed at regular intervals, sequential automatic imaging is performed while imaging operation settings adapted to the user's psychology and behavior are performed. For example, when the user is psychologically different from normal, it is important for the user to perform settings such as high-quality imaging, shortening the imaging interval time, switching from still image imaging to video imaging, etc. The scene that you would feel can be stored more appropriately.
As a result, for example, as an automatic image capturing for life log use, a suitable image capturing is realized such that a scene impressive to the user is imaged with high image quality.
In addition, the user's pulse wave information is used as the biological information, but since the detection of the pulse wave is relatively easy, the apparatus configuration is advantageous.

Also, when it is detected that the heart rate has increased due to the pulse wave, it is determined whether the heart rate variation is due to heart rate variation due to psychology or heart rate variation due to exercise, and the determination result By performing the imaging operation setting according to the above, it is possible to accurately determine a psychological change and perform an appropriate imaging operation setting according to the psychological change.
Further, by performing frequency analysis of the heartbeat interval based on the pulse wave information and determining the user's stress situation, it is possible to set the imaging operation according to the user's stress situation.
Further, by obtaining the heart rate and heart rate fluctuation based on the pulse wave information and determining the psychological state of the user from these, it is possible to perform the imaging operation setting suitable for the emotion of the user.

Although the embodiments of the present invention have been described above, the present invention is not limited to the embodiments described so far, and various modifications can be considered.
The contents of the imaging operation settings, the items to be changed, or the combination thereof, the number of setting change steps that can be switched, etc., are considered to be very diverse.
Although the embodiment has been described as detecting pulse wave information, it is also possible to detect other biological information such as brain waves and sweat volume in addition to the pulse wave to estimate the psychological state and change the imaging operation setting. It is done.

It is explanatory drawing of the example of an external appearance of the imaging device of this invention. It is a block diagram of the imaging device of an embodiment. It is a block diagram of the other example of composition of the imaging device of an embodiment. It is explanatory drawing of the cause determination about the heart rate fluctuation | variation of embodiment. 10 is a flowchart of an imaging operation setting process example I according to the embodiment. It is explanatory drawing of the stress condition estimation by the heartbeat interval of embodiment. It is a flowchart of imaging operation setting processing example II of an embodiment. It is explanatory drawing of the psychological estimation by the heart rate and heart rate fluctuation | variation of embodiment. It is a flowchart of imaging operation setting processing example III of an embodiment.

Explanation of symbols

    DESCRIPTION OF SYMBOLS 1 Imaging device, 2 System controller, 3 Imaging part, 3a Imaging optical system, 3b Imaging element part, 3c Imaging signal processing part, 4 Imaging control part, 5 Display part, 6 Display control part, 7 Operation input part, 8 Storage part 10 Pulse wave sensor, 11 Pulse wave database

Claims (7)

  1. As an imaging operation, an imaging unit that obtains captured image data of a subject and performs a storage process of the captured image data;
    Biological information detection means for detecting the biological information of the user;
    When performing imaging control as automatic imaging processing that is not based on a user's shutter operation, imaging operation setting is performed based on biological information obtained by the biological information detection unit, and imaging operation based on imaging operation setting is performed as described above. Control means to be executed by the imaging means;
    An imaging apparatus comprising:
  2.   The imaging apparatus according to claim 1, wherein the biological information detection unit detects pulse wave information as the biological information.
  3.   The control means performs image quality setting, image size setting, imaging interval time setting, frame rate setting, still image imaging / moving image imaging switching setting, or shutter speed setting as the imaging operation setting. The imaging device according to claim 1.
  4.   The control means determines whether the heart rate fluctuation obtained from the pulse wave information detected by the biological information detection means is a heart rate fluctuation caused by psychology or a heart rate fluctuation caused by exercise, and according to the determination result The imaging apparatus according to claim 2, wherein the imaging operation setting is performed.
  5.   The said control means discriminate | determines a user's stress condition based on the pulse wave information detected by the said biometric information detection means, The said imaging operation setting is performed according to the discrimination | determination result. 2. The imaging device according to 2.
  6. The said control means discriminate | determines a user's psychological state based on the pulse wave information detected by the said biometric information detection means, and performs the said imaging operation setting according to the discrimination | determination result. 2. The imaging device according to 2.
  7. As an imaging operation, as an imaging method of an imaging apparatus that obtains captured image data of a subject and performs storage processing of the captured image data,
    A biological information detection step for detecting the biological information of the user;
    When performing imaging control as an automatic imaging process that is not based on a user's shutter operation, an imaging operation setting is performed based on the biological information detected in the biological information detection step, and an imaging operation based on the imaging operation setting is executed. Imaging step;
    An imaging method comprising:
JP2007216687A 2007-08-23 2007-08-23 Imaging apparatus and imaging method Pending JP2009049951A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007216687A JP2009049951A (en) 2007-08-23 2007-08-23 Imaging apparatus and imaging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007216687A JP2009049951A (en) 2007-08-23 2007-08-23 Imaging apparatus and imaging method

Publications (1)

Publication Number Publication Date
JP2009049951A true JP2009049951A (en) 2009-03-05

Family

ID=40501685

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007216687A Pending JP2009049951A (en) 2007-08-23 2007-08-23 Imaging apparatus and imaging method

Country Status (1)

Country Link
JP (1) JP2009049951A (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011020583A1 (en) * 2009-08-18 2011-02-24 Horst Sonntag Pulse-measuring device
US20120154612A1 (en) * 2010-12-20 2012-06-21 Samsung Electronics Co., Ltd. Imaging Apparatus and Method of Setting In-Focus Condition
JPWO2012098574A1 (en) * 2011-01-18 2014-06-09 三菱電機株式会社 Information processing system and information processing apparatus
JP2015033052A (en) * 2013-08-05 2015-02-16 カシオ計算機株式会社 Training support system, server, terminal, camera, method, and program
CN104735342A (en) * 2013-12-18 2015-06-24 卡西欧计算机株式会社 Dynamic image processing device and dynamic image processing method
JP2015116310A (en) * 2013-12-18 2015-06-25 オータックス株式会社 Monitoring device
WO2015185962A1 (en) * 2014-06-03 2015-12-10 Sony Corporation Lifelog camera and method of controlling in association with an intrapersonal area network
WO2015189713A1 (en) * 2014-06-13 2015-12-17 Sony Corporation Lifelog camera and method of controlling same according to transitions in activity
JP2016019210A (en) * 2014-07-10 2016-02-01 カシオ計算機株式会社 Imaging apparatus, image generating method and program
JP2016051263A (en) * 2014-08-29 2016-04-11 日本電信電話株式会社 Heartbeat feeling information generation apparatus, heartbeat feeling information generation method, distribution system and program
US9332377B2 (en) 2013-12-05 2016-05-03 Sony Corporation Device and method for control of data transfer in local area network
US9351100B2 (en) 2013-12-05 2016-05-24 Sony Corporation Device for control of data transfer in local area network
JP2016158683A (en) * 2015-02-27 2016-09-05 Winフロンティア株式会社 Image data organizing system, image data organizing device, image data organizing server device program, image data organizing terminal device program, and image data organizing device program
US9462455B2 (en) 2014-11-11 2016-10-04 Sony Corporation Dynamic user recommendations for ban enabled media experiences
US9489511B2 (en) 2013-12-05 2016-11-08 Sony Corporation Wearable device and a method for storing credentials associated with an electronic device in said wearable device
US9532275B2 (en) 2015-02-03 2016-12-27 Sony Corporation Body contact communication optimization with link key exchange
US9591682B2 (en) 2013-12-05 2017-03-07 Sony Corporation Automatic password handling
JP2017059044A (en) * 2015-09-17 2017-03-23 トヨタ自動車株式会社 Life log recording system
US9667353B2 (en) 2014-07-11 2017-05-30 Sony Corporation Methods of providing body area network communications when a user touches a button of a wireless electronic device, and related wireless electronic devices and wearable wireless electronic devices
US9674883B2 (en) 2014-07-23 2017-06-06 Sony Mobile Communications Inc. System, an object and a method for grouping of objects in a body area network
US9712256B2 (en) 2015-02-03 2017-07-18 Sony Corporation Method and system for capturing media by using BAN
US9743364B2 (en) 2014-04-24 2017-08-22 Sony Corporation Adaptive transmit power adjustment for phone in hand detection using wearable device
US9794670B2 (en) 2014-10-22 2017-10-17 Sony Mobile Communications Inc. BT and BCC communication for wireless earbuds
US9794733B2 (en) 2015-03-25 2017-10-17 Sony Corporation System, method and device for transferring information via body coupled communication from a touch sensitive interface
US9830001B2 (en) 2015-02-03 2017-11-28 Sony Mobile Communications Inc. Method, device and system for collecting writing pattern using ban
US9842329B2 (en) 2015-02-13 2017-12-12 Sony Corporation Body area network for secure payment
US9848325B2 (en) 2014-07-14 2017-12-19 Sony Corporation Enabling secure application distribution on a (E)UICC using short distance communication techniques
WO2018123057A1 (en) * 2016-12-28 2018-07-05 本田技研工業株式会社 Information providing system
US10136314B2 (en) 2015-01-16 2018-11-20 Sony Corporation BCC enabled key management system
US10133459B2 (en) 2015-05-15 2018-11-20 Sony Mobile Communications Inc. Usability using BCC enabled devices

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001036800A (en) * 1999-07-23 2001-02-09 Minolta Co Ltd Mounting type camera
JP2005303734A (en) * 2004-04-13 2005-10-27 Ntt Docomo Inc Communication equipment and server system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001036800A (en) * 1999-07-23 2001-02-09 Minolta Co Ltd Mounting type camera
JP2005303734A (en) * 2004-04-13 2005-10-27 Ntt Docomo Inc Communication equipment and server system

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011020583A1 (en) * 2009-08-18 2011-02-24 Horst Sonntag Pulse-measuring device
US20120154612A1 (en) * 2010-12-20 2012-06-21 Samsung Electronics Co., Ltd. Imaging Apparatus and Method of Setting In-Focus Condition
US8537265B2 (en) 2010-12-20 2013-09-17 Samsung Electronics Co., Ltd. Imaging apparatus and method of setting in-focus condition
JPWO2012098574A1 (en) * 2011-01-18 2014-06-09 三菱電機株式会社 Information processing system and information processing apparatus
JP2015033052A (en) * 2013-08-05 2015-02-16 カシオ計算機株式会社 Training support system, server, terminal, camera, method, and program
US9351100B2 (en) 2013-12-05 2016-05-24 Sony Corporation Device for control of data transfer in local area network
US9591682B2 (en) 2013-12-05 2017-03-07 Sony Corporation Automatic password handling
US9489511B2 (en) 2013-12-05 2016-11-08 Sony Corporation Wearable device and a method for storing credentials associated with an electronic device in said wearable device
US9942760B2 (en) 2013-12-05 2018-04-10 Sony Corporation Wearable device and a method for storing credentials associated with an electronic device in said wearable device
US9860928B2 (en) 2013-12-05 2018-01-02 Sony Corporation Pairing consumer electronic devices using a cross-body communications protocol
US9826561B2 (en) 2013-12-05 2017-11-21 Sony Corporation System and method for allowing access to electronic devices using a body area network
US9332377B2 (en) 2013-12-05 2016-05-03 Sony Corporation Device and method for control of data transfer in local area network
CN104735342B (en) * 2013-12-18 2018-08-28 卡西欧计算机株式会社 Moving image processing apparatus, dynamic image processing method and recording medium
JP2015116310A (en) * 2013-12-18 2015-06-25 オータックス株式会社 Monitoring device
US9536566B2 (en) 2013-12-18 2017-01-03 Casio Computer Co., Ltd. Video processing device, video processing method, and recording medium
CN104735342A (en) * 2013-12-18 2015-06-24 卡西欧计算机株式会社 Dynamic image processing device and dynamic image processing method
US9743364B2 (en) 2014-04-24 2017-08-22 Sony Corporation Adaptive transmit power adjustment for phone in hand detection using wearable device
US10194067B2 (en) 2014-06-03 2019-01-29 Sony Mobile Communications Inc. Lifelog camera and method of controlling in association with an intrapersonal area network
WO2015185962A1 (en) * 2014-06-03 2015-12-10 Sony Corporation Lifelog camera and method of controlling in association with an intrapersonal area network
CN106464796A (en) * 2014-06-03 2017-02-22 索尼公司 Lifelog camera and method of controlling in association with an intrapersonal area network
WO2015189713A1 (en) * 2014-06-13 2015-12-17 Sony Corporation Lifelog camera and method of controlling same according to transitions in activity
CN106464812A (en) * 2014-06-13 2017-02-22 索尼公司 Lifelog camera and method of controlling same according to transitions in activity
JP2016019210A (en) * 2014-07-10 2016-02-01 カシオ計算機株式会社 Imaging apparatus, image generating method and program
US9667353B2 (en) 2014-07-11 2017-05-30 Sony Corporation Methods of providing body area network communications when a user touches a button of a wireless electronic device, and related wireless electronic devices and wearable wireless electronic devices
US9848325B2 (en) 2014-07-14 2017-12-19 Sony Corporation Enabling secure application distribution on a (E)UICC using short distance communication techniques
US9674883B2 (en) 2014-07-23 2017-06-06 Sony Mobile Communications Inc. System, an object and a method for grouping of objects in a body area network
JP2016051263A (en) * 2014-08-29 2016-04-11 日本電信電話株式会社 Heartbeat feeling information generation apparatus, heartbeat feeling information generation method, distribution system and program
US10091572B2 (en) 2014-10-22 2018-10-02 Sony Corporation BT and BCC communication for wireless earbuds
US9794670B2 (en) 2014-10-22 2017-10-17 Sony Mobile Communications Inc. BT and BCC communication for wireless earbuds
US9462455B2 (en) 2014-11-11 2016-10-04 Sony Corporation Dynamic user recommendations for ban enabled media experiences
US10136314B2 (en) 2015-01-16 2018-11-20 Sony Corporation BCC enabled key management system
US9712256B2 (en) 2015-02-03 2017-07-18 Sony Corporation Method and system for capturing media by using BAN
US9532275B2 (en) 2015-02-03 2016-12-27 Sony Corporation Body contact communication optimization with link key exchange
US9830001B2 (en) 2015-02-03 2017-11-28 Sony Mobile Communications Inc. Method, device and system for collecting writing pattern using ban
US9842329B2 (en) 2015-02-13 2017-12-12 Sony Corporation Body area network for secure payment
JP2016158683A (en) * 2015-02-27 2016-09-05 Winフロンティア株式会社 Image data organizing system, image data organizing device, image data organizing server device program, image data organizing terminal device program, and image data organizing device program
US9794733B2 (en) 2015-03-25 2017-10-17 Sony Corporation System, method and device for transferring information via body coupled communication from a touch sensitive interface
US10133459B2 (en) 2015-05-15 2018-11-20 Sony Mobile Communications Inc. Usability using BCC enabled devices
JP2017059044A (en) * 2015-09-17 2017-03-23 トヨタ自動車株式会社 Life log recording system
US10133917B2 (en) 2015-09-17 2018-11-20 Toyota Jidosha Kabushiki Kaisha Lifelog recording system
WO2018123057A1 (en) * 2016-12-28 2018-07-05 本田技研工業株式会社 Information providing system

Similar Documents

Publication Publication Date Title
US9179057B2 (en) Imaging apparatus and imaging method that acquire environment information and information of a scene being recorded
US8957981B2 (en) Imaging device for capturing self-portrait images
CN101133438B (en) Electronic display medium and screen display control method used for electronic display medium
US6549231B1 (en) Image recording apparatus
JP4973299B2 (en) Optical communication apparatus and optical communication method
US6900778B1 (en) Display apparatus and method with detecting elements allocated on side facing face of user and on lower side of display windows
JP5553022B2 (en) Image display apparatus, control method, and computer program
US8885875B2 (en) Imaged image data processing apparatus, viewing information creating apparatus, viewing information creating system, imaged image data processing method and viewing information creating method
JP2008042319A (en) Imaging apparatus and method, and facial expression evaluating device and program
US9402033B2 (en) Image sensing apparatus and control method therefor
CN103399403B (en) Display device, display packing
JP2005252732A (en) Imaging device
JP2006509460A (en) Digital camera with removable display that can be used as a remote control
US6558050B1 (en) Human body-mounted camera
US20080080846A1 (en) Selecting autofocus area in an image
JP2008147864A (en) Image display system, display device and display method
US8115816B2 (en) Image capturing method, control method therefor, and program
KR20160105439A (en) Systems and methods for gaze-based media selection and editing
JP4961914B2 (en) Imaging display device and imaging display method
JP4025362B2 (en) Imaging apparatus and imaging method
US8107771B2 (en) Image processing apparatus and image processing method
US8687925B2 (en) Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program
JP2009111843A (en) Imaging apparatus and imaging method
JP5162928B2 (en) Image processing apparatus, image processing method, and image processing system
JP2001183735A (en) Method and device for image pickup

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20100802

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110712

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20111115