WO2021152711A1 - Dispositif d'imagerie d'intervalle - Google Patents

Dispositif d'imagerie d'intervalle Download PDF

Info

Publication number
WO2021152711A1
WO2021152711A1 PCT/JP2020/003053 JP2020003053W WO2021152711A1 WO 2021152711 A1 WO2021152711 A1 WO 2021152711A1 JP 2020003053 W JP2020003053 W JP 2020003053W WO 2021152711 A1 WO2021152711 A1 WO 2021152711A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
interval
imaging
image
shooting
Prior art date
Application number
PCT/JP2020/003053
Other languages
English (en)
Japanese (ja)
Inventor
奥山 宣隆
奥 万寿男
吉澤 和彦
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to JP2021573677A priority Critical patent/JP7343621B2/ja
Priority to PCT/JP2020/003053 priority patent/WO2021152711A1/fr
Publication of WO2021152711A1 publication Critical patent/WO2021152711A1/fr
Priority to JP2023140669A priority patent/JP2023164903A/ja

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an interval imaging device used as a life log camera or the like.
  • Patent Document 1 discloses an automatic imaging device that determines whether a state in which shooting processing is possible or a state in which shooting processing is not possible, and controls not to execute shooting when it is determined that shooting processing is not possible. There is.
  • Patent Document 1 when the image pickup device exists in a dark place where a sufficient amount of light is not satisfied, or when the image pickup lens is covered with a jacket or the like, automatic photographing is not performed. It is stated that it will be possible to take only images that are significant to the user.
  • Patent Document 1 In conventional techniques such as Patent Document 1, it is considered not to shoot an image meaningless to the user, but the relationship between the user's motion state and the shooting conditions described above is not particularly considered.
  • the present invention has been made in view of the above points, and an object of the present invention is to provide an interval imaging device that performs interval imaging under suitable imaging conditions according to a state of movement of a user.
  • the interval imaging device of the present invention has an imaging unit capable of shooting in a plurality of shooting modes, a motion sensor that detects the motion of the interval imaging device, and the presence or absence of motion based on the detection results of the motion sensor.
  • a motion determination unit for determination a timing generation unit that generates a trigger signal for imaging by the imaging unit at predetermined intervals, and a shooting control unit that controls the imaging unit and the timing generation unit according to the determination result of the motion determination unit.
  • the shooting control unit controls the imaging unit to switch between a shooting mode when there is movement (moving state) and a shooting mode when there is no movement (stationary state).
  • the imaging unit includes a standard mode for shooting at a standard image angle and a wide-angle mode for shooting at a wide angle
  • the photographing control unit shoots at the imaging unit in the standard mode in a moving state and in a wide-angle mode in a stationary state. Control to shoot with.
  • the imaging unit can acquire captured images in a plurality of formats
  • the imaging control unit acquires one of the captured images in the plurality of formats in the moving state and a plurality of captured images in the stationary state with respect to the imaging unit. Controls to acquire all captured images of the format.
  • an interval imaging device that efficiently acquires an image effective as a history image according to a state of movement of a user.
  • FIG. 5 is a wearing state diagram of the interval imaging device (omnidirectional camera 3) according to the second embodiment. Block diagram of omnidirectional camera 3. The external view of the interval imaging apparatus (HMD4) which concerns on Example 3.
  • FIG. 5 is a mounting state diagram of the interval imaging device 6 according to the fourth embodiment. The block diagram of the interval image pickup apparatus 6. A time chart showing an example of interval shooting. A flowchart showing an interval shooting operation.
  • the interval imaging device of the present invention is attached to the user and intermittently performs external photographing, and includes an imaging unit (camera) capable of photographing in a plurality of photographing modes.
  • an imaging unit camera
  • the configuration of the imaging unit will be specifically described in each case.
  • FIG. 1 is a diagram showing a state in which the user wears the interval imaging device 2 according to the first embodiment.
  • the interval imaging device 2 is attached to the head of the user 1 by the attachment belt 7.
  • the interval imaging device 2 is provided with a standard camera 11 and a wide-angle camera 12 for photographing the front of the user 1 as an imaging unit for externally photographing.
  • FIG. 2 is a block diagram of the interval imaging device 2.
  • the sensors include an acceleration sensor 13, a gyro sensor 14, a geomagnetic sensor 15, a position sensor 16, and an illuminance sensor 17.
  • the user setting unit 18, the communication unit 19, the CPU (or microprocessor) 20, the RAM 21, and the flash ROM 22 are provided, and each unit is connected by an internal bus 26.
  • the standard camera 11 and the wide-angle camera 12 are cameras that capture a standard angle-of-view image and a wide-angle image, respectively, and are used by switching according to the movement state of the user 1 as described later.
  • the image to be captured is for obtaining the user's action history, and may be a still image or a short moving image.
  • the captured image data is stored in the captured data holding unit 25 or in the external storage via the communication unit 19.
  • the acceleration sensor 13 and the gyro sensor 14 detect the movement and shaking of the interval image pickup device 2 (that is, the user 1 who wears the interval image pickup device 2).
  • the acceleration sensor 13 detects acceleration in the uniaxial, biaxial or triaxial directions
  • the gyro sensor 14 detects angular velocities in the uniaxial, biaxial or triaxial directions.
  • the geomagnetic sensor 15 acquires the orientation information of the interval imaging device 2, and the position sensor (for example, the GPS receiving unit) 16 acquires the position information, and handles these information as metadata of the captured image by attaching it to the image data.
  • the illuminance sensor 17 is used to detect the brightness of the surroundings and adjust the shooting conditions of the camera.
  • the user setting unit 18 is an operation input unit for the user to set the time interval (shooting interval) of interval shooting to a desired value.
  • Communication unit 19 includes all or part of communication functions such as mobile communication such as 4G, wireless LAN communication, and Bluetooth (registered trademark) communication.
  • the communication unit 19 can also transmit the image data taken by the standard camera 11 or the wide-angle camera 12 to the external storage via the network.
  • the CPU 20 expands the program 23 stored and stored in the flash ROM 22 into the RAM 21, and by executing this, controls each component of the interval imaging device 2 and realizes various functions.
  • the flash ROM 22 includes a program 23 and a shooting data holding unit 25, and the program 23 further includes a motion determination process 24a, a timing generation process 24b, and a shooting control process 24c that constitute the interval shooting application 24.
  • the interval shooting application 24 is executed, corresponding functional blocks (motion determination unit 24a, timing generation unit 24b, shooting control unit 24c) are configured.
  • the motion determination unit 24a compares the detection values of the acceleration sensor 13 and the gyro sensor 14 with a threshold value set separately, and determines whether or not the interval imaging device 2 (that is, the user 1 wearing the interval imaging device 2) is moving. do. When the interval imaging device 2 (user 1) is moving at a constant speed, it is determined that there is movement (movement state). The current velocity can be calculated by integrating the acceleration over time.
  • the timing generation unit 24b generates a trigger signal indicating the shooting timing at the set shooting interval.
  • the shooting control unit 24c switches the shooting mode, that is, switches between the standard camera 11 and the wide-angle camera 12, and controls the timing generation unit 24b to generate a trigger signal based on the determination result of the motion determination unit 24a.
  • the CPU 20, RAM 21, and flash ROM 22 may be mounted on one integrated circuit.
  • FIG. 3 is a diagram showing an example of a standard angle-of-view image and a wide-angle image.
  • a standard mode and a wide-angle mode are provided as shooting modes, and these modes are appropriately switched.
  • the standard angle-of-view image 81 is photographed by the standard camera 11, and in the wide-angle mode, the wide-angle image 82 is photographed by the wide-angle camera 12.
  • the standard angle-of-view image 81 is, for example, an image including the periphery of the person 83 in front, but the wide-angle image 82 can capture a wide range of backgrounds, which is effective for grasping the background situation in more detail. ..
  • 4A and 4B are time charts showing an example of interval shooting. From the top of the drawing, the time transitions of the motion determination result, shooting timing, and shooting mode are shown.
  • the motion determination result is a determination of the motion state of the interval imaging device 2 by the motion determination unit 24a.
  • the state with movement and its period will be referred to as “exercise state” and “exercise period”
  • the state without movement and its period will be referred to as “stationary state” and “stationary period”.
  • the "exercise state” is described as “exercise”
  • the "stationary state” is described as “stationary”.
  • the shooting timing signals T1 to T11 are generated by the timing generation unit 24b, and based on this, a trigger signal to the camera is generated.
  • the shooting mode has a standard mode and a wide-angle mode, and is switched by the shooting control unit 24c.
  • the shooting timing interval ⁇ T is constant except for the intervals [T2, T3] and [T8, T9], and the user's action history is obtained in seconds, minutes, or hours via the user setting unit 18.
  • the shooting interval ⁇ T is set to about several minutes.
  • T3 and T9 where the motion state changes from the motion state to the stationary state, the interval of the shooting timing up to that point is reset and a new shooting timing is set. That is, T3 and T9 become new shooting timings, and thereafter, the intervals ⁇ T are set with reference to these timings.
  • the image is taken in the standard mode using only the standard camera 11.
  • the standard camera 11 may have a smaller number of pixels or the like than the wide-angle camera 12.
  • the image is taken in the wide-angle mode using the wide-angle camera 12.
  • shooting may be performed using the standard camera 11 at the same time.
  • FIG. 4B shows an increase in the shooting timing during the rest period as compared with FIG. 4A.
  • the shooting timings of T3a and T3b are added, and in the period of [T9, T10], the shooting timings of T9a and T9b are added, and shooting is performed in the wide-angle mode.
  • the shooting interval ⁇ T'in the stationary period is set to about several tens of seconds. This makes it possible to observe the surrounding situation in more detail when there are moving or changing objects in the surroundings.
  • FIG. 5 is a flowchart showing the interval shooting operation.
  • the shooting operation proceeds by the interval shooting application 24 (movement determination unit 24a, timing generation unit 24b, shooting control unit 24c).
  • the timer value t As parameters, the timer value t, the movement flag F, the shooting interval ⁇ T during the exercise period (first shooting interval), and the shooting interval ⁇ T'(second shooting interval) during the stationary period are used.
  • S103 Acquires the outputs of the acceleration sensor 13 and the gyro sensor 14.
  • S104 The motion determination unit 24a determines the motion state.
  • S105 Proceed to S106 in the case of a moving state, and proceed to S109 in the case of a stationary state.
  • S107 The timer value t and the first shooting interval ⁇ T are compared. If t ⁇ ⁇ T, the process proceeds to S108, otherwise the process proceeds to S115.
  • S108 Shoot in the standard mode, that is, with the standard camera 11. As a result, in the moving state, shooting in the standard mode is performed at the first shooting interval ⁇ T.
  • S113 The captured image is saved in the captured data holding unit 25, or transmitted from the communication unit 19 to the external storage and saved.
  • S114: The timer value t is reset (t 0).
  • an image effective for observing the background can be acquired by the wide-angle camera 12 when stationary, and more abundant image information can be acquired by narrowing the shooting interval.
  • shooting since shooting is performed immediately after the transition from the moving state to the stationary state, the shooting in the stationary state is not missed even if the stationary period is short.
  • the capacity of the shooting data holding unit 25 and the power consumption of the device can be saved, and an efficient interval imaging device can be realized.
  • the switching of the shooting mode is to switch between shooting with the standard camera 11 and shooting with the wide-angle camera 12 according to the state of movement, but once both cameras 11 and 12 After shooting with, only the shot image corresponding to the motion state may be selected and saved in the shooting data holding unit 25. This also applies to the following examples.
  • FIG. 6 is a diagram showing a state in which the user wears the interval imaging device (omnidirectional camera 3) according to the second embodiment.
  • the user 1 attaches the omnidirectional camera 3 to the head using a mounting jig 8 such as a helmet.
  • the omnidirectional camera 3 has a first half celestial sphere optical system 31 for photographing the front of the user 1 and a second half celestial sphere optical system 32 for photographing the rear of the user 1.
  • FIG. 7 shows a block diagram of the omnidirectional camera 3.
  • the same components as those of the interval imaging device 2 of the first embodiment (FIG. 2) are designated by the same reference numerals, and redundant description will be omitted.
  • the first half celestial sphere optical system 31 is composed of a fisheye lens or the like and captures the first half celestial sphere of the user 1.
  • the second half celestial sphere optical system 32 is also composed of a fisheye lens or the like and captures the second half celestial sphere of the user 1.
  • the captured images of the first half celestial sphere and the second half celestial sphere are projected onto the image sensor (imaging element) 33 and taken out as image data. At that time, by selecting the detection region of the image sensor, it is possible to separately acquire the captured image of the first half celestial sphere and the captured image of the second half celestial sphere.
  • Interval shooting using the omnidirectional camera 3 is performed in the same manner as the shooting timing and shooting mode shown in FIGS. 4A and 4B of the first embodiment. Further, the flowchart of the shooting operation is performed in the same manner as in FIG. However, regarding the shooting mode, in the standard mode, the first half celestial sphere image using the first half celestial sphere optical system 31 is acquired. Further, in the wide-angle mode, a spherical image using both the first half celestial sphere optical system 31 and the second half celestial sphere optical system 32 is acquired.
  • the second embodiment it is possible to acquire an image effective for background observation by a spherical image when stationary, and to obtain a historical image at a predetermined shooting interval even when exercising.
  • Example 3 describes a case where a head-mounted display (HMD) is used as an imaging unit for interval shooting.
  • HMD head-mounted display
  • FIG. 8 is an external view of the interval imaging device (head-mounted display (HMD) 4) according to the third embodiment.
  • the configuration includes a left visual line camera 41, a right visual line camera 42, left and right projection optical systems 44a and 44b, a display optical system 45 such as a lens and a screen, a speaker 46, a microphone 47, a frame housing 48a to 48c, and a nose pad 49.
  • It has a controller 50 and the like.
  • the controller 50, the left line-of-sight camera 41, the right line-of-sight camera 42, the speaker 46, and the microphone 47 are arranged in the frame housings 48a to 48c.
  • the arrangement location does not have to be as shown in FIG.
  • a group of sensors such as an acceleration sensor, a gyro sensor, and a position sensor are built in.
  • the left line-of-sight camera 41 and the right line-of-sight camera 42 are three-dimensional cameras capable of photographing the front of the user's line of sight and measuring the distance to an object in the real space captured by the photographed image by using the difference between the left eye and the right eye. It is configured. By using the measured distance as depth information of the captured image, it is useful for understanding the captured image.
  • the controller 50 takes in the image of the real space taken by the left visual line camera 41 and the right visual line camera 42, the distance data to the object in the real space, and the like into the internal memory and the CPU. Further, the controller 50 creates images projected by the projection optical systems 44a and 44b by computer graphics or the like, and also creates audio output from the speaker 46.
  • the projection optical systems 44a and 44b and the display optical system 45 serve as a display unit of the HMD4.
  • the projection optical systems 44a and 44b divide the image of the virtual object created by the controller 50 into a left-eye image and a right-eye image, and project and display the image on the display optical system 45.
  • the user 1 sees a landscape or a real object in front of the display optical system 45 through the transmissive display optical system 45, and superimposes and visually recognizes an image of a virtual object projected from the projection optical systems 44a and 44b on the display optical system 45. be able to.
  • FIG. 9 shows a block diagram of the head-mounted display (HMD) 4.
  • HMD head-mounted display
  • the same components as those in FIG. 8 are designated by the same reference numerals, and the other elements include a sensor group 43, a distance calculation unit 51, a communication unit 52, a CPU 53, a RAM 54, an image RAM 55, and a flash ROM 56. Each part is connected by an internal bus 60.
  • the projection optical system 44 corresponds to the projection optical systems 44a and 44b of FIG. 8, and the left-eye image and the right-eye image are independently projected onto the display optical system 45.
  • a method may be used in which the left-eye image and the right-eye image interleaved by one projector are projected, and the left-eye image and the right-eye image are projected on the respective eyes by the shutter optical system.
  • an optical system using a holographic lens may be used.
  • the communication unit 52 has a plurality of communication functions such as mobile communication, wireless LAN and Bluetooth (registered trademark), and connects the HMD4 to an external storage or the like via a network.
  • the CPU 53 expands the program stored in the flash ROM 56 into the RAM 54 and executes the program to control the operation of each component of the HMD 4.
  • the image RAM 55 stores video data to be transmitted to the projection optical system 44.
  • the flash ROM 56 includes a basic operation program 57, a processing program of the interval shooting application 58, and a shooting data holding unit 59.
  • the basic operation program 57 performs the projection operation process as the HMD 4, and the interval shooting application 58 performs the motion determination process 24a, the timing generation process 24b, and the shooting control process 24c described in the first embodiment (FIG. 2). Further, the shooting data holding unit 59 stores the image data shot by the camera.
  • the interval shooting using the HMD4 in this embodiment is performed in the same manner as the shooting timing and shooting mode shown in FIGS. 4A and 4B of the first embodiment. Further, the flowchart of the shooting operation is performed in the same manner as in FIG. However, regarding the shooting mode, in the standard mode, the left visual line camera 41 or the right visual line camera 42 is used for shooting. Further, in the wide-angle mode, three-dimensional photography is performed using both the left visual line camera 41 and the right visual line camera 42.
  • the third embodiment it is possible to acquire an image effective for background observation by a three-dimensional image when stationary, and to obtain a historical image at a predetermined shooting interval even when exercising. In addition, it is possible to reduce the amount of captured data and power consumption of the historical image during exercise.
  • the optical axes of the lenses of the left visual line camera 41 and the right visual line camera 42 can be substantially aligned with the user's line of sight, so that the landscape actually seen by the user is a three-dimensional photograph. Can be saved as. Further, by operating the projection optical system 44 and the display optical system 45, there is an advantage that the captured image can be viewed retroactively in time without interrupting the interval shooting.
  • Example 4 has a configuration in which the motion determination process in Example 1 is determined at a plurality of levels. Specifically, the movement of the user's head and the movement of the body are detected, and the movements are compared to control shooting.
  • FIG. 10 is a diagram showing a state in which the user wears the interval imaging device 6 according to the fourth embodiment.
  • the main body portion 6a of the interval imaging device 6 (hereinafter referred to as the device main body portion) is attached to the head of the user 1 by the attachment belt 7.
  • the apparatus main body 6a is provided with a standard camera 11 and a wide-angle camera 12.
  • the sensor terminal 6b is attached to the body (for example, the chest) of the user 1.
  • the sensor terminal 6b detects the movement of the body of the user 1.
  • motion determination is performed using the motion data of the user's head and the motion data of the user's torso.
  • first movement data the movement of the user's head
  • second movement data the movement of the user's torso
  • FIG. 11 shows a block diagram of the interval imaging device 6.
  • the configuration of the device main body 6a is the same as that of the first embodiment (FIG. 2), and the acceleration sensor 13 and the gyro sensor 14 are “first motion sensors” that detect the movement of the user's head.
  • the interval shooting application 24 includes a motion determination process 24a, a timing generation process 24b, and a shooting control process 24c. In this figure, some components of the device main body 6a are omitted.
  • the configuration of the sensor terminal 6b includes an acceleration sensor 61, a gyro sensor 62, a communication unit 63, a CPU (or microcomputer) 64, a RAM 65, and a flash ROM 66.
  • the acceleration sensor 61 and the gyro sensor 62 are "second movement sensors" that detect the movement of the user's body.
  • the communication unit 63 supports a low power communication protocol such as Bluetooth (registered trademark), and performs one-to-one communication with the communication unit 19 of the device main body 6a.
  • the operation program 67 stored in the flash ROM 66 is for performing a process of acquiring the motion data of the user's body by the second motion sensors 61 and 62 and transmitting the motion data to the apparatus main body 6a.
  • the sensor terminal 6b acquires the movement data of the user's body in conjunction with the shooting operation of the device main body 6a. Specifically, when a motion data request is received from the device body 6a, the detected values of the acceleration sensor 61 and the gyro sensor 62 are transmitted to the device body 6a as second motion data.
  • the motion determination unit 24a determines the motion state at a plurality of levels. In the judgment, the correlation between the movement data of both is examined together with the state of each movement.
  • both the first motion data and the second motion data are "with motion” and have the same direction and the same size (correlation), that is, the head and the body move in synchronization. If so, it is determined to be "exercise state 1", and shooting is performed in the standard mode (standard camera 11 only).
  • the first motion data and the second motion data are in different directions and different sizes (no correlation), that is, when the head and the torso are moving randomly out of synchronization, it is determined to be "exercise state 2". Then, the camera shooting is restricted (stopped).
  • both the first motion data and the second motion data are "no motion”
  • shooting is performed in the wide-angle mode (wide-angle camera 12).
  • FIG. 12 is a time chart showing an example of interval shooting.
  • the notation in the figure is the same as in the first embodiment (FIG. 4A).
  • the motion determination result determines the state of motion in three stages of "exercise 1", “exercise 2", and "stationary".
  • the shooting mode in addition to the standard mode and wide-angle mode, a mode for limiting shooting has been added.
  • the shooting timing interval ⁇ T is set in the same manner as in FIG. 4A, and in T23 and T30 where the motion determination transitions from the moving state to the stationary state, the shooting timing interval up to that point is reset and a new shooting timing is set. Will be done.
  • the motion judgment result is "exercise 1"
  • the image is taken in the standard mode, for example, T21 and T22.
  • the motion determination result is "exercise 2”
  • camera shooting is restricted, for example, T27 and T28.
  • shooting is performed in the wide-angle mode, for example, T23 and T24.
  • the shooting timing interval ⁇ T is set to be equal in the standard mode and the wide-angle mode, but as shown in FIG. 4B, it may be set to a different interval ⁇ T'in the wide-angle mode.
  • FIG. 13 is a flowchart showing the interval shooting operation.
  • the steps having the same contents as the flowchart shown in the first embodiment (FIG. 5) are given the same numbers.
  • the shooting timing interval is ⁇ T in the standard mode and ⁇ T'in the wide-angle mode.
  • the steps different from the flowchart of FIG. 5 are as follows.
  • S103a Acquires the output of the first motion sensor (accelerometer 13, gyro sensor 14) of the device main body 6a.
  • S103b Acquires the output of the second motion sensor (accelerometer 61, gyro sensor 62) of the sensor terminal 6b.
  • the motion determination unit 24a compares the first motion data with the second motion data, and performs three-step motion determination. If the first and second motion data are "with motion” and have a correlation, it is regarded as "exercise state 1". If the first and second motion data are "with motion” and there is no correlation, it is set as “exercise state 2". When both the first and second motion data are "no motion”, it is regarded as "stationary state".
  • the motion is determined by using the motion sensors attached to the user's head and the body, and if there is no correlation between the motion data even if both are in the motion state, the camera I tried to limit the shooting.
  • the camera I tried to limit the shooting.
  • the motion state may be determined at a plurality of levels according to the size of the motion data by using only the first motion sensors 13 and 14 of the apparatus main body 6a. That is, it is determined that a small movement or a large movement is performed, and in the case of a large movement exceeding a predetermined value, unnecessary shooting is reduced by limiting the camera shooting, and the same effect can be obtained.
  • the present invention is not limited to these, and a part of the configuration of one embodiment can be replaced with another embodiment. It is also possible to add the configuration of another embodiment to the configuration of one embodiment. All of these belong to the category of the present invention, and the numerical values appearing in the text and figures are merely examples, and even if different ones are used, the effect of the present invention is not impaired.
  • the interval imaging device of the present invention is attached to the user to acquire the user's history image (life log)
  • the present invention is not limited to this, and the interval imaging device can be used as an animal, a vehicle, or the like. It can also be applied when it is attached to a moving body and an image of its action history is acquired.
  • the functions and the like of the invention may be implemented by hardware by designing a part or all of them by, for example, an integrated circuit. Further, it may be implemented by software by interpreting and executing an operation program by a microprocessor unit, a CPU, or the like. Further, the implementation range of the software is not limited, and the hardware and the software may be used together.
  • HMD Head mount display
  • 6a Device body
  • 6b Sensor terminal
  • 11 Standard camera
  • 12 Wide angle camera
  • 13 Acceleration sensor
  • 14,62 Gyro sensor
  • 19,63 Communication unit
  • 24,58 Interval shooting application
  • 24a Motion judgment processing
  • 24b Timing generation processing
  • 24c Shooting control processing
  • 25,59 Shooting Data holding unit
  • 31 first half celestial optical system
  • 32 second half celestial optical system
  • 33 imaging sensor
  • 42 right visual line camera.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Cameras In General (AREA)

Abstract

La présente invention concerne un dispositif d'imagerie d'intervalles qui capture des images dans des conditions de capture d'image appropriées selon l'état de mouvement d'un utilisateur. Le dispositif d'imagerie d'intervalle 2 comprend : des unités d'imagerie 11, 12 capables de capturer une image selon une pluralité de modes de capture d'image ; des capteurs de mouvement 13, 14 pour détecter le mouvement du dispositif d'imagerie d'intervalle ; une unité de détermination de mouvement 24a pour déterminer la présence d'un mouvement à partir des résultats de détection des capteurs de mouvement ; une unité de génération de synchronisation 24b pour générer à des intervalles prescrits un signal déclencheur pour les unités d'imagerie pour qu'elles capturent une image ; et une unité de commande de capture d'image 24c pour commander les unités d'imagerie et l'unité de génération de synchronisation selon le résultat de détermination de l'unité de détermination de mouvement. Les unités d'imagerie sont pourvues, par exemple, d'un mode standard pour capturer une image sous un angle de vue standard et d'un mode grand angle pour capturer une image sous un grand angle de vue, et l'unité de commande de capture d'image commande les unités d'imagerie afin de capturer une image dans le mode standard lorsqu'il y a un mouvement (état de mouvement) et de capturer une image dans le mode grand angle lorsqu'il n'y a pas de mouvement (état stationnaire).
PCT/JP2020/003053 2020-01-28 2020-01-28 Dispositif d'imagerie d'intervalle WO2021152711A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2021573677A JP7343621B2 (ja) 2020-01-28 2020-01-28 インターバル撮像装置
PCT/JP2020/003053 WO2021152711A1 (fr) 2020-01-28 2020-01-28 Dispositif d'imagerie d'intervalle
JP2023140669A JP2023164903A (ja) 2020-01-28 2023-08-31 インターバル撮像装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/003053 WO2021152711A1 (fr) 2020-01-28 2020-01-28 Dispositif d'imagerie d'intervalle

Publications (1)

Publication Number Publication Date
WO2021152711A1 true WO2021152711A1 (fr) 2021-08-05

Family

ID=77078046

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/003053 WO2021152711A1 (fr) 2020-01-28 2020-01-28 Dispositif d'imagerie d'intervalle

Country Status (2)

Country Link
JP (2) JP7343621B2 (fr)
WO (1) WO2021152711A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009118135A (ja) * 2007-11-06 2009-05-28 Sony Corp 撮像装置、撮像方法
JP2009267792A (ja) * 2008-04-25 2009-11-12 Panasonic Corp 撮像装置
JP2010011343A (ja) * 2008-06-30 2010-01-14 Toyota Motor Corp 画像取得装置、画像取得方法、及び移動体
JP2013153329A (ja) * 2012-01-25 2013-08-08 Nikon Corp 電子機器
JP2015005809A (ja) * 2013-06-19 2015-01-08 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP2019121855A (ja) * 2017-12-28 2019-07-22 キヤノン株式会社 撮像装置およびその制御方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4725595B2 (ja) 2008-04-24 2011-07-13 ソニー株式会社 映像処理装置、映像処理方法、プログラム及び記録媒体

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009118135A (ja) * 2007-11-06 2009-05-28 Sony Corp 撮像装置、撮像方法
JP2009267792A (ja) * 2008-04-25 2009-11-12 Panasonic Corp 撮像装置
JP2010011343A (ja) * 2008-06-30 2010-01-14 Toyota Motor Corp 画像取得装置、画像取得方法、及び移動体
JP2013153329A (ja) * 2012-01-25 2013-08-08 Nikon Corp 電子機器
JP2015005809A (ja) * 2013-06-19 2015-01-08 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP2019121855A (ja) * 2017-12-28 2019-07-22 キヤノン株式会社 撮像装置およびその制御方法

Also Published As

Publication number Publication date
JPWO2021152711A1 (fr) 2021-08-05
JP2023164903A (ja) 2023-11-14
JP7343621B2 (ja) 2023-09-12

Similar Documents

Publication Publication Date Title
US11860511B2 (en) Image pickup device and method of tracking subject thereof
JP7408678B2 (ja) 画像処理方法およびヘッドマウントディスプレイデバイス
US9245389B2 (en) Information processing apparatus and recording medium
US20170085841A1 (en) Image Recording System, User Wearable Device, Imaging Device, Image Processing Device, Image Recording Method And Storage Medium
JP2011166756A (ja) 撮影装置及び撮影システム
KR20220128585A (ko) 웨어러블 촬상장치, 촬상장치와 통신하는 휴대 기기 및 캘리브레이터, 이들의 제어방법과 이들의 제어 프로그램을 기억한 기억매체
JP6533761B2 (ja) 情報処理装置、情報処理システム、および情報処理方法
JP2009171428A (ja) デジタルカメラ装置および電子ズームの制御方法およびプログラム
KR20170119201A (ko) 스마트폰 카메라 원격 제어 방법 및 시스템
WO2021152711A1 (fr) Dispositif d'imagerie d'intervalle
JP6499993B2 (ja) 情報処理装置、情報処理システム、および情報処理方法
CN107426522B (zh) 基于虚拟现实设备的视频方法和系统
JP6921204B2 (ja) 情報処理装置および画像出力方法
RU2782312C1 (ru) Способ обработки изображения и устройство отображения, устанавливаемое на голове
US20240121506A1 (en) System consisting of electronic device and notification apparatus, control method thereof, and electronic device and notification apparatus
JP2018160809A (ja) 画像処理装置、撮像システム、画像処理方法及びプログラム
JP2022140420A (ja) 撮像装置、表示装置、映像転送システム、制御方法、並びにプログラム
JP2023154539A (ja) 撮像装置、撮像装置の制御方法、表示装置及び撮像システム
JP2023143437A (ja) 制御装置、制御方法、及び制御プログラム
JP2022140424A (ja) 撮像装置及びその制御方法並びにプログラム
JP2024064186A (ja) 撮像システム並びに装置、制御方法
JP2023055074A (ja) 撮像装置及びその制御方法並びにプログラム
JP2024097267A (ja) 撮像装置、キャリブレータ、及びこれらの制御方法並びにプログラム
JP2024017297A (ja) 撮像装置、ウェアラブルデバイス、制御方法、プログラムおよびシステム
JP2022140418A (ja) 撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20916849

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021573677

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20916849

Country of ref document: EP

Kind code of ref document: A1