CN117917089A - Imaging device and imaging method - Google Patents

Imaging device and imaging method Download PDF

Info

Publication number
CN117917089A
CN117917089A CN202280059544.8A CN202280059544A CN117917089A CN 117917089 A CN117917089 A CN 117917089A CN 202280059544 A CN202280059544 A CN 202280059544A CN 117917089 A CN117917089 A CN 117917089A
Authority
CN
China
Prior art keywords
target area
control unit
focus position
focal position
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280059544.8A
Other languages
Chinese (zh)
Inventor
中岛伸基
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JVCKenwood Corp
Original Assignee
JVCKenwood Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by JVCKenwood Corp filed Critical JVCKenwood Corp
Priority claimed from PCT/JP2022/029298 external-priority patent/WO2023047802A1/en
Publication of CN117917089A publication Critical patent/CN117917089A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)
  • Focusing (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

The focus position is properly aligned. An imaging device (100) is provided with: a photographing element; an object information acquisition unit that acquires position information of an object present in an imaging region (AR 0) of an imaging element; and a focus position control unit that controls the focus position of the imaging device (100). The focus position control unit aligns the focus position with an object in a target Area (AR) existing between a first position (AX 1) which is a first distance (L1) from the imaging device (100) and a second position (AX 2) which is a second distance (L2) from the imaging device (100), and the second distance (L2) is smaller than the first distance (L1). The focus position is continuously aligned with the object while the object is present in the object Area (AR), and the focus position is separated from the object after the object is moved outside the object Area (AR).

Description

Imaging device and imaging method
Technical Field
The present invention relates to an imaging apparatus and an imaging method.
Background
An imaging device of an auto-focus system is known, in which a focus position is automatically set. For example, patent document 1 describes a method of focusing on a predetermined position designated by a user.
Prior art literature
Patent literature
Patent document 1: international publication No. 2017/141746.
Disclosure of Invention
In an imaging device of an auto-focus system, proper focusing is required.
In view of the above problems, an object of the present embodiment is to provide an imaging device, an imaging method, and a program that can be properly focused.
An imaging device according to an aspect of the present embodiment is an imaging device capable of imaging an object, including: a photographing element; an object information acquisition unit that acquires position information of an object existing in a shooting region of the shooting element; and a focus position control unit that controls a focus position of the imaging device, the focus position control unit aligning the focus position with an object in a target area existing between a first position and a second position, the first position being a position at a first distance from the imaging device, the second position being a position at a second distance from the imaging device, the second distance being smaller than the first distance, the focus position control unit continuously aligning the focus position with the object while the object is present in the target area, and moving the focus position away from the object after the object is moved outside the target area.
An imaging method according to an aspect of the present embodiment is an imaging method for imaging an object, including the steps of: acquiring position information of an object existing in a shooting area; and controlling a focus position of a photographing device, wherein in the step of controlling the focus position, the focus position is aligned with an object of an object region existing between a first position and a second position, the first position is a position having a first distance from the photographing device, the second position is a position having a second distance from the photographing device, the second distance is smaller than the first distance, the focus position is continuously aligned with the object during the period that the object exists in the object region, and the focus position is separated from the object after the object moves outside the object region.
A program according to an aspect of the present embodiment is a program for causing a computer to execute an imaging method for imaging an object, the program causing the computer to execute: acquiring position information of an object existing in a shooting area; and controlling a focus position of a photographing device, wherein in the step of controlling the focus position, the focus position is aligned with an object of an object region existing between a first position and a second position, the first position is a position having a first distance from the photographing device, the second position is a position having a second distance from the photographing device, the second distance is smaller than the first distance, the focus position is continuously aligned with the object during the period that the object exists in the object region, and the focus position is separated from the object after the object moves outside the object region.
According to the present embodiment, focusing can be performed appropriately.
Drawings
Fig. 1 is a schematic block diagram of an imaging device according to a first embodiment.
Fig. 2 is a schematic diagram for explaining an example of the target area.
Fig. 3 is a schematic diagram for explaining an example of the target area.
Fig. 4 is a schematic diagram showing another example of the object region.
Fig. 5 is a schematic diagram showing another example of the object region.
Fig. 6 is a flowchart illustrating a flow of processing for setting the focal position.
Fig. 7 is a schematic diagram illustrating an example of a case where the movement of the object is set to a predetermined condition.
Fig. 8 is a flowchart illustrating a flow of processing for setting the focal position in the second embodiment.
Fig. 9 is a schematic diagram illustrating an example of a case where a plurality of objects exist in the object region.
Fig. 10 is a flowchart illustrating a flow of processing for setting the focal position according to the third embodiment.
Fig. 11 is a schematic diagram for explaining a method of setting a focus position in the fourth embodiment.
Fig. 12 is a flowchart illustrating a flow of processing for setting the focal position according to the fourth embodiment.
Fig. 13 is a schematic block diagram of an imaging device according to a fifth embodiment.
Fig. 14 is a schematic diagram for explaining an example of the target region.
Fig. 15 is a schematic diagram for explaining an example of the target region.
Fig. 16 is a schematic diagram showing another example of the object region.
Fig. 17 is a schematic diagram showing another example of the object region.
Fig. 18 is a flowchart illustrating a flow of processing for setting the focal position.
Fig. 19 is a schematic diagram for explaining setting of the focal position in the sixth embodiment.
Fig. 20 is a flowchart illustrating a flow of processing for setting the focal position in the sixth embodiment.
Fig. 21 is a schematic diagram illustrating an example of a case where the movement of the object is set to a predetermined condition.
Fig. 22 is a schematic block diagram of an imaging device according to an eighth embodiment.
Fig. 23 is a schematic diagram for explaining an example of the target region.
Fig. 24 is a schematic diagram for explaining an example of the target region.
Fig. 25 is a schematic diagram showing another example of the object region.
Fig. 26 is a schematic diagram showing another example of the object region.
Fig. 27 is a flowchart illustrating a flow of processing for setting the focal position.
Fig. 28 is a schematic diagram illustrating an example of a case where the movement of the object is set to a predetermined condition.
Detailed Description
The present embodiment will be described in detail below with reference to the drawings. The present embodiment is not limited to the embodiments described below.
(First embodiment)
(Structure of photographing device)
Fig. 1 is a schematic block diagram of an imaging device according to a first embodiment. The imaging device 100 according to the first embodiment is an imaging device that images an object within an imaging range. The imaging device 100 is an auto-focus camera capable of automatically setting a focus position. The imaging device 100 may be a video camera that captures a moving image by imaging every predetermined frame, or may be a still image camera. The imaging device 100 can be used for any purpose, and can be used as a monitoring camera set in a predetermined position in a device or outdoors, for example.
As shown in fig. 1, the imaging device 100 includes an optical element 10, an imaging element 12, an image processing circuit 13, an object position measuring unit 14, an input unit 16, a display unit 18, a communication unit 20, a storage unit 22, and a control unit 24.
The optical element 10 is, for example, an element of an optical system such as a lens. The optical element 10 may be one or a plurality of optical elements.
The imaging element 12 is an element that converts light incident through the optical element 10 into an image signal that is an electrical signal. The imaging element 12 is, for example, a CCD (Charge Coupled Device: charge coupled device) sensor or a CMOS (Complementary Metal Oxide Semiconductor: complementary metal oxide semiconductor) sensor or the like.
The image processing circuit 13 generates image data for every 1 frame from the image signal generated by the imaging element 12. The image data may be, for example, data including information on brightness and color of each pixel in1 frame, or data on gradation assigned to each pixel.
The object position measuring unit 14 is a sensor that measures the position of an object to be measured (the relative position of the object) with respect to the imaging device 100. The object may be any object, and may be living or inanimate, and the same applies hereinafter. The object herein may be a movable object, but is not limited thereto, and may be an object that does not move.
In the present embodiment, the object position measuring unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object. The object position measuring unit 14 may be any sensor capable Of measuring the relative position Of an object, and may be, for example, a Time Of Flight (TOF) sensor. In the case where the object position measuring unit 14 is a TOF sensor, for example, a light emitting element (for example, LED (LIGHT EMITTING Diode)) that emits light and a light receiving unit that receives light are provided, and the distance to the object is measured from the time of flight of the light emitted from the light emitting element to the object and returned to the light receiving unit. The object position measuring unit 14 may measure the relative position of the object, for example, the direction in which the object exists with respect to the imaging device 100, in addition to the distance from the imaging device 100 to the object. In other words, the object position measuring unit 14 may measure the position (coordinates) of the object in a coordinate system having the imaging device 100 as the origin as the relative position of the object.
The input unit 16 is a mechanism for receiving an input (operation) from a user, and may be, for example, a button, a keyboard, a touch panel, or the like.
The display unit 18 is a display panel that displays an image. The display unit 18 can display an image for setting a target area AR described later by a user, in addition to the image captured by the imaging device 100.
The communication unit 20 is a communication module that communicates with an external device, and may be, for example, an antenna, a Wi-Fi (registered trademark) module, or the like. The imaging device 100 communicates with an external device by wireless communication, but may be wired communication, and the communication method may be arbitrary.
The storage unit 22 is a memory for storing captured image data, various information such as the operation contents of the control unit 24, and programs, and includes at least one of a main storage device such as a RAM (Random Access Memory: random access memory) and a ROM (read only memory), and an external storage device such as an HDD (HARD DISK DRIVE: hard disk drive). The program for the control unit 24 stored in the storage unit 22 may be stored in a recording medium readable by the imaging device 100.
The control unit 24 is an arithmetic device, and includes an arithmetic circuit such as a CPU (Central Processing Unit: central processing unit). The control section 24 includes a target area acquisition section 30, an object information acquisition section 32, a focus position control section 34, a photographing control section 36, and an image acquisition section 38. The control unit 24 reads out a program (software) from the storage unit 22 and executes the program, thereby realizing the target region acquiring unit 30, the object information acquiring unit 32, the focal position control unit 34, the imaging control unit 36, and the image acquiring unit 38, and executing these processes. The control unit 24 may execute these processes by one CPU, or may include a plurality of CPUs to execute the processes by the plurality of CPUs. At least part of the processing of the target region acquiring unit 30, the object information acquiring unit 32, the focal position control unit 34, the imaging control unit 36, and the image acquiring unit 38 may be realized by a hardware circuit.
(Target region acquiring section)
The target area acquisition unit 30 acquires information of the target area AR set in the imaging area of the imaging device 100. The target area AR is an area set for automatic focus position. The information of the target area AR is information indicating the position of the target area AR, that is, the position information of the target area AR. The target area AR is described below.
Fig. 2 and 3 are schematic diagrams for explaining an example of the target region. Fig. 2 is a view of the imaging device 100 and the target area AR from above in the vertical direction, and fig. 3 is a view of the imaging device 100 and the target area AR from above in the horizontal direction. Hereinafter, the direction Z is referred to as the vertical direction, the direction X is referred to as one of the horizontal directions orthogonal to the direction Z, and the direction Y is referred to as the direction (horizontal direction) orthogonal to the direction Z and the direction X. As shown in fig. 2 and 3, the range in which an image can be captured by the imaging device 100 is defined as an imaging area AR0. The imaging region AR0 is a region (space) within the angle of view of the imaging element 12, in other words, a range that is displayed as an image in real space. The target area AR is an area (space) set within the range of the imaging area AR0.
More specifically, the target area AR is an area within the imaging area AR0 and between the first position AX1 and the second position AX 2. The first position AX1 is a position at a first distance L1 from the imaging device 100, and the second position AX2 is a position at a second distance L2 shorter than the first distance L1 from the imaging device 100. As shown in fig. 2 and 3, in the present embodiment, the first position AX1 can be said to be a virtual plane including positions (coordinates) having a first distance L1 from the imaging device 100 in the imaging area AR 0. Similarly, the second position AX2 is a virtual plane including each position (coordinate) having a second distance L2 from the imaging device 100 in the imaging area AR 0. That is, the target area AR can be said to be a space surrounded by a virtual plane having a distance of the second distance L2 from the imaging device 100 and a virtual plane having a distance of the first distance L1 from the imaging device 100 in the imaging area AR 0. The first position AX1 is not limited to a virtual plane having the first distance L1 from the imaging device 100 in all the positions (coordinates) included in the first position AX1, and may be a virtual plane having the first distance L1 from the imaging device 100 in at least a part of the positions (coordinates) included in the first position AX 1. Similarly, the second position AX2 may be a virtual plane in which a position (coordinate) of at least a part included in the second position AX2 is a second distance L2 from the imaging device 100.
Fig. 4 and 5 are schematic views showing another example of the target area. In the description of fig. 2 and 3, the target area AR is divided by the first position AX1 and the second position AX2 in the optical axis direction (the depth direction of the image) of the image capturing device 100, but the image capturing area AR0 is not divided in the radiation direction (the extending direction of the angle of view) opposite to the optical axis direction of the image capturing device 100. In other words, the end face in the expansion direction of the angle of view of the target area AR coincides with the end face in the expansion direction of the angle of view of the imaging area AR 0. However, the target area AR is not limited to this, and the imaging area AR0 may be divided in the extending direction of the angle of view. That is, for example, as shown in fig. 4 and 5, the imaging area AR0 may be divided by the third position AX3 in the extending direction of the angle of view. In this example, the third position AX3 is a virtual plane (here, a closed curved surface of a side surface shape of a cylinder) including a position (coordinate) separated by a predetermined distance to the outside in the radial direction with respect to the optical axis LX of the imaging device 100. In this case, the target area AR is an area (space) surrounded by the first position AX1, the second position AX2, and the third position AX 3. The third position AX3 is not limited to the virtual plane in which all the positions (coordinates) included in the third position AX3 are the first distance L1 from the optical axis LX, and may be a virtual plane in which at least a part of the positions (coordinates) included in the third position AX3 are the third distance L3 from the optical axis LX. For example, the third position AX3 may be a virtual plane that extends outward in the radial direction (horizontal direction and elevation direction) at a predetermined angle as it moves away from the imaging device 100 along the optical axis direction.
The size and shape of the target area AR are not limited to the above description, and may be any. In the above description, the target area AR is an area set in the imaging area AR0, but is not limited thereto. For example, if a range that can be measured by the object position measuring unit 14 is set as a distance measurement area (distance measurement space), the target area AR may be an area set in the distance measurement area. In this case, the photographing region AR0 in fig. 2 to 5 may be treated as a ranging region.
The target area acquisition unit 30 may acquire information of the target area AR by any method. For example, the position of the target area AR may be set in advance. In this case, the target area acquisition unit 30 may read out the position information of the target area AR set in advance from the storage unit 22, or may acquire the position information of the target area AR from another device via the communication unit 20. For example, when the position of the target area AR is not preset, the target area acquisition unit 30 may automatically set the position of the target area AR. In addition, for example, the user may set the position of the target area AR. In this case, for example, the user may input information (for example, values of the first distance L1, the second distance L2, and the third distance L3) specifying the position of the target area AR to the input unit 16, and the target area acquisition unit 30 may set the target area AR based on the position information of the target area AR specified by the user. For example, the target area AR may be set by specifying coordinates. That is, for example, in the example of fig. 2, coordinates P1, P2, P3, and P4 of the vertex position of the target area AR may be specified, or an area surrounded by the coordinates P1 to P4 may be set as the target area AR.
(Object information acquisition section)
The object information acquiring unit 32 acquires position information of an object existing in the imaging area AR 0. The object information acquisition unit 32 controls the object position measurement unit 14 such that the object position measurement unit 14 measures the relative position of the object with respect to the imaging device 100. The object information acquisition unit 32 acquires, as object position information, a result of measurement of the relative position of the object with respect to the imaging device 100 by the object position measurement unit 14. The object information acquisition unit 32 acquires position information of an object at predetermined intervals, thereby sequentially acquiring the position information of the object. The object information acquiring unit 32 can acquire information indicating the shape of the object (for example, the 3D shape of the object) based on the position information of the object. For example, the object information acquiring unit 32 can acquire a 3D shape of an object by accumulating a plurality of pieces of position information such as TOF image information.
(Focal position control section)
The focal position control unit 34 sets the focal position of the imaging device 100. The focal position control unit 34 controls the focal position by controlling the position of the optical element 10, that is, by moving the position of the optical element 10.
The focal position control unit 34 aligns the focal position with the object existing in the target area AR. In other words, the focal position control unit 34 sets the focal position at the position of the object determined to exist in the target area AR. In the present embodiment, the focus position control section 34 determines whether or not the object exists in the target area AR based on the position information of the object acquired by the object information acquisition section 32. When the position of the object acquired by the object information acquisition unit 32 overlaps with the position of the target area AR, the focus position control unit 34 determines that the object exists in the target area AR, and aligns the focus position with the position of the object acquired by the object information acquisition unit 32. That is, for example, when the distance from the imaging device 100 to the object is equal to or less than the first distance L1 and equal to or more than the second distance L2, the focal position control unit 34 determines that the object is present in the target area AR, and aligns the focal position with the object. On the other hand, the focal position control section 34 is not aligned with the focal position for an object that does not exist in the target area AR. That is, for example, when the distance from the imaging device 100 to the object is longer than the first distance L1 and shorter than the second distance L2, the focal position control unit 34 determines that the object is not present in the target area AR, and does not align the focal position with the object.
The focus position control section 34 continuously aligns the focus position with the object whose focus position is aligned while the object exists in the target area AR. That is, the focus position control unit 34 determines whether or not the object is continuously present in the target area AR based on the position information of the object obtained by the object information obtaining unit 32 at every predetermined time, and continuously aligns the focus position with the object while the object is continuously present in the target area AR. On the other hand, when the object whose focal position is aligned moves out of the target area AR, that is, when the object is no longer present in the target area AR, the focal position control unit 34 separates the focal position from the object and aligns the focal position at a position other than the object.
The focus position control unit 34 may also be configured to shift the focus position to a position not to be aligned with an object existing in the target area AR from the start of the operation of the imaging device 100 (at a timing when the imaging device is in a state capable of imaging). That is, the focus position control unit 34 may align the focus position with an object that enters the target area AR after the start of the operation. In other words, the focal position control unit 34 may align the focal position from the timing at which the object exists in the target area AR, with respect to an object that exists in the target area AR at a certain timing but does not exist in the target area AR at a timing before the certain timing. In other words, in the case where an object moves from outside the target area AR to inside the target area AR, the object can be recognized as the object in which the focus position control section 34 focuses. That is, the focus position control section 34 may move the focus position alignment from outside the target area AR to an object inside the target area AR.
In addition, in the case where the object is not present in the target area AR, the focal position control unit 34 may align the focal position with a set position set in advance. The setting position may be arbitrarily set, but is preferably set in the target area AR such as the center position of the target area AR.
An example of setting the focal position described above will be described with reference to fig. 2. Fig. 2 shows an example of a case where the object a moves from the position A0 to the position A3 toward the photographing device 100 through the position A1 and the position A2. The distance from the position A0 to the imaging device 100 is longer than the first distance L1, and is outside the target area AR. The distances from the position A1 and the position A2 to the imaging device 100 are equal to or less than the first distance L1 and equal to or more than the second distance L2, and thus are within the target area AR. The distance between the position A3 and the imaging device 100 is shorter than the second distance L2, and is outside the target area AR. In this case, the focus position control unit 34 does not align the focus position with the object a, for example, with the set position at the timing when the object a exists at the position A0. Then, the focal position control unit 34 aligns the focal position with the object a at the timing when the object a exists at the position A1, that is, at the timing when the object a enters the target area AR. The focal position control unit 34 keeps the focal position in alignment with the object a even when the object a is present at the position A2, and returns the focal position to the set position by moving the focal position away from the object a at the timing when the object a moves to the position A3, that is, at the timing when the object a comes out of the target area AR. That is, the focus position control unit 34 aligns the focus position with the object a from the timing when the object a enters the target area AR, moves the focus position with the object a that has moved while the object a is moving in the target area AR, and separates the focus position from the object a at the timing when the object a moves outside the target area AR.
In addition, the focal position may be set by the user. In this case, for example, an automatic mode in which the focus position is automatically set and a manual mode in which the focus position is set by the user can be switched. In the case of the automatic mode, the focal position is set by the focal position control unit 34 as described above. On the other hand, in the manual mode, the user inputs an operation to set the focal position to the input unit 16, and the focal position control unit 34 sets the focal position according to the user operation.
(Imaging control section)
The imaging control unit 36 controls imaging by the imaging device 100 to take an image. The imaging control unit 36 controls the imaging element 12, for example, so that the imaging element 12 acquires an image signal. For example, the imaging control unit 36 may cause the imaging element 12 to automatically acquire an image signal, or may acquire an image signal according to a user operation.
(Image acquisition section)
The image acquisition unit 38 acquires image data acquired by the imaging element 12. The image acquisition unit 38 controls the image processing circuit 13, for example, to cause the image processing circuit 13 to generate image data from the image signal generated by the imaging element 12, and to acquire the image data. The image acquisition unit 38 causes the storage unit 22 to store image data.
(Flow of setting focal position)
Next, the flow of the processing for setting the focal position described above will be described. Fig. 6 is a flowchart illustrating a flow of processing for setting the focal position. As shown in fig. 6, the control unit 24 acquires information of the target area AR by the target area acquisition unit 30 (step S10), and acquires position information of the object by the object information acquisition unit 32 (step S12). The order of execution of steps S10 and S12 may be arbitrary. The control unit 24 determines whether or not the object is located in the target area AR based on the position information of the object by the focus position control unit 34 (step S14). If the object is not located in the target area AR (no in step S14), the flow returns to step S12, and the acquisition of the position information of the object is continued. On the other hand, when the object is located in the target area AR (yes in step S14), the focal position control unit 34 aligns the focal position with the object (step S16). Thereafter, the acquisition of the position information of the object is continued, and it is determined whether or not the object is moved outside the target area AR (step S18). If the object has not moved out of the target area AR (no in step S18), that is, if the object is continuously present in the target area AR, the flow returns to step S16, and the focus position is continuously aligned with the object. When the object moves out of the target area AR (yes in step S18), the focal position control unit 34 separates the focal position from the object (step S20). After that, when the process is not completed (step S22: NO), the process returns to step S12, and when the process is completed (step S22: YES), the present process is completed.
(Effect)
As described above, the imaging device 100 according to the present embodiment includes: a photographing element 12; an object information acquisition unit 32 that acquires position information of an object existing in the imaging region AR0 of the imaging element 12; and a focus position control unit 34 that controls the focus position of the imaging device 100. The focal position control unit 34 keeps the focal position in the object existing in the target area AR while keeping the focal position in the object existing in the target area AR, and moves the object out of the target area AR to separate the focal position from the object. The target area AR is an area between a first position AX1 at a first distance L1 from the imaging device 100 and a second position AX at a second distance L2 smaller than the first distance L1 from the imaging device 100.
Here, in the imaging device of the auto-focus system, it is required to properly align the focus position. In contrast, the imaging device 100 according to the present embodiment aligns the focus position with the object existing in the target area AR, continuously aligns the focus position with the object when the object is continuously existing in the target area AR, and separates the focus position when the object comes out of the target area AR. Therefore, for example, focusing can be continued on an object existing in a target area, which is an area to be focused on in monitoring or the like. In addition, when an object comes out of the region to be focused on such as an object being farther than the first distance L1 or being closer than the second distance L2, the focus position is separated from the object, and the focus can be suppressed from being deviated from the region to be focused on. Therefore, according to the present embodiment, the focal position can be properly aligned.
The focal position control unit 34 controls the focal position by moving the position of the optical element 10 provided in the imaging device 100. According to the present embodiment, the focal position can be properly aligned.
(Second embodiment)
Next, a second embodiment will be described. The second embodiment differs from the first embodiment in that the focus position is aligned with an object that exists in the target area AR and satisfies a predetermined condition. In the second embodiment, the parts common to the first embodiment are not described.
In the second embodiment, the focal position control unit 34 aligns the focal position with an object that exists in the target area AR and satisfies a predetermined condition. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of the presence in the target area AR and the satisfaction of the predetermined condition. The focus position control unit 34 continuously aligns the focus position with the object while the object aligned with the focus position satisfies a predetermined condition and continuously exists in the target area AR. On the other hand, the focal position control unit 34 separates the focal position from the object when at least one of the presence of the object in the target area AR and the satisfaction of the predetermined condition is not satisfied. That is, for example, the focal position control unit 34 separates the focal position from the object when the object satisfies a predetermined condition but moves outside the target area AR, or when the object exists in the target area AR but does not satisfy the predetermined condition.
The focal position control unit 34 may determine whether or not a predetermined condition is satisfied by any method, and may determine whether or not the predetermined condition is satisfied based on at least one of position information of an object and an image of the object, for example. The positional information of the object may be a measurement result of the object position measurement unit 14, and the image of the object may be image data of the object captured by the imaging element 12.
The predetermined condition here may be any condition as long as an object exists in the target area AR. For example, the predetermined condition may be at least one of that the object is performing a predetermined motion, that the object is in a predetermined shape, and that the object is oriented in a predetermined direction. Any two or all of them may be used as the predetermined condition. When a plurality of predetermined conditions are set, the focal position control unit 34 determines that the predetermined conditions are satisfied if all the conditions are satisfied.
The case where the movement of the object is set to a predetermined condition will be described. In this case, the focal position control unit 34 determines whether or not the object is performing a predetermined motion based on the position information of the object continuously acquired in time series. The focal position control unit 34 aligns the focal position with an object that is present in the target area AR and is performing a predetermined movement. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of the presence in the target area AR and the ongoing prescribed movement. The focal position control unit 34 continuously aligns the focal position with the object while the object aligned with the focal position is present in the target area AR and continuously performs the predetermined movement. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the predetermined movement. The movement of the object herein refers to a movement pattern of the object, and may refer to a movement direction and a movement speed of the object, for example. For example, when the predetermined movement is directed downward in the vertical direction and moves at a speed of 10m/h or more, the focal position control unit 34 aligns the focal position with the object moving downward in the vertical direction in the target area AR at a speed of 10m/h or more. The movement of the object is not limited to the movement direction and the movement speed of the object, and may refer to any movement method. For example, the movement of the object may refer to at least one of a moving direction and a moving speed of the object.
Fig. 7 is a schematic diagram illustrating an example of a case where the movement of the object is set to a predetermined condition. In the example of fig. 7, the object is moved downward in the vertical direction (in the direction opposite to the Z direction), that is, the movement direction of the object is set to a predetermined condition. Fig. 7 shows an example in which the object a passes through the position A1a and the position A2a from the position A0a, moves downward in the vertical direction to the position A3a, and stops at the position A3 a. The position A0a is outside the target area AR, and the positions A1a, A2a, A3a are inside the target area AR. In this case, at the timing when the object a is present at the position A0a, the object a is outside the target area AR, and therefore the focus position control section 34 does not aim the focus position at the object a, for example, aims the focus position at the set position. Then, the focal position control unit 34 aligns the focal position with the object a at the timing when the object a is present at the position A1a, that is, at the timing when the object a enters the target area AR while moving downward in the vertical direction. The focus position control unit 34 also keeps the focus position at the object a at the timing when the object a is present at the position A2a, and moves the focus position away from the object a at the timing when the object a is moved to the position A3a and stopped, thereby returning the focus position to the set position.
Next, a case where the shape of the object is set to a predetermined condition will be described. In this case, the focal position control unit 34 determines whether or not the object is in a predetermined shape based on the image data of the object. The object existing in the target area AR and having a predetermined shape is aligned with the focal position. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of being present in the target area AR and having a predetermined shape. The focal position control unit 34 continuously aligns the focal position with the object while the object aligned with the focal position is in a predetermined shape and continuously exists in the target area AR. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the predetermined shape. The shape of the object here may be, for example, at least one of the size of the object and the shape of the object. For example, when the predetermined shape is equal to or larger than a predetermined size, the focal position control unit 34 aligns the focal position with an object equal to or larger than the predetermined size existing in the target area AR. The shape information of the object may be obtained by using the 3D shape information obtained by the object information obtaining unit 32.
The description will be given of a case where the orientation of the object is set to a predetermined condition. In this case, the focal position control unit 34 determines whether or not the object is oriented in the predetermined direction based on the image data of the object. The object existing in the target area AR and oriented in a predetermined direction is aligned with the focal position. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of the presence in the target area AR and the orientation in the predetermined direction. The focus position control unit 34 continuously aligns the focus position with the object while the object aligned with the focus position is oriented in the predetermined direction and continuously exists in the target area AR. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the orientation in the predetermined direction. The orientation information of the object may be acquired by using the 3D shape information acquired by the object information acquisition unit 32.
The predetermined condition may be set by any method, and may be set in advance, for example. In this case, the focal position control unit 34 may read information (for example, a moving direction and a moving speed) indicating predetermined conditions set in advance from the storage unit 22, or may acquire the predetermined conditions from other devices via the communication unit 20. For example, if the predetermined condition is not set in advance, the focal position control unit 34 may automatically set the predetermined condition. For example, the user may set a predetermined condition. In this case, for example, the user may input information (for example, a moving direction and a moving speed) specifying a predetermined condition to the input section 16, and the focus position control section 34 may set the predetermined condition based on the information specified by the user.
Next, a flow of setting the focal position in the second embodiment will be described. Fig. 8 is a flowchart illustrating a flow of processing for setting the focal position in the second embodiment. As shown in fig. 8, the control unit 24 acquires information of the target area AR by the target area acquisition unit 30 (step S30), acquires a predetermined condition by the focal position control unit 34 (step S32), and acquires position information of the object and information related to the condition by the object information acquisition unit 32 (step S34). The information related to the condition is information for judging whether or not the object satisfies a predetermined condition, and for example, refers to position information of the object, image data of the object captured, and the like. The order of execution of steps S30, S32, S34 may be arbitrary. The control unit 24 determines whether or not the object satisfies a predetermined condition and is located in the target area AR by the focal position control unit 34 (step S36). If the object does not satisfy at least one of the predetermined condition and being located in the target area AR (step S36: no), the flow returns to step S12, and the acquisition of the position information of the object is continued. On the other hand, when the object satisfies the predetermined condition and is located in the target area AR (yes in step S36), the focal position control unit 34 aligns the focal position with the object (step S38). Thereafter, the acquisition of the position information of the object and the information related to the condition is continued, and it is judged whether or not the object does not satisfy the predetermined condition or moves outside the target area AR (step S40). If the object satisfies the predetermined condition and is located in the target area AR (no in step S40), that is, if the object satisfies the predetermined condition and is continuously present in the target area AR, the flow returns to step S38, and the focus position is continuously aligned with the object. If the object does not satisfy at least one of the predetermined condition and being located in the target area AR (yes in step S40), the focal position control unit 34 separates the focal position from the object (step S42). After that, when the process is not completed (step S44: NO), the process returns to step S34, and when the process is completed (step S44: YES), the present process is completed.
As described above, in the second embodiment, the focal position control unit 34 may align the focal position with the object existing in the target area AR and performing the predetermined movement. The focus position control unit 34 continuously aligns the focus position with the object while the object is performing a predetermined movement, and separates the focus position from the object if the object is no longer performing the predetermined movement. In this way, in addition to being in the target area AR, the predetermined movement is satisfied as the condition for aligning the focal position, and thus the object performing the specific operation can be tracked and the focal position can be properly aligned. For example, the drop detection in the target area AR can be performed.
In the second embodiment, the focal position control unit 34 may be configured to align the focal position with an object having a predetermined shape existing in the target area AR. In this way, in addition to being located in the target area AR, a predetermined shape is used as a condition for aligning the focal position, whereby an object of a specific shape can be tracked and the focal position can be properly aligned.
In the second embodiment, the focal position control unit 34 may aim the focal position at an object existing in the target area AR and oriented in a predetermined direction. In this way, in addition to being located in the target area AR, the object having a specific orientation can be tracked by setting the orientation in a predetermined direction as a condition for aligning the focal position, and the focal position can be properly aligned.
(Third embodiment)
Next, a third embodiment will be described. In the third embodiment, a focus position alignment method in the case where a plurality of objects exist in the target area AR is defined, which is different from the first embodiment. In the third embodiment, the parts common to the first embodiment are not described. The third embodiment can also be applied to the second embodiment.
The object information acquiring unit 32 causes the object position measuring unit 14 to measure the relative position of each object, and acquires the position information of each object existing in the imaging area AR 0. The focal position control unit 34 determines whether or not the respective objects are located within the target area AR based on the position information of the respective objects. When it is determined that a plurality of objects are located in the target area AR at the same timing, the focal position control unit 34 changes the focal position so that the focal position corresponds to each object existing in the target area AR. That is, the focal position control unit 34 keeps the focal position aligned with one object existing in the target area AR for a predetermined time, and then changes to another object existing in the target area AR with the focal position aligned with the other object for a predetermined time. The focus position control unit 34 repeats this process for all the objects existing in the target area AR until the focus position alignment is completed, and then, the focus position is realigned to the object whose focus position was first aligned, and the process is continued. Further, the focal position control unit 34 also continues to determine whether or not the object is located in the target area AR during this processing, and thus, for example, does not align the focal position with an object that is not located in the target area AR.
In the present embodiment, the focus position control unit 34 sets the order of changing the focus positions, that is, sets the object to be aligned next to the focus position, based on the position information of each object existing in the object area AR. For example, the focal position control section 34 may set the order in which the focal positions are changed so as to minimize the time required for changing the focal positions. That is, since the amount of movement of the optical element 10 for converting the focal position corresponds to the distance (object distance) between the position of the conversion source and the position of the conversion target, the time required for converting the focal position is determined based on the object distance, and the shorter the object distance, the shorter the time required for converting. Therefore, the focus position control section 34 sets the order of changing the focus positions, that is, to minimize the object distance, in other words, to minimize the time required for changing the focus positions, based on the position information of each object present in the object region AR.
Fig. 9 is a schematic diagram illustrating an example in the case where a plurality of objects exist in the object region. An example of setting the order of changing the focal position will be described with reference to fig. 9.fig. 9 exemplifies a case where the object Aa, ab, ac, ad is continuously located within the target area AR. For example, among the objects Aa, ab, ac, ad, an object having the shortest distance from the current focal position (for example, the object distance from the set position) is referred to as an object Aa. In this case, the focal position control unit 34 converts the focal position into an object Aa having the shortest object distance from the current focal position, thereby minimizing the time required for converting the focal position. Next, the object having the smallest object distance from the object Aa among the objects Ab, ac, ad which are not aligned with the focal position before the present time becomes the object Ab. Therefore, the focal position control unit 34 sets the object Ab having the smallest object distance as the next object, and converts the focal position from the object Aa to the object Ab, thereby minimizing the time required for converting the focal position. Next, the object having the smallest object distance from the object Ab among the objects Ac, ad which are not aligned with the focal position before is the object Ad. Therefore, the focal position control unit 34 converts the focal position from the object Ab to the object Ad with the object Ad having the smallest object distance as the next object. Then, since the object whose focal position is not aligned is only the object Ac, the focal position control unit 34 converts the focal position from the object Ad to the object Ac with the object Ac as the next object. Then, since there is no object that has been previously out of focus position, the focus position is converted into an object Aa that has been initially in focus position, and the same process is continued.
In this way, the focal position control unit 34 converts the focal position into an object having the smallest object distance among objects that have been previously out of focus positions. If there is no object whose focal position has been previously misaligned, in other words, if all the objects have been aligned with the focal position, the object whose focal position has been aligned with the longest past among the objects currently located in the target area AR is aligned with the focal position. However, the method of setting the order of changing the focal position is not limited to the above, and may be arbitrary. For example, the focal position control unit 34 may change the focal position to the object having the largest object distance, or may change the focal position in order of distance from the imaging device 100.
The predetermined time, which is the time for which the focal position is continuously aligned with one object, may be arbitrarily set. For example, the predetermined time may be set to the same length for all the objects, or the predetermined time may be different for each object. In this case, for example, the focal position control unit 34 may assign importance to each object based on at least one of the position information of the object and the image data of the object, and increase the predetermined time as the importance increases. The method of assigning the importance level may be arbitrary, and for example, the focus position control unit 34 may set the importance level of the object higher as the distance from the imaging device 100 is shorter, or may set the importance level of the object higher as the speed of approaching the imaging device 100 is higher.
The focal position control unit 34 may stop changing the focal position to another object while a plurality of objects are present in the target area AR, and keep the focal position at one object. For example, when a command to stop the conversion of the focal position (a command to fix the focal position to the object) is input to the input unit 16 by the user at the timing when the focal position is aligned with a certain object, the focal position control unit 34 stops the conversion of the focal position to another object and keeps the focal position aligned with the object.
Next, the flow of the processing of the focal position setting described above will be described. Fig. 10 is a flowchart illustrating a flow of processing for setting the focal position according to the third embodiment. As shown in fig. 10, the control unit 24 acquires information of the target area AR by the target area acquisition unit 30 (step S50), and acquires position information of the object by the object information acquisition unit 32 (step S52). The order of execution of steps S50, S52 may be arbitrary. The control unit 24 determines whether or not the object is located in the target area AR based on the position information of the object by the focus position control unit 34 (step S54). If the object is not located in the target area AR (no in step S54), the flow returns to step S52, and the acquisition of the position information of the object is continued. On the other hand, when the object is located in the target area AR (yes in step S54), the focal position control unit 34 determines whether or not a plurality of objects are present in the target area AR (step S56). When a plurality of objects exist in the target area AR (yes in step S56), the focal position control unit 34 sets the focal position to the object having the smallest distance (step S58). Thereafter, the acquisition of the position information of the object is continued, and it is determined whether or not there is no object in the target area AR (step S60). If an object exists in the target area AR (no in step S60), the flow returns to step S56, and the setting of the focal position is continued. When there is no object in the target area AR (yes in step S60), the focal position control unit 34 separates the focal position from the object (step S62), returns to step S52 when the process is not completed (no in step S64), and ends the present process when the process is completed (yes in step S64).
On the other hand, when there are no plurality of objects in the target area AR (no in step S56), that is, when there is one object in the target area AR, the focus position is aligned with the object (step S66). Thereafter, the acquisition of the position information of the object is continued, and it is determined whether or not there is no object in the target area AR (step S60). If an object exists in the target area AR (no in step S60), the flow returns to step S56, and the setting of the focal position is continued. When there is no object in the target area AR (yes in step S60), the focal position control unit 34 separates the focal position from the object (step S62), returns to step S52 when the process is not completed (no in step S64), and ends the present process when the process is completed (yes in step S64).
As described above, in the third embodiment, when a plurality of objects exist in the target area AR, the focal position control unit 34 changes the focal positions so that the focal positions are sequentially directed to the respective objects. Therefore, even in the case where a plurality of objects exist in the target area AR, the focus position can be properly aligned.
The focal position control unit 34 sets the order of changing the focal positions according to the positions of the respective objects. Therefore, even in the case where a plurality of objects exist in the target area AR, the focus position can be properly aligned.
The focal position control unit 34 sets the order of changing the focal positions so as to minimize the time required for changing the focal positions. Therefore, even in the case where a plurality of objects exist in the target area AR, the focus position can be properly aligned.
(Fourth embodiment)
Next, a fourth embodiment will be described. In the fourth embodiment, a point different from the first embodiment is that the focus position is not aligned with an object existing in the target area AR, but is aligned when moving outside the target area AR. In the fourth embodiment, the parts common to the first embodiment are not described. The fourth embodiment can be applied to the second and third embodiments.
The method of setting the focal position in the fourth embodiment can be described by replacing "inside the target area AR" and "outside the target area AR" in the description of the first embodiment. Hereinafter, description will be made specifically.
In the fourth embodiment, the focus position control unit 34 aligns the focus position with an object existing in the imaging area AR0 and outside the target area AR, and does not align the focus position with an object existing in the target area AR. The focus position control unit 34 keeps the focus position aligned with the object while the object aligned with the focus position is present in the imaging area AR0 and outside the target area AR. On the other hand, when the object whose focal position is aligned is no longer present in the imaging area AR0 and is outside the target area AR, that is, when the object is moved outside the imaging area AR0 or inside the target area AR, the focal position control unit 34 separates the focal position from the object and aligns the focal position at a position other than the object.
Further, it is preferable that the focus position control unit 34 moves the focus position from within the target area AR to an object within the imaging area AR0 and outside the target area AR. Thus, for example, it is preferable to align the focal position with an object moving from a specific region.
Fig. 11 is a schematic diagram for explaining a method of setting a focus position in the fourth embodiment. An example of setting the focal position described above will be described with reference to fig. 11. Fig. 11 shows an example of a case where the object a moves from the position A0b to the position A2b through the position A1 b. The position A0b is located within the target area AR, the position A1b is located within the imaging area AR0 and outside the target area AR, and the position A2b is located outside the imaging area AR 0. In this case, the focus position control unit 34 does not align the focus position with the object a, for example, aligns the focus position with the set position at the timing when the object a exists at the position A0 b. Then, the focus position control unit 34 aligns the focus position with the object a at the timing when the object a exists at the position A1b, that is, at the timing when the object a moves from within the target area AR to within the imaging area AR0 and outside the target area AR. The focus position control unit 34 separates the focus position from the object a and returns the focus position to the set position at the timing when the object a moves to the position A2b, that is, at the timing when the object a comes out of the imaging area AR 0.
Next, the flow of the processing for setting the focal position described above will be described. Fig. 12 is a flowchart illustrating a flow of processing for setting the focal position according to the fourth embodiment. As shown in fig. 12, the control unit 24 acquires information of the target area AR by the target area acquisition unit 30 (step S70), and acquires position information of the object by the object information acquisition unit 32 (step S72). The order of execution of steps S70, S72 may be arbitrary. The control unit 24 determines whether or not the object existing in the target area AR has moved into the imaging area AR0 and out of the target area AR based on the position information of the object by the focus position control unit 34 (step S74). If the object has not moved out of the target area AR (no in step S74), that is, if the object is continuously present in the target area AR, the process returns to step S72, and the position information of the object is continuously acquired. On the other hand, when the object moves into the imaging area AR0 and out of the target area AR (yes in step S74), the focal position control unit 34 aligns the focal position with the object (step S76). After that, the acquisition of the position information of the object is continued, and if the object is not moving from within the imaging area AR0 to outside the target area AR (no in step S78), that is, if the object is continuously located within the imaging area AR0 to outside the target area AR, the process returns to step S76 to continuously align the focus position with the object. On the other hand, when the object moves from within the imaging area AR0 to outside the target area AR (yes in step S78), that is, when the object moves to outside the imaging area AR or inside the target area AR, the focal position control unit 34 moves the focal position away from the object (step S80). If the process is not completed (no in step S82), the flow returns to step S72, and if the process is completed (yes in step S82), the present process is completed.
As described above, the imaging device 100 according to the fourth embodiment includes: a photographing element 12; an object information acquisition unit 32 that acquires position information of an object existing in the imaging region AR0 of the imaging element 12; and a focus position control unit 34 that controls the focus position of the imaging device 100. The focal position control unit 34 keeps the focal position in the object existing outside the target area AR and within the imaging area AR0, and keeps the focal position in the object while the object exists outside the target area AR and within the imaging area AR0, and moves the object into the target area AR, so that the focal position is separated from the object. The target area AR is an area between a first position AX1 at a first distance L1 from the imaging device 100 and a second position AX at a second distance L2 smaller than the first distance L1 from the imaging device 100.
Here, in the imaging device of the auto-focus system, it is required to properly align the focus position. In contrast, the imaging device 100 according to the present embodiment aligns the focal position with an object existing outside the target area AR, and continuously aligns the focal position with the object when the object is continuously existing outside the target area AR. Therefore, for example, focusing can be continued on an object coming out of the target area that is the area to be focused. Therefore, according to the present embodiment, the focal position can be properly aligned.
(Fifth embodiment)
(Structure of photographing device)
Fig. 13 is a schematic block diagram of an imaging device according to a fifth embodiment. The imaging device 100 according to the fifth embodiment is an imaging device that images an object within an imaging range. The imaging device 100 is an auto-focus camera capable of automatically setting a focus position. The imaging device 100 may be a video camera that captures a moving image by imaging every predetermined frame, or may be a still image camera. The imaging device 100 can be used for any purpose, and can be used as a monitoring camera set in a predetermined position in a device or outdoors, for example.
As shown in fig. 13, the imaging device 100 includes an optical element 10, an imaging element 12, an image processing circuit 13, an object position measuring unit 14, an input unit 16, a display unit 18, a communication unit 20, a storage unit 22, and a control unit 24.
The optical element 10 is, for example, an element of an optical system such as a lens. The optical element 10 may be one or a plurality of optical elements.
The imaging element 12 is an element that converts light incident through the optical element 10 into an image signal that is an electrical signal. The imaging element 12 is, for example, a CCD (Charge Coupled Device: charge coupled device) sensor or a CMOS (Complementary Metal Oxide Semiconductor: complementary metal oxide semiconductor) sensor or the like.
The image processing circuit 13 generates image data for every 1 frame from the image signal generated by the imaging element 12. The image data may be, for example, data including information on brightness and color of each pixel in 1 frame, or data on gradation assigned to each pixel.
The object position measuring unit 14 is a sensor that measures the position of an object to be measured (the relative position of the object) with respect to the imaging device 100. The object may be any object, and may be living or inanimate, and the same applies hereinafter. The object herein may be a movable object, but is not limited thereto, and may be an object that does not move.
The object position measuring unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object. The object position measuring unit 14 may be any sensor capable Of measuring the relative position Of an object, and may be, for example, a Time Of Flight (TOF) sensor. In the case where the object position measuring unit 14 is a TOF sensor, for example, a light emitting element (for example, LED (LIGHT EMITTING Diode)) that emits light and a light receiving unit that receives light are provided, and the distance to the object is measured from the time of flight of the light emitted from the light emitting element to the object and returned to the light receiving unit. The object position measuring unit 14 may measure the relative position of the object, for example, the direction in which the object exists with respect to the imaging device 100, in addition to the distance from the imaging device 100 to the object. In other words, the object position measuring unit 14 may measure the position (coordinates) of the object in a coordinate system having the imaging device 100 as the origin as the relative position of the object.
The input unit 16 is a mechanism for receiving an input (operation) from a user, and may be, for example, a button, a keyboard, a touch panel, or the like.
The display unit 18 is a display panel that displays an image. The display unit 18 can display an image for setting a target area AR described later by a user, in addition to the image captured by the imaging device 100.
The communication unit 20 is a communication module that communicates with an external device, and may be, for example, an antenna, a Wi-Fi (registered trademark) module, or the like. The imaging device 100 communicates with an external device by wireless communication, but may be wired communication, and the communication method may be arbitrary.
The storage unit 22 is a memory for storing captured image data, various information such as the operation contents of the control unit 24, and programs, and includes at least one of a main storage device such as a RAM (Random Access Memory: random access memory) and a ROM (read only memory), and an external storage device such as an HDD (HARD DISK DRIVE: hard disk drive). The program for the control unit 24 stored in the storage unit 22 may be stored in a recording medium readable by the imaging device 100.
The control unit 24 is an arithmetic device, and includes an arithmetic circuit such as a CPU (Central Processing Unit: central processing unit). The control section 24 includes a target area acquisition section 30, an object information acquisition section 32, a focus position control section 34, a shooting control section 36, and an image acquisition section 38. The control unit 24 reads out a program (software) from the storage unit 22 and executes the program, thereby realizing the target region acquiring unit 30, the object information acquiring unit 32, the focal position control unit 34, the imaging control unit 36, and the image acquiring unit 38, and executing these processes. The control unit 24 may execute these processes by one CPU, or may include a plurality of CPUs, and execute the processes by these plurality of CPUs. At least part of the processing of the target region acquiring unit 30, the object information acquiring unit 32, the focal position control unit 34, the imaging control unit 36, and the image acquiring unit 38 may be realized by a hardware circuit.
(Target region acquiring section)
The target area acquisition unit 30 acquires information of a plurality of target areas AR set in the imaging area of the imaging device 100. The target area AR is an area set for automatic focus position. The information of the target area AR is information indicating the position of the target area AR, that is, the position information of the target area AR. The respective target areas AR are preferably set to be located at different positions and not overlapped with each other.
Fig. 14 and 15 are schematic diagrams for explaining an example of the target region. Fig. 14 is a view of the imaging device 100 and the target area AR from above in the vertical direction, and fig. 15 is a view of the imaging device 100 and the target area AR from above in the horizontal direction. Hereinafter, the direction Z is referred to as the vertical direction, the direction X is referred to as one of the horizontal directions orthogonal to the direction Z, and the direction Y is referred to as the direction (horizontal direction) orthogonal to the direction Z and the direction X. As shown in fig. 14 and 15, the range in which an image can be captured by the imaging device 100 is defined as an imaging area AR0. The imaging region AR0 is a region (space) within the angle of view of the imaging element 12, in other words, a range that is displayed as an image in real space. The target area AR is an area (space) set within the range of the imaging area AR0.
More specifically, each of the object regions AR is a region located within the imaging region AR0 and between the first position AX1 and the second position AX 2. The first position AX1 is a position at a first distance L1 from the imaging device 100, and the second position AX2 is a position at a second distance L2 shorter than the first distance L1 from the imaging device 100. As shown in fig. 14 and 15, in the present embodiment, the first position AX1 is a virtual plane including positions (coordinates) having a first distance L1 from the imaging device 100 in the imaging area AR 0. Similarly, the second position AX2 is a virtual plane including each position (coordinate) having a second distance L2 from the imaging device 100 in the imaging area AR 0. That is, each target area AR can be said to be a space within a range surrounded by a virtual plane having a distance of the second distance L2 from the imaging device 100 and a virtual plane having a distance of the first distance L1 from the imaging device 100 in the imaging area AR 0. The first position AX1 is not limited to a virtual plane having the first distance L1 from the imaging device 100 in all the positions (coordinates) included in the first position AX1, and may be a virtual plane having the first distance L1 from the imaging device 100 in at least a part of the positions (coordinates) included in the first position AX 1. Similarly, the second position AX2 may be a virtual plane in which a position (coordinate) of at least a part included in the second position AX2 is a second distance L2 from the imaging device 100.
In fig. 14 and 15, a case where the first target area AR1 and the second target area AR2 are set as the target areas AR is exemplified. In the example of fig. 14 and 15, the first object area AR1 and the second object area AR2 are located between the first position AX1 and the second position AX2 within the imaging area AR 0. In the example of fig. 14 and 15, the region within the photographing region AR0 and between the first position AX1 and the second position AX2 is divided into the first object region AR1 and the second object region AR2, in other words, it can be said here that the entire region between the first position AX1 and the second position AX2 is divided into a plurality of object regions. In the example of fig. 14 and 15, the second target area AR2 is located at a position surrounded by the first target area AR 1.
Fig. 16 and 17 are schematic views showing another example of the target area. In the description of fig. 14 and 15, the first target area AR1 is formed by dividing the imaging area AR0 by the first position AX1 and the second position AX2 in the optical axis direction (the depth direction of the image) of the imaging device 100, but the imaging area AR0 is not divided in the radiation direction (the expansion direction of the angle of view) with respect to the optical axis direction of the imaging device 100. In other words, the end face in the expansion direction of the angle of view of the first object area AR1 coincides with the end face in the expansion direction of the angle of view of the imaging area AR 0. However, the first target area AR1 is not limited to this, and the imaging area AR0 may be divided in the extending direction of the angle of view. That is, for example, as shown in fig. 16 and 17, the first target area AR1 may be divided into the imaging areas AR0 by the third position AX3 in the extending direction of the angle of view. In this example, the third position AX3 is a virtual plane (here, a closed curved surface of a side surface shape of a cylinder) including a position (coordinate) separated by a predetermined distance to the outside in the radial direction with respect to the optical axis LX of the imaging device 100. In this case, the first target area AR1 is an area (space) surrounded by the first position AX1, the second position AX2, and the third position AX 3. The third position AX3 is not limited to the virtual plane in which all the positions (coordinates) included in the third position AX3 are a third distance L3 from the optical axis LX, and may be a virtual plane in which at least a part of the positions (coordinates) included in the third position AX3 are a third distance L3 from the optical axis LX. For example, the third position AX3 may be a virtual plane that extends outward in the radial direction (horizontal direction and elevation direction) at a predetermined angle as it moves away from the imaging device 100 along the optical axis direction.
The position of each target area AR is not limited to the above description, and may be arranged at any position between the first position AX1 and the second position AX2 within the imaging area AR 0. For example, the second target area AR2 may not be disposed so as to be surrounded by the first target area AR 1. The size and shape of the target area AR are not limited to the above description, and may be arbitrary. The number of target areas AR is not limited to two, and any number of areas such as three or more may be set. In the above description, the target area AR is an area set in the imaging area AR0, but is not limited thereto. For example, if a range that can be measured by the object position measuring unit 14 is set as a distance measurement area (distance measurement space), the target area AR may be an area set in the distance measurement area. In this case, the photographing region AR in fig. 14 to 17 may be treated as a ranging region.
The target area acquisition unit 30 may acquire information of the target area AR by any method. For example, the positions of the respective target areas AR may be set in advance. In this case, the target area acquisition unit 30 may read out the position information of each target area AR set in advance from the storage unit 22, or may acquire the position information of each target area AR from another device via the communication unit 20. For example, when the position of the target area AR is not preset, the target area acquisition unit 30 may automatically set the position of each target area AR. In addition, for example, the user may set the position of each target area AR. In this case, for example, the user may input information (for example, values of the first distance L1, the second distance L2, and the third distance L3) specifying the positions of the respective target areas AR to the input unit 16, and the target area acquisition unit 30 may set the respective target areas AR based on the position information of the target areas AR specified by the user. For example, the target area AR may be set by specifying coordinates. That is, for example, in the example of fig. 14, coordinates P1, P2, P3, and P4 that are the vertex positions of the first target area AR1 and coordinates P5, P6, P7, and P8 that are the vertex positions of the second target area AR2 may be designated, the area surrounded by the coordinates P5 to P8 may be set as the second target area AR2, and the area outside the second target area AR2 and surrounded by the coordinates P1 to P4 may be set as the first target area AR1.
(Object information acquisition section)
The object information acquiring unit 32 acquires position information of an object existing in the imaging area AR 0. The object information acquisition unit 32 controls the object position measurement unit 14 such that the object position measurement unit 14 measures the relative position of the object with respect to the imaging device 100. The object information acquisition unit 32 acquires, as object position information, a result of measurement of the relative position of the object with respect to the imaging device 100 by the object position measurement unit 14. The object information acquisition unit 32 acquires position information of an object at predetermined intervals, thereby sequentially acquiring the position information of the object. The object information acquiring unit 32 can acquire information indicating the shape of the object (for example, the 3D shape of the object) based on the position information of the object. For example, the object information acquiring unit 32 can acquire a 3D shape of an object by accumulating a plurality of pieces of position information such as TOF image information.
(Focal position control section)
The focal position control unit 34 sets the focal position of the imaging device 100. The focal position control unit 34 controls the focal position by controlling the position of the optical element 10, that is, by moving the position of the optical element 10.
The focus position control unit 34 aligns the focus position with respect to each object region AR with respect to the object present in the object region AR. In other words, the focal position control unit 34 sets the focal position at the position of the object determined to exist in the target area AR. In the present embodiment, the focus position control section 34 determines whether or not the object exists in the target area AR based on the position information of the object acquired by the object information acquisition section 32. When the position of the object acquired by the object information acquisition unit 32 overlaps with the position of the target area AR, the focus position control unit 34 determines that the object exists in the target area AR, and aligns the focus position with the position of the object acquired by the object information acquisition unit 32. In the example of fig. 14 and 15, since the first target area AR1 and the second target area AR2 are set as the target areas AR, the focal position control unit 34 aligns the focal position with the object in the first target area AR1 when the object is present in the first target area AR1, and aligns the focal position with the object in the second target area AR2 when the object is present in the second target area AR 2.
The focus position control unit 34 continuously aligns the focus position with the object while the object aligned with the focus position exists in the target area AR. That is, the focus position control unit 34 determines whether or not the object is continuously present in the target area AR based on the position information of the object obtained by the object information obtaining unit 32 at every predetermined time, and continuously aligns the focus position with the object while the object is continuously present in the target area AR. On the other hand, when the object whose focal position is aligned moves out of the target area AR, that is, when the object is no longer present in the target area AR, the focal position control unit 34 separates the focal position from the object and aligns the focal position at a position other than the object. Taking the case where the focal position is aligned with the object in the first target area AR1 as an example, the focal position control unit 34 continuously aligns the focal position with the object while the object is continuously present in the first target area AR1, and separates the focal position from the object when the object is moved out of the first target area AR 1. However, in the case where the object moves directly from the first target area AR1 into the second target area AR2 (i.e., in the case where the object moves from the first target area AR1 to the second target area AR2 without going outside the range of the first target area AR1 and the second target area AR 2), the object whose focal position is moved to the second target area AR2 may be continued.
The focus position control unit 34 may also be configured to shift the focus position to a position not to be aligned with an object existing in the target area AR from the start of the operation of the imaging device 100 (at a timing when the imaging device is in a state capable of imaging). That is, the focus position control unit 34 may align the focus position with an object that enters the target area AR after the start of the operation. In other words, the focal position control unit 34 may align the focal position from the timing at which the object exists in the target area AR, with respect to an object that exists in the target area AR at a certain timing but does not exist in the target area AR at a timing before the certain timing. In other words, in the case where an object moves from outside the target area AR to inside the target area AR, the object can be recognized as the object in which the focus position control section 34 focuses. That is, the focus position control section 34 may move the focus position alignment from outside the target area AR to an object inside the target area AR.
In addition, in the case where the object is not present in the target area AR, the focal position control unit 34 may align the focal position with a set position set in advance. The setting position may be arbitrarily set, but is preferably set in the target area AR such as the center position of the target area AR.
In the present embodiment, regarding a part of the target area AR among the respective target areas AR, when an object exists in the target area AR, the imaging device 100 is caused to perform a predetermined process while aligning the focus position with the object. On the other hand, in the case where an object exists in the target area AR, the focus position is aligned with the object in the other part of the target areas AR, but the imaging device 100 is not caused to execute a predetermined process. The predetermined processing in the present embodiment is processing for capturing an image by the imaging control unit 36. That is, in the present embodiment, regarding a part of the target area AR among the respective target areas AR, when an object exists in the target area AR, the focus position control section 34 aligns the focus position with the object, and the imaging control section 36 captures an image. That is, in this case, the imaging control unit 36 captures an image in a state where the object is located at the focal position pair. On the other hand, regarding the target area AR of the other part of the respective target areas AR, when an object exists in the target area AR, the focus position control unit 34 brings the focus position into alignment with the object, but the imaging control unit 36 does not take an image.
In the case where the display unit 18 displays the image in the imaging area AR in real time, the imaging device 100 can also be said to always capture the image. However, if the image is not recorded, the image is temporarily stored in a buffer or the like and then automatically deleted without being stored in the storage unit 22. The term "image capturing" in the present embodiment does not refer to capturing such an image that is not stored in the storage unit 22 but is automatically deleted, but rather refers to capturing an image that is automatically stored in the storage unit 22, in other words, capturing an image for video recording and storing the image in the storage unit 22.
An example of setting the focal position described above will be described with reference to fig. 14. In fig. 14, a case is exemplified in which the predetermined process is not performed when the object is located in the first target area AR1, and the predetermined process is performed when the object is located in the second target area AR 2. Further, fig. 14 shows an example of a case where the object a moves from the position A0 to the position A4 toward the photographing device 100 through the position A1, the position A2, and the position A3. The position A0 is farther from the imaging device 100 than the first distance L1, and is out of the range of the first target area AR1 and the second target area AR 2. The position A1 is within the range of the first object area AR1, the position A2 is within the range of the second object area AR2, and the position A3 is within the range of the first object area AR 1. The distance between the position A4 and the imaging device 100 is shorter than the second distance L2, and is outside the range of the first target area AR1 and the second target area AR 2.
At the timing when the object a exists at the position A0, the focus position control section 34 does not aim the focus position at the object a, for example, aims the focus position at the set position. In addition, at this timing, the imaging control section 36 does not take an image. Then, the focal position control unit 34 aligns the focal position with the object a at the timing when the object a exists at the position A1, that is, at the timing when the object a enters the first target area AR 1. The focal position control unit 34 continuously aligns the focal position with the object a while the object a is continuously present in the first target area AR 1. The imaging control unit 36 does not take an image while the object a is continuously present in the first target area AR 1. Then, the focal position control unit 34 continuously aligns the focal position with the object a at the timing when the object a exists at the position A2, that is, at the timing when the object a moves directly from the first target area AR1 to the second target area AR 2. The focal position control unit 34 continuously aligns the focal position with the object a while the object a is continuously present in the second target area AR 2. On the other hand, the imaging control unit 36 starts imaging of the image at the timing when the object a enters the second target area AR2, and continues imaging while the object a continues to exist in the second target area AR 2.
Thereafter, the focus position is continuously aligned with the object a at the timing when the object a exists at the position A3, that is, at the timing when the object a moves directly from the second target area AR2 to the first target area AR 1. The focal position control unit 34 continuously aligns the focal position with the object a while the object a is continuously present in the first target area AR 1. On the other hand, the imaging control unit 36 stops imaging of the image at the time of moving from the second target area AR2 to the first target area AR 1. The imaging control unit 36 continuously stops imaging while the object a is continuously present in the first target area AR 1. Then, at the timing when the object a is present at the position A4, that is, at the timing when the object a moves from the first target area AR1 to the outside of the ranges of the first target area AR1 and the second target area AR2, the focal position is separated from the object a, and the focal position is returned to the set position. That is, the focus position control unit 34 aligns the focus position with the object a from the timing when the object a enters the target area AR, moves the focus position with the object a that moves while the object a moves in the target area AR, and moves the focus position away from the object a at the timing when the object a moves outside the target area AR. The imaging control unit 36 continuously captures an image when the object a is located in the second target area AR2, and does not capture an image when the object a is located outside the second target area AR 2.
In the above description, the image capturing is set to a predetermined process (a process when an object exists in the second target area AR 2), but the predetermined process is not limited to the image capturing. The predetermined process may be any process other than the process of aligning the focal position to the object, and may be at least one of a process of capturing an image, a process of irradiating light toward the object (for example, a process of irradiating illumination light), and a process of outputting information indicating that the object is present in the second target area AR2 (for example, a process of outputting sound). In this case, for example, when an object exists in the second target area AR2, all of the predetermined processes may be executed. In addition, when three or more target areas AR are set, different predetermined processes may be allocated to each target area AR. That is, for example, when the first target area, the second target area, and the third target area are set, the predetermined process may not be performed when the object exists in the first target area, the first predetermined process may be performed when the object exists in the second target area, and the second predetermined process different from the first predetermined process may be performed when the object exists in the third target area.
In addition, the focal position may be set by the user. In this case, for example, an automatic mode in which the focus position is automatically set and a manual mode in which the focus position is set by the user can be switched. In the case of the automatic mode, the focal position is set by the focal position control unit 34 as described above. On the other hand, in the manual mode, the user inputs an operation to set the focal position to the input unit 16, and the focal position control unit 34 sets the focal position according to the user operation.
In addition, the image may be captured by the user. In this case, for example, an automatic mode of automatically photographing an image and a manual mode of photographing an image by an operation of a user can be switched. In the case of the automatic mode, as described above, the image is captured by the imaging control unit 36 while the object a is present in the second target area AR 2. On the other hand, in the manual mode, the user inputs an operation to the input unit 16 to capture an image, and the capture control unit 36 captures an image in accordance with the user operation.
(Imaging control section)
The imaging control unit 36 controls the imaging of the imaging device 100 to take an image as described above. The imaging control unit 36 controls the imaging element 12, for example, so that the imaging element 12 acquires an image signal. For example, the imaging control unit 36 may cause the imaging element 12 to automatically acquire an image signal, or may acquire an image signal according to a user operation.
(Image acquisition section)
The image acquisition unit 38 acquires image data acquired by the imaging element 12. The image acquisition unit 38 controls the image processing circuit 13, for example, to cause the image processing circuit 13 to generate image data from the image signal generated by the imaging element 12, and to acquire the image data. The image acquisition unit 38 causes the storage unit 22 to store image data.
(Flow of setting focal position)
Next, the flow of the processing for setting the focal position described above will be described. Fig. 18 is a flowchart illustrating a flow of processing for setting the focal position. As shown in fig. 18, the control unit 24 acquires information of the target area AR by the target area acquisition unit 30 (step S10), and acquires the content of the predetermined process (step S12). The content of the predetermined process is information indicating the object area AR in which the predetermined process is performed (i.e., information indicating which object area AR is the object area in which the predetermined process is performed) and the content of the predetermined process (for example, capturing an image or the like) in the plurality of object areas AR. The content of the predetermined process may be set in advance or may be specified by the user. Here, the second target area AR2 is assumed to be the target area AR in which predetermined processing, which is processing of capturing an image, is performed, and the description will be continued.
The control unit 24 acquires the position information of the object through the object information acquisition unit 32 (step S14). The order of execution of steps S10, S12, S14 may be arbitrary. The control unit 24 determines whether or not the object is located in the first target area AR1 based on the position information of the object by the focus position control unit 34 (step S16). When the object is located in the first target area AR1 (yes in step S16), the focal position control unit 34 aligns the focal position with the object (step S18), but does not perform a predetermined process (in this case, capturing an image). After that, when the process is ended (yes in step S20), the present process is ended, and when the process is not ended (no in step S20), the process returns to step S14 to continue. On the other hand, when the object is not located in the first target area AR1 (no in step S16), and when the object is not located in the second target area AR2 (no in step S22), the object is out of focus or a predetermined process is executed, and the flow advances to step S20. On the other hand, when the object is located in the second target area AR2 (yes in step S22), the focus position control unit 34 brings the focus position into alignment with the object, and the imaging control unit 36 executes a predetermined process (in this case, imaging of an image) (step S24). After that, the process advances to step S20.
The process of performing the predetermined process on the partial target area AR (here, the second target area AR 2) as described above is not essential. The imaging device 100 may set a plurality of target areas AR, and control the focus position of each target area AR to be aligned with an object existing in the target area AR.
(Effect)
As described above, the imaging device 100 according to the present embodiment includes: a photographing element 12; an object information acquisition unit 32 that acquires position information of an object existing in the imaging region AR0 of the imaging element 12; an object region acquiring unit 30 that acquires position information of a plurality of object regions AR; and a focus position control unit 34 that controls the focus position of the imaging device 100. The plurality of object regions AR are located between a first position AX1 at a first distance L1 from the imaging device 100 and a second position AX2 at a second distance L2 smaller than the first distance L1 from the imaging device 100. Regarding each target area AR, in a case where an object exists within the target area AR, the focus position control section 34 aligns the focus position with the object.
Here, in the imaging device of the auto-focus system, it is required to properly align the focus position. In contrast, the imaging device 100 according to the present embodiment is provided with a plurality of object areas AR, and when an object exists in any one of these object areas AR, the focus position is aligned with the object. Therefore, for example, even when there are a plurality of areas to be focused on in monitoring or the like, it is possible to focus on an object existing in each area. Therefore, according to the present embodiment, the focal position can be properly aligned.
In the present embodiment, at least the first target area AR1 and the second target area AR2 are set as the plurality of target areas AR. When an object exists in the second target area AR2, the control unit 24 aligns the focus position with the object, and causes the imaging device 100 to execute a predetermined process. On the other hand, in the case where an object exists in the first target area AR1, the control section 24 aligns the focus position with the object, and does not execute a prescribed process. According to the present embodiment, since the region in which the predetermined process is performed while the focus position is aligned and the region in which the predetermined process is not performed while the focus position is aligned are divided, photographing and the like can be appropriately performed.
In the present embodiment, the predetermined process is at least one of a process of photographing the object, a process of irradiating light toward the object, and a process of outputting information indicating that the object is present in the second target area AR 2. By performing these processes in the case where an object exists in the second target area AR2, shooting or the like can be appropriately performed.
(Sixth embodiment)
Next, a sixth embodiment will be described. The sixth embodiment differs from the fifth embodiment in the provision of the case where the same timing object exists in different target areas AR. In the sixth embodiment, the parts common to the fifth embodiment are not described.
The object information acquiring unit 32 causes the object position measuring unit 14 to measure the relative position of each object, and acquires the position information of each object existing in the imaging area AR 0. The focal position control unit 34 determines whether or not the respective objects are located within the target area AR based on the position information of the respective objects. When it is determined that the same timing object is located in each of the different target areas AR, the focus position control unit 34 sets the focus position based on the priority information. The priority information is information indicating the priority target area AR in each target area AR, and is acquired by the target area acquisition unit 30. The priority information may be, for example, information indicating the priority order of the respective target areas AR.
The target area acquisition unit 30 may acquire the priority information by any method. For example, the priority information may be set in advance. In this case, the target area acquisition unit 30 may read out the preset priority information from the storage unit 22, or may acquire the priority information from another device via the communication unit 20. For example, when priority information is not set in advance, the target area acquisition unit 30 may automatically set the priority information. In addition, for example, the user may set priority information. In this case, for example, the user inputs information (for example, the order of priority of each target area AR) specifying the priority information to the input unit 16, and the target area acquisition unit 30 acquires the priority information by the user.
The focus position control unit 34 aligns the focus position based on the priority information so that the focus position is preferentially aligned with the object located in the preferential target area AR than with the object located in the target area AR other than the preferential target area AR. In other words, the focus position control section 34 aligns the focus position as follows: compared with an object whose focus position is aligned with the target area AR whose priority order is low, an object whose focus position is aligned with the target area AR whose priority order is higher is prioritized. Specific examples of the focus position alignment method based on the priority information will be described below.
Fig. 19 is a schematic diagram for explaining setting of the focal position in the sixth embodiment. For example, the focus position control unit 34 may not align the focus position with an object other than the priority target area AR, but may align the focus position with an object located in the priority target area. In other words, the focus position control unit 34 may be configured to align the focus position with an object located in the target area AR having the highest priority, instead of aligning the focus position with an object located in another target area AR. In fig. 19, a case where the second target area AR2 is higher in priority than the first target area AR1 and the object Aa is located in the first target area AR1 and the object Ab is located in the second target area AR2 at the same timing will be described as an example. In this case, the focus position control unit 34 continues to align the focus position with the object Ab and does not align the focus position with the object Aa.
For example, the focal position control unit 34 may change the focal position so that the focal position overlaps with the object in each target area AR. In this case, for example, the focal position control unit 34 may keep the focal position aligned with the object located in the preferential target area AR longer than the focal position aligned with the object located in the target area AR other than the preferential target area AR. In other words, the focal position control unit 34 may make the period during which the focal position is continuously aligned with the object located in the target area AR having a higher priority than the period during which the focal position is continuously aligned with the object located in the target area AR having a lower priority. That is, the higher the priority order, the longer the period of time for which the focus position is continuously aligned can be made. Taking fig. 19 as an example, the focal position control unit 34 sets the focal position so that the focal position is changed between a position overlapping the object Aa and a position overlapping the object Ab. At this time, the focal position control unit 34 keeps the focal position aligned with the object Ab in the second target area AR2 having a higher priority longer than the focal position aligned with the object Aa in the first target area AR1 having a lower priority.
For example, the focus position control unit 34 may align the focus position with the object located in the priority target area AR before aligning the focus position with the object located in the target area AR other than the priority target area AR. In other words, the focus position control unit 34 may continuously align the focus position with the object located in the target area AR having a higher priority order than the object located in the target area AR having a lower priority order before timing of aligning the focus position with the object located in the target area AR having a lower priority order. That is, the higher the priority order, the earlier the timing of the in-focus position can be made. Taking fig. 19 as an example, the focal position control unit 34 sets the focal position so that the focal position is changed between a position overlapping the object Aa and a position overlapping the object Ab. At this time, the focal position control unit 34 first aligns the focal position with the object Ab located in the second target area AR2, and then aligns the focal position with the object located in the first target area AR1 having a lower priority.
Next, the flow of the processing for setting the focal position described above will be described. Fig. 20 is a flowchart illustrating a flow of processing for setting the focal position in the sixth embodiment. As shown in fig. 20, the control unit 24 acquires information of the target area AR by the target area acquisition unit 30 (step S30), and acquires priority information (step S32).
The control unit 24 acquires the position information of the object through the object information acquisition unit 32 (step S34). The order of execution of steps S10, S12, S14 may be arbitrary. The control unit 24 determines whether or not the object is located in at least one of the first target area AR1 and the second target area AR2 based on the position information of the object by the focus position control unit 34 (step S36). When the object is located in at least one of the first target area AR1 and the second target area AR2 (yes in step S36) and the object is located in both of the first target area AR1 and the second target area AR2 (yes in step S38), the focal position control unit 34 aligns the focal position based on the priority information (step S40). After that, when the process is ended (yes in step S42), the present process is ended, and when the process is not ended (no in step S42), the process returns to step S34 to continue. On the other hand, when the object is not located in both the first target area AR1 and the second target area AR2 (step S38: no), that is, when the object is located in only one of the first target area AR1 and the second target area AR2, the focal position control unit 34 aligns the focal position with the object (step S44), and the flow advances to step S42. If the object is not located in at least one of the first target area AR1 and the second target area AR2 (no in step S36), that is, if the object is located neither in the first target area AR1 nor in the second target area AR, the control of the focal position is not performed, and the flow advances to step S42.
As described above, in the sixth embodiment, when the same timing object exists in the plurality of target areas AR, the focal position control unit 34 sets the focal position based on the priority information indicating the object that is the priority among the objects in the respective target areas AR. According to the sixth embodiment, even when an object moves to a plurality of target areas AR at the same time, the focal position can be set appropriately based on the priority information.
The focus position control unit 34 may not align the focus position with an object located outside the target area AR set to be prioritized in the priority information, but rather align the focus position with an object located in the target area AR set to be prioritized. Therefore, even when the object moves to the plurality of target areas AR at the same time, the focus position can be properly aligned with the prioritized object.
The focal position control unit 34 may change the focal position so that the focal position overlaps with the object in each target area AR. In this case, the focal position control unit 34 makes the period of time during which the focal position is aligned with the object located in the preferential target area AR longer than the period of time during which the focal position is aligned with the object located in the target area AR other than the preferential target area AR. Therefore, the focal position can be aligned with the priority object longer.
(Seventh embodiment)
Next, a seventh embodiment will be described. The seventh embodiment is different from the fifth embodiment in that the focus position is aligned with an object that exists in the target area AR and satisfies a predetermined condition. In the seventh embodiment, the parts common to the fifth embodiment are not described. The seventh embodiment can also be applied to the sixth embodiment.
In the seventh embodiment, the focal position control unit 34 aligns the focal position with an object that exists in the target area AR and satisfies a predetermined condition. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of the presence in the target area AR and the satisfaction of the predetermined condition. The focus position control unit 34 continuously aligns the focus position with the object while the object aligned with the focus position satisfies a predetermined condition and continuously exists in the target area AR. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the satisfaction of the predetermined condition. That is, for example, the focal position control unit 34 separates the focal position from the object when the object satisfies a predetermined condition but moves outside the target area AR, or when the object exists in the target area AR but does not satisfy the predetermined condition.
The focal position control unit 34 may determine whether or not a predetermined condition is satisfied by any method, and may determine whether or not the predetermined condition is satisfied based on at least one of position information of an object and an image of the object, for example. The positional information of the object may be a measurement result of the object position measurement unit 14, and the image of the object may be image data of the object captured by the imaging element 12.
The predetermined condition here may be any condition as long as an object exists in the target area AR. For example, the predetermined condition may be at least one of that the object is performing a predetermined motion, that the object is in a predetermined shape, and that the object is oriented in a predetermined direction. Any two or all of them may be used as the predetermined condition. When a plurality of predetermined conditions are set, the focal position control unit 34 determines that the predetermined conditions are satisfied when all the conditions are satisfied.
The case where the movement of the object is set to a predetermined condition will be described. In this case, the focal position control unit 34 determines whether or not the object is performing a predetermined motion based on the position information of the object continuously acquired in time series. The focal position control unit 34 aligns the focal position of the object existing in the target area AR and performing a predetermined movement. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of the presence in the target area AR and the movement being performed. The focal position control unit 34 continuously aligns the focal position with the object while the object aligned with the focal position is present in the target area AR and continuously performs the predetermined movement. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the predetermined movement. The movement of the object herein refers to a movement pattern of the object, and may refer to a movement direction and a movement speed of the object, for example. For example, when the predetermined movement is directed downward in the vertical direction and moves at a speed of 10m/h or more, the focal position control unit 34 aligns the focal position with an object moving downward in the vertical direction in the target area AR at a speed of 10m/h or more. The movement of the object is not limited to the movement direction and the movement speed of the object, and may refer to any movement method. For example, the movement of the object may refer to at least one of a moving direction and a moving speed of the object.
Fig. 21 is a schematic diagram illustrating an example of a case where the movement of the object is set to a predetermined condition. In the example of fig. 21, a predetermined condition is that an object moves downward in the vertical direction (direction opposite to the Z direction), that is, the movement direction of the object. Fig. 21 shows an example in which the object a passes through the position A1a and the position A2a from the position A0a, moves downward in the vertical direction to the position A3a, and stops at the position A3 a. The position A0a is outside the target area AR, and the positions A1a, A2a, A3a are inside the target area AR. In this case, at the timing when the object a is present at the position A0a, the object a is outside the target area AR, and therefore the focus position control section 34 does not aim the focus position at the object a, for example, aims the focus position at the set position. Then, the focal position control unit 34 aligns the focal position with the object a at the timing when the object a is present at the position A1a, that is, at the timing when the object a enters the target area AR while moving downward in the vertical direction. The focus position control unit 34 also continuously aligns the focus position with the object a when the object a is present at the position A2a, and moves the focus position away from the object a and returns the focus position to the set position when the object a is moved to the position A3a and stopped.
Next, a case where the shape of the object is set to a predetermined condition will be described. In this case, the focal position control unit 34 determines whether or not the object is in a predetermined shape based on the image data of the object. The object existing in the target area AR and having a predetermined shape is aligned with the focal position. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of being present in the target area AR and having a predetermined shape. The focus position control unit 34 keeps focusing on the object in focus while the object in focus is in a predetermined shape and continuously exists in the target area AR. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the predetermined shape. The shape of the object here may be, for example, at least one of the size of the object and the shape of the object. For example, when the predetermined shape is equal to or larger than a predetermined size, the focal position control unit 34 aligns the focal position with an object equal to or larger than the predetermined size existing in the target area AR. The shape information of the object may be obtained by using the 3D shape information obtained by the object information obtaining unit 32.
The description will be given of a case where the orientation of the object is set to a predetermined condition. In this case, the focal position control unit 34 determines whether or not the object is oriented in the predetermined direction based on the image data of the object. The object existing in the target area AR and oriented in a predetermined direction is aligned with the focal position. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of the presence in the target area AR and the orientation in the predetermined direction. The focus position control unit 34 continuously aligns the focus position with the object while the object aligned with the focus position is oriented in the predetermined direction and continuously exists in the target area AR. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the orientation in the predetermined direction. The orientation information of the object may be acquired by using the 3D shape information acquired by the object information acquisition unit 32.
The predetermined condition may be set by any method, and may be set in advance, for example. In this case, the focal position control unit 34 may read information (for example, a moving direction and a moving speed) indicating predetermined conditions set in advance from the storage unit 22, or may acquire the predetermined conditions from other devices via the communication unit 20. For example, if the predetermined condition is not set in advance, the focal position control unit 34 may automatically set the predetermined condition. Further, for example, the user may set a predetermined condition. In this case, for example, the user may input information (for example, a moving direction and a moving speed) specifying a predetermined condition to the input section 16, and the focus position control section 34 may set the predetermined condition based on the information specified by the user.
As described above, in the seventh embodiment, the focal position control unit 34 may align the focal position with the object existing in the target area AR and performing the predetermined movement. The focus position control unit 34 keeps the focus position in the object while the object is performing a predetermined movement, and separates the focus position from the object if the object is not performing the predetermined movement. In this way, in addition to being located in the target area AR, the predetermined movement is satisfied as a condition for aligning the focal position, and thus the object performing the specific movement can be tracked and the focal position can be properly aligned.
In the seventh embodiment, the focal position control unit 34 may align the focal position with an object having a predetermined shape existing in the target area AR. In this way, in addition to being located in the target area AR, a predetermined shape is used as a condition for aligning the focal position, whereby an object of a specific shape can be tracked and the focal position can be properly aligned.
In the seventh embodiment, the focal position control unit 34 may align the focal position with an object existing in the target area AR and facing in a predetermined direction. In this way, in addition to being located in the target area AR, the object having a specific orientation can be tracked by setting the orientation in a predetermined direction as a condition for aligning the focal position, and the focal position can be properly aligned.
(Eighth embodiment)
(Structure of photographing device)
Fig. 22 is a schematic block diagram of an imaging device according to an eighth embodiment. The imaging device 100 according to the eighth embodiment is an imaging device that images an object within an imaging range. The imaging device 100 is an auto-focus camera capable of automatically setting a focus position. The imaging device 100 may be a video camera that captures a moving image by imaging every predetermined frame, or may be a still image camera. The imaging device 100 can be used for any purpose, and can be used as a monitoring camera set in a predetermined position in a device or outdoors, for example.
As shown in fig. 22, the imaging device 100 includes an optical element 10, an imaging element 12, an image processing circuit 13, an object position measuring unit 14, an input unit 16, a display unit 18, a communication unit 20, a storage unit 22, and a control unit 24.
The optical element 10 is, for example, an element of an optical system such as a lens. The optical element 10 may be one or a plurality of optical elements.
The imaging element 12 is an element that converts light incident through the optical element 10 into an image signal that is an electrical signal. The imaging element 12 is, for example, a CCD (Charge Coupled Device: charge coupled device) sensor or a CMOS (Complementary Metal Oxide Semiconductor: complementary metal oxide semiconductor) sensor or the like.
The image processing circuit 13 generates image data for every 1 frame from the image signal generated by the imaging element 12. The image data may be, for example, data including information on brightness and color of each pixel in1 frame, or data on gradation assigned to each pixel.
The object position measuring unit 14 is a sensor that measures the position of an object to be measured (the relative position of the object) with respect to the imaging device 100. The object may be any object, and may be living or inanimate, and the same applies hereinafter. The object herein may be a movable object, but is not limited thereto, and may be an object that does not move.
In the present embodiment, the object position measuring unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object. The object position measuring unit 14 may be any sensor capable Of measuring the relative position Of an object, and may be, for example, a Time Of Flight (TOF) sensor. In the case where the object position measuring unit 14 is a TOF sensor, for example, a light emitting element (for example, LED (LIGHT EMITTING Diode)) that emits light and a light receiving unit that receives light are provided, and the distance to the object is measured from the time of flight of the light emitted from the light emitting element to the object and returned to the light receiving unit. The object position measuring unit 14 may measure the relative position of the object, for example, the direction in which the object exists with respect to the imaging device 100, in addition to the distance from the imaging device 100 to the object. In other words, the object position measuring unit 14 may measure the position (coordinates) of the object in a coordinate system having the imaging device 100 as the origin as the relative position of the object.
The input unit 16 is a mechanism for receiving an input (operation) from a user, and may be, for example, a button, a keyboard, a touch panel, or the like.
The display unit 18 is a display panel that displays an image. The display unit 18 can display an image for setting a target area AR described later by a user, in addition to the image captured by the imaging device 100.
The communication unit 20 is a communication module that communicates with an external device, and may be, for example, an antenna, a Wi-Fi (registered trademark) module, or the like. The imaging device 100 communicates with an external device by wireless communication, but may be wired communication, and the communication method may be arbitrary.
The storage unit 22 is a memory for storing captured image data, various information such as the operation contents of the control unit 24, and programs, and includes at least one of a main storage device such as a RAM (Random Access Memory: random access memory) and a ROM (read only memory), and an external storage device such as an HDD (HARD DISK DRIVE: hard disk drive). The program for the control unit 24 stored in the storage unit 22 may be stored in a recording medium readable by the imaging device 100.
The control unit 24 is an arithmetic device, and includes an arithmetic circuit such as a CPU (Central Processing Unit: central processing unit). The control section 24 includes a target area acquisition section 30, an object information acquisition section 32, a focus position control section 34, a photographing control section 36, an image acquisition section 38, and an object recognition section 40. The control unit 24 reads out a program (software) from the storage unit 22 and executes the program, thereby realizing the target region acquiring unit 30, the object information acquiring unit 32, the focal position control unit 34, the imaging control unit 36, the image acquiring unit 38, and the object recognizing unit 40, and executing these processes. The control unit 24 may execute these processes by one CPU, or may include a plurality of CPUs to execute the processes by the plurality of CPUs. At least part of the processing of the target region acquiring unit 30, the object information acquiring unit 32, the focal position control unit 34, the imaging control unit 36, the image acquiring unit 38, and the object recognizing unit 40 may be realized by a hardware circuit.
(Target region acquiring section)
The target area acquisition unit 30 acquires information of the target area AR set in the imaging area of the imaging device 100. The target area AR is an area set for automatic focus position. The information of the target area AR is information indicating the position of the target area AR, that is, the position information of the target area AR. The target area AR is described below.
Fig. 23 and 24 are schematic diagrams for explaining an example of the target region. Fig. 23 is a view of the imaging device 100 and the target area AR from above in the vertical direction, and fig. 24 is a view of the imaging device 100 and the target area AR from above in the horizontal direction. Hereinafter, the direction Z is referred to as the vertical direction, the direction X is referred to as one of the horizontal directions orthogonal to the direction Z, and the direction Y is referred to as the direction (horizontal direction) orthogonal to the direction Z and the direction X. As shown in fig. 23 and 24, the range in which an image can be captured by the imaging device 100 is defined as an imaging area AR0. The imaging region AR0 is a region (space) within the angle of view of the imaging element 12, in other words, a range that is displayed as an image in real space. The target area AR is an area (space) set within the range of the imaging area AR0.
More specifically, the target area AR is an area within the imaging area AR0 and between the first position AX1 and the second position AX 2. The first position AX1 is a position at a first distance L1 from the imaging device 100, and the second position AX2 is a position at a second distance L2 shorter than the first distance L1 from the imaging device 100. As shown in fig. 23 and 24, in the present embodiment, the first position AX1 is a virtual plane including positions (coordinates) having a first distance L1 from the imaging device 100 in the imaging area AR 0. Similarly, the second position AX2 is a virtual plane including each position (coordinate) having a second distance L2 from the imaging device 100 in the imaging area AR 0. That is, the target area AR can be said to be a space surrounded by a virtual plane having a distance of the second distance L2 from the imaging device 100 and a virtual plane having a distance of the first distance L1 from the imaging device 100 in the imaging area AR 0. The first position AX1 is not limited to a virtual plane having the first distance L1 from the imaging device 100 in all the positions (coordinates) included in the first position AX1, and may be a virtual plane having the first distance L1 from the imaging device 100 in at least a part of the positions (coordinates) included in the first position AX 1. Similarly, the second position AX2 may be a virtual plane in which a position (coordinate) of at least a part included in the second position AX2 is a second distance L2 from the imaging device 100.
Fig. 25 and 26 are schematic views showing another example of the target area. In the description of fig. 23 and 24, the target area AR is divided by the first position AX1 and the second position AX2 in the optical axis direction (the depth direction of the image) of the image capturing device 100, but the image capturing area AR0 is not divided in the radiation direction (the extending direction of the angle of view) opposite to the optical axis direction of the image capturing device 100. In other words, the end face in the expansion direction of the angle of view of the target area AR coincides with the end face in the expansion direction of the angle of view of the imaging area AR 0. However, the target area AR is not limited to this, and the imaging area AR0 may be divided in the extending direction of the angle of view. That is, for example, as shown in fig. 25 and 26, the imaging area AR0 may be divided by the third position AX3 in the extending direction of the angle of view. In this example, the third position AX3 is a virtual plane (here, a closed curved surface of a side surface shape of a cylinder) including a position (coordinate) separated by a predetermined distance to the outside in the radial direction with respect to the optical axis LX of the imaging device 100. In this case, the target area AR is an area (space) surrounded by the first position AX1, the second position AX2, and the third position AX 3. The third position AX3 is not limited to the virtual plane in which all the positions (coordinates) included in the third position AX3 are the first distance L1 from the optical axis LX, and may be a virtual plane in which at least a part of the positions (coordinates) included in the third position AX3 are the third distance L3 from the optical axis LX. For example, the third position AX3 may be a virtual plane that extends outward in the radial direction (horizontal direction and elevation direction) at a predetermined angle as it moves away from the imaging device 100 along the optical axis direction.
The size and shape of the target area AR are not limited to the above description, and may be any. The position of the target area AR is not limited to the position between the first position AX1 and the second position AX2, and may be any position. The target area AR is an area set in the imaging area AR0, but is not limited thereto. For example, if a range that can be measured by the object position measuring unit 14 is set as a distance measurement area (distance measurement space), the target area AR may be an area set in the distance measurement area. In this case, the photographing region AR in fig. 23 to 26 may be treated as a ranging region.
The target area acquisition unit 30 may acquire information of the target area AR by any method. For example, the position of the target area AR may be set in advance. In this case, the target area acquisition unit 30 may read out the position information of the target area AR set in advance from the storage unit 22, or may acquire the position information of the target area AR from another device via the communication unit 20. For example, when the position of the target area AR is not preset, the target area acquisition unit 30 may automatically set the position of the target area AR. In addition, for example, the user may set the position of the target area AR. In this case, for example, the user may input information (for example, values of the first distance L1, the second distance L2, and the third distance L3) specifying the position of the target area AR to the input unit 16, and the target area acquisition unit 30 may set the target area AR based on the position information of the target area AR specified by the user. For example, the target area AR may be set by specifying coordinates. That is, for example, in the example of fig. 23, coordinates P1, P2, P3, and P4 of the vertex position of the target area AR may be specified, or an area surrounded by the coordinates P1 to P4 may be set as the target area AR.
(Object information acquisition section)
The object information acquiring unit 32 acquires position information of an object existing in the imaging area AR 0. The object information acquisition unit 32 controls the object position measurement unit 14 such that the object position measurement unit 14 measures the relative position of the object with respect to the imaging device 100. The object information acquisition unit 32 acquires, as object position information, a result of measurement of the relative position of the object with respect to the imaging device 100 by the object position measurement unit 14. The object information acquisition unit 32 acquires position information of an object at predetermined intervals, thereby sequentially acquiring the position information of the object. The object information acquiring unit 32 can acquire information indicating the shape of the object (for example, the 3D shape of the object) based on the position information of the object. For example, the object information acquiring unit 32 can acquire a 3D shape of an object by accumulating a plurality of pieces of position information such as TOF image information.
(Focal position control section)
The focal position control unit 34 sets the focal position of the imaging device 100. The focal position control unit 34 controls the focal position by controlling the position of the optical element 10, that is, by moving the position of the optical element 10.
The focal position control unit 34 aligns the focal position with the object existing in the target area AR. In other words, the focal position control unit 34 sets the focal position at the position of the object determined to exist in the target area AR. In the present embodiment, the focus position control section 34 determines whether or not the object exists in the target area AR based on the position information of the object acquired by the object information acquisition section 32. When the position of the object acquired by the object information acquisition unit 32 overlaps with the position of the target area AR, the focus position control unit 34 determines that the object exists in the target area AR, and aligns the focus position with the position of the object acquired by the object information acquisition unit 32. That is, for example, when the distance from the imaging device 100 to the object is equal to or less than the first distance L1 and equal to or more than the second distance L2, the focal position control unit 34 determines that the object is present in the target area AR, and aligns the focal position with the object. On the other hand, the focal position control section 34 is not aligned with the focal position for an object that does not exist in the target area AR. That is, for example, when the distance from the imaging device 100 to the object is longer than the first distance L1 and shorter than the second distance L2, the focal position control unit 34 determines that the object is not present in the target area AR, and does not align the focal position with the object.
The focus position control section 34 continuously aligns the focus position with the object whose focus position is aligned while the object exists in the target area AR. That is, the focus position control unit 34 determines whether or not the object is continuously present in the target area AR based on the position information of the object obtained by the object information obtaining unit 32 at every predetermined time, and continuously aligns the focus position with the object while the object is continuously present in the target area AR. On the other hand, when the object whose focal position is aligned moves out of the target area AR, that is, when the object is not present in the target area AR, the focal position control unit 34 separates the focal position from the object and aligns the focal position at a position other than the object.
The focus position control unit 34 may also be configured to shift the focus position to a position not to be aligned with an object existing in the target area AR from the start of the operation of the imaging device 100 (at a timing when the imaging device is in a state capable of imaging). That is, the focus position control unit 34 may align the focus position with an object that enters the target area AR after the start of the operation. In other words, the focal position control unit 34 may align the focal position from the timing at which the object exists in the target area AR, with respect to an object that exists in the target area AR at a certain timing but does not exist in the target area AR at a timing before the certain timing. In other words, in the case where an object moves from outside the target area AR to inside the target area AR, the object can be recognized as the object in which the focus position control section 34 focuses. That is, the focus position control section 34 may move the focus position alignment from outside the target area AR to an object inside the target area AR.
In addition, in the case where the object is not present in the target area AR, the focal position control unit 34 may align the focal position with a set position set in advance. The setting position may be arbitrarily set, but is preferably set in the target area AR such as the center position of the target area AR.
In addition, the focal position may be set by the user. In this case, for example, an automatic mode in which the focus position is automatically set and a manual mode in which the focus position is set by the user can be switched. In the case of the automatic mode, the focal position is set by the focal position control unit 34 as described above. On the other hand, in the manual mode, the user inputs an operation to set the focal position to the input unit 16, and the focal position control unit 34 sets the focal position according to the user operation.
(Imaging control section)
The imaging control unit 36 controls imaging by the imaging device 100 to take an image. The imaging control unit 36 controls the imaging element 12, for example, so that the imaging element 12 acquires an image signal. For example, the imaging control unit 36 may cause the imaging element 12 to automatically acquire an image signal, or may acquire an image signal according to a user operation.
(Image acquisition section)
The image acquisition unit 38 acquires image data acquired by the imaging element 12. The image acquisition unit 38 controls the image processing circuit 13, for example, to cause the image processing circuit 13 to generate image data from the image signal generated by the imaging element 12, and to acquire the image data. The image acquisition unit 38 causes the storage unit 22 to store image data.
(Object identification part)
When an object exists in the target area AR, the object identifying unit 40 determines whether or not the object is identical to an object that has previously existed in the target area AR. That is, when it is determined that the object is located in the target area AR, the focus position control unit 34 aligns the focus position with the object, and the object recognition unit 40 determines whether the object is identical to an object that was previously present in the same target area AR.
In the present embodiment, the object recognition unit 40 determines whether or not the objects are identical based on the image data of the object determined to exist in the target area AR and the image data of the object determined to exist in the target area AR in the past. In this case, the imaging control unit 36 captures an image including the object at a timing when it is determined that the object exists in the target area AR, and acquires image data of the object by the image acquisition unit 38. The object identification unit 40 determines whether or not the objects are identical based on the image data of the object acquired this time and the image data of the object acquired in the past. The image data of the object is image data representing the outline of the object.
The determination method of whether or not the objects based on the image data are identical may be arbitrary, for example, the object recognition section 40 may extract the feature amounts of the objects from the image data of the objects, and determine whether or not the objects are identical based on the degree of coincidence of the feature amounts. That is, the object identifying unit 40 extracts the feature quantity of the object from the image data of the object acquired in the past, extracts the feature quantity of the object from the image data of the object acquired in the present, and determines whether or not the objects are identical based on the degree of coincidence of the feature quantities of the respective objects. The object identification unit 40 determines that the object is the same when the degree of coincidence of the feature amounts is equal to or greater than a predetermined threshold value, and determines that the object is different when the degree of coincidence of the feature amounts is less than the predetermined threshold value. The feature amount extracted from the image data of the object acquired in the past may be extracted at any timing, for example, at the past timing at which the image data of the object is acquired, and may be stored in the storage unit 22. In this case, the object recognition unit 40 reads out the feature extracted from the image data of the object acquired in the past from the storage unit 22. The method of extracting the feature amount to calculate the degree of coincidence may be arbitrary, and may be performed by an AI (ARITIFICAL INTELLIGENCE: artificial intelligence) model, for example. The same determination of the object may be performed based on data other than the image data. For example, the same determination of the object may be performed using the 3D shape information acquired by the object information acquisition unit 32.
Here, the past period refers to a period of time from the present (the latest timing when it is determined that the object is located in the target area AR) to a predetermined time. That is, in the present embodiment, when an object exists in the target area AR after the timing of advancing the predetermined time, the object identifying unit 40 determines whether or not the object is identical to the object determined to exist in the target area AR at the present time. In other words, the object identifying unit 40 determines whether or not the object that was currently determined to exist in the target area AR is identical to the object that exists in the target area AR after the timing of the predetermined time has been advanced. However, the same determination is not limited to be performed only with respect to the object existing in the target area AR after the timing of the predetermined time is advanced. That is, the object identification unit 40 may determine that the object is the same as the object when the object is determined to exist in the target area AR earlier than the predetermined time.
When there are a plurality of times that the past object exists in the target area AR, the object identifying unit 40 may determine whether or not the current object matches the object existing in the target area AR in the past. However, the object identification unit 40 may determine whether or not the present object matches each of the past objects based on the image data of the present object and the image data of each of the past objects, not limited to the same determination as the latest object.
In the case where the display unit 18 displays the image in the imaging area AR in real time, the imaging device 100 can also be said to always capture the image. However, if the image is not recorded, the image is temporarily stored in a buffer or the like and then automatically deleted without being stored in the storage unit 22. In the present embodiment, the image data used for the same determination may be image data that is not automatically stored in the storage unit 22 but automatically deleted without being stored in the storage unit 22.
The method of the same determination as that of the past object is not limited to the use of image data. For example, the object identifying section 40 may make the same determination as the past object based on the position information of the object acquired by the object information acquiring section 32. In this case, for example, the object identification unit 40 determines whether or not the object currently located in the target area AR corresponds to the object currently entering the target area AR again after the object previously existing in the target area AR has once moved outside the target area AR, based on the position information of the objects continuously acquired in time series. When the object recognition unit 40 determines that the object is equivalent to the object that has been present in the target area AR in the past and has entered again, the object recognition unit 40 determines that the current object is identical to the past object.
The object identification unit 40 stores the determination result of the same determination as the past object in the storage unit 22. That is, the object identification unit 40 causes the storage unit 22 to store the result of the determination of whether or not the object currently determined to be located in the target area AR is identical to the object previously existing in the same target area AR. In this case, for example, the object identification unit 40 may store the determination result in the storage unit 22 in association with the image data of the object, or may store the determination result in the storage unit 22 in association with the timing at which the image data is acquired.
An example of the process of aligning the focal position and the process of determining the same as that of the past object described above will be described with reference to fig. 23. Fig. 23 shows an example in which the object a moves in the order of the position A0 outside the target area AR, the position A1 inside the target area AR, the position A2 outside the target area AR, the position A3 inside the target area AR, and the position 4 outside the target area AR. In this case, the focus position control unit 34 does not align the focus position with the object a at the timing when the object a exists at the position A0, for example, aligns the focus position with the set position. Then, the focal position control unit 34 aligns the focal position with the object a at the timing when the object a is located at the position A1, that is, at the timing when the object a enters the target area AR. At this time, the object identifying section 40 determines whether or not the object a located at the position A1 is identical to the object located in the target area AR before the timing at which the object a is located at the position A1. In this example, since the object located in the target area AR does not exist before the timing at which the object a is located at the position A1, the object identifying unit 40 determines that the same object as the object a does not exist in the past, and stores the determination result in the storage unit 22.
The focal position control unit 34 continuously aligns the focal position with the object a while the object a is positioned in the target area AR. Then, at the timing when the object a moves to the position A2, that is, at the timing when the object a comes out of the target area AR, the focal position control unit 34 separates the focal position from the object a and returns the focal position to the set position. Then, the focal position is aligned with the object a at the timing when the object a is located at the position A3, that is, at the timing when the object a reenters the target area AR. At this time, the object identifying section 40 determines whether or not the object a located at the position A3 is identical to the object located in the target area AR before the timing at which the object a is located at the position A3. In this example, since the object a is located in the target area AR at the position A1, the object identifying unit 40 determines that the object a at the position A3 is the same as the object a at the position A1, and stores the determination result in the storage unit 22.
Then, at the timing when the object a moves to the position A4, that is, at the timing when the object a comes out of the target area AR, the focal position control unit 34 separates the focal position from the object a and returns the focal position to the set position.
(Flow of setting focal position)
Next, the flow of the processing for setting the focal position described above will be described. Fig. 27 is a flowchart illustrating a flow of processing for setting the focal position. As shown in fig. 27, the control unit 24 acquires information of the target area AR by the target area acquisition unit 30 (step S10), and acquires position information of the object by the object information acquisition unit 32 (step S12). The order of execution of steps S10 and S12 may be arbitrary. The control unit 24 determines whether or not the object is located in the target area AR based on the position information of the object by the focus position control unit 34 (step S14). If the object is not located in the target area AR (no in step S14), the flow returns to step S12, and the acquisition of the position information of the object is continued. On the other hand, when the object is located in the target area AR (yes in step S14), the focal position control unit 34 aligns the focal position with the object (step S16). Then, the object recognition unit 40 determines whether or not the object located in the target area AR is identical to the object located in the target area AR in the past (step S18). In the present embodiment, for example, the object identifying unit 40 determines whether or not an object is present in the target area AR after timing earlier than timing at which the object is located in the target area AR by a predetermined time. When an object exists in the target area AR after the timing of the predetermined time is advanced, the object identification unit 40 determines whether or not the object existing in the target area AR after the timing of the predetermined time is advanced is identical to the object existing in the target area AR. The object recognition unit 40 causes the storage unit 22 to store the determination result of whether or not the determination result is the same. Thereafter, the acquisition of the position information of the object is continued, and it is determined whether or not the object is moved outside the target area AR (step S20). If the object has not moved out of the target area AR (no in step S20), that is, if the object is continuously present in the target area AR, the flow returns to step S16, and the focus position is continuously aligned with the object. When the object moves out of the target area AR (yes in step S20), the focal position control unit 34 separates the focal position from the object (step S22). After that, when the process is not completed (no in step S24), the process returns to step S12, and when the process is completed (yes in step S24), the present process is completed.
(Effect)
As described above, the imaging device 100 according to the present embodiment includes the imaging element 12, the object information acquisition unit 32, the target area acquisition unit 30, the focal position control unit 34, and the object recognition unit 40. The object information acquisition unit 32 acquires position information of an object existing in the imaging region AR0 of the imaging element 12. The target area acquisition unit 30 acquires position information of the target area AR within the imaging area AR 0. In the case where an object exists in the target area AR, the focal position control section 34 controls the focal position of the photographing device 100 so that the focal position is aligned with the object. When an object exists in the target area AR, the object identifying unit 40 determines whether or not the object is identical to an object that has previously existed in the target area AR.
Here, in the imaging device of the auto-focus system, it is required to properly align the focus position. In contrast, the imaging device 100 according to the present embodiment is capable of focusing on an object existing in the target area AR, which is an area to be focused on in monitoring or the like, for example, because the focus position is aligned with the object existing in the target area AR. In the present embodiment, since it is determined whether or not the objects in the target area AR have been located in the same target area AR in the past, if the objects appear in the target area AR with a time difference, it is possible to recognize whether or not the objects are the same object.
The object recognition unit 40 determines whether or not the objects are identical based on image data obtained by the imaging element 12 capturing the object existing in the target area AR and image data obtained by the imaging element 12 capturing the object existing in the target area AR in the past. By making the same determination based on the image data, in the case where objects appear in the target area AR with a time difference, it is possible to appropriately recognize whether or not they are the same object.
When an object exists in the target area AR, the object identification unit 40 determines whether or not the object is identical to an object existing in the target area AR after a timing earlier than the object by a predetermined time. Therefore, in the case where objects appear in the target area AR with a time difference, it can be appropriately recognized whether or not they are the same object.
The object area AR is located between a first position AX1 at a first distance L1 from the imaging device 100 and a second position AX2 at a second distance L2 shorter than the first distance L1 from the imaging device 100. The imaging device 100 according to the present embodiment can appropriately align the focal position with an object existing at such a position.
(Ninth embodiment)
Next, a ninth embodiment will be described. In the ninth embodiment, a point different from the eighth embodiment is that the focus position is aligned with an object that exists in the target area AR and satisfies a predetermined condition. In the ninth embodiment, the parts common to the eighth embodiment are omitted from the description.
In the ninth embodiment, the focal position control unit 34 aligns the focal position with an object that exists in the target area AR and satisfies a predetermined condition. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of the presence in the target area AR and the satisfaction of the predetermined condition. The focus position control unit 34 continuously aligns the focus position with the object while the object aligned with the focus position satisfies a predetermined condition and continuously exists in the target area AR. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the satisfaction of the predetermined condition. That is, for example, the focal position control unit 34 separates the focal position from the object when the object satisfies a predetermined condition but moves outside the target area AR, or when the object exists in the target area AR but does not satisfy the predetermined condition.
The focal position control unit 34 may determine whether or not a predetermined condition is satisfied by any method, and may determine whether or not the predetermined condition is satisfied based on at least one of position information of an object and an image of the object, for example. The positional information of the object may be a measurement result of the object position measurement unit 14, and the image of the object may be image data of the object captured by the imaging element 12.
The predetermined condition here may be any condition as long as an object exists in the target area AR. For example, the predetermined condition may be at least one of that the object is performing a predetermined motion, that the object is in a predetermined shape, and that the object is oriented in a predetermined direction. Any two or all of them may be used as the predetermined condition. When a plurality of predetermined conditions are set, the focal position control unit 34 determines that the predetermined conditions are satisfied when all the conditions are satisfied.
The case where the movement of the object is set to a predetermined condition will be described. In this case, the focal position control unit 34 determines whether or not the object is performing a predetermined motion based on the position information of the object continuously acquired in time series. The focal position control unit 34 aligns the focal position with respect to the object existing in the target area AR and performing a predetermined movement. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of the presence in the target area AR and the movement being performed. The focal position control unit 34 continuously aligns the focal position with the object while the object aligned with the focal position is present in the target area AR and continuously performs the predetermined movement. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the predetermined movement. The movement of the object herein refers to a movement pattern of the object, and may refer to a movement direction and a movement speed of the object, for example. For example, when the predetermined movement is directed downward in the vertical direction and moves at a speed of 10m/h or more, the focal position control unit 34 aligns the focal position with an object moving downward in the vertical direction in the target area AR at a speed of 10m/h or more. The movement of the object is not limited to the movement direction and the movement speed of the object, and may refer to any movement method. For example, the movement of the object may refer to at least one of a moving direction and a moving speed of the object.
Fig. 28 is a schematic diagram illustrating an example in the case where the movement of the object is set to a predetermined condition. In the example of fig. 28, a predetermined condition is that an object moves downward in the vertical direction (direction opposite to the Z direction), that is, the movement direction of the object. Fig. 28 shows an example in which the object a passes through the position A1a and the position A2a from the position A0a, moves downward in the vertical direction to the position A3a, and stops at the position A3 a. The position A0a is outside the target area AR, and the positions A1a, A2a, A3a are inside the target area AR. In this case, at the timing when the object a is present at the position A0a, the object a is outside the target area AR, and therefore the focus position control section 34 does not aim the focus position at the object a, for example, aims the focus position at the set position. Then, the focal position control unit 34 aligns the focal position with the object a at the timing when the object a is present at the position A1a, that is, at the timing when the object a enters the target area AR while moving downward in the vertical direction. The focus position control unit 34 also keeps the focus position aligned with the object a at the timing when the object a is present at the position A2a, and moves the focus position away from the object a at the timing when the object a is moved to the position A3a and stopped, and returns the focus position to the set position.
Next, a case where the shape of the object is set to a predetermined condition will be described. In this case, the focal position control unit 34 determines whether or not the object is in a predetermined shape based on the image data of the object. The focal position is aligned with an object having a predetermined shape existing in the target area AR. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of the presence in the target area AR and the predetermined shape. The focus position control unit 34 continuously aligns the focus position with the object while the object aligned with the focus position is in a predetermined shape and continuously exists in the target area AR. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the predetermined shape. The shape of the object here may be, for example, at least one of the size of the object and the shape of the object. For example, when the predetermined shape is equal to or larger than a predetermined size, the focal position control unit 34 aligns the focal position with an object equal to or larger than the predetermined size existing in the target area AR. The shape information of the object may be obtained by using the 3D shape information obtained by the object information obtaining unit 32.
The description will be given of a case where the orientation of the object is set to a predetermined condition. In this case, the focal position control unit 34 determines whether or not the object is oriented in the predetermined direction based on the image data of the object. The focus position is aligned with an object existing in the target area AR and facing a predetermined direction. The focal position control unit 34 does not align the focal position with respect to an object that does not satisfy at least one of the presence in the target area AR and the orientation in the predetermined direction. The focus position control unit 34 continuously aligns the focus position with the object while the object aligned with the focus position is oriented in the predetermined direction and continuously exists in the target area AR. On the other hand, the focal position control unit 34 separates the focal position from the object when the object does not satisfy at least one of the presence in the target area AR and the orientation in the predetermined direction. The orientation information of the object may be acquired by using the 3D shape information acquired by the object information acquisition unit 32.
The predetermined condition may be set by any method, and may be set in advance, for example. In this case, the focal position control unit 34 may read information (for example, a moving direction and a moving speed) indicating predetermined conditions set in advance from the storage unit 22, or may acquire the predetermined conditions from other devices via the communication unit 20. For example, if the predetermined condition is not set in advance, the focal position control unit 34 may automatically set the predetermined condition. For example, the user may set a predetermined condition. In this case, for example, the user may input information (for example, a moving direction and a moving speed) for specifying a predetermined condition to the input section 16, and the focus position control section 34 may set the predetermined condition based on the information specified by the user.
As described above, in the ninth embodiment, the focal position control unit 34 may align the focal position with the object existing in the target area AR and performing the predetermined movement. The focus position control unit 34 keeps the focus position in the object while the object is performing a predetermined movement, and moves the focus position away from the object if the object is no longer performing the predetermined movement. In this way, in addition to being in the target area AR, the predetermined movement is satisfied as the condition for aligning the focal position, and thus the object performing the specific movement can be tracked, and the focal position can be properly aligned.
In the ninth embodiment, the focal position control unit 34 may align the focal position with an object having a predetermined shape and existing in the target area AR. In this way, in addition to being located in the target area AR, a predetermined shape is used as a condition for focusing on the focal position, whereby an object of a specific shape can be tracked, and the focal position can be properly aligned.
In the ninth embodiment, the focal position control unit 34 may align the focal position with an object existing in the target area AR and facing in a predetermined direction. In this way, in addition to being located in the target area AR, the object having a specific orientation can be tracked by setting the orientation in a predetermined direction as a condition for the focus position, and the focus position can be properly aligned.
The present embodiment has been described above, but the embodiment is not limited to the content of these embodiments. The above-described constituent elements include elements which can be easily recognized by those skilled in the art, substantially the same elements, and elements of a so-called equivalent range. The above-described components may be appropriately combined, or the configurations of the embodiments may be combined. Various omissions, substitutions, and changes in the constituent elements may be made without departing from the spirit of the embodiments described above. In the embodiments, the operation of aligning the focal position was described as the feature point, but the operation of aligning the focal position and other operations may be combined. For example, the operation of focusing on the focal position and the zoom-in/out operation based on zooming may be combined. In the description of each embodiment, the operation of aligning the focal position may be replaced with another operation. For example, in the description of each embodiment, the operation of focusing on the focal position may be replaced with the operation of zooming in and out by zooming out. The control unit 24 of the imaging device according to each embodiment may notify the predetermined transmission destination through the communication unit 20 when, for example, the condition set by the object entering and exiting the predetermined target area AR, the object moving in the predetermined direction, or the like is satisfied. The set condition here may be, for example, to align the focus position with the object triggered by the movement of the object within the target area AR.
Industrial applicability
The imaging device, the imaging method, and the program according to the present embodiment can be used for imaging an image, for example.
Description of the reference numerals
10 Optical element
12 Shooting element
14 Object position measuring unit
30 Target region acquisition unit
32 Object information acquisition unit
34 Focus position control unit
AR object region
AR0 shooting area
AX1 first position
AX2 second position
L1 first distance
L2 second distance

Claims (20)

1. A photographing apparatus capable of photographing an object, comprising:
A photographing element;
An object information acquisition unit that acquires position information of an object existing in a shooting region of the shooting element; and
A focus position control unit for controlling a focus position of the imaging device,
The focus position control unit aligns the focus position with an object in a target area existing between a first position and a second position, the first position being a position at a first distance from the imaging device, the second position being a position at a second distance from the imaging device, the second distance being smaller than the first distance,
The focus position control unit continuously aligns the focus position with the object while the object is present in the target area, and moves the object out of the target area and then moves the focus position away from the object.
2. The photographing device as claimed in claim 1, wherein,
When the object moves from outside the object area to inside the object area, the object is recognized as an object of which the focus position control section is aligned with a focus position.
3. The photographing device according to claim 1 or 2, wherein,
The focal position control unit controls the focal position by moving a position of an optical element provided in the imaging device.
4. The photographing device according to any of claims 1 to 3, wherein,
The focus position control unit aligns the focus position with an object that is present in the target area and is performing a predetermined movement,
The focus position is continuously aligned with the object during the prescribed movement of the object, and the focus position is moved away from the object after the object is no longer in the prescribed movement.
5. The photographing device according to any of claims 1 to 4, wherein,
The focus position control unit aligns the focus position with an object having a predetermined shape existing in the target region.
6. The photographing device according to any of claims 1 to 4, wherein,
The focus position control unit aligns the focus position with an object existing in the target area and facing a predetermined direction.
7. The photographing device as claimed in any one of claims 1 to 6, wherein,
When a plurality of objects exist in the target area, the focus position control unit changes the focus positions so that the focus positions correspond to the objects.
8. The photographing device as claimed in claim 7, wherein,
The focus position control unit sets an order of changing the focus positions according to the positions of the respective objects.
9. The photographing device as claimed in claim 8, wherein,
The focus position control section sets an order of changing the focus positions so as to minimize a time required for changing the focus positions.
10. A photographing apparatus capable of photographing an object, comprising:
A photographing element;
An object information acquisition unit that acquires position information of an object existing in a shooting region of the shooting element; and
A focus position control unit for controlling a focus position of the imaging device,
The focus position control unit aligns the focus position with an object that is located outside a target area between a first position and a second position, the first position being a position having a first distance from the imaging device, the second position being a position having a second distance from the imaging device, the second distance being smaller than the first distance,
The focus position control unit continuously aligns the focus position with the object while the object is present outside the target area and within the imaging area, and moves the focus position away from the object after the object is moved into the target area.
11. The photographing device of claim 10, wherein,
When the object moves from within the object region to outside the object region, the object is identified as an object whose focus position is aligned by the focus position control section.
12. The photographing device of claim 1, further comprising:
an object region acquisition unit that acquires position information of a plurality of object regions,
The focus position control unit aligns the focus position with an object in a case where the object exists in the object region for each of the object regions.
13. The photographing device of claim 12, wherein,
At least a first target area and a second target area are set as the plurality of target areas, and the imaging device is configured to execute a predetermined process while aligning the focus position with the object when the object is present in the second target area, and not to execute the predetermined process while aligning the focus position with the object when the object is present in the first target area.
14. The photographing device of claim 13, wherein,
The predetermined process is at least one of a process of photographing the object, a process of irradiating light toward the object, and a process of outputting information indicating that the object is present in the second target area.
15. The photographing device of claim 13 or 14, wherein,
The second object region is surrounded by the first object region.
16. The photographing device of any of claims 12 to 15, wherein,
When an object exists in the plurality of target areas at the same timing, the focus position control section sets the focus position based on priority information indicating a priority target area among the respective target areas.
17. The photographing device of claim 1, further comprising:
And an object recognition unit that, when an object exists in the target area, determines whether or not the object is identical to an object existing in the target area in the past.
18. The photographing device of claim 17, wherein,
The object identification unit determines whether or not the objects are identical based on image data obtained by the imaging device by imaging the object existing in the target area and image data obtained by the imaging device by imaging the object existing in the target area in the past.
19. The photographing device of claim 17 or 18, wherein,
The object recognition unit determines whether or not an object is the same as an object existing in the target area after a timing earlier than the object by a predetermined time when the object exists in the target area.
20. A photographing method for photographing an object, the method comprising the steps of:
Acquiring position information of an object existing in a shooting area; and
The focal position of the photographing device is controlled,
In the step of controlling the focal position,
Aligning the focus position with an object of an object region existing between a first position and a second position, the first position being a position at a first distance from the photographing device, the second position being a position at a second distance from the photographing device, the second distance being smaller than the first distance,
The focus position is continuously aligned with the object during the presence of the object within the object region, and the focus position is moved away from the object after the object is moved outside the object region.
CN202280059544.8A 2021-09-27 2022-07-29 Imaging device and imaging method Pending CN117917089A (en)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
JP2021157147A JP2023047943A (en) 2021-09-27 2021-09-27 Imaging device, imaging method, and program
JP2021-157146 2021-09-27
JP2021-156800 2021-09-27
JP2021-157147 2021-09-27
JP2021-157250 2021-09-27
JP2021-157148 2021-09-27
JP2021-156863 2021-09-27
JP2021-156799 2021-09-27
JP2021-157244 2021-09-27
PCT/JP2022/029298 WO2023047802A1 (en) 2021-09-27 2022-07-29 Imaging device and imaging method

Publications (1)

Publication Number Publication Date
CN117917089A true CN117917089A (en) 2024-04-19

Family

ID=85779380

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280059544.8A Pending CN117917089A (en) 2021-09-27 2022-07-29 Imaging device and imaging method

Country Status (2)

Country Link
JP (1) JP2023047943A (en)
CN (1) CN117917089A (en)

Also Published As

Publication number Publication date
JP2023047943A (en) 2023-04-06

Similar Documents

Publication Publication Date Title
EP2571257B1 (en) Projector device and operation detecting method
US8224069B2 (en) Image processing apparatus, image matching method, and computer-readable recording medium
US11156823B2 (en) Digital microscope apparatus, method of searching for in-focus position thereof, and program
US10163222B2 (en) Image capturing apparatus, tracking device, control method, and storage medium
WO2014108976A1 (en) Object detecting device
JP2023175752A (en) Processing apparatus, electronic equipment, processing method, and program
JP6505234B2 (en) Ranging device, ranging method, and ranging program
US8023744B2 (en) Pattern matching system and targeted object pursuit system using light quantities in designated areas of images to be compared
JP2004361740A (en) Automatic focusing device and method
JP2020034484A (en) Image inspection device
JP2020034483A (en) Image measuring device
JP6534455B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
JP2009109682A (en) Automatic focus adjusting device and automatic focus adjusting method
CN117917089A (en) Imaging device and imaging method
WO2023047802A1 (en) Imaging device and imaging method
EP3163369B1 (en) Auto-focus control in a camera to prevent oscillation
JP6534456B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
US20130076868A1 (en) Stereoscopic imaging apparatus, face detection apparatus and methods of controlling operation of same
JP2008287648A (en) Mobile body detection method and device, and monitoring device
JP2023048013A (en) Imaging device, imaging method, and program
JP2023047944A (en) Imaging device, imaging method, and program
JP2023047942A (en) Imaging device, imaging method, and program
JP2023047715A (en) Imaging system, imaging method, and program
CN117917090A (en) Imaging device, imaging system, imaging method, and program
JP2023048019A (en) Imaging device, imaging method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination