WO2021168804A1 - Procédé de traitement d'image, appareil de traitement d'image et programme de traitement d'image - Google Patents

Procédé de traitement d'image, appareil de traitement d'image et programme de traitement d'image Download PDF

Info

Publication number
WO2021168804A1
WO2021168804A1 PCT/CN2020/077217 CN2020077217W WO2021168804A1 WO 2021168804 A1 WO2021168804 A1 WO 2021168804A1 CN 2020077217 W CN2020077217 W CN 2020077217W WO 2021168804 A1 WO2021168804 A1 WO 2021168804A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
image
focal length
processed
viewing angle
Prior art date
Application number
PCT/CN2020/077217
Other languages
English (en)
Chinese (zh)
Inventor
邹文
彭亮
丁硕
夏斌强
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/077217 priority Critical patent/WO2021168804A1/fr
Priority to CN202080004280.7A priority patent/CN112514366A/zh
Publication of WO2021168804A1 publication Critical patent/WO2021168804A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present disclosure relates to the field of image processing technology, and in particular to an image processing method, an image processing device, and an image processing system.
  • the embodiments of the present disclosure provide an image processing method, an image processing device, and an image processing system, which can determine the shooting range of a combined lens with multiple focal lengths at multiple focal lengths through specific application software, so as to facilitate users Understand the range of shooting under the specified focal length to improve user experience.
  • an embodiment of the present disclosure provides an image processing method, the method includes: obtaining a first maximum viewing angle area corresponding to a first focal length at a current attitude angle, and obtaining a second maximum viewing angle area corresponding to a second focal length ; Compare the first maximum viewing angle area with the second maximum viewing angle area; and determine the selectable range of the area to be processed in the first image according to the comparison result, so as to obtain at least one second of the area to be processed under the second focal length Image, the first image is acquired at the first focal length; wherein the second focal length is greater than the first focal length.
  • the angle of view area corresponding to the first focal length is different from the angle of view area corresponding to the second focal length, for example, when the posture of the camera is adjusted by the posture adjustment device, the maximum posture adjustment range of the camera is affected by the mechanical of the posture adjustment device. Due to the influence of the structure and the like, the second maximum viewing angle area corresponding to the second focal length may not completely include the first maximum viewing angle area corresponding to the first focal length under the current attitude angle. Also, because the user is not clear about which areas in the first maximum viewing angle area overlap with the second maximum viewing angle area, that is, the user is not clear about which areas in the first image can be photographed with the second focal length to obtain a clearer partial image.
  • the image processing method can compare the first maximum angle of view area corresponding to the first focal length and the second maximum angle of view area corresponding to the second focal length under the current attitude angle to determine whether the area to be processed is in The selectable range in the first image corresponding to the first maximum viewing angle area, so as to obtain at least one second image of the area to be processed at the second focal length.
  • determining the selectable range of the area to be processed in the first image not only allows the user to clarify which areas in the first image can be photographed with the second focal length to obtain a clearer partial image, but also can be limited by the selectable range Measures such as user operation and adjustment range of the attitude adjustment device reduce the possibility of damage to the attitude adjustment device.
  • embodiments of the present disclosure provide an image processing device, which includes: one or more processors; and a computer-readable storage medium.
  • the computer-readable storage medium is used to store one or more computer programs.
  • the computer program When the computer program is executed by the processor, it realizes: obtain the first maximum viewing angle area corresponding to the first focal length under the current attitude angle, and obtain the second focal length The corresponding second largest viewing angle area; comparing the first largest viewing angle area with the second largest viewing angle area; and determining the selectable range of the area to be processed in the first image according to the comparison result, so as to obtain the area to be processed in the second At least one second image at the focal length, the first image is acquired at the first focal length; wherein the second focal length is greater than the first focal length.
  • the image processing device provided by the embodiment of the present disclosure can automatically determine the selectable range of the area to be processed in the first image, thereby facilitating the user to determine the area in the first image that can be photographed with the second focal length, so as to obtain a clearer part
  • the image helps to improve the user experience and helps reduce the possibility of damage to the posture adjustment device.
  • an embodiment of the present disclosure provides an image processing system.
  • the image processing system includes a photographing device and a control terminal communicatively connected to the photographing device.
  • the control device is used to obtain the area to be processed, and the photographing device is used to obtain the photographing device.
  • the first maximum angle of view area corresponding to the first focal length under the current attitude angle, and the second maximum angle of view area corresponding to the second focal length is obtained; the first maximum angle of view area and the second maximum angle of view area are compared; and according to the comparison result Determine the selectable range of the area to be processed in the first image, so as to control the photographing device to acquire at least one second image of the area to be processed at the second focal length.
  • the first image is acquired at the first focal length, where the second The focal length is greater than the first focal length.
  • the photographing device can automatically determine the selectable range of the area to be processed in the first image, thereby facilitating the user to determine the photographable area in the first image that can be photographed with the second focal length, so that The user controls the photographing device to obtain a clearer partial image of the photographable area at the second focal length, which helps to improve the user experience and helps reduce the possibility of damage to the posture adjustment device.
  • embodiments of the present disclosure provide a computer-readable storage medium that stores executable instructions that, when executed by one or more processors, can cause one or more processors to execute the above Methods.
  • FIG. 1 is an application scenario of an image processing method, image processing device, and image processing system provided by embodiments of the disclosure
  • FIG 2 is an application scenario of an image processing method, image processing device, and image processing system provided by another embodiment of the present disclosure
  • FIG. 3 is a schematic flowchart of an image processing method provided by an embodiment of the disclosure.
  • FIG. 4 is a schematic diagram of a vertical range of a second maximum viewing angle area corresponding to a second focal length provided by an embodiment of the present disclosure
  • 5 to 10 are schematic diagrams of the display interface of the control terminal provided by the embodiments of the disclosure.
  • FIG. 11 is a schematic diagram of the division of to-be-processed areas provided by an embodiment of the present disclosure.
  • FIG. 12 is a schematic diagram of an area to be photographed according to an embodiment of the disclosure.
  • FIG. 13 is a schematic diagram when the number of shots provided by an embodiment of the disclosure is an odd number
  • FIG. 18 is a schematic structural diagram of an image processing device provided by an embodiment of the disclosure.
  • FIG. 19 is a schematic structural diagram of an image processing system provided by an embodiment of the disclosure.
  • FIG. 20 is a schematic structural diagram of a control terminal provided by an embodiment of the disclosure.
  • FIG. 21 is a schematic diagram of the structure of a photographing device provided by an embodiment of the disclosure.
  • Fig. 1 is an application scenario of an image processing method, an image processing device, and an image processing system provided by embodiments of the disclosure.
  • the UAV 10 can rotate around up to three orthogonal axes, such as a first pitch axis (X1), a first heading axis (Y1), and a first roll axis (Z1).
  • Rotating around three axes can refer to pitch rotation, heading rotation and roll rotation.
  • the angular velocity of the UAV 10 around the first pitch axis, the first yaw axis, and the first roll axis can be expressed as ⁇ X1, ⁇ Y1, and ⁇ Z1.
  • the unmanned aerial vehicle 10 can perform movement in translation along a first pitch axis, a first yaw axis, and a first roll axis.
  • the linear velocity of the unmanned aerial vehicle 10 along the first pitch axis, the first yaw axis, and the first roll axis can be denoted as VX1, VY1, and VZ1, respectively.
  • the unmanned aerial vehicle 10 is only an exemplary description of the application scenario, and cannot be understood as a limitation on the application scenario of the present disclosure.
  • the present disclosure can also be applied to a variety of movable platforms.
  • the movable platform may also be a pan-tilt cart, a handheld pan-tilt, a robot, etc., these movable platforms can rotate around one axis or two axes, or cannot rotate around an axis.
  • the imaging device 20 is mounted on the unmanned aerial vehicle 10 via a carrier.
  • the carrier can make the imaging device 20 move around and/or along up to three orthogonal axes (in other embodiments, non-orthogonal axes) relative to the UAV 10, such as the second pitch axis (X2 ), the second yaw axis (Y2) and the second roll axis (Z2) move.
  • the second pitch axis, the second yaw axis, and the second roll axis may be parallel to the first pitch axis, the first yaw axis, and the first roll axis, respectively.
  • the load is an imaging device including an optical module
  • the second roll axis may be substantially parallel to the optical path and the optical axis of the optical module.
  • the optical module can be connected with an image sensor to capture images.
  • the carrier can send a control signal according to an actuator connected to the carrier, such as a motor, so that the carrier can rotate around up to three orthogonal axes, such as the second pitch axis, the second yaw axis, and the second roll axis. . Rotation around the three axes can be referred to as pitch rotation, heading rotation and roll rotation.
  • the angular velocity of the camera 20 around the second pitch axis, the second yaw axis, and the second roll axis can be expressed as ⁇ X2, ⁇ Y2, and ⁇ Z2, respectively.
  • the carrier can make the imaging device 20 move relative to the UAV 10 along the second pitch axis, the second yaw axis, and the second roll axis.
  • the linear velocity of the camera 20 along the second pitch axis, the second yaw axis, and the second roll axis can be denoted as VX2, VY2, VZ2, respectively.
  • the carrier may only allow the camera 20 to rotate relative to the UAV 10 about a subset of the three axes (the second pitch axis, the second yaw axis, and the second roll axis).
  • the carrier may only allow the camera 20 to rotate around the second pitch axis, the second yaw axis, and the second roll axis or a combination of any of them, but not allow the camera 20 to move in translation along any axis.
  • the carrier may allow the camera 20 to rotate around only one of the second pitch axis, the second yaw axis, and the second roll axis.
  • the carrier may allow the camera 20 to rotate around only two of the second pitch axis, the second yaw axis, and the second roll axis.
  • the carrier may allow the camera device 20 to rotate around the three axes of the second pitch axis, the second yaw axis, and the second roll axis.
  • the carrier may only allow the camera 20 to move along the second pitch axis, the second yaw axis, and the second roll axis or a combined translational movement of any of them, but not allow the camera 20 to move around. Any axis rotation movement.
  • the carrier may allow the camera 20 to move along only one of the second pitch axis, the second yaw axis, and the second roll axis.
  • the carrier may allow the camera 20 to move along only two of the second pitch axis, the second yaw axis, and the second roll axis.
  • the carrier may allow the camera 20 to move along the three axes of the second pitch axis, the second yaw axis, and the second roll axis.
  • the carrier may allow the camera 20 to perform rotational and translational motions relative to the UAV 10.
  • the carrier may allow the camera 20 to move along and/or around one, two, or three of the second pitch axis, the second yaw axis, and the second roll axis.
  • the imaging device 20 may be directly mounted on the UAV 10 without a carrier, or the carrier does not allow the imaging device 20 to move relative to the UAV 10. In this case, the posture, position, and/or direction of the photographing device 20 are fixed relative to the unmanned aerial vehicle 10.
  • the adjustment of the posture, direction and/or position of the photographing device 20 may be achieved individually or collectively by appropriate adjustments to the UAV 10, the carrier and/or the photographing device 20.
  • the camera 20 can be rotated 80 degrees around a designated axis (such as a heading axis) by the following method: the UAV 10 rotates 80 degrees alone, the load is rotated 80 degrees relative to the UAV 10 through actuation of the carrier, and no The human aircraft 10 rotates 50 degrees while the load rotates 30 degrees relative to the unmanned aircraft 10.
  • the translational movement of other loads can be achieved by proper adjustment of the UAV 10 and the carrier.
  • the adjustment of the operating parameters of the load can also be completed by one or more of the UAV 10, the carrier, or the control terminal of the UAV 10.
  • the operating parameters of the load may include, for example, the zoom degree or the focal length of the imaging device.
  • the unmanned aerial vehicle 10 may include an unmanned aerial vehicle
  • the photographing device 20 may include an imaging device.
  • the control terminal 30 may also be included in the application scenario.
  • the movable platform can be an aircraft (such as an unmanned aerial vehicle), a pan-tilt cart, a handheld pan-tilt, a robot, and so on.
  • the control terminal 30 may be a device such as a mobile phone or a tablet computer, and may also have a remote control function to realize remote control of a movable platform such as the unmanned aerial vehicle 10.
  • the unmanned aerial vehicle 10 may include an attitude adjustment device (such as the carrier described above) such as a pan/tilt, and the imaging device 20 may be mounted on the pan/tilt.
  • the photographing device 20 may include a first lens and a second lens for performing different photographing tasks.
  • the photographing device 20 may also include two or more.
  • the first lens and the second lens are located at Different camera devices.
  • the first lens and the second lens may be lenses corresponding to the conventional shooting function of the camera.
  • the first lens and the second lens may be a wide-angle lens and a zoom lens, respectively.
  • a wide-angle lens can be used to obtain a wider and complete picture
  • a zoom lens can be used to obtain high-definition details
  • a pan-tilt can be used to adjust the shooting angle of the camera 20
  • the UAV 10 can be used to ensure smooth movement and no drift.
  • the control terminal 30 can be used to control the movement of the unmanned aerial vehicle 10.
  • the control terminal 30 can also obtain the shooting screen returned by the shooting device 20 for the user to view.
  • control terminal 30 can also obtain a user's control instruction on the photographing device 20, and send the control instruction to the photographing device.
  • the control instruction may be, for example, an instruction to control the zoom factor of the zoom lens 30 of the photographing device 20, or may be an instruction to control the photographing range of the zoom lens of the photographing device 20, or the like.
  • the unmanned aerial vehicle 10 may include various types of UAV (Unmanned Aerial Vehicle, unmanned aerial vehicle), such as a quadrotor UAV, a hexarotor UAV, and the like.
  • UAV Unmanned Aerial Vehicle, unmanned aerial vehicle
  • the gimbal included can be a three-axis gimbal, that is, the attitude of the gimbal can be controlled on the three axes of the pitch axis, roll axis, and yaw axis, so as to determine the corresponding attitude of the gimbal so that it can be mounted on
  • the photographing device 20 on the pan/tilt can complete corresponding photographing tasks.
  • the unmanned aerial vehicle 10 may establish a communication connection with the above-mentioned control terminal 30 through a wireless connection method (for example, a wireless connection method based on WiFi or radio frequency communication, etc.).
  • the control terminal 30 may be a controller with a joystick and a display screen, and the UAV 10 is controlled by the amount of the stick.
  • the control terminal 30 can also be a smart device such as a smart phone or a tablet computer. It can control the unmanned aerial vehicle 10 to fly automatically by configuring the flight trajectory on the user interface UI, or control the unmanned aerial vehicle 10 to fly automatically by means of body sensing, or through After the flight trajectory is recorded during the flight of the UAV 10 in advance, the UAV 1 is controlled to automatically fly along the recorded flight trajectory.
  • the photographing device 20 may also establish a communication connection with the above-mentioned control terminal 30 through a wireless connection method (for example, a wireless connection method based on WiFi or radio frequency communication, etc.).
  • a wireless connection method for example, a wireless connection method based on WiFi or radio frequency communication, etc.
  • the photographing device 20 establishes a wireless communication connection with the aforementioned control terminal 30 through a wireless communication module of a movable platform.
  • the photographing device 20 may establish a communication connection with the aforementioned control terminal 30 by itself, wherein, in some embodiments, the photographing device 20 is an independent photographing device, such as a sports camera.
  • the control terminal 30 can install application software for controlling the shooting device 20, and the user can view the shooting screen returned by the shooting device on the display interface of the control terminal 30 through the application software, and provide the user with the interaction between the control terminal 30 and the shooting device 20 interface.
  • the imaging device 20 includes charge-coupled devices (CCD), and uses complementary metal-oxide-semiconductor (CMOS), or N-type metal-oxide-semiconductor (N-type metal-semiconductor). oxide-semiconductor, NMOS, Live MOS), or other types of sensors.
  • the sensor may be a part of the imaging equipment (such as a camera) carried by the unmanned aerial vehicle.
  • the imaging equipment may be carried on the unmanned aerial vehicle through a pan/tilt, which can adjust the attitude angle of the photographing device 20.
  • the image data captured by the camera 20 can be stored in a data storage device.
  • the data storage device may be based on semiconductor, magnetic, optical or any suitable technology, and the data storage device may include flash memory, USB drive, memory card, solid state drive, hard disk drive, floppy disk, optical disk, magnetic disk, etc.
  • data storage devices include removable storage devices that can be detachably connected to imaging devices, such as memory cards of any suitable format, such as personal computer cards, micro flash memory cards, SM cards, and Memory Stick Memory Stick, Memory Stick Duo Memory Stick, Memory Stick PRO Duo Memory Stick, Mini Memory Card, Multimedia Memory Card, Micro Multimedia Memory Card, MMCmicro Card, PS2 Card, SD Card, SxS Memory Card, UFS Card, Micro SD Card, MicroSD card, xD card, iStick card, serial flash memory module, NT card, XQD card, etc.
  • the data storage device may also include an external hard disk drive, optical disk drive, magnetic disk drive, floppy disk drive, or any other suitable storage device connected to the imaging device.
  • the image captured by the camera 20 can be transmitted to the control terminal 30 through the wireless communication module.
  • the image data may be compressed or subjected to other processing before being transmitted by the wireless communication module. In other embodiments, the image data may not undergo compression or other processing before transmission.
  • the transmitted image data can be displayed on the control terminal 30 so that the user who operates the user terminal can see the image data and/or interact with the control terminal 30 based on the image data.
  • the photographing device 20 may preprocess the captured image data.
  • hardware In the process of preprocessing the image data, hardware, software or a combination of the two can be used.
  • the hardware may include Field-Programmable Gate Array (FPGA) and so on. Specifically, after the image data is captured, the unprocessed image data can be pre-processed to perform noise removal, contrast enhancement, scale space representation, and the like.
  • FPGA Field-Programmable Gate Array
  • Fig. 2 is an application scenario of an image processing method, image processing device, and image processing system provided by another embodiment of the present disclosure.
  • this scene may include a photographing device 20, a control terminal 30, and a posture adjustment device 40.
  • the posture adjusting device 40 may be a device for adjusting the posture of the photographing device 20, such as a pan/tilt, etc., to adjust the posture of the photographing device 20.
  • the control terminal 30 may be an electronic device such as a mobile phone, a tablet computer, a notebook, a desktop computer, a remote control, etc., to realize remote control of the posture adjustment device 40 and the photographing device 20.
  • the imaging device 20 may be mounted on a posture adjustment device 40 which is a pan/tilt.
  • the photographing device 20 may include a zoom lens to perform photographing tasks.
  • the zoom lens may be a lens corresponding to the conventional shooting function of the camera.
  • a zoom lens can be used to obtain a more complete picture in the environment at a low focal length, and a high focal length of the zoom lens can be used to obtain high-definition details.
  • the movement of the device 20 is smooth and without drift.
  • the control terminal 30 can be used to control the movement of the posture adjustment device 40.
  • the control terminal 30 can also obtain the shooting screen returned by the shooting device 20 for the user to view.
  • the control terminal 30 can also obtain a user's control instruction on the photographing device 20, and send the control instruction to the photographing device.
  • the control instruction may be, for example, an instruction to control the zoom factor of the zoom lens of the photographing device 20, or may be an instruction to control the photographing range of the zoom lens of the photographing device 20, or may be an instruction to view a partially clear photo, or the like.
  • the user may set a monitoring task on the control terminal 30, and the monitoring task may include the task of acquiring images in a specified time period and a specified area using multiple focal lengths. For example, at 9 o'clock in the morning, an image of the scene as shown in FIG. 2 (such as an image acquired at a low focal length), and a partial image of the area where the key monitoring object is located in the image (such as an image acquired at a high focal length) is acquired at 9 o'clock in the morning. In this way, it is convenient for the user to perform state monitoring and the like based on the image acquired by the photographing device 20.
  • the design application is not limited to security scenes.
  • the security scenes are shooting in a fixed range. In addition, they can also be shooting in a moving range, for example, shooting in different scenes through a movable platform.
  • users can use multiple focal lengths to obtain images at the current point in time and in a specific area at any time. For example, when the user finds that there may be an abnormality in the scene as shown in FIG. 2, he can control the photographing device 20 to photograph the area where the abnormality may exist in time for state tracking and the like.
  • the photographing device 20 may also establish a communication connection with the above-mentioned control terminal 30 through a wired connection or a wireless connection (for example, a wireless connection based on WiFi or radio frequency communication, etc.).
  • the control terminal 30 can install application software for controlling the shooting device 20, and the user can view the shooting screen returned by the shooting device on the display interface of the control terminal 30 through the application software, and provide the user with the interaction between the control terminal 30 and the shooting device 20 interface.
  • the photographing device 20 includes an image sensor to capture image information.
  • FIG. 3 is a schematic flowchart of an image processing method provided by an embodiment of the disclosure.
  • the image processing method may include operations S301 to S305.
  • the above operations can be performed on a variety of electronic devices with computing capabilities, for example, can be performed on at least one of an unmanned aerial vehicle, a camera, a pan/tilt, a pan/tilt camera, and a control terminal.
  • the pan/tilt camera refers to the equipment including the pan/tilt and the camera, and the pan/tilt and the camera are connected by a non-quick release.
  • operations S301 to S305 are performed on the pan-tilt camera.
  • operations S301 to S305 are performed on the control terminal.
  • at least part of operations S301 to S305 are performed on the pan-tilt camera, and the rest are performed on the control terminal.
  • different electronic devices can communicate with each other the data required for the operation, user operations, intermediate operation results, and final operation results, etc. .
  • the required data and user operations can be collected by a specific sensor, such as image data collected by an image sensor, and user operations collected by a touch screen.
  • a first maximum viewing angle area corresponding to the first focal length under the current attitude angle is obtained, and a second maximum viewing angle area corresponding to the second focal length is obtained.
  • the second maximum viewing angle area corresponding to the second focal length is determined based on the maximum rotatable angle of the posture adjustment device (such as a pan-tilt).
  • the first maximum viewing angle area corresponding to the first focal length under the current attitude angle is related to the pose of the camera. Take a camera installed on the drone as an example for illustrative description. The camera is installed on the drone through a pan-tilt, and the attitude of the camera is adjusted through the pan-tilt.
  • the deflection range of the pitch angle of the gimbal in the normal downward installation includes the first range, such as (-90, 30)
  • the deflection range of the yaw angle includes the second range, such as (-300, 300).
  • the maximum adjustable range can be determined by the maximum rotation angle of the pan/tilt.
  • FIG. 4 is a schematic diagram of a vertical range of a second maximum viewing angle area corresponding to a second focal length provided by an embodiment of the disclosure. As shown in FIG. 4, if the deflection range of the pitch angle of the pan/tilt at the second focal length is set to include the first range being (MIN, MAX), it can be determined that the second maximum viewing angle area corresponding to the second focal length includes: in, Is the vertical size of the viewing area corresponding to the second focal length.
  • setting the deflection range of the pitch angle of the pan/tilt at the first focal length includes the second range (MIN, MAX) (because the camera device corresponding to the first focal length and the camera corresponding to the second focal length can be controlled through the same pan/tilt In the photographing device, the first range and the second range may be the same), it can be determined that the first maximum viewing angle area corresponding to the first focal length includes: in, Is the vertical size of the viewing area corresponding to the first focal length.
  • the first maximum viewing angle area corresponding to the first focal length can be expressed as
  • the first maximum viewing angle area corresponding to the first focal length at the current attitude angle may be the first viewing angle area corresponding to the first focal length of the camera at the current attitude angle, and the camera may capture an image of the first viewing angle area at the first focal length .
  • the second focal length includes any of the following: fixed focal length (such as the focal length of a camera with a fixed focal length), preset focal length (such as the focal length specified in the preset shooting task), and the focal length determined based on the focal length algorithm ( For example, determined based on the historical focal length data used by the user), based on the focal length determined by the sensor and the focal length selected by the user (for example, the focal length selection component is displayed on the display interface, and the user clicks the button, slides the focal length selection bar, enters text, etc. Select focal length).
  • the focal length determined based on the sensor includes: the focal length determined based on distance information, and the distance information is determined by a laser rangefinder.
  • a laser rangefinder can be set on a camera, pan/tilt or drone to determine the applicable focal length.
  • the first maximum viewing angle area and the second maximum viewing angle area are compared. For example, comparing whether the first maximum viewing angle area includes the second maximum viewing angle area, or comparing whether the second maximum viewing angle area includes the first maximum viewing angle area. For another example, compare the upper limit of the first maximum angle of view area and the upper limit of the second maximum angle of view area, compare the lower limit of the first maximum angle of view area and the lower limit of the second maximum angle of view area, compare the left limit of the first maximum angle of view area and the second maximum For the left limit of the viewing angle area, compare the right limit of the first maximum viewing angle area and the right limit of the second maximum viewing angle area.
  • the selectable range of the area to be processed in the first image is determined according to the comparison result, so as to obtain at least one second image of the area to be processed under the second focal length, the first image being obtained under the first focal length .
  • the second focal length is greater than the first focal length.
  • the selectable range of the area to be processed in the first image corresponding to the first maximum viewing angle area may be determined based on the comparison result, and the area in the first image corresponding to the selectable range may be based on the second focal length Take a shot.
  • the first maximum angle of view area corresponding to the first focal length is included in the second maximum angle of view area corresponding to the second focal length, it is determined that the first maximum angle of view area can be photographed based on the second focal length . That is, the upper and lower boundaries of the selectable range are the upper and lower boundaries of the first maximum viewing angle area, and the value of the selectable range is (0.0, 1.0) (for example, the upper boundary of the selectable range and the upper boundary of the first maximum viewing angle area are taken The value ratio is 0% to 100%, and the value ratio of the lower boundary of the selectable range to the lower boundary of the first maximum viewing angle area is 0% to 100%).
  • the upper limit of the selectable range needs to be calculated.
  • the lower limit of the selectable range needs to be calculated.
  • the left limit of the selectable range needs to be calculated.
  • the right limit of the selectable range needs to be calculated.
  • the method in order for the user to intuitively see the relationship between the first maximum angle of view area and the second maximum angle of view area, and then select an appropriate area to be processed, when the method is applied to the control end of the shooting end, it can also The following operations are included to display at least one of the multiple boundaries of the selectable range.
  • one or more boundaries of the selectable range can be displayed in the display interface.
  • the boundary of the area to be processed exceeds the selectable range
  • the boundary of the exceeded selectable range may be displayed on the display interface.
  • the boundary beyond the selectable range among the boundaries of the area to be processed may be displayed on the display interface.
  • the non-selectable range and the selectable range can be displayed differently, such as by transparency or color.
  • FIGS 5 to 10 are schematic diagrams of the display interface of the control terminal provided by the embodiments of the present disclosure.
  • determining the selectable range of the area to be processed in the first image according to the comparison result may include the following operations. If it is determined according to the comparison result that the second maximum viewing angle area S2 includes the first maximum viewing angle area S1, then determining the area to be processed The selectable range of the area in the first image is any area in the first image.
  • determining the selectable range of the area to be processed in the first image according to the comparison result may include the following operations. If it is determined according to the comparison result that the second maximum viewing angle area S2 includes part of the first maximum viewing angle area S1, then determining The selectable range of the processing area in the first image is the overlap area of the first maximum viewing angle area and the second maximum viewing angle area.
  • the shaded area in FIG. 6 is the overlapping area of the first maximum viewing angle area and the second maximum viewing angle area.
  • the current attitude angle is the current pitch angle
  • determining the selectable range of the area to be processed in the first image according to the comparison result may include the following operations. If the first upper limit of the first maximum viewing angle area S1 is determined according to the comparison result Beyond the second upper limit of the second maximum viewing angle area S2, based on the first upper limit, the second upper limit, and the vertical viewing angle area corresponding to the first focal length, the selectable range is determined to be the upper limit (upper boundary) of the first maximum viewing angle area. ).
  • the current attitude angle is the current pitch angle
  • determining the selectable range of the area to be processed in the first image according to the comparison result may include the following operations. If it is determined according to the comparison result that the second lower limit of the second maximum viewing angle area exceeds The first lower limit of the first maximum viewing angle area, based on the second lower limit, the first lower limit, and the vertical viewing angle area corresponding to the first focal length, determine the selectable lower limit (lower boundary) of the selectable range in the first maximum viewing angle area .
  • the current attitude angle is the current yaw angle
  • determining the selectable range of the area to be processed in the first image according to the comparison result may include the following operations. If the limit exceeds the second left limit of the second maximum angle of view area, based on the first left limit, the second left limit and the horizontal angle of view area corresponding to the first focal length, the selectable range is determined to be the selectable left limit of the first maximum angle of view area. limit.
  • the algorithm for the left limit (optional left limit) of the selectable range can refer to the algorithm for calculating the lower limit of the selectable range.
  • Replace with the horizontal size of the field of view corresponding to the first focal length will Replaced with the horizontal size of the viewing area corresponding to the second focal length That's it.
  • the current attitude angle is the current yaw angle
  • determining the selectable range of the area to be processed in the first image according to the comparison result may include the following operations. If the limit exceeds the first right limit of the first maximum viewing angle area, based on the second right limit, the first right limit, and the horizontal viewing angle area corresponding to the first focal length, determine the selectable range in the first maximum viewing angle area. Right limit.
  • the algorithm for the right limit of the selectable range can refer to the above-mentioned algorithm for calculating the upper limit of the selectable range.
  • Replace with the horizontal size of the field of view corresponding to the first focal length will Replaced with the horizontal size of the viewing area corresponding to the second focal length That's it.
  • the method may further include the following operations.
  • the second largest viewing angle area includes the area to be processed.
  • the to-be-processed area input by the user may be received at the control terminal, or the to-be-processed area may be determined based on the image recognition algorithm at the photographing device or the control terminal.
  • the area to be processed should be included in the second maximum viewing angle area to avoid multiple second images obtained by taking pictures of the area to be processed under the second focal length, which cannot cover the complete image of the area to be processed (such as after stitching) Form a "m" font image).
  • the area to be processed is determined by way of user input as an example for description.
  • the control terminal or camera device the control terminal sends the area to be processed by the user to the camera device
  • the area to be processed may be a frame-like area input by the user from the display interface of the control terminal, an area formed by multiple coordinates input by the user, or an area formed by multiple points input by the user.
  • the method may further include the following operations.
  • the control terminal or camera device can obtain the candidate to-be-processed area (such as the person image area, the scenic spot image area, etc. obtained through the algorithm, or the key monitoring area determined according to the preset task, etc.), and accordingly, determine the to-be-processed area Including: if it is determined that the candidate to-be-processed area does not exceed the selectable range, then the candidate to-be-processed area is determined to be the to-be-processed area.
  • the candidate to-be-processed area such as the person image area, the scenic spot image area, etc. obtained through the algorithm, or the key monitoring area determined according to the preset task, etc.
  • a prompt message may be output on the control terminal.
  • the selectable range is displayed as the prompt information on the display interface of the control terminal.
  • the method may further include: if it is determined that the candidate to-be-processed area is beyond the selectable range, outputting prompt information at the control terminal. For example, when the candidate to-be-processed area input by the user exceeds the selectable range, prompt information is output on the control terminal.
  • the prompt information includes but is not limited to: image information, text information, sound information, and somatosensory information, such as when the candidate to-be-processed area
  • the lower limit of the selectable range exceeds the lower limit of the selectable range
  • the lower limit of the selectable range is displayed on the display interface of the control terminal.
  • the selectable range is displayed on the display interface of the control terminal, other areas on the first image in the display interface other than the selectable range cannot be selected, that is, the user is restricted to only select within the selectable range. That is, in this case, the obtained candidate to-be-processed area is the to-be-processed area, and there is no need to further determine whether the candidate to-be-processed area exceeds the selectable range.
  • the determination of the area to be processed based on the image recognition algorithm is taken as an example.
  • the determination of the area to be processed based on the image recognition algorithm may include: As the area to be processed.
  • suitable image recognition technologies can be used in the above-mentioned image recognition process, such as CAD-like object recognition methods (edge detection, primal sketch), and appearance (apprearance-based) recognition methods (such as edge matching, divide-conquer search algorithm, grayscale matching, gradient matching, historgrams) of receptive field response, or large model (1arge model bases, etc.), feature-based recognition methods (such as interpretation tree, hypothesizing-testing), pose consistency ), pose clustering, invariance, geometric hashing, scale-invariant feature transform (SIFT), based on fast and robust features (speeded up robust) feature, SURF), genetic algorithms (genetic algorithms), etc.
  • CAD-like object recognition methods edge detection, primal sketch
  • appearance (apprearance-based) recognition methods such as edge matching, divide
  • the area to be processed is determined based on a preset task and an image recognition algorithm as an example for description.
  • the user can set a preset task through the control terminal in advance, such as an image collection task based on a preset period.
  • the preset task can include a variety of task parameters, such as task period, shooting pose sequence information (the shooting pose sequence information can be input by the user, or it can be automatically controlled by the control terminal according to the key monitoring area and focal length information input by the user.
  • alarm-related information may include the number of photos taken and the pose information of each shot, focal length information, sensitivity information, shutter information, and the like.
  • Alarm related information includes but is not limited to: alarm trigger conditions, alarm mode, etc.
  • the recognition result of the recognition by the image recognition algorithm is compared with the alarm triggering conditions to determine whether to alarm. For example, if it is determined that there are flammable and explosive items in a certain location monitored through image recognition, an alarm is required.
  • monitored objects such as meters and fasteners are usually fixed at designated locations, and the status information of the monitored object is determined based on preset tasks and image recognition algorithms, and then whether the monitored object is determined based on the status information of the monitored object Make an alarm.
  • the photographing related information is determined based on the second focal length and the to-be-processed area, so as to acquire at least one second image of the to-be-processed area at the second focal length based on the photographing-related information.
  • the photographing related information may include: photographing pose sequence information, camera photographing related information, camera type information, and the like.
  • the shooting pose sequence information may include posture adjustment device associated information.
  • the associated information of the attitude adjustment device includes but is not limited to at least one of the following: attitude angle sequence information, time sequence information, and the like.
  • Camera shooting related information includes but is not limited to at least one of the following: exposure duration sequence information, sensitivity sequence information, denoising mode sequence information, output format sequence information, compression mode sequence information, etc. (It should be noted that the camera shooting related information
  • Each sequence information may also be a single value, for example, the shooting parameters for shooting under each shooting pose of the shooting pose sequence information are the same).
  • the shooting pose sequence information is determined based on the shooting combination information, and the shooting combination information is determined based on the second focal length and the area to be processed.
  • the photographing combination information may include: the number of photographs and the area to be photographed for each photograph.
  • determining the photographing associated information based on the second focal length and the area to be processed includes: determining the number of photographs and the area to be photographed for each photograph based on the second focal length and the area to be processed.
  • each shooting pose in the shooting pose sequence information corresponds to a to-be-photographed area, so that under each shooting pose of the shooting pose sequence information, a second image corresponding to the to-be-photographed area is captured at the second focal length.
  • determining the number of photographs and the area to be photographed for each photograph may include the following operations. First, the viewing angle area corresponding to the second focal length is determined based on the size of the image sensor and the second focal length. Then, the number of photographs and the area to be photographed for each photograph are determined based on the area to be processed and the viewing angle area corresponding to the second focal length. Among them, the area formed by multiple areas to be photographed includes the area to be processed.
  • FIG. 11 is a schematic diagram of the division of a to-be-processed area provided by an embodiment of the disclosure.
  • the size of the camera's image sensor is fixed. Under the second focal length in a specific posture (the size of the image obtained under the long focal length is smaller than the size of the image obtained under the short focal length), the image sensor can capture
  • the size of the image on the display interface is: width W1, length L1.
  • the size of the area to be processed on the display interface is: width W2, length L2.
  • the area to be processed can be divided into: Regions, of which, Indicates rounding up. In this way, the number of photographs that need to be taken can be determined, so that the multiple second images taken at the second focal length can completely cover the image of the area to be processed in the first image taken at the first focal length.
  • each divided area can be used as an area to be photographed.
  • a certain redundant area may be set for the area to be photographed. For example, determining the number of photos taken based on the area to be processed and the viewing angle area corresponding to the second focal length, and the area to be photographed for each photo may include: based on the area to be processed, the viewing angle area corresponding to the second focal length and the area to be photographed.
  • the image overlap ratio determines the number of photographs and the area to be photographed for each photograph. Wherein, the image overlap ratio may be a preset value, the image overlap ratio may be for one or more side lengths (for example, including at least one of the width overlap ratio and the length overlap ratio), or it may be for area (for example, Area overlap ratio).
  • FIG. 12 is a schematic diagram of an area to be photographed according to an embodiment of the disclosure.
  • the illustrated area to be processed is divided into 9 areas to be photographed, and there is a specific image overlap ratio between adjacent areas to be photographed. Since the horizontal length and the vertical length of the area to be photographed can be different, the horizontal image overlap ratio (overlap x ) and the vertical image overlap ratio (overlap y ) can be set. Of course, the same image overlap ratio can also be set for the horizontal and vertical directions of the area to be photographed. In other embodiments, if the area to be photographed is a square, an image overlap ratio with the same horizontal and vertical overlap ratio can be set.
  • a camera with two lens groups is used as an example for description. If the first focal length is used for focusing by the first lens group, the second focal length is used for focusing by the second lens group.
  • the baseline deviation always exists, and the baseline deviation needs to be corrected, and the optical axis of the second lens group needs to be aligned with the optical axis of the original first lens by adjusting the posture adjustment device to take an image of a designated area.
  • determining the photographing pose sequence information based on the area to be photographed may include: determining the photographing pose corresponding to the area to be photographed for each photograph based on the current pose of the camera at the first focal length and the angular position deviation of the second focal length, and The set of all the determined shooting poses is used as the shooting pose sequence information.
  • determining the shooting pose sequence information based on the area to be photographed may include: based on the current pose of the camera at the first focal length, the angular position deviation of the second focal length, and the image overlap between the viewing angle area corresponding to the second focal length and the area to be photographed The ratio determines the shooting pose corresponding to the area to be photographed for each shot, and the set of all the determined shooting poses is used as the shooting pose sequence information.
  • the deviation of the second focal length angle position may be determined based on the first deviation, the second deviation, and the third deviation. Specifically, the first deviation between the designated position of the viewing angle area corresponding to the second focal length and the designated position of the viewing angle area corresponding to the first focal length is determined, for example, the designated position may be the center point. The second deviation between the designated position of the area to be processed and the designated position of the viewing angle area corresponding to the first focal length is determined. Determine the third deviation between the designated position of the area to be photographed and the designated position of the area to be processed.
  • the second focal length angular position deviation between the area to be photographed and the designated position of the viewing angle area corresponding to the first focal length can be determined based on the first deviation, the second deviation, and the third deviation.
  • the pan-tilt so that the camera deviates from the current attitude angle of the above-mentioned second focal length angle position deviation, it is possible to take pictures of the area to be photographed.
  • the horizontal angle of view (fov zx ) corresponding to the second focal length and the size corresponding to the second focal length are calculated using formulas (1) and (2).
  • W and H are the width and height of the sensor, and focal_length is the second focal length.
  • the horizontal angle of view at the first focal length corresponding to the area to be processed is (fov wx ), and the angle of view at the first focal length corresponding to the area to be processed
  • the vertical viewing angle of the area is (fov wy )
  • the conditions shown in equations (3) and (4) can be satisfied .
  • overlap x is the image overlap ratio in the horizontal direction
  • overlap y is the image overlap ratio in the vertical direction, as shown in FIG. 12, that is, the length overlap ratio and the width overlap ratio.
  • m and n are the number of photos that need to be taken horizontally and vertically respectively.
  • overlap x can be specified by the user or calculated by an algorithm through special rules.
  • the internal parameters, rotation matrix and translation matrix between the first lens group and the second lens group can be calibrated to calculate the specific target distance (in general, it is assumed to be infinity). Then, calculate the first deviation from the center of the viewing angle area corresponding to the second focal length to the center of the viewing angle area corresponding to the first focal length. Specifically, it can be as shown in formula (5) to formula (7).
  • the center of the area to be processed can be calculated relative to the view area corresponding to the first focal length
  • the second deviation (offset_2 x , offset_2 y ) of the center of can be specifically expressed by equations (8) and (9).
  • the third deviation (offset_3 x , offset_3 y ) of the first area to be photographed in the upper left corner relative to the center of the area to be processed can be calculated.
  • FIG. 13 is a schematic diagram when the number of shots provided by an embodiment of the disclosure is an odd number
  • FIG. 14 is a schematic diagram when the number of shots provided by an embodiment of the disclosure is an even number.
  • the third deviation (offset_3 x , offset_3 y ) can be calculated using equations (10) and (11).
  • fov x -overlap x is the horizontal viewing angle after removing the overlap
  • fov y -overlap y is the vertical viewing angle after removing the overlap
  • the current pose may refer to the current shooting pose angle of the camera.
  • the PTZ can be controlled to adjust the posture of the camera based on the angular position deviation, so that the camera can take a picture of the area to be photographed in the upper left corner of the m*n areas to be photographed under the second focal length.
  • the angular position deviation of the area to be photographed in the upper left corner with respect to the first lens group in the current pose can be determined by formulas (12) and (13).
  • offset x offset_1 x +offset_2 x +offset_3 x formula (12)
  • the first focal length is realized by a wide-angle camera (including the first lens group), and the second focal length is realized by a telephoto camera (including the second lens group).
  • the current wide-angle camera position is (yaw current , pitch current )
  • the position corresponding to the area to be photographed in the upper left corner of the wide-angle camera is (yaw target (1, 1), pitch target (1, 1)), which can be obtained by (14) and formula (15) for calculation.
  • pitch target (1, 1) pitch current + offset y formula (15)
  • yaw target (m,n) yaw target (1,1)+(1-overlap x )*fov zx *(m-1)
  • pitch target (m, n) pitch target (1, 1) + (1-overlap y )*fov zy *(n-1)
  • the angular position deviation of each area to be photographed relative to the current posture of the camera can be determined, and the photographing pose sequence information can be obtained.
  • the method may further include the following operations: controlling the shooting end to shoot at the second focal length under each shooting pose in the shooting pose sequence information.
  • the second image corresponding to the photographing area can be performed by the control terminal or by the camera terminal.
  • the control terminal may send a shooting instruction to the shooting terminal.
  • the shooting end automatically performs shooting based on the shooting pose sequence information, which is not limited here.
  • the shooting end includes a shooting device, and controlling the shooting end to shoot the second image corresponding to the area to be photographed at the second focal length under each shooting pose of the shooting pose sequence information may include the following operations: controlling the shooting device through the attitude adjustment device In each shooting pose in the shooting pose sequence information, the shooting device is controlled to shoot the second images respectively corresponding to the area to be photographed at the second focal length in each shooting pose.
  • the attitude adjustment device is set on a movable platform, the attitude adjustment device can be a multi-axis pan/tilt, and the movable platform can be an unmanned aerial vehicle.
  • the first image and the second image are captured by a zoom camera at the first focal length and the second focal length, respectively.
  • the camera has a zoom lens
  • the first image and the second image are obtained by adjusting the focal length of the zoom lens and adjusting the posture of the camera.
  • the first image is taken by a first photographing device
  • the second image is taken by a second photographing device.
  • the focal length of the first photographing device is at least partially different from the focal length of the second photographing device.
  • the first photographing device and the second photographing device are two independent photographing devices, which can be set on the same pan-tilt, so that the first photographing device and the second photographing device can move synchronously.
  • the distance between the first camera and the second camera is fixed, and the first camera is set on the first platform, the second camera is set on the second platform, and the first camera and the second camera are The pan/tilt can move synchronously.
  • the first imaging device and the second imaging device are packaged in a housing and share an image sensor, and the optical axis of the lens group of the first imaging device and the optical axis of the lens group of the second imaging device are kept parallel.
  • the focal length of at least one of the first camera and the second camera is adjustable.
  • the first photographing device is a fixed focal length camera
  • the second photographing device is an adjustable focal length camera.
  • the first imaging device is an adjustable focal length camera
  • the second imaging device is a fixed focal length camera.
  • both the first imaging device and the second imaging device are cameras with adjustable focal lengths. Using two shooting devices for shooting is compared to using a shooting device with an adjustable focal length (such as a high-power zoom lens with a larger focal length adjustment range).
  • the first photographing device is a wide-angle camera
  • the second photographing device is a telephoto camera.
  • the telephoto camera may be a camera with an adjustable focal length.
  • the adjustable focal length range of the telephoto camera can be determined according to the cost of the camera and so on.
  • the adjustable focus range of the telephoto camera may be smaller than the adjustable focus range of the above-mentioned high-power zoom lens.
  • the adjustable focal length range of a telephoto camera is located on the larger focal length side of the adjustable focal range of a high-power zoom lens.
  • first photographing device described above may include a first lens group
  • second photographing device may include a second lens group
  • the method also It may include the following operations: storing the first image and the second image in association.
  • the SD card stores the HTML file and the first image and the second image, so that the SD card is coupled to the personal computer (PC) side, and the first image and the second image can be displayed on the PC side based on the HTML file.
  • the HTML file may include the mapping relationship between the first image and the second image.
  • storing the first image and the second image in association may include the following operations. First, determine the first mapping relationship between the second image and the area to be processed, and determine the second mapping relationship between the area to be processed and the first image. Then, the first image, the area to be processed, the second image, the first mapping relationship, and the second mapping relationship are stored. Among them, the area to be processed may be expressed as range information, etc., such as a range determined based on multiple coordinates.
  • the second image when applied to the shooting terminal, is an unprocessed shooting image.
  • the second image is the original captured image.
  • the original captured image may undergo pre-processing such as noise reduction, object recognition, and recognition of object region marking.
  • the second image when applied to the control terminal of the shooting terminal, the second image is a processed captured image, and the resolution of the second image is smaller than the resolution of the unprocessed captured image.
  • image compression may be performed on the second image to reduce the storage occupied by the second image.
  • the network resources occupied by the transmission of the second image can be reduced.
  • the drone side transmits a thumbnail of the second image.
  • the control terminal needs to consult the specific second image, it needs to be downloaded from the drone side. After the transmission is compressed
  • the second image is helpful to reduce network resource consumption, and helps to improve the transmission rate and user command response speed.
  • the process of image compression on the second image may occur after the second image is captured, after receiving a user instruction for viewing the second image, or after receiving a user for compressing the second image After the instruction, there is no restriction here.
  • the zoom lens can obtain the shooting image according to the new zoom factor (for example, when the user changes the second focal length by adjusting the magnification adjustment component in the display interface of the control terminal), the camera device can real-time capture the shots acquired by the zoom lens
  • the picture is sent to the control terminal, so that the control terminal displays the picture obtained by the zoom lens, so that the user can intuitively check whether the sharpness of the picture obtained by the zoom lens meets their own needs.
  • the above method may further include the following operation: if an operation instruction for viewing the second image is received, obtaining an unprocessed photographed image corresponding to the second image from the photographing terminal. Then, the unprocessed captured image corresponding to the second image is displayed.
  • the drone side may send the compressed image of the second image to the control terminal side for display. If the user wants to view the original photographed image of the second image, he can send an operation instruction for viewing the second image to the side of the drone (or the photographing device on it) through the control terminal side.
  • the drone side In response to the operation instruction for viewing the second image, the drone side sends the original captured image of the second image to the control terminal side. In this way, the information interaction speed between the control terminal side and the drone side can be satisfied at the same time, and the multiple needs of the user for the second image can also be met.
  • the control camera is used to shoot at the second focal length and the area to be photographed (which may include multiple areas to be photographed) under each shooting position of the shooting position sequence.
  • the method may further include the following operations. First, a third image is acquired, which is synthesized based on the second image. Then, the first image and the third image are stored in association. Wherein, the third image may be synthesized using multiple second images, for example, based on image stitching technology. For the overlapping image parts between the multiple second images that are stitched, interpolation can be used to make the stitching more natural.
  • the first image is captured through the wide-angle lens and the second image is captured through the telephoto lens, combined with the display interface of the control terminal, to illustrate the viewing of the photographing result.
  • Figures 15-17 are schematic diagrams of display images provided by embodiments of the disclosure.
  • the content displayed on the display interface 1501 of the control terminal includes: the first image captured at the first focal length under the current posture of the first camera.
  • the user selects the area to be processed 1502 (included in the second maximum viewing angle area corresponding to the second focal length) from the first image, and sets the second focal length.
  • the photographing terminal or the controlling terminal divides the area to be processed 1502 based on the second focal length, determines four areas to be photographed, and determines photographing related information of the four areas to be photographed. In this way, the shooting end can respectively take pictures of the four areas to be photographed at the second focal length based on the photographing related information, so as to obtain four super-analyzed second images.
  • the first image and the second image can be associated and stored, such as stored in an SD card.
  • the display interface 1501 further includes a focal length adjustment component 1503, focal length information, etc., so that the user can adjust the second focal length.
  • the current second focal length (ZOOM) is 10 times the focal length.
  • the display interface 1501 may also include the number of shots and/or the shooting duration of the zoom lens. The shooting information changes as the shooting parameters of the zoom lens change. As a result, the picture parameters of the wide-angle lens will change accordingly, and the lens parameters will change accordingly.
  • a display interface 1601 of the terminal's display is shown.
  • annotation information 1602 may be displayed, such as the range mark corresponding to the area to be processed in FIG. 15, the text annotation information of the second image stored in association, and the like.
  • the image identification and format information (DJI00001.jpeg) of the first image can also be displayed.
  • the user can view the second image stored in association by clicking the annotation information 1602 or a specific functional component.
  • the terminal can determine the second image that the user wants to view based on the HTML file and the annotation information 1602, and display it.
  • the user can hide the annotation information 1602 through a specific operation to ensure the viewing effect.
  • the display interface 1601 of the terminal's display may also include an operation button 1603, so that the user can also perform operations such as editing, deleting, and sharing the first image.
  • the label information may not be printed.
  • the HTML file may not be sent.
  • FIG. 17 it is a display interface 1601 of the display of the terminal. Among them, the relationship between the currently displayed second image 1701 and the first image can be displayed.
  • the second image 1701 shown in FIG. 17 is the first (1 of 4) of the four second images corresponding to the to-be-processed area of the first image.
  • the user can also edit, delete, and share the second image 1701.
  • FIG. 18 is a schematic structural diagram of an image processing device provided by an embodiment of the disclosure.
  • the image processing device 1800 includes:
  • One or more processors 1810 are configured to perform one or more tasks.
  • the computer-readable storage medium 1820 is configured to store one or more computer programs 1821.
  • the computer program 1821 is executed by the processor 1810, it realizes:
  • the second focal length is greater than the first focal length.
  • the second maximum viewing angle area corresponding to the second focal length is determined based on the maximum rotatable angle of the posture adjustment device.
  • determining the selectable range of the area to be processed in the first image according to the comparison result includes:
  • the selectable range of the area to be processed in the first image is any area in the first image.
  • determining the selectable range of the area to be processed in the first image according to the comparison result includes:
  • the selectable range of the area to be processed in the first image is determined to be the overlapping area of the first maximum viewing angle area and the second maximum viewing angle area.
  • the current attitude angle is the current pitch angle
  • the selectable range of the region to be processed in the first image is determined according to the comparison result, including:
  • the selectable range is determined based on the first upper limit, the second upper limit, and the vertical viewing angle area corresponding to the first focal length Optional upper limit in the first maximum viewing angle area.
  • the current attitude angle is the current pitch angle
  • the selectable range of the region to be processed in the first image is determined according to the comparison result, including:
  • the optional The range is the optional lower limit of the first maximum viewing angle area.
  • the current attitude angle is the current yaw angle
  • the selectable range of the region to be processed in the first image is determined according to the comparison result, including:
  • the selectable range is determined to be the selectable left limit of the first maximum viewing angle area.
  • the current attitude angle is the current yaw angle
  • the selectable range of the region to be processed in the first image is determined according to the comparison result, including:
  • the second right limit of the second maximum angle of view area exceeds the first right limit of the first maximum angle of view area, based on the second right limit, the first right limit, and the horizontal direction angle of view area corresponding to the first focal length To determine the selectable range at the selectable right end of the first maximum viewing angle area.
  • the computer program 1821 when applied to the control end of the shooting end, is also used to implement:
  • At least one of the multiple boundaries of the selectable range is displayed.
  • the computer program 1821 when the selectable range of the area to be processed in the first image is determined according to the comparison result, when the computer program 1821 is executed by the processor 1810, it is further used to implement:
  • the photographing associated information is determined based on the second focal length and the area to be processed, so as to acquire at least one second image of the area to be processed at the second focal length based on the photographing associated information.
  • the computer program 1821 before determining the area to be processed, when the computer program 1821 is executed by the processor 1810, it is also used to implement:
  • Determine the area to be processed including:
  • the candidate to-be-processed area is determined to be the to-be-processed area.
  • determining the area to be processed includes:
  • determining the area to be processed based on the image recognition algorithm includes:
  • determining the area to be processed based on the image recognition algorithm includes:
  • the area to be processed is determined from the first image based on the preset task and the image recognition algorithm.
  • the photographing related information includes: photographing pose sequence information.
  • the shooting pose sequence information is determined based on the shooting combination information, and the shooting combination information is determined based on the second focal length and the area to be processed.
  • the photographing combination information includes: the number of photographs and the area to be photographed for each photograph;
  • Determining photographing related information based on the second focal length and the area to be processed includes:
  • each shooting pose in the shooting pose sequence information corresponds to a to-be-photographed area, so that under each shooting pose of the shooting pose sequence information, a second image corresponding to the to-be-photographed area is captured at the second focal length.
  • determining the number of photographs and the area to be photographed for each photograph includes:
  • the area formed by a plurality of areas to be photographed includes the area to be processed.
  • the determination of the number of photographs based on the area to be processed and the viewing angle area corresponding to the second focal length, and the area to be photographed for each photograph include:
  • the number of photographing times and the area to be photographed for each photograph are determined based on the image overlap ratio between the area to be processed, the viewing angle area corresponding to the second focal length, and the area to be photographed.
  • determining the shooting pose sequence information based on the area to be photographed includes:
  • the shooting pose corresponding to the area to be photographed each time is determined based on the current pose of the camera at the first focal length and the angular position deviation of the second focal length, and the set of all the determined shooting poses is used as the shooting pose sequence information.
  • the angular position deviation of the second focal length is determined in the following manner:
  • the second focal length angular position deviation between the to-be-photographed area and the designated position of the viewing angle area corresponding to the first focal length is determined.
  • the computer program 1821 when being executed by the processor 1810, is also used to implement:
  • the shooting end is controlled to shoot the second image corresponding to the area to be photographed at the second focal length under each shooting pose of the shooting pose sequence information.
  • the shooting end includes a shooting device, and controlling the shooting end to shoot the second image corresponding to the area to be photographed at the second focal length under each shooting pose of the shooting pose sequence information includes:
  • the posture adjustment device controls the shooting device to sequentially be in each shooting pose in the shooting pose sequence information, and controls the shooting device to shoot the second images corresponding to the area to be photographed at the second focal length under each shooting pose.
  • the posture adjustment device is provided on a movable platform.
  • the computer program 1821 is executed by the processor 1810 after controlling the shooting end to shoot the second image corresponding to the area to be photographed at the second focal length under each shooting pose of the shooting pose sequence information.
  • the computer program 1821 is executed by the processor 1810 after controlling the shooting end to shoot the second image corresponding to the area to be photographed at the second focal length under each shooting pose of the shooting pose sequence information.
  • the first image and the second image are stored in association.
  • storing the first image and the second image in association includes:
  • the second image when applied to the shooting end, is an unprocessed shooting image.
  • the second image is a processed captured image, and the resolution of the second image is smaller than the resolution of the unprocessed captured image.
  • the computer program 1821 is executed by the processor 1810. , Also used to achieve:
  • the third image is synthesized based on the second image
  • the first image and the third image are stored in association.
  • the first image and the second image are captured by the zoom camera at the first focal length and the second focal length, respectively.
  • the first image is taken by a first photographing device
  • the second image is taken by a second photographing device; wherein the focal length of the first photographing device is at least partially different from the focal length of the second photographing device.
  • the focal length of at least one of the first camera and the second camera is adjustable.
  • the first camera is a wide-angle camera
  • the second camera is a telephoto camera
  • the second focal length includes any one of the following: a fixed focal length, a preset focal length, a focal length determined based on a focal length algorithm, a focal length determined based on a sensor, and a focal length selected by the user.
  • the focal length determined based on the sensor includes a focal length determined based on distance information, and the distance information is determined by a laser rangefinder.
  • FIG. 19 is a schematic structural diagram of an image processing system provided by an embodiment of the disclosure.
  • the image processing system 1900 may include:
  • the photographing device 1910 is used to obtain the first maximum viewing angle area corresponding to the first focal length under the current attitude angle of the photographing device 1910, and to obtain the second maximum viewing angle area corresponding to the second focal length;
  • the viewing angle area is compared; and the selectable range of the area to be processed in the first image is determined according to the comparison result, so as to control the photographing device 1910 to obtain at least one second image of the area to be processed at the second focal length, and the first image is in the first image. Obtained at a focal length; where the second focal length is greater than the first focal length.
  • the second maximum viewing angle area corresponding to the second focal length is determined based on the maximum rotatable angle of the posture adjustment device.
  • the photographing device 1910 is specifically configured to determine that the selectable range of the area to be processed in the first image is in the first image if it is determined according to the comparison result that the second maximum angle of view area includes the first maximum angle of view area. Any area of.
  • the photographing device 1910 is specifically configured to determine that the second maximum viewing angle area includes part of the first maximum viewing angle area according to the comparison result, and then determining that the selectable range of the to-be-processed area in the first image is the first maximum viewing angle area. The area of overlap between the viewing angle area and the second largest viewing angle area.
  • the photographing device 1910 is specifically used when the current attitude angle is the current pitch angle. If it is determined according to the comparison result that the first upper limit of the first maximum viewing angle area exceeds the second upper limit of the second maximum viewing angle area, it is based on The first upper limit, the second upper limit, and the vertical viewing angle area corresponding to the first focal length determine the optional upper limit of the selectable range in the first maximum viewing angle area.
  • the photographing device 1910 is specifically used when the current attitude angle is the current pitch angle. If it is determined according to the comparison result that the second lower limit of the second maximum viewing angle area exceeds the first lower limit of the first maximum viewing angle area, it is based on The second lower limit, the first lower limit, and the vertical viewing angle area corresponding to the first focal length determine the optional lower limit of the selectable range in the first maximum viewing angle area.
  • the photographing device 1910 is specifically used when the current attitude angle is the current yaw angle, if it is determined according to the comparison result that the first left limit of the first maximum angle of view area exceeds the second left limit of the second maximum angle of view area , Based on the first left limit, the second left limit, and the horizontal viewing angle area corresponding to the first focal length, the selectable range is determined to be the selectable left limit of the first maximum viewing angle area.
  • the photographing device 1910 is specifically used when the current attitude angle is the current yaw angle, if it is determined according to the comparison result that the second right limit of the second maximum viewing angle area exceeds the first right limit of the first maximum viewing angle area , Based on the second right limit, the first right limit, and the horizontal viewing angle area corresponding to the first focal length, the selectable range is determined to be the selectable right limit of the first maximum viewing angle area.
  • control terminal 1920 includes a display interface, and the control terminal 1920 is further configured to display at least one of multiple boundaries of the selectable range through the display interface.
  • the photographing device 1910 is further configured to determine the area to be processed after determining the selectable range of the area to be processed in the first image according to the comparison result, where the second largest viewing angle area includes the area to be processed; And determining photographing related information based on the second focal length and the area to be processed, so as to control the photographing device 1910 to acquire at least one second image of the area to be processed at the second focal length based on the photographing associated information.
  • the photographing device 1910 is further configured to obtain candidate regions to be processed before determining the regions to be processed;
  • the candidate to-be-processed area is the to-be-processed area if it is determined that the candidate to-be-processed area does not exceed the selectable range.
  • the photographing device 1910 is further configured to output prompt information if it is determined that the candidate to-be-processed area is beyond the selectable range after acquiring the candidate to-be-processed area.
  • the photographing device 1910 is specifically configured to determine the area to be processed based on an image recognition algorithm.
  • the photographing device 1910 is specifically configured to perform image recognition on the first image, determine the object to be processed, and use the area where the object to be processed is located as the area to be processed.
  • the photographing device 1910 is specifically configured to determine the area to be processed from the first image based on a preset task and an image recognition algorithm.
  • the photographing related information includes: photographing pose sequence information.
  • the shooting pose sequence information is determined based on the shooting combination information, and the shooting combination information is determined based on the second focal length and the area to be processed.
  • the photographing combination information includes the number of photographs and the area to be photographed for each photograph
  • the photographing device 1910 is specifically configured to determine the number of photographs and the area to be photographed for each photograph based on the second focal length and the area to be processed; and to determine the photographing pose sequence information based on the area to be photographed; wherein, each of the photographing pose sequence information
  • the shooting pose corresponds to a region to be photographed, so that the photographing device 1910 shoots the second image corresponding to the region to be photographed at the second focal length under each shooting pose of the shooting pose sequence information.
  • the photographing device 1910 is specifically configured to determine the viewing angle area corresponding to the second focal length based on the size of the image sensor and the second focal length; and to determine the number of photographs based on the area to be processed and the viewing angle area corresponding to the second focal length, And the area to be photographed each time the photograph is taken; wherein the area formed by multiple areas to be photographed includes the area to be processed.
  • the photographing device 1910 is specifically configured to determine the number of photographs and the area to be photographed for each photograph based on the area to be processed, the viewing angle area corresponding to the second focal length, and the image overlap ratio between the area to be photographed.
  • the photographing device 1910 is specifically configured to determine the photographing pose corresponding to the area to be photographed for each photograph based on the current pose of the camera at the first focal length and the angular position deviation of the second focal length, and determine The collection of all shooting poses is used as the shooting pose sequence information.
  • the photographing device 1910 is specifically configured to determine the first deviation between the designated position of the viewing angle area corresponding to the second focal length and the designated position of the viewing angle area corresponding to the first focal length; and determining the area to be processed The second deviation between the designated position of the area and the designated position of the viewing angle area corresponding to the first focal length; and the third deviation between the designated position of the area to be photographed and the designated position of the area to be processed; and based on the first deviation, The second deviation and the third deviation determine the second focal length angular position deviation between the area to be photographed and the designated position of the viewing angle area corresponding to the first focal length.
  • the photographing device 1910 is specifically configured to photograph the second image corresponding to the area to be photographed at the second focal length under each photographing pose of the photographing pose sequence information.
  • the photographing device 1910 includes a posture adjusting device and a photographing device.
  • the photographing device 1910 is specifically used to control the photographing device to be in each photographing pose in the photographing pose sequence information sequentially through the pose adjusting device, and to control the photographing device In each shooting posture, a second image corresponding to the area to be photographed is shot at the second focal length.
  • the posture adjustment device is provided on a movable platform.
  • the photographing device 1910 is further configured to store the first image and the second image corresponding to the area to be photographed at the second focal length under each photographing pose of the photographing pose sequence information. The second image.
  • the photographing device 1910 is specifically configured to determine the first mapping relationship between the second image and the area to be processed, and determine the second mapping relationship between the area to be processed and the first image; and store the first mapping relationship between the area to be processed and the first image.
  • the second image is an unprocessed captured image and stored in the photographing device.
  • the second image is a processed captured image and stored in the control terminal, and the resolution of the second image is smaller than the resolution of the unprocessed captured image.
  • control terminal further includes a display interface
  • control terminal 1920 is further configured to obtain an unprocessed image corresponding to the second image from the photographing device 1910 if an operation instruction for viewing the second image is received. Capturing an image; and displaying the unprocessed captured image corresponding to the second image through the display interface.
  • the photographing device 1910 is also used to control the photographing end to photograph the second image corresponding to the area to be photographed at the second focal length under each photographing position of the photographing position sequence, and then acquire the third image.
  • the third image is synthesized based on the second image; and the first image and the third image are stored in association.
  • the photographing device 1910 includes a zoom photographing device, which is used to photograph the first image and the second image at the first focal length and the second focal length, respectively.
  • the photographing device 1910 includes a first photographing device and a second photographing device
  • the first photographing device is used to photograph a first image
  • the second photographing device is used to photograph a second image
  • the focal length of the first photographing device is at least partially different from the focal length of the second photographing device.
  • the focal length of at least one of the first camera and the second camera is adjustable.
  • the first camera is a wide-angle camera
  • the second camera is a telephoto camera
  • the second focal length includes any one of the following: a fixed focal length, a preset focal length, a focal length determined based on a focal length algorithm, a focal length determined based on a sensor, and a focal length selected by the user.
  • the photographing device 1910 includes a laser rangefinder, and the laser rangefinder is used to measure distance information to determine the second focal length based on the distance information.
  • the image processing system shown above is only exemplary and cannot be construed as a limitation of the present disclosure.
  • the operations implemented by the photographing device 1910 in the above-mentioned image processing system may be at least partially executed by the control terminal 1920.
  • the operations implemented by the photographing device 1910 in the above-mentioned image processing system can be at least partially performed by an unmanned aerial vehicle.
  • control terminal 1920 may obtain the first maximum angle of view area corresponding to the first focal length of the photographing device 1910 at the current attitude angle, and obtain the second maximum angle of view area corresponding to the second focal length; combine the first maximum angle of view area with the first Compare the two regions with the largest viewing angle; and determine the selectable range of the region to be processed in the first image according to the comparison result, so as to control the photographing device 1910 to obtain at least one second image of the region to be processed at the second focal length, where the first image is Obtained at a first focal length; wherein the second focal length is greater than the first focal length.
  • control terminal 1920 may be configured to determine that the selectable range of the area to be processed in the first image is any area in the first image if it is determined according to the comparison result that the second maximum angle of view area includes the first maximum angle of view area.
  • the photographing device 1910 described above may be any device with a photographing function, and may refer to a camera, a pan-tilt camera, a movable platform including a camera, and the like.
  • each block in the flowchart or block diagram may represent a module, program segment, or part of the code, and the above-mentioned module, program segment, or part of the code contains one or more for realizing the specified logic function.
  • Executable instructions may also occur in a different order from the order marked in the drawings. For example, two blocks shown in succession can actually be executed substantially in parallel, and they can sometimes be executed in the reverse order, depending on the functions involved.
  • each block in the block diagram or flowchart, and the combination of blocks in the block diagram or flowchart can be implemented by a dedicated hardware-based system that performs the specified function or operation, or can be implemented by It is realized by a combination of dedicated hardware and computer instructions.
  • FIG. 20 is a schematic structural diagram of a control terminal provided by an embodiment of the disclosure.
  • the control terminal 70 may include: at least one processor 701, such as a CPU, at least one network interface 704, a user interface 703, a memory 705, at least one communication bus 702, and a display screen 706.
  • the communication bus 702 is used to implement connection and communication between these components.
  • the user interface 703 may include a touch screen, a keyboard, a mouse, a joystick, and so on.
  • the network interface 704 may optionally include a standard wired interface and a wireless interface (such as a WI-FI interface), and a communication connection with the server and the camera 20 can be established through the network interface 704.
  • the memory 705 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory.
  • the memory 705 includes the flash in the embodiment of the present invention.
  • the memory 705 may also be at least one storage system located far away from the foregoing processor 701.
  • the memory 705 as a computer storage medium may include an operating system, a network communication module, a user interface module, and program instructions.
  • the network interface 704 can be connected to an acquirer, a transmitter, or other communication modules.
  • Other communication modules can include but are not limited to a WiFi module, a Bluetooth module, etc.
  • the control terminal 70 in the embodiment of the present application may also include acquiring Receivers, transmitters and other communication modules.
  • the processor 701 may be used to call program instructions stored in the memory 705 and perform the following operations:
  • the second focal length is greater than the first focal length.
  • control terminal 70 in this embodiment can be specifically implemented according to the method in the above method embodiment, and will not be repeated here.
  • FIG. 21 is a schematic diagram of the structure of a photographing device provided by an embodiment of the disclosure.
  • the photographing device 80 may include: at least one processor 801, such as a CPU, at least one network interface 804, a zoom lens 803, a memory 805, a wide-angle lens 806, and at least one communication bus 802.
  • the communication bus 802 is used to implement connection and communication between these components.
  • the network interface 20 port 804 may optionally include a standard wired interface and a wireless interface (such as a WI-FI interface), and a communication connection with the control terminal 70 can be established through the network interface 804.
  • the memory 805 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory.
  • the memory 805 includes the flash in the embodiment of the present invention.
  • the memory 805 may also be at least one storage system located far away from the aforementioned processor 801.
  • the memory 805 as a computer storage medium may include an operating system, a network communication module, and program instructions.
  • the network interface 804 can be connected to an acquirer, a transmitter, or other communication modules.
  • Other communication modules can include but are not limited to a WiFi module, a Bluetooth module, etc.
  • the flight trajectory recording device in the embodiment of the present invention can also include Obtainers, transmitters and other communication modules, etc.
  • the processor 801 may be used to call the program instructions stored in the memory 805 and perform the following operations:
  • the second focal length is greater than the first focal length.
  • the embodiment of the present application also provides a computer-readable storage medium that stores instructions in the computer-readable storage medium, and when it runs on a computer or a processor, the computer or the processor executes any one of the above methods. Or multiple steps. If each component module of the above-mentioned signal processing device is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the above embodiments it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • software it can be implemented in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted through the computer-readable storage medium.
  • the computer instructions can be sent from a website site, computer, server, or data center to another website site via wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , Computer, server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).
  • the aforementioned storage medium may be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random storage memory RAM, etc.
  • the technical features in this embodiment and the implementation can be combined arbitrarily.

Abstract

L'invention concerne un procédé de traitement d'image, consistant à : acquérir une première zone d'angle de vision maximale correspondant à une première distance focale à l'angle d'attitude actuel, et obtenir une seconde zone d'angle de vision maximale correspondant à une seconde distance focale ; comparer la première zone d'angle de vision maximale avec la seconde zone d'angle de vision maximale ; et déterminer la plage sélectionnable de la zone à traiter dans une première image selon un résultat de comparaison, de manière à obtenir au moins une seconde image de la zone à traiter à la seconde distance focale, la première image étant acquise à la première distance focale, et la seconde distance focale étant supérieure à la première distance focale. Selon la présente invention, la portée photographique d'une lentille combinée présentant de multiples distances focales à de multiples distances focales peut être déterminée, de telle sorte qu'un utilisateur peut comprendre la portée photographique à une distance focale spécifiée, ce qui permet d'améliorer l'expérience de l'utilisateur.
PCT/CN2020/077217 2020-02-28 2020-02-28 Procédé de traitement d'image, appareil de traitement d'image et programme de traitement d'image WO2021168804A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/077217 WO2021168804A1 (fr) 2020-02-28 2020-02-28 Procédé de traitement d'image, appareil de traitement d'image et programme de traitement d'image
CN202080004280.7A CN112514366A (zh) 2020-02-28 2020-02-28 图像处理方法、图像处理装置和图像处理系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/077217 WO2021168804A1 (fr) 2020-02-28 2020-02-28 Procédé de traitement d'image, appareil de traitement d'image et programme de traitement d'image

Publications (1)

Publication Number Publication Date
WO2021168804A1 true WO2021168804A1 (fr) 2021-09-02

Family

ID=74953160

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/077217 WO2021168804A1 (fr) 2020-02-28 2020-02-28 Procédé de traitement d'image, appareil de traitement d'image et programme de traitement d'image

Country Status (2)

Country Link
CN (1) CN112514366A (fr)
WO (1) WO2021168804A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115499565A (zh) * 2022-08-23 2022-12-20 盯盯拍(深圳)技术股份有限公司 基于双镜头的图像采集方法、装置、介质及行车记录仪

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113688824B (zh) * 2021-09-10 2024-02-27 福建汇川物联网技术科技股份有限公司 一种施工节点的信息采集方法、装置及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106791376A (zh) * 2016-11-29 2017-05-31 广东欧珀移动通信有限公司 成像装置、控制方法、控制装置及电子装置
CN107277371A (zh) * 2017-07-27 2017-10-20 青岛海信移动通信技术股份有限公司 一种移动终端放大图片区域的方法及装置
WO2019147024A1 (fr) * 2018-01-23 2019-08-01 광주과학기술원 Procédé de détection d'objet à l'aide de deux caméras aux distances focales différentes, et appareil associé
CN110445978A (zh) * 2019-06-24 2019-11-12 华为技术有限公司 一种拍摄方法及设备
CN110493526A (zh) * 2019-09-10 2019-11-22 北京小米移动软件有限公司 基于多摄像模块的图像处理方法、装置、设备及介质

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4348261B2 (ja) * 2004-08-31 2009-10-21 Hoya株式会社 トリミング撮像装置
US11927874B2 (en) * 2014-07-01 2024-03-12 Apple Inc. Mobile camera system
CN105809618A (zh) * 2014-12-31 2016-07-27 华为终端(东莞)有限公司 一种图片处理方法及装置
CN106254780A (zh) * 2016-08-31 2016-12-21 宇龙计算机通信科技(深圳)有限公司 一种双摄像头拍照控制方法、拍照控制装置及终端
CN110099213A (zh) * 2019-04-26 2019-08-06 维沃移动通信(杭州)有限公司 一种图像显示控制方法及终端
CN110602401A (zh) * 2019-09-17 2019-12-20 维沃移动通信有限公司 一种拍照方法及终端
CN110781879B (zh) * 2019-10-31 2023-04-28 广东小天才科技有限公司 一种点读目标识别方法、系统、存储介质及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106791376A (zh) * 2016-11-29 2017-05-31 广东欧珀移动通信有限公司 成像装置、控制方法、控制装置及电子装置
CN107277371A (zh) * 2017-07-27 2017-10-20 青岛海信移动通信技术股份有限公司 一种移动终端放大图片区域的方法及装置
WO2019147024A1 (fr) * 2018-01-23 2019-08-01 광주과학기술원 Procédé de détection d'objet à l'aide de deux caméras aux distances focales différentes, et appareil associé
CN110445978A (zh) * 2019-06-24 2019-11-12 华为技术有限公司 一种拍摄方法及设备
CN110493526A (zh) * 2019-09-10 2019-11-22 北京小米移动软件有限公司 基于多摄像模块的图像处理方法、装置、设备及介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115499565A (zh) * 2022-08-23 2022-12-20 盯盯拍(深圳)技术股份有限公司 基于双镜头的图像采集方法、装置、介质及行车记录仪
CN115499565B (zh) * 2022-08-23 2024-02-20 盯盯拍(深圳)技术股份有限公司 基于双镜头的图像采集方法、装置、介质及行车记录仪

Also Published As

Publication number Publication date
CN112514366A (zh) 2021-03-16

Similar Documents

Publication Publication Date Title
JP6532217B2 (ja) 画像処理装置、画像処理方法、及び画像処理システム
KR101530255B1 (ko) 객체 자동 추적 장치가 구비된 cctv 시스템
CN113329182A (zh) 一种图像处理方法、无人机及系统
WO2018205104A1 (fr) Procédé de commande de capture par un aéronef sans pilote, procédé de capture par un aéronef sans pilote, terminal de commande, dispositif de commande d'un aéronef sans pilote et aéronef sans pilote
JP6716015B2 (ja) 撮影制御装置、撮影システム、および撮影制御方法
WO2021168804A1 (fr) Procédé de traitement d'image, appareil de traitement d'image et programme de traitement d'image
JP2020053774A (ja) 撮像装置および画像記録方法
JP2016212784A (ja) 画像処理装置、画像処理方法
JP7371076B2 (ja) 情報処理装置、情報処理システム、情報処理方法及びプログラム
TWI696147B (zh) 全景圖形成方法及系統
JP2010092092A (ja) 画像処理装置及び画像処理方法
JP6677980B2 (ja) パノラマビデオデータの処理装置、処理方法及び処理プログラム
JP6483661B2 (ja) 撮像制御装置、撮像制御方法およびプログラム
JP2016111561A (ja) 情報処理装置、システム、情報処理方法及びプログラム
CN110930303A (zh) 全景图形成方法及系统
JP6700706B2 (ja) 情報処理装置、情報処理方法及びプログラム
WO2022041013A1 (fr) Procédé de commande, cardan portatif, système, et support de stockage lisible par ordinateur
US11790483B2 (en) Method, apparatus, and device for identifying human body and computer readable storage medium
JP2018092507A (ja) 画像処理装置、画像処理方法及びプログラム
CN110647792A (zh) 信息处理设备、控制方法和存储介质
WO2024018973A1 (fr) Procédé de traitement d'informations, dispositif de traitement d'informations, et programme de traitement d'informations
JP2020009099A (ja) 画像処理装置、画像処理方法、及びプログラム
KR102497593B1 (ko) 정보 처리장치, 정보 처리방법 및 기억매체
JP6852878B2 (ja) 画像処理装置、画像処理プログラムおよび画像処理方法
JP2017010555A (ja) 画像処理方法及び画像処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20921454

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20921454

Country of ref document: EP

Kind code of ref document: A1