WO2022126375A1 - Procédé de zoom, dispositif d'imagerie, cardan et plate-forme mobile - Google Patents

Procédé de zoom, dispositif d'imagerie, cardan et plate-forme mobile Download PDF

Info

Publication number
WO2022126375A1
WO2022126375A1 PCT/CN2020/136515 CN2020136515W WO2022126375A1 WO 2022126375 A1 WO2022126375 A1 WO 2022126375A1 CN 2020136515 W CN2020136515 W CN 2020136515W WO 2022126375 A1 WO2022126375 A1 WO 2022126375A1
Authority
WO
WIPO (PCT)
Prior art keywords
zoom
imaging
target
imaging devices
instruction
Prior art date
Application number
PCT/CN2020/136515
Other languages
English (en)
Chinese (zh)
Inventor
方恒彬
温亚停
司佩潞
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/136515 priority Critical patent/WO2022126375A1/fr
Publication of WO2022126375A1 publication Critical patent/WO2022126375A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Definitions

  • the invention belongs to the field of imaging technology, and in particular relates to a zoom method, an imaging device, a pan-tilt and a movable platform.
  • Imaging devices are suitable for different application scenarios and can meet the needs of different scenarios.
  • different types of imaging devices are usually used in combination. For example, combining visible light cameras and infrared cameras, mounted on drones, and applied to fire protection, power inspection, photovoltaic inspection, security and other industries can greatly improve operational efficiency.
  • each imaging device is usually controlled independently, which makes it impossible to uniformly control multiple imaging devices.
  • the present invention provides a zoom method, an imaging device, a pan-tilt head and a movable platform, so as to solve the problem that the zoom cannot be controlled uniformly in the application scene of multiple imaging devices.
  • an embodiment of the present invention provides a zooming method, which is applied to an electronic device including a plurality of imaging devices, and the method includes:
  • the zoom instruction includes zoom parameters for adjusting the focal lengths of the plurality of imaging devices
  • the focal lengths of the plurality of imaging devices are adjusted according to the zoom parameters in the zoom instruction.
  • the plurality of imaging devices perform imaging according to light in different wavelength ranges.
  • the plurality of imaging devices include at least two of the following: visible light cameras, infrared cameras, and ultraviolet cameras.
  • the method before the step of acquiring the zoom instruction, the method further includes:
  • zoom control receiving a first input to a zoom control, wherein the zoom control is used to adjust the focus of at least two of the plurality of imaging devices;
  • the zoom instruction is generated according to the first input.
  • generating the zoom instruction according to the first input includes:
  • the zoom instruction is generated according to the zoom parameter.
  • obtaining the zoom parameter according to the first input includes:
  • the zoom parameter is acquired according to the numerical value currently indicated by the zoom control; wherein, the numerical value currently indicated is the numerical value obtained through the first input.
  • the zoom controls include virtual controls or physical controls.
  • the method before the step of receiving the first input to the zoom control, the method further includes:
  • the zoom control in the first zoom mode, can adjust the focal length of one of the at least two imaging devices, and in the second zoom mode, the zoom control can adjust the at least two The focal length of the imaging device.
  • the first target control includes a virtual control or an entity control.
  • the method further includes:
  • the zoom mode of the electronic device is switched from the second zoom mode to the first zoom mode.
  • the method further includes:
  • the focus adjustment object of the zoom control is switched between the at least two imaging devices.
  • the second target control includes a virtual control or an entity control.
  • the method further includes:
  • the focal lengths of the at least two imaging devices are adjusted to the same focal length value.
  • the method further includes:
  • the method before the step of acquiring the zoom instruction, the method further includes:
  • the tracking instruction is used to make at least two imaging devices in the plurality of imaging devices track and focus on the same target object.
  • the method further includes:
  • the at least two imaging devices track and focus the same target object, acquiring the imaging ratio of the target object in the imaging pictures of the at least two imaging devices;
  • the zoom instruction is generated according to the imaging scale.
  • the acquiring the imaging ratio of the target object in the imaging pictures of the at least two imaging devices includes:
  • the at least two imaging devices include a first imaging device and a second imaging device device, the first picture is a picture acquired by the first imaging device, and the second picture is a picture acquired by the second imaging device.
  • generating the zoom instruction according to the imaging scale includes:
  • the zoom parameter In the case where the scale value of the image of the target object in the first target picture is not within a preset value range, obtain the zoom parameter; wherein, the focal length value in the zoom parameter is used to convert the image of the target object
  • the scale value in the first target picture is adjusted to be within the preset value range; wherein, the first target picture is the first picture and/or the second picture.
  • the zoom instruction is generated according to the zoom parameter.
  • the method further includes:
  • the multiple imaging devices track and focus on the same target object, acquiring a field of view parameter, where the field of view parameter is used to make the field of view of the multiple imaging devices in a preset ratio;
  • the zoom parameter corresponding to the field of view parameter is obtained by calculating
  • the zoom instruction is generated according to the zoom parameter.
  • the angle of view of the plurality of imaging devices being in a preset ratio includes: the angle of view of at least two imaging devices of the plurality of imaging devices is the same.
  • the imaging positions of the target object in the imaging pictures of the at least two imaging devices are the same.
  • the imaging positions of the target object in the imaging pictures of the at least two imaging devices are the same, including:
  • the imaging position of the target object on the imaging pictures of the at least two imaging devices is the exact center of the imaging pictures.
  • the method before the step of acquiring the tracking instruction, the method further includes:
  • the second target picture is an imaging picture of a first imaging device among the at least two imaging devices
  • the imaging pictures of the at least two imaging devices determine the region of the image of the target object in each of the remaining target pictures, wherein the remaining The target picture is an imaging picture of an imaging device other than the first imaging device among the at least two imaging devices;
  • a tracking instruction for tracking and focusing the target objects in the second target picture and each of the remaining target pictures is generated.
  • the acquiring the area of the image of the target object in the second target picture includes:
  • a selection operation on a first area in the second target picture is received; wherein, the first area is an area including an image of the target object.
  • determining that the image of the target object is in each remaining target frame according to the relationship between the imaging frames of the at least two imaging devices and the area of the image of the target object in the second target frame area, including:
  • the coordinate information of the first area in the second target picture and the preset coordinate mapping relationship between the second target picture and each of the remaining target pictures determine in each of the remaining target pictures A second area that includes the target object.
  • the acquiring the area of the image of the target object in the second target picture includes:
  • meeting the preset condition is the highest temperature and/or the lowest temperature.
  • the preset condition is satisfied that the temperature is greater than the first temperature threshold and/or the temperature is less than the second temperature threshold; wherein the first temperature threshold is greater than the second temperature threshold.
  • the adjusting the focal lengths of the multiple imaging devices according to the zoom parameters in the zoom instruction includes:
  • the focal lengths of the plurality of imaging devices are changed according to a preset ratio.
  • the focal lengths of multiple imaging devices are changed according to a preset ratio, including:
  • the target value corresponding to each imaging device is determined according to the zoom value indicated by the zoom parameter and the preset ratio corresponding to each imaging device in the plurality of imaging devices.
  • the zoom parameter indicates the target value.
  • the ratio between the zoom value and the target value corresponding to the imaging device is equal to the preset ratio corresponding to the imaging device;
  • the focal lengths of the plurality of imaging devices are adjusted to respective corresponding target values.
  • the adjusting the focal lengths of the plurality of imaging devices includes:
  • the focal lengths of a plurality of imaging devices are changed in tandem.
  • the step of causing the focal lengths of the multiple imaging devices to change in a coordinated manner includes:
  • the focal lengths of the plurality of imaging devices are sequentially changed in a preset ratio.
  • an embodiment of the present invention provides an imaging device, which is applied to an electronic device including multiple imaging devices, where the imaging device includes a computer-readable storage medium and a processor; the processor is configured to perform the following operations:
  • the zoom instruction includes zoom parameters for adjusting the focal lengths of the plurality of imaging devices
  • the focal lengths of the plurality of imaging devices are adjusted according to the zoom parameters in the zoom instruction.
  • the plurality of imaging devices perform imaging according to light in different wavelength ranges.
  • the plurality of imaging devices include at least two of the following: visible light cameras, infrared cameras, and ultraviolet cameras.
  • the processor is further configured to perform the following operations:
  • zoom control receiving a first input to a zoom control, wherein the zoom control is used to adjust the focus of at least two of the plurality of imaging devices;
  • the zoom instruction is generated according to the first input.
  • generating the zoom instruction according to the first input includes:
  • the zoom instruction is generated according to the zoom parameter.
  • obtaining the zoom parameter according to the first input includes:
  • the zoom parameter is acquired according to the numerical value currently indicated by the zoom control; wherein, the numerical value currently indicated is the numerical value obtained through the first input.
  • the zoom controls include virtual controls or physical controls.
  • the processor before the step of receiving the first input to the zoom control, the processor is further configured to perform the following operations:
  • the zoom control in the first zoom mode, can adjust the focal length of one of the at least two imaging devices, and in the second zoom mode, the zoom control can adjust the at least two The focal length of the imaging device.
  • the first target control includes a virtual control or an entity control.
  • the processor is further configured to perform the following operations:
  • the zoom mode of the electronic device is switched from the second zoom mode to the first zoom mode.
  • the processor is further configured to execute the following operate:
  • the focus adjustment object of the zoom control is switched between the at least two imaging devices.
  • the second target control includes a virtual control or an entity control.
  • the processor is further configured to perform the following operations:
  • the focal lengths of the at least two imaging devices are adjusted to the same focal length value.
  • the processor is further configured to perform the following operations:
  • the processor is further configured to perform the following operations:
  • the tracking instruction is used to make at least two imaging devices in the plurality of imaging devices track and focus on the same target object.
  • the processor is further configured to perform the following operations:
  • the at least two imaging devices track and focus the same target object, acquiring the imaging ratio of the target object in the imaging pictures of the at least two imaging devices;
  • the zoom instruction is generated according to the imaging scale.
  • the acquiring the imaging ratio of the target object in the imaging pictures of the at least two imaging devices includes:
  • the at least two imaging devices include a first imaging device and a second imaging device device, the first picture is a picture acquired by the first imaging device, and the second picture is a picture acquired by the second imaging device.
  • generating the zoom instruction according to the imaging scale includes:
  • the zoom parameter In the case where the scale value of the image of the target object in the first target picture is not within a preset value range, obtain the zoom parameter; wherein, the focal length value in the zoom parameter is used to convert the image of the target object
  • the scale value in the first target picture is adjusted to be within the preset value range; wherein, the first target picture is the first picture and/or the second picture.
  • the zoom instruction is generated according to the zoom parameter.
  • the processor is further configured to perform the following operations:
  • the multiple imaging devices track and focus on the same target object, acquiring a field of view parameter, where the field of view parameter is used to make the field of view of the multiple imaging devices in a preset ratio;
  • the zoom parameter corresponding to the field of view parameter is obtained by calculating
  • the zoom instruction is generated according to the zoom parameter.
  • the angle of view of the plurality of imaging devices being in a preset ratio includes: the angle of view of at least two imaging devices of the plurality of imaging devices is the same.
  • the imaging positions of the target object in the imaging pictures of the at least two imaging devices are the same.
  • the imaging positions of the target object in the imaging pictures of the at least two imaging devices are the same, including:
  • the imaging position of the target object on the imaging pictures of the at least two imaging devices is the exact center of the imaging pictures.
  • the processor is further configured to perform the following operations:
  • the second target picture is an imaging picture of a first imaging device among the at least two imaging devices
  • the imaging pictures of the at least two imaging devices determine the region of the image of the target object in each of the remaining target pictures, wherein the remaining The target picture is an imaging picture of an imaging device other than the first imaging device among the at least two imaging devices;
  • a tracking instruction for tracking and focusing the target objects in the second target picture and each of the remaining target pictures is generated.
  • the acquiring the area of the image of the target object in the second target picture includes:
  • a selection operation on a first area in the second target picture is received; wherein, the first area is an area including an image of the target object.
  • determining that the image of the target object is in each remaining target frame according to the relationship between the imaging frames of the at least two imaging devices and the area of the image of the target object in the second target frame area, including:
  • the coordinate information of the first area in the second target picture and the preset coordinate mapping relationship between the second target picture and each of the remaining target pictures determine in each of the remaining target pictures A second area that includes the target object.
  • the acquiring the area of the image of the target object in the second target picture includes:
  • meeting the preset condition is the highest temperature and/or the lowest temperature.
  • the preset condition is satisfied that the temperature is greater than the first temperature threshold and/or the temperature is less than the second temperature threshold; wherein the first temperature threshold is greater than the second temperature threshold.
  • the adjusting the focal lengths of the multiple imaging devices according to the zoom parameters in the zoom instruction includes:
  • the focal lengths of the plurality of imaging devices are changed according to a preset ratio.
  • the focal lengths of multiple imaging devices are changed according to a preset ratio, including:
  • the target value corresponding to each imaging device is determined according to the zoom value indicated by the zoom parameter and the preset ratio corresponding to each imaging device in the plurality of imaging devices.
  • the zoom parameter indicates the target value.
  • the ratio between the zoom value and the target value corresponding to the imaging device is equal to the preset ratio corresponding to the imaging device;
  • the focal lengths of the plurality of imaging devices are adjusted to respective corresponding target values.
  • the adjusting the focal lengths of the plurality of imaging devices includes:
  • the focal lengths of a plurality of imaging devices are changed in tandem.
  • the step of causing the focal lengths of the multiple imaging devices to change in a coordinated manner includes:
  • the focal lengths of the plurality of imaging devices are sequentially changed in a preset ratio.
  • an embodiment of the present invention provides a pan/tilt head, including an axial movement device, and the imaging device according to the second aspect whose orientation is controlled by the axial movement device.
  • an embodiment of the present invention provides a movable platform, which is characterized by comprising a body, and the imaging device according to the second aspect mounted on the body.
  • an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the following operations are implemented:
  • the zoom instruction includes zoom parameters for adjusting the focal lengths of multiple imaging devices in the electronic device;
  • the focal lengths of the plurality of imaging devices are adjusted according to the zoom parameters in the zoom instruction.
  • the focal lengths of multiple imaging devices can be adjusted according to the zoom parameters in the zoom instruction, so as to realize unified zooming of multiple imaging devices, thereby reducing the difficulty of zooming operation of multiple imaging devices and response speed, improve the work efficiency of subsequent steps such as multi-camera follow focus, multi-camera linked zoom, intelligent monitoring, and intelligent tracking.
  • FIG. 1 is a flowchart of steps of a zooming method provided by an embodiment of the present invention.
  • FIG. 2 is one of the schematic diagrams showing the display screen of the electronic device in the embodiment of the present invention.
  • FIG 3 is the second schematic diagram showing the display screen of the electronic device in the embodiment of the present invention.
  • FIG. 4 is the third schematic diagram showing the display screen of the electronic device in the embodiment of the present invention.
  • FIG. 5 is a schematic diagram showing a zoom control for performing optical zooming in an embodiment of the present invention.
  • FIG. 6 is a schematic diagram showing a zoom control for digital zooming provided by an embodiment of the present invention.
  • FIG. 7 is the fourth schematic diagram showing the display screen of the electronic device in the embodiment of the present invention.
  • FIG. 8 is a block diagram of an imaging device provided by an embodiment of the present invention.
  • FIG. 9 is a schematic diagram of a hardware structure of a device implementing various embodiments of the present invention.
  • FIG. 10 is a block diagram of a computing processing device provided by an embodiment of the present invention.
  • FIG. 11 is a block diagram of a portable or fixed storage unit according to an embodiment of the present invention.
  • FIG. 1 is a flowchart of steps of a zoom method provided by an embodiment of the present invention. As shown in FIG. 1 , the method is applied to an electronic device including multiple imaging devices, including:
  • Step 101 Acquire a zoom instruction.
  • the zoom instruction includes zoom parameters for adjusting the focal lengths of the multiple imaging devices.
  • the zoom parameter may be a focal length value, that is, a corresponding focal length value after the imaging device is zoomed.
  • the zoom parameter may also be other parameters associated with the focal length value, the other parameters have a fixed relationship with the focal length value, and the corresponding focal length value can be calculated according to the other parameters through a preset calculation method.
  • Step 102 Adjust the focal lengths of the multiple imaging devices according to the zoom parameters in the zoom instruction.
  • the same zoom instruction can adjust the focal lengths of multiple imaging devices.
  • the imaging device is a device that performs imaging according to light, and the imaging devices can be classified based on the wavelength range of the light on which the imaging device is imaging. Imaging devices of the same type will image based on light in the same wavelength range; imaging devices of different types will image based on light in different wavelength ranges.
  • the multiple imaging devices in the electronic device may be the same type of imaging device, or may be different types of imaging devices.
  • the imaging device may be a camera, a mobile phone, or the like.
  • the focal lengths of at least two of the plurality of imaging devices can be adjusted according to the zoom parameter.
  • Adjusting the focal length of the imaging device means adjusting the focal length of the lens in the imaging device, which is often referred to as optical zoom.
  • adjusting the focal length of the imaging device can also be understood as adjusting the parameters associated with the digital zoom in the imaging device, so as to achieve a zoom effect.
  • the focal lengths of multiple imaging devices can be adjusted according to the zoom parameters in the zoom instruction, so as to realize unified zooming of the multiple imaging devices, thereby reducing the operational difficulty and difficulty of zooming the multiple imaging devices.
  • the response speed improves the work efficiency of subsequent steps such as multi-camera follow focus, multi-camera linked zoom, intelligent monitoring, and intelligent tracking.
  • a plurality of imaging devices perform imaging according to light of different wavelength ranges.
  • light can be classified into visible light, infrared light, ultraviolet light, etc. according to different wavelength ranges
  • imaging devices that perform imaging according to light in different wavelength ranges can be applied to different scenarios.
  • the multiple imaging devices in the embodiments of the present invention perform imaging according to light in different wavelength ranges.
  • the plurality of imaging devices include at least two of the following: visible light cameras, infrared cameras, and ultraviolet cameras.
  • visible light cameras infrared cameras
  • ultraviolet cameras ultraviolet cameras.
  • the cooperation of visible light cameras and infrared cameras can greatly improve operational efficiency in fire protection, power inspection, photovoltaic inspection, security and other industries.
  • the visible light camera is used to obtain high-definition images while the ultraviolet camera is used to obtain images under ultraviolet light, so as to achieve the scene requirements that require both the visible light camera and the ultraviolet camera for operation.
  • the infrared camera is used for infrared imaging
  • the ultraviolet camera is used to obtain images under ultraviolet radiation, so as to meet the scene requirements that require both the infrared camera and the ultraviolet camera for operation.
  • multiple imaging devices can also be understood as the same device including multiple imaging modules, for example, the multiple imaging devices can include a visible light camera and an infrared camera; it can also be a dual-light camera, wherein the dual-light camera is provided with two imaging devices Modules, one imaging module uses visible light imaging, and the other imaging module uses infrared light imaging.
  • the wavelength range on which the visible light camera is usually imaged is a certain wavelength range within the visible light wavelength range. That is to say, the wavelength ranges on which the imaging of different visible light cameras is based may be different wavelength ranges within the visible light wavelength range.
  • An imaging device that performs imaging For example, the wavelength range of visible light is 390 nanometers to 780 nanometers.
  • the first imaging device in the plurality of imaging devices can be a visible light camera that performs imaging based on visible light of 390 nanometers to 600 nanometers, and the second imaging device can be based on 500 nanometers to 700 nanometers. Visible light camera for imaging with visible light.
  • multiple imaging devices perform imaging according to light in different wavelength ranges, so that they can be applied to more application scenarios.
  • the multiple imaging images collected by the multiple imaging devices may be images displayed on the display screen of the electronic device, and the imaging images of different imaging devices may be displayed on different display screens;
  • a display screen is divided into multiple display areas, and each display area displays an imaging image of an imaging device; the above method can ensure that the details of different imaging images are presented separately.
  • the same screen mode can also be used to display multiple imaging images on the same display screen, and at least some of the images overlap, for example, the imaging centers of the multiple imaging images are placed at the same point on the same display screen; Intuitively comparing the differences between different imaging pictures, and under the condition that the field angles of different imaging pictures are at least partially aligned, it is also possible to quickly locate the position of the same point on different imaging pictures.
  • the method before the step of acquiring the zoom instruction, the method further includes:
  • a first input to the zoom control is received.
  • the zoom control is used to adjust the focal length of at least two imaging devices in the plurality of imaging devices. That is, zooming of at least two imaging devices on the electronic device can be achieved through the zoom control.
  • the zoom controls include virtual controls or physical controls. That is, the virtual control displayed in the display screen of the electronic device can be used as a zoom control, as shown in FIG.
  • the zoom control can also use physical controls on the electronic device, for example, a joystick, a key, a sliding key, etc., as the zoom control.
  • the first input here may be input such as click, slide, long press, toggle, and open.
  • the first input may also include a preset instruction, such as an input instruction that can be triggered when a specific condition is met, and the specific condition includes a specific time (segment), a specific location, and the data sensed by the sensor reaches a threshold.
  • a zoom instruction is generated.
  • a zoom instruction may be generated through the user's manipulation of the zoom control, so that unified zooming of at least two imaging devices among the plurality of imaging devices can be implemented according to the zoom instruction.
  • generating a zoom instruction according to the first input including:
  • the zoom parameter is obtained.
  • zoom parameters need to be used.
  • the zoom parameter can be acquired through a first input to the zoom control, and the zoom parameter is associated with the first input.
  • a zoom command is generated.
  • the zoom instruction may be an instruction in a preset format, and the zoom instruction is generated according to the zoom parameter, that is, an instruction in a preset format carrying the zoom parameter is generated.
  • the zoom parameters required in the zooming process of the imaging device are obtained first, and then the zoom instruction is regenerated according to the zoom parameters.
  • obtain zoom parameters according to the first input including:
  • the currently indicated numerical value is the numerical value obtained by the first input.
  • the zoom control can correspond to multiple numerical values, and the current corresponding numerical value can be switched through the first input. Among them, the current corresponding value is the currently indicated value.
  • the zoom control can be a fixed-length virtual control with a scale, and is provided with a slidable slider. The user slides the slider through the first input, and the scale value at the position of the slider is the value currently indicated by the zoom control.
  • the zoom control can also be a physical button, the user will adjust the value currently indicated by the physical button by first inputting the physical button.
  • the zoom parameter can be determined conveniently and quickly according to the value currently indicated by the zoom parameter.
  • the method before the step of receiving the first input to the zoom control, the method further includes:
  • a second input to the first target control is received.
  • the first target control includes a virtual control or an entity control. That is, the virtual control displayed on the display screen of the electronic device can be used as the first target control.
  • a physical control on the electronic device may also be used as the second target control, for example, a joystick, a button, a sliding key, etc., may be used as the first target control.
  • the second input here can be input such as click, slide, long press, toggle, and open.
  • the second input may also include a preset instruction, for example, an input instruction that can be triggered when a specific condition is met, and the specific condition includes a specific time (segment), a specific location, the data sensed by the sensor reaches a threshold, and the like.
  • the zoom mode of the electronic device is switched from the first zoom mode to the second zoom mode.
  • the first target control can realize switching of the zoom mode of the electronic device.
  • the electronic device includes at least two zoom modes.
  • the zoom control can adjust the focal length of one of the at least two imaging devices, that is, to realize zooming on one imaging device; in the second zoom mode, zoom The control can adjust the focal lengths of the at least two imaging devices, that is, realize unified zooming of the at least two imaging devices.
  • prompt information for prompting the current zoom mode is displayed on the electronic device, and different zoom modes can be indicated through different patterns displayed on the first target control. For example, a lock pattern is displayed on the first target control.
  • the lock pattern is the pattern that has been unlocked; in the second zoom mode, the lock pattern is the pattern that has been locked, but Not limited to this.
  • FIG. 3 it is a schematic diagram of a display screen of an electronic device, which includes a first display area 31 for displaying an imaging picture of a first imaging device in at least two imaging devices, and an imaging image in a second imaging device in the at least two imaging devices.
  • the second display area 32, the zoom control 33, and the first target control 34 of the screen switch the zoom mode of the electronic device by inputting the first target control 34.
  • the first target control 34 displays a locked pattern. , indicating that the current zoom mode is the second zoom mode.
  • the user can switch the zoom mode through the first target control, and switch the zoom mode of the electronic device to the second zoom mode that can uniformly zoom at least two imaging devices, so as to meet user needs.
  • the method further includes:
  • a third input to the first target control is received.
  • the third input here can be input such as click, slide, long press, toggle, and open.
  • the third input may also include a preset instruction, for example, an input instruction that can be triggered when a specific condition is met, and the specific condition includes a specific time (segment), a specific location, the data sensed by the sensor reaches a threshold, and the like.
  • the zoom mode of the electronic device is switched from the second zoom mode to the first zoom mode.
  • the zoom control in the first zoom mode, can adjust the focal length of one of the at least two imaging devices, that is, realize zooming of one imaging device.
  • the same number of zoom controls as the imaging devices are displayed, and each zoom control is used to zoom one imaging device, but is not limited thereto.
  • the user can switch the zoom mode through the first target control, and switch the zoom mode of the electronic device to the first zoom mode that can zoom an imaging device, so as to meet the user's needs.
  • the method further includes:
  • a fourth input to the second target control is received.
  • the second target control includes a virtual control or an entity control. That is, the virtual control displayed on the display screen of the electronic device can be used as the second target control.
  • a physical control on the electronic device may also be used as the second target control, for example, a joystick, a button, a sliding key, etc., may be used as the second target control.
  • the fourth input here can be input such as click, slide, long press, toggle, and open.
  • the fourth input may also include a preset instruction, for example, an input instruction that can be triggered when a specific condition is met, and the specific condition includes a specific time (segment), a specific location, the data sensed by the sensor reaches a threshold, and the like. According to the fourth input, the focus adjustment object of the zoom control is switched between the at least two imaging devices.
  • FIG. 4 it is a schematic diagram of a display screen of an electronic device, which includes a first display area 41 for displaying the imaging picture of the first imaging device in the at least two imaging devices, and displaying the imaging image of the second imaging device in the at least two imaging devices.
  • the second display area 42 , the zoom control 43 and the second target control 44 of the screen switch the focus adjustment object of the zoom control 43 by inputting the second target control 44 .
  • different logos may be displayed on the second target control 44 to indicate the current focus adjustment object of the zoom control 43 .
  • IR/VIS is displayed on the second target control 44.
  • IR When IR is in a highlighted state, or when only IR is displayed on the second target control 44, it indicates that the current focus adjustment object of the zoom control 43 is an infrared camera.
  • VIS When the VIS is in a highlighted state, or when only the VIS is displayed on the second target control 44, it is indicated that the current focus adjustment object of the zoom control 43 is a visible light camera.
  • only one zoom control needs to be set to realize the zooming of each imaging device, so as to avoid the influence of many zoom controls, especially when using the zoom control on a small screen, it is easy to be There are too many controls on it, which increases the complexity of use; in addition, when using it in strong light outdoors or in emergency situations, it is even more necessary to ensure that the interface is simple to reduce the possibility of misoperation.
  • the method further includes:
  • the focal lengths of the at least two imaging devices are adjusted to the same focal length value.
  • the same focal length value may be the current focal length value of any one of the at least two imaging devices.
  • it can also be a preset fixed focal length value.
  • the zoom mode of the electronic device when the zoom mode of the electronic device is switched to the second zoom mode, the focal lengths of at least two imaging devices are first synchronized, and then unified zooming is performed, which can better ensure the consistency of imaging between the two imaging devices. It reduces the difficulty of using this zoom solution, and improves the work efficiency of subsequent steps such as multi-camera follow focus, multi-camera linked zoom, intelligent monitoring, and intelligent tracking.
  • the method further includes:
  • the transparency of the zoom control is adjusted to the first target value.
  • the first target value is a preset fixed value, which can be set according to requirements.
  • the first target value is less than the transparency of all controls displayed in the display screen.
  • the preset duration can be any preset duration, preferably, the preset duration can be any duration within one minute.
  • the first target value is smaller than the second target value.
  • the second target value may be equal to the transparency of other controls displayed in the display screen.
  • the zoom control when the zoom control is used and when the zoom control is not used within a preset time period, the zoom control is adjusted to different transparency, so as to reduce the sight obstruction of the zoom control when the zoom control is not used for a long time.
  • the purpose is to better ensure the cleanliness of the zoom interface, and to ensure that the user's attention is focused on the imaging part.
  • the zoom control is used to optically zoom and/or digitally zoom the plurality of imaging devices.
  • the optical zoom can be understood as a non-integer zoom
  • the digital zoom is an integer zoom
  • the two different zoom methods can present different effects.
  • optical zooming or digital zooming can be used according to requirements.
  • Figure 5 is a schematic diagram showing a zoom control for optical zooming
  • Figure 6 is a schematic diagram showing a zoom control for digital zooming.
  • the electronic device can also be set to support both optical zoom and digital zoom.
  • a switch control is provided to switch the zoom mode used when the imaging device zooms.
  • optical zooming and/or digital zooming may be implemented.
  • the method before the step of acquiring the zoom instruction, the method further includes:
  • the tracking instruction is used to make at least two imaging devices in the plurality of imaging devices track and focus on the same target object. Thereby, automatic zooming is realized during the tracking and focusing process of the target object.
  • the at least two imaging devices can be automatically zoomed without user operation, especially when the target object is far away or moves frequently , such as scientific investigation, rescue and rescue, wildlife protection and other fields, which further improves the response speed of target tracking and prevents the loss of the target.
  • the method further includes;
  • the imaging ratio of the target object in the imaging frames of the at least two imaging devices is acquired.
  • the imaging image is collected by the imaging device and displayed on the display screen of the electronic device.
  • the imaging images of different imaging devices can be displayed on different display screens; For multiple display areas, each display area displays an imaging image of an imaging device; of course, the same-screen mode can also be used to display multiple imaging images on the same display screen, wherein at least some of the images overlap.
  • Each imaging device corresponds to an imaging ratio, and for one imaging device, the imaging ratio of the target object is the ratio of the image of the target object occupying the imaging screen of the imaging device. It should be noted that, for the same imaging device, in the process of tracking and focusing on an object, the larger the focal length of the imaging device, the larger the imaging ratio of the object.
  • the zoom instruction generated according to the imaging ratio is used to indicate the imaging ratio of the target object in the imaging device after zooming.
  • a zoom instruction may be generated according to the imaging ratio of the target object in the imaging screen of the imaging devices, and then the target object can be adjusted after the imaging device is zoomed.
  • the imaging ratio in the picture in the process of tracking and focusing on the target object by at least two imaging devices, a zoom instruction may be generated according to the imaging ratio of the target object in the imaging screen of the imaging devices, and then the target object can be adjusted after the imaging device is zoomed. The imaging ratio in the picture.
  • acquiring the imaging ratio of the target object in the imaging pictures of at least two imaging devices includes:
  • the ratio value of the image of the target object in the first frame and the ratio value of the image of the target object in the second frame are obtained respectively.
  • the at least two imaging devices include a first imaging device and a second imaging device
  • the first picture is a picture acquired by the first imaging device, that is, an imaging picture of the first imaging device
  • the second picture is a picture acquired by the second imaging device , that is, the imaging picture of the second imaging device.
  • only two imaging devices are taken as an example, and the number of imaging devices is not limited to two.
  • a zoom instruction including:
  • the zoom parameter is acquired.
  • the focal length value in the zoom parameter is used to adjust the ratio value of the image of the target object in the first target picture to be within the preset value range; wherein, the first target picture is the first picture and/or the second picture. That is, after the imaging device corresponding to the first target picture is zoomed according to the zoom parameter, the scale value of the image of the target object in the first target picture will be within a preset numerical range.
  • the preset value range is a value range preset by the user according to requirements.
  • a zoom command is generated.
  • the ratio value of the image of the target object occupying the imaging screen is stabilized within a preset value range, so as to avoid the ratio value being too large or too large. small impact.
  • the method further includes:
  • the FOV parameter is obtained.
  • the field of view parameter is used to make the field of view of the multiple imaging devices in a preset ratio.
  • the field angles of the multiple imaging devices are in a preset ratio, there is a certain relationship between the imaging pictures of different imaging devices.
  • a plurality of different field of view parameters may be preset, and in the case of multiple imaging devices tracking and focusing on the same target object, a field of view parameter is acquired according to the difference between the imaging pictures of different imaging devices.
  • a fixed field of view parameter can also be preset, and the fixed field of view parameter can be directly acquired when multiple imaging devices track and focus on the same target object.
  • the zoom parameter corresponding to the field of view angle parameter is obtained by calculation.
  • a zoom command is generated.
  • the zoom parameter can be obtained according to the field angle parameter, and then the imaging device can be zoomed according to the zoom parameter.
  • the angle of view of the plurality of imaging devices being in a preset ratio includes: the angle of view of at least two imaging devices of the plurality of imaging devices is the same.
  • imaging images can be completely for the same imaging scene at the same imaging time, or even achieve the same content of the images, thereby facilitating the user to compare the imaging images.
  • the field of view angles of the at least two imaging devices can be adjusted to the same field of view angle.
  • the imaging positions of the target object in the imaging pictures of the at least two imaging devices are the same.
  • the imaging position may be any position in the imaging screen, in order to display as many scenes around the target object as possible, the imaging position may be the position in the center of the imaging screen.
  • the imaging positions of the target object in the imaging pictures of the at least two imaging devices are the same, including: the imaging position of the target object in the imaging pictures of the at least two imaging devices is the exact center of the imaging pictures.
  • the imaging images of the at least two imaging devices can be adjusted to be approximately the same image, thereby facilitating the user to compare and locate.
  • the method before the step of acquiring the tracking instruction, the method further includes:
  • the second target picture is an imaging picture of the first imaging device among the at least two imaging devices. That is, any one of the at least two imaging devices.
  • the region of the image of the target object in each of the remaining target pictures is determined according to the correlation between the imaging pictures of the at least two imaging devices and the region of the image of the target object in the second target picture.
  • each imaging device has an imaging picture
  • the remaining target pictures are imaging pictures of imaging devices other than the first imaging device among the at least two imaging devices.
  • the regions of the image of the target object in each of the remaining target pictures are respectively determined.
  • a tracking instruction for tracking and focusing the target objects in the second target picture and each remaining target picture is generated.
  • the target object in the second target picture and each remaining target picture can be tracked and focused according to the tracking instruction.
  • the image of the target object in the screen is marked, for example, framed, but not limited to this.
  • the image of the target object in the imaging screen of any imaging device can be determined, and then the tracking instruction for tracking and focusing the target object is generated.
  • acquiring the area of the image of the target object in the second target picture includes:
  • a selection operation for the first area in the second target screen is received.
  • the first area is an area including an image of the target object.
  • the user selects the image of the target object in the second target screen through the selection operation, so as to realize the free selection of the target object.
  • the user can freely select the target object to be tracked and focused, which greatly satisfies the user's needs.
  • the imaging pictures of at least two imaging devices and the region of the image of the target object in the second target picture determine the region of the image of the target object in each remaining target picture, including:
  • the second region including the target object is determined in each remaining target picture.
  • the preset coordinate mapping relationship is a coordinate mapping relationship established in advance according to the positions and postures between the imaging devices.
  • the preset coordinate mapping relationship between the respective imaging pictures of two imaging devices is used as an example for description, wherein the two imaging devices include a first imaging device and a second imaging device, and the imaging picture of the first imaging device is the first imaging device. picture, the imaging picture of the second imaging device is the second picture.
  • the positions and postures of the first imaging device and the second imaging device are pre-calibrated, so that the content of the first picture and the second picture are almost the same.
  • the coordinates in the first screen are the same as the coordinates mapped to the second screen, that is, the screen contents at the corresponding positions of the first screen and the second screen with the same coordinates are the same.
  • the region indicated by the coordinate information in the second picture is determined as The image of the target object is in the area of the second screen. Referring to FIG. 7 , the first image is displayed in the first display area 71 of the display screen of the electronic device, and the image of the target object in the first image is the first image 72 .
  • the second image is displayed in the second display area 73 of the display screen of the electronic device, and the image of the target object in the second image is the second image 74 .
  • the screen contents displayed by the first screen and the second screen are almost the same. If the coordinates of the first image 72 in the first frame are (0,0), the coordinates of the second image 74 in the second frame are also (0,0).
  • the position and posture of the second imaging device can be adjusted so that the picture contents of the first picture and the second picture are almost the same.
  • the coordinates in the first screen are the same as the coordinates mapped to the second screen, that is, the screen contents at the corresponding positions of the first screen and the second screen with the same coordinates are the same.
  • the coordinate information of the area of the image of the target object in the first picture and the preset coordinate mapping relationship when determining the area of the image of the target object in the second picture, the area indicated by the coordinate information in the second picture is determined as the target The image of the object is in the area of the second screen.
  • a preset coordinate mapping relationship can also be determined according to the relationship between the two, so that the coordinates of the same object in the first screen are the same as those in the first image.
  • the coordinates in the two pictures conform to the preset coordinate mapping relationship.
  • the position of the target object in the imaging pictures of other imaging devices can be accurately determined.
  • acquiring the area of the image of the target object in the second target picture includes:
  • the target objects that meet the preset conditions are determined.
  • the filter conditions associated with the temperature can be preset to filter the target objects.
  • meeting the preset condition is the highest temperature and/or the lowest temperature. That is, the object corresponding to the image with the lowest temperature and/or the highest temperature in the imaging picture of the infrared camera can be determined as the target object.
  • the number of target objects is not limited, and after the target objects are determined, all target objects can be tracked and focused.
  • different preset conditions can be set, wherein the preset conditions are met when the temperature is greater than the first temperature threshold and/or the temperature is less than the second temperature threshold; wherein the first temperature threshold is greater than the second temperature threshold.
  • the image recognition technology may be used to recognize the image of the target object, and then determine the area of the target object in the second target image.
  • conditions can be preset, and target objects that meet the conditions can be automatically screened for tracking and focusing, thereby reducing user operations.
  • adjusting the focal lengths of the multiple imaging devices according to the zoom parameters in the zoom instruction including:
  • the focal lengths of the plurality of imaging devices are changed according to a preset ratio.
  • the preset ratio can be any preset ratio, wherein the ratio value of the preset ratio can be equal to 1, greater than 1 or less than 1.
  • the preset ratio may be 1:2. If the infrared camera is enlarged by 1 times according to the zoom parameter, the visible light camera is enlarged by 2 times at the same time. The parameters make the infrared camera magnify 2 times, and at the same time make the visible light camera magnify 4 times.
  • a proportional control can also be set, and the proportional value of the preset ratio can be adjusted by triggering the proportional control.
  • the focal lengths of multiple imaging devices can be changed in a certain proportion according to the zoom parameters, so as to meet more scene requirements.
  • the focal lengths of multiple imaging devices are changed according to a preset ratio, including:
  • the target value corresponding to each imaging device is determined according to the zoom value indicated by the zoom parameter and the preset ratio corresponding to each imaging device in the plurality of imaging devices.
  • the ratio between the zoom value indicated by the zoom parameter and the target value corresponding to the imaging device is equal to the preset ratio corresponding to the imaging device.
  • the preset ratio corresponding to the infrared camera can be 1:2
  • the preset ratio corresponding to the visible light camera can be 1:4
  • the zoom value indicated by the zoom parameter is When m, the target value corresponding to the infrared camera is 2m, and the target value corresponding to the visible light camera is 4m.
  • the focal lengths of the plurality of imaging devices are adjusted to respective corresponding target values.
  • the focal length of the imaging device can be adjusted according to the corresponding preset ratio.
  • adjusting the focal lengths of the plurality of imaging devices includes:
  • the focal lengths of a plurality of imaging devices are changed in tandem.
  • Adjusting the focal lengths of the multiple imaging devices may include adjusting the focal lengths of the multiple imaging devices simultaneously, or sequentially adjusting the focal lengths of the multiple imaging devices, or separately adjusting the focal lengths of the multiple imaging devices.
  • changing the focal lengths of the multiple imaging devices in a coordinated manner includes: simultaneously changing the focal lengths of the multiple imaging devices according to a preset ratio; or changing the focal lengths of the multiple imaging devices sequentially according to the preset ratio. That is to say, when the focal lengths of multiple imaging devices are changed, they can be changed simultaneously; or based on one imaging device, after the focal lengths of the imaging devices are changed, the focal lengths of the remaining imaging devices are changed sequentially.
  • the focal length adjustment of multiple imaging devices is implemented in a linked manner.
  • the apparatus 80 is applied to an electronic device including multiple imaging devices, and may include:
  • the instruction acquisition module 81 is configured to acquire a zoom instruction, wherein the zoom instruction includes zoom parameters for adjusting the focal lengths of multiple imaging devices.
  • the zoom module 82 is configured to adjust the focal lengths of the multiple imaging devices according to the zoom parameters in the zoom instruction.
  • a plurality of imaging devices perform imaging according to light of different wavelength ranges.
  • the plurality of imaging devices include at least two of the following: a visible light camera, an infrared camera, and an ultraviolet camera.
  • the apparatus further includes: a first receiving module configured to receive a first input to a zoom control, wherein the zoom control is used to adjust the focal length of at least two imaging devices in the plurality of imaging devices.
  • the instruction generation module is configured to generate a zoom instruction according to the first input.
  • the instruction generation module includes: a parameter acquisition unit for acquiring zoom parameters according to the first input; and an instruction generation unit for generating a zoom instruction according to the zoom parameters.
  • the parameter obtaining unit is specifically configured to obtain the zoom parameter according to the numerical value currently indicated by the zoom control; wherein the numerical value currently indicated is the numerical value obtained through the first input.
  • the zoom controls include virtual controls or physical controls.
  • the apparatus further includes: a second receiving module, configured to receive a second input to the first target control.
  • the zoom control can adjust one of the at least two imaging devices
  • a focal length of one, in the second zoom mode, the zoom control is capable of adjusting the focal lengths of at least two imaging devices.
  • the first target control includes a virtual control or an entity control.
  • the apparatus further includes: a third receiving module, configured to receive a third input to the first target control.
  • the second switching module is configured to switch the zoom mode of the electronic device from the second zoom mode to the first zoom mode according to the third input.
  • the apparatus further includes: a fourth receiving module, configured to receive a fourth input to the second target control.
  • the third switching module is configured to switch the focus adjustment object of the zoom control between at least two imaging devices according to the fourth input.
  • the second target control includes a virtual control or an entity control.
  • the apparatus further includes: a synchronization module configured to adjust the focal lengths of the at least two imaging devices to the same focal length value during the process of switching the zoom mode of the electronic device from the first zoom mode to the second zoom mode.
  • the zoom control includes a virtual control displayed in the display screen of the electronic device
  • this further includes: a first transparency module configured to detect that the zoom control displayed in the display screen is within a preset time period When not triggered, adjust the transparency of the zoom control to the first target value.
  • the second transparency module is configured to adjust the transparency of the zoom control to a second target value when it is triggered when the transparency of the zoom control is the first target value, wherein the first target value is smaller than the second target value.
  • the apparatus further includes: a tracking module, configured to acquire a tracking instruction, where the tracking instruction is used to make at least two imaging devices in the plurality of imaging devices track and focus on the same target object.
  • the apparatus further includes: an imaging scale module, configured to acquire the imaging scale of the target object in the imaging pictures of the at least two imaging devices when the at least two imaging devices track and focus the same target object.
  • the scale adjustment module is used to generate a zoom instruction according to the imaging scale.
  • the imaging scale module is specifically configured to respectively obtain the scale value of the image of the target object in the first picture and the scale value of the image of the target object in the second picture; wherein, the at least two imaging devices include the first imaging device.
  • the device and the second imaging device the first picture is a picture acquired by the first imaging device, and the second picture is a picture acquired by the second imaging device.
  • the scale adjustment module is specifically configured to obtain the zoom parameter when the scale value of the image of the target object in the first target picture is not within the preset numerical range; wherein, the focal length value in the zoom parameter is used to The scale value of the image of the object in the first target picture is adjusted to be within a preset value range; wherein, the first target picture is the first picture and/or the second picture; according to the zoom parameter, a zoom instruction is generated.
  • the device further includes: a field of view parameter module, configured to obtain a field of view parameter when multiple imaging devices track and focus on the same target object, and the field of view parameter is used to make the view of the multiple imaging devices.
  • the field angle is in a preset ratio.
  • the field angle calculation module is configured to calculate and obtain the zoom parameter corresponding to the field angle parameter according to the relationship between the focal length and the field angle in the plurality of imaging devices.
  • the instruction generation module is used for generating a zoom instruction according to the zoom parameter.
  • the angle of view of the plurality of imaging devices being in a preset ratio includes: the angle of view of at least two imaging devices of the plurality of imaging devices is the same.
  • the imaging positions of the target object in the imaging pictures of the at least two imaging devices are the same.
  • the imaging positions of the target object in the imaging pictures of the at least two imaging devices are the same, including: the imaging position of the target object in the imaging pictures of the at least two imaging devices is the exact center of the imaging pictures.
  • the apparatus further includes: an area acquisition module, configured to acquire an area of the image of the target object in a second target picture, where the second target picture is an imaging picture of a first imaging device among the at least two imaging devices .
  • a mapping module configured to determine the region of the image of the target object in each remaining target picture according to the relationship between the imaging pictures of the at least two imaging devices and the region of the image of the target object in the second target picture, wherein, The remaining target pictures are imaging pictures of imaging devices other than the first imaging device among the at least two imaging devices.
  • the tracking instruction generation module is configured to generate a tracking instruction for tracking and focusing the target objects in the second target picture and each remaining target picture.
  • the area acquisition module is specifically configured to receive a selection operation on a first area in the second target image; wherein, the first area is an area including an image of the target object.
  • the mapping module is specifically configured to determine, according to the coordinate information of the first area in the second target image and the preset coordinate mapping relationship between the second target image and each remaining target image, the target image including the target image in each remaining target image.
  • the second area of the object is specifically configured to determine the target objects that meet the preset conditions according to the temperatures indicated by the images of different objects in the second target screen; The area of the image of the target object of the condition in the second target screen.
  • meeting the preset condition is the highest temperature and/or the lowest temperature.
  • meeting the preset condition is that the temperature is greater than the first temperature threshold and/or the temperature is less than the second temperature threshold; wherein the first temperature threshold is greater than the second temperature threshold.
  • the zoom module 82 is specifically configured to change the focal lengths of the multiple imaging devices according to a preset ratio according to the zoom parameters in the zoom instruction.
  • the zoom module 82 is specifically configured to determine the target value corresponding to each imaging device according to the zoom value indicated by the zoom parameter and the preset ratio corresponding to each imaging device in the plurality of imaging devices, for each imaging device. , the ratio between the zoom value indicated by the zoom parameter and the target value corresponding to the imaging device is equal to the preset ratio corresponding to the imaging device; the focal lengths of the multiple imaging devices are adjusted to their corresponding target values.
  • the zoom module 82 is specifically configured to change the focal lengths of multiple imaging devices in linkage.
  • the zoom module 82 is specifically configured to make the focal lengths of the multiple imaging devices change simultaneously according to a preset ratio; or make the focal lengths of the multiple imaging devices change sequentially according to the preset ratio.
  • an embodiment of the present invention also provides an imaging device, which is applied to an electronic device including multiple imaging devices, where the imaging device includes a computer-readable storage medium and a processor; the processor is configured to perform the following operations:
  • the zoom instruction includes zoom parameters for adjusting the focal lengths of multiple imaging devices
  • the focal lengths of the plurality of imaging devices are adjusted.
  • a plurality of imaging devices perform imaging according to light of different wavelength ranges.
  • the plurality of imaging devices include at least two of the following: a visible light camera, an infrared camera, and an ultraviolet camera.
  • the processor is further configured to perform the following operations: receiving a first input to a zoom control, wherein the zoom control is used to adjust the focal length of at least two imaging devices in the plurality of imaging devices; Based on the first input, a zoom instruction is generated.
  • generating a zoom instruction according to the first input includes: acquiring a zoom parameter according to the first input; and generating a zoom instruction according to the zoom parameter.
  • obtaining the zoom parameter according to the first input includes: obtaining the zoom parameter according to the value currently indicated by the zoom control; wherein the value currently indicated is the value obtained through the first input.
  • the zoom controls include virtual controls or physical controls.
  • the processor before the step of receiving the first input to the zoom control, the processor is further configured to perform the following operations: receive a second input to the first target control; A zoom mode is switched to a second zoom mode; wherein, in the first zoom mode, the zoom control can adjust the focal length of one of the at least two imaging devices, and in the second zoom mode, the zoom control can adjust the at least two imaging devices focal length.
  • the first target control includes a virtual control or an entity control.
  • the processor is further configured to perform the following operations: receive a third input to the first target control; The zoom mode of the electronic device is switched from the second zoom mode to the first zoom mode.
  • the processor is further configured to perform the following operations: receiving a fourth input to the second target control ; according to the fourth input, switch the focus adjustment object of the zoom control between at least two imaging devices.
  • the second target control includes a virtual control or an entity control.
  • the processor is further configured to perform the following operation: adjust the focal lengths of the at least two imaging devices to the same focal length value.
  • the processor is further configured to perform the following operation: after detecting that the zoom control displayed in the display screen has not been used within a preset time period In the case of triggering, adjust the transparency of the zoom control to the first target value; when triggered when the transparency of the zoom control is the first target value, adjust the transparency of the zoom control to the second target value, wherein the first The target value is less than the second target value.
  • the processor before the step of acquiring the zoom instruction, is further configured to perform the following operations: acquiring a tracking instruction, where the tracking instruction is used to make at least two imaging devices in the plurality of imaging devices track and focus on the same target object.
  • the processor is further configured to perform the following operation: in the case that the at least two imaging devices track and focus on the same target object, acquire the target object in the imaging pictures of the at least two imaging devices. Imaging ratio; according to the imaging ratio, generate a zoom command.
  • acquiring the imaging proportions of the target object in the imaging pictures of the at least two imaging devices includes: respectively acquiring the proportion value of the image of the target object in the first picture and the proportion value of the image of the target object in the second picture.
  • the at least two imaging devices include a first imaging device and a second imaging device, the first picture is a picture obtained by the first imaging device, and the second picture is a picture obtained by the second imaging device.
  • generating a zoom instruction according to the imaging scale including: in the case that the scale value of the image of the target object in the first target picture is not within a preset numerical range, obtaining a zoom parameter; wherein, the focal length value in the zoom parameter is determined by for adjusting the ratio of the image of the target object in the first target frame to a predetermined value range; wherein the first target frame is the first frame and/or the second frame.
  • a zoom command is generated.
  • the processor is further configured to perform the following operations: in the case where multiple imaging devices track and focus on the same target object, acquire a field of view parameter, where the field of view parameter is used to make the The field of view angle of the imaging device is in a preset ratio; according to the correlation between the focal length and the field of view angle in the multiple imaging devices, the zoom parameter corresponding to the field of view parameter is calculated and obtained; the zoom instruction is generated according to the zoom parameter.
  • the angle of view of the plurality of imaging devices being in a preset ratio includes: the angle of view of at least two imaging devices of the plurality of imaging devices is the same.
  • the imaging positions of the target object in the imaging pictures of the at least two imaging devices are the same.
  • the imaging positions of the target object in the imaging pictures of the at least two imaging devices are the same, including: the imaging position of the target object in the imaging pictures of the at least two imaging devices is the exact center of the imaging pictures.
  • the processor is further configured to perform the following operations: acquiring an area of the image of the target object in the second target picture, where the second target picture is the first image of the at least two imaging devices.
  • An imaging picture of an imaging device according to the relationship between the imaging pictures of at least two imaging devices and the region of the image of the target object in the second target picture, determine the region of the image of the target object in each remaining target picture,
  • the remaining target picture is an imaging picture of an imaging device other than the first imaging device among the at least two imaging devices; and a tracking instruction for tracking and focusing the target object in the second target picture and each remaining target picture is generated.
  • acquiring the area of the image of the target object in the second target image includes: receiving a selection operation on the first area in the second target image; wherein the first area is an area including the image of the target object.
  • determining the area of the image of the target object in each remaining target frame according to the correlation between the imaging frames of the at least two imaging devices and the area of the image of the target object in the second target frame including: according to The coordinate information of the first region in the second target picture and the preset coordinate mapping relationship between the second target picture and each remaining target picture, determine the second region including the target object in each remaining target picture.
  • acquiring the area of the image of the target object in the second target image includes: determining, according to the temperatures indicated by the images of different objects in the second target image, that the image conforms to the preset image.
  • adjusting the focal lengths of the multiple imaging devices according to the zoom parameters in the zoom instruction includes: changing the focal lengths of the multiple imaging devices according to a preset ratio according to the zoom parameters in the zoom instruction.
  • changing the focal lengths of the multiple imaging devices according to a preset ratio including: according to the zoom value indicated by the zoom parameter and the preset ratio corresponding to each imaging device in the multiple imaging devices, Determine the target value corresponding to each imaging device, and for each imaging device, the ratio between the zoom value indicated by the zoom parameter and the target value corresponding to the imaging device is equal to the preset ratio corresponding to the imaging device; Adjust to their corresponding target values.
  • adjusting the focal lengths of the multiple imaging devices includes: changing the focal lengths of the multiple imaging devices in a coordinated manner.
  • changing the focal lengths of the multiple imaging devices in a coordinated manner includes: simultaneously changing the focal lengths of the multiple imaging devices according to a preset ratio; or changing the focal lengths of the multiple imaging devices sequentially according to the preset ratio.
  • an embodiment of the present invention further provides a pan/tilt head, including an axial movement device, and the above imaging device whose orientation is controlled by the axial movement device.
  • the imaging device is used to perform each step in the above zooming method, and can achieve the same technical effect. To avoid repetition, details are not described here.
  • an embodiment of the present invention also provides a mobile platform, including a body, and the above-mentioned imaging device installed on the body. The imaging device is used to perform each step in the above zooming method, and can achieve the same technical effect. To avoid repetition, details are not described here.
  • an embodiment of the present invention also provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, each step of the above zooming method is implemented, and can achieve the same In order to avoid repetition, the technical effect will not be repeated here.
  • the device 900 includes but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, User input unit 907, interface unit 908, memory 909, processor 910, and power supply 911 and other components.
  • a radio frequency unit 901 for implementing various embodiments of the present invention.
  • the device 900 includes but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, User input unit 907, interface unit 908, memory 909, processor 910, and power supply 911 and other components.
  • the device structure shown in FIG. 9 does not constitute a limitation to the device, and the device may include more or less components than the one shown, or combine some components, or arrange different components.
  • the devices include but are not limited to mobile phones, tablet computers, notebook computers, handheld computers, vehicle-mounted devices, wearable devices, and pedometers.
  • the radio frequency unit 901 can be used for receiving and sending signals during sending and receiving information or during a call, receiving downlink data from the base station, and processing it to the processor 910, and sending uplink data to the base station.
  • the radio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 901 can also communicate with the network and other devices through a wireless communication system.
  • the device provides the user with wireless broadband Internet access through the network module 902, such as helping the user to send and receive emails, browse web pages, access streaming media, and so on.
  • the audio output unit 903 may convert audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into audio signals and output as sound. Also, the audio output unit 903 may also provide audio output related to a specific function performed by the device 900 (eg, call signal reception sound, message reception sound, etc.).
  • the audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 904 is used to receive audio or video signals.
  • the input unit 904 may include a graphics processor (Graphics Processing Unit, GPU) 9041 and a microphone 9042, and the graphics processor 9041 is used for still pictures or video images obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode data is processed.
  • GPU Graphics Processing Unit
  • the processed image frames may be displayed on the display unit 906 .
  • the image frames processed by the graphics processor 9041 may be stored in the memory 909 (or other storage medium) or transmitted via the radio frequency unit 901 or the network module 902 .
  • the microphone 9042 can receive sound and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be transmitted to a mobile communication base station via the radio frequency unit 901 for output in the case of a telephone call mode.
  • Device 900 also includes at least one sensor 905, such as a light sensor, motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 9061 according to the brightness of the ambient light, and the proximity sensor can turn off the display panel 9061 and/or the backlight when the device 900 is moved to the ear.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the device posture (such as horizontal and vertical screen switching, related games.
  • the sensor 905 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer, infrared Sensors, etc., will not be repeated here.
  • the display unit 906 is used to display information input by the user or information provided to the user.
  • the display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
  • the user input unit 907 may be used to receive input numerical or character information, and generate key signal input related to user settings and function control of the device.
  • the user input unit 907 includes a touch panel 9041 and other input devices 9072 .
  • the touch panel 9041 also referred to as a touch screen, can collect the user's touch operations on or near it (such as the user's finger, stylus, etc., any suitable object or accessory on or near the touch panel 9041).
  • the touch panel 9041 may include two parts, a touch detection device and a touch controller. Among them, the touch detection device detects the user's touch orientation, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it to the touch controller. To the processor 910, the command sent by the processor 910 is received and executed.
  • the touch panel 9041 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the user input unit 907 may also include other input devices 9072 .
  • other input devices 9072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which will not be repeated here.
  • the touch panel 9041 can be covered on the display panel 9061. When the touch panel 9041 detects a touch operation on or near it, it transmits it to the processor 910 to determine the type of the touch event, and then the processor 910 determines the type of the touch event according to the touch The type of event provides a corresponding visual output on the display panel 9061.
  • the interface unit 908 is an interface for connecting an external device to the device 900 .
  • external devices may include wired or wireless headset ports, external power (or battery charger) ports, wired or wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input/output (I/O) ports, video I/O ports, headphone ports, and more.
  • Interface unit 908 may be used to receive input (eg, data information, power, etc.) from an external device and transmit the received input to one or more elements within device 900 or may be used between device 900 and an external device. transfer data between.
  • the memory 909 may be used to store software programs as well as various data.
  • the memory 909 may mainly include a stored program area and a stored data area, wherein the stored program area may store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of the mobile phone (such as audio data, phone book, etc.), etc.
  • memory 909 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • the processor 910 is the control center of the device, uses various interfaces and lines to connect various parts of the entire device, and executes by running or executing the software programs and/or modules stored in the memory 909, and calling the data stored in the memory 909. Various functions of the equipment and processing data, so as to monitor the equipment as a whole.
  • the processor 910 may include one or more processing units; preferably, the processor 910 may integrate an application processor and a modem processor, wherein the application processor mainly processes the operating system, user interface, and application programs, etc., and the modem The processor mainly handles wireless communication.
  • the device 900 may also include a power supply 911 (such as a battery) for supplying power to various components.
  • a power supply 911 such as a battery
  • the power supply 911 may be logically connected to the processor 910 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system. Function.
  • the device 900 includes some unshown functional modules, which are not repeated here.
  • the device embodiments described above are only illustrative, wherein the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in One place, or it can be distributed over multiple network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment. Those of ordinary skill in the art can understand and implement it without creative effort.
  • Various component embodiments of the present invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
  • FIG. 10 is a block diagram of a computing and processing device provided by an embodiment of the present invention. As shown in FIG. 10 , FIG.
  • the computing processing device traditionally includes a processor 1010 and a computer program product or computer readable medium in the form of a memory 1020 .
  • the memory 1020 may be electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the memory 1020 has storage space 1030 for program code for performing any of the method steps in the above-described methods.
  • the storage space 1030 for program codes may include various program codes for implementing various steps in the above methods, respectively. These program codes can be read from or written to one or more computer program products.
  • These computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks. Such computer program products are typically portable or fixed storage units as described with reference to FIG. 11 .
  • the storage unit may have storage segments, storage spaces, etc. arranged similarly to the memory 1020 in the computing processing device of FIG. 10 .
  • the program code may, for example, be compressed in a suitable form.
  • the storage unit comprises computer readable code, ie code readable by a processor such as 1010 for example, which when executed by a computing processing device, causes the computing processing device to perform each of the methods described above. step.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • the word “comprising” does not exclude the presence of elements or steps not listed in a claim.
  • the word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the invention can be implemented by means of hardware comprising several different elements and by means of a suitably programmed computer. In a unit claim enumerating several means, several of these means may be embodied by one and the same item of hardware.
  • the use of the words first, second, and third, etc. do not denote any order. These words can be interpreted as names.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de zoom, un dispositif d'imagerie, un cardan et une plate-forme mobile. Le procédé de zoom est appliqué à un dispositif électronique et consiste à : acquérir une instruction de zoom, l'instruction de zoom comprenant des paramètres de zoom pour ajuster les longueurs focales d'une pluralité de dispositifs d'imagerie ; et selon les paramètres de zoom dans l'instruction de zoom, ajuster les longueurs focales de la pluralité de dispositifs d'imagerie. Les longueurs focales de la pluralité de dispositifs d'imagerie peuvent être ajustées selon les paramètres de zoom dans l'instruction de zoom, ce qui réduit la difficulté de fonctionnement et la vitesse de réaction de zoom de la pluralité de dispositifs d'imagerie, et améliore l'efficacité de travail d'étapes ultérieures.
PCT/CN2020/136515 2020-12-15 2020-12-15 Procédé de zoom, dispositif d'imagerie, cardan et plate-forme mobile WO2022126375A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/136515 WO2022126375A1 (fr) 2020-12-15 2020-12-15 Procédé de zoom, dispositif d'imagerie, cardan et plate-forme mobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/136515 WO2022126375A1 (fr) 2020-12-15 2020-12-15 Procédé de zoom, dispositif d'imagerie, cardan et plate-forme mobile

Publications (1)

Publication Number Publication Date
WO2022126375A1 true WO2022126375A1 (fr) 2022-06-23

Family

ID=82059832

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/136515 WO2022126375A1 (fr) 2020-12-15 2020-12-15 Procédé de zoom, dispositif d'imagerie, cardan et plate-forme mobile

Country Status (1)

Country Link
WO (1) WO2022126375A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003069865A (ja) * 2001-08-29 2003-03-07 Miyota Kk 撮像装置
CN1573505A (zh) * 2003-05-21 2005-02-02 富士写真光机株式会社 可见光和红外光摄影用透镜系统
CN104535997A (zh) * 2015-01-08 2015-04-22 西安费斯达自动化工程有限公司 图像/激光测距/低空脉冲雷达一体化系统
CN106385614A (zh) * 2016-09-22 2017-02-08 北京小米移动软件有限公司 画面合成方法及装置
CN108803668A (zh) * 2018-06-22 2018-11-13 航天图景(北京)科技有限公司 一种静态目标监测的智能巡检无人机吊舱系统
CN111063148A (zh) * 2019-12-30 2020-04-24 神思电子技术股份有限公司 一种远距离夜视目标视频检测方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003069865A (ja) * 2001-08-29 2003-03-07 Miyota Kk 撮像装置
CN1573505A (zh) * 2003-05-21 2005-02-02 富士写真光机株式会社 可见光和红外光摄影用透镜系统
CN104535997A (zh) * 2015-01-08 2015-04-22 西安费斯达自动化工程有限公司 图像/激光测距/低空脉冲雷达一体化系统
CN106385614A (zh) * 2016-09-22 2017-02-08 北京小米移动软件有限公司 画面合成方法及装置
CN108803668A (zh) * 2018-06-22 2018-11-13 航天图景(北京)科技有限公司 一种静态目标监测的智能巡检无人机吊舱系统
CN111063148A (zh) * 2019-12-30 2020-04-24 神思电子技术股份有限公司 一种远距离夜视目标视频检测方法

Similar Documents

Publication Publication Date Title
CN111541845B (zh) 图像处理方法、装置及电子设备
CN108513070B (zh) 一种图像处理方法、移动终端及计算机可读存储介质
CN109639970B (zh) 一种拍摄方法及终端设备
CN111010510B (zh) 一种拍摄控制方法、装置及电子设备
CN108495029B (zh) 一种拍照方法及移动终端
WO2021104197A1 (fr) Procédé de poursuite d'objet et dispositif électronique
CN108989672B (zh) 一种拍摄方法及移动终端
CN111031398A (zh) 一种视频控制方法及电子设备
WO2021104227A1 (fr) Procédé de photographie et dispositif électronique
JP7394879B2 (ja) 撮像方法及び端末
CN111010512A (zh) 显示控制方法及电子设备
WO2019184947A1 (fr) Procédé de visualisation d'image et terminal mobile
CN110198413B (zh) 一种视频拍摄方法、视频拍摄装置和电子设备
CN111031253B (zh) 一种拍摄方法及电子设备
CN109922294B (zh) 一种视频处理方法及移动终端
US20220272275A1 (en) Photographing method and electronic device
CN108924422B (zh) 一种全景拍照方法及移动终端
CN111464746B (zh) 拍照方法及电子设备
CN111083374B (zh) 滤镜添加方法及电子设备
CN112749590B (zh) 目标检测方法、装置、计算机设备和计算机可读存储介质
CN111385525B (zh) 视频监控方法、装置、终端及系统
JP7413546B2 (ja) 撮影方法及び電子機器
CN109104564B (zh) 一种拍摄提示方法及终端设备
CN110913133B (zh) 拍摄方法及电子设备
CN110958387B (zh) 一种内容更新方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20965393

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20965393

Country of ref document: EP

Kind code of ref document: A1