CN110602389A - Display method and electronic equipment - Google Patents

Display method and electronic equipment Download PDF

Info

Publication number
CN110602389A
CN110602389A CN201910812999.2A CN201910812999A CN110602389A CN 110602389 A CN110602389 A CN 110602389A CN 201910812999 A CN201910812999 A CN 201910812999A CN 110602389 A CN110602389 A CN 110602389A
Authority
CN
China
Prior art keywords
target object
image data
image
preview image
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910812999.2A
Other languages
Chinese (zh)
Other versions
CN110602389B (en
Inventor
胡吉祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910812999.2A priority Critical patent/CN110602389B/en
Publication of CN110602389A publication Critical patent/CN110602389A/en
Priority to PCT/CN2020/104334 priority patent/WO2021036623A1/en
Application granted granted Critical
Publication of CN110602389B publication Critical patent/CN110602389B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a display method and electronic equipment, wherein the method comprises the following steps: under the condition that a target object is not included in a first preview image acquired by a first device of electronic equipment, acquiring first image data by a second device of the electronic equipment, wherein the target object is included in the first image data; identifying a location of the target object in the first image data; and displaying direction prompt information according to the position relation between the position of the target object in the first image data and a preset position, wherein the direction prompt information is used for prompting a user to move the electronic equipment, so that a preview image acquired by the first device after movement comprises the target object. In the embodiment of the invention, the direction prompt information is displayed on the electronic equipment, so that the electronic equipment can move according to the direction prompt information, and the time consumed for re-controlling the target object to appear in the visual angle of the lens can be reduced.

Description

Display method and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a display method and an electronic device.
Background
Nowadays, electronic devices are generally provided with a camera which can be used for shooting images. When an image including a target object is photographed, the size of the target object in the image may be adjusted by changing the focal length in order to make the photographing effect better. However, in actual use, the target object is often easily lost after adjusting the focal length or moving the electronic device, and in the process of moving the electronic device again so that the target object appears in the angle of view of the lens of the electronic device, the electronic device may move in the opposite direction, thereby causing a long time to be taken to control the target object to appear in the angle of view of the lens again.
Disclosure of Invention
The embodiment of the invention provides a display method and electronic equipment, and aims to solve the problem that time consumed for re-controlling a target object to appear in a visual angle of a lens is long.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a display method, including:
under the condition that a target object is not included in a first preview image acquired by a first device of electronic equipment, acquiring first image data by a second device of the electronic equipment, wherein the target object is included in the first image data;
identifying a location of the target object in the first image data;
and displaying direction prompt information according to the position relation between the position of the target object in the first image data and a preset position, wherein the direction prompt information is used for prompting a user to move the electronic equipment, so that a preview image acquired by the first device after movement comprises the target object.
In a second aspect, an embodiment of the present invention further provides an electronic device, including:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring first image data through a second device of the electronic equipment under the condition that a target object is not included in a first preview image acquired through a first device of the electronic equipment, and the target object is included in the first image data;
an identification module for identifying a location of the target object in the first image data;
the first display module is configured to display direction prompt information according to a position relationship between a position of the target object in the first image data and a preset position, where the direction prompt information is used to prompt a user to move the electronic device, so that a preview image acquired by the first device after movement includes the target object.
In a third aspect, an embodiment of the present invention further provides an electronic device, including: the display device comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor realizes the steps of the display method when executing the computer program.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps in the display method.
In the embodiment of the invention, under the condition that a target object is not included in a first preview image acquired by a first device of electronic equipment, first image data is acquired by a second device of the electronic equipment, wherein the target object is included in the first image data; identifying a location of the target object in the first image data; and displaying direction prompt information according to the position relation between the position of the target object in the first image data and a preset position, wherein the direction prompt information is used for prompting a user to move the electronic equipment, so that a preview image acquired by the first device after movement comprises the target object. In this way, since the direction prompt information is displayed on the electronic device, the electronic device can move according to the direction prompt information, and thus, the time consumed for re-controlling the target object to appear in the visual angle of the lens can be reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a flow chart of a display method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another display method provided by the embodiment of the invention;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of another electronic device provided in an embodiment of the invention;
FIG. 5 is a schematic structural diagram of another electronic device provided in an embodiment of the invention;
FIG. 6 is a schematic structural diagram of another electronic device provided in an embodiment of the invention;
fig. 7 is a schematic structural diagram of another electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a display method according to an embodiment of the present invention, as shown in fig. 1, including the following steps:
step 101, when a target object is not included in a first preview image acquired by a first device of an electronic device, acquiring first image data by a second device of the electronic device, wherein the target object is included in the first image data.
The specific scene of the target object, which is not included in the first preview image acquired by the first device of the electronic device, may include the following scenes: first, the preview image obtained by the first device includes the target object, but after the focal length of the first device is adjusted, the first preview image obtained by the first device does not include the target object, and the electronic device may specifically adjust the focal length of the first device according to an instruction input by a user. And the particular type of instruction is not limited herein. For example: the instruction may be a voice instruction, and of course, the instruction may also be a touch operation, such as a single-finger operation or a double-finger operation.
In addition, the specific scene of the target object, which is not included in the first preview image acquired by the first device of the electronic device, may further include the following scenes: the target object is included in the preview image first acquired by the first device, but after the electronic device is moved, the target object is not included in the first preview image acquired by the first device.
Wherein the viewing angle of the second device may be greater than or equal to the viewing angle of the first device. For example: the first device may be a first camera, the second device may be a second camera, and the first camera and the second camera may be located on the same plane, for example, both located on a back plate of the electronic device. Preferably, the first camera and the second camera can be located on the same straight line of the same plane. Of course, the viewing angle of the second camera may be greater than the viewing angle of the first camera.
In addition, the specific type of the second device is not limited herein, for example: the second device may be a camera, a sensor, or the like.
Wherein the first image data and the first preview image may both be displayed on a display screen of the electronic device, for example: the first preview image may be displayed in a first area of a display screen of the electronic device, and the first image data may be displayed in a second area of the display screen of the electronic device. Of course, the first image data may not be displayed on the display screen of the electronic device, but may be recorded only in the electronic device.
The specific type of the target object is not limited herein, for example: the target object may be a human being, or may be other animals or plants. Such as: moon, sun, or street lights, etc.
Step 102, identifying a position of the target object in the first image data.
Wherein the position of the target object in the first image data may be identified by object tracking techniques, such as: the first image data may be in the form of a rectangular image, and the target object may be at an upper left corner, an upper right corner, a lower left corner, a lower right corner, or a middle position of the rectangular image.
Step 103, displaying direction prompt information according to the position relation between the position of the target object in the first image data and a preset position, wherein the direction prompt information is used for prompting a user to move the electronic equipment, so that a preview image acquired by the first device after movement includes the target object.
The specific position of the preset position is not limited herein, for example: the preset position may be a middle position of the viewing angle of the second device, or a position shifted to the left or right from the middle position of the viewing angle of the second device. Of course, the preset position may also be an intermediate position of the image data acquired by the second device, such as: an intermediate position of the first image data.
The direction prompt information may include a prompt direction, and the prompt direction may be presented in the form of an arrow or text. For example: when the target object is prompted to need to move towards the upper right corner, an arrow pointing to the upper right corner or text of "move towards the upper right corner" may be displayed directly.
In an embodiment of the present invention, the electronic Device may be a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
In the embodiment of the invention, under the condition that a target object is not included in a first preview image acquired by a first device of electronic equipment, first image data is acquired by a second device of the electronic equipment, wherein the target object is included in the first image data; identifying a location of the target object in the first image data; and displaying direction prompt information according to the position relation between the position of the target object in the first image data and a preset position, wherein the direction prompt information is used for prompting a user to move the electronic equipment, so that a preview image acquired by the first device after movement comprises the target object. In this way, since the direction prompt information is displayed on the electronic device, the electronic device can move according to the direction prompt information, and thus, the time consumed for re-controlling the target object to appear in the visual angle of the lens can be reduced.
Referring to fig. 2, fig. 2 is a flowchart of another display method provided by the embodiment of the invention. The main differences between this embodiment and the previous embodiment are: the target object needs to be selected before the first image data is acquired by the second device. As shown in fig. 2, the method comprises the following steps:
step 201, acquiring a second preview image through the first device, and displaying the second preview image on the electronic device.
The second preview image may include a target object, and certainly, the second preview image may further include other background objects.
Optionally, the electronic device includes a main camera and an auxiliary camera on the same surface, the first device is the main camera, and the second device is the auxiliary camera; alternatively, the first and second electrodes may be,
the electronic equipment comprises a camera assembly, the camera assembly comprises a first camera and an image sensor, the first device is the first camera, and the second device is the image sensor.
Wherein, main camera and vice camera can all be located electronic equipment's same surface, for example: on the back plate of the electronic equipment, preferably, the main camera and the auxiliary camera can also be arranged on the same straight line. In addition, it should be noted that the viewing angle of the sub camera may be greater than or equal to the viewing angle of the main camera.
The camera assembly can comprise a first camera and an image sensor, and the visual angle of the image sensor can be larger than or equal to that of the first camera. For example: as an optional implementation, the first image data acquired by the image sensor for a certain object may include content larger than that included in the first preview image acquired by the first camera for the certain object: the first image data acquired by the image sensor may include a target object and a first background object, and the first preview image acquired by the first camera may include only a part of content in the first background object. It should be noted that the first camera may be a camera that does not support optical zooming.
In the embodiment of the present invention, the types of the second devices may be different, for example, the second devices may be sub-cameras or image sensors, so that due to the different types of the second devices, flexibility of acquiring the first image data through the second devices is increased.
Step 202, when an input operation for the second preview image is received, detecting whether the target object in the second preview image can be tracked according to an object tracking technology. In case the target object cannot be tracked, performing step 203; returning to execute the step 202 when the input information of the target object tracking error is received; in case the target object can be tracked, step 204 is performed.
Among them, the Object Tracking (OT) technique may be: in the image shooting process, a certain object is tracked and marked out in real time through a preset frame body in the image. Common object tracking techniques include face tracking. For example: a user can click a certain position of an image in the image shooting process, and the object tracking technology can identify an object corresponding to the position according to the position of the image clicked by the user, expand the object to the edge of the object, and draw a corresponding preset frame (such as a rectangular frame) according to the edge of the object. The face tracking can identify the position of the face in the image when the image is shot, and draw a rectangular frame corresponding to the face according to an object tracking technology.
When an input operation aiming at the second preview image is received, detecting whether a corresponding target object at the position determined by the input operation can be tracked or not by combining an object tracking technology according to the position determined by the input operation in the second preview image. For example: and determining the target position through input operation, and detecting whether the contour of the target object can be tracked by taking the target position as a center and expanding outwards through combining an object tracking technology. The outline of the target object may be a rectangular box type or a circular box, etc.
When the determined target object is wrong, the process may return to step 202, so that the target object may be re-determined according to the input operation of the user.
And step 203, displaying prompt information for prompting that the user cannot track the target object.
The content of the prompt message may be a text message, and the text message may be "the target object cannot be tracked"; of course, the content of the prompt message may also be image information, and the image information may include the content of "the target object cannot be tracked".
Note that the type of the prompt information is not limited herein. In addition, step 203 is an optional step.
Therefore, when the prompt information is displayed, the user can acquire the tracking state of the target object in time, and can make corresponding reaction according to the prompt information, so that the user experience is improved.
And 204, acquiring a first preview image through the first device under the condition that the target object can be tracked. Detecting whether the first preview image comprises a target object or not; if the first preview image does not include the target object, go to step 205.
The specific scene of the target object, which is not included in the first preview image acquired by the first device of the electronic device, may include the following scenes: first, the preview image obtained by the first device includes the target object, but after the focal length of the first device is adjusted, the first preview image obtained by the first device does not include the target object, and the electronic device may specifically adjust the focal length of the first device according to an instruction input by a user. And the particular type of instruction is not limited herein. For example: the instruction may be a voice instruction, and of course, the instruction may also be a touch operation, such as a single-finger operation or a double-finger operation.
In addition, the specific scene of the target object, which is not included in the first preview image acquired by the first device of the electronic device, may further include the following scenes: the target object is included in the preview image first acquired by the first device, but after the electronic device is moved, the target object is not included in the first preview image acquired by the first device.
Step 205, acquiring first image data through a second device of the electronic device, wherein the first image data includes the target object.
For specific expressions of the second device and the first image data, reference may be made to corresponding expressions in the previous embodiments, and details are not described herein again.
Step 206, identifying the position of the target object in the first image data.
Wherein the position of the target object in the first image data may be identified by object tracking techniques, such as: the first image data may be in the form of a rectangular image, and the target object may be at an upper left corner, an upper right corner, a lower left corner, a lower right corner, or a middle position of the rectangular image.
Step 207, displaying direction prompt information according to the position relation between the position of the target object in the first image data and a preset position, wherein the direction prompt information is used for prompting a user to move the electronic device, so that a preview image acquired through the first device after movement includes the target object.
The specific representation of the preset position and the direction prompt information may refer to the corresponding representation in the previous embodiment, which is not described herein again.
Optionally, after displaying the direction prompt information according to the position relationship between the position of the target object in the first image data and the preset position, the method further includes:
acquiring second image data in real time through the second device in the moving process or after the electronic equipment moves;
taking an image by the first device if the position of the target object in the second image data at least partially coincides with the preset position.
Wherein the focal length (which may also be referred to as the viewing angle) of the second device is unchanged before and after the user moves the electronic device in the cued direction.
The first device can also acquire the preview image in real time, and when the user moves the electronic device according to the prompting direction and the target object appears in the preview image acquired by the first device, the prompting information does not need to be displayed any more.
If the user does not move the electronic device according to the prompting direction, and thus the second image data obtained in real time through the second device does not include the target object, prompting text information such as that the object completely deviates, the direction of the electronic device is required to be readjusted, or that the object completely deviates and the lens direction of the electronic device is required to be readjusted can be displayed on the electronic device. If the user moves the electronic device again, and the second image data obtained in real time through the second device includes the target object, the prompt direction may be displayed again.
In addition, in the case where the position of the target object in the second image data at least partially coincides with the preset position, for example: the preset position can be in the shape of a rectangular frame or a circular frame, when the preset position is at least partially overlapped with the position of the target object in the second image data, the target object can be determined to move to the preset position, and the electronic device can shoot the image through the first device at the moment because the imaging effect of the target object at the preset position is generally good.
It should be noted that, when the first device acquires the first preview image again after the image is shot by the first device, the contour and the position of the target object can be tracked continuously according to the object tracking technology, so that the subsequent electronic device can complete continuous shooting for the target object conveniently.
In the embodiment of the invention, the electronic equipment can move according to the direction prompt information, so that the moving efficiency of the electronic equipment is improved; and when the target object moves to the preset position, the image can be shot through the first device, the effect of the shot image is better, and therefore the imaging effect of the shot image is improved.
Optionally, the preset position is a central position of the first image data or the second image data.
In an optional implementation manner, if the first device is a main camera and the second device is a sub-camera, when the first device and the second device acquire image data for the same object, the content included in the middle position of the image data acquired by the second device for the object may be the same as the content included in the image data acquired by the first device for the object.
In another optional implementation, if the first device is a first camera and the second device is an image sensor, after the focal length of the first device is adjusted, and when image data is acquired for the same object by the first device and the second device, the content included in the image acquired by the first device may be the same as the content included in the middle position of the image data acquired by the second device; it can also be understood that: the content included in the image acquired by the first device may be a content included in a middle position of the image data acquired by the second device and amplified by an "interpolation method".
In the embodiment of the invention, the preset position is the central position of the first image data or the second image data, so that the target object moves to the central position to finish image shooting through the first device, and the imaging effect of the image is further improved.
In the embodiment of the invention, the target object can be determined firstly through the steps 201 to 207, so that the position of the target object can be identified more conveniently and quickly in the first image data subsequently, and the determination speed of the prompt direction is further correspondingly improved.
Referring to fig. 3, fig. 3 is a structural diagram of an electronic device according to an embodiment of the present invention, which can implement details of a display method in the foregoing embodiment and achieve the same effects. As shown in fig. 3, the electronic device 300 includes:
a first obtaining module 301, configured to obtain first image data through a second device of an electronic device when a target object is not included in a first preview image obtained through a first device of the electronic device, where the target object is included in the first image data;
an identification module 302 for identifying a location of the target object in the first image data;
a first display module 303, configured to display direction prompt information according to a position relationship between a position of the target object in the first image data and a preset position, where the direction prompt information is used to prompt a user to move the electronic device, so that a preview image obtained through the first device after the movement includes the target object.
Optionally, the electronic device 300 includes a main camera and a sub-camera on the same surface, where the first device is the main camera, and the second device is the sub-camera; alternatively, the first and second electrodes may be,
the electronic device 300 comprises a camera assembly, the camera assembly comprises a first camera and an image sensor, the first device is the first camera, and the second device is the image sensor.
Optionally, referring to fig. 4, the electronic device 300 further includes:
a second obtaining module 304, configured to obtain, in real time, second image data through the second device during or after the movement of the electronic device;
a capturing module 305, configured to capture an image by the first device in a case where a position of the target object in the second image data at least partially coincides with the preset position.
Optionally, referring to fig. 5, the electronic device 300 further includes:
a third obtaining module 306, configured to obtain a second preview image through the first device, and display the second preview image on the electronic device;
a detecting module 307, configured to detect whether the target object in the second preview image can be tracked according to an object tracking technology when an input operation for the second preview image is received;
a fourth obtaining module 308, configured to obtain, by the first device, a first preview image if the target object can be tracked.
Optionally, referring to fig. 6, the electronic device 300 further includes:
a second display module 309, configured to display, in a case that the target object cannot be tracked, a prompt message for prompting that the user cannot track the target object.
The electronic device provided in the embodiment of the present invention can implement each process implemented by the electronic device in the method embodiments of fig. 1 to fig. 2, and is not described herein again to avoid repetition. In the embodiment of the invention, the direction prompt information is displayed on the electronic equipment, so that the electronic equipment can move according to the direction prompt information, thereby reducing the time consumed by controlling the lens again to align the target object and improving the efficiency of aligning the target object.
Fig. 7 is a schematic diagram of a hardware structure of another electronic device for implementing various embodiments of the present invention.
The electronic device 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 7 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 710 is configured to:
under the condition that a target object is not included in a first preview image acquired by a first device of electronic equipment, acquiring first image data by a second device of the electronic equipment, wherein the target object is included in the first image data;
identifying a location of the target object in the first image data;
and displaying direction prompt information according to the position relation between the position of the target object in the first image data and a preset position, wherein the direction prompt information is used for prompting a user to move the electronic equipment, so that a preview image acquired by the first device after movement comprises the target object.
Optionally, the electronic device includes a main camera and an auxiliary camera on the same surface, the first device is the main camera, and the second device is the auxiliary camera; alternatively, the first and second electrodes may be,
the electronic equipment comprises a camera assembly, the camera assembly comprises a first camera and an image sensor, the first device is the first camera, and the second device is the image sensor.
Optionally, the processor 710 is further configured to:
acquiring second image data in real time through the second device in the moving process or after the electronic equipment moves;
taking an image by the first device if the position of the target object in the second image data at least partially coincides with the preset position.
Optionally, the processor 710 is further configured to:
acquiring a second preview image through the first device, and displaying the second preview image on the electronic equipment;
detecting whether the target object in the second preview image can be tracked according to an object tracking technology when an input operation for the second preview image is received;
acquiring a first preview image by the first device if the target object can be tracked.
Optionally, the processor 710 is further configured to: and displaying prompt information for prompting the user that the target object cannot be tracked under the condition that the target object cannot be tracked.
In the embodiment of the invention, the direction prompt information is displayed on the electronic equipment, so that the electronic equipment can move according to the direction prompt information, and the time consumed for re-controlling the target object to appear in the visual angle of the lens can be reduced.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 701 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 710; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 701 may also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 702, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output as sound. Also, the audio output unit 703 may also provide audio output related to a specific function performed by the electronic apparatus 700 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used to receive audio or video signals. The input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics processor 7041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 706. The image frames processed by the graphic processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 701 in case of a phone call mode.
The electronic device 700 also includes at least one sensor 705, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 7061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 7061 and/or a backlight when the electronic device 700 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 705 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 706 is used to display information input by the user or information provided to the user. The Display unit 706 may include a Display panel 7061, and the Display panel 7061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 7071 (e.g., operations by a user on or near the touch panel 7071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 7071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, receives a command from the processor 710, and executes the command. In addition, the touch panel 7071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, the other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 7071 may be overlaid on the display panel 7061, and when the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch operation is transmitted to the processor 710 to determine the type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event. Although the touch panel 7071 and the display panel 7061 are shown in fig. 7 as two separate components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 7071 and the display panel 7061 may be integrated to implement the input and output functions of the electronic device, which is not limited herein.
The interface unit 708 is an interface for connecting an external device to the electronic apparatus 700. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 708 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 700 or may be used to transmit data between the electronic apparatus 700 and the external device.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 709 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 709 and calling data stored in the memory 709, thereby monitoring the whole electronic device. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The electronic device 700 may also include a power supply 711 (e.g., a battery) for providing power to the various components, and preferably, the power supply 711 may be logically coupled to the processor 710 via a power management system, such that functions of managing charging, discharging, and power consumption may be performed via the power management system.
In addition, the electronic device 700 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor 710, a memory 709, and a computer program stored in the memory 709 and capable of running on the processor 710, where the computer program, when executed by the processor 710, implements each process of the above display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (12)

1. A display method, comprising:
under the condition that a target object is not included in a first preview image acquired by a first device of electronic equipment, acquiring first image data by a second device of the electronic equipment, wherein the target object is included in the first image data;
identifying a location of the target object in the first image data;
and displaying direction prompt information according to the position relation between the position of the target object in the first image data and a preset position, wherein the direction prompt information is used for prompting a user to move the electronic equipment, so that a preview image acquired by the first device after movement comprises the target object.
2. The method of claim 1, wherein the electronic device comprises a primary camera and a secondary camera on the same surface, wherein the first device is the primary camera and the second device is the secondary camera; alternatively, the first and second electrodes may be,
the electronic equipment comprises a camera assembly, the camera assembly comprises a first camera and an image sensor, the first device is the first camera, and the second device is the image sensor.
3. The method according to claim 1 or 2, wherein after displaying the direction hint information according to the positional relationship between the position of the target object in the first image data and a preset position, the method further comprises:
acquiring second image data in real time through the second device in the moving process or after the electronic equipment moves;
taking an image by the first device if the position of the target object in the second image data at least partially coincides with the preset position.
4. The method of claim 1, wherein in the event that the target object is not included in the first preview image acquired by the first device of the electronic device, prior to acquiring the first image data by the second device of the electronic device, the method further comprises:
acquiring a second preview image through the first device, and displaying the second preview image on the electronic equipment;
detecting whether the target object in the second preview image can be tracked according to an object tracking technology when an input operation for the second preview image is received;
acquiring a first preview image by the first device if the target object can be tracked.
5. The method of claim 4, wherein after detecting whether the target object in the second preview image can be tracked according to an object tracking technique, the method further comprises:
and displaying prompt information for prompting the user that the target object cannot be tracked under the condition that the target object cannot be tracked.
6. An electronic device, comprising:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring first image data through a second device of the electronic equipment under the condition that a target object is not included in a first preview image acquired through a first device of the electronic equipment, and the target object is included in the first image data;
an identification module for identifying a location of the target object in the first image data;
the first display module is configured to display direction prompt information according to a position relationship between a position of the target object in the first image data and a preset position, where the direction prompt information is used to prompt a user to move the electronic device, so that a preview image acquired by the first device after movement includes the target object.
7. The electronic device of claim 6, wherein the electronic device comprises a primary camera and a secondary camera on the same surface, wherein the first device is the primary camera and the second device is the secondary camera; alternatively, the first and second electrodes may be,
the electronic equipment comprises a camera assembly, the camera assembly comprises a first camera and an image sensor, the first device is the first camera, and the second device is the image sensor.
8. The electronic device of claim 6 or 7, further comprising:
the second acquisition module is used for acquiring second image data in real time through the second device in the moving process or after the electronic equipment moves;
and the shooting module is used for shooting an image through the first device under the condition that the position of the target object in the second image data is at least partially overlapped with the preset position.
9. The electronic device of claim 6, further comprising:
the third acquisition module is used for acquiring a second preview image through the first device and displaying the second preview image on the electronic equipment;
a detection module, configured to detect whether the target object in the second preview image can be tracked according to an object tracking technique when an input operation for the second preview image is received;
and the fourth acquisition module is used for acquiring the first preview image through the first device under the condition that the target object can be tracked.
10. The electronic device of claim 9, further comprising:
and the second display module is used for displaying prompt information for prompting that the user cannot track the target object under the condition that the target object cannot be tracked.
11. An electronic device, comprising: memory, processor and computer program stored on the memory and executable on the processor, the processor implementing the steps in the display method according to any one of claims 1-5 when executing the computer program.
12. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps in the display method according to any one of claims 1 to 5.
CN201910812999.2A 2019-08-30 2019-08-30 Display method and electronic equipment Active CN110602389B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910812999.2A CN110602389B (en) 2019-08-30 2019-08-30 Display method and electronic equipment
PCT/CN2020/104334 WO2021036623A1 (en) 2019-08-30 2020-07-24 Display method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910812999.2A CN110602389B (en) 2019-08-30 2019-08-30 Display method and electronic equipment

Publications (2)

Publication Number Publication Date
CN110602389A true CN110602389A (en) 2019-12-20
CN110602389B CN110602389B (en) 2021-11-02

Family

ID=68856771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910812999.2A Active CN110602389B (en) 2019-08-30 2019-08-30 Display method and electronic equipment

Country Status (2)

Country Link
CN (1) CN110602389B (en)
WO (1) WO2021036623A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111479055A (en) * 2020-04-10 2020-07-31 Oppo广东移动通信有限公司 Shooting method, shooting device, electronic equipment and storage medium
CN111770277A (en) * 2020-07-31 2020-10-13 RealMe重庆移动通信有限公司 Auxiliary shooting method, terminal and storage medium
WO2021036623A1 (en) * 2019-08-30 2021-03-04 维沃移动通信有限公司 Display method and electronic device
CN112954220A (en) * 2021-03-03 2021-06-11 北京蜂巢世纪科技有限公司 Image preview method and device, electronic equipment and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114286009A (en) * 2021-12-29 2022-04-05 维沃移动通信有限公司 Inverted image shooting method and device, electronic equipment and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110304749A1 (en) * 2010-06-10 2011-12-15 Canon Kabushiki Kaisha Image pickup apparatus and method for controlling image pickup apparatus
CN103108164A (en) * 2013-02-01 2013-05-15 南京迈得特光学有限公司 Compound eye type panorama continuous tracking and monitoring system
CN103377471A (en) * 2012-04-16 2013-10-30 株式会社理光 Method and device for object positioning, and method and device for determining optimal camera pair
CN105759839A (en) * 2016-03-01 2016-07-13 深圳市大疆创新科技有限公司 Unmanned aerial vehicle (UAV) visual tracking method, apparatus, and UAV
CN105960796A (en) * 2014-02-18 2016-09-21 富士胶片株式会社 Automatic tracking image-capture apparatus
CN106454132A (en) * 2016-11-29 2017-02-22 广东欧珀移动通信有限公司 Control method, control device and electronic device
US20170094189A1 (en) * 2015-09-28 2017-03-30 Kyocera Corporation Electronic apparatus, imaging method, and non-transitory computer readable recording medium
TW201812424A (en) * 2016-09-02 2018-04-01 聚晶半導體股份有限公司 Image capturing apparatus and image zooming method thereof
CN108366220A (en) * 2018-04-23 2018-08-03 维沃移动通信有限公司 A kind of video calling processing method and mobile terminal
CN108429881A (en) * 2018-05-08 2018-08-21 山东超景深信息科技有限公司 Exempt from the focal length shooting tripod head camera system application process by zoom view repeatedly
CN108712602A (en) * 2018-04-24 2018-10-26 Oppo广东移动通信有限公司 Camera control method, device, mobile terminal and storage medium
CN108833768A (en) * 2018-05-10 2018-11-16 信利光电股份有限公司 A kind of image pickup method of multi-cam, camera terminal and readable storage medium storing program for executing
CN109788208A (en) * 2019-01-30 2019-05-21 华通科技有限公司 Target identification method and system based on multiple groups focal length images source

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8115801B2 (en) * 2008-05-15 2012-02-14 Arcsoft, Inc. Method of automatic photographs stitching
CN110602389B (en) * 2019-08-30 2021-11-02 维沃移动通信有限公司 Display method and electronic equipment

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110304749A1 (en) * 2010-06-10 2011-12-15 Canon Kabushiki Kaisha Image pickup apparatus and method for controlling image pickup apparatus
CN103377471A (en) * 2012-04-16 2013-10-30 株式会社理光 Method and device for object positioning, and method and device for determining optimal camera pair
CN103108164A (en) * 2013-02-01 2013-05-15 南京迈得特光学有限公司 Compound eye type panorama continuous tracking and monitoring system
CN105960796A (en) * 2014-02-18 2016-09-21 富士胶片株式会社 Automatic tracking image-capture apparatus
US20170094189A1 (en) * 2015-09-28 2017-03-30 Kyocera Corporation Electronic apparatus, imaging method, and non-transitory computer readable recording medium
CN105759839A (en) * 2016-03-01 2016-07-13 深圳市大疆创新科技有限公司 Unmanned aerial vehicle (UAV) visual tracking method, apparatus, and UAV
TW201812424A (en) * 2016-09-02 2018-04-01 聚晶半導體股份有限公司 Image capturing apparatus and image zooming method thereof
CN106454132A (en) * 2016-11-29 2017-02-22 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN108366220A (en) * 2018-04-23 2018-08-03 维沃移动通信有限公司 A kind of video calling processing method and mobile terminal
CN108712602A (en) * 2018-04-24 2018-10-26 Oppo广东移动通信有限公司 Camera control method, device, mobile terminal and storage medium
CN108429881A (en) * 2018-05-08 2018-08-21 山东超景深信息科技有限公司 Exempt from the focal length shooting tripod head camera system application process by zoom view repeatedly
CN108833768A (en) * 2018-05-10 2018-11-16 信利光电股份有限公司 A kind of image pickup method of multi-cam, camera terminal and readable storage medium storing program for executing
CN109788208A (en) * 2019-01-30 2019-05-21 华通科技有限公司 Target identification method and system based on multiple groups focal length images source

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
明翠: "手机双摄像头实时监测技术与通讯问题探究", 《中国新通信》 *
林忠等: "应用于大变倍监控摄像机的电动变焦跟踪", 《应用光学》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021036623A1 (en) * 2019-08-30 2021-03-04 维沃移动通信有限公司 Display method and electronic device
CN111479055A (en) * 2020-04-10 2020-07-31 Oppo广东移动通信有限公司 Shooting method, shooting device, electronic equipment and storage medium
CN111770277A (en) * 2020-07-31 2020-10-13 RealMe重庆移动通信有限公司 Auxiliary shooting method, terminal and storage medium
CN112954220A (en) * 2021-03-03 2021-06-11 北京蜂巢世纪科技有限公司 Image preview method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN110602389B (en) 2021-11-02
WO2021036623A1 (en) 2021-03-04

Similar Documents

Publication Publication Date Title
CN108513070B (en) Image processing method, mobile terminal and computer readable storage medium
CN108668083B (en) Photographing method and terminal
CN109361869B (en) Shooting method and terminal
CN110602389B (en) Display method and electronic equipment
CN108471498B (en) Shooting preview method and terminal
CN110557575B (en) Method for eliminating glare and electronic equipment
CN108495029B (en) Photographing method and mobile terminal
CN109461117B (en) Image processing method and mobile terminal
CN108989672B (en) Shooting method and mobile terminal
CN107730460B (en) Image processing method and mobile terminal
CN107749046B (en) Image processing method and mobile terminal
CN111031234B (en) Image processing method and electronic equipment
CN111401463B (en) Method for outputting detection result, electronic equipment and medium
CN111031253B (en) Shooting method and electronic equipment
CN108881721B (en) Display method and terminal
CN110830713A (en) Zooming method and electronic equipment
CN108924422B (en) Panoramic photographing method and mobile terminal
CN108174110B (en) Photographing method and flexible screen terminal
CN108174109B (en) Photographing method and mobile terminal
CN111405181B (en) Focusing method and electronic equipment
CN110908517B (en) Image editing method, image editing device, electronic equipment and medium
CN109005337B (en) Photographing method and terminal
CN108924413B (en) Shooting method and mobile terminal
CN111131706B (en) Video picture processing method and electronic equipment
CN109660750B (en) Video call method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant