WO2021036623A1 - 显示方法及电子设备 - Google Patents

显示方法及电子设备 Download PDF

Info

Publication number
WO2021036623A1
WO2021036623A1 PCT/CN2020/104334 CN2020104334W WO2021036623A1 WO 2021036623 A1 WO2021036623 A1 WO 2021036623A1 CN 2020104334 W CN2020104334 W CN 2020104334W WO 2021036623 A1 WO2021036623 A1 WO 2021036623A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
electronic device
image data
image
preview image
Prior art date
Application number
PCT/CN2020/104334
Other languages
English (en)
French (fr)
Inventor
胡吉祥
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021036623A1 publication Critical patent/WO2021036623A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera

Definitions

  • the present disclosure relates to the field of image processing technology, and in particular to a display method and electronic equipment.
  • the electronic equipment in the related art is generally equipped with a camera, which can be used to capture images.
  • a camera which can be used to capture images.
  • the electronic device may move in the opposite direction. As a result, it takes a long time to re-control the target object to appear in the angle of view of the lens.
  • the embodiments of the present disclosure provide a display method and an electronic device to solve the problem that it takes a long time to re-control the target object to appear in the angle of view of the lens.
  • embodiments of the present disclosure provide a display method, including:
  • the first image data is acquired by the second device of the electronic device, wherein the first image data includes the target;
  • Direction prompt information according to the positional relationship between the position of the target object in the first image data and the preset position, wherein the direction prompt information is used to prompt the user to move the electronic device so that the The preview image acquired by the first device includes the target object.
  • the embodiments of the present disclosure also provide an electronic device, including:
  • the first acquiring module is configured to acquire the first image data through the second device of the electronic device in the case that the first preview image acquired through the first device of the electronic device does not include the target object, wherein the second device of the electronic device One image data includes the target object;
  • An identification module for identifying the position of the target object in the first image data
  • the first display module is configured to display direction prompt information according to the position relationship between the position of the target object in the first image data and the preset position, wherein the direction prompt information is used to prompt the user to move the The electronic device, so that the preview image acquired by the first device after the movement includes the target object.
  • embodiments of the present disclosure also provide an electronic device, including: a memory, a processor, and a computer program stored on the memory and running on the processor, and the processor executes the computer program When realizing the steps in the above display method.
  • the embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned display method are implemented.
  • the first image data is acquired by the second device of the electronic device, wherein the first image data is acquired by the second device of the electronic device.
  • An image data includes the target object; identifying the position of the target object in the first image data; according to the positional relationship between the position of the target object in the first image data and a preset position Display direction prompt information, where the direction prompt information is used to prompt the user to move the electronic device, so that the preview image obtained by the first device after the movement includes the target object.
  • the direction prompt information is displayed on the electronic device, the electronic device can move according to the direction prompt information, thereby reducing the time it takes to re-control the target object to appear in the angle of view of the lens.
  • FIG. 1 is a flowchart of a display method provided by an embodiment of the present disclosure
  • FIG. 2 is a flowchart of another display method provided by an embodiment of the present disclosure.
  • FIG. 3 is a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.
  • FIG. 4 is a schematic structural diagram of another electronic device provided by an embodiment of the present disclosure.
  • FIG. 5 is a schematic structural diagram of another electronic device provided by an embodiment of the present disclosure.
  • FIG. 6 is a schematic structural diagram of another electronic device provided by an embodiment of the present disclosure.
  • Fig. 7 is a schematic structural diagram of another electronic device provided by an embodiment of the present disclosure.
  • FIG. 1 is a flowchart of a display method provided by an embodiment of the present disclosure. As shown in FIG. 1, it includes the following steps:
  • Step 101 In the case that the first preview image acquired by the first device of the electronic device does not include the target object, acquire the first image data by the second device of the electronic device, wherein the first image data is Including the target object.
  • the first preview image obtained by the first device of the electronic device does not include the target object.
  • the specific scene may include the following scenarios: first the preview image obtained by the first device includes the target object, but the focus of the first device is adjusted. After that, the first preview image obtained by the first device does not include the target object, and the electronic device can specifically adjust the focal length of the first device according to the instruction input by the user.
  • the specific types of instructions are not limited here.
  • the instruction may be a voice instruction, of course, the instruction may also be a touch operation, such as a single-finger operation or a two-finger operation.
  • the specific scene that does not include the target object in the first preview image obtained by the first device of the electronic device may also include the following scenarios: first the preview image obtained by the first device includes the target object, but after moving the electronic device, As a result, the target object is not included in the first preview image obtained by the first device.
  • the viewing angle of the second device may be greater than or equal to the viewing angle of the first device.
  • the first device may be a first camera
  • the second device may be a second camera
  • the first camera and the second camera may be located on the same plane, for example, both are located on the backplane of the electronic device.
  • the first camera and the second camera may be located on the same straight line on the same plane.
  • the viewing angle of the second camera may be greater than the viewing angle of the first camera.
  • the specific type of the second device is not limited here.
  • the second device may be a camera or a sensor.
  • the first image data and the first preview image may both be displayed on the display screen of the electronic device.
  • the first preview image may be displayed on the first area of the display screen of the electronic device
  • the first preview image may be displayed on the second area of the display screen of the electronic device.
  • the area can display the first image data.
  • the first image data may not be displayed on the display screen of the electronic device, but only recorded in the electronic device.
  • the specific type of the target object is not limited here.
  • the target object may be a human, or other animals or plants. Such as: the moon, the sun, or street lights, etc.
  • Step 102 Identify the position of the target object in the first image data.
  • the object tracking technology can be used to identify the position of the target object in the first image data.
  • the first image data can be a rectangular image
  • the target object can be in the upper left corner, upper right corner, lower left corner, and lower right corner of the rectangular image. Or in the middle position.
  • Step 103 Display direction prompt information according to the position relationship between the position of the target object in the first image data and the preset position, where the direction prompt information is used to prompt the user to move the electronic device to The preview image acquired by the first device after the movement includes the target object.
  • the preset position may be the middle position of the viewing angle of the second device, or may be a position left or right of the middle position of the viewing angle of the second device.
  • the preset position may also be the middle position of the image data collected by the second device, for example: the middle position of the first image data.
  • the direction prompt information may include a prompt direction
  • the prompt direction may be presented in the form of an arrow or text. For example: when the target object needs to be moved toward the upper right corner, an arrow pointing to the upper right corner can be displayed or the text "move to the upper right corner" can be displayed directly.
  • the above-mentioned electronic device may be a mobile phone, a tablet computer (Personal Computer), a laptop computer (Laptop Computer), a personal digital assistant (Personal Digital Assistant, PDA), a mobile Internet device (Mobile Internet Device, MID). ) Or wearable device (Wearable Device) and so on.
  • the first image data is acquired by the second device of the electronic device, wherein the first image data is acquired by the second device of the electronic device.
  • An image data includes the target object; identifying the position of the target object in the first image data; according to the positional relationship between the position of the target object in the first image data and a preset position Display direction prompt information, where the direction prompt information is used to prompt the user to move the electronic device, so that the preview image obtained by the first device after the movement includes the target object.
  • the direction prompt information is displayed on the electronic device, the electronic device can move according to the direction prompt information, thereby reducing the time it takes to re-control the target object to appear in the angle of view of the lens.
  • FIG. 2 is a flowchart of another display method provided by an embodiment of the present disclosure.
  • the main difference between this embodiment and the previous embodiment is that the target object needs to be selected before the first image data is acquired through the second device. As shown in Figure 2, it includes the following steps:
  • Step 201 Obtain a second preview image through the first device, and display the second preview image on the electronic device.
  • the second preview image may include the target object, and of course, the second preview image may also include other background objects.
  • the electronic device includes a main camera and a sub camera located on the same surface, the first device is the main camera, and the second device is the sub camera; or,
  • the electronic device includes a camera assembly, the camera assembly includes a first camera and an image sensor, the first device is the first camera, and the second device is the image sensor.
  • the main camera and the secondary camera may both be located on the same surface of the electronic device, for example, the back plate of the electronic device.
  • the main camera and the secondary camera may also be arranged on the same straight line.
  • the viewing angle of the secondary camera may be greater than or equal to the viewing angle of the main camera.
  • the camera assembly may include a first camera and an image sensor, and the viewing angle of the image sensor may also be greater than or equal to the viewing angle of the first camera.
  • the content included in the first image data acquired by the image sensor for a certain object may be greater than the content included in the first preview image acquired by the first camera for the object, as an optional implementation manner: acquisition by the image sensor
  • the first image data may include the target object and the first background object, and the first preview image obtained by the first camera may only include part of the content in the first background object.
  • the above-mentioned first camera may be a camera that does not support optical zoom.
  • the type of the second device may be different, for example, it may be a secondary camera or an image sensor. In this way, due to the different type of the second device, the flexibility of acquiring the first image data through the second device is increased.
  • Step 202 In the case of receiving an input operation for the second preview image, detect whether the target object in the second preview image can be tracked according to an object tracking technology. In the case that the target object cannot be tracked, go to step 203; in the case of receiving the wrong input information of the target object tracking, return to step 202; in the case that the target object can be tracked, go to step 204 .
  • the object tracking (OT, Object Tracking) technology can be: in the process of image shooting, tracking an object, and marking it in the image in real time through a preset frame.
  • object tracking techniques include face tracking.
  • the user can click on a certain position of the image during the image shooting process, and the object tracking technology can identify the object corresponding to the position according to the position where the user clicks on the image, and expand to the edge of the object, and draw according to the edge of the object
  • the corresponding preset frame (such as a rectangular frame).
  • Face tracking can identify the position of the face in the image when the image is taken, and draw the corresponding rectangular frame of the face according to the object tracking technology.
  • the position determined in the second preview image according to the input operation is then combined with object tracking technology to detect whether the corresponding position at the position determined by the input operation can be tracked target.
  • the input operation determines the target position, combined with the object tracking technology, expands outward with the target position as the center, and detects whether the contour of the target object can be tracked.
  • the outline of the target object may be a rectangular frame type or a circular frame.
  • step 202 can be returned, so that the target object can be re-determined according to the user's input operation.
  • Step 203 Display prompt information for prompting the user that the target object cannot be tracked.
  • the content of the prompt message can be text information, which can be "the target object cannot be tracked”; of course, the content of the prompt message can also be image information, and the image information can include "the target object cannot be tracked.” "Content.
  • step 203 is an optional step.
  • the user can obtain the tracking status of the target object in time, and can make a corresponding response according to the prompt information, which improves the user experience.
  • Step 204 Under the condition that the target object can be tracked, obtain a first preview image through the first device. And it is detected whether the target object is included in the first preview image; if the target object is not included in the first preview image, step 205 is executed.
  • the first preview image obtained by the first device of the electronic device does not include the target object.
  • the specific scene may include the following scenarios: first the preview image obtained by the first device includes the target object, but the focus of the first device is adjusted. After that, the first preview image obtained by the first device does not include the target object, and the electronic device can specifically adjust the focal length of the first device according to the instruction input by the user.
  • the specific types of instructions are not limited here.
  • the instruction may be a voice instruction, of course, the instruction may also be a touch operation, such as a single-finger operation or a two-finger operation.
  • the specific scene that does not include the target object in the first preview image obtained by the first device of the electronic device may also include the following scenarios: first the preview image obtained by the first device includes the target object, but after moving the electronic device, As a result, the target object is not included in the first preview image obtained by the first device.
  • Step 205 Acquire first image data through a second device of the electronic device, wherein the first image data includes the target object.
  • the specific expressions of the second device and the first image data can refer to the corresponding expressions in the previous embodiment, which will not be repeated here.
  • Step 206 Identify the position of the target object in the first image data.
  • the object tracking technology can be used to identify the position of the target object in the first image data.
  • the first image data can be a rectangular image
  • the target object can be in the upper left corner, upper right corner, lower left corner, and lower right corner of the rectangular image. Or in the middle position.
  • Step 207 Display direction prompt information according to the position relationship between the position of the target object in the first image data and the preset position, wherein the direction prompt information is used to prompt the user to move the electronic device to The preview image acquired by the first device after the movement includes the target object.
  • the specific expressions of the preset position and the direction prompt information can refer to the corresponding expressions in the previous embodiment, which will not be repeated here.
  • the method further includes:
  • an image is taken by the first device.
  • the focal length of the second device (which can also be referred to as the viewing angle) has not changed.
  • the first device may also obtain the preview image in real time.
  • the prompt information may not be displayed again.
  • the electronic device may display "The object has completely deviated, please readjust the direction of the electronic device "Or "The object has completely deviated, please re-adjust the lens direction of the electronic device” and other prompt text messages. If at this time, after the user re-moves the electronic device, when the second image data obtained in real time through the second device includes the target object, the above prompt direction can be displayed again at this time.
  • the preset position may be a rectangular frame or a circular frame, etc.
  • the electronic device can take an image through the first device.
  • the contour and position of the target object can be tracked according to the object tracking technology, which is convenient for subsequent electronic devices to target the target object. Complete continuous shooting.
  • the movement efficiency of the electronic device is improved; and when the target object moves to the preset position, the image can be taken through the first device, and the image obtained at this time can be taken The effect is better, which improves the imaging effect of the captured image.
  • the preset position is a center position of the first image data or the second image data.
  • the second device acquires image data for the object.
  • the content included in the middle position of the image data may be the same as the content included in the image data obtained by the first device for the object.
  • the first device is a first camera and the second device is an image sensor
  • the first device and the second device aim at the same object.
  • the content included in the image acquired through the first device may be the same as the content included in the middle position of the image data acquired through the second device; it can also be understood as: the content included in the image acquired through the first device, It may be the content included in the middle position of the image data acquired by the second device after being enlarged by the "interpolation method".
  • the target object moves to the center position to complete the image shooting through the first device, thereby further improving the imaging effect of the image .
  • the target object can be determined first. In this way, the position of the target object can be recognized more conveniently and quickly in the first image data, and the speed of determining the prompt direction can be increased accordingly. .
  • FIG. 3 is a structural diagram of an electronic device provided by an embodiment of the present disclosure, which can realize the details of the display method in the foregoing embodiment and achieve the same effect.
  • the electronic device 300 includes:
  • the first acquisition module 301 is configured to acquire the first image data through the second device of the electronic device when the first preview image acquired through the first device of the electronic device does not include the target object, wherein the The first image data includes the target object;
  • the recognition module 302 is configured to recognize the position of the target object in the first image data
  • the first display module 303 is configured to display direction prompt information according to the position relationship between the position of the target object in the first image data and the preset position, wherein the direction prompt information is used to prompt the user to move
  • the electronic device is such that the preview image acquired by the first device after the movement includes the target object.
  • the electronic device 300 includes a main camera and a sub camera located on the same surface, the first device is the main camera, and the second device is the sub camera; or,
  • the electronic device 300 includes a camera assembly, the camera assembly includes a first camera and an image sensor, the first device is the first camera, and the second device is the image sensor.
  • the electronic device 300 further includes:
  • the second acquisition module 304 is configured to acquire second image data in real time through the second device during or after the movement of the electronic device;
  • the photographing module 305 is configured to photograph an image through the first device when the position of the target object in the second image data at least partially overlaps with the preset position.
  • the electronic device 300 further includes:
  • the third obtaining module 306 is configured to obtain a second preview image through the first device, and display the second preview image on the electronic device;
  • the detection module 307 is configured to detect whether the target object in the second preview image can be tracked according to the object tracking technology in the case of receiving an input operation for the second preview image;
  • the fourth acquisition module 308 is configured to acquire a first preview image through the first device when the target object can be tracked.
  • the electronic device 300 further includes:
  • the second display module 309 is configured to display prompt information for prompting the user that the target object cannot be tracked when the target object cannot be tracked.
  • the electronic device provided by the embodiment of the present disclosure can implement each process implemented by the electronic device in the method embodiments of FIG. 1 to FIG. 2, and to avoid repetition, details are not described herein again.
  • the electronic device since the electronic device displays the direction prompt information, the electronic device can move according to the direction prompt information, thereby reducing the time spent re-controlling the lens to aim at the target object and improving the efficiency of aiming at the target object.
  • FIG. 7 is a schematic diagram of the hardware structure of another electronic device that implements various embodiments of the present disclosure.
  • the electronic device 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, and Power 711 and other components.
  • a radio frequency unit 701 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, and Power 711 and other components.
  • the electronic device may include more or less components than those shown in the figure, or a combination of certain components, or different components. Layout.
  • electronic devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, in-vehicle terminals, wearable devices, and pedometers.
  • the processor 710 is used for:
  • the first image data is acquired by the second device of the electronic device, wherein the first image data includes the target;
  • Direction prompt information according to the positional relationship between the position of the target object in the first image data and the preset position, wherein the direction prompt information is used to prompt the user to move the electronic device so that the The preview image acquired by the first device includes the target object.
  • the electronic device includes a main camera and a sub camera located on the same surface, the first device is the main camera, and the second device is the sub camera; or,
  • the electronic device includes a camera assembly, the camera assembly includes a first camera and an image sensor, the first device is the first camera, and the second device is the image sensor.
  • processor 710 is further configured to:
  • an image is taken by the first device.
  • processor 710 is further configured to:
  • a first preview image is acquired through the first device.
  • the processor 710 is further configured to: when the target object cannot be tracked, display prompt information for prompting the user that the target object cannot be tracked.
  • the electronic device can move according to the direction prompt information, thereby reducing the time it takes to re-control the target object to appear in the angle of view of the lens.
  • the radio frequency unit 701 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, after receiving the downlink data from the base station, it is processed by the processor 710; Uplink data is sent to the base station.
  • the radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 701 can also communicate with the network and other devices through a wireless communication system.
  • the electronic device provides users with wireless broadband Internet access through the network module 702, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 703 may convert the audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output it as sound. Moreover, the audio output unit 703 may also provide audio output related to a specific function performed by the electronic device 700 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 704 is used to receive audio or video signals.
  • the input unit 704 may include a graphics processing unit (GPU) 7041 and a microphone 7042.
  • the graphics processor 7041 is used for the image of a still picture or video obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode.
  • the data is processed.
  • the processed image frame may be displayed on the display unit 706.
  • the image frame processed by the graphics processor 7041 may be stored in the memory 709 (or other storage medium) or sent via the radio frequency unit 701 or the network module 702.
  • the microphone 7042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to the mobile communication base station via the radio frequency unit 701 in the case of a telephone call mode for output.
  • the electronic device 700 further includes at least one sensor 705, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 7061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 7061 and the display panel 7061 when the electronic device 700 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 705 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 706 is used to display information input by the user or information provided to the user.
  • the display unit 706 may include a display panel 7061, and the display panel 7061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 707 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the electronic device.
  • the user input unit 707 includes a touch panel 7071 and other input devices 7072.
  • the touch panel 7071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 7071 or near the touch panel 7071. operating).
  • the touch panel 7071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 710, the command sent by the processor 710 is received and executed.
  • the touch panel 7071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 707 may also include other input devices 7072.
  • other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 7071 can be overlaid on the display panel 7061.
  • the touch panel 7071 detects a touch operation on or near it, it is sent to the processor 710 to determine the type of touch event, and then the processor 710 determines the type of touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 7061.
  • the touch panel 7071 and the display panel 7061 are used as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 7071 and the display panel 7061 can be integrated
  • the implementation of the input and output functions of the electronic device is not specifically limited here.
  • the interface unit 708 is an interface for connecting an external device and the electronic device 700.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 708 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 700 or can be used to connect the electronic device 700 to an external device. Transfer data between devices.
  • the memory 709 can be used to store software programs and various data.
  • the memory 709 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 709 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 710 is the control center of the electronic device. It uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes software programs and/or modules stored in the memory 709, and calls data stored in the memory 709 , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
  • the processor 710 may include one or more processing units; optionally, the processor 710 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 710.
  • the electronic device 700 may also include a power source 711 (such as a battery) for supplying power to various components.
  • a power source 711 such as a battery
  • the power source 711 may be logically connected to the processor 710 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the electronic device 700 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides an electronic device, including a processor 710, a memory 709, a computer program stored on the memory 709 and running on the processor 710, and the computer program is executed by the processor 710
  • an electronic device including a processor 710, a memory 709, a computer program stored on the memory 709 and running on the processor 710, and the computer program is executed by the processor 710
  • the embodiment of the present disclosure also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, or optical disk, etc.
  • the technical solution of the present disclosure essentially or the part that contributes to the related technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present disclosure.
  • a terminal which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Abstract

本公开提供一种显示方法及电子设备,该方法包括:在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象;识别所述目标对象在所述第一图像数据中的位置;根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对象。

Description

显示方法及电子设备
相关申请的交叉引用
本申请主张在2019年8月30日在中国提交的中国专利申请No.201910812999.2的优先权,其全部内容通过引用包含于此。
技术领域
本公开涉及图像处理技术领域,尤其涉及一种显示方法及电子设备。
背景技术
相关技术中的电子设备上一般均配置有摄像头,可以用于拍摄图像。在拍摄包括有目标对象的图像时,为了使得拍摄效果较好,可以通过改变焦距,从而调整目标对象在图像中的大小。但是在实际的使用中,在调整焦距或者移动电子设备后,通常容易丢失目标对象,在重新移动电子设备使得目标对象出现在电子设备的镜头的视角的过程中,电子设备可能朝向相反方向移动,从而导致重新控制目标对象出现在镜头的视角中所耗费时间较长。
发明内容
本公开实施例提供一种显示方法及电子设备,以解决重新控制目标对象出现在镜头的视角中所耗费时间较长的问题。
为了解决上述技术问题,本公开是这样实现的:
第一方面,本公开实施例提供了一种显示方法,包括:
在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象;
识别所述目标对象在所述第一图像数据中的位置;
根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对 象。
第二方面,本公开实施例还提供一种电子设备,包括:
第一获取模块,用于在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象;
识别模块,用于识别所述目标对象在所述第一图像数据中的位置;
第一显示模块,用于根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对象。
第三方面,本公开实施例还提供一种电子设备,包括:存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述显示方法中的步骤。
第四方面,本公开实施例还提供一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现上述显示方法中的步骤。
在本公开实施例中,在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象;识别所述目标对象在所述第一图像数据中的位置;根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对象。这样,由于电子设备上显示有方向提示信息,使得电子设备可以根据方向提示信息移动,从而可以减少重新控制目标对象出现在镜头的视角中所耗费的时间。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对本公开实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅 是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本公开实施例提供的一种显示方法的流程图;
图2是本公开实施例提供的另一种显示方法的流程图;
图3是本公开实施例提供的一种电子设备的结构示意图;
图4是本公开实施例提供的另一种电子设备的结构示意图;
图5是本公开实施例提供的另一种电子设备的结构示意图;
图6是本公开实施例提供的另一种电子设备的结构示意图;
图7是本公开实施例提供的另一种电子设备的结构示意图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
参见图1,图1是本公开实施例提供的一种显示方法的流程图,如图1所示,包括以下步骤:
步骤101、在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象。
其中,在通过电子设备的第一器件获取的第一预览图像中未包括目标对象具体场景可以包括以下场景:首先通过第一器件获取的预览图像中包括目标对象,但是在调整第一器件的焦距之后,通过第一器件获取的第一预览图像中未包括目标对象,而电子设备具体可以根据用户输入的指令对第一器件的焦距进行调整。而指令的具体类型在此不做限定。例如:指令可以为语音指令,当然,指令还可以为触控操作,如单手指操作或者双手指操作等。
另外,在通过电子设备的第一器件获取的第一预览图像中未包括目标对象具体场景还可以包括以下场景:首先通过第一器件获取的预览图像中包括目标对象,但是在移动电子设备之后,导致通过第一器件获取的第一预览图 像中未包括目标对象。
其中,第二器件的视角可以大于或等于第一器件的视角。例如:第一器件可以为第一摄像头,第二器件可以为第二摄像头,第一摄像头和第二摄像头可以位于同一个平面上,例如均位于电子设备的背板上。可选地,第一摄像头和第二摄像头可以位于同一平面的同一直线上。当然,第二摄像头的视角可以大于第一摄像头的视角。
另外,第二器件的具体类型在此不做限定,例如:第二器件可以为摄像头,也可以为传感器等。
其中,第一图像数据和第一预览图像可以均显示在电子设备的显示屏上,例如:在电子设备的显示屏的第一区域可以显示第一预览图像,在电子设备的显示屏的第二区域可以显示第一图像数据。当然,第一图像数据也可以不显示在电子设备的显示屏上,只记录在电子设备中。
其中,目标对象的具体类型在此不做限定,例如:目标对象可以为人,也可以为其他动物或者植物等。如:月亮、太阳或者路灯等。
步骤102、识别所述目标对象在所述第一图像数据中的位置。
其中,可以通过物体跟踪技术识别目标对象在第一图像数据中的位置,例如:第一图像数据可以呈矩形图像,在目标对象可以在该矩形图像的左上角、右上角、左下角、右下角或者中间位置上。
步骤103、根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对象。
其中,预设位置的具体位置在此不做限定,例如:预设位置可以为第二器件的视角的中间位置,也可以为第二器件的视角的中间位置偏左或者偏右的位置等。当然,预设位置还可以是第二器件采集的图像数据的中间位置,例如:第一图像数据的中间位置。
其中,方向提示信息可以包括提示方向,提示方向可以以箭头或者文本等形式呈现。例如:当提示目标对象需要朝向右上角移动时,可以显示一个指向右上角的箭头或者直接显示“朝右上角移动”的文本。
本公开实施例中,上述电子设备可以是手机、平板电脑(Tablet Personal Computer)、膝上型电脑(Laptop Computer)、个人数字助理(Personal Digital Assistant,PDA)、移动上网装置(Mobile Internet Device,MID)或可穿戴式设备(Wearable Device)等等。
在本公开实施例中,在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象;识别所述目标对象在所述第一图像数据中的位置;根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对象。这样,由于电子设备上显示有方向提示信息,使得电子设备可以根据方向提示信息移动,从而可以减少重新控制目标对象出现在镜头的视角中所耗费的时间。
参见图2,图2是本公开实施例提供的另一种显示方法的流程图。本实施例与上个实施例的主要区别在于:在通过第二器件获取第一图像数据之前,需要先选定目标对象。如图2所示,包括以下步骤:
步骤201、通过所述第一器件获取第二预览图像,并在所述电子设备上显示所述第二预览图像。
其中,第二预览图像中可以包括目标对象,当然,第二预览图像中还可以包括其他背景对象。
可选地,所述电子设备包括位于同一表面上的主摄像头和副摄像头,所述第一器件为所述主摄像头,所述第二器件为所述副摄像头;或者,
所述电子设备包括摄像头组件,所述摄像头组件包括第一摄像头和影像感应器,所述第一器件为所述第一摄像头,所述第二器件为所述影像感应器。
其中,主摄像头和副摄像头可以均位于电子设备的同一表面上,例如:电子设备的背板上,可选地,主摄像头和副摄像头还可以设置于同一直线上。另外,需要说明的是,副摄像头的视角可以大于或等于主摄像头的视角。
其中,摄像头组件可以包括第一摄像头和影像感应器,而影像感应器的视角也可以大于或等于第一摄像头的视角。例如:影像感应器针对某一对象 获取的第一图像数据包括的内容可以大于第一摄像头针对该对象获取的第一预览图像中包括的内容,作为一种可选的实施方式:影像感应器获取的第一图像数据可以包括目标对象和第一背景对象,第一摄像头获取的第一预览图像中可以只包括第一背景对象中的部分内容。需要说明的是,上述第一摄像头可以为不支持光学变焦的摄像头。
本公开实施例中,第二器件的种类可以不同,例如可以为副摄像头或者影像感应器,这样,由于第二器件种类的不同,从而增加了通过第二器件获取第一图像数据的灵活性。
步骤202、在接收到针对所述第二预览图像的输入操作的情况下,根据物体跟踪技术检测是否能够跟踪所述第二预览图像中的所述目标对象。在不能跟踪所述目标对象的情况下,执行步骤203;在接收到所述目标对象跟踪错误的输入信息的情况下,返回执行步骤202;在能够跟踪所述目标对象的情况下,执行步骤204。
其中,物体跟踪(OT,Object Tracking)技术可以为:在图像拍摄过程中,跟踪某个物体,并将其在图像中通过预设框体实时标记出来。常见的物体跟踪技术包括对人脸跟踪。例如:用户可以在图像拍摄过程中点击图像的某个位置,而物体跟踪技术可以根据用户点击图像的位置,识别该位置对应的物体,并扩充至该物体的边缘,并根据物体的边缘画出相应的预设框体(如矩形框)。人脸跟踪可以在图像拍摄时识别出图像中人脸的位置,并根据物体跟踪技术画出该人脸相应的矩形框。
其中,当接收到针对第二预览图像的输入操作的情况下,根据输入操作在第二预览图像中所确定的位置,然后结合物体跟踪技术,检测是否能够跟踪输入操作所确定的位置处对应的目标对象。例如:输入操作确定目标位置,结合物体跟踪技术,以目标位置为中心往外扩,检测是否能够跟踪目标对象的轮廓。该目标对象的轮廓可以为矩形框类型或者圆形框等。
其中,当确定的目标对象错误时,可以返回步骤202,从而可以根据用户的输入操作重新确定目标对象。
步骤203、显示用于提示用户不能跟踪所述目标对象的提示信息。
其中,提示信息的内容可以为文本信息,该文本信息可以为“不能跟踪 所述目标对象”;当然,提示消息的内容还可以为图像信息,该图像信息中可以包括“不能跟踪所述目标对象”的内容。
需要说明的是,提示信息的类型在此不做限定。另外,步骤203为可选的步骤。
这样,当显示有上述提示信息时,可以使得用户可以及时获取目标对象的跟踪状态,并可以根据提示信息作出相应的反应,提升了用户体验。
步骤204、在能够跟踪所述目标对象的情况下,通过所述第一器件获取第一预览图像。并检测第一预览图像中是否包括目标对象;若第一预览图像中不包括目标对象,执行步骤205。
其中,在通过电子设备的第一器件获取的第一预览图像中未包括目标对象具体场景可以包括以下场景:首先通过第一器件获取的预览图像中包括目标对象,但是在调整第一器件的焦距之后,通过第一器件获取的第一预览图像中未包括目标对象,而电子设备具体可以根据用户输入的指令对第一器件的焦距进行调整。而指令的具体类型在此不做限定。例如:指令可以为语音指令,当然,指令还可以为触控操作,如单手指操作或者双手指操作等。
另外,在通过电子设备的第一器件获取的第一预览图像中未包括目标对象具体场景还可以包括以下场景:首先通过第一器件获取的预览图像中包括目标对象,但是在移动电子设备之后,导致通过第一器件获取的第一预览图像中未包括目标对象。
步骤205、通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象。
其中,第二器件以及第一图像数据等的具体表述可以参见上个实施例中的相应表述,在此不再赘述。
步骤206、识别所述目标对象在所述第一图像数据中的位置。
其中,可以通过物体跟踪技术识别目标对象在第一图像数据中的位置,例如:第一图像数据可以呈矩形图像,在目标对象可以在该矩形图像的左上角、右上角、左下角、右下角或者中间位置上。
步骤207、根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户 移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对象。
其中,预设位置以及方向提示信息的具体表述可以参见上个实施例中的相应表述,在此不再赘述。
可选地,所述根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息之后,所述方法还包括:
在所述电子设备移动过程中或者移动后,通过所述第二器件实时获取第二图像数据;
在所述目标对象在所述第二图像数据中的位置与所述预设位置至少部分重合的情况下,通过所述第一器件拍摄图像。
其中,在用户按照提示方向移动电子设备之前和之后,第二器件的焦距(也可以被称作为视角)一直未变化。
其中,第一器件也可以实时获取预览图像,当用户按照提示方向移动电子设备,且目标对象在第一器件获取的预览图像中出现时,提示信息可以不用再显示。
其中,若用户未按照提示方向移动电子设备,从而导致通过第二器件实时获取的第二图像数据也未包括目标对象时,电子设备上可以显示“物体已完全偏离,请重新调整电子设备的方向”或者“物体已完全偏离,请重新调整电子设备的镜头方向”等提示文本信息。若此时,用户重新移动电子设备后,通过第二器件实时获取的第二图像数据包括目标对象时,此时可以重新显示上述提示方向。
另外,在目标对象在第二图像数据中的位置与预设位置至少部分重合的情况下,例如:预设位置可以为矩形框或者圆形框等形状,当预设位置与目标对象在第二图像数据中的位置至少部分重合时,可以确定目标对象移动至预设位置,由于目标对象在预设位置的成像效果一般较好,此时电子设备可以通过第一器件拍摄图像。
需要说明的是,当通过第一器件完成拍摄图像后,重新通过第一器件获取第一预览图像时,此时可以继续根据物体跟踪技术追踪目标对象的轮廓和位置,方便后续电子设备针对目标对象完成连续拍照。
本公开实施例中,由于电子设备可以按照方向提示信息移动,从而提升了电子设备的移动效率;并且当目标对象移动至预设位置时,可以通过第一器件拍摄图像,此时拍摄得到的图像的成效效果较好,从而提升了拍摄图像的成像效果。
可选地,所述预设位置为所述第一图像数据或者所述第二图像数据的中心位置。
在一种可选的实施方式中,若第一器件为主摄像头,第二器件为副摄像头时,在第一器件和第二器件针对同一对象获取图像数据时,第二器件针对该对象获取的图像数据的中间位置包括的内容,与第一器件针对该对象获取的图像数据包括的内容可以相同。
在另一种可选的实施方式中,若第一器件为第一摄像头,第二器件为影像感应器时,当第一器件的焦距调整之后,且通过第一器件和第二器件针对同一对象获取图像数据时,通过第一器件获取的图像包括的内容,可以与通过第二器件获取的图像数据中间位置包括的内容相同;也可以被理解为:通过第一器件获取的图像包括的内容,可以为通过第二器件获取的图像数据中间位置包括的内容通过“插值法”进行放大后的内容。
本公开实施例中,由于预设位置为第一图像数据或者第二图像数据的中心位置,这样,目标对象移动至中心位置才通过第一器件完成图像拍摄,从而进一步的提高了图像的成像效果。
本公开实施例中,通过步骤201至207,可以先确定目标对象,这样,在后续在第一图像数据中可以更加方便、快捷的识别出目标对象的位置,进而相应的提高提示方向的确定速度。
参见图3,图3是本公开实施例提供的电子设备的结构图,能实现上述实施例中显示方法的细节,并达到相同的效果。如图3所示,电子设备300包括:
第一获取模块301,用于在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象;
识别模块302,用于识别所述目标对象在所述第一图像数据中的位置;
第一显示模块303,用于根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对象。
可选地,所述电子设备300包括位于同一表面上的主摄像头和副摄像头,所述第一器件为所述主摄像头,所述第二器件为所述副摄像头;或者,
所述电子设备300包括摄像头组件,所述摄像头组件包括第一摄像头和影像感应器,所述第一器件为所述第一摄像头,所述第二器件为所述影像感应器。
可选地,参见图4,所述电子设备300还包括:
第二获取模块304,用于在所述电子设备移动过程中或者移动后,通过所述第二器件实时获取第二图像数据;
拍摄模块305,用于在所述目标对象在所述第二图像数据中的位置与所述预设位置至少部分重合的情况下,通过所述第一器件拍摄图像。
可选地,参见图5,所述电子设备300还包括:
第三获取模块306,用于通过所述第一器件获取第二预览图像,并在所述电子设备上显示所述第二预览图像;
检测模块307,用于在接收到针对所述第二预览图像的输入操作的情况下,根据物体跟踪技术检测是否能够跟踪所述第二预览图像中的所述目标对象;
第四获取模块308,用于在能够跟踪所述目标对象的情况下,通过所述第一器件获取第一预览图像。
可选地,参见图6,所述电子设备300还包括:
第二显示模块309,用于在不能跟踪所述目标对象的情况下,显示用于提示用户不能跟踪所述目标对象的提示信息。
本公开实施例提供的电子设备能够实现图1至图2的方法实施例中电子设备实现的各个过程,为避免重复,这里不再赘述。本公开实施例中,由于电子设备上显示有方向提示信息,使得电子设备可以根据方向提示信息移动,从而可以减少重新控制镜头对准目标对象所耗费时间,提高了对准目标对象 的效率。
图7为实现本公开各个实施例的另一种电子设备的硬件结构示意图。
该电子设备700包括但不限于:射频单元701、网络模块702、音频输出单元703、输入单元704、传感器705、显示单元706、用户输入单元707、接口单元708、存储器709、处理器710、以及电源711等部件。本领域技术人员可以理解,图7中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,电子设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
其中,处理器710,用于:
在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象;
识别所述目标对象在所述第一图像数据中的位置;
根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对象。
可选地,所述电子设备包括位于同一表面上的主摄像头和副摄像头,所述第一器件为所述主摄像头,所述第二器件为所述副摄像头;或者,
所述电子设备包括摄像头组件,所述摄像头组件包括第一摄像头和影像感应器,所述第一器件为所述第一摄像头,所述第二器件为所述影像感应器。
可选地,处理器710,还用于:
在所述电子设备移动过程中或者移动后,通过所述第二器件实时获取第二图像数据;
在所述目标对象在所述第二图像数据中的位置与所述预设位置至少部分重合的情况下,通过所述第一器件拍摄图像。
可选地,处理器710,还用于:
通过所述第一器件获取第二预览图像,并在所述电子设备上显示所述第 二预览图像;
在接收到针对所述第二预览图像的输入操作的情况下,根据物体跟踪技术检测是否能够跟踪所述第二预览图像中的所述目标对象;
在能够跟踪所述目标对象的情况下,通过所述第一器件获取第一预览图像。
可选地,处理器710,还用于:在不能跟踪所述目标对象的情况下,显示用于提示用户不能跟踪所述目标对象的提示信息。
本公开实施例中,由于电子设备上显示有方向提示信息,使得电子设备可以根据方向提示信息移动,从而可以减少重新控制目标对象出现在镜头的视角中所耗费的时间。
应理解的是,本公开实施例中,射频单元701可用于收发信息或通话过程中,信号的接收和发送,具体地,将来自基站的下行数据接收后,给处理器710处理;另外,将上行的数据发送给基站。通常,射频单元701包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元701还可以通过无线通信系统与网络和其他设备通信。
电子设备通过网络模块702为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元703可以将射频单元701或网络模块702接收的或者在存储器709中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元703还可以提供与电子设备700执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元703包括扬声器、蜂鸣器以及受话器等。
输入单元704用于接收音频或视频信号。输入单元704可以包括图形处理器(Graphics Processing Unit,GPU)7041和麦克风7042,图形处理器7041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元706上。经图形处理器7041处理后的图像帧可以存储在存储器709(或其它存储介质)中或者经由射频单元701或网络模块702进行发送。麦克风7042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以 在电话通话模式的情况下转换为可经由射频单元701发送到移动通信基站的格式输出。
电子设备700还包括至少一种传感器705,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板7061的亮度,接近传感器可在电子设备700移动到耳边时,关闭显示面板7061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别电子设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器705还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元706用于显示由用户输入的信息或提供给用户的信息。显示单元706可包括显示面板7061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板7061。
用户输入单元707可用于接收输入的数字或字符信息,以及产生与电子设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元707包括触控面板7071以及其他输入设备7072。触控面板7071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板7071上或在触控面板7071附近的操作)。触控面板7071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器710,接收处理器710发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板7071。除了触控面板7071,用户输入单元707还可以包括其他输入设备7072。具体地,其他输入设备7072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步地,触控面板7071可覆盖在显示面板7061上,当触控面板7071检测到在其上或附近的触摸操作后,传送给处理器710以确定触摸事件的类型,随后处理器710根据触摸事件的类型在显示面板7061上提供相应的视觉输出。虽然在图7中,触控面板7071与显示面板7061是作为两个独立的部件来实现电子设备的输入和输出功能,但是在某些实施例中,可以将触控面板7071与显示面板7061集成而实现电子设备的输入和输出功能,具体此处不做限定。
接口单元708为外部装置与电子设备700连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元708可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到电子设备700内的一个或多个元件或者可以用于在电子设备700和外部装置之间传输数据。
存储器709可用于存储软件程序以及各种数据。存储器709可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器709可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器710是电子设备的控制中心,利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储器709内的软件程序和/或模块,以及调用存储在存储器709内的数据,执行电子设备的各种功能和处理数据,从而对电子设备进行整体监控。处理器710可包括一个或多个处理单元;可选地,处理器710可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器710中。
电子设备700还可以包括给各个部件供电的电源711(比如电池),可选地,电源711可以通过电源管理系统与处理器710逻辑相连,从而通过电源 管理系统实现管理充电、放电、以及功耗管理等功能。
另外,电子设备700包括一些未示出的功能模块,在此不再赘述。
可选地,本公开实施例还提供一种电子设备,包括处理器710,存储器709,存储在存储器709上并可在所述处理器710上运行的计算机程序,该计算机程序被处理器710执行时实现上述一种显示方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本公开实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述一种显示方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例所述的方法。
上面结合附图对本公开的实施例进行了描述,但是本公开并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本公开的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本公开的保护之内。

Claims (12)

  1. 一种显示方法,包括:
    在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象;
    识别所述目标对象在所述第一图像数据中的位置;
    根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对象。
  2. 根据权利要求1所述的方法,其中,所述电子设备包括位于同一表面上的主摄像头和副摄像头,所述第一器件为所述主摄像头,所述第二器件为所述副摄像头;或者,
    所述电子设备包括摄像头组件,所述摄像头组件包括第一摄像头和影像感应器,所述第一器件为所述第一摄像头,所述第二器件为所述影像感应器。
  3. 根据权利要求1或2所述的方法,其中,所述根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息之后,所述方法还包括:
    在所述电子设备移动过程中或者移动后,通过所述第二器件实时获取第二图像数据;
    在所述目标对象在所述第二图像数据中的位置与所述预设位置至少部分重合的情况下,通过所述第一器件拍摄图像。
  4. 根据权利要求1所述的方法,其中,所述在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据之前,所述方法还包括:
    通过所述第一器件获取第二预览图像,并在所述电子设备上显示所述第二预览图像;
    在接收到针对所述第二预览图像的输入操作的情况下,根据物体跟踪技 术检测是否能够跟踪所述第二预览图像中的所述目标对象;
    在能够跟踪所述目标对象的情况下,通过所述第一器件获取第一预览图像。
  5. 根据权利要求4所述的方法,其中,所述根据物体跟踪技术检测是否能够跟踪所述第二预览图像中的所述目标对象之后,所述方法还包括:
    在不能跟踪所述目标对象的情况下,显示用于提示用户不能跟踪所述目标对象的提示信息。
  6. 一种电子设备,包括:
    第一获取模块,用于在通过电子设备的第一器件获取的第一预览图像中未包括目标对象的情况下,通过所述电子设备的第二器件获取第一图像数据,其中,所述第一图像数据中包括所述目标对象;
    识别模块,用于识别所述目标对象在所述第一图像数据中的位置;
    第一显示模块,用于根据所述目标对象在所述第一图像数据中的位置与预设位置之间的位置关系显示方向提示信息,其中,所述方向提示信息用于提示用户移动所述电子设备,以使得移动后通过所述第一器件获取的预览图像中包括所述目标对象。
  7. 根据权利要求6所述的电子设备,还包括位于同一表面上的主摄像头和副摄像头,所述第一器件为所述主摄像头,所述第二器件为所述副摄像头;或者,
    所述电子设备包括摄像头组件,所述摄像头组件包括第一摄像头和影像感应器,所述第一器件为所述第一摄像头,所述第二器件为所述影像感应器。
  8. 根据权利要求6或7所述的电子设备,还包括:
    第二获取模块,用于在所述电子设备移动过程中或者移动后,通过所述第二器件实时获取第二图像数据;
    拍摄模块,用于在所述目标对象在所述第二图像数据中的位置与所述预设位置至少部分重合的情况下,通过所述第一器件拍摄图像。
  9. 根据权利要求6所述的电子设备,还包括:
    第三获取模块,用于通过所述第一器件获取第二预览图像,并在所述电子设备上显示所述第二预览图像;
    检测模块,用于在接收到针对所述第二预览图像的输入操作的情况下,根据物体跟踪技术检测是否能够跟踪所述第二预览图像中的所述目标对象;
    第四获取模块,用于在能够跟踪所述目标对象的情况下,通过所述第一器件获取第一预览图像。
  10. 根据权利要求9所述的电子设备,还包括:
    第二显示模块,用于在不能跟踪所述目标对象的情况下,显示用于提示用户不能跟踪所述目标对象的提示信息。
  11. 一种电子设备,包括:存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如权利要求1-5中任一项所述的显示方法中的步骤。
  12. 一种计算机可读存储介质,其中,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1-5中任一项所述的显示方法中的步骤。
PCT/CN2020/104334 2019-08-30 2020-07-24 显示方法及电子设备 WO2021036623A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910812999.2 2019-08-30
CN201910812999.2A CN110602389B (zh) 2019-08-30 2019-08-30 一种显示方法及电子设备

Publications (1)

Publication Number Publication Date
WO2021036623A1 true WO2021036623A1 (zh) 2021-03-04

Family

ID=68856771

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/104334 WO2021036623A1 (zh) 2019-08-30 2020-07-24 显示方法及电子设备

Country Status (2)

Country Link
CN (1) CN110602389B (zh)
WO (1) WO2021036623A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114286009A (zh) * 2021-12-29 2022-04-05 维沃移动通信有限公司 倒影图像拍摄方法、装置、电子设备及存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110602389B (zh) * 2019-08-30 2021-11-02 维沃移动通信有限公司 一种显示方法及电子设备
CN111479055B (zh) * 2020-04-10 2022-05-20 Oppo广东移动通信有限公司 拍摄方法、装置、电子设备及存储介质
CN111770277A (zh) * 2020-07-31 2020-10-13 RealMe重庆移动通信有限公司 一种辅助拍摄方法及终端、存储介质
CN112954220A (zh) * 2021-03-03 2021-06-11 北京蜂巢世纪科技有限公司 图像预览方法及装置、电子设备、存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090284582A1 (en) * 2008-05-15 2009-11-19 Arcsoft, Inc. Method of automatic photographs stitching
CN108366220A (zh) * 2018-04-23 2018-08-03 维沃移动通信有限公司 一种视频通话处理方法及移动终端
CN109788208A (zh) * 2019-01-30 2019-05-21 华通科技有限公司 基于多组焦距图像源的目标识别方法及系统
CN110602389A (zh) * 2019-08-30 2019-12-20 维沃移动通信有限公司 一种显示方法及电子设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5539045B2 (ja) * 2010-06-10 2014-07-02 キヤノン株式会社 撮像装置およびその制御方法、記憶媒体
CN103377471B (zh) * 2012-04-16 2016-08-03 株式会社理光 物体定位方法和装置、最优摄像机对确定方法和装置
CN103108164A (zh) * 2013-02-01 2013-05-15 南京迈得特光学有限公司 复眼式全景连续跟踪监控系统
JP6027560B2 (ja) * 2014-02-18 2016-11-16 富士フイルム株式会社 自動追尾撮像装置
JP2017069618A (ja) * 2015-09-28 2017-04-06 京セラ株式会社 電子機器及び撮像方法
CN105759839B (zh) * 2016-03-01 2018-02-16 深圳市大疆创新科技有限公司 无人机视觉跟踪方法、装置以及无人机
CN107800953B (zh) * 2016-09-02 2020-07-31 聚晶半导体股份有限公司 图像获取装置及其缩放图像的方法
CN106454132A (zh) * 2016-11-29 2017-02-22 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
CN108712602A (zh) * 2018-04-24 2018-10-26 Oppo广东移动通信有限公司 摄像头控制方法、装置、移动终端以及存储介质
CN108429881A (zh) * 2018-05-08 2018-08-21 山东超景深信息科技有限公司 免通过反复变焦取景的长焦拍摄云台相机系统应用方法
CN108833768A (zh) * 2018-05-10 2018-11-16 信利光电股份有限公司 一种多摄像头的拍摄方法、拍摄终端和可读存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090284582A1 (en) * 2008-05-15 2009-11-19 Arcsoft, Inc. Method of automatic photographs stitching
CN108366220A (zh) * 2018-04-23 2018-08-03 维沃移动通信有限公司 一种视频通话处理方法及移动终端
CN109788208A (zh) * 2019-01-30 2019-05-21 华通科技有限公司 基于多组焦距图像源的目标识别方法及系统
CN110602389A (zh) * 2019-08-30 2019-12-20 维沃移动通信有限公司 一种显示方法及电子设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114286009A (zh) * 2021-12-29 2022-04-05 维沃移动通信有限公司 倒影图像拍摄方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN110602389B (zh) 2021-11-02
CN110602389A (zh) 2019-12-20

Similar Documents

Publication Publication Date Title
WO2021098678A1 (zh) 投屏控制方法及电子设备
CN108513070B (zh) 一种图像处理方法、移动终端及计算机可读存储介质
US11689649B2 (en) Shooting method and terminal
CN111541845B (zh) 图像处理方法、装置及电子设备
WO2021036623A1 (zh) 显示方法及电子设备
US20220279116A1 (en) Object tracking method and electronic device
JP7203859B2 (ja) 画像処理方法及びフレキシブルスクリーン端末
WO2021051995A1 (zh) 拍照方法及终端
WO2019174628A1 (zh) 拍照方法及移动终端
US20200257433A1 (en) Display method and mobile terminal
WO2020020134A1 (zh) 拍摄方法及移动终端
WO2021104227A1 (zh) 拍照方法及电子设备
WO2020238497A1 (zh) 图标移动方法及终端设备
WO2021013009A1 (zh) 拍照方法和终端设备
US11778304B2 (en) Shooting method and terminal
CN111031253B (zh) 一种拍摄方法及电子设备
US11863901B2 (en) Photographing method and terminal
WO2020199986A1 (zh) 视频通话方法及终端设备
WO2021082744A1 (zh) 视频查看方法及电子设备
WO2021104357A1 (zh) 电子设备和拍摄方法
WO2020238562A1 (zh) 显示方法及终端
CN108881721B (zh) 一种显示方法及终端
WO2021017730A1 (zh) 截图方法及终端设备
WO2021104226A1 (zh) 拍照方法及电子设备
WO2021147911A1 (zh) 移动终端、拍摄模式的检测方法及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20856493

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20856493

Country of ref document: EP

Kind code of ref document: A1