WO2020216181A1 - 终端设备及其控制方法 - Google Patents

终端设备及其控制方法 Download PDF

Info

Publication number
WO2020216181A1
WO2020216181A1 PCT/CN2020/085648 CN2020085648W WO2020216181A1 WO 2020216181 A1 WO2020216181 A1 WO 2020216181A1 CN 2020085648 W CN2020085648 W CN 2020085648W WO 2020216181 A1 WO2020216181 A1 WO 2020216181A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal device
camera
camera modules
hole
present disclosure
Prior art date
Application number
PCT/CN2020/085648
Other languages
English (en)
French (fr)
Inventor
陈海新
王友飞
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020216181A1 publication Critical patent/WO2020216181A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • the present disclosure relates to the field of terminal technology, and in particular to a terminal device and a control method thereof.
  • the optical path of the traditional camera module is consistent with the thickness direction of the terminal device. Therefore, the height of the camera module directly determines the minimum thickness of the terminal device, and the current height of the camera module has become a bottleneck restricting the thinning of the terminal device.
  • a periscope camera module is proposed in the related art, that is, the ambient light of the terminal equipment is reflected and transmitted to the camera assembly by using a reflective device, so that the optical path of the camera module and the terminal equipment The length direction is consistent.
  • the optical path of the camera module is consistent with the length of the terminal device, so the height of the camera no longer limits the thickness of the terminal device.
  • the thickness of the camera module of the periscope camera is consistent with the thickness direction of the terminal device, in an ultra-thin terminal device equipped with a periscope camera, the area of the light sensor of the camera is small, and the area of the light sensor is relatively small. Small will result in a small light-receiving area of a single pixel or a small number of camera pixels, which in turn will cause an ultra-thin terminal device equipped with a periscope camera to be unable to capture large-bottom, high-pixel images.
  • the embodiments of the present disclosure provide a terminal device and a control method thereof, which are used to solve the problem that an ultra-thin terminal device equipped with a periscope camera cannot capture images with a large bottom and high pixels.
  • an embodiment of the present disclosure provides a terminal device including at least two camera modules
  • any one of the camera modules includes: a camera assembly, a through hole provided on the housing of the terminal device, and a reflective device disposed opposite to the through hole, and the reflective device will enter from the through hole The ambient light is reflected to the camera assembly.
  • embodiments of the present disclosure provide a method for controlling a terminal device, which is used to control the terminal device described in the first aspect, and the method includes:
  • the images captured by each of the camera modules are spliced to generate a target image.
  • embodiments of the present disclosure provide a terminal device as described in the first aspect, including:
  • a photographing module configured to use the at least two camera modules to perform imaging according to the ambient light reflected by the reflective device contained in each of the at least two camera modules to obtain images of partial regions of the target scene respectively;
  • the stitching module is used for stitching the images captured by each of the camera modules to generate a target image.
  • an embodiment of the present disclosure provides a terminal device, including: a processor, a memory, and a computer program stored on the memory and running on the processor, and the computer program is processed by the processor.
  • the steps of the terminal device control method described in the second aspect are implemented when the device is executed.
  • the embodiments of the present disclosure provide a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the terminal device as described in the second aspect is implemented The steps of the control method.
  • the terminal device provided by the embodiment of the present disclosure includes at least two camera modules, wherein any one of the camera modules includes: a camera assembly, a through hole arranged on the terminal device housing, and a through hole arranged opposite to the through hole Reflective device, and the reflective device can reflect ambient light entering from the through hole to the camera assembly. Since the camera assembly is formed by reflecting the ambient light entering through the through hole through the reflective device, at least two camera modules of the terminal device provided by the embodiment of the present disclosure are both periscope camera modules, and since the reflective device can The ambient light that enters through the through hole is reflected to the camera assembly, so the at least two camera modules can respectively obtain partial images of the scene where the image is collected, and further compare the images obtained by the two camera modules.
  • each camera module only needs to obtain a part of the image. Therefore, compared with obtaining a complete image through a camera module, this The disclosed embodiment can reduce the area of the light sensor of each camera module without affecting the photographing effect of the terminal device, thereby reducing the thickness of the terminal device. Therefore, the embodiment of the present disclosure can solve the problem of a periscope camera. There is a problem that thin terminal devices cannot capture large-bottom, high-pixel images.
  • FIG. 1 is a structural diagram of an Android operating system provided by an embodiment of the disclosure
  • FIG. 2 is one of schematic structural diagrams of a terminal device provided by an embodiment of the disclosure.
  • FIG. 3 is a second schematic structural diagram of a terminal device provided by an embodiment of the disclosure.
  • FIG. 5 is a schematic diagram of an image collected by the camera module shown in FIG. 4 provided by an embodiment of the disclosure
  • FIG. 6 is the second schematic diagram of the arrangement of the through holes of the camera module provided by the embodiment of the disclosure.
  • FIG. 7 is a schematic diagram of an image collected by the camera module shown in FIG. 6 provided by an embodiment of the disclosure.
  • FIG. 8 is the third schematic diagram of the arrangement of the through holes of the camera module provided by the embodiments of the disclosure.
  • FIG. 9 is a schematic diagram of an image collected by the camera module shown in FIG. 8 provided by an embodiment of the present disclosure.
  • FIG. 10 is a flowchart of steps of a method for controlling a terminal device according to an embodiment of the present disclosure
  • FIG. 11 is the third schematic structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 12 is a schematic diagram of the hardware structure of a terminal device provided by an embodiment of the disclosure.
  • words such as “first” and “second” are used to distinguish the same or similar items with substantially the same function or effect.
  • the skilled person can understand that the words “first” and “second” do not limit the number and execution order.
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present disclosure should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner. In the embodiments of the present disclosure, unless otherwise specified, the meaning of "plurality" means two or more.
  • the thickness of the camera module of the periscope camera is consistent with the thickness direction of the terminal device, in an ultra-thin terminal device equipped with a periscope camera, the area of the light sensor of the camera is small, and the area of the light sensor is smaller. As a result, the light-receiving area of a single pixel is small or the number of camera pixels is small, which in turn causes the ultra-thin terminal equipment equipped with a periscope camera to be unable to capture large-bottom, high-pixel images.
  • the embodiments of the present disclosure provide a terminal device, including at least two camera modules, wherein any of the camera modules includes: a camera assembly, a through hole provided on the terminal device housing, and A light reflecting device arranged opposite to the through hole, and the light reflecting device can reflect ambient light entering from the through hole to the camera assembly. Since the camera assembly is formed by reflecting the ambient light entering through the through hole through the reflective device, at least two camera modules of the terminal device provided by the embodiment of the present disclosure are both periscope camera modules, and since the reflective device can The ambient light that enters through the through hole is reflected to the camera assembly, so the at least two camera modules can respectively obtain partial images of the scene where the image is collected, and further compare the images obtained by the two camera modules.
  • each camera module only needs to obtain a part of the image. Therefore, compared with obtaining a complete image through a camera module, this The disclosed embodiment can reduce the area of the light sensor of each camera module without affecting the photographing effect of the terminal device, thereby reducing the thickness of the terminal device. Therefore, the embodiment of the present disclosure can solve the problem of a periscope camera. There is a problem that thin terminal devices cannot capture large-bottom, high-pixel images.
  • the method for controlling a terminal device may be applied to a terminal device, and the terminal device may be a terminal device with an operating system.
  • the operating system may be an Android operating system, an iOS operating system, or other possible operating systems, which is not limited in the embodiments of the present disclosure.
  • the following uses the Android operating system as an example to introduce the software environment applied by the terminal device control method provided in the embodiments of the present disclosure.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure.
  • the architecture of the Android operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications in the Android operating system (including system applications and third-party applications).
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software level.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • developers can develop software programs that implement the terminal device control method provided by the embodiments of the present disclosure based on the system architecture of the Android operating system as shown in FIG.
  • the control method of the terminal device can be run based on the Android operating system as shown in FIG. 1. That is, the processor or the terminal device can implement the method for controlling the terminal device provided in the embodiments of the present disclosure by running the software program in the Android operating system.
  • the terminal devices provided by the embodiments of the present disclosure may be mobile phones, tablet computers, notebook computers, ultra-mobile personal computers (UMPC), netbooks, personal digital assistants (personal digital assistants, PDAs), smart watches, smart hands.
  • a terminal device such as a ring, or the terminal device may also be another type of terminal device, which is not limited in the embodiment of the present disclosure.
  • the embodiment of the present disclosure provides a terminal device.
  • the terminal device provided by the embodiment of the present disclosure includes at least two camera modules 20 (in FIG. 2 the terminal device includes two camera modules). Group as an example).
  • any one of the camera modules 20 includes: a camera assembly 21, a through hole 22 arranged on the housing of the terminal device, and a light reflecting device 23 arranged opposite to the through hole 22; the light reflecting device 23 is used to The ambient light entering from the through hole 22 (the ambient light is shown as a dotted line with an arrow in FIG. 2) is reflected to the camera assembly 21, and 21 pieces of the camera assembly are used for imaging according to the ambient light reflected by the reflective device 23 .
  • the terminal device includes two camera modules as a column, but the embodiment of the present disclosure is not limited thereto.
  • the terminal device provided by the embodiment of the present disclosure also Other numbers of camera modules may be included.
  • the terminal device may include four camera modules.
  • the following describes the working principle of the terminal device provided in the above embodiment by the terminal device's image collection of the target scene.
  • the at least two camera modules are imaged according to the ambient light reflected by the reflective device contained in each of them, and images of partial areas of the target scene are obtained respectively, and then combined through image synthesis
  • the device synthesizes the images generated by the at least two camera modules to generate a complete image of the target scene.
  • the image synthesis apparatus may be a processor of a terminal device.
  • the terminal device provided by the embodiment of the present disclosure includes at least two camera modules, wherein any one of the camera modules includes: a camera assembly, a through hole arranged on the terminal device housing, and a through hole arranged opposite to the through hole Reflective device, and the reflective device can reflect the ambient light entering from the through hole to the camera assembly. Since the camera assembly is formed by reflecting the ambient light entering through the through hole through the reflective device, at least two camera modules of the terminal device provided by the embodiment of the present disclosure are both periscope camera modules, and since the reflective device can The ambient light that enters through the through hole is reflected to the camera assembly, so the at least two camera modules can respectively obtain partial images of the scene where the image is collected, and further compare the images obtained by the two camera modules.
  • each camera module only needs to obtain a part of the image. Therefore, compared with obtaining a complete image through a camera module, this The disclosed embodiment can reduce the area of the light sensor of each camera module without affecting the photographing effect of the terminal device, thereby reducing the thickness of the terminal device. Therefore, the embodiment of the present disclosure can solve the problem of a periscope camera. There is a problem that thin terminal devices cannot capture large-bottom, high-pixel images.
  • the light reflecting device 23 in the foregoing embodiment may be a light reflecting prism.
  • the light reflecting device in the above embodiment only needs to reflect the ambient light passing through the through hole to the camera assembly. Therefore, the light reflecting prism in the embodiment of the present disclosure can be any triangular prism, and does not need to be a special isosceles triangular prism.
  • the light-reflecting device in the embodiment of the present disclosure may be other devices, such as a plane mirror.
  • the reflective device 23 is not limited, and it is subject to the ability to emit ambient light through the through hole to the camera assembly.
  • the angle between the light path in the through hole 22 and the light path in the camera assembly 21 is 90°.
  • the height direction of the camera assembly 21 is consistent with the length direction of the terminal device.
  • the terminal device further includes: a driving device 30.
  • the through hole 21 includes a first through hole 211 provided on the front side of the terminal device and a second through hole 212 provided on the back side of the terminal device.
  • the driving device 30 is used to adjust the light reflecting device 23 to reflect the ambient light entering from the first through hole 211 to the camera assembly 21, or the ambient light entering from the second through hole 212 Reflected to the camera assembly 21.
  • the terminal device can first view the scene on the front of the terminal device.
  • the camera components of the at least two camera modules can be used as the front camera of the terminal device; and because the reflector 23 is adjusted by the driving device 30, the rear side of the terminal device The ambient light that enters the second through hole 212 is reflected to the camera assembly 21. Therefore, the terminal device also collects images of the scene on the back of the terminal device.
  • the camera assembly of the at least two camera modules can be used as the The rear camera of the terminal device.
  • the camera assembly group of at least two camera modules of the terminal device provided by the embodiment of the present disclosure can be used as the front camera of the terminal device and the rear camera of the terminal device, thereby realizing the camera assembly Therefore, the terminal device provided by the embodiments of the present disclosure can also reduce the manufacturing cost of the terminal device and save the internal space of the terminal device.
  • the arrangement of the through holes of the 4 camera modules may be any of the following:
  • the through holes 22 of the four camera modules are arranged in a row along the length direction of the terminal device.
  • the four camera modules sequentially collect images from top to bottom These are 5a, 5b, 5c, and 5d.
  • the widths of 5a, 5b, 5c, and 5d are all equal to the width of the image of the target scene, and the lengths of 5a, 5b, 5c, and 5d are all greater than 1/4 of the length h of the image of the target scene.
  • the length of 5a and 5d can be 5/16h
  • the width and length of 5b and 5c can be 6/16h
  • stitching the images collected by four camera modules 1/16 of the two adjacent images Overlapping stitching.
  • the through holes 22 of the four camera modules are arranged in a row along the width direction of the terminal device.
  • the four camera modules sequentially collect images from left to right These are 7a, 7b, 7c, and 7d.
  • the lengths of 7a, 7b, 7c, and 7d are all equal to the length of the image of the target scene, and the widths of 7a, 7b, 7c, and 7d are all greater than 1/4 of the width L of the image of the target scene.
  • the width of 7a and 7d can be 5/16L, and the width of 7b and 7c can be 6/16h.
  • the through holes 22 of the four camera modules are arranged in a 2*2 through hole matrix.
  • the image obtained by the camera module with the through hole on the upper left is 9a
  • the through hole is located
  • the image obtained by the camera module on the upper right is 9b
  • the image obtained by the camera module with the through hole at the lower left is 9c
  • the image obtained by the camera module with the through hole on the lower right is 9d.
  • the widths of 9a, 9b, 9c, and 9d are all greater than 1/2 of the width of the image of the target scene
  • the lengths of 9a, 9b, 9c, and 9d are all greater than 1/2 of the length of the image of the target scene.
  • the ratio of the width of 9a, 9b, 9c, and 9d to the width of the image of the target scene is 2.1/2; the ratio of the length of 9a, 9b, 9c, and 9d to the width of the image of the target scene is 1.6/1.5 , When stitching the images collected by four camera modules, overlap and stitch the 1/20 width and 1/15 height of two adjacent images.
  • the ratio of the length to the width of the light sensor of the camera assembly is 2.1:1.6.
  • Another embodiment of the present disclosure provides a method for controlling a terminal device.
  • the method for controlling a terminal device is used to control the terminal device provided in any of the foregoing embodiments. Specifically, referring to FIG. 10, the method includes the following steps 101 and 102.
  • Step 101 Use at least two camera modules to perform imaging according to the ambient light reflected by the reflective devices contained in each of them, to obtain images of partial regions of the target scene respectively.
  • the images respectively acquired by the at least two camera modules may be as shown in FIG. 6, or FIG. 8 or FIG. 10, which will not be repeated here.
  • Step 102 Splicing the images captured by each of the camera modules to generate a target image.
  • the embodiment of the present disclosure does not limit the algorithm for stitching the images taken by the at least two camera modules, and the image that can stitch the images taken by the at least two camera modules into the target scene is quasi.
  • At least two camera modules are used to image the ambient light reflected by the reflective device contained in each of them to obtain images of partial regions of the target scene respectively, and then to take each of the camera modules.
  • the images captured by the modules are spliced to generate the target image; since the camera assembly is formed by reflecting the ambient light entering through the through hole through the reflective device, the at least two camera modules of the terminal device provided in the embodiment of the present disclosure are both hidden In addition, because the reflective device can reflect the ambient light that enters from the through hole to the camera assembly, the at least two camera modules can separately obtain partial images of the scene for image collection, and further Then synthesize the images obtained by the two camera modules to complete the image of the scene where the image is collected.
  • the embodiments of the present disclosure can reduce the area of the light sensor of each camera module without affecting the photographing effect of the terminal device, thereby reducing the thickness of the terminal device.
  • the disclosed embodiments can solve the problem that an ultra-thin terminal device equipped with a periscope camera cannot capture large-bottom, high-pixel images.
  • each function module may be divided corresponding to each function, or two or more functions may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules. It should be noted that the division of modules in the embodiments of the present disclosure is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
  • FIG. 11 shows a schematic diagram of a possible structure of the terminal device 1100 involved in the foregoing embodiment, and the terminal device 1100 includes:
  • the photographing module 1101 is configured to use the at least two camera modules to perform imaging according to the ambient light reflected by the reflective devices contained in each of the at least two camera modules to obtain images of partial regions of the target scene respectively;
  • the stitching module 1102 is used to stitch the images captured by each of the camera modules to generate a target image.
  • the terminal equipment provided by the embodiments of the present disclosure includes: a photographing module and a splicing module; wherein the photographing module can use at least two camera modules to image according to the ambient light reflected by the reflective device contained in each of them, to obtain images of partial regions of the target scene respectively ,
  • the splicing module can splice the images captured by each of the camera modules to generate a target image; because the camera assembly is formed by reflecting the ambient light entering through the through hole through a reflective device, the terminal device provided by the embodiment of the present disclosure
  • the at least two camera modules of are both periscope camera modules, and because the reflective device can reflect ambient light entering from the through hole to the camera assembly, the at least two camera modules can obtain Part of the image of the scene where the image is collected is further combined with the images respectively obtained by the two camera modules to complete the image of the scene where the image is collected, that is, when the image is taken by the terminal device, each camera The module only needs to acquire a part of the image, so compared to acquiring a complete
  • FIG. 12 is a schematic diagram of the hardware structure of a terminal device that implements the terminal device control method of an embodiment of the present disclosure.
  • the terminal device 100 of an embodiment of the present disclosure includes: at least two camera modules; any of the camera modules includes: A camera assembly, a through hole provided on the housing of the terminal device, and a light reflecting device disposed opposite to the through hole, the light reflecting device reflecting ambient light entering from the through hole to the camera assembly; and
  • the terminal device 100 also includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, and a power supply 111 And at least one component such as a heat generator provided in any of the above embodiments.
  • terminal devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
  • the processor 110 is configured to use at least two camera modules to perform imaging according to the ambient light reflected by the reflective devices contained in each of them, to obtain images of partial regions of the target scene respectively;
  • the input unit 104 is used to splice the images captured by each of the camera modules to generate a target image.
  • At least two camera modules of the terminal device are both periscope camera modules, and since the reflective device can The ambient light that enters through the through hole is reflected to the camera assembly, so the at least two camera modules can respectively obtain partial images of the scene where the image is collected, and further compare the images obtained by the two camera modules.
  • this The disclosed embodiment can reduce the area of the light sensor of each camera module without affecting the photographing effect of the terminal device, thereby reducing the thickness of the terminal device. Therefore, the embodiment of the present disclosure can solve the problem of a periscope camera. There is a problem that thin terminal devices cannot capture large-bottom, high-pixel images.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into audio signals and output them as sounds. Moreover, the audio output unit 103 may also provide audio output associated with a specific function performed by the terminal device (for example, call signal receiving sound, message receiving sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042.
  • the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the terminal device also includes at least one sensor 105, such as a Hall shift sensor, a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light, and the proximity sensor can close the display panel 1061 and/or when the terminal device is moved to the ear. Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in multiple directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, related Games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, percussion), etc.; the sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers , Infrared sensors, etc., I won’t repeat them here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 can be used to receive inputted number or character information and generate key signal input related to user settings and function control of the terminal device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be realized by various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it is transmitted to the processor 110 to determine the type of the touch event.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to realize the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
  • the implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device and a terminal device.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input from an external device (for example, data information, power, etc.) and transmit the received input to one or more elements in the terminal device or can be used to communicate between the terminal device and the external device. Transfer data between.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the terminal device. It uses various interfaces and lines to connect multiple parts of the entire terminal device. It runs or executes software programs and/or modules stored in the memory 109 and calls data stored in the memory 109. , Perform various functions of the terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the terminal device may also include a power source 111 (such as a battery) for supplying power to multiple components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system.
  • the terminal device includes some functional modules that are not shown, which will not be repeated here.
  • the embodiments of the present disclosure also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the above-mentioned terminal device control method embodiment is realized, and the same In order to avoid repetition, I won’t repeat them here.
  • the computer readable storage medium such as read-only memory (Read-Only Memory, ROM for short), random access memory (Random Access Memory, RAM for short), magnetic disk or optical disk, etc.
  • the terminal device and computer storage medium provided in the embodiments of the present disclosure are both used to execute the corresponding method provided above. Therefore, the beneficial effects that can be achieved can refer to the beneficial effects in the corresponding method provided above , I won’t repeat it here.
  • the technical solution of the present disclosure essentially or the part that contributes to the related technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal (which can be a mobile phone, a computer, a server, an air conditioner, or a network-side device, etc.) execute the methods described in the multiple embodiments of the present disclosure.
  • a terminal which can be a mobile phone, a computer, a server, an air conditioner, or a network-side device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

本公开实施例提供一种终端设备及其控制方法,本公开实施例提供的终端设备,包括至少两个摄像模组;其中,任一所述摄像模组包括:摄像头组件、设置于所述终端设备的外壳上的通孔以及与所述通孔相对设置的反光器件,所述反光器件将从所述通孔进入的环境光反射至所述摄像头组件。

Description

终端设备及其控制方法
相关申请的交叉引用
本申请要求于2019年04月25日提交国家知识产权局、申请号为201910340748.9、申请名称为“一种终端设备及其控制方法”的中国专利申请的优先权,其全部内容通过引用包含于此。
技术领域
本公开涉及终端技术领域,尤其涉及一种终端设备及其控制方法。
背景技术
随着终端技术的不断发展,用户不但从性能上对终端设备提出了更高的要求,而且对终端设备的外观的要求也越来越高,其中,超薄终端设备已成为终端设备的发转趋势之一。
传统摄像模组的光路与终端设备的厚度方向一致,因此摄像模组的高度直接决定了终端设备的最小厚度,并且目前摄像模组的高度已成为限制终端设备变薄的瓶颈。为了减小终端设备的厚度,相关技术中提出了一种潜望式摄像模组,即,利用反光器件将终端设备环境光反射后传输至摄像头组件,从而使摄像模组的光路与终端设备的长度方向一致。摄像模组的光路与终端设备的长度方向一致,因此摄像头高度已不再对终端设备的厚度造成限制。然而,由于潜望式摄像头的摄像头模组的厚度与终端设备的厚度方向一致,因此设置有潜望式摄像头的超薄终端设备中,摄像头的光传感器的面积较小,而光传感器的面积较小会导致单一像素的受光面积较小或摄像头像素数量较小,进而导致设置有潜望式摄像头的超薄终端设备无法拍摄出大底、高像素的图像。
发明内容
本公开实施例提供一种终端设备及其控制方法,用于解决设置有潜望式摄像头的超薄终端设备无法拍摄出大底、高像素的图像的问题。
为了解决上述技术问题,本公开实施例是这样实现的:
第一方面,本公开的实施例提供了一种终端设备,包括至少两个摄像模组;
其中,任一所述摄像模组包括:摄像头组件、设置于所述终端设备的外壳上的通孔以及与所述通孔相对设置的反光器件,所述反光器件将从所述通孔进入的环境光反射至所述摄像头组件。
第二方面,本公开的实施例提供了一种终端设备的控制方法,用于控制第一方面所述的终端设备,所述方法包括:
通过至少两个摄像模组根据其各自包含的反光器件反射的环境光成像,分别得到目标场景的部分区域的图像;
对每个所述摄像模组拍摄得到的图像进行拼接,生成目标图像。
第三方面,本公开的实施例提供了一种如第一方面所述终端设备,包括:
拍摄模块,用于通过所述至少两个摄像模组根据其各自包含的反光器件反射的环境光成像,分别得到目标场景的部分区域的图像;
拼接模块,用于对每个所述摄像模组拍摄得到的图像进行拼接,生成目标图像。
第四方面,本公开的实施例提供了一种终端设备,包括:处理器、存储器、存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如第二方面所述的终端设备的控制方法的步骤。
第五方面,本公开的实施例提供了一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如第二方面所述的终端设备的控制方法的步骤。
本公开实施例提供的终端设备,包括至少两个摄像模组,其中,任一所述摄像模组包括:摄像头组件、设置于所述终端设备外壳上的通孔以及与所述通孔相对设置的反光器件,并且所述反光器件可以将从所述通孔进入的环境光反射至所述摄像头组件。由于摄像头组件是通过反光器件反射由通孔进入的环境光成像的,因此本公开实施例提供的终端设备的至少两个摄像模组均为潜望式摄像模组,又由于反光器件可以将从所述通孔进入的环境光反射至所述摄像头组件,因此所述至少两个摄像模组可以分别获取进行图像采集的场景的部分图像,进一步再对所述两个摄像模组分别获取的图像进行合成可以进行图像采集的场景的完成图像,即,在通过所述终端设备拍摄图像时,每一个摄像模组仅需要获取图像的一部分,因此相比于通过一个摄像模组获取完整图像,本公开实施例可以在不影响终端设备拍照效果的同时,减小每一个摄像模组的光传感器的面积,进而减小终端设备的厚度,因此本公开实施例可以解决设置有潜望式摄像头的超薄终端设备无法拍摄出大底、高像素的图像的问题。
附图说明
图1为本公开实施例提供的安卓操作系统的架构图;
图2为本公开实施例提供的终端设备的示意性结构图之一;
图3为本公开实施例提供的终端设备的示意性结构图之二;
图4为本公开实施例提供的摄像模组的通孔的排列示意图之一;
图5为本公开实施例提供的图4所示摄像模组采集的图像的示意图;
图6为本公开实施例提供的摄像模组的通孔的排列示意图之二;
图7为本公开实施例提供的图6所示摄像模组采集的图像的示意图;
图8为本公开实施例提供的摄像模组的通孔的排列示意图之三;
图9本公开实施例提供的图8所示摄像模组采集的图像的示意图;
图10本公开实施例提供终端设备的控制方法的步骤流程图;
图11本公开实施例提供终端设备的示意性结构图之三;
图12为本公开实施例提供的终端设备的硬件结构示意图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系;在公式中,字符“/”,表示前后关联对象是一种“相除”的关系。如果不加说明,本文中的“多个”是指两个或两个以上。
为了便于清楚描述本公开实施例的技术方案,在本公开的实施例中,采用了“第一”、“第二”等字样对功能或作用基本相同的相同项或相似项进行区分,本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定。
本公开实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本公开实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。在本公开实施例中,除非另有说明,“多个”的含义是指两个或者两个以上。
由于潜望式摄像头的摄像头模组的厚度与终端设备的厚度方向一致,因此设置有潜望式摄像头的超薄终端设备中,摄像头的光传感器的面积较小,而光传感器的面积较小会导致单一像素的受光面积较小或摄像头像素数量较小,进而导致设置有潜望式摄像头的超薄终端设备无法拍摄出大底、高像素的图像。
为了解决上述问题,本公开实施例提供了一种终端设备,包括至少两个摄像模组,其中,任一所述摄像模组包括:摄像头组件、设置于所述终端设备外壳上的通孔以及与所述通孔相对设置的反光器件,并且所述反光器件可以将从所述通孔进入的环境光反射至所述摄像头组件。由于摄像头组件是通过反光器件反射由通孔进入的环境光成像的,因此本公开实施例提供的终端设备的至少两个摄像模组均为潜望式摄像模组,又由于反光器件可以将从所述通孔进入的环境光反射至所述摄像头组件,因此所述至少两个摄像模组可以分别获取进行图像采集的场景的部分图像,进一步再对所述两个摄像模组分别获取的图像进行合成可以进行图像采集的场景的完成图像,即,在通过所述终端设备拍摄图像时,每一个摄像模组仅需要获取图像的一部分,因此相比于通过一个摄像模组获取完整图像,本公开实施例可以在不影响终端设备拍照效果的同时,减小每一个摄像模组的光传感器的面积,进而减小终端设备的厚度,因此本公开实施例可以解决设置有潜望式摄像头的超薄终端设备无法拍摄出大底、高像素的图像的问题。
本公开实施例提供的终端设备的控制方法可以应用于终端设备,该终端设备可以为具有操作系统的终端设备。该操作系统可以为安卓操作系统,也可以为iOS操作系统,还可以为其他可能的操作系统,本公开实施例不作限定。
下面以安卓操作系统为例,介绍一下本公开实施例提供的终端设备的控制方法所应用的软件环境。
如图1所示,为本公开实施例提供的一种可能的安卓操作系统的架构示意图。在图1中,安卓操作系统的架构包括4层,分别为:应用程序层、应用程序框架层、系统运行库层和内核层(具体可以为Linux内核层)。
其中,应用程序层包括安卓操作系统中的各个应用程序(包括系统应用程序和第 三方应用程序)。
应用程序框架层是应用程序的框架,开发人员可以在遵守应用程序的框架的开发原则的情况下,基于应用程序框架层开发一些应用程序。
系统运行库层包括库(也称为系统库)和安卓操作系统运行环境。库主要为安卓操作系统提供其所需的各类资源。安卓操作系统运行环境用于为安卓操作系统提供软件环境。
内核层是安卓操作系统的操作系统层,属于安卓操作系统软件层次的最底层。内核层基于Linux内核为安卓操作系统提供核心系统服务和与硬件相关的驱动程序。
以安卓操作系统为例,本公开实施例中,开发人员可以基于上述如图1所示的安卓操作系统的系统架构,开发实现本公开实施例提供的终端设备的控制方法的软件程序,从而使得该终端设备的控制方法可以基于如图1所示的安卓操作系统运行。即,处理器或者终端设备可以通过在安卓操作系统中运行该软件程序实现本公开实施例提供的终端设备的控制方法。
本公开实施例提供的终端设备可以为手机、平板电脑、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、智能手表、智能手环等终端设备,或者该终端设备还可以为其他类型的终端设备,本公开实施例不作限定。
本公开的实施例提供了一种终端设备,具体的,参照图2所示,本公开实施例提供的终端设备,包括至少两个摄像模组20(图2中以终端设备包括2个摄像模组为例示出)。
其中,任一所述摄像模组20包括:摄像头组件21、设置于所述终端设备外壳上的通孔22以及与所述通孔22相对设置的反光器件23;所述反光器件23用于将从所述通孔22进入的环境光(图2中环境光以带箭头虚线示出)反射至所述摄像头组件21,所述摄像头组21件用于根据所述反光器件23反射的环境光成像。
需要说明的是,图2中以终端设备包括两个摄像模组为列示出,但本公开实施例并不限定于此,在上述实施例的基础上,本公开实施例提供的终端设备还可以包括其他数量的摄像模组,例如:终端设备可以包括四个摄像模组。
以下对终端设备对目标场景进行图像采集对上述实施例提供的终端设备的工作原理进行说明。
在通过所述终端设备对目标场景进行图像采集时,所述至少两个摄像模组根据其各自包含的反光器件反射的环境光成像,分别得到目标场景的部分区域的图像,然后再通过图像合成装置对所述至少两个摄像模组生成的图像合成,生成目标场景的完整图像。
示例性的,图像合成装置可以为终端设备的处理器。
本公开实施例提供的终端设备,包括至少两个摄像模组,其中,任一所述摄像模组包括:摄像头组件、设置于所述终端设备外壳上的通孔以及与所述通孔相对设置的反光器件,并且所述反光器件可以将从所述通孔进入的环境光反射至所述摄像头组件。由于摄像头组件是通过反光器件反射由通孔进入的环境光成像的,因此本公开实施例提供的终端设备的至少两个摄像模组均为潜望式摄像模组,又由于反光器件可以将从 所述通孔进入的环境光反射至所述摄像头组件,因此所述至少两个摄像模组可以分别获取进行图像采集的场景的部分图像,进一步再对所述两个摄像模组分别获取的图像进行合成可以进行图像采集的场景的完成图像,即,在通过所述终端设备拍摄图像时,每一个摄像模组仅需要获取图像的一部分,因此相比于通过一个摄像模组获取完整图像,本公开实施例可以在不影响终端设备拍照效果的同时,减小每一个摄像模组的光传感器的面积,进而减小终端设备的厚度,因此本公开实施例可以解决设置有潜望式摄像头的超薄终端设备无法拍摄出大底、高像素的图像的问题。
可选地,上述实施例中的反光器件23可以为反光棱镜。
由上述实施例中的反光器件只要能将通过通孔的环境光反射至摄像头组件即可,因此本公开实施例中的反光棱镜可以为任意三角棱镜,而无需设置为特殊的等腰三角棱镜。
当然,在上述实时的基础上,本公开实施例中的反光器件可以为其他器件,例如:平面镜。本公开实施例中对反光器件23不作限定,以能够将通过通孔的环境光发射至摄像头组件为准。
进一步的,所述通孔22中的光路与所述摄像头组件21中的光路的夹角为90°。
即,当通过位于终端设备的正面或背面时,所述摄像头组件21的高度方向与所述终端设备的长度方向一致。
可选地,参照图3所示,所述终端设备,还包括:驱动器件30。
所述设置于所述通孔21,包括:设置于所述终端设备的正面的第一通孔211以及设置于所述终端设备的背面的第二通孔212。
所述驱动器件30,用于调节所述反光器件23,将从所述第一通孔211进入的环境光反射至所述摄像头组件21,或者将从所述第二通孔212进入的环境光反射至所述摄像头组件21。
由于通过驱动器件30调节所述反光器件23可以将从所述终端设备的正面的第一通孔211进入的环境光反射至所述摄像头组件21,因此通过终端设备首先对终端设备的正面的场景进行图像采集,可以将所述至少两个摄像模组的摄像头组件用作所述终端设备的前置摄像头;又因为通过驱动器件30调节所述反光器件23还可以将从所述终端设备的背面的第二通孔212进入的环境光反射至所述摄像头组件21,因此通过终端设备还对终端设备背面的场景进行图像采集,可以将所述至少两个摄像模组的摄像头组件用作所述终端设备的后置摄像头。即,本公开实施例提供的终端设备的至少两个摄像模的摄像头组件组既可以用作所述终端设备的前置摄像头,又可以用作所述终端设备的后置摄像头,实现了摄像头组件的复用,因此本公开实施例提供的终端设备还可以降低终端设备的制造成本,以及节省终端设备的内部空间。
进一步的,当终端设备包括4个摄像模组时,所述4个摄像模组的通孔的排列方式可以为如下所述的任一种:
排列方式一、
参照图4所示,所述四个摄像模组的通孔22沿所述终端设备的长度方向排列成一列。
进一步的,参照图5所示,当设置所述四个摄像模组的通孔22沿所述终端设备的 长度方向排列成一列时,所述四个摄像模组由上至下依次采集的图像分别为5a、5b、5c以及5d。其中,5a、5b、5c以及5d的宽度均等于目标场景的图像的宽度,5a、5b、5c以及5d的长度均大于目标场景的图像的长度h的1/4。示例性的,5a和5d的长度可以为5/16h,5b和5c的宽长可以为6/16h,并在拼接四个摄像模组采集的图像时,将相邻两幅图像的1/16重叠拼接。
排列方式二、
参照图6所示,所述四个摄像模组的通孔22沿所述终端设备的宽度方向排列成一行。
进一步的,参照图7所示,当设置所述四个摄像模组的通孔22沿所述终端设备的长度方向排列成一列时,所述四个摄像模组由左至右依次采集的图像分别为7a、7b、7c以及7d。其中,7a、7b、7c以及7d的长度均等于目标场景的图像的长度,7a、7b、7c以及7d的宽度均大于目标场景的图像的宽度L的1/4。示例性的,7a和7d的宽度可以为5/16L,7b和7c的宽度可以为6/16h,在拼接四个摄像模组采集的图像时,将相邻两幅图像的1/16重叠拼接。
排列方式三、
参照图8所示,所述四个摄像模组的通孔22排列成2*2的通孔矩阵。
进一步的,参照图9所示,当设置所述四个摄像模组的通孔22排列成2*2的通孔矩阵时,通孔位于左上的摄像模组获取的图像为9a,通孔位于右上的摄像模组获取的图像为9b,通孔位于左下的摄像模组获取的图像为9c,通孔位于右下的摄像模组获取的图像为9d。其中,9a、9b、9c以及9d的宽度均大于目标场景的图像的宽度的1/2,9a、9b、9c以及9d的长度均大于目标场景的图像的长度的1/2。
示例性性的,9a、9b、9c以及9d的宽度与目标场景的图像的宽度的比值为2.1/2;9a、9b、9c以及9d的长度与目标场景的图像的宽度的比值为1.6/1.5,在拼接四个摄像模组采集的图像时,将相邻两幅图像的1/20宽度以及1/15高度重叠拼接。
进一步可选地,所述摄像头组件的光传感器的长度和宽度的比值为2.1:1.6。
本公开再一实施例提供一种终端设备的控制方法,该种端设备控制方法用于控制上述任一实施例提供的终端设备。具体的,参照图10所示,该方法包括如下步骤101和步骤102。
步骤101、通过至少两个摄像模组根据其各自包含的反光器件反射的环境光成像,分别得到目标场景的部分区域的图像。
示例性的,所述至少两个摄像模组分别获取的图像可以如图6、或图8或图10所示,再此不再赘述。
步骤102、对每个所述摄像模组拍摄得到的图像进行拼接,生成目标图像。
需要说明的,本公开实施例对拼接所述至少两个摄像模组拍摄的图像的算法不做限定,以能够将所述至少两个摄像模组拍摄的图像拼接为所述目标场景的图像为准。
本公开实施例提供的终端设备的控制方法,首先通过至少两个摄像模组根据其各自包含的反光器件反射的环境光成像,分别得到目标场景的部分区域的图像,然后对每个所述摄像模组拍摄得到的图像进行拼接,生成目标图像;由于摄像头组件是通过反光器件反射由通孔进入的环境光成像的,因此本公开实施例提供的终端设备的至少 两个摄像模组均为潜望式摄像模组,又由于反光器件可以将从所述通孔进入的环境光反射至所述摄像头组件,因此所述至少两个摄像模组可以分别获取进行图像采集的场景的部分图像,进一步再对所述两个摄像模组分别获取的图像进行合成可以进行图像采集的场景的完成图像,即,在通过所述终端设备拍摄图像时,每一个摄像模组仅需要获取图像的一部分,因此相比于通过一个摄像模组获取完整图像,本公开实施例可以在不影响终端设备拍照效果的同时,减小每一个摄像模组的光传感器的面积,进而减小终端设备的厚度,因此本公开实施例可以解决设置有潜望式摄像头的超薄终端设备无法拍摄出大底、高像素的图像的问题。
本公开实施例可以根据上述方法示例对终端设备进行功能模块的划分。例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本公开实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用集成的单元的情况下,图11示出了上述实施例中所涉及的终端设备1100的一种可能的结构示意图,该终端设备1100包括:
拍摄模块1101,用于通过所述至少两个摄像模组根据其各自包含的反光器件反射的环境光成像,分别得到目标场景的部分区域的图像;
拼接模块1102,用于对每个所述摄像模组拍摄得到的图像进行拼接,生成目标图像。
本公开实施例提供的终端设备包括:拍摄模块和拼接模块;其中,拍摄模块可以通过至少两个摄像模组根据其各自包含的反光器件反射的环境光成像,分别得到目标场景的部分区域的图像,拼接模块可以对每个所述摄像模组拍摄得到的图像进行拼接,生成目标图像;由于摄像头组件是通过反光器件反射由通孔进入的环境光成像的,因此本公开实施例提供的终端设备的至少两个摄像模组均为潜望式摄像模组,又由于反光器件可以将从所述通孔进入的环境光反射至所述摄像头组件,因此所述至少两个摄像模组可以分别获取进行图像采集的场景的部分图像,进一步再对所述两个摄像模组分别获取的图像进行合成可以进行图像采集的场景的完成图像,即,在通过所述终端设备拍摄图像时,每一个摄像模组仅需要获取图像的一部分,因此相比于通过一个摄像模组获取完整图像,本公开实施例可以在不影响终端设备拍照效果的同时,减小每一个摄像模组的光传感器的面积,进而减小终端设备的厚度,因此本公开实施例可以解决设置有潜望式摄像头的超薄终端设备无法拍摄出大底、高像素的图像的问题。
图12为实现本公开的实施例的终端设备的控制方法的终端设备的硬件结构示意图,本公开的实施例的终端设备100包括:至少两个摄像模组;任一所述摄像模组包括:摄像头组件、设置于所述终端设备外壳上的通孔以及与所述通孔相对设置的反光器件,所述反光器件将从所述通孔进入的环境光反射至所述摄像头组件;并且所述终端设备100还包括不限于:射频单元101、网络模块102、音频输出单元103、输入单元104、传感器105、显示单元106、用户输入单元107、接口单元108、存储器109、处理器110、电源111以及至少一个上述任一实施例提供的热发电机等部件。本领域技术人员可以理解,图12中示出的终端设备结构并不构成对终端设备的限定,终端设 备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,终端设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备以及计步器等。
其中,处理器110,用于通过至少两个摄像模组根据其各自包含的反光器件反射的环境光成像,分别得到目标场景的部分区域的图像;
所述输入单元104,用于对每个所述摄像模组拍摄得到的图像进行拼接,生成目标图像。
由于摄像头组件是通过反光器件反射由通孔进入的环境光成像的,因此本公开实施例提供的终端设备的至少两个摄像模组均为潜望式摄像模组,又由于反光器件可以将从所述通孔进入的环境光反射至所述摄像头组件,因此所述至少两个摄像模组可以分别获取进行图像采集的场景的部分图像,进一步再对所述两个摄像模组分别获取的图像进行合成可以进行图像采集的场景的完成图像,即,在通过所述终端设备拍摄图像时,每一个摄像模组仅需要获取图像的一部分,因此相比于通过一个摄像模组获取完整图像,本公开实施例可以在不影响终端设备拍照效果的同时,减小每一个摄像模组的光传感器的面积,进而减小终端设备的厚度,因此本公开实施例可以解决设置有潜望式摄像头的超薄终端设备无法拍摄出大底、高像素的图像的问题。
应理解的是,本公开实施例中,射频单元101可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器110处理;另外,将上行的数据发送给基站。通常,射频单元101包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元101还可以通过无线通信系统与网络和其他设备通信。
终端设备通过网络模块102为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元103可以将射频单元101或网络模块102接收的或者在存储器109中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元103还可以提供与终端设备执行的特定功能相关联的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元103包括扬声器、蜂鸣器以及受话器等。
输入单元104用于接收音频或视频信号。输入单元104可以包括图形处理器(Graphics Processing Unit,GPU)1041和麦克风1042,图形处理器1041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元106上。经图形处理器1041处理后的图像帧可以存储在存储器109(或其它存储介质)中或者经由射频单元101或网络模块102进行发送。麦克风1042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元101发送到移动通信基站的格式输出。
终端设备还包括至少一种传感器105,比如霍尔移位传感器、光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板1061的亮度,接近传感器可在终端设备移动到耳边时,关闭显示面板1061和/或背光。作为运动传感器的一种,加速计 传感器可检测多个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别终端设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器105还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元106用于显示由用户输入的信息或提供给用户的信息。显示单元106可包括显示面板1061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板1061。
用户输入单元107可用于接收输入的数字或字符信息以及产生与终端设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元107包括触控面板1071以及其他输入设备1072。触控面板1071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1071上或在触控面板1071附近的操作)。触控面板1071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器110,接收处理器110发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1071。除了触控面板1071,用户输入单元107还可以包括其他输入设备1072。具体地,其他输入设备1072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板1071可覆盖在显示面板1061上,当触控面板1071检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示面板1061上提供相应的视觉输出。虽然在图12中,触控面板1071与显示面板1061是作为两个独立的部件来实现终端设备的输入和输出功能,但是在某些实施例中,可以将触控面板1071与显示面板1061集成而实现终端设备的输入和输出功能,具体此处不做限定。
接口单元108为外部装置与终端设备连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元108可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到终端设备内的一个或多个元件或者可以用于在终端设备和外部装置之间传输数据。
存储器109可用于存储软件程序以及各种数据。存储器109可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器109可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器110是终端设备的控制中心,利用各种接口和线路连接整个终端设备的多 个部分,通过运行或执行存储在存储器109内的软件程序和/或模块以及调用存储在存储器109内的数据,执行终端设备的各种功能和处理数据,从而对终端设备进行整体监控。处理器110可包括一个或多个处理单元;优选地,处理器110可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110中。
终端设备还可以包括给多个部件供电的电源111(比如电池),优选地,电源111可以通过电源管理系统与处理器110逻辑相连,从而通过电源管理系统实现管理充电、放电以及功耗管理等功能。
另外,终端设备包括一些未示出的功能模块,在此不再赘述。
本公开实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述终端设备的控制方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,简称ROM)、随机存取存储器(Random Access Memory,简称RAM)、磁碟或者光盘等。
其中,本公开实施例提供的终端设备、计算机存储介质均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络侧设备等)执行本公开多个实施例所述的方法。
上面结合附图对本公开的实施例进行了描述,但是本公开并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本公开的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本公开的保护之内。

Claims (12)

  1. 一种终端设备,包括至少两个摄像模组;
    其中,任一所述摄像模组包括:摄像头组件、设置于所述终端设备的外壳上的通孔以及与所述通孔相对设置的反光器件,所述反光器件将从所述通孔进入的环境光反射至所述摄像头组件。
  2. 根据权利要求1所述的终端设备,其中,所述终端设备还包括:驱动器件;
    所述通孔包括:设置于所述终端设备的正面的第一通孔以及设置于所述终端设备的背面的第二通孔;
    所述驱动器件调节所述反光器件,将从所述第一通孔进入的环境光反射至所述摄像头组件,或者将从所述第二通孔进入的环境光反射至所述摄像头组件。
  3. 根据权利要求1所述的终端设备,其中,所述反光器件为反光棱镜。
  4. 根据权利要求1所述的终端设备,其中,所述通孔中的光路与所述摄像头组件中的光路的夹角为90°。
  5. 根据权利要求1所述的终端设备,其中,所述终端设备包括四个摄像模组。
  6. 根据权利要求5所述的终端设备,其中,所述四个摄像模组的通孔沿所述终端设备的长度方向排列成一列;或者,所述四个摄像模组的通孔沿所述终端设备的宽度方向排列。
  7. 根据权利要求5所述的终端设备,其中,所述四个摄像模组的通孔排列成2*2的通孔矩阵。
  8. 根据权利要求7所述的终端设备,其中,所述摄像头组件的光传感器的长度和宽度的比值为2.1:1.6。
  9. 一种终端设备的控制方法,用于控制权利要求1至8任一项所述的终端设备,所述方法包括:
    通过至少两个摄像模组根据其各自包含的反光器件反射的环境光成像,分别得到目标场景的部分区域的图像;
    对每个所述摄像模组拍摄得到的图像进行拼接,生成目标图像。
  10. 一种如权利要求1至8任一项所述的终端设备,包括:
    拍摄模块,用于通过所述至少两个摄像模组根据其各自包含的反光器件反射的环境光成像,分别得到目标场景的部分区域的图像;
    拼接模块,用于对每个所述摄像模组拍摄得到的图像进行拼接,生成目标图像。
  11. 一种终端设备,包括:处理器、存储器、存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求9所述的终端设备的控制方法的步骤。
  12. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求9所述的终端设备的控制方法的步骤。
PCT/CN2020/085648 2019-04-25 2020-04-20 终端设备及其控制方法 WO2020216181A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910340748.9A CN110233914A (zh) 2019-04-25 2019-04-25 一种终端设备及其控制方法
CN201910340748.9 2019-04-25

Publications (1)

Publication Number Publication Date
WO2020216181A1 true WO2020216181A1 (zh) 2020-10-29

Family

ID=67860263

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085648 WO2020216181A1 (zh) 2019-04-25 2020-04-20 终端设备及其控制方法

Country Status (2)

Country Link
CN (1) CN110233914A (zh)
WO (1) WO2020216181A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110233914A (zh) * 2019-04-25 2019-09-13 维沃移动通信有限公司 一种终端设备及其控制方法
CN111629107B (zh) * 2020-06-10 2022-01-25 北京字节跳动网络技术有限公司 终端的控制方法、装置、终端和存储介质
CN112822361B (zh) * 2020-12-30 2022-11-18 维沃移动通信有限公司 电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100738803B1 (ko) * 2006-05-04 2007-07-12 엘지이노텍 주식회사 하나의 이미지 센서로 듀얼 카메라 기능을 구현하는 카메라모듈 및 휴대용 단말기
CN105049687A (zh) * 2015-06-30 2015-11-11 广东欧珀移动通信有限公司 双摄像头移动终端
WO2015195296A2 (en) * 2014-06-20 2015-12-23 Qualcomm Incorporated Multi-camera system using folded optics free from parallax artifacts
CN107659758A (zh) * 2017-09-26 2018-02-02 努比亚技术有限公司 潜望式拍摄装置及移动终端
CN208386734U (zh) * 2018-06-14 2019-01-15 Oppo广东移动通信有限公司 摄像头模组和电子装置
CN110233914A (zh) * 2019-04-25 2019-09-13 维沃移动通信有限公司 一种终端设备及其控制方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4259461B2 (ja) * 2004-12-09 2009-04-30 三菱電機株式会社 目標検出装置
CN207835595U (zh) * 2017-12-14 2018-09-07 信利光电股份有限公司 一种双摄像头模组以及终端
CN109040596B (zh) * 2018-08-27 2020-08-28 Oppo广东移动通信有限公司 一种调整摄像头的方法、移动终端及存储介质
CN109151327A (zh) * 2018-10-30 2019-01-04 维沃移动通信(杭州)有限公司 一种图像处理方法及终端设备
CN109379522A (zh) * 2018-12-06 2019-02-22 Oppo广东移动通信有限公司 成像方法、成像装置、电子装置及介质
CN109600551A (zh) * 2018-12-29 2019-04-09 Oppo广东移动通信有限公司 成像方法、成像装置、电子装置及介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100738803B1 (ko) * 2006-05-04 2007-07-12 엘지이노텍 주식회사 하나의 이미지 센서로 듀얼 카메라 기능을 구현하는 카메라모듈 및 휴대용 단말기
WO2015195296A2 (en) * 2014-06-20 2015-12-23 Qualcomm Incorporated Multi-camera system using folded optics free from parallax artifacts
CN105049687A (zh) * 2015-06-30 2015-11-11 广东欧珀移动通信有限公司 双摄像头移动终端
CN107659758A (zh) * 2017-09-26 2018-02-02 努比亚技术有限公司 潜望式拍摄装置及移动终端
CN208386734U (zh) * 2018-06-14 2019-01-15 Oppo广东移动通信有限公司 摄像头模组和电子装置
CN110233914A (zh) * 2019-04-25 2019-09-13 维沃移动通信有限公司 一种终端设备及其控制方法

Also Published As

Publication number Publication date
CN110233914A (zh) 2019-09-13

Similar Documents

Publication Publication Date Title
WO2021098678A1 (zh) 投屏控制方法及电子设备
US11373054B2 (en) Object recognition method and mobile terminal
WO2021104195A1 (zh) 图像显示方法及电子设备
WO2020156466A1 (zh) 拍摄方法及终端设备
US12003860B2 (en) Image processing method and electronic device
WO2020216181A1 (zh) 终端设备及其控制方法
WO2020186945A1 (zh) 界面显示方法及终端设备
WO2021083087A1 (zh) 截屏方法及终端设备
WO2021098697A1 (zh) 屏幕显示的控制方法及电子设备
WO2020186964A1 (zh) 音频信号的输出方法及终端设备
WO2021036623A1 (zh) 显示方法及电子设备
WO2021057290A1 (zh) 信息控制方法及电子设备
WO2021082744A1 (zh) 视频查看方法及电子设备
WO2020192324A1 (zh) 界面显示方法及终端设备
WO2021121398A1 (zh) 一种视频录制方法及电子设备
WO2020192282A1 (zh) 通知消息显示方法及终端设备
WO2020238459A1 (zh) 显示方法和终端设备
WO2020238562A1 (zh) 显示方法及终端
WO2020220893A1 (zh) 截图方法及移动终端
US11863901B2 (en) Photographing method and terminal
WO2021129776A1 (zh) 成像处理方法和电子设备
WO2021017705A1 (zh) 界面显示方法及终端设备
WO2021175143A1 (zh) 图片获取方法及电子设备
WO2021208890A1 (zh) 截屏方法及电子设备
WO2021109960A1 (zh) 图像处理方法、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20795210

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20795210

Country of ref document: EP

Kind code of ref document: A1