WO2023071497A1 - Procédé de réglage de paramètre de photographie, dispositif électronique et support de stockage - Google Patents

Procédé de réglage de paramètre de photographie, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2023071497A1
WO2023071497A1 PCT/CN2022/115819 CN2022115819W WO2023071497A1 WO 2023071497 A1 WO2023071497 A1 WO 2023071497A1 CN 2022115819 W CN2022115819 W CN 2022115819W WO 2023071497 A1 WO2023071497 A1 WO 2023071497A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
screen
parameter
user
electronic device
Prior art date
Application number
PCT/CN2022/115819
Other languages
English (en)
Chinese (zh)
Inventor
戴天瑶
郑江震
李�瑞
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023071497A1 publication Critical patent/WO2023071497A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Definitions

  • the embodiments of the present application relate to the field of smart terminals, and in particular, to a shooting parameter adjustment method, electronic equipment, and a storage medium.
  • the embodiment of the present application provides a shooting parameter adjustment method, electronic equipment, and a storage medium, so as to provide a method for adjusting the shooting parameters of the rear camera, which can facilitate the adjustment of the shooting parameters of the rear camera, thereby improving the User's operating experience.
  • an embodiment of the present application provides a shooting parameter adjustment method, which is applied to an electronic device, and the electronic device includes a first screen and a second screen, and the first screen and the second screen are respectively located on both sides of the electronic device, including :
  • the electronic device may be a smart device with a rear camera, for example, a mobile phone, a tablet, and the like.
  • the first screen may be a small screen on the back of the electronic device on the rear camera side.
  • the second screen may be the main screen on the front of the electronic device.
  • the value of the shooting parameter of the parameter adjustment gesture matching the first gesture is adjusted based on the first gesture; wherein, the preset parameter adjustment gesture is performed by the user at the second Presets are made on the second screen.
  • the user can adjust the shooting parameters by preset parameter adjustment gestures on the main screen and gestures on the small screen, which can facilitate the adjustment of the shooting parameters of the rear camera, thereby improving the user's operating experience.
  • the preset parameter adjustment gestures include up and down sliding gestures and/or left and right sliding gestures.
  • recognizing the first gesture includes:
  • the validity of the first gesture is determined by the moving speed of the first gesture; wherein, the moving speed of the first gesture may be the speed of the touch point of the user's finger on the first screen Moving speed.
  • the moving speed may be represented by the number of rows or columns of pixels moving along a certain direction within a unit time (for example, 1 microsecond).
  • the first gesture is recognized.
  • the display interface of the first screen displays a scale and a cursor, wherein the cursor corresponds to the touch point of the first gesture on the first screen, and the value on the scale corresponding to the cursor is used to represent the value of the shooting parameter.
  • the scale may include scales, and each scale may correspond to a numerical value, and the numerical value may correspond to the numerical value of the shooting parameter.
  • the cursor may be generated according to the touch point of the user's finger on the first screen. As the user's finger moves, the touch point moves accordingly, and the cursor also moves accordingly.
  • One of the possible implementations also includes:
  • the second gesture In response to the detected second gesture of the user on the first screen, the second gesture is recognized; wherein, the second gesture can be used to switch the shooting parameters corresponding to the current parameter adjustment gesture.
  • the number of parameter adjustment gestures may be one or more.
  • the shooting parameters corresponding to one or more parameter adjusting gestures are switched based on the second gesture.
  • the shooting parameters corresponding to the parameter adjustment gestures are switched through the user's second gesture on the small screen, thereby preventing the user from switching the shooting parameters on the main screen, facilitating the user's operation, and improving the user's operation experience.
  • One of the possible implementations also includes:
  • a first icon and a second icon are displayed on the display interface of the first screen; wherein, the first icon may be a lock icon, and the second icon may be a cancel icon. icon.
  • the third gesture may be a stop gesture, and the above-mentioned first icon and second icon may be called out through the user's stop gesture.
  • the cancel icon and the lock icon are called out through the abort gesture, and through the operation of the cancel icon or the lock icon, the locking or canceling of the shooting parameters can be realized, and the user's operating experience can be improved.
  • the display interface of the first screen further includes a scale, and the first icon and the second icon are respectively located on the left and right sides or on the top and bottom sides of the scale.
  • the first icon and the second icon are respectively located on the left and right sides or the upper and lower sides of the scale, so as to avoid user misoperation.
  • the electronic device further includes a shooting mode
  • the shooting mode includes a normal shooting mode and a wide-angle shooting mode
  • the above method further includes:
  • the parameter corresponding to the wide-angle shooting mode is switched based on the fourth gesture.
  • the fourth gesture can be used to switch parameters in the shooting mode, for example, 1X and 0.5X in the wide-angle shooting mode.
  • the parameter switching in the shooting mode can be realized through the user's fourth gesture, thereby improving the efficiency of parameter switching and further improving the user's experience.
  • the embodiment of the present application provides a shooting parameter adjustment device, which is applied to electronic equipment, and the electronic equipment includes a first screen and a second screen, and the first screen and the second screen are respectively located on both sides of the electronic equipment, including:
  • a recognition module configured to recognize the first gesture in response to the detected first gesture of the user on the first screen
  • the adjustment module is configured to adjust the value of the shooting parameter of the parameter adjustment gesture matching the first gesture based on the first gesture if it is recognized that the first gesture matches the preset parameter adjustment gesture; wherein, the preset parameter adjustment Gestures are preset by the user on the second screen.
  • the preset parameter adjustment gesture includes a gesture of sliding up and down and/or a gesture of sliding left and right.
  • the above identification module is also used for
  • Identifying the validity of the first gesture wherein, the validity of the first gesture is determined by the moving speed of the first gesture;
  • the first gesture is recognized.
  • the above shooting parameter adjustment device further includes:
  • the display module is used to display a scale and a cursor on the display interface of the first screen, wherein the cursor corresponds to the touch point of the first gesture on the first screen, and the value on the scale corresponding to the cursor is used to represent the value of the shooting parameter.
  • the above shooting parameter adjustment device further includes:
  • the first switching module is configured to identify the second gesture in response to the detected second gesture of the user on the first screen; if it is recognized that the second gesture matches the preset parameter change gesture, then based on The second gesture switches the shooting parameters corresponding to one or more parameter adjustment gestures.
  • the above shooting parameter adjustment device further includes:
  • a suspension module configured to display the first icon and the second icon on the display interface of the first screen in response to the detected third gesture of the user on the first screen;
  • the display interface of the first screen further includes a ruler, and the first icon and the second icon are respectively located on the left and right sides or the upper and lower sides of the ruler.
  • the electronic device further includes a shooting mode
  • the shooting mode includes a normal shooting mode and a wide-angle shooting mode
  • the above-mentioned shooting parameter adjustment device further includes:
  • the second switching module is configured to switch the parameter corresponding to the wide-angle shooting mode based on the fourth gesture in response to the detected fourth gesture of the user on the first screen.
  • the embodiment of the present application provides an electronic device, including:
  • the memory is used to store computer program codes
  • the computer program codes include instructions
  • the electronic device includes a first screen and a second screen, the first screen and the second screen are located on both sides of the electronic device, when the electronic device is from the above
  • the above-mentioned instruction is read in the memory, so that the above-mentioned electronic device performs the following steps:
  • the value of the shooting parameter of the parameter adjustment gesture matching the first gesture is adjusted based on the first gesture; wherein, the preset parameter adjustment gesture is performed by the user at the second Presets are made on the second screen.
  • the preset parameter adjustment gesture includes a gesture of sliding up and down and/or a gesture of sliding left and right.
  • the step of making the above-mentioned electronic device perform the step of recognizing the first gesture includes:
  • Identifying the validity of the first gesture wherein, the validity of the first gesture is determined by the moving speed of the first gesture;
  • the first gesture is recognized.
  • the display interface of the first screen displays a scale and a cursor, wherein the cursor corresponds to the touch point of the first gesture on the first screen, and the value on the scale corresponding to the cursor is used to represent the value of the shooting parameter.
  • the above-mentioned electronic device when executed by the above-mentioned electronic device, the above-mentioned electronic device also performs the following steps:
  • the shooting parameters corresponding to one or more parameter adjusting gestures are switched based on the second gesture.
  • the above-mentioned electronic device when executed by the above-mentioned electronic device, the above-mentioned electronic device also performs the following steps:
  • the display interface of the first screen further includes a scale, and the first icon and the second icon are respectively located on the left and right sides or on the top and bottom sides of the scale.
  • the electronic device further includes a shooting mode, and the shooting mode includes a normal shooting mode and a wide-angle shooting mode.
  • the above-mentioned electronic device also performs the following steps:
  • the parameter corresponding to the wide-angle shooting mode is switched based on the fourth gesture.
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when it is run on a computer, the computer executes the method described in the first aspect.
  • an embodiment of the present application provides a computer program, which is used to execute the method described in the first aspect when the above computer program is executed by a computer.
  • all or part of the program in the fifth aspect may be stored on a storage medium packaged with the processor, or part or all may be stored on a memory not packaged with the processor.
  • FIG. 1 is a schematic diagram of a hardware structure of an electronic device provided in an embodiment of the present application
  • FIG. 2 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 3 is a schematic flow diagram of an embodiment of a shooting parameter adjustment method provided by the present application.
  • FIG. 4 is a schematic diagram of an embodiment of a parameter adjustment gesture setting interface provided by the present application.
  • FIG. 5 is a schematic diagram of another embodiment of the parameter adjustment gesture setting interface provided by the present application.
  • FIG. 6 is a schematic diagram of an embodiment of a parameter modification display interface provided by the present application.
  • FIG. 7 is a schematic diagram of another embodiment of the parameter modification display interface provided by the present application.
  • FIG. 8 is a schematic diagram of another embodiment of the parameter modification display interface provided by the present application.
  • FIG. 9 is a schematic diagram of another embodiment of the parameter modification display interface provided by the present application.
  • FIG. 10 is a schematic diagram of another embodiment of the parameter modification display interface provided by the present application.
  • FIG. 11 is a schematic diagram of another embodiment of the parameter modification display interface provided by the present application.
  • FIG. 12 is a schematic flowchart of another embodiment of the shooting parameter adjustment method provided by the present application.
  • Fig. 13a is a schematic diagram of a display interface of locking parameters provided by an embodiment of the present application.
  • Fig. 13b is a schematic diagram of a display interface for canceling parameter adjustment provided by the embodiment of the present application.
  • Fig. 14 is a schematic structural diagram of the device provided by the embodiment of the present application.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, "plurality” means two or more.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the processor 110 may receive the image code stream of the rear camera, and output a preview image; when the touch sensor 180K detects a specific gesture, adjust the rendering parameters of the preview image or adjust the hardware parameters of the rear camera; and According to the user's gesture recognized by the gesture recognition unit, the shooting parameters adjusted by the gesture on the small screen are changed.
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly recalled from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 may include a main screen and a small screen, the main screen may be located on the front of the electronic device 100 , and the small screen may be located on the back of the electronic device 100 .
  • the main screen can be used to display the window of the camera application and can be used to display a preview image.
  • the small screen may be a small screen integrated on the back of the electronic device 100, and when the user uses the rear camera to take a selfie, a preview image is displayed. In a normal state (for example, a standby state), the small screen can display a clock.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the electronic device 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Electronic device 100 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to receive the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a phone call or sending a voice message, the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In some other embodiments, the electronic device 100 may be provided with two microphones 170C, which may also implement a noise reduction function in addition to collecting sound signals. In some other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions, etc.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the electronic device 100 determines the intensity of pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the electronic device 100 when the electronic device 100 is a clamshell machine, the electronic device 100 can detect opening and closing of the clamshell according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user is holding the electronic device 100 close to the ear to make a call, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access to application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the electronic device 100 may reduce the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to prevent the electronic device 100 from being shut down abnormally due to the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the touch sensor 180K can monitor the touch points on the small screen in real time, and perform gesture recognition according to the touch points to determine the touch gesture.
  • the bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 180M can also contact the human pulse and receive the blood pressure beating signal. In some embodiments, the bone conduction sensor 180M can also be disposed in the earphone, combined into a bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • FIG. 2-FIG. 12, FIG. 13a and FIG. 13b The shooting parameter adjustment method provided by the embodiment of the present application will now be described with reference to FIG. 2-FIG. 12, FIG. 13a and FIG. 13b.
  • FIG. 2 is a schematic diagram of an application scenario of the shooting parameter adjustment method of the present application.
  • the above application scenario includes a user 200 and an electronic device 100 .
  • the user 200 can adjust shooting parameters on the small screen on the back of the electronic device 100, and can use the rear camera to take a selfie.
  • FIG. 3 it is a schematic flowchart of an embodiment of the shooting parameter adjustment method provided by the embodiment of the present application, including:
  • Step 301 perform setting of parameter adjustment gestures on the main screen.
  • the user can open the camera application program of the electronic device 100, and can enter the setting menu of the above-mentioned camera application program, for setting the parameter adjustment gesture of the small screen. Since the area of the small screen is usually small, it can only support a limited number of parameter adjustment gestures, and it is impossible to make each shooting parameter correspond to a parameter adjustment gesture. In addition, if each shooting parameter corresponds to a parameter adjustment gesture, the user will also have a memory deviation due to too many parameter adjustment gestures, which will make it difficult to match the corresponding relationship between the parameter adjustment gesture and the shooting parameters, resulting in users being unable to fluency operate.
  • two parameter adjustment gestures may be set, for example, one is a parameter adjustment gesture of sliding in a vertical direction, and the other is a parameter adjustment gesture of sliding in a left and right direction. It can be understood that three or more parameter adjustment gestures may also be set, and the embodiment of the present application does not specifically limit the number of the above parameter adjustment gestures.
  • the parameter adjustment gestures can be set in advance to determine the adjustment of each of the above parameters.
  • Shooting parameters controlled by gestures Exemplarily, in the camera application setting interface, there may be a "rear self-timer" menu bar, and the user can select whether to enable the parameter adjustment gesture and the shooting parameters controlled by the parameter adjustment gesture in the above-mentioned "rear self-timer" menu bar.
  • the interface 400 is a setting interface of the camera application program.
  • the interface 400 may include a rear Selfie menu bar 410, and the rear self-timer menu bar 410 may be used to enable parameter adjustment gestures and select shooting parameters controlled by the parameter adjustment gestures.
  • the above shooting parameters may include ISO aperture, shutter speed, exposure compensation, filter strength, white balance, beauty, zoom, sharpness, etc.
  • the user can click on the rear self-timer menu bar 410, thereby opening the shooting parameter selection box 411, which can include off, ISO aperture, shutter speed, exposure compensation, filter intensity, white balance , Beauty, Zoom, Sharpness and other options are used to enable or disable the above parameter adjustment gestures, and to select shooting parameters.
  • the user can select the close option to close the above-mentioned parameter adjustment gestures, and the user can also select any of the above-mentioned shooting parameters such as ISO aperture, shutter speed, exposure compensation, filter strength, white balance, beauty, zoom, and sharpness. Select one to enable the above parameter adjustment gestures, and use the parameter adjustment gestures to control the selected shooting parameters.
  • the interface 500 is a setting interface of the camera application program.
  • the interface 500 may include a rear self-timer menu bar 510, which may be used to enable parameter adjustment gestures and select shooting parameters controlled by the parameter adjustment gestures.
  • the parameter adjustment gesture may include a gesture of sliding up and down and a gesture of sliding left and right. Up and down sliding gestures and left and right sliding gestures can be used to control corresponding shooting parameters. It can be understood that the above-mentioned up and down sliding gestures and left and right sliding gestures can control the same shooting parameters, and can also control different shooting parameters. Next, it will be described by taking the above-mentioned up and down sliding gestures and left and right sliding gestures controlling different shooting parameters as an example.
  • the left and right sliding gestures can be used to adjust hardware shooting parameters.
  • the hardware shooting parameters can be parameters for controlling physical hardware such as CMOS sensors, for example, parameters such as ISO aperture, shutter speed, exposure compensation, and white balance can be included.
  • Swipe up and down gestures can be used to adjust software shooting parameters.
  • the software shooting parameters can be parameters for image processing when the algorithm produces images, for example, it can include parameters such as filter strength, beauty, zoom and sharpness.
  • the rear selfie menu bar 510 may include an up and down slide gesture menu 511 and a left and right slide gesture menu 512. The user may click on the up and down slide gesture menu 511, thereby obtaining a first shooting parameter selection box 5111, the first shooting parameter The selection box 5111 can be used to adjust the shooting parameters of the software.
  • the first shooting parameter selection box 5111 may include options such as off, filter strength, beauty, zoom, and sharpness.
  • the user can click on the left and right sliding gesture menu 512, thereby obtaining a second shooting parameter selection box 5121, which can be used to adjust hardware shooting parameters.
  • the second shooting parameter selection box 5121 may include options such as off, ISO aperture, shutter speed, exposure compensation, and white balance.
  • Step 302 using the rear camera to acquire images.
  • the rear camera of the electronic device 100 may be used to collect images, for example, for taking selfies.
  • the electronic device 100 may use the rear camera to acquire an image, for example, may acquire the user's face image.
  • Step 303 detecting the user's touch operation on the small screen, and performing gesture recognition on the touch operation.
  • a preview image may be displayed on the small screen.
  • users can perform touch operations on the small screen.
  • the electronic device 100 acquires the user's gesture, and can perform gesture recognition on the gesture to determine the type of the gesture, for example, whether the gesture is a gesture of sliding up and down or sliding left and right gesture.
  • the user touches and moves the finger on the small screen, and the touch sensor 180K on the small screen can perform gesture recognition by tracking the movement track of the touch point.
  • Step 304 adjust the value of the corresponding shooting parameter according to the recognized parameter adjustment gesture.
  • the recognized parameter adjustment gesture is the preset up and down sliding gestures If the up and down sliding gestures and left and right sliding gestures are preset in the camera application program, for example, as shown in the embodiment shown in Figure 5, it can be determined whether the recognized parameter adjustment gesture is a preset up and down sliding gesture or a preset left and right sliding gesture .
  • the gesture can be considered as a valid gesture; if the moving speed of the user's gesture (touch point) is greater than or equal to the preset value, the gesture can be Considered an invalid gesture.
  • the gesture For the up and down sliding gesture, if the movement speed of the gesture (touch point) is too large (for example, greater than or equal to the preset value), even if the gesture is recognized as a sliding up and down gesture, it will not trigger the adjustment of the shooting parameters, if the gesture If the moving speed of the (touch point) is less than a preset value, the gesture can be recognized as a sliding up and down gesture for adjusting shooting parameters.
  • the operating habit of the human hand is to swipe left and right, so it can also be distinguished by the moving speed of the gesture.
  • the gesture can be recognized as a swipe gesture to start the camera/switch the camera; if the movement of the gesture If the speed is less than the preset value, the gesture can be recognized as a left and right sliding gesture for adjusting shooting parameters.
  • the movement speed of the gesture can be defined by the number of rows/columns of pixels moving along a certain direction within a unit time (for example, 1 microsecond). For example, for the gesture of sliding up and down, that is, within 1 microsecond, the number of rows of pixels that the touch point spans in the up and down direction.
  • the value of the shooting parameter corresponding to the parameter adjustment gesture may be adjusted.
  • the value of the filter strength can be adjusted according to the moving distance of the finger on the screen or the number of rows of pixels moved by the touch point in the up and down direction, and the rendering and presentation can be performed according to the adjusted value of the filter strength. Preview images on small screens.
  • the adjustment process of the above-mentioned shooting parameters may be presented in a visualized manner.
  • a ruler is generated near the user's gesture, and the value of the current shooting parameter is displayed in a large font.
  • the cursor is displayed according to the user's touch point, and the adjustment process of the shooting parameters is presented visually through the movement of the cursor on the scale.
  • the electronic device 100 may include a display interface 600 with a small screen.
  • the ruler 610 can be displayed on the display interface 600, and the current touch point of the corresponding finger can be A cursor 620 is generated.
  • the cursor 620 can move on the scale 610 with the movement of the finger.
  • the scale on the scale 610 corresponds to the intensity value of the shooting parameter (eg, filter intensity).
  • the filter intensity value of 3 is used for rendering of the filter.
  • the intensity value of the filter intensity is decreased, and if the user's finger slides upward, the intensity value of the filter intensity is increased.
  • the scale of the ruler 610 on the screen of the above-mentioned small screen is not fixed, and can be dynamically adjusted according to the value of the current shooting parameter and the touch position of the user.
  • the intensity value of the filter strength before adjustment is 3, when the scale 610 is displayed, the position touched by the user corresponds to scale 3 in the scale 610 (as shown in FIG. 6 ).
  • the intensity value of the filter intensity before adjustment is 7, when the scale 610 is displayed, the position of the scale 7 may correspond to the position of the cursor 620 .
  • the scale displayed on the small screen of the scale 610 may also change correspondingly.
  • the shooting parameters are adjusted through the gesture interaction of the small screen.
  • the shooting parameters can be adjusted through simple interaction with the small screen, and there is no need to adjust the shooting parameters by operating the main screen. , more convenient to use, better selfie experience, and richer camera functions.
  • the operation is simple and convenient. You can hold the mobile phone with one hand, and interact with the small screen with the other hand. The adjustment effect can be previewed in real time through the small screen to easily complete the adjustment of shooting parameters and output satisfactory images.
  • the above-mentioned gesture adjustment can support multiple gestures, and the same or different parameters can be adjusted through multiple gestures, which enriches the operation scenarios.
  • the above-mentioned embodiment shown in FIG. 3 only changes the shooting parameters in the current shooting mode (for example, normal shooting), and does not constitute a limitation to the embodiment of the present application. In some embodiments, it may also be included in Change of shooting parameters in shooting mode switching scene.
  • the user can switch the current normal shooting mode to the wide-angle shooting mode.
  • the user can also perform a gesture to change shooting mode parameters on the small screen (for example, double-tap on the small screen).
  • step 304 if the electronic device 100 detects the user's gesture of changing the parameters of the shooting mode on the small screen (for example, double-clicking on the small screen), it can adjust the change of the parameters of the current shooting mode.
  • the mode is wide-angle shooting mode.
  • the parameters of wide-angle shooting mode can include main camera 1X and wide-angle 0.5X. Therefore, the double-click gesture can be used to switch between the main camera 1X and wide-angle 0.5X, so that the shooting mode can be realized. parameter changes.
  • Figure 3- Figure 6 introduces the scene of adjusting the shooting parameters through the parameter adjustment gesture of the small screen, and then, the scene of switching parameter settings is introduced in detail below through Figure 7- Figure 11. Since the number of parameter adjustment gestures is limited (for example, two are usually set, up and down sliding gestures and left and right sliding gestures), that is to say, the shooting parameters adjusted by the parameter adjustment gestures are limited. If the shooting parameters that the user wants to adjust are not the shooting parameters that have been set in advance, they still have to go back to the main screen to change the settings, and then adjust the specific shooting parameters through gestures, or complete the setting of the shooting parameters directly on the main screen; this increases the complexity of parameter adjustment.
  • the number of parameter adjustment gestures is limited (for example, two are usually set, up and down sliding gestures and left and right sliding gestures)
  • the shooting parameters adjusted by the parameter adjustment gestures are limited. If the shooting parameters that the user wants to adjust are not the shooting parameters that have been set in advance, they still have to go back to the main screen to change the settings, and then adjust
  • a parameter modification gesture can be added, and the parameter modification gesture can be used to execute the parameter modification function of parameter adjustment gesture adjustment.
  • the parameter changing gesture can be a user gesture acting on the small screen, or a gesture acting on the main screen.
  • the electronic device 100 can recognize the user's gesture. If the electronic device 100 recognizes that it is a parameter changing gesture, it can be used to change the currently adjustable shooting parameters. Exemplarily, the user can draw an arc exceeding a semicircle by gesture on the small screen. When the electronic device 100 detects the user gesture of the arc exceeding a semicircle, it can be considered as a parameter change gesture. At this time, the current The adjustable shooting parameter is changed, assuming that the current adjustable shooting parameter is the filter strength, then it can be changed to the next shooting parameter (for example, beauty face) in the shooting parameter list (for example, the first shooting parameter selection box 5111 ).
  • the next shooting parameter for example, beauty face
  • the edge gesture can also be used, and the user can slide the finger along the edge of the small screen, and the entire edge of the small screen can correspond to the top to the bottom of the parameter list. That is to say, as the user's finger slides along the small screen, the shooting parameters in the parameter list can be switched from top to bottom, or from bottom to top.
  • the way of switching from top to bottom can correspond to the user's gesture of sliding clockwise
  • the way of switching from bottom to top can correspond to the gesture of user's sliding counterclockwise, that is to say, the gesture of clockwise can slide down parameters
  • the list is used to change the adjustable shooting parameters; the counterclockwise gesture can slide up the parameter list to change the adjustable shooting parameters.
  • the above-mentioned gestures in Figure 7 and Figure 8 are only illustrative and do not constitute a limitation to the embodiment of the present application. In some embodiments, other types of gestures can also be used to change the adjustable shooting parameters . In addition, the above-mentioned embodiments shown in FIG. 7 and FIG. 8 are only for scenarios where one parameter adjustment gesture is preset, that is to say, one parameter change gesture can only change one adjustable shooting parameter at a time. In the case of preset multiple (for example, two) parameter adjustment gestures, one parameter changing gesture can change multiple adjustable shooting parameters at the same time.
  • one parameter change gesture can only change one adjustable shooting parameter at a time.
  • the edge gesture for example, finger sliding along the edge of the small screen
  • the edge gesture can be divided into left and right sides, that is, when the user's finger slides on the edge of the left half of the small screen, One parameter can be changed to adjust the shooting parameter corresponding to the gesture, and when the user's finger slides on the edge of the right half of the small screen, another parameter can be changed to adjust the shooting parameter corresponding to the gesture.
  • a current parameter adjustment gesture can also be locked, and an adjustable shooting parameter corresponding to the locked parameter adjustment gesture can be changed.
  • the current parameter adjustment gesture can be locked as a parameter adjustment gesture to change adjustable parameters.
  • the scenario of the above-mentioned locking parameter adjustment gesture will now be described with reference to FIG. 11 .
  • the current up and down sliding gestures can be Lock is the parameter adjustment gesture to be changed
  • the shooting parameter corresponding to the up and down sliding gesture is beauty
  • the shooting parameter corresponding to the left and right sliding gesture is white balance.
  • the user can use the parameter change gesture to change only the shooting parameters (for example, beauty) corresponding to the up and down sliding gestures, so that the changed shooting parameters (for example, zoom) can be obtained without changing the shooting parameters corresponding to the left and right sliding gestures. (for example, white balance) to make changes.
  • the shooting parameters for example, beauty
  • the changed shooting parameters for example, zoom
  • the back of the mobile phone (that is, the main screen) is easy to be operated when taking a rear selfie.
  • the parameter adjustment gesture adjustment can be realized through the tapping gesture on the back Change settings of shooting parameters.
  • the shooting parameters of multiple parameter adjustment gestures can be changed at the same time. For example, if you tap the back once (for example, the main screen), you can change the shooting parameters corresponding to the two gestures at the same time, or you can only change one. Parameters adjust the shooting parameters of gestures. In the specific implementation, you can first select and lock one of the parameter adjustment gestures by double-clicking on the back of the phone, and you can display the horizontal ruler and vertical ruler on the small screen to prompt the locked gesture, and cooperate with the text to prompt that the parameter adjustment gesture in this direction is locked, and then Then click to change the parameter to adjust the parameter type adjusted by the gesture.
  • the first and second methods it is also possible to combine the first and second methods to change the shooting parameters.
  • the electronic device 100 as a mobile phone as an example
  • the user can double-tap the back of the mobile phone to enter the selection mode, and display the scales of the left and right sliding gestures and up and down sliding gestures on the small screen. Users can select the parameter adjustment gesture to change by selecting the ruler.
  • the user can select the ruler on the small screen and lock it; or click the back of the phone to switch between the left and right sliding gestures and the up and down sliding gestures, and then double-click the back of the phone to select the ruler to select the parameter adjustment gesture to be changed; for example, select After swiping up and down gestures, the ruler corresponding to the up and down swiping gestures is highlighted on the small screen.
  • the user can change the shooting parameters of the currently locked parameter adjustment gesture (for example, a sliding gesture up and down) through a parameter changing gesture.
  • the shooting parameters adjusted by gestures are switched through simple user gestures.
  • the shooting parameters can be switched through the preset switching gesture, so that the adjustable shooting parameters can be increased without switching to the main screen for resetting. In turn, user experience can be improved.
  • Figures 7-11 introduce the scene of switching parameter settings, and then, the following describes the scene of canceling parameter adjustment or locking parameters in detail through Figure 12, Figure 13a and Figure 13b.
  • the scene of canceling the parameter adjustment is applicable to the user who is not satisfied with the current parameter adjustment effect, and at this time, the user may cancel the current shooting parameter adjustment.
  • the scene where the parameters are locked is suitable for the user to be satisfied with the current parameter adjustment effect. At this time, the user can lock the current shooting parameters and use the current shooting parameters for shooting.
  • FIG. 12 is a schematic flowchart of another embodiment of the shooting parameter adjustment method provided by the embodiment of the present application, including:
  • Step 1201 when the parameter adjustment gesture is detected, the stop gesture is monitored in real time.
  • the electronic device 100 may monitor the suspension gesture in real time.
  • a user's parameter adjustment gesture for example, a gesture of sliding up and down or a gesture of sliding left and right
  • the abort gesture can be used to display a cancel icon and a lock icon.
  • the cancel icon can be used to cancel the adjustment of the current shooting parameters
  • the lock icon can be used to lock the adjustment of the current shooting parameters.
  • Step 1202 displaying a cancel icon and a lock icon on the small screen in response to the detected suspension gesture of the user.
  • Step 1203 in response to the user's first operation on the cancel icon, terminate the adjustment of the shooting parameters, or in response to the user's second operation on the lock icon, lock the adjusted values of the current shooting parameters.
  • the user may sometimes want to cancel the adjustment during the process of adjusting the shooting parameters. For example, when the user repeatedly adjusts the shooting parameters and the image preview effect is still not as good as before the adjustment, he will want to restore the previous shooting parameter values; Or, the user thinks that the value of the current shooting parameter is very suitable and wants to lock the value of the parameter to prevent accidental adjustment.
  • the user may provide a stop gesture, and the stop gesture may be, for example, a long press on the screen for a preset time, such as 3 seconds.
  • the user can long press the screen to display the cancel icon and lock icon on the small screen. Then, the user can slide to the position of the cancel icon or the lock icon through a gesture, thereby selecting the cancel icon to cancel the adjustment of the shooting parameters this time, or selecting the lock icon to lock the value of the currently adjusted shooting parameter.
  • the cancel icon and the lock icon may be distributed on both sides of the scale to avoid false touches.
  • the user can perform a stop gesture (eg, long press) operation on the display interface 1300 of the small screen, thereby calling out a cancel icon and a lock icon.
  • the display interface 1300 may display a cancel icon 1301 and a lock icon 1302 .
  • the cancel icon 1301 and the lock icon 1302 are respectively located on the left and right sides of the scale 1303 .
  • the user can slide the finger to the lock icon 1302 in the display interface 1300 as shown in FIG. In the displayed display interface 1300, slide the finger to the cancel icon 1301, thereby canceling the adjustment of the shooting parameters this time, that is, recovering the values of the shooting parameters before adjustment.
  • Fig. 13a and Fig. 13b are only illustrative illustrations with up and down sliding gestures, and do not constitute a limitation to the embodiment of the present application.
  • the above-mentioned parameter adjustment gesture is a left-right sliding gesture
  • the above-mentioned cancel icon and lock icon can be They are respectively located on the upper and lower sides of the ruler. At this time, the ruler is placed horizontally. Then, the user can slide the finger up and down to the cancel icon or the lock icon to perform corresponding operations.
  • FIG. 13a and FIG. 13b which will not be repeated here.
  • FIG. 14 is a schematic structural diagram of an embodiment of the shooting parameter adjustment device of the present application. As shown in FIG. The second screen is respectively located on both sides of the electronic device, and may include: an identification module 1410 and an adjustment module 1420; wherein,
  • An identification module 1410 configured to identify the first gesture in response to the detected first gesture of the user on the first screen
  • the adjustment module 1420 is configured to adjust the value of the shooting parameter of the parameter adjustment gesture matching the first gesture based on the first gesture if it is recognized that the first gesture matches the preset parameter adjustment gesture; wherein, the preset parameter The adjustment gestures are preset by the user on the second screen.
  • the preset parameter adjustment gesture includes a gesture of sliding up and down and/or a gesture of sliding left and right.
  • the identification module 1410 is also used to
  • Identifying the validity of the first gesture wherein, the validity of the first gesture is determined by the moving speed of the first gesture;
  • the first gesture is recognized.
  • the above shooting parameter adjustment device 1400 further includes:
  • the display module 1430 is used to display a scale and a cursor on the display interface of the first screen, wherein the cursor corresponds to the touch point of the first gesture on the first screen, and the value on the scale corresponding to the cursor is used to represent the value of the shooting parameter.
  • the above shooting parameter adjustment device 1400 further includes:
  • the first switching module 1440 is configured to identify the second gesture in response to the detected second gesture of the user on the first screen; if it is recognized that the second gesture matches the preset parameter changing gesture, then The shooting parameters corresponding to the one or more parameter adjustment gestures are switched based on the second gesture.
  • the above shooting parameter adjustment device 1400 further includes:
  • a suspension module 1450 configured to display the first icon and the second icon on the display interface of the first screen in response to the detected third gesture of the user on the first screen;
  • the display interface of the first screen further includes a scale, and the first icon and the second icon are respectively located on left and right sides or up and down sides of the scale.
  • the electronic device further includes a shooting mode
  • the shooting mode includes a normal shooting mode and a wide-angle shooting mode
  • the above-mentioned shooting parameter adjustment device 1400 also includes:
  • the second switching module 1460 is configured to switch the parameter corresponding to the wide-angle shooting mode based on the fourth gesture detected by the user on the first screen in response to the fourth gesture.
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the above-mentioned electronic device 100 and the like include hardware structures and/or software modules corresponding to each function.
  • the embodiments of the present application can be implemented in the form of hardware or a combination of hardware and computer software in combination with the example units and algorithm steps described in the embodiments disclosed herein. Whether a certain function is executed by hardware or computer software drives hardware depends on the specific application and design constraints of the technical solution. Professionals and technicians may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the embodiments of the present application.
  • the embodiment of the present application can divide the above-mentioned electronic device 100 into functional modules according to the above-mentioned method examples.
  • each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. It should be noted that the division of modules in the embodiment of the present application is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • Each functional unit in each embodiment of the embodiment of the present application may be integrated into one processing unit, or each unit may physically exist separately, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage
  • the medium includes several instructions to enable a computer device (which may be a personal computer, server, or network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk, and other various media capable of storing program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Des modes de réalisation de la présente demande se rapportent au domaine des terminaux intelligents, et concernent un procédé de réglage de paramètre de photographie, un dispositif électronique et un support de stockage. Le procédé consiste à : en réponse à un premier geste détecté d'un utilisateur sur un premier écran, reconnaître le premier geste ; et s'il est reconnu que le premier geste est mis en correspondance avec un geste de réglage de paramètre prédéfini, régler, sur la base du premier geste, la valeur d'un paramètre de photographie correspondant au geste de réglage de paramètre mis en correspondance avec le premier geste, le geste de réglage de paramètre prédéfini étant prédéfini par l'utilisateur sur un second écran. Au moyen du procédé décrit dans les modes de réalisation de la présente demande, des paramètres de photographie d'une caméra arrière peuvent être facilement réglés, de telle sorte que l'expérience de fonctionnement de l'utilisateur peut être améliorée.
PCT/CN2022/115819 2021-11-01 2022-08-30 Procédé de réglage de paramètre de photographie, dispositif électronique et support de stockage WO2023071497A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111282872.8A CN116069156A (zh) 2021-11-01 2021-11-01 拍摄参数调节方法、电子设备及存储介质
CN202111282872.8 2021-11-01

Publications (1)

Publication Number Publication Date
WO2023071497A1 true WO2023071497A1 (fr) 2023-05-04

Family

ID=86160197

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/115819 WO2023071497A1 (fr) 2021-11-01 2022-08-30 Procédé de réglage de paramètre de photographie, dispositif électronique et support de stockage

Country Status (2)

Country Link
CN (1) CN116069156A (fr)
WO (1) WO2023071497A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6337709B1 (en) * 1995-02-13 2002-01-08 Hitachi, Ltd. Image display device
CN105141852A (zh) * 2015-10-10 2015-12-09 李彦辰 双屏手机拍摄模式下的控制方法及控制装置
CN106933620A (zh) * 2017-02-14 2017-07-07 珠海市魅族科技有限公司 一种拍摄控制方法和系统
CN108769506A (zh) * 2018-04-16 2018-11-06 Oppo广东移动通信有限公司 图像采集方法、装置、移动终端及计算机可读介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6337709B1 (en) * 1995-02-13 2002-01-08 Hitachi, Ltd. Image display device
CN105141852A (zh) * 2015-10-10 2015-12-09 李彦辰 双屏手机拍摄模式下的控制方法及控制装置
CN106933620A (zh) * 2017-02-14 2017-07-07 珠海市魅族科技有限公司 一种拍摄控制方法和系统
CN108769506A (zh) * 2018-04-16 2018-11-06 Oppo广东移动通信有限公司 图像采集方法、装置、移动终端及计算机可读介质

Also Published As

Publication number Publication date
CN116069156A (zh) 2023-05-05

Similar Documents

Publication Publication Date Title
CN112333380B (zh) 一种拍摄方法及设备
EP3800876B1 (fr) Procédé de commutation de caméras par un terminal, et terminal
CN110244893B (zh) 一种分屏显示的操作方法及电子设备
WO2022100610A1 (fr) Procédé et appareil de projection d'écran, ainsi que dispositif électronique et support de stockage lisible par ordinateur
US11272116B2 (en) Photographing method and electronic device
US20220342516A1 (en) Display Method for Electronic Device, Electronic Device, and Computer-Readable Storage Medium
EP3879401A1 (fr) Procédé de division d'écran automatique, interface utilisateur graphique et dispositif électronique
CN112751954A (zh) 一种操作提示的方法和电子设备
CN113934330A (zh) 一种截屏方法及电子设备
WO2023241209A1 (fr) Procédé et appareil de configuration de papier peint de bureau, dispositif électronique et support de stockage lisible
CN114115770A (zh) 显示控制的方法及相关装置
WO2020221062A1 (fr) Procédé d'opération de navigation et dispositif électronique
CN114500901A (zh) 双景录像方法、装置和电子设备
CN114095602B (zh) 索引显示方法、电子设备及计算机可读存储介质
CN113467735A (zh) 图像调整方法、电子设备及存储介质
WO2022068505A1 (fr) Procédé de prise de photographies et dispositif électronique
CN114302063B (zh) 一种拍摄方法及设备
WO2023071497A1 (fr) Procédé de réglage de paramètre de photographie, dispositif électronique et support de stockage
CN113867520A (zh) 设备控制方法、电子设备和计算机可读存储介质
CN114089902A (zh) 手势交互方法、装置及终端设备
CN113391735A (zh) 显示形态的调整方法、装置、电子设备及存储介质
WO2023020420A1 (fr) Procédé d'affichage de volume, dispositif électronique et support de stockage
WO2022105670A1 (fr) Procédé d'affichage et terminal
WO2022252786A1 (fr) Procédé d'affichage d'écran divisé en fenêtres et dispositif électronique
WO2023124178A1 (fr) Procédé d'affichage d'image de prévisualisation, appareil et support de stockage lisible

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22885389

Country of ref document: EP

Kind code of ref document: A1