CN117979156A - Picture shooting method, picture processing method, electronic device and storage medium - Google Patents

Picture shooting method, picture processing method, electronic device and storage medium Download PDF

Info

Publication number
CN117979156A
CN117979156A CN202311835171.1A CN202311835171A CN117979156A CN 117979156 A CN117979156 A CN 117979156A CN 202311835171 A CN202311835171 A CN 202311835171A CN 117979156 A CN117979156 A CN 117979156A
Authority
CN
China
Prior art keywords
target
application
picture
poi
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311835171.1A
Other languages
Chinese (zh)
Inventor
周无垠
张亮
黄涛
朱炳林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202311835171.1A priority Critical patent/CN117979156A/en
Publication of CN117979156A publication Critical patent/CN117979156A/en
Pending legal-status Critical Current

Links

Landscapes

  • Telephone Function (AREA)

Abstract

The application provides a picture shooting method, a picture processing method, electronic equipment and a storage medium, wherein the method comprises the following steps: and in response to a first operation of the user on the target picture, displaying related information, wherein the target picture is at least related to target POI information, the related information comprises a first recommended application, and the first recommended application at least comprises a recommended application related to the address type. The method provided by the application can open the related application according to the information carried in the picture in the process of previewing the picture by the user, and does not need to exit the previewing process and then open the related application, thereby being beneficial to improving the efficiency of processing the picture and improving the use experience of the user.

Description

Picture shooting method, picture processing method, electronic device and storage medium
Technical Field
The present application relates to the field of information technologies, and in particular, to a picture capturing method, a picture processing method, an electronic device, and a storage medium.
Background
The portability of the terminal equipment enables people to rely more and more on using the gallery in the terminal equipment to arrange the pictures shot by the people, the number of pictures in the gallery is more and more owned by people at present, the frequency of using the gallery is more and more high, and the number of browsing pictures every day is more and more. With the increasing frequency of use, there is an increasing need for functionality, for example, in some scenarios, it is desirable to process pictures through related applications when browsing pictures in a gallery.
However, in the related art, when a user browses a picture through a gallery, if the user desires to process the picture through some applications, the user needs to exit the gallery and then open the related applications, so that the operation of the user is complicated, and the efficiency of picture processing is reduced.
Disclosure of Invention
The application provides a picture shooting method, a picture processing method, electronic equipment and a storage medium, which can open related applications according to information carried in pictures in the process of previewing the pictures by a user without exiting the previewing process and then opening the related applications, thereby being beneficial to improving the efficiency of processing the pictures and improving the use experience of the user.
In a first aspect, the present application provides a method for capturing a picture, including: responding to shooting operation of a user, acquiring one or more POI information, wherein the POI information at least comprises POI addresses; determining target POI information in the one or more POI information; and associating the shot target picture with the target POI information and storing the target POI information.
In one possible implementation manner, after determining the target POI information in the one or more POI information, the method further includes: and if the POI address in the target POI information is incomplete, calling a target application, and complementing the incomplete POI address in the target POI information based on the target application.
In one possible implementation manner, the acquiring the information of one or more points of interest POI includes: and if the current shot object at least comprises an address type object, acquiring one or more POI information.
In one possible implementation manner, the storing after associating the target picture obtained by shooting with the target POI information includes: and acquiring target business hours of the target POI, and storing the target picture acquired by shooting after being associated with the target POI information and the target business hours.
In one possible implementation manner, the POI information includes a POI name, and determining the target POI information in the one or more POI information includes: identifying key information in the target picture; and matching the key information with the POI names in the one or more interest point information to determine target POI information.
In a second aspect, the present application provides a picture processing method, including: and displaying related information in response to a first operation of a user on the target picture, wherein the target picture is at least related to target POI information, the related information comprises a first recommended application, and the first recommended application at least comprises a recommended application related to the address type.
In one possible implementation manner, the displaying the related information in response to the first operation of the user on the target picture includes: determining a type of a target object in a target picture in response to a first operation of a user on the target picture, wherein the type of the target object comprises a biological type and/or an address type; and if the target object at least comprises an object of an address type, displaying related information.
In one possible implementation manner, the related information further includes a second recommendation application, where the second recommendation application is determined according to a personal usage habit and/or a public usage habit.
In one possible implementation manner, the target picture is further associated with a target business hour, and the priority of each recommended application in the first recommended application is determined by the target business hour.
In one possible implementation manner, the method further includes: responding to a second operation of a user on a target application, and calling the target application, wherein the target application comprises a recommended application related to an address type in the first recommended application; and displaying a target page in the target application, and automatically inputting target POI information associated with the target picture in the target page.
In a third aspect, the present application provides a picture taking device, comprising one or more functional modules, where the one or more functional modules are configured to implement the picture taking method according to the first aspect.
In a fourth aspect, the present application provides a picture processing apparatus, comprising one or more functional modules for implementing the picture processing method according to the first aspect.
In a fifth aspect, the present application provides an electronic device, comprising: a processor and a memory for storing a program; the processor is configured to run the program, implement the picture taking method as described in the first aspect, or implement the picture processing method as described in the second aspect.
In a sixth aspect, the present application provides a readable storage medium having a program stored therein, which when run on an electronic device, causes the electronic device to implement the picture taking method as described in the first aspect, or causes the electronic device to implement the picture processing method as described in the second aspect.
In a seventh aspect, the present application provides a program which, when run on a processor of a slave electronic device, causes the electronic device to perform the picture taking method as described in the first aspect, or causes the electronic device to perform the picture processing method as described in the second aspect.
In one possible design, the program in the seventh aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a flowchart of an embodiment of a picture taking method according to the present application;
fig. 3 is a schematic diagram of Wi-Fi signal detection provided by an embodiment of the present application;
FIG. 4 is a schematic diagram illustrating detection of key information according to an embodiment of the present application;
Fig. 5 is a flowchart illustrating another embodiment of a picture taking method according to the present application;
fig. 6 is a flowchart illustrating another embodiment of a picture taking method according to the present application;
fig. 7 is a flowchart illustrating another embodiment of a picture taking method according to the present application;
FIG. 8 is a schematic diagram of acquiring business hours according to an embodiment of the present application;
FIG. 9 is a flowchart illustrating an embodiment of a method for processing pictures according to the present application;
FIGS. 10A-10C are schematic views illustrating a first recommended application according to an embodiment of the present application;
FIG. 11 is a schematic diagram illustrating a second recommendation application according to an embodiment of the present application;
FIG. 12 is a schematic diagram of one-touch navigation according to an embodiment of the present application;
Fig. 13 is a schematic structural diagram of a picture capturing device according to an embodiment of the present application;
Fig. 14 is a schematic structural diagram of a picture processing device according to an embodiment of the present application.
Detailed Description
In the embodiment of the present application, unless otherwise specified, the character "/" indicates that the associated object is one or the relationship. For example, A/B may represent A or B. "and/or" describes an association relationship of an association object, meaning that three relationships may exist. For example, a and/or B may represent: a exists alone, A and B exist together, and B exists alone.
It should be noted that the terms "first," "second," and the like in the embodiments of the present application are used for distinguishing between description and not necessarily for indicating or implying a relative importance or number of features or characteristics in order.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. Furthermore, "at least one item(s)" below, or the like, refers to any combination of these items, and may include any combination of single item(s) or plural items(s). For example, at least one (one) of A, B or C may represent: a, B, C, a and B, a and C, B and C, or A, B and C. Wherein each of A, B, C may itself be an element or a collection of one or more elements.
In embodiments of the application, "exemplary," "in some embodiments," "in another embodiment," etc. are used to indicate an example, instance, or illustration. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the term use of an example is intended to present concepts in a concrete fashion.
"Of", "corresponding (corresponding, relevant)" and "corresponding (corresponding)" in the embodiments of the present application may be sometimes mixed, and it should be noted that the meanings to be expressed are consistent when the distinction is not emphasized. In the embodiments of the present application, communications and transmissions may sometimes be mixed, and it should be noted that, when the distinction is not emphasized, the meaning expressed is consistent. For example, a transmission may include sending and/or receiving, either nouns or verbs.
The equal to that related in the embodiment of the application can be used together with the greater than the adopted technical scheme, can also be used together with the lesser than the adopted technical scheme. It should be noted that when the number is equal to or greater than the sum, the number cannot be smaller than the sum; when the value is equal to or smaller than that used together, the value is not larger than that used together.
The portability of the terminal equipment enables people to rely more and more on using the gallery in the terminal equipment to arrange the pictures shot by the people, the number of pictures in the gallery is more and more owned by people at present, the frequency of using the gallery is more and more high, and the number of browsing pictures every day is more and more. With the increasing frequency of use, there is an increasing need for functionality, for example, in some scenarios, it is desirable to process pictures through related applications when browsing pictures in a gallery.
However, in the related art, when a user browses a picture through a gallery, if the user desires to process the picture through some applications, the user needs to exit the gallery and then open the related applications, so that the operation of the user is complicated, and the efficiency of picture processing is reduced.
Based on the above problems, the embodiments of the present application provide a picture taking method and a picture processing method, which are applied to an electronic device, where the electronic device may be a device having a display screen and a camera.
In some alternative embodiments, the electronic device may be a mobile terminal, which may also be referred to as a User Equipment (UE), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a User terminal, a wireless communication device, a User agent, or a User Equipment. The mobile terminal may be a Station (ST) in a WLAN, which may be a cellular telephone, a cordless telephone, a session initiation protocol (Session Initiation Protocol, SIP) phone, a wireless local loop (Wireless Local Loop, WLL) station, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA) device, a handheld device with wireless communication capabilities, a computing device or other processing device connected to a wireless modem, an in-vehicle device, a car networking terminal, a computer, a laptop computer, a handheld communication device, a handheld computing device, a satellite radio, a wireless modem card, a television Set Top Box (STB), a customer premises equipment (customer premise equipment, CPE) and/or other devices for communication over a wireless system as well as next generation communication systems, such as a mobile terminal in a 5G network or a mobile terminal in a future evolved public land mobile network (Public Land Mobile Network, PLMN) network, etc.
In some alternative embodiments, the electronic device may be a fixed terminal. By way of example, the stationary terminal may include, but is not limited to, a personal computer (Personal Computer, PC), a notebook, a large screen, a smart screen, and the like.
In some alternative embodiments, the electronic device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wearing and developing wearable devices by applying wearable technology, such as a smart watch, a smart bracelet and the like.
Fig. 1 exemplarily shows a schematic structural diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an ear-headphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (CAMERA SERIAL INTERFACE, CSI), display serial interfaces (DISPLAY SERIAL INTERFACE, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, WIFI) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
Next, an exemplary picture photographing method according to an embodiment of the present application is described with reference to fig. 2 to 8.
Fig. 2 is a schematic flow chart of an embodiment of a picture shooting method provided by the present application, and in the embodiment shown in fig. 2, the electronic device may be a mobile terminal, where the mobile terminal has a camera, and specifically includes the following steps:
In step 201, in response to a shooting operation of a user, the electronic device acquires information of one or more points of interest, where the information of points of interest is information of points of interest near a shooting environment.
Specifically, when the user is located at any one of the geographical locations, the user can take a photograph using the portable electronic device. It will be appreciated that the electronic device has a camera and that a user may take a photograph using the camera on the electronic device.
In response to a user's photographing operation, the electronic device may obtain information of one or more points of interest (Point of Interest, POIs).
It is understood that the POI information may be information of POIs in the vicinity of the shooting environment in which the user is currently located.
For example, taking a photograph taken as a photograph of a store, the POI information may include information of POIs of the store and information of POIs near the store.
In some alternative embodiments, the information of a POI may include at least information such as a POI name and a POI address.
It should be noted that the foregoing stores are merely exemplary to illustrate the shooting objects, and are not meant to limit the embodiments of the present application, and in some embodiments, the current shooting environment may include other shooting objects.
In some alternative embodiments, the point of interest may be obtained via a wireless fidelity (WIRELESS FIDELITY, wi-Fi) signal.
For example, the electronic device may search for one or more Wi-Fi signals in the vicinity through the Wi-Fi module, and may determine POI information corresponding thereto according to the one or more Wi-Fi signals.
Fig. 3 illustrates a detection schematic of Wi-Fi signals. Referring to fig. 3, the WLAN search page of the electronic device may show the searched nearby Wi-Fi signals, for example, the searched Wi-Fi signals may include 3 Wi-Fi signals of aa, bb, cc, etc., i.e., the names of the 3 Wi-Fi signals are aa, bb, and cc.
Then, the electronic device can query in a preset relational database according to the searched Wi-Fi signal name to determine POI information corresponding to the Wi-Fi signal name.
The preset relation database stores the mapping relation between the name of Wi-Fi signals and POI information.
The mapping relationship shown in table 1 can be obtained through the above-mentioned query in the preset relationship database.
TABLE 1
Name of Wi-Fi signal POI name POI address
aa Shop No. 1 Shenzhen city Longpost region snow sentry north road 20-13
bb Shop No. 2 Shenzhen city Longpost region snow sentry north road 20-12
cc Shop No. 3 Shenzhen city Longpost region snow sentry north road 20-14
Referring to Table 1, the wi-Fi signal has a name of aa corresponding to POI name of store No. 1, and the corresponding POI address of 20-13 in snow guard North road in Dragon guard area of Shenzhen; the name of the POI corresponding to the bb of the Wi-Fi signal is "store No. 2", and the address of the corresponding POI is "20-12% of snow sentry North road in Shenzhen Longpost district; the Wi-Fi signal has the name of POI corresponding to aa as a store No.3, and the corresponding POI address is 20-14 of snow guard North road in Shenzhen Longpost district.
It may be appreciated that the POI name may be the same as the name of the corresponding Wi-Fi signal, or the POI name may be different from the name of the corresponding Wi-Fi signal, which is not particularly limited in the embodiment of the present application.
In step 202, the electronic device associates the shot target picture with the searched one or more POI information.
Specifically, after the electronic device shoots, a target picture can be obtained, and the target picture can be associated with the searched one or more pieces of POI information.
In step 203, the electronic device determines target POI information in the one or more POI information based on the key information in the target picture.
Specifically, since the target picture is associated with the searched one or more POI information, however, it is not determined whether or not POI information corresponding to the target picture exists in the one or more POI information, nor is it determined which POI information corresponds to the target picture in the one or more POI information. Therefore, the target POI information can also be determined among the searched one or more POI information.
The target POI information is POI information corresponding to a target object in the target picture. Taking the shot picture as a store as an example, assuming that the shot picture of store 1 is taken, that is, the target object in the shot target picture is store 1, the target POI information may be name and address information of store 1.
In some alternative embodiments, the manner of determining the target POI information may be based on key information in the target picture.
The key information in the target picture may include, but is not limited to, information such as a name, an address, etc. of the target object.
For example, first, the content in the target picture may be identified to obtain the key information in the target picture.
Fig. 4 schematically illustrates key information in a target picture.
Referring to fig. 4, a target picture 400 taken by a user includes a shop having a shop name and a shop address (for example, a house number), and key information such as the shop name and the shop address can be acquired by identifying the target picture, for example, the shop name is displayed in a first area 401 and the shop address is displayed in a second area 402.
It should be understood that the first area 401 and the second area 402 are merely exemplary, and are not meant to limit embodiments of the present application, and the display areas of the key information may be different in different target pictures, that is, the number of display areas of the key information and the positions in the pictures may be different in different target pictures.
Next, the target POI information is determined by matching key information in the target picture with one or more POI information associated with the target picture. For example, the shop name may be matched with the POI name in the one or more POI information, the POI information corresponding to the POI name matched with the shop name may be the target POI information, and/or the shop address may be matched with the POI address in the one or more POI information, and the POI information corresponding to the POI address matched with the shop address may be the target POI information.
For example, taking name matching as an example, assuming that the shop name in the target picture is "shop No. 1" by recognition of the target picture, the POI information corresponding to the POI name "shop No. 1" can be determined as the target POI information by matching the POI name in the one or more POI information with the POI name "shop No. 1" and the shop name "shop No. 1".
Or taking address matching as an example, assuming that the shop address in the target picture is "snow guard north road 20-13" through identification of the target picture, by matching with the POI address in one or more POI information, the POI address is "Shenzhen city snow guard north road 20-13" and the shop name is "snow guard north road 20-13" are matched, so that the POI information corresponding to the POI address of "Shenzhen city snow guard north road 20-13" can be determined as the target POI information.
In step 204, the electronic device associates the target picture with the target POI information and stores the same.
Specifically, after the electronic device obtains the target POI information, the target picture and the target POI information can be stored after being associated, that is, the target picture carries the target POI information, so that when a user processes the target picture later, the target picture can be processed based on the target POI information in the target picture, and the processing efficiency of the picture is improved.
In some alternative embodiments, the manner of associating the target picture with the target POI information and then storing the target picture may include: writing the POI information into Exif information of the picture file.
In the embodiment of the application, the POI information is carried in the picture shot by the user, so that the user provides the related information when the user processes the picture later, and the processing efficiency of the user on the picture is improved.
In some alternative embodiments, the POI address stored in the preset relational database may be incomplete, which may result in incomplete POI address in the target POI information, and may reduce the processing efficiency of the user on the picture.
The electronic device may query the preset relationship database according to the searched Wi-Fi signal name, so as to obtain the mapping relationship between the Wi-Fi signal name and POI information as shown in table 2.
TABLE 2
Name of Wi-Fi signal POI name POI address
aa Shop No. 1 Shenzhen city Longpost snow sentry north road
bb Shop No. 2 Shenzhen city Longpost snow sentry north road
cc Shop No. 3 Shenzhen city Longpost snow sentry north road
Referring to table 2, the POI addresses of the "store 1", "store 2" and "store 3" are not complete, for example, only the road name, no house number is present, and of course, the situation that the address is not complete may also include other types, which are not examples herein.
Therefore, the electronic device can complement the POI address under the condition that the POI address is detected to be incomplete.
Fig. 5 is a flowchart of another embodiment of a picture taking method according to an embodiment of the present application, where, by using the embodiment shown in fig. 5, incomplete POI addresses may be complemented, and after step 203, the method may further include the following steps:
in step 501, the electronic device determines whether the POI address in the target POI information is complete.
If the electronic device determines that the POI address in the target POI information is complete, step 204 may be performed; or alternatively
If the electronic device determines that the POI address in the target POI information is incomplete, step 502 may be performed.
Step 502, the electronic device invokes a target application, and completes the incomplete POI address in the target POI information based on the target application.
In particular, the target application may be a related application for querying an address.
By way of example, the target applications may include, but are not limited to, a map-type application, a browser-type application, or a search engine-type application, among others.
When the electronic equipment judges that the POI address in the target POI information is incomplete, the target application can be called, and the incomplete POI address in the target POI information can be complemented based on the target application. When the electronic device complements the incomplete POI address in the target POI information, step 204 may be performed.
In some optional embodiments, after the electronic device invokes the target application, the POI name in the target POI information may be input into the target application, so that the target application may query for a specific address based on the POI name, and thus, the electronic device may complete the POI address in the target POI information based on the specific address.
By way of example, taking the information in table 2 as an example, assuming that the target POI information is POI information corresponding to "store No. 1", the POI name (i.e., "store No. 1") may be input into the target application, and the target application may obtain that the specific address of "store No. 1" is "shenzhen" on sentry district snow guard north road 20-13 "through the query of" store No. 1 ", so that the specific address may replace the POI address in the target POI information. To complete the information complement of the incomplete POI address in the target POI information.
In some optional embodiments, in order to improve accuracy of information completion, after the electronic device invokes the target application, the POI name and POI address in the target POI information may be input into the target application, so that the target application may query to obtain a specific address based on the POI name and POI address.
According to the embodiment of the application, the incomplete POI address in the target POI information is complemented, so that the target POI information is kept complete, and the problem that a user cannot call related applications to process target pictures due to the incomplete target POI information is avoided, for example, if the POI address in the target POI information is not complete, the user cannot call the navigation application to the position indicated by the POI address.
In some alternative embodiments, the target POI information may not be associated with the object of the part type of photographing, for example, a person, and the target POI information may be associated with the object of the part type of photographing, for example, a building, so that the type of the object currently photographed may be judged before the target POI information is associated with the picture photographed by the user.
Fig. 6 is a flowchart of another embodiment of a picture shooting method according to an embodiment of the present application, through the embodiment shown in fig. 6, it may be determined whether a shot picture is associated with target POI information based on a type of a currently shot object, and before the electronic device obtains the point of interest information in step 201, the method may further include the following steps:
in step 601, in response to a shooting operation of a user, the electronic device determines a type of an object currently shot.
Specifically, the type of the photographed object may include, but is not limited to, a biological type, an address type, a biological+address type.
The biological type may be used to characterize the photographed object, including people, animals, plants, and other objects, and of course, may also include other types of living things, which are not exemplified herein.
The address type may be used to characterize photographed objects including shops, scenic spots, malls, transportation hubs, cells, movie theatres, etc., and of course, other types of addresses may be included, and embodiments of the present application are not limited to the examples described herein.
The bio + address type may be used to characterize photographed objects including bio-type objects and address type objects.
If the electronic device determines that the type of the currently photographed object is an address type or a bio+address type, step 602 may be performed. Or alternatively
If the electronic device determines that the type of the object currently photographed is a biological type, step 603 may be performed.
In step 602, the electronic device obtains point of interest information.
Specifically, after step 602 is performed, step 202 may be performed.
In step 603, the electronic device takes a picture of the object containing the biometric type.
According to the embodiment of the application, whether the target POI information is related to the picture is determined by judging the type of the shooting object, and redundant operations such as related address information of the figure image are avoided, so that the picture generation efficiency can be improved.
In some optional embodiments, for a picture whose type is an address type or a biological+address type, the shooting object may further include business hours, so after associating the target POI information with the target picture, the business hours may also be associated with the target picture, so as to improve the efficiency of subsequent processing of the target picture by the user.
Fig. 7 is a flowchart of another embodiment of a picture taking method according to an embodiment of the present application, where, by using the embodiment shown in fig. 7, business hours can be associated with a taken target picture, and after step 204, the method may further include the following steps:
In step 701, the electronic device invokes a related application, and obtains a target business hour based on the related application.
In particular, the related application may be an address-related application, which may include, for example, but not limited to, a map application, a navigation application, a browser application, and the like.
The electronic device may invoke the related application and may input the POI name and/or the POI address in the target POI information into the related application, so that the related application may query based on the POI name and/or the POI address to obtain the corresponding target business hours.
Taking store No. 1 in table 1 as an example, fig. 8 illustrates an acquisition method of business hours. Referring to fig. 8, after POI information is input to the related application, the page 800 shown in fig. 8 may be used, where the page 800 includes information about business hours, for example, the business hours are 10:30-20:00.
In step 702, the electronic device associates the target picture with the target business hours.
Specifically, the method for associating the target picture with the target business hours by the electronic device may refer to the method for associating the target picture with the target POI information, which is not described herein.
The picture photographing method is exemplified by fig. 2 to 8, and then the picture processing method is exemplified by fig. 9, 10A to 10C, 11 and 12. When a user processes a shot picture, for example, when the user wants to go to an address corresponding to a shot object in the picture, a related application recommended by the picture, for example, a navigation application or a taxi taking application, can be opened, and navigation is performed to the address based on target POI information carried by the picture.
Fig. 9 is a flow chart of a picture processing method according to an embodiment of the present application, which specifically includes the following steps:
In step 901, in response to a first operation of a user on a target picture, the electronic device determines a type of a target object in the target picture.
Specifically, the target object in the target picture may be an object when the user takes the picture. As previously described, the types of target objects may include, but are not limited to, a biological type, an address type, a biological + address type.
The first operation may include, but is not limited to, a slide-up picture, a slide-down picture, a double click picture, a finger pinch picture, etc., which is not particularly limited in the embodiment of the present application.
In response to a first operation of the user on the target picture, the electronic device may determine a type of the target object in the target picture.
The mode of identifying the target object in the target picture may be through a preset image identification model, which is not limited in particular in the present application.
In step 902, the electronic device displays related information based on the type of the target object, wherein the related information includes a first recommended application.
Specifically, after determining the type of the target object, the electronic device may display related information based on the type of the target object, where the related information may include the first recommendation application.
The method for displaying the first recommended application by the electronic device based on the type of the target object can comprise the following steps: and the electronic equipment calculates the text similarity between the type of the target object and the label of the application installed in the electronic equipment, determines the application with high text similarity as a first recommended application, and can display the first recommended application.
For example, taking the type of the target object as an address type as an example, by calculating the text similarity between the address type and the tag of the application installed in the electronic device, the electronic device may determine the tag of the related application with the highest text similarity with the address type, for example, the tag of the related application may include, but is not limited to, a tag of a map application, a navigation application, a taxi taking application, a take-out application, a ticket booking application, and the like, and may determine the first recommended application based on the tag of the related application, and display the first recommended application.
FIG. 10A is a diagram illustrating a display effect of one embodiment of a first recommended application. Referring to fig. 10A, since the type of the target object is an address type, for example, the target object may be a store, and the first recommended application may be determined and displayed according to the type of the target object, for example, the first recommended application may include applications such as navigation, taxi taking, take-out, ticket booking, and the like.
For another example, taking the type of the target object as a biological type, by calculating the text similarity between the biological type and the tag of the application installed in the electronic device, the electronic device may determine the tag of the related application having the highest text similarity with the biological type, for example, the tag of the related application may include, but is not limited to, a beauty application, a P-graph application, and the like, and may determine the first recommended application based on the tag of the related application, and display the first recommended application.
FIG. 10B is a schematic diagram illustrating a display effect of another embodiment of the first recommended application. Referring to fig. 10B, since the type of the target object is a biological type, for example, the target object may be a person, the first recommended application may be determined according to the type of the target object, and for example, the first recommended application may include an application such as beauty, P-map, or the like.
For another example, taking the type of the target object as the biological+address type as an example, by calculating the text similarity between the biological+address type and the tag of the application installed in the electronic device, the electronic device may determine the tag of the related application having the highest text similarity with the biological+address type, for example, the tag of the related application may include, but is not limited to, a tag of a map application, a navigation application, a taxi taking application, a take-away application, a ticket booking application, a beauty application, a P-map application, and the like, and may determine the first recommended application based on the tag of the related application and display the first recommended application.
FIG. 10C is a schematic diagram illustrating a display effect of another embodiment of the first recommended application. Since the type of the target object is a bio+address type, for example, the target object may include a person and a place, the first recommended application may be determined according to the type of the target object, for example, the first recommended application may include applications related to the person such as beauty, P-map, etc., and applications related to the place such as navigation, taxi taking, take-out, ticket booking, etc.
In some optional embodiments, after determining the first recommended application, the electronic device may rank the recommended applications based on the priority of the recommended applications in the first recommended application and display the ranked recommended applications.
The priority of the recommended application can be determined according to personal use habit or preference and public use habit or preference. The usage habit or preference of the individual is used to characterize the usage habit or preference of the application by the individual, and the usage habit or preference of the individual may be obtained by counting the frequency of using the recommended application by the user at ordinary times, for example, may count the frequency of using the recommended application by the individual in a historical period, or may be obtained by calculating in other manners, which is not limited in particular by the embodiment of the present application. The public's usage habits or preferences are used to characterize the public's usage habits or preferences for applications, which may be obtained by statistics based on the download or installation of applications in the application platform, for example, may be obtained by statistics of download or installation of recommended applications by the public over a historical period of time, or may be obtained by other means of calculation, which is not particularly limited in the embodiments of the present application.
The display manner of the first recommended application may be determined according to the visual habit of the user, for example, the visual habit of the person is usually seen from left to right, so that the recommended applications with the highest priority may be displayed from left to right in sequence from high to low in priority, that is, the recommended application with the highest priority may be displayed at the leftmost side, and the recommended application with the lowest priority may be displayed at the rightmost side.
It should be understood that the display modes according to the visual habits of the user are merely exemplary, and are not limiting to the embodiments of the present application, and other display modes are also possible in some embodiments.
In some alternative embodiments, for the target picture whose type of the target object is the address type, the electronic device may further sort and display the first recommended application based on business hours of the target object.
For example, if the time at which the user browses the picture is in non-business hours, the electronic device may decrease the priority of recommended applications related to business hours and increase the priority of recommended applications not related to business hours. Or alternatively
If the moment the user browses the picture is in business hours, the electronic device may increase the priority of recommended applications related to the business hours and decrease the priority of recommended applications not related to the business hours.
The method for displaying the first recommended application may specifically refer to the related description in the foregoing embodiment, which is not described herein again.
For example, taking a target object as a traffic hub as an example, assuming that the time when the user browses the picture is non-business hours of a certain traffic hub, the user's intention of the application related to business hours is low, the priority of the application related to business hours, such as a map application, a navigation application, a driving application, etc., may be reduced, while the user's intention of the application not related to business hours is high, and the priority of the application not related to business hours, such as a ticket booking application, etc., may be increased.
In some alternative embodiments, the relevant information may further include a second recommended application.
Wherein the second recommendation application may be determined according to personal usage habits or preferences, public usage habits or preferences. It is to be appreciated that the second recommendation application may be considered a personalized recommendation application, for example, the second recommendation application may include, but is not limited to, a life application, a social application, a travel application, and the like.
It can be understood that the first recommendation application can be used for completing tasks required by the user by one key, and taking the first recommendation application as an example of the navigation application, after the user clicks the navigation application, the user does not need to enter a first page of the navigation application, but can enter a target page of navigation according to POI address information carried in a target picture, and the navigation can be completed by one key on the target page, namely, the user does not need to input address information, so that the task completion efficiency can be improved.
The second recommendation application is an application for personalized recommendation for the current user, after the user opens the second recommendation application, the user can enter the home page of the second recommendation application, and the corresponding task can be completed according to the requirement of the user. By way of example, taking the second recommendation application as a social application, after clicking the social application, the user may enter the home page of the social application, and may perform operations such as picture sharing and picture editing according to the needs of the user.
It will be appreciated that the electronic device may also display a second recommended application on the basis of the first recommended application displayed in fig. 10A-10C. Taking fig. 10A as an example, fig. 11 exemplarily shows a display effect diagram of an embodiment of the second recommended application. Referring to fig. 11, in addition to displaying the first recommended application, a second recommended application may be displayed, which may include, but is not limited to, a living application, a social application, a travel application, and the like. The manner and effect of displaying the second recommended application in fig. 10B and 10C may refer to the manner of displaying the second recommended application in fig. 10A, and will not be described herein.
According to the embodiment of the application, the selection surface of the user can be expanded and the use experience of the user can be improved by displaying the second recommendation application of personalized recommendation.
In some alternative embodiments, step 903 may also be included after step 902.
In step 903, in response to the second operation of the first recommended application related to the address type, which is displayed in the related information of the target picture, by the user, the electronic device invokes the corresponding first recommended application.
Specifically, when the first recommended application related to the address type is included in the first recommended application, the user may perform the second operation on the first recommended application related to the address type.
Wherein the second operation may include, but is not limited to, a click, a double click, etc. operation.
And responding to a second operation of the first recommended application related to the address type, which is displayed in the related information of the target picture, by the user, and calling the corresponding first recommended application by the electronic equipment.
Step 904, the electronic device displays the target page, obtains the POI information in the target picture, and automatically inputs the POI information in the target page.
Specifically, in the process that the electronic device invokes the first recommended application related to the address type, POI information in the target picture can be obtained.
It can be appreciated that the electronic device can input the POI name in the POI information to the target page of the first recommendation application related to the address type; or the electronic equipment can input the POI address in the POI information to a target page of a first recommendation application related to the address type; or the electronic device may input the POI name and the POI address in the POI information to the target page of the first recommended application related to the address type, which is not limited in particular by the embodiment of the present application.
The target page of the first recommended application related to the address type may not be the top page of the first recommended application, and the target page of the first recommended application related to the address type may be a page on which the user can complete the task required by the user through one key.
For example, taking the first recommended application related to the address type as the navigation application, after the user clicks the first recommended application related to the address type displayed in the related information of the target picture, the electronic device may call the navigation application, enter the target page of the navigation application, and may input the POI information on the target page, so that the user may not need to enter the first page of the navigation application and then input related information such as the address by the user, thereby enabling the user to complete navigation by one key and improving the task processing efficiency.
Next, taking a first recommended application related to the address type as an example of a navigation application, a target page is exemplarily described with reference to fig. 12. Referring to fig. 12, in the target page 1200, the electronic device may input related address information in the information input box 1201 based on POI information in the target picture, so that an operation of inputting related information by the user may be omitted.
According to the embodiment of the application, the electronic equipment automatically inputs the related information such as the address when calling the first recommended application, so that a user can finish a required task by one key, and the efficiency of completing the task by the user is improved.
In some optional embodiments, the current user may share the target picture with other users, where the sharing manner may be a recommended application displayed by the user through relevant information of the target picture, or the sharing manner may be that the user exits a browsing interface of the target picture and opens the relevant application, which is not limited in particular in the embodiments of the present application.
It will be appreciated that after the current user shares the target picture with other users, other users who receive the target picture may invoke the relevant application through step 904 and step 905, and may complete the required task with one key.
Fig. 13 is a schematic structural diagram of an embodiment of a picture taking device according to the present application, and as shown in fig. 13, the picture taking device 1300 may include: an acquisition module 1310, a determination module 1320, and an association module 1330; wherein,
An obtaining module 1310, configured to obtain, in response to a shooting operation of a user, POI information of one or more points of interest, where the POI information at least includes a POI address;
a determining module 1320, configured to determine target POI information from the one or more POI information;
And the association module 1330 is configured to associate the target picture obtained by shooting with the target POI information and store the target POI information.
In one possible implementation manner, the image capturing apparatus 1300 further includes:
and the complementing module is used for calling a target application if judging that the POI address in the target POI information is incomplete, and complementing the incomplete POI address in the target POI information based on the target application.
In one possible implementation manner, the obtaining module 1310 is configured to obtain the one or more POI information if it is determined that the currently shot object includes at least an object of an address type.
In one possible implementation manner, the association module 1330 is configured to obtain a target business hour of the target POI, and store a target picture obtained by shooting after associating the target picture with the target POI information and the target business hour.
In one possible implementation manner, the determining module 1320 is configured to identify key information in the target picture;
and matching the key information with the POI names in the one or more interest point information to determine target POI information.
The embodiment shown in fig. 13 provides a picture taking device 1300 that can be used to implement the technical solution of the method embodiment of the present application, and the implementation principle and technical effects can be further described with reference to the related description of the method embodiment.
Fig. 14 is a schematic structural diagram of an embodiment of a picture processing apparatus according to the present application, as shown in fig. 14, the picture processing apparatus 1400 may include: a display module 1410; wherein,
The display module 1410 is configured to display related information in response to a first operation of a user on a target picture, where the target picture is associated with at least target POI information, the related information includes a first recommended application, and the first recommended application includes at least a recommended application related to an address type.
In one possible implementation manner, the display module 1410 is configured to determine, in response to a first operation of a user on a target picture, a type of a target object in the target picture, where the type of the target object includes a biological type and/or an address type;
and if the target object at least comprises an object of an address type, displaying related information.
In one possible implementation manner, the related information further includes a second recommendation application, where the second recommendation application is determined according to a personal usage habit and/or a public usage habit.
In one possible implementation manner, the image processing apparatus 1400 further includes:
The sorting module is used for sorting and displaying the recommended applications in the first recommended application based on the priority of the recommended applications in the first recommended application.
In one possible implementation manner, the priority of each recommended application in the first recommended application is determined by the personal usage habit and/or the public usage habit.
In one possible implementation manner, the target picture is further associated with a target business hour, and the priority of each recommended application in the first recommended application is determined by the target business hour.
In one possible implementation manner, the image processing apparatus 1400 further includes:
The calling module is used for responding to a second operation of a user on a target application, and calling the target application, wherein the target application comprises a recommended application related to an address type in the first recommended application;
And displaying a target page in the target application, and automatically inputting target POI information associated with the target picture in the target page.
It should be understood that the above division of the respective modules of the image capturing apparatus 1300 shown in fig. 13 and the image processing apparatus 1400 shown in fig. 14 is merely a division of logic functions, and may be fully or partially integrated into one physical entity or may be physically separated. And these modules may all be implemented in software in the form of calls by the processing element; or can be realized in hardware; it is also possible that part of the modules are implemented in the form of software called by the processing element and part of the modules are implemented in the form of hardware. For example, the detection module may be a separately established processing element or may be implemented integrated in a certain chip of the electronic device. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
For example, the modules above may be one or more integrated circuits configured to implement the methods above, such as: one or more Application SPECIFIC INTEGRATED Circuits (ASIC), or one or more microprocessors (DIGITAL SIGNAL Processor (DSP), or one or more field programmable gate arrays (Field Programmable GATE ARRAY; FPGA), etc. For another example, the modules may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
In the above embodiments, the processor may include, for example, a CPU, a DSP, a microcontroller, or a digital signal processor, and may further include a GPU, an embedded neural network processor (Neural-network Process Units; hereinafter referred to as NPU), and an image signal processor (IMAGE SIGNAL Processing; hereinafter referred to as ISP), where the processor may further include a necessary hardware accelerator or a logic Processing hardware circuit, such as an ASIC, or one or more integrated circuits for controlling the execution of the program according to the technical solution of the present application. Further, the processor may have a function of operating one or more software programs, which may be stored in a storage medium.
The embodiment of the application also provides a readable storage medium, wherein a program is stored in the readable storage medium, and when the readable storage medium runs on the electronic device, the electronic device is caused to execute the method provided by the embodiment of the application.
The embodiments of the present application also provide a program product comprising a program which, when run on an electronic device, causes the electronic device to perform the method provided by the illustrated embodiments of the present application.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relation of association objects, and indicates that there may be three kinds of relations, for example, a and/or B, and may indicate that a alone exists, a and B together, and B alone exists. Wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of the following" and the like means any combination of these items, including any combination of single or plural items. For example, at least one of a, b and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in the embodiments disclosed herein can be implemented as a combination of electronic hardware, computer software, and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In several embodiments provided by the present application, any of the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (hereinafter referred to as ROM), a random access Memory (Random Access Memory hereinafter referred to as RAM), a magnetic disk, or an optical disk, etc., which can store program codes.
The foregoing is merely exemplary embodiments of the present application, and any person skilled in the art may easily conceive of changes or substitutions within the technical scope of the present application, which should be covered by the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A picture taking method, the method comprising:
Responding to shooting operation of a user, acquiring one or more POI information, wherein the POI information at least comprises POI addresses;
Determining target POI information in the one or more POI information;
And associating the shot target picture with the target POI information and storing the target POI information.
2. The method of claim 1, wherein after determining the target POI information in the one or more POI information, the method further comprises:
And if the POI address in the target POI information is incomplete, calling a target application, and complementing the incomplete POI address in the target POI information based on the target application.
3. The method according to claim 1 or 2, wherein the obtaining one or more point of interest POI information comprises:
And if the current shot object at least comprises an address type object, acquiring one or more POI information.
4. A method according to any one of claims 1-3, wherein storing the captured target picture in association with the target POI information comprises:
and acquiring target business hours of the target POI, and storing the target picture acquired by shooting after being associated with the target POI information and the target business hours.
5. The method of any of claims 1-4, wherein the POI information comprises POI names, and wherein determining target POI information in the one or more POI information comprises:
identifying key information in the target picture;
and matching the key information with the POI names in the one or more interest point information to determine target POI information.
6. A picture processing method, the method comprising:
And displaying related information in response to a first operation of a user on the target picture, wherein the target picture is at least related to target POI information, the related information comprises a first recommended application, and the first recommended application at least comprises a recommended application related to the address type.
7. The method of claim 6, wherein the displaying the related information in response to the first operation of the target picture by the user comprises:
Determining a type of a target object in a target picture in response to a first operation of a user on the target picture, wherein the type of the target object comprises a biological type and/or an address type;
and if the target object at least comprises an object of an address type, displaying related information.
8. The method according to claim 6 or 7, wherein the related information further comprises a second recommendation application, the second recommendation application being determined according to the personal usage habits and/or the public usage habits.
9. The method according to any one of claims 6-8, further comprising:
And sequencing and displaying each recommended application in the first recommended application based on the priority of each recommended application in the first recommended application.
10. The method of claim 9, wherein the priority of each of the first recommended applications is determined by personal usage habits and/or public usage habits.
11. The method of claim 9, wherein the target picture is further associated with a target business hour, and wherein the priority of each of the first recommended applications is determined by the target business hour.
12. The method according to any one of claims 6-11, further comprising:
Responding to a second operation of a user on a target application, and calling the target application, wherein the target application comprises a recommended application related to an address type in the first recommended application;
And displaying a target page in the target application, and automatically inputting target POI information associated with the target picture in the target page.
13. An electronic device, comprising: a processor and a memory for storing a program; the processor is configured to run the program, implement the picture taking method according to any one of claims 1 to 5, or implement the picture processing method according to any one of claims 6 to 12.
14. A readable storage medium, characterized in that the readable storage medium stores a program which, when run on an electronic device, implements the picture taking method according to any one of claims 1-5 or implements the picture processing method according to any one of claims 6-12.
CN202311835171.1A 2023-12-27 2023-12-27 Picture shooting method, picture processing method, electronic device and storage medium Pending CN117979156A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311835171.1A CN117979156A (en) 2023-12-27 2023-12-27 Picture shooting method, picture processing method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311835171.1A CN117979156A (en) 2023-12-27 2023-12-27 Picture shooting method, picture processing method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN117979156A true CN117979156A (en) 2024-05-03

Family

ID=90857113

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311835171.1A Pending CN117979156A (en) 2023-12-27 2023-12-27 Picture shooting method, picture processing method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN117979156A (en)

Similar Documents

Publication Publication Date Title
WO2020151387A1 (en) Recommendation method based on user exercise state, and electronic device
WO2021004527A1 (en) Countdown display method and electronic device
CN111552451B (en) Display control method and device, computer readable medium and terminal equipment
CN113473013A (en) Display method and device for beautifying effect of image and terminal equipment
US20240171826A1 (en) Volume adjustment method and system, and electronic device
WO2022022319A1 (en) Image processing method, electronic device, image processing system and chip system
CN113472861B (en) File transmission method and electronic equipment
CN114449333B (en) Video note generation method and electronic equipment
CN113727287A (en) Short message notification method and electronic terminal equipment
CN115543145A (en) Folder management method and device
WO2023160179A1 (en) Magnification switching method and magnification switching apparatus
WO2023071441A1 (en) Method and apparatus for displaying letters in contact list, and terminal device
CN113660369B (en) Incoming call processing and model training method and device, terminal equipment and storage medium
CN114120987B (en) Voice wake-up method, electronic equipment and chip system
CN111339513B (en) Data sharing method and device
CN113282632A (en) Data transmission method and equipment
CN117979156A (en) Picture shooting method, picture processing method, electronic device and storage medium
CN114449492B (en) Data transmission method and terminal equipment
CN115563338B (en) Photo pushing method and related device
CN115841099B (en) Intelligent recommendation method of page filling words based on data processing
CN115640414B (en) Image display method and electronic device
WO2024021691A9 (en) Display method and electronic device
CN114895991B (en) Content sharing method and electronic equipment
CN115734323B (en) Power consumption optimization method and device
CN115297530B (en) Network connection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination