CN117093735A - Object full life cycle searching method and electronic equipment - Google Patents

Object full life cycle searching method and electronic equipment Download PDF

Info

Publication number
CN117093735A
CN117093735A CN202210513016.7A CN202210513016A CN117093735A CN 117093735 A CN117093735 A CN 117093735A CN 202210513016 A CN202210513016 A CN 202210513016A CN 117093735 A CN117093735 A CN 117093735A
Authority
CN
China
Prior art keywords
image
life cycle
information
images
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210513016.7A
Other languages
Chinese (zh)
Inventor
唐伟
刘瑞涛
汪芳山
刘晓波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210513016.7A priority Critical patent/CN117093735A/en
Priority to PCT/CN2023/093133 priority patent/WO2023217159A1/en
Publication of CN117093735A publication Critical patent/CN117093735A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/532Query formulation, e.g. graphical querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/538Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/55Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9532Query formulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Library & Information Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Provided are an object full life cycle searching method and electronic equipment. The method comprises the following steps: acquiring first information, wherein the first information comprises information of a first object; determining that the first object has a lifecycle, the lifecycle comprising different growth phases of the first object; outputting an image sequence, wherein the image sequence comprises N first images, and N is a positive integer; the N first images are used for displaying the morphology of the first object in different growth stages. In this way, the electronic device can obtain the morphological display diagrams of different growth stages of the first object, so that the user can conveniently know the morphological display diagrams.

Description

Object full life cycle searching method and electronic equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a full life cycle searching method for an object and an electronic device.
Background
As the functions of the terminal device are increased, the user can search for various information using the terminal device. For example, a terminal device has a browser Application (APP) therein, and a user can input information in the browser application to search. For example, a user entering the word "swan" in a browser application may appear as search results shown in FIG. 1, which include pictures of a wide variety of swans. The image sequence searched by the method is irregular, basically the concentrated display of various swan images, and the irregularity can be said.
Disclosure of Invention
The application aims to provide a full life cycle searching method of an object and electronic equipment, which are used for facilitating the display of an object image searched by a user according to the life rule of the object.
In a first aspect, a full life cycle search method for an object is provided, including: acquiring first information, wherein the first information comprises information of a first object; determining that the first object has a lifecycle, the lifecycle comprising different growth phases of the first object; outputting an image sequence, wherein the image sequence comprises N first images, and N is a positive integer; the N first images are used for displaying the morphology of the first object in different growth stages.
In the embodiment of the application, the electronic device may output an image sequence of the full life cycle of the first object, where the image sequence includes N first images, and the N first images reflect the morphology of the first object in different growth phases. According to the technical scheme provided by the application, as the image of the search object (namely the first object) is displayed according to the life cycle rule of the object, a user can quickly know the life rule of the search object (namely the first object), and natural and scientific science popularization is realized.
In one possible design, the first information is a second image, and the first object satisfies at least one of the following conditions:
an object centered on the second image; or,
an object with the largest occupied area on the second image; or,
the most number of objects on the second image; or,
and an object specified by a user on the second image.
It should be noted that the first information may be an image or text.
Taking the example that the first information is the second image, after the electronic device acquires the second image, the first object on the second image may be determined, for example, an object centered in the second image, an object with the largest area, the most number of objects, an object specified by the user, and so on. Then, the electronic device may output a full life-cycle image sequence of the first object, i.e., N first images, for reflecting the forms of the first object in different growth phases, so as to facilitate the user to understand the life rule of the search object (i.e., the first object).
Taking the case that the first information is text as an example, after the electronic device acquires the text, a keyword may be extracted from the text as a first object. For example, the text entered by the user includes: the swan looks, the electronic device can extract the keyword "swan", i.e., the first object, from it and then search for and output the image sequence of the swan's full life cycle.
For example, the electronic device may acquire the first information in various manners. For example, the user may input the first information by voice, or the electronic device provides an input box in which the user inputs the first information.
In one possible design, determining that the first object has a lifecycle includes: identifying a type of the first object; and judging whether the first object has a life cycle or not according to the type of the first object. For example, when the first object is an animal or plant, it is determined that the first object has a life cycle. Therefore, whether the first object has a life cycle can be determined through the type of the first object, and if so, an image sequence of the full life cycle of the first object can be output, so that a user can conveniently know the life rule of the search object (namely the first object).
In one possible design, before outputting the sequence of images, further comprising: respectively inputting the first information into M life cycle models to obtain M pieces of characteristic information, wherein M is a positive integer, and each piece of characteristic information in the M pieces of characteristic information is used for describing a growth stage characteristic of the first object; and obtaining the image sequence according to the M pieces of characteristic information.
In one possible design, the M lifecycle models are related to the first object or, alternatively, to a type of the first object.
In one case, the M lifecycle models are related to the first object, i.e. the first object is different, and the corresponding M lifecycle models are different. For example, when the first object is object a, the M lifecycle models are models for searching for the lifecycle of object a; when the first object is object B, the M lifecycle models are models for searching for the lifecycle of object B.
In another case, the M lifecycle models are related to the type of the first object, i.e. the type of the first object is different, and the corresponding M lifecycle models are different. For example, when a first object is of a first type, the M lifecycle models are models for searching for the lifecycle of the first type of object; when the first object is of the second type, the M lifecycle models are models for searching for the lifecycle of the object of the second type.
In one possible design, the N second images include an i-th second image and a j-th second image, where i < j < N; the ith second image shows the morphology of the first object in a first growing stage; the j-th second image shows the morphology of the first object in a second growth phase, which follows the first growth phase. That is, the first images are displayed in a sequence according to the growth stages of the first object from small to large, so that the first images can be conveniently checked by a user.
In one possible design, the method further comprises: receiving a first operation, wherein the first operation is used for indicating to open a first application; opening the first application, and displaying a first interface, wherein the first interface comprises an input box; the acquiring the first information includes: acquiring first information input in the input box; before the outputting the sequence of images, further comprising: a second operation for instructing to search for the first information input in the input box is received.
That is, there is a first application, such as a browser application, in the electronic device, an input box is displayed when the first application is opened, and a user can input first information in the input box, and when a second operation is received, the second operation is used to instruct searching of the first information to output a full life-cycle image sequence of the first object in the first information.
In one possible design, the method further comprises: and outputting prompt information, wherein the prompt information is used for prompting the current growth stage of the first object.
That is, the electronic device may not only output the image sequence of the full life cycle of the first object, but also prompt the user which life stage the first object is currently in, so as to facilitate learning and knowing of the user.
In one possible design, before outputting the sequence of images, the method may further comprise: dividing the N first images included in the image sequence into K groups of images, wherein K is a positive integer, and each group of images corresponds to one life stage of the first object. For example, the electronic device displays 3 groups of graphs, the 1 st group corresponds to the first life stage, the 2 nd group corresponds to the second life stage, and the 3 rd group corresponds to the third life stage. The first life stage precedes the second life stage, which precedes the third life stage. That is, each life stage of the first subject may exhibit multiple images to facilitate a user's more comprehensive understanding of the life stage of the subject.
In a second aspect, there is also provided an electronic device, comprising: a processor, a memory, and one or more programs; wherein the one or more programs are stored in the memory, the one or more programs comprising instructions, which when executed by the processor, cause the electronic device to perform the method steps of any of the first aspect.
In a third aspect, there is also provided a computer readable storage medium storing a computer program which, when run on a computer, causes the computer to perform the method of any of the first aspects described above.
In a fourth aspect, there is also provided a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method as described in the first aspect above.
In a fifth aspect, an embodiment of the present application further provides a chip, where the chip is coupled to a memory in an electronic device, and is configured to invoke a computer program stored in the memory and execute the technical solution of the first aspect of the embodiment of the present application, where "coupled" in the embodiment of the present application means that two components are directly or indirectly combined with each other.
The advantages of the second to fifth aspects are described above, with reference to the advantages of the first aspect, and the description is not repeated.
Drawings
FIG. 1 is a diagram of a general search result provided by an embodiment of the present application;
fig. 2A is a schematic diagram of a communication system according to an embodiment of the present application;
fig. 2B is a schematic hardware structure of an electronic device according to an embodiment of the application;
fig. 2C is a schematic software structure of an electronic device according to an embodiment of the application;
FIG. 3 is a diagram illustrating a full life cycle search process according to an embodiment of the present application;
FIG. 4 is a diagram illustrating a full life cycle search process according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a full life cycle search process according to an embodiment of the present application;
FIG. 6 is a diagram illustrating a full life cycle search process for an object according to an embodiment of the present application;
FIGS. 7A-7D are schematic diagrams of a full life cycle model set according to an embodiment of the present application;
FIG. 8 is a diagram of full life cycle search results provided by an embodiment of the present application;
FIG. 9 is another diagram of full life cycle search results provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of a current life cycle identification process according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a current life cycle identification process according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the following, some terms in the embodiments of the present application are explained for easy understanding by those skilled in the art.
At least one of the embodiments of the present application includes one or more; wherein, a plurality refers to greater than or equal to two. In addition, it should be understood that in the description of the present application, the words "first," "second," and the like are used merely for distinguishing between the descriptions and not for indicating or implying any relative importance or order. For example, the first device and the second device do not represent the importance of both, or the order of both, in order to distinguish between objects.
In the embodiment of the present application, "and/or" is an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
The object full life cycle searching method provided by the embodiment of the application can be applied to a communication system (which can be simply called as a system). The system comprises N devices, wherein N is an integer greater than or equal to 2. For ease of understanding, the system will be described with reference to the inclusion of a first device and a second device.
The first device may be a terminal device having a search function. For example, the first device includes an application having a search function, such as a browser application. By way of example, the first device may be a portable electronic device such as a cell phone, tablet computer, portable computer, a wearable device with wireless communication capabilities (e.g., smart watch, smart glasses, smart bracelet, or smart helmet, etc.), and so forth; alternatively, the system may be an in-vehicle apparatus; or, the system can also be a desktop computer, an integrated machine and other devices; or, the application can also be intelligent home devices such as a television (e.g. a smart screen), a sound box and the like, and the type of the first device is not limited by the application. In other embodiments, the first device may have specific image capturing functions, for example, a camera, and may be used to capture images for the purpose of "searching for images in a map" (described below).
The second device may be a server, for example, a server supporting functions of a message storage and distribution program, a multi-user access management program, a large-scale data storage program, a large-scale data processing program, a data redundancy backup program, and the like. For example, a Linux server, windows server, or other type of server; alternatively, a server capable of providing simultaneous access to multiple devices may be provided. The second device may be a server or a server cluster formed by a plurality of servers, for example, may be a server cluster formed by a plurality of regions, a plurality of machines and a plurality of servers. It is understood that the server may also be referred to as a cloud device or cloud, etc.
Exemplary, as shown in fig. 2A, a schematic diagram of a communication system according to an embodiment of the present application is provided. As shown in fig. 2A, the system includes a terminal device and a server. In this example, the terminal device takes a mobile phone as an example, and a user may search for an object in the mobile phone, and the mobile phone searches for a full life cycle image series (described later) of the object through the server.
When the object full life cycle searching method provided by the embodiment of the application is suitable for a communication system, terminal equipment and a server in the communication system respectively execute corresponding functions. For example, the terminal device may input information input by the user into the full life cycle model set to obtain a full life cycle feature vector set, and then send the feature vector set to the server to obtain a final full life cycle graphic sequence, which will be described later in fig. 6. Or after receiving the input information, the terminal equipment sends the input information to the server, the server inputs the input information into the full-life-cycle model group to obtain a full-life-cycle feature vector group, and then the final full-life-cycle graphic sequence is obtained according to the feature vector group and returned to the terminal equipment. In short, the functions between the terminal device and the server can be flexibly allocated, and the embodiment of the application is not limited.
In other embodiments, the method for searching the full life cycle of the object provided in the embodiments of the present application may also be applied to an electronic device, which may be, for example, a terminal device in the communication system shown in fig. 2A. It will be understood that the corresponding functions of the terminal device and the server in the communication system are performed by one electronic device. By way of example, the electronic device may be a portable electronic device such as a cell phone, tablet computer, portable computer, wearable device with wireless communication capability (e.g., smart watch, smart glasses, smart bracelet, or smart helmet, etc.), etc.; alternatively, the system may be an in-vehicle apparatus; or, the system can also be a desktop computer, an integrated machine and other devices; or the electronic equipment can also be intelligent household equipment such as a television (e.g. an intelligent screen), a sound box and the like, and the type of the electronic equipment is not limited by the application.
For easy understanding, the following description will take the full life cycle searching method for objects provided in the embodiment of the present application as an example for a communication system.
The structure of the first device in the communication system is described below.
Fig. 2B shows a schematic structural diagram of the electronic device. The electronic device may be a first device in a communication system. The electronic device may be a cell phone, tablet computer, or the like. As shown in fig. 2B, the electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a user identification module (subscriber identification module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller can be a neural center and a command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the processor may fuse images acquired by multiple cameras on the electronic device to obtain a panoramic image. The processor for performing image fusion may be a CPU or GPU.
USB interface 130 is an interface that conforms to the USB standard specification, including, but not limited to, a Mini USB interface, a Micro USB interface, a USB Type C interface, etc. The USB interface 130 may be used to connect a charger to charge an electronic device, or may be used to transfer data between the electronic device and a peripheral device. The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G/6G, etc. applied on an electronic device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. for application on an electronic device. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, the antenna 1 and the mobile communication module 150 of the electronic device are coupled, and the antenna 2 and the wireless communication module 160 are coupled, so that the electronic device can communicate with the network and other devices through wireless communication technology.
The display 194 is used to display a display interface of an application or the like. The electronic device may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format.
In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1. Taking an example that the electronic device comprises N cameras, wherein the N cameras comprise at least one wide-angle camera, the wide-angle camera is used for shooting panoramic images (called a main body image), the other cameras are used for shooting auxiliary images, and the auxiliary images and the main body image are fused to obtain global images with clearer details. The specific implementation principle will be described later.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an operating system, software code of at least one application program, and the like. The storage data area may store data (e.g., images, video, etc.) generated during use of the electronic device, and so forth. In addition, the internal memory 121 may include a high-speed random access memory, and may also include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a general-purpose flash memory device, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions.
The electronic device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. The gyro sensor 180B may be used to determine a motion gesture of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 180B.
The gyro sensor 180B may be used for photographing anti-shake. The air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronics calculate altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device in various directions (typically three axes). A distance sensor 180F for measuring a distance. The electronic device may measure the distance by infrared or laser. In some embodiments, the scene is photographed and the electronic device can range using the distance sensor 180F to achieve quick focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The electronic device may detect that the user holds the electronic device near the ear to talk using the proximity light sensor 180G, so as to automatically extinguish the screen for power saving purposes.
The ambient light sensor 180L is used to sense ambient light level. The electronic device can adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect if the electronic device is in a pocket to prevent false touches. The fingerprint sensor 180H is used to collect a fingerprint. The electronic equipment can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access the application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device performs a temperature processing strategy using the temperature detected by temperature sensor 180J.
The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device.
It will be appreciated that the components shown in fig. 2B do not constitute a particular limitation of the electronic device. The electronic device in embodiments of the invention may include more or fewer components than in fig. 2B. In addition, the combination/connection relationship between the components in fig. 2B is also adjustable and modifiable.
Fig. 2C is a schematic diagram of a software structure of a communication system according to an embodiment of the present application.
As shown in fig. 2C, the upper half is a software structure diagram of the server, and the lower half is a software structure diagram of the terminal device. The server comprises one or more nodes, each node can be a module or a sub-server, and the plurality of nodes can balance the processing service accessed by multiple devices. Taking the node 1 as an example, the node 1 comprises a storage module, a central processing module and a communication module. Wherein the communication module is used for communicating with the terminal equipment or other nodes. For example, the communication module may be configured to receive the full-life-cycle feature vector set reported by the terminal device, or the communication module may issue the full-life-cycle search result to the terminal device. The central processing module is responsible for processing related services, for example, determining a full life cycle search result according to a full life cycle feature vector set reported by the terminal device. The storage module is used for storing a large number of images, for example, the central processing module searches matched images in the images stored in the storage module according to the full life cycle feature vector group to obtain full life cycle search results.
Continuing to refer to fig. 2C, the lower half is a schematic structural diagram of a terminal device, where the terminal device includes a communication module, an input module, an output module, a central processing module, a camera module, and a storage module. The communication module is used for communicating with the server. The input module is used for inputting information, which can be text or images, for example, in an input box of a browser application. The image pickup module is used for shooting images, and the images can be used as input images. The output module is used for outputting the full life cycle search result. The storage module is used for storing the full life cycle model group, which will be described in detail later. The central processing module is used for inputting the information obtained by the input module into the full life cycle model group in the storage module to obtain the full life cycle feature vector group, and the full life cycle feature vector group is sent to the server through the communication module.
The application scenario of the object full life cycle searching method provided by the embodiment of the application is described below.
For example, referring to fig. 3 (a), the handset displays a main interface 201. Icons of various applications are included in the main interface 201, including icons 202 of a browser. When the mobile phone detects an operation for the icon 202, an interface 203 as shown in (b) in fig. 3 is displayed. An input box is included in the interface 203, as well as a camera marker 204 and a search marker 205. When the handset detects an operation for the camera mark 204, an interface 206 as shown in fig. 3 (c) is displayed. A selection box 207 is included in the interface 206. In the selection box 207, a "shoot" option 208 and a "select from cell phone album" option 209 are included. When the handset detects operation of the user selecting the "select from handset album" option 209, an interface 210 as shown in (d) of fig. 3 is displayed. Thumbnails of the various images in the gallery application of the handset are included in the interface 210. When the mobile phone detects that the user selects the thumbnail 211 (including the photographic subject "apple"), an interface 212 as shown in (e) in fig. 3 is displayed. A selection box 213 is included in the interface 212. If the user selects the wrong image, the selection box 213 may be closed and the image may then be selected again by the camera mark 204. With continued reference to fig. 3 (e), when the handset detects operation with respect to the search marker 205, an interface 214 is displayed as shown in fig. 3 (f). Search results for thumbnail 211 are included in interface 214. The search results include graphical sequences of the full life cycle of the object (e.g., "apple") in the thumbnail image 211, e.g., "apple" full life cycle includes seed cycle, germination cycle, adult cycle, flowering cycle, fruiting cycle. The pattern sequence in fig. 3 (f) includes an image 215 corresponding to the seed period of "apple", an image 216 corresponding to the germination period, an image 217 corresponding to the adult period, an image 218 corresponding to the flowering period, and an image 219 corresponding to the fruiting period.
In the embodiment of fig. 3, after the user opens the browser application, an image is entered in the input box of the browser application, and a full life-cycle graphical sequence of objects (e.g., apples) in the image is searched for, which is also referred to as "searching a map in a map". In other embodiments, after the user opens the browser application, the text information is input in the input box of the browser application, and the graphic sequence of the full life cycle of the object contained in the text information can also be searched.
For example, referring to fig. 4 (a), when the mobile phone detects an operation on an icon 202 of a browser application in the main interface 201, an interface 203 as shown in fig. 4 (b) is displayed. An input box is included in the interface 203. After detecting that the user inputs the "apple" word in the input box, the mobile phone detects an operation for the search marker 205, and displays an interface as shown in (c) of fig. 4, in which a graphical sequence of the full life cycle of the apple is included.
In fig. 4, after the user inputs the "apple" two words in the input box, clicking the search button directly searches the full life cycle sequence pattern of the apple. One possible scenario is that the browser is a browser dedicated to full life cycle searching, so when a user inputs a text, clicking the search button directly appears the full life cycle sequence pattern corresponding to the text. In other embodiments, the browser may conduct a full life cycle search, as well as a general search. For example, referring to (a) in fig. 5, when an operation of clicking an icon 202 of a browser application in a main interface 201 by a user is detected, an interface 203 as shown in (b) in fig. 5 is displayed. Keys 220 are included in interface 203. When the mobile phone detects an operation for the key 220, an interface as shown in (c) of fig. 5 is displayed, in which a selection frame 221 is included, and a normal search key 222 and a full life cycle search key 223 are included in the selection frame 221. In the case where the input box is input to the apple, when the mobile phone detects an operation of the user selecting the full life cycle search key 223, an interface as shown in (d) of fig. 5 is displayed in which a graphic sequence of the full life cycle of the apple is displayed. With continued reference to fig. 5 (c), in the case where an apple is input in the input box, when the mobile phone detects that the user selects the normal search key 222, an interface as shown in fig. 5 (e) is displayed, which includes relevant information of the apple, such as hundreds of encyclopedias of the apple, the kind of the apple, and the like. Alternatively, with continued reference to fig. 5 (c), in the case where an apple is entered in the input box, when the mobile phone detects that the user selects the normal search key 222, an interface as shown in fig. 5 (f) is displayed, which includes various apple pictures, which are irregular.
That is, in the embodiment shown in fig. 5, the browser may perform both a normal search and a full life cycle search, and the user may select the search.
In other embodiments, the function of performing the full life cycle search on the object (for example, an apple) may be integrated in an application other than the browser application (for example, an instant messaging application), which is mainly exemplified by the browser application, but the full life cycle search mode of the object provided by the embodiment of the present application may also be used for other applications, which is not limited by the embodiment of the present application.
The principle of searching for objects in a full life cycle will be described below mainly by taking browser application as an example.
Fig. 6 is a flowchart of a method for identifying a full life cycle of an object according to an embodiment of the application. The method may be applied to a communication system as shown in fig. 1. As shown in fig. 6, the process includes:
s1, the terminal acquires an input image.
Illustratively, taking fig. 3 as an example, the input image may be a thumbnail 211 selected in the interface 210 shown in (d) of fig. 3. Alternatively, it may be an image that is photographed at the present time, for example, in (c) of fig. 3, if the user clicks the photographing button 208, the camera application may be turned on to photograph an image as an input image. Besides inputting the image, the terminal may also input text information, such as "apple", "full life cycle of apple", etc., and after obtaining the text information input by the user, the terminal may determine the target object in the text information, which will be described later.
S2, the terminal initiates full life cycle searching.
Illustratively, taking (e) in FIG. 3 as an example, when the terminal detects that the user clicks on the search tab 205, a full life cycle search is initiated.
S3, the terminal determines a target object on the image.
In some embodiments, the target object may be a variety of implementations, including, for example, at least one of ways a through D as follows.
In mode a, the target object is a centrally located object on the input image. It will be appreciated that in general, a single image may contain multiple objects, and the target object may be an object located in an intermediate position on the input image. In general, an object located in the middle of an image is an object of interest to a user, so a full life cycle search is performed with the object as a target object.
In the mode B, the target object is the object having the largest area occupied on the input image. The area of each object on the image may be the area of the area surrounded by the edge contour of the object, and in general, the object with larger occupied area is the object of interest to the user, so the object is used as the target object to perform the full life cycle search.
In the mode C, the target object is the most number of objects on the input image. It will be appreciated that a plurality of objects may be included on a single image, with more than one object, and a generally larger number of objects being objects of interest to the user, so that a full life cycle search is performed using the object as the target object.
In the mode D, the target object is an object specified by the user on the input image. For example, the electronic device may display the image to enable a user to outline an object (e.g., a circling operation to select the object) on the image as the target object. In this way, the user can select which object on the image to perform full life cycle search, and the interaction experience is better.
It should be noted that, fig. 6 is taken as an example of "searching a map", and if the user inputs text information, the target object may be determined in the text information. Keywords in the textual information may be extracted, for example, to determine the target object. For example, the text information input by the user includes "apple growth", and then the keyword "apple" is extracted therefrom, that is, it is determined that the target object is apple.
In some embodiments, before S3, the method may further include the steps of: the target object on the image is identified by the detection model. The detection model can be a depth main body detection classification model common in the field of computer vision such as pre-trained RCNN and Yolo, and also can be a traditional statistical main body detection classification model such as SVM and Boosting. For example, please refer to fig. 7A, which is a schematic diagram of a detection model, the object detection model includes an input module, a frame module, an intermediate module, an object segmentation module and a prediction module. When an input image (including apples) is input to the input module, the input module transmits the image to the frame module, for example, the frame module can be a backstene, and an output result obtained through the backstene enters the middle module, for example, a Neck, so that the frame module can play a role in supporting the up-down link in the target detection model. The output result of the intermediate module enters a target segmentation module, namely, the target object on the image is segmented. The target detection model can also comprise a second-level detection module, namely a prediction module on the basis of the first-level detection module, and the prediction module is used for predicting the type of a target object. In the embodiment of the present application, the target detection model may be a model including only a primary detection module or a model including a secondary detection module.
S4, the terminal judges whether the target object has life cycle attribute. If the target object does not have a lifecycle attribute, S5 is performed, and if the target object has a lifecycle attribute, S7 is performed.
It is understood that a lifecycle attribute may be understood as whether an object has a lifecycle. Some objects, such as stones, electronic devices, etc., often do not have a life cycle. Other objects such as apples, plants, flowers, etc. have a life cycle. Thus, in some embodiments, the terminal may identify the type of the target object first, and determine whether the target object has a life cycle attribute according to the type. For example, when the type of the target object is a preset type, it is determined that the target object has a life cycle attribute, otherwise, it is determined that the target object does not have a life cycle attribute. The preset types include, for example: plants, animals, microorganisms, and the like.
S5, if the target object does not have the life cycle attribute, the terminal sends the target object to the server.
It may be understood that the target object sent by the terminal to the server may be an image block where the target object is located on the input image, or may also be text information for describing the target object, etc., which is not limited by the embodiment of the present application.
S6, the server returns a common search result to the terminal.
In some embodiments, the general search results may be, for example, the search results shown in (e) of fig. 5, or the search results shown in (f) of fig. 5.
S7, if the target object has life cycle attributes, the terminal calls a full life cycle model group and identifies a full life cycle feature group of the target object.
The full life cycle model set is described below using the example where the target object is a plant.
In some embodiments, the full life cycle model set includes N modules, N being an integer greater than or equal to 2. For example, referring to fig. 7B, model 1 is used to identify a target object to obtain feature vector 1, and feature vector 1 is used to describe feature 1 of the target object. The model 2 is used for identifying the target object to obtain a feature vector 2, and the feature vector 2 is used for describing the feature 2 of the target object. The model 3 is used for identifying the target object to obtain a feature vector 3, and the feature vector 3 is used for describing the feature 3 of the target object. The model 4 is used for identifying the target object to obtain a feature vector 4, and the feature vector 4 is used for describing the features 4 of the target object. The model 5 is used for identifying the target object to obtain a feature vector 5, and the feature vector 5 is used for describing the feature 5 of the target object. Thus, a full-life feature vector set (simply referred to as a full-life feature set) including feature vectors 1 to 5 can be obtained from the full-life model set (i.e., N models).
In fig. 7B, each of models 1 to 5 corresponds to one life cycle.
For example, taking the target object as a plant as an example, the full life cycle may comprise 5 cycles (also referred to as phases), i.e. n=5, respectively seed cycle, germination cycle, adult cycle, flowering cycle, fruiting cycle; wherein each period corresponds to a model. For example, a seed period corresponding model 1, a germination period corresponding model 2, an adult period corresponding model 3, a flowering period corresponding model 4, and a fruiting period corresponding model 5. In addition to "seed period, germination period, adult period, flowering period, fruiting period", the whole life cycle of the plant may be replaced by other names such as seed period, seedling period, fruit tree period, flower period, fruit period, etc. In addition, the number of periods/phases of the full life cycle may be flexibly set, for example, may include 3 periods/phases, 5 periods/phases, and the like, and the number of periods/phases may be set by a user or may be set by default, which is not limited by the embodiment of the present application.
For example, please understand in conjunction with fig. 7B and fig. 7C that model 1 is used to identify the target object to obtain feature vector 1, and feature vector 1 is used to describe seed features of the target object, so model 1 is also referred to as a seed model. For example, the seed characteristics of the target object may be determined by identifying its category, morphology, etc. Model 2 is used to identify the target object to obtain a feature vector 2, and feature vector 2 is used to describe the germination characteristics of the target object, so model 2 is also referred to as a germination model. Model 3 is used to identify the target object resulting in feature vector 3, and feature vector 3 is used to describe the adult characteristics of the target object, so model 3 is also referred to as an adult model. Model 4 is used to identify the target object to obtain feature vector 4, and feature vector 4 is used to describe the flowering characteristics of the target object, so model 4 is also referred to as a flowering model. Model 5 is used to identify the target object resulting in feature vector 5, and feature vector 5 is used to describe the resulting features of the target object, so model 5 is also referred to as a resulting model.
It should be noted that, in the above embodiment, the full life cycle model set includes 5 models, corresponding to 5 life cycles/phases, it is understood that in practice, more or fewer models may be included, corresponding to more or fewer life cycles/phases. It will be appreciated that the greater the number of models, the greater the corresponding lifecycle/stage, and the more comprehensive the lifecycle of the target object.
Furthermore, the above embodiment is exemplified by the target object being a plant (e.g., apple), the full life cycle of which includes: seed period, germination period, adult period, flowering period, fruiting period, etc. It will be appreciated that if the target object is another type of object (e.g., an animal), its full life cycle may include: ovum, shaping, larva, adult, death, etc. That is, the full life cycle corresponding to different types of target objects is different. For example, taking an animal as the target object, please refer to fig. 7B and 7D, the model 1 is used to identify the target object to obtain a feature vector 1, and the feature vector 1 is used to describe the ovum feature of the target object, so the model 1 is also called an ovum model. Model 2 is used to identify the target object to obtain feature vector 2, and feature vector 2 is used to describe the shaping features of the target object, so model 2 is also referred to as a shaping model. Model 3 is used to identify the target object to obtain a feature vector 3, and feature vector 3 is used to describe the larval characteristics of the target object, so model 3 is also referred to as a larval model. The model 4 is used for identifying the target object resulting in a feature vector 4, the feature vector 4 being used for describing the adult characteristics of the target object, so the model 4 is also referred to as an adult model. Model 5 is used to identify the target object to obtain feature vector 5, and feature vector 5 is used to describe the death feature of the target object, so model 5 is also referred to as a death model.
Therefore, in the embodiment of the present application, in S7 in fig. 6, the terminal may determine the corresponding full life cycle model group according to the type of the target object, and identify the full life cycle feature group of the target object. For example, when the type of target object is a plant, a full lifecycle model set such as that of fig. 7C may be used. When the type of target object is an animal, the full life cycle model set of fig. 7D may be used.
In still other embodiments, even though the same type of target object, the full lifecycle model set for different target objects is different. For example, if the two target objects are animals, one is a butterfly and one is a dragonfly, the set of full life cycle models corresponding to the butterfly and the set of full life cycle models corresponding to the dragonfly may be different. For example, the set of full life cycle models corresponding to butterflies includes an egg model, a larva model, an adult model, a pupa model; the dragonfly full life cycle model group comprises an egg model, a larva model and a dragonfly model. That is, in S7 of fig. 6, the terminal may further determine a corresponding full life cycle model group according to a specific object of the target object to identify a full life cycle feature group of the object.
S8, the terminal sends the full life cycle feature group to the server.
Illustratively, taking fig. 7B as an example, the full life cycle feature set includes feature vectors 1 through 5.
In fig. 6, the terminal executes S7, i.e., identifies the full life cycle feature set, and then executes S8, i.e., sends the full life cycle feature set to the server. This way transmission resources of the terminal and the server can be saved, since less transmission resources are required for transmitting the set of lifecycle features. Of course, S7 may be performed by the server, for example, after the terminal determines that the target object has the life cycle attribute, the target object is sent to the server, and S7 is performed by the server. In this way, the terminal is required to transmit an image (image of the target object) to the server, and the image transmission requires a large transmission resource.
S9, the terminal receives the full life cycle search result sent by the server.
Illustratively, taking fig. 3 as an example, the full lifecycle search results include a graphical sequence, such as images 215 through 219 in (f) of fig. 3. The user can determine the full life cycle of the target object through the graphical sequence.
Taking the full life cycle feature set shown in fig. 7C as an example, after the server acquires the full life cycle feature set from the terminal, determining an image 1 corresponding to the seed period according to the feature vector 1, wherein the image 1 comprises a target object (for example, an apple) of the seed period, determining an image 2 corresponding to the germination period according to the feature vector 2, wherein the image 2 comprises a target object of the germination period, determining an image 3 corresponding to the adult period according to the feature vector 3, wherein the image 3 comprises a target object of the adult period, determining an image 4 corresponding to the flowering period according to the feature vector 4, wherein the image 4 comprises a target object of the flowering period, determining an image 5 corresponding to the result period according to the feature vector 5, and wherein the image 5 comprises a target object of the result period.
In some embodiments, taking the image 1 in fig. 8 as an example, a plurality of images corresponding to a seed period of a target object (for example, an apple) are stored in a server, the image 1 may be determined in the plurality of images according to the feature vector 1, that is, if the feature vector 1 is different, the determined image 1 is different. As described above, the feature vector 1 is a seed feature for describing a target object (for example, apple), and the seed characteristics corresponding to different kinds of target objects (for example, apple) are different, so the obtained image 1 is different.
In other embodiments, taking fig. 8 as an example, in fig. 8, the feature vector 1 corresponds to one image, that is, the image 1, and in other embodiments, the feature vector 1 may also correspond to multiple images, where the multiple images are respectively different images of the seed period of the target object. Similarly, in fig. 8, for example, the feature vector 2 corresponds to one image, that is, the image 2, and in other embodiments, the feature vector 2 may correspond to a plurality of images, where the plurality of images are different images of the germination period of the target object. That is, multiple images may be obtained per lifecycle, reflecting the same lifecycle.
For another example, taking the target object as a swan, the full life cycle search result includes a graphical sequence as shown in fig. 9, including images 1 through 5. Image 1 is the swan egg period of the swan, image 2 is the hatching period of the swan, image 3 is the hatchling period of the swan, and image 4 is the adult period of the swan.
In other embodiments, the terminal may also output a prompt message, where the prompt message is used to prompt the current period/stage of the target object. Taking fig. 3 as an example, assuming that an apple in an image input by a user is currently in a result period, a prompt message is output to prompt that the current period is the period in which the image 219 is located.
It can be understood that before the terminal outputs the prompt message, the current period/stage of the target object needs to be determined. One implementation, as understood in conjunction with fig. 10, is that the manner in which the electronic device determines the period/phase in which the target object is currently located, includes the following steps:
step 1, acquiring an image sequence including the whole life cycle of an object by using time priori of the growth cycle of the object;
step 2, carrying out self-adaptive segmentation on each image in the image sequence;
step 3, inputting each divided sub-image into a convolutional neural network to obtain the image characteristics of each sub-image; for example, in FIG. 10, T1-T8 represent image features corresponding to 8 sub-maps.
Step 4, inputting the image characteristics of each sub-image into a super-parametric neural network to obtain the image characteristic weight of each image in the image sequence; the image characteristics of each sub-image are voted based on the super-parametric neural network, and a fusion voting result of each sub-image, namely the image characteristic weight, is obtained.
And 5, constructing an image characteristic weight sequence of the growth period of the object by using the time priori of the growth period.
And 6, inputting the image characteristic weight sequence into a long-term and short-term memory network to obtain a growth cycle identification result of the object.
As another implementation manner, as understood in connection with fig. 11, the manner in which the electronic device determines the period/phase in which the target object is currently located includes the following steps:
step 1, inputting an image into a first preset block of a residual neural network to obtain a first image feature corresponding to the image; wherein the residual neural network may include one or more residual blocks, for example, 5 residual blocks, and the first preset block may be the first four residual blocks among the 5 residual blocks.
Step 2, inputting the first image characteristics into a second preset block of the residual neural network to obtain second image characteristics; illustratively, continuing with the example in which the residual neural network includes 5 residual blocks, the second preset block may be the 5 th residual block.
Step 3, inputting the second image feature into an attention model to obtain a local important feature which is automatically focused by an attention mechanism; for example, the attention model may include a location attention module and/or a channel attention module. Wherein the position attention module may be the structure of the upper part of fig. 11, and the channel attention module may be the structure of the lower part of fig. 11.
And step 4, inputting the features into a first convolution layer and a pooling layer to obtain first label information corresponding to the image. The first tag information is for example the life cycle/stage in the image in which the target object is currently located.
In the embodiment shown in fig. 6, the terminal and the server cooperate to complete the technical solution of the present application. It should be noted that, in practice, a server may not be required, that is, all steps may be performed by the terminal, or steps that are originally performed by the terminal in fig. 6 may be performed by the server, for example, the terminal sends an image to the server, and then S3 and S4 may be performed by the server, so that the tasks of the terminal are reduced. Alternatively, the steps originally performed by the server in fig. 6 may be performed by the terminal.
Fig. 12 is a schematic structural diagram of an electronic device 1200 according to an embodiment of the present application. The electronic device 1200 may be a terminal (e.g., a cell phone) or a server in the foregoing. As shown in fig. 12, an electronic device 1200 may include: one or more processors 1201; one or more memories 1202; a communication interface 1203, and one or more computer programs 1204, which may be connected via one or more communication buses 1205. Wherein the one or more computer programs 1204 are stored in the memory 1202 and configured to be executed by the one or more processors 1201, the one or more computer programs 1204 comprising instructions. For example, when the electronic device 1200 is a terminal as in the foregoing, the instructions may be for performing the steps associated with the terminal as in the corresponding embodiments above. When the electronic device 1200 is a server as described above, the instructions may be used to perform the steps associated with the server as in the corresponding embodiment above. The communication interface 1203 is used to enable communication of the electronic device with other devices, for example the communication interface may be a transceiver.
The embodiment of the application also provides a communication system. The communication system comprises a first device and a second device. The structures of the first device (e.g., terminal) and the second device (e.g., server) may be shown in fig. 12. For example, when the electronic device 1200 shown in fig. 12 is a first device, the instructions of the one or more computer programs 1204, when executed by the processor, cause the first device to perform the steps of the first device (i.e., the terminal) as in fig. 6, supra. When the electronic device 1200 shown in fig. 12 is a second device, the instructions of the one or more computer programs 1204, when executed by the processor, cause the second device to perform the steps of the second device (i.e., the server) as in fig. 6, supra.
In the embodiments of the present application described above, the method provided by the embodiments of the present application is described in terms of an electronic device (e.g., a mobile phone, a server) as an execution subject. In order to implement the functions in the method provided by the embodiment of the present application, the electronic device may include a hardware structure and/or a software module, where the functions are implemented in the form of a hardware structure, a software module, or a hardware structure plus a software module. Some of the functions described above are performed in a hardware configuration, a software module, or a combination of hardware and software modules, depending on the specific application of the solution and design constraints.
As used in the above embodiments, the term "when …" or "after …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …" depending on the context. Similarly, the phrase "at the time of determination …" or "if detected (a stated condition or event)" may be interpreted to mean "if determined …" or "in response to determination …" or "at the time of detection (a stated condition or event)" or "in response to detection (a stated condition or event)" depending on the context. In addition, in the above-described embodiments, relational terms such as first and second are used to distinguish one entity from another entity without limiting any actual relationship or order between the entities.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present invention, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc. The schemes of the above embodiments may be used in combination without conflict.
It is noted that a portion of this patent document contains material which is subject to copyright protection. The copyright owner has reserved copyright rights, except for making copies of patent documents or recorded patent document content of the patent office.

Claims (12)

1. A method for searching a full life cycle of an object, comprising:
acquiring first information, wherein the first information comprises information of a first object;
determining that the first object has a lifecycle, the lifecycle comprising different growth phases of the first object;
outputting an image sequence, wherein the image sequence comprises N first images, and N is a positive integer; the N first images are used for displaying the morphology of the first object in different growth stages.
2. The method of claim 1, wherein the first information is a second image and the first object satisfies at least one of the following conditions:
an object centered on the second image; or,
an object with the largest occupied area on the second image; or,
the most number of objects on the second image; or,
and an object specified by a user on the second image.
3. The method of claim 1 or 2, wherein determining that the first object has a lifecycle comprises:
Identifying a type of the first object;
and judging whether the first object has a life cycle or not according to the type of the first object.
4. A method according to any one of claims 1-3, characterized in that before outputting the image sequence, further comprises:
respectively inputting the first information into M life cycle models to obtain M pieces of characteristic information, wherein M is a positive integer, and each piece of characteristic information in the M pieces of characteristic information is used for describing a growth stage characteristic of the first object;
and obtaining the image sequence according to the M pieces of characteristic information.
5. The method of claim 4, wherein the M lifecycle models are related to the first object or are related to a type of the first object.
6. The method of any one of claims 1-5, wherein the N first images include an i first image and a j first image, wherein i < j < N;
the ith first image shows the morphology of the first object in a first growing stage; the j-th first image shows the morphology of the first object in a second growth phase, which follows the first growth phase.
7. The method according to any one of claims 1-6, further comprising:
receiving a first operation, wherein the first operation is used for indicating to open a first application;
opening the first application, and displaying a first interface, wherein the first interface comprises an input box;
the acquiring the first information includes: acquiring first information input in the input box;
before the outputting the sequence of images, further comprising: a second operation for instructing to search for the first information input in the input box is received.
8. The method according to any one of claims 1-7, further comprising:
and outputting prompt information, wherein the prompt information is used for prompting the current growth stage of the first object.
9. The method according to any one of claims 1-8, wherein prior to outputting the sequence of images, the method further comprises:
dividing N first images included in the image sequence into K groups of images, wherein K is a positive integer, and each group of images corresponds to one life stage of the first object.
10. An electronic device, comprising:
a processor, a memory, and one or more programs;
Wherein the one or more programs are stored in the memory, the one or more programs comprising instructions, which when executed by the processor, cause the electronic device to perform the method steps of any of claims 1-9.
11. A computer readable storage medium, characterized in that a computer program is stored which, when run on a computer, causes the computer to perform the method according to any one of claims 1 to 9.
12. A computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method of any of the preceding claims 1-9.
CN202210513016.7A 2022-05-11 2022-05-11 Object full life cycle searching method and electronic equipment Pending CN117093735A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210513016.7A CN117093735A (en) 2022-05-11 2022-05-11 Object full life cycle searching method and electronic equipment
PCT/CN2023/093133 WO2023217159A1 (en) 2022-05-11 2023-05-10 Object full-life-cycle search method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210513016.7A CN117093735A (en) 2022-05-11 2022-05-11 Object full life cycle searching method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117093735A true CN117093735A (en) 2023-11-21

Family

ID=88729715

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210513016.7A Pending CN117093735A (en) 2022-05-11 2022-05-11 Object full life cycle searching method and electronic equipment

Country Status (2)

Country Link
CN (1) CN117093735A (en)
WO (1) WO2023217159A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001350828A (en) * 2000-04-05 2001-12-21 Taikaen:Kk Picture supplying device
CN104951459A (en) * 2014-03-26 2015-09-30 腾讯科技(深圳)有限公司 Display method and device for photo gallery
JP6570840B2 (en) * 2015-01-29 2019-09-04 Dynabook株式会社 Electronic apparatus and method
JP6590690B2 (en) * 2015-12-25 2019-10-16 富士フイルム株式会社 Cell image retrieval apparatus and method, and program
CN113868456A (en) * 2020-06-30 2021-12-31 北京安云世纪科技有限公司 Method and system for screening character images

Also Published As

Publication number Publication date
WO2023217159A1 (en) 2023-11-16

Similar Documents

Publication Publication Date Title
CN110225244A (en) A kind of image capturing method and electronic equipment
WO2020029306A1 (en) Image capture method and electronic device
CN110471606B (en) Input method and electronic equipment
CN112580400B (en) Image optimization method and electronic equipment
CN115734071A (en) Image processing method and device
CN111510626A (en) Image synthesis method and related device
WO2022143921A1 (en) Image reconstruction method, and related apparatus and system
CN115115679A (en) Image registration method and related equipment
CN111249728B (en) Image processing method, device and storage medium
CN115437601B (en) Image ordering method, electronic device, program product and medium
CN112416984A (en) Data processing method and device
CN113497835B (en) Multi-screen interaction method, electronic equipment and computer readable storage medium
CN117093735A (en) Object full life cycle searching method and electronic equipment
CN113572798B (en) Device control method, system, device, and storage medium
CN114971107A (en) Privacy risk feedback method and device and first terminal equipment
CN114527903A (en) Key mapping method, electronic equipment and system
CN111339513A (en) Data sharing method and device
WO2022179271A1 (en) Search result feedback method and device, and storage medium
CN113473057B (en) Video recording method and electronic equipment
CN115640414B (en) Image display method and electronic device
CN114466100B (en) Method, device and system for adapting accessory theme
CN114095600B (en) Earphone theme changing method, smart phone and storage medium
CN114115770B (en) Display control method and related device
CN116033063A (en) Method for checking message and electronic equipment
CN115857964A (en) Application program installation method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination