CN114667505A - Application program identification device and electronic device - Google Patents

Application program identification device and electronic device Download PDF

Info

Publication number
CN114667505A
CN114667505A CN202080007252.0A CN202080007252A CN114667505A CN 114667505 A CN114667505 A CN 114667505A CN 202080007252 A CN202080007252 A CN 202080007252A CN 114667505 A CN114667505 A CN 114667505A
Authority
CN
China
Prior art keywords
image
application program
user
application
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080007252.0A
Other languages
Chinese (zh)
Inventor
唐頔朏
冯太锐
张树本
李令言
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN114667505A publication Critical patent/CN114667505A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Abstract

The application discloses an identification device of an application program and an electronic device, wherein a main processor is used for receiving a first instruction triggered by a user, and the first instruction is used for indicating that a first image is associated with the first application program; the coprocessor is used for identifying the first image to obtain first image characteristics and sending first information for indicating the first image characteristics to the main processor; the main processor is further used for associating the first image feature with the first application program when the first information is determined not to be included in the database, and storing the first information and the first application program in the database; wherein the database comprises information indicative of at least one image feature and an application associated with each image feature. The electronic equipment can identify the application program to be activated based on the image characteristics, the operation flow of the user is simplified, and the processing efficiency is improved.

Description

Application program identification device and electronic device Technical Field
The present application relates to the field of electronic devices, and in particular, to an application program identification device and an electronic device.
Background
With the development of the intelligent electronic device technology, the number and functions of applications installed in the intelligent electronic device are more and more, and the intelligent mobile terminal is integrated into the aspect of daily life, so that various requirements of users can be met. For example, the user may use a shared bicycle through the mobile terminal, may make payment through the mobile terminal, may ride public transportation through the mobile terminal, and so on.
However, with the development of mobile terminal technology and application development technology, users have also made higher demands on the convenience of use of applications in mobile terminals. Currently, a user wants to use an application program in a mobile terminal, generally needs to find an icon of the application program that the user wants to use in a main interface or an application interface of the mobile terminal and click the icon, and when the number of the application programs installed in the mobile terminal is large, the process of finding the icon of the application program may take a long time. In order to simplify the user operation, a virtual key can be set, and the user can quickly start the application program by touching the virtual key, but the display interface of the mobile terminal is messy; alternatively, a corresponding touch trajectory may be set for the application program in advance, as shown in fig. 1, a memory in the mobile terminal stores a preset correspondence between the touch trajectory and the application program, after the user inputs the touch trajectory, the touch trajectory input by the user is sequentially matched with the touch trajectories in the correspondence, and if the matching is successful, the corresponding application program is activated, but the user needs to remember the touch trajectory corresponding to each application program, so as to increase the memory burden of the user. In addition, in the two modes, the fingerprint identification can be added, namely, when a user uses different fingers to execute the same operation, the user can correspond to different application programs; however, in some scenarios, the user's finger may not be convenient to operate the mobile terminal, for example, when the weather is cold, the user is wearing gloves, the user is required to remove the gloves first, or the user's finger is too damp, too dry, or stuck with objects, which may result in the fingerprint not being recognized. Although the prior art proposes a scheme of entering an application program through an image recognition technology, the types of images capable of being recognized are limited, the types of entering or opening the application program are limited, and the user experience is reduced.
Disclosure of Invention
The application provides identification equipment and electronic equipment of an application program, supports user-defined application programs which can be activated based on an image identification technology, simplifies user operation processes, and improves user experience.
In a first aspect, an embodiment of the present application provides an identification device for an application program, including: the main processor is used for receiving a first instruction triggered by a user, and the first instruction is used for indicating that a first image is associated with a first application program; the coprocessor is used for identifying the first image to obtain first image characteristics and sending first information used for indicating the first image characteristics to the main processor; the main processor is further configured to associate the first image feature with the first application program and store the first information and the first application program in the database when it is determined that the database does not include the first information; wherein the database comprises information indicative of at least one image feature and an application associated with each image feature. The scheme supports user customization and establishes the incidence relation between the image characteristics and the application program. In subsequent image recognition, if an image feature associated with an application program is recognized, the recognition device can automatically start the application program associated with the image feature, so that the operation process of a user is simplified, and the types of the application programs which can be activated based on the image recognition technology are expanded.
In one possible implementation, the first information includes the first image feature and/or an identification of the first image feature.
In a possible implementation, the system further comprises a signal processing device for processing the first data from the image sensor to obtain the first image.
In one possible implementation, the first image is an image designated by a user in at least one preset image.
In a possible implementation manner, the signal processing device is further configured to process second data from the image sensor to obtain a second image; the coprocessor is further configured to identify the second image to obtain a second image feature, and send second information to the main processor when it is determined that the database includes the second information indicating the second image feature. The main processor is further used for determining the application program associated with the second image feature in the database as a target application program to be activated.
In a possible implementation manner, the coprocessor is further configured to perform semantic analysis on the second image or the second image feature to obtain a semantic tag when the database does not include second information indicating the second image feature; wherein the semantic tag comprises text information for characterizing the second image or the second image feature; the main processor is further configured to receive the semantic tag from the coprocessor, indicate at least one application program associated with the semantic tag to a user, receive a second instruction triggered by the user, where the second instruction is used to indicate that a second application program of the at least one application program is activated, and determine that the second application program is the target application program to be activated.
In one possible implementation, after receiving the identification of the target application from the coprocessor, the main processor is further configured to: judging whether the target application program is installed in the electronic equipment for deploying the equipment; activating the target application program when the target application program is installed in the electronic equipment; or when the target application program is not installed in the electronic equipment but a third application program is installed, activating the target application program through the target application program interface built in the third application program; or when the target application program is not installed in the terminal, prompting a user to download the target application program.
In a possible implementation, the signal processing device is specifically configured to periodically or in real time process the second data from the image sensor to obtain the second image.
In one possible implementation, the signal processing device comprises at least one of an image signal processor ISP or a sensor processor.
In a possible implementation, the co-processor is specifically configured to identify the first image or the second image using a small sample image identification algorithm.
In one possible implementation, the co-processor is at least one of a neural network processor, a digital signal processor, an image processing unit, or a hardware accelerator.
In a second aspect, an embodiment of the present application provides an identification device for an application program, including: the main processing module is used for receiving a first instruction triggered by a user, wherein the first instruction is used for indicating that a first image is associated with a first application program; the co-processing module is used for identifying the first image to obtain a first image characteristic and sending first information used for indicating the first image characteristic to the main processing module; the main processing module is further configured to associate the first image feature with the first application program and store the first information and the first application program in the database when it is determined that the database does not include the first information; wherein the database comprises information indicative of at least one image feature and an application associated with each image feature. The scheme supports user customization and establishes the association relationship between the image characteristics and the application program. In subsequent image recognition, if an image feature associated with an application program is recognized, the recognition device can automatically start the application program associated with the image feature, so that the operation process of a user is simplified, and the types of the application programs which can be activated based on the image recognition technology are expanded.
In a third aspect, an embodiment of the present application provides an electronic device, which includes an image sensor and the device described in any one of the possible implementation manners of the first aspect and the first aspect. Wherein the image sensor is used for periodically or real-time data acquisition.
In a fourth aspect, an embodiment of the present application further provides a computer storage medium, where a software program is stored in the storage medium, and at least a first portion of the software program, when read and executed by a coprocessor, may implement the function performed by the coprocessor in the first aspect or any one of the designs of the first aspect; alternatively, at least a second part of the software program, when read and executed by the host processor, may implement the functions performed by the host processor in the first aspect or any of the implementations of the first aspect described above.
In a fifth aspect, this embodiment further provides a computer program product, which when run on a coprocessor, causes the coprocessor to perform a function as performed by the coprocessor in the first aspect or any one of the designs of the first aspect; or, when at least the second part thereof is run on the main processor, cause the main processor to perform the functions as performed by the main processor in the first aspect or any one of the implementations of the first aspect described above.
Drawings
FIG. 1 is a schematic flow chart of an activation application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an apparatus for an application according to an embodiment of the present application;
FIG. 4 is a schematic view of an interactive interface provided in an embodiment of the present application;
FIG. 5 is a second schematic view of an interactive interface provided in the present embodiment;
FIG. 6a is a third schematic view of an interactive interface provided in the present application;
FIG. 6b is a fourth schematic view of an interactive interface provided in the present application;
FIG. 7a is a fifth schematic view of an interactive interface provided in the present application;
FIG. 7b is a sixth schematic view of an interactive interface provided in the present application;
fig. 8 is a schematic flowchart of an image registration process according to an embodiment of the present disclosure;
fig. 9 is a schematic flowchart of an AI image recognition technique according to an embodiment of the present application;
FIG. 10 is a flowchart illustrating a small sample image recognition technique according to an embodiment of the present application;
FIG. 11a is one of the captured images provided by the embodiments of the present application;
FIG. 11b is a schematic diagram of an application activated in accordance with the embodiment of the present application and corresponding to FIG. 11 a;
FIG. 12a is a second example of an image acquisition system according to the present disclosure;
FIG. 12b is a schematic diagram of an application activated in accordance with the embodiment of the present application and corresponding to FIG. 12 a;
FIG. 12c is a schematic diagram of an application page activated in accordance with the embodiment of the present application and corresponding to FIG. 12 a;
FIG. 13a is a third example of an image taken according to the present disclosure;
FIG. 13b is a schematic diagram of an application associated with FIG. 13a according to an embodiment of the present application;
FIG. 14a is a fourth example of a collected image according to an embodiment of the present application;
FIG. 14b is a schematic diagram of an application page activated in accordance with the embodiment of the present application and corresponding to FIG. 14 a;
FIG. 15a is a fourth example of an image acquisition system provided in the embodiments of the present application;
FIG. 15b is a schematic diagram of an application page activated in accordance with the embodiment of the present application and corresponding to FIG. 15 a;
fig. 16 is a schematic structural diagram of another device for application programs according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described in further detail below with reference to the accompanying drawings. Various application programs installed in the electronic equipment can provide different aspects of convenience for users and solve different requirements of the users in different scenes. For example, a single-car class application is shared so that a user can conveniently use a public single car; the payment application program can enable a user to pay through the electronic equipment without carrying cash or a bank card and the like; the bus inquiry application program can facilitate the inquiry of bus routes and running conditions for users.
The conventional method for activating an application program has some limitations due to the complicated process of opening the application program in the prior art. In order to simplify user operations and improve user experience and processing efficiency, embodiments of the present application provide an identification device for an application program and an electronic device. The identification device may be a circuit board, circuitry, chip or chipset in an electronic device that runs the necessary software, including but not limited to driver software, operating system software and application software, and is used to perform operations related to application activation. The electronic device is a mobile electronic device, such as a mobile terminal like a mobile phone or a tablet, and optionally includes or does not include a communication capability, and the communication capability may be mobile communication or short-range communication. When the electronic device does not have the communication capability, the electronic device is equivalent to a movable local device. The electronic device needs to have shooting capability, such as a built-in camera, i.e., a sensor lens used for taking pictures. The electronic device includes, but is not limited to, a cell phone, a tablet computer, a Personal Digital Assistant (PDA), a laptop computer, or a wearable device.
Taking the example where the electronic device is a mobile phone, fig. 2 shows a part of a possible structure of a mobile phone 200. As shown, the handset 200 includes: radio Frequency (RF) circuitry 210, memory 220, other input devices 230, touch screen display 240, sensors 250, audio circuitry 260, I/O subsystem 270, processor 280, and power supply 290. Those skilled in the art will appreciate that the handset configuration shown in fig. 2 is not intended to be limiting and may include more or fewer components than those shown, or may combine certain components, or split certain components, or arranged in different components. The following describes the components of the mobile phone 200 in detail with reference to fig. 2:
the RF circuit 210 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information of a base station and then processes the received downlink information to the processor 280; in addition, the data for designing uplink is transmitted to the base station. Typically, the RF circuit includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 210 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), etc.
The memory 220 may be used to store software programs and modules, and the processor 280 executes various functional applications and data processing of the mobile phone 200 by operating the software programs and modules stored in the memory 220. The memory 220 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone 200, and the like. Further, memory 220 may include powered down volatile memory or non-powered down volatile memory. The memory 220 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM) or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
Other input devices 230 may be used to receive entered numeric or character information and generate key signal inputs relating to user settings and function controls of cell phone 200. In particular, the other input devices 230 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), levers, and the like. In some embodiments of the present invention, other input devices may also include an image sensor (camera) for capturing images. The other input device 230 is connected to the other input device controller 271 of the I/O subsystem 270 and is in signal communication with the processor 280 under the control of the other input device controller 271.
The touch display screen 240 may be used to display information input by or provided to the user and various menus of the cellular phone 200, and may also accept user input. A particular touch screen display 240 may include a display panel 241, and a touch panel 242. The display panel 241 may be configured by a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), or the like. The touch panel 242, also referred to as a touch screen, a touch sensitive screen, etc., may collect contact or non-contact operations (e.g., operations performed by a user on or near the touch panel 242 using any suitable object or accessory such as a finger, a stylus, etc., and may also include body sensing operations; the operations include single-point control operations, multi-point control operations, etc.) and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 242 may include two parts, i.e., a touch detection device and a touch controller. The touch detection device detects the touch direction and gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into information that can be processed by the processor, sends the information to the processor 280, and receives and executes commands sent from the processor 280. In addition, the touch panel 242 may be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave, and the touch panel 242 may be implemented by any technology developed in the future. Further, the touch panel 242 may cover the display panel 241, a user may operate on or near the touch panel 242 covered on the display panel 241 according to the content displayed on the display panel 241 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, virtual keys, icons, etc.), the touch panel 242, after detecting the operation on or near the touch panel 242, transmits the operation to the processor 280 through the I/O subsystem 270 to determine a user input, and the processor 280 then provides a corresponding visual output on the display panel 241 through the I/O subsystem 280 according to the user input. Although in fig. 2, the touch panel 242 and the display panel 241 are two independent components to implement the input and output functions of the mobile phone 200, in some embodiments, the touch panel 242 and the display panel 241 may be integrated to implement the input and output functions of the mobile phone 200.
The handset 200 may also include at least one sensor 250, such as an acceleration sensor 251, a light sensor, and other sensors. Specifically, the acceleration sensor 251 can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the gesture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; the light sensors may include an ambient light sensor that adjusts the brightness of the touch screen 240 based on the intensity of ambient light, and a proximity sensor that turns off the screen and/or backlight when the cell phone 200 is moved to the ear. As for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone 200, further description is omitted here.
Audio circuitry 260, speaker 261, and microphone 262 may provide an audio interface between a user and the handset 200. The audio circuit 260 may transmit the converted signal of the received audio data to the speaker 261, and the converted signal is converted into a sound signal by the speaker 261 and output; on the other hand, the microphone 262 converts the collected sound signals into signals, which are received by the audio circuit 260 and converted into audio data, which are then output to the RF circuit 210 for transmission to, for example, another cell phone, or to the memory 220 for further processing.
The external devices used by the I/O subsystem 270 to control input and output may include other input device controllers 271, sensor controllers 272, and a display controller 273. Optionally, one or more other input device controllers 271 receive signals from and/or transmit signals to other input devices 230, and other input devices 230 may include physical buttons (push buttons, rocker buttons, etc.), dials, slide switches, cameras, etc. It is noted that other input device controllers 271 may be connected to any one or more of the above devices. The display controller 273 in the I/O subsystem 270 receives signals from the touch display screen 240 and/or transmits signals to the touch display screen 240. After the touch screen 240 detects the user input, the display controller 273 converts the detected user input into an interaction with the user interface object displayed on the touch screen 240, i.e., realizes a human-computer interaction. Sensor controller 270 may receive signals from one or more sensors 250 and/or send signals to one or more sensors 250.
The processor 280 is used as a control center of the mobile phone 200, connects various parts of the whole mobile phone by using various interfaces and lines, and performs various functions and processes of the mobile phone 200 by operating or executing software programs and/or modules stored in the memory 220 and calling data stored in the memory 220, thereby performing overall monitoring of the mobile phone. Alternatively, processor 280 may include one or more processing units, or one or more processors; preferably, the processor 280 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 280. The processor 280 may further include processing functional units mentioned in the subsequent embodiments.
The handset 200 also includes a power supply 290 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 280 via a power management system that manages charging, discharging, and power consumption. Although not shown, the mobile phone 200 may further include other modules such as a bluetooth module, which will not be described herein.
A schematic structural diagram of an application program identification device 300 according to an embodiment of the present application may be as shown in fig. 3, and includes a main processor 301 and a coprocessor 302.
The main processor 301 is configured to receive a first instruction triggered by a user, where the first instruction is used to instruct that a first image is associated with a first application.
In an optional embodiment, the apparatus 300 further includes a signal processing apparatus 303; the signal processing device 303 is configured to process the first data from the image sensor to obtain a first image. Alternatively, the user may trigger an image sensor in the electronic device to acquire the first data. In specific implementation, before receiving a first instruction triggered by a user, the main processor 301 is further configured to receive a shooting instruction triggered by the user, where the shooting instruction is used to instruct to shoot a first image, control an image sensor in the electronic device to acquire first data and transmit the first data to the signal processing device 303, and the signal processing device 303 processes the first data to obtain the first image. In another alternative embodiment, the first image is an image designated (selected) by a user in at least one preset image. Specifically, the first image may be image data pre-stored by the user in the electronic device in which the device 300 is located, for example, an image selected in a gallery on the electronic device. Optionally, the first image may be one or more images, for example, the user may trigger to take multiple images from different angles of the same object, or one or more images may be selected from a gallery on the electronic device, which is not limited herein.
A coprocessor 302, configured to recognize the first image to obtain a first image characteristic, and send first information indicating the first image characteristic to the main processor 301. Wherein the first information comprises the first image feature and/or an identification of the first image feature. The first image feature includes a target object in the first image, and related parameters such as color, size, shape and the like of the target object. Optionally, when the coprocessor 302 processes the first image, the first image may be identified by using an artificial intelligence AI image identification algorithm to obtain a corresponding first image feature; alternatively, the first image may be identified using a small sample (few-shot learning) image identification algorithm.
The main processor 301 is further configured to associate the first image feature with the first application program when the database does not include the first information, and store the first information and the first application program in the database; wherein the database comprises information indicative of at least one image feature and an application associated with each image feature. Associating the first image feature with the first application also associates the first image with the first application.
For example, if the information in the database indicating any image feature of the at least one image feature includes an identifier of the image feature and the image feature, it is assumed that the database stores an application "ofo shared bicycle" associated with the identifier "21", the image feature "yellow shared bicycle", and "21" and "yellow shared bicycle". If the first information received by the main processor 301 from the coprocessor 302 includes the identification "11", the image feature "black guitar", and it is determined that the first application selected by the user is "tuner APP" based on the first instruction triggered by the user, then "11", "black guitar" and "tuner APP" may be associated and saved in the database.
Optionally, the content included in the database is divided into a preset part and a user-defined part. That is, some information indicating the image feature and the application program associated with the information are pre-configured in the database, and some information indicating the image feature and the application program associated with the information are customized by the user to be stored in the database, for example, according to the above-mentioned procedure, the first information and the first application program are stored in the database. For differentiation, the application program preset in the database and associated with the relevant image feature may be referred to as a default application program, and the preset portion and the user-defined portion may be stored in the database using two tables.
In the embodiment of the application, user-defined is supported, the incidence relation between the image characteristics and the application program is established, and the types of the application programs which can be activated based on the image recognition technology are expanded. In subsequent image recognition, if an image feature associated with an application program is recognized, the recognition device can automatically start the application program associated with the image feature, so that the user operation process is simplified, the user experience is improved, and the processing efficiency is improved.
During specific implementation, an interactive interface for a user to trigger the first instruction may be set on the electronic device. Illustratively, referring to the interactive interface on the mobile phone shown in fig. 4 to fig. 7b, the user performs a related operation on the mobile phone to trigger the first instruction, so as to associate the first image with the first application program.
As shown in fig. 4, the function "register image" may be deployed in a setting within the handset; or the mobile phone can be based on an independent software deployment function to register images. After the user taps the "register image," a jump is made to an interface prompting the user to enter an image as shown in fig. 5. If the user selects 'take a picture', jumping to a picture taking interface shown in fig. 6a for the user to take one or more pictures; if the user selects "open gallery", jump to the gallery interface shown in fig. 6b for the user to select the images stored in the gallery, and specifically fig. 6b shows that the user selects 5 images. After the user finishes inputting the image, a coprocessor in the mobile phone identifies the image feature of the image input by the user, if the image feature is associated with an application program, the coprocessor determines that the image feature is registered, and prompts the user that the image feature is registered, and as shown in fig. 7a, a prompt content of 'an object in the image is registered in an XX application, and a new image is selected or the association relation between the image and the XX application is deleted' is popped up on the mobile phone; or, if the image feature is not associated with the application, as shown in fig. 7b, prompting the user that the image can be registered, and popping up an application list on the mobile phone, and if the user selects the application 2, saving the association between the image feature of the image input by the user and the application 2.
Referring to fig. 8, a flowchart of registering an image is provided in an embodiment of the present application, which corresponds to the interactive interfaces shown in fig. 4 to fig. 7b, and includes the following steps. And S1, responding to the user operation, and opening the pre-deployed image registration function in the electronic equipment. For example, as shown in fig. 4, the user taps on the "registration image". And S2, prompting the user to enter an image, and acquiring the image triggered by the user to be shot or the stored image selected by the user. S3, it is determined whether or not the image feature of the acquired image is registered, that is, whether or not the image feature of the acquired image is associated with an application program. If so, go to S4; if not, S5 is executed.
And S4, prompting the user that the image feature is registered, re-entering the image or deleting the association relationship between the image feature and the associated application program. S5, prompting the user that registration is possible. S6, the image feature of the acquired image is associated with the application program designated by the user, and registration is completed.
Alternatively, determining the user-specified application may be implemented in either of two ways. The method I comprises the following steps: after execution of S2 and before execution of S3, the user is presented with a list of applications for the user to select applications that the user wishes to associate with the entered image for the user to select desired applications in advance. The second method comprises the following steps: after execution of S5 and before execution of S6, the user is presented with a list of applications for the user to select applications that the user wishes to associate with the entered image.
Optionally, when the user triggers that the shot or selected image does not contain a significant feature, for example, the image does not contain a target object or the first image mostly contains a background such as a wall surface, a ground surface, a ceiling and the like but contains fewer features of the target object, the user is prompted to shoot or select the image with the significant feature, and the image with the significant feature is used as the first image. The images shot or selected by the user are required to have significant features, and subsequent triggering error identification can be avoided.
Further, the signal processing device 303 is further configured to process second data from the image sensor to obtain a second image. The coprocessor 302 is further configured to identify the second image to obtain a second image feature, and send second information to the host processor when it is determined that the database includes the second information indicating the second image feature. The main processor 301 is further configured to determine an application program associated with the second image feature in the database as a target application program to be activated.
Optionally, when the coprocessor 302 processes the second image, the second image may be identified by using an artificial intelligent AI image identification algorithm to obtain a corresponding second image feature; alternatively, the second image may be identified using a small sample (few-shot learning) image identification algorithm.
The flow of the AI image recognition technique can be as shown in fig. 9, and includes a training process and a recognition process. In the training process, a large number of images to be trained are obtained, that is, training samples are obtained, then preprocessing and feature extraction are performed on the sample images to obtain an image training model, and the processed image training model is processed by using a deep learning engine, for example, engine software of the coprocessor 302, to obtain a training classifier. For example, if the feature of the image to be trained is a yellow sharing bicycle, first, images of a large number of yellow sharing bicycles are acquired, the acquired images may include images of the yellow sharing bicycles photographed from different angles and different scenes, then each image is preprocessed, for example, by difference, denoising, or sharpening, and the like, the preprocessed images are subjected to feature extraction, that is, features of the yellow sharing bicycle in the images are extracted, then, training is performed based on the features of the yellow sharing bicycle extracted from each image, a training model of the yellow sharing bicycle is obtained, and a classifier of the yellow sharing bicycle is trained according to the obtained training model. The training models and classifiers may exist as software models and are stored in the memory of the electronic device for subsequent invocation by the coprocessor 302. In the identification process, an image to be identified (such as the second image) is obtained first, the image to be identified is preprocessed, and the preprocessed image to be identified is subjected to feature extraction, wherein the preprocessing includes removing an image which does not include significant features in the image to be identified, for example, if backgrounds such as a ceiling, a wall surface, a ground surface and the like in the image occupy a large proportion, and features related to a target object occupy a small proportion or even do not exist, the image is removed, the calculation amount of the coprocessor 302 for identifying the features of the image is reduced, and the processing efficiency is further improved. And then, performing dimension reduction processing on the extracted features, and then matching the extracted features with a training model and a training classifier obtained in the training process to obtain a recognition result. The training process may be completed offline by the server, and then the trained model and classifier are sent to the electronic device to be stored in a memory of the electronic device, and the coprocessor 302 in the electronic device identifies the acquired second image to obtain the second image feature.
The small sample image recognition technique may be as shown in fig. 10, including an offline training process, an online training process, and a recognition process. In the off-line training process, a large number of images to be trained are obtained, namely training samples are obtained, then preprocessing and feature extraction are carried out on the sample images to obtain an image training model with meta-knowledge, and a deep learning engine, such as engine software of the co-processor 302, is utilized to process the processed image training model to obtain a training classifier. Then, in the online training process, a small number of images with user individuation are obtained, the individualized images are preprocessed and subjected to feature extraction to obtain an individualized image training model, and the individualized image training model is processed by utilizing a deep learning engine such as engine software of the coprocessor 302 to obtain a classifier. In the identification process, an image to be identified (such as the second image) is obtained first, the image to be identified is preprocessed, and the preprocessed image to be identified is subjected to feature extraction, wherein the preprocessing includes removing an image which does not include significant features in the image to be identified, for example, if backgrounds such as a ceiling, a wall surface, a ground surface and the like in the image occupy a large proportion, and features related to a target object occupy a small proportion or even do not exist, the image is removed, the calculation amount of the coprocessor 302 for identifying the features of the image is reduced, and the processing efficiency is further improved. And then matching with an image recognition model and a classifier obtained in the online training process to obtain a recognition result. If the matching is successful, the coprocessor identifies the image characteristics corresponding to the obtained image, for example, the coprocessor identifies the second image to obtain the second image characteristics.
Referring to table 1, the association relationship between the image features stored by the user in a customized manner and the application programs, and as shown in table 2, the default application programs corresponding to some image features stored in the database, that is, the application programs not customized by the user, are already present in the database before the user selects them. An alternative embodiment to the above is illustrated.
TABLE 1
Identification Image features Application program
11 Black guitar Tuner APP
12 Silver drum set Metronome APP
13 Air conditioner Remote controller APP
14 Television receiver Remote controller APP
15 Automobile steering wheel Navigation APP
16 Yoga mat Keep
17 Textbook/teaching material Learning APP
18 Orange shared bicycle Mobai bicycle
... ... ...
TABLE 2
Identification Image features Default application
21 Yellow sharing bicycle ofo shared bicycle
22 Blue sharing bicycle Small blue bicycle
23 Subway gate Easy passage
24 Two-dimensional code for collection Payment treasure (or little letter)
25 Sweep sign indicating number rifle Payment treasure (or little letter)
26 Bus stop board Vehicle comes
... ... ...
For example, a user-defined "Mobai bicycle" application has an associative relationship with an orange share bicycle. If the second image acquired by the coprocessor 302 is as shown in fig. 11a, the coprocessor 302 recognizes that an orange shared bicycle exists in the second image, that is, it is determined that the second image characteristic corresponding to the second image is an orange shared bicycle, and it is determined that the application program associated with the second image characteristic is a "moja bicycle" according to table 1, the coprocessor 302 may notify the identifier "18" to the main processor 301, and the main processor 301 determines that the application program corresponding to the identifier "18" is a "moja bicycle", and then further determines the "moja bicycle" as the target application program to be activated. If the electronic device 300 is in an electronic device that has an application "moyball-bike" installed, the main processor 301 activates the "moyball-bike" application, and the content displayed by the electronic device for the user at this time is as shown in fig. 11 b. If the application program "mojah bicycle" is not installed in the electronic device, the main processor 301 may prompt the user through the display screen of the electronic device whether to download the application program "mojah bicycle".
For another example, in a scenario where a code scanning gun is present, the user may need to make a payment, and thus the "pay for Bao" application that is able to pay is set as the default application corresponding to the code scanning gun. If the image acquired by the coprocessor 302 is as shown in fig. 12a, the coprocessor 302 recognizes that the image feature corresponding to the acquired second image is a code scanning gun, determines, according to table 1, that the application program associated with the code scanning gun is not defined by the user, and determines, according to table 2, that the default application program corresponding to the coprocessor 302 is a "pay treasure" application program, the coprocessor 302 may notify the main processor 301 of the identifier "25", and the main processor 301 determines, when the application program corresponding to the identifier "25" is a "pay treasure", the "pay treasure" is further determined as the target application program to be activated. If the electronic device 300 is in an electronic device with an application "pay for treasure" installed, the main processor 301 activates the "pay for treasure" application, and the content displayed by the electronic device for the user at this time is as shown in fig. 12 b. If the electronic equipment is not provided with the application program 'pay treasure', the main processor can prompt the user whether to download the application program 'pay treasure' or not through a display screen of the electronic equipment.
It should be understood that the above tables 1 and 2 are only examples, and the association relationship between the image feature and the application program customized by the user may include the contents shown in table 1, but is not limited thereto. The preset image features and corresponding default applications in the database may include, but are not limited to, those shown in table 2. In addition, the column of "identifier" is also an optional item, if the item exists, the communication between the coprocessor 302 and the main processor 301 can be simplified, that is, the coprocessor 302 only needs to notify the main processor of the determined identifier of the application program; if the item does not exist, the coprocessor 302 may also notify the host processor of the identification information corresponding to the image to the application program to be activated in other ways, for example, send the name of the application program or image feature information to the host processor 301.
In addition, the same image features in the database characterized in table 2 may also correspond to multiple default applications. For example, if the image feature identified by the coprocessor 302 is "two-dimensional code collection" or "code scanning gun", the user may need to pay, and both the "pay for your" application and the "WeChat" application can implement payment, in this case, in an alternative embodiment, the two applications may be used as default applications that are candidates for the same image feature, and different priorities may be set for the two applications. If the priority of the "pay treasure" is higher, the main processor 301 preferentially activates the "pay treasure" application program when the "pay treasure" application program is installed in the electronic device, and activates the "WeChat" application program when the "pay treasure" application program is not installed. The setting of the priority may be set when determining the corresponding relationship between the image feature and the default application program, or may be set by the user after the electronic device obtains the corresponding relationship between the image feature and the default application program. In another alternative embodiment, the main processor 301 may prompt the user to select one of "pay for treasure" and "WeChat" via a display screen of the electronic device, and record the user's selection to determine the priority based on the user's selection. For example, if the user prefers to select "pay for Bao" multiple times, the "pay Bao" application is activated preferentially, and if the "pay Bao" application is not installed, the "WeChat" application is activated.
Further, in an optional implementation manner, the coprocessor 302 is further configured to perform semantic analysis on the second image or the second image feature to obtain a semantic tag when the database does not include second information used for indicating the second image feature; wherein the semantic tag comprises text information for characterizing the second image or the second image feature; the main processor 301 is further configured to receive the semantic tag from the coprocessor, and determine at least one application program associated with the semantic tag; and then indicating at least one application program associated with the semantic tag to a user, receiving a second instruction triggered by the user, wherein the second instruction is used for indicating to activate a second application program in the at least one application program, and determining that the second application program is the target application program to be activated.
Specifically, the semantic label may be a type of the target object in the second image, for example, fig. 13a illustrates that the obtained second image includes a piano, and the semantic label obtained through semantic analysis may be an instrument. The main processor 301 may search out at least one application program related to the musical instrument according to program description information such as function introduction, development description, software classification, etc. of the application programs in the electronic device, for example, considering that the user may need to record a sound when playing the piano music, the application program related to the musical instrument may include a recorder having a recording function; the user may need to search for a song, and the application program related to the musical instrument may include a music player with a function of searching for and playing a song; the user may need to learn to play the piano online, and the instrument-related application may include associated learning software that can learn to play the instrument. And displaying a candidate list corresponding to at least one application program to a user in a display screen on the electronic equipment, for example, the user selects the application program to be activated. Illustratively, the application software related to the "musical instrument" presented to the user as specifically illustrated in fig. 13b includes a recorder, a music player, learning software, and the like, and the user selects the application to be activated as the music player. Optionally, user's selection may also be recorded, and the semantic tag related application program priority may be determined by the user's selection. At least one application in the candidate list may then be presented in the determined priority order.
Further, after determining the target application, the main processor 301 is further configured to determine whether the target application is installed in the electronic device in which the device 300 is located. And if the target application program is determined to be installed in the electronic equipment, activating the target application program. Specifically, if the target application program is installed in the electronic device, and the main processor 301 determines that the target application program may not be executed when the target application program is determined, the main processor 301 may execute the target application program in the foreground; or when the main processor 301 determines that the target application program is in a background running state, the main processor 301 may switch the target application program from the background running state to a foreground running state, that is, a user interface of the target application program is displayed on a display screen of the electronic device, so that a user can operate the target application program. If the target application program is not installed in the electronic device, the main processor 301 may prompt whether to download the target application program through a display screen in the electronic device. Further, if the target application is not installed in the electronic device, but a third application is installed in the electronic device, and the third application has a built-in interface of the target application, the main processor 301 may also activate the target application through the built-in interface of the target application of the third application. In the device, the coprocessor acquires an image from the signal processing device, then identifies the image and determines a target application program corresponding to the image, and the main processor activates the target application program or prompts a user to download the target application program. Therefore, the device can judge the application program which the user possibly wants to use more accurately according to the image from the image sensor and activate the application program, and the user does not need to click the application icon in the process, so that the user operation is simplified. In addition, the coprocessor extracts the characteristics of the image and determines the application program to be activated, a main processor is not required to execute relevant operations, the processing efficiency is high, and power consumption is saved.
For example, an installation-free application, also called an applet, is gradually gaining favor of users due to the advantage of no need for installation, i.e., point-and-play, and freeing up storage space after shutdown. Specifically, an interface of the application program B is built in the application program a, and a user can click an icon of the application program B in the application program a, so that the electronic device loads and runs the program package of the application program B. Therefore, if the main processor 301 determines to activate the application B, even if the application B is not installed in the electronic device, the main processor may activate the application B through the application a by using the application a, so as to avoid increasing the storage burden on the electronic device. Optionally, if the electronic device is installed with both the target application to be activated and a third application with a built-in target application interface, priorities may also be set for different activation modes. For example, in consideration of the limitation of the package size by the applet, so that the applet may be inferior in function integrity to the normally installed application, a high priority may be set for the direct activation target application and a low priority may be set for the activation target application through the third application.
In one possible implementation, the device 300 may further include a transceiver 304 for receiving part or all of the contents in the database from the server. As mentioned above, the process of training the model may be completed by the server, and then the trained model may be transmitted to the electronic device, and the electronic device may receive the aforementioned at least one image feature and the default application program corresponding to each image feature transmitted by the server through the transceiver 304 in the device 300. In particular, the transceiver 304 may be a wired transceiver, i.e., receive the contents of the database via wired transmission, or the transceiver 304 may be a wireless transceiver, i.e., receive the contents of the database via wireless transmission, such as the RF circuitry in the handset 200.
In another possible implementation manner, the training process may be completed by a provider of each application, and the trained model is carried in an installation package of the application, and if the electronic device downloads and installs the application, the coprocessor 302 may acquire an image feature corresponding to the application, so that if the coprocessor 302 identifies an image feature existing in the database from the image, the electronic device installs the application corresponding to the image feature.
In order to further improve the accuracy of the activated application program, the correspondence between the image features stored in the database and the application program may further include location information. In one embodiment, the image features stored in the database and the application program corresponding to each image feature may be as shown in table 3.
TABLE 3
Identification Image features Location information Application program
21 Yellow sharing bicycle - ofo shared bicycle
22 Blue sharing bicycle - Small blue bicycle
23 Subway gate Beijing Easy passage
23 Subway gate Shanghai province Metro metropolitan area of Metro
24 Two-dimensional code for collection Payment treasure (or little letter)
25 Sweep sign indicating number rifle Payment treasure (or little letter)
26 Bus stop board Shanghai province Shanghai bus inquiry
26 Bus stop board Beijing Beijing real-time public transport
For example, similarly, when a user uses a two-dimensional code to take a subway, an application program of "easy to pass" is currently used in beijing, and an application program of "Metro metropolitan party" is currently used in Shanghai, so that if the user wants to swipe the two-dimensional code to get into the station at a subway station in beijing and the electronic device opens the application program of "Metro metropolitan party", the service cannot be provided for the user.
For another example, many current bus applications can provide information of a running route of a bus and a current position of the bus for a user, so that the user can conveniently take the bus, estimate the bus waiting time and the like. However, each type of bus application has some limitations, for example, the "bus-coming" application can basically query the bus route of each city, but cannot query the location information of the bus of each city, and in beijing, if the location information of the bus of a certain bus route is desired to be queried, the "beijing real-time bus" application can be used, but the application cannot query the bus information of other cities. Therefore, the matching of the electronic equipment position information is increased, and the accuracy of activating the application program is improved. The location information may be implemented by a positioning device within the electronic device, such as a Global Positioning System (GPS) or a beidou system.
To further enhance the user experience, after the main processor 301 activates the determined target application, the page required by the user may be further opened. In one embodiment, the activation page of the application program can be further customized by the user after the registration of the association relationship between the user-defined image feature and the application program is completed. In addition, the image features stored in the database and the application program corresponding to each image feature may further include activation page information, which may be specifically shown in table 4 below.
TABLE 4
Identification Image classification Application program Page
21 Yellow sharing bicycle ofo shared bicycle Code scanning unlocking page
22 Blue sharing bicycle Small blue bicycle Code scanning unlocking page
23 Subway gate Easy passage Scanning page
24 Two-dimensional code for collection Payment treasure (or little letter) Scanning page
25 Sweep sign indicating number rifle Payment treasure (or little letter) Two-dimensional code payment page
26 Bus stop board Vehicle comes Route page
The following description will take an example of a case where the image features of the acquired image are not customized by the user to the associated application, but the corresponding default application can be found in the database. For example, since the scene where the "code scanning gun" or the "payment two-dimensional code" appears is usually the scene that the user needs to pay, the application corresponding to the "code scanning gun" or the "payment two-dimensional code" is the "payment treasure" that can be used for payment, that is, when the coprocessor 302 recognizes the image feature of the acquired image (such as the aforementioned second image) as the "code scanning gun" or the "payment two-dimensional code", the main processor 301 activates the "payment treasure" application. If the 'Paibao' application program is activated to be switched from background operation to foreground operation, the page currently displayed to the user is the page used by the user last time, and the page may not be the payment page required by the user; if the application program of 'Paibao' is activated, the application program which is not operated is started, and the current page displayed to the user is the main page of the application program. However, the user needs to use the "pay for use" application to make payment, that is, the user needs to be a payment page in the "pay for use" application program, and for the convenience of the user, the present embodiment may be further improved as follows. If the image acquired by the coprocessor 302 is still as shown in fig. 12a, and the identified image feature is "code scanning gun", the page of the "pay two-dimensional code" may be required by the user, the main processor 301 may further open a "pay two-dimensional code" page for the user after activating the "pay treasure" application program, and at this time, the page displayed by the electronic device is as shown in fig. 12 c; if the image acquired by the coprocessor 302 is as shown in fig. 14a, and the recognized image category is "pay two-dimensional code", the main processor 301 may further open a "scan" page for the user after activating the "pay for treasure" application, as shown in fig. 14 b. The embodiment can further simplify the user operation and improve the user experience.
In some scenarios, the user's requirement may be to use a certain application program for information query, so that the main processor 301 may further perform a search according to the information extracted from the image after activating the corresponding application program. For example, as shown in fig. 15a, the image acquired by the coprocessor 302 has an identified image feature of "bus stop board", and extracts a number "646" from the image, and the main processor 301 receives information of the identifier of the target application program corresponding to the "bus stop board" and the extracted feature information, that is, the identifier of the "beijing real-time bus" application program and the extracted number "646", sent by the coprocessor 302; the main processor 301 activates the "beijing real-time bus" application and searches for "646" in the application to enable display of information for the current 646 bus to the user, as shown in fig. 15 b.
As previously mentioned, the processor 280 in the handset 200 may comprise one or more processors, and thus, when the device 300 is used in the handset 200 as described above, the main processor 301 and the co-processor 302 correspond to different ones of the processors 280 in the handset 200; the signal processing device 303 is coupled to an image sensor and other input device controllers 271 in the handset 200, respectively, to enable data to be retrieved from the image sensor and to enable the co-processor 302 to retrieve images. The signal processing device 303 may also be one of the processors in the processor 280. Alternatively, the database may be stored in a memory of the electronic device. In addition, the memory may further store necessary software, such as driver software, operating system software, or application software, required for the main processor 301 and the coprocessor 302 to run.
In one possible design, the device 300 may be one or more chips, for example, the main processor 301, the coprocessor 302 and the signal processing device 303 are integrated into one chip, and the chip is disposed in an electronic device. The electronic device provided with the chip is further provided with the above-described image sensor, but the image sensor may not be provided on one chip with the above-described device 300.
The main processor 301 may be, for example, a Central Processing Unit (CPU) or an application processor or a microprocessor of the electronic device, and is located in the processor 280. The power consumption of the main processor 301 in normal operation is larger than that of the coprocessor 301 in normal operation, and therefore, the coprocessor 302 processes the acquired image and notifies the main processor to execute an operation of activating an application program when it is determined that it is necessary to activate an application program.
Alternatively, the main processor 301 may be in a low power consumption state before receiving information indicating image characteristics, such as the first information/second information or receiving semantic tags, from the coprocessor 302, the power consumption of the main processor 301 in the low power consumption state being lower than the power consumption of the coprocessor 302 when running the processing program. For example, the low power consumption state may be an idle state, a sleep state, or a sleep state. The main processor 301 may be awakened by the coprocessor 302 by receiving the information or semantic tag indicating the image feature, or the main processor 301 may be awakened by the coprocessor 302 in other manners, such as an interrupt notification manner, and further receive the information or semantic tag indicating the image feature, which is not limited in this embodiment.
The coprocessor 302 may be a neural network processor (e.g., NPU), a digital signal processor, an image processing unit, or a hardware accelerator, which may operate with lower power consumption than the main processor 301. Coprocessor 302 may be internal or external to processor 280, and is not limited in this respect. Since the power consumption of the coprocessor 302 is low, the power consumption of the electronic device is not greatly affected even if the coprocessor is set as a normally-open processor. In order to further improve the user experience, the above embodiments of the present application may be applied to the situation that the electronic device turns off the screen or locks the screen to light up, and in a normal situation, when the electronic device is in the screen turning off state, the main processor 301 is in the sleep state, so as to reduce power consumption. At this moment, the coprocessor can be in a working state even when the electronic equipment locks or turns off the screen due to lower power consumption, and more power can not be consumed.
The signal processing device 303 may be an ISP, and may be configured to process data acquired by the image sensor, for example, perform processing of functions such as Automatic Exposure Control (AEC), Automatic Gain Control (AGC), Automatic White Balance (AWB), color correction, lens shading (lens shading) correction, gamma (gamma) correction, and bad point removal. Alternatively, the signal processing device 303 may also be a processor of a sensor signal, which has a function similar to that of an ISP, and processes data collected by an image sensor. Alternatively, the ISP may be located within the processor 280 or alternatively, the ISP may be located outside the processor 280. Optionally, a shared memory may be deployed in the device 300, and the shared memory is used to store data collected by the image sensor. The signal processing device 303 may also be a smart Sensor hub (Sensor hub), and is configured to acquire data acquired by the image Sensor from the shared memory, and further transmit the data to the coprocessor 302, so that the coprocessor 302 performs image recognition and then sends a recognition result to the main processor 301.
In one possible design, the image sensor may be actively triggered by the user to acquire data. In another possible design, the image sensor may be always on to collect data in real time; alternatively, the image sensor may collect data according to a preset period, that is, data collection is automatically performed at intervals. Therefore, the camera or the lens carrying the image sensor can be considered to be in a normally open (Always On) state, so that the surrounding environment can be identified in real time to activate the application program in time, and the user experience is improved. Optionally, a camera or a lens carrying the image sensor may employ a low power consumption camera. The process of acquiring data by the image sensor does not need to be triggered by the user each time, but is automatically implemented by the device. The preset period may be a fixed period, or a plurality of periods may be set for the image sensor, and different periods are adopted according to different environments. For example, if the ambient light is strong, the user requirement may be generated at any time, and a preset period of 0.1s may be adopted; if the ambient light is dark, the possibility of generating the user requirement is low, and a preset period of 1s can be adopted. In one possible design, the signal processing device 303 may also process the acquired data in real time or periodically and process it for further identification by the coprocessor 302. If the image sensor and the signal processing device 303 are each configured to acquire data or process data according to a preset cycle, the same cycle or different cycles may be configured thereto.
Further, an embodiment of the present application also provides a computer storage medium, where a software program is stored in the storage medium, and when the software program is read and executed by a coprocessor, the software program may implement the functions performed by the coprocessor in any of the above embodiments; alternatively, the software program, when read and executed by the main processor, may implement the functions performed by the main processor in any of the embodiments.
Embodiments of the present application further provide a computer program product, which when run on a coprocessor, causes the coprocessor to perform the functions performed by the coprocessor in any of the above embodiments; or, when run on a host processor, cause the host processor to perform the functions performed by the host processor in any of the embodiments described above. The computer storage medium includes the memory 220 mentioned in the previous embodiment, and may store the relevant software data of the corresponding database in fig. 3, which is not described herein again. Thus, the co-processor and the running software may be provided as a separate product for implementing the image recognition function mentioned in the previous embodiment to find the corresponding default application, so that the identity of the relevant default application is sent to the host processor by the co-processor. Therefore, the software executed by the coprocessor may include a plurality of software modules, including but not limited to an identification module for identifying an image and a search module for searching a corresponding default application, which is not limited in this embodiment.
Based on the same concept, the embodiment of the present application provides another identification device 1600 for an application program, as shown in fig. 16, where the device 1600 includes: a main processing module 1601, configured to receive a first instruction triggered by a user, where the first instruction is used to instruct to associate a first image with a first application; a co-processing module 1602, configured to identify the first image to obtain a first image feature, and send first information indicating the first image feature to the main processing module; the main processing module is further configured to associate the first image feature with the first application program and store the first information and the first application program in the database when it is determined that the database does not include the first information; wherein the database comprises information indicative of at least one image feature and an application associated with each image feature. The scheme supports user customization and establishes the incidence relation between the image characteristics and the application program. In subsequent image recognition, if an image feature associated with an application program is recognized, the recognition device can automatically start the application program associated with the image feature, so that the operation process of a user is simplified, and the types of the application programs which can be activated based on the image recognition technology are expanded.
In an alternative embodiment, the apparatus 1600 further comprises a signal processing module 1603 for processing the first data from the image sensor to obtain the first image.
In an optional implementation manner, the signal processing module 1603 is further configured to process second data from the image sensor to obtain a second image; the co-processing module 1602 is further configured to identify the second image to obtain a second image feature, and send second information to the main processing module 1601 when it is determined that the database includes the second information indicating the second image feature. The main processing module 1601 is further configured to determine an application program associated with the second image feature in the database as a target application program to be activated.
In an optional implementation manner, the co-processing module 1602 is further configured to perform semantic analysis on the second image or the second image feature to obtain a semantic tag when the database does not include second information used for indicating the second image feature; wherein the semantic tag comprises text information for characterizing the second image or the second image feature; the main processing module 1601 is further configured to receive the semantic tag from the co-processing module 1602, indicate at least one application program associated with the semantic tag to a user, receive a second instruction triggered by the user, where the second instruction is used to indicate that a second application program of the at least one application program is activated, and determine that the second application program is the target application program to be activated.
In an alternative implementation, after receiving the identification of the target application from the co-processing module 1602, the main processing module 1601 is further configured to: judging whether the target application program is installed in the electronic equipment for deploying the equipment; activating the target application program when the target application program is installed in the electronic equipment; or when the target application program is not installed in the electronic equipment but a third application program is installed, activating the target application program through the target application program interface built in the third application program; or when the target application program is not installed in the terminal, prompting a user to download the target application program.
In an alternative implementation, the signal processing module 1603 is specifically configured to process the second data from the image sensor periodically or in real time to obtain the second image.
In an alternative implementation, the co-processing module 1602 is specifically configured to identify the first image or the second image using a small sample image identification algorithm.
Any one or more of the above modules may be implemented by software, hardware or a combination of the two, and the functions of the modules implemented by hardware may refer to the description of the corresponding components in fig. 3. The module implemented in software may include computer program instructions and be executed by the corresponding components in fig. 3, which is not limited in this embodiment.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks, which may be described with reference to fig. 16.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

  1. An identification device for an application program, comprising:
    the main processor is used for receiving a first instruction triggered by a user, and the first instruction is used for indicating that a first image is associated with a first application program;
    the coprocessor is used for identifying the first image to obtain first image characteristics and sending first information used for indicating the first image characteristics to the main processor;
    the main processor is further configured to associate the first image feature with the first application program and store the first information and the first application program in the database when it is determined that the database does not include the first information; wherein the database comprises information indicative of at least one image feature and an application associated with each image feature.
  2. The apparatus of claim 1, further comprising a signal processing device for processing first data from an image sensor to obtain the first image.
  3. The apparatus of claim 2,
    the signal processing device is further used for processing second data from the image sensor to obtain a second image;
    the coprocessor is further configured to identify the second image to obtain a second image feature, and send second information to the main processor when the second information indicating the second image feature is determined to be included in the database;
    the main processor is further used for determining the application program associated with the second image feature in the database as a target application program to be activated.
  4. The apparatus of claim 3,
    the coprocessor is further configured to perform semantic analysis on the second image or the second image feature to obtain a semantic tag when the database does not include second information indicating the second image feature; wherein the semantic label comprises text information for characterizing the second image or the second image feature;
    the main processor is further configured to receive the semantic tag from the coprocessor, indicate at least one application program associated with the semantic tag to a user, receive a second instruction triggered by the user, where the second instruction is used to indicate that a second application program of the at least one application program is activated, and determine that the second application program is the target application program to be activated.
  5. The device of claim 3 or 4, wherein the host processor is further to:
    judging whether the target application program is installed in the electronic equipment for deploying the equipment;
    activating the target application program when the target application program is installed in the electronic equipment; or,
    when the target application program is not installed in the electronic equipment but a third application program is installed, activating the target application program through a target application program interface built in the third application program; or,
    and when the target application program is not installed in the terminal, prompting a user to download the target application program.
  6. The device according to any of claims 3 to 5, wherein the signal processing device is in particular adapted to process the second data from the image sensor periodically or in real time to obtain the second image.
  7. The apparatus of any of claims 2-6, wherein the signal processing apparatus comprises at least one of an image signal processor, ISP, or a sensor processor.
  8. The device of any of claims 1-7, wherein the co-processor is specifically configured to identify the first image or the second image using a small sample image identification algorithm.
  9. The device of any of claims 1-8, wherein the co-processor comprises at least one of a neural network processor, a digital signal processor, an image processing unit, or a hardware accelerator.
  10. An electronic device comprising the device of any of claims 1-9 and the image sensor, the image sensor being configured to acquire data periodically or in real time.
CN202080007252.0A 2020-08-28 2020-08-28 Application program identification device and electronic device Pending CN114667505A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/112290 WO2022041155A1 (en) 2020-08-28 2020-08-28 Application program identification device and electronic device

Publications (1)

Publication Number Publication Date
CN114667505A true CN114667505A (en) 2022-06-24

Family

ID=80352484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080007252.0A Pending CN114667505A (en) 2020-08-28 2020-08-28 Application program identification device and electronic device

Country Status (2)

Country Link
CN (1) CN114667505A (en)
WO (1) WO2022041155A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105144197A (en) * 2013-03-14 2015-12-09 高通股份有限公司 Image-based application launcher
CN108509231A (en) * 2018-03-27 2018-09-07 平安科技(深圳)有限公司 Application program deployment method, electronic device, equipment and storage medium based on VR
CN109828668A (en) * 2019-01-30 2019-05-31 维沃移动通信有限公司 A kind of display control method and electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006011677A (en) * 2004-06-24 2006-01-12 Fuji Photo Film Co Ltd Startup system and startup program
CN102298533B (en) * 2011-09-20 2015-05-13 宇龙计算机通信科技(深圳)有限公司 Method for activating application program and terminal equipment
CN103268151B (en) * 2013-05-29 2016-09-07 深圳市东方拓宇科技有限公司 A kind of data processing equipment and the method starting specific function thereof
CN104834382B (en) * 2015-05-21 2018-03-02 上海斐讯数据通信技术有限公司 Application program for mobile terminal response system and method
CN105677025A (en) * 2015-12-31 2016-06-15 宇龙计算机通信科技(深圳)有限公司 Terminal application starting method and device, and terminal
CN107783715A (en) * 2017-11-20 2018-03-09 北京小米移动软件有限公司 Using startup method and device
CN109697349A (en) * 2018-12-18 2019-04-30 深圳壹账通智能科技有限公司 Terminal unlock method, device, computer equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105144197A (en) * 2013-03-14 2015-12-09 高通股份有限公司 Image-based application launcher
CN108509231A (en) * 2018-03-27 2018-09-07 平安科技(深圳)有限公司 Application program deployment method, electronic device, equipment and storage medium based on VR
CN109828668A (en) * 2019-01-30 2019-05-31 维沃移动通信有限公司 A kind of display control method and electronic equipment

Also Published As

Publication number Publication date
WO2022041155A1 (en) 2022-03-03

Similar Documents

Publication Publication Date Title
CN111476306B (en) Object detection method, device, equipment and storage medium based on artificial intelligence
CN108494947B (en) Image sharing method and mobile terminal
CN109561211B (en) Information display method and mobile terminal
CN108182271B (en) Photographing method, terminal and computer readable storage medium
CN106293076A (en) Communication terminal and intelligent terminal's gesture identification method and device
CN108460817B (en) Jigsaw puzzle method and mobile terminal
CN108510267B (en) Account information acquisition method and mobile terminal
CN110990679A (en) Information searching method and electronic equipment
CN109917988B (en) Selected content display method, device, terminal and computer readable storage medium
CN112835495B (en) Method and device for opening application program and terminal equipment
KR101995799B1 (en) Place recognizing device and method for providing context awareness service
CN109684006B (en) Terminal control method and device
EP2704038A1 (en) Method for providing a title of contents based on context information and device therefor
US20210342138A1 (en) Device for recognizing application in mobile terminal and terminal
CN109726726B (en) Event detection method and device in video
CN113744736B (en) Command word recognition method and device, electronic equipment and storage medium
CN114667505A (en) Application program identification device and electronic device
CN111027406B (en) Picture identification method and device, storage medium and electronic equipment
CN112560612B (en) System, method, computer device and storage medium for determining business algorithm
CN111028846B (en) Method and device for registration of wake-up-free words
CN110166621B (en) Word processing method and terminal equipment
CN113569889A (en) Image recognition method based on artificial intelligence and related device
CN107844242B (en) Display method of mobile terminal and mobile terminal
CN111638843A (en) Information processing method and device and electronic equipment
CN111797391A (en) High-risk process processing method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination