CN112486394A - Information processing method and device, electronic equipment and readable storage medium - Google Patents

Information processing method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN112486394A
CN112486394A CN202011504205.5A CN202011504205A CN112486394A CN 112486394 A CN112486394 A CN 112486394A CN 202011504205 A CN202011504205 A CN 202011504205A CN 112486394 A CN112486394 A CN 112486394A
Authority
CN
China
Prior art keywords
image
sub
target
display
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011504205.5A
Other languages
Chinese (zh)
Inventor
梁远刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Weiwo Software Technology Co ltd
Original Assignee
Nanjing Weiwo Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Weiwo Software Technology Co ltd filed Critical Nanjing Weiwo Software Technology Co ltd
Priority to CN202011504205.5A priority Critical patent/CN112486394A/en
Publication of CN112486394A publication Critical patent/CN112486394A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/113Recognition of static hand signs
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an information processing method, an information processing device, electronic equipment and a readable storage medium, wherein the method comprises the following steps: acquiring a first image shot by a thermal infrared imager; extracting position information of the target object in the first image under the condition that the target object is identified to be included in the first image; mapping the position information to a target display area of a display interface of the electronic equipment according to a preset mapping relation; and responding to the touch operation of the target display area. According to the embodiment of the invention, the problem that the user can not smoothly interact with the electronic equipment when the user is difficult to contact the electronic equipment can be solved.

Description

Information processing method and device, electronic equipment and readable storage medium
Technical Field
Embodiments of the present invention relate to the field of information processing, and in particular, to an information processing method and apparatus, an electronic device, and a readable storage medium.
Background
With the continuous development of touch electronic devices, great convenience is brought to the life of users. Currently, a user mainly touches a display screen of an electronic device to interact with the electronic device.
In the process of implementing the present application, the inventor finds that at least the following problems exist in the prior art: the user must touch the screen of the electronic device to be able to perform touch input on the electronic device, and when the user is difficult to touch the electronic device, the user cannot smoothly interact with the electronic device.
Disclosure of Invention
Embodiments of the present invention provide an information processing method and apparatus, an electronic device, and a readable storage medium, which can solve the problem that a user cannot smoothly interact with the electronic device when the user is difficult to contact the electronic device.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an information processing method, where the method may include:
acquiring a first image shot by a thermal infrared imager; extracting position information of the target object in the first image under the condition that the target object is identified to be included in the first image; mapping the position information to a target display area of a display interface of the electronic equipment according to a preset mapping relation; and responding to the touch operation of the target display area.
In a second aspect, an embodiment of the present invention provides an information processing apparatus, which may include:
the acquisition module is used for acquiring a first image shot by the thermal infrared imager; the extraction module is used for extracting the position information of the target object in the first image under the condition that the target object is identified to be included in the first image; the mapping module is used for mapping the position information to a target display area of a display interface of the electronic equipment according to a preset mapping relation; and the response module is used for responding the touch operation of the target display area.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present invention provide a readable storage medium on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the invention, when a user is difficult to contact the electronic equipment, the first image shot by the thermal infrared imager is acquired, and under the condition that the first image comprises the target object, the extracted position information of the target object in the first image is mapped to the target display area of the display interface of the electronic equipment according to the preset mapping relation, and then the first input of the target display area is responded. Therefore, the space interaction between the user and the electronic equipment can be smoothly realized without the need of contacting the display screen of the electronic equipment.
Drawings
The present invention will be better understood from the following description of specific embodiments thereof taken in conjunction with the accompanying drawings, in which like or similar reference characters designate like or similar features.
Fig. 1 is a schematic view of an application scenario of an information processing method according to an embodiment of the present invention;
fig. 2 is a flowchart of an information processing method according to an embodiment of the present invention;
fig. 3 is a schematic diagram for implementing an information processing method according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The information processing method provided by the embodiment of the invention can be applied to the following application scenarios, which are explained below.
For electronic equipment with a touch screen, a main man-machine interaction mode is to realize interaction with the electronic equipment by contacting the touch screen.
As shown in FIG. 1, the actual distance between the electronic device 10 and the user 20 is greater than the contact distance that the user can contact the electronic device 10, and at this time, the user has difficulty touching the display screen of the electronic device 10 and is unable to interact with the electronic device 10.
Embodiments of the present invention provide an information processing method, an information processing apparatus, an electronic device, and a storage medium, which are used to solve the problem in the related art that a user cannot smoothly interact with the electronic device when the user is difficult to contact the electronic device.
The method provided by the embodiment of the invention can be applied to any scene that a user is difficult to contact with the electronic equipment besides the application scene.
According to the method provided by the embodiment of the invention, when a user is difficult to contact the electronic equipment, the first image shot by the thermal infrared imager is obtained, and under the condition that the first image comprises the target object, the extracted position information of the target object in the first image is mapped to the target display area of the display interface of the electronic equipment according to the preset mapping relation, and then the first input of the target display area is responded. Therefore, the space interaction between the user and the electronic equipment can be smoothly realized without the need of contacting the display screen of the electronic equipment.
Based on the application scenario, the following describes in detail an information processing method provided by an embodiment of the present invention.
Fig. 2 is a flowchart of an information processing method according to an embodiment of the present invention.
As shown in fig. 2, the information processing method may include steps 210 to 240, and the method is applied to an information processing apparatus, and specifically as follows:
and step 210, acquiring a first image shot by the thermal infrared imager.
In step 220, in the case that the target object is identified to be included in the first image, the position information of the target object in the first image is extracted.
And step 230, mapping the position information to a target display area of a display interface of the electronic equipment according to a preset mapping relation.
And step 240, responding to the touch operation of the target display area.
According to the information processing method provided by the embodiment of the invention, when a user is difficult to contact the electronic equipment, the first image shot by the thermal infrared imager is obtained, and under the condition that the first image comprises the target object, the extracted position information of the target object in the first image is mapped to the target display area of the display interface of the electronic equipment according to the preset mapping relation, and then the first input of the target display area is responded. Therefore, the space interaction between the user and the electronic equipment can be smoothly realized without the need of contacting the display screen of the electronic equipment.
The contents of steps 210-240 are described below:
first, step 210 is involved.
The thermal infrared imager utilizes an infrared detector and an optical imaging objective to receive an infrared radiation energy distribution pattern of a detected target and reflect the infrared radiation energy distribution pattern to a photosensitive element of the infrared detector, so that an infrared thermal image is obtained, and the thermal image corresponds to a thermal distribution field on the surface of an object. I.e., thermal infrared imagers, convert the invisible infrared energy emitted by an object into a visible thermal image. The different colors on the top of the thermal image represent the different temperatures of the object being measured.
The infrared camera can be used in weak light and dark environments, and has good noise resistance in complex environments through temperature detection. The infrared thermal phase instrument has the advantages that the human body and the object can be effectively distinguished due to different temperatures of the human body and the object, and the operation of the human body can be more easily identified. Compared with a common camera, the infrared thermal phase instrument can also identify the operation of a human body in the environment with weak light, darkness and complex background, and is wide in applicable scene.
And the infrared camera has a proper focal range, the focal range is matched with a preset user operation space, and the infrared camera is insensitive to images outside the focal range. Interference in non-user operation areas can be reduced.
Therefore, the image with high accuracy and good noise immunity can be obtained by acquiring the first image shot by the thermal infrared imager.
Next, step 220 is involved.
Whether a target object (such as a finger) is included in the first image is identified, and in the case that the target object is identified to be included in the first image, the position information of the target object in the first image is extracted. The position information may be coordinate information of the target object in the first image.
Then, step 230 is involved.
In some embodiments of the present application, the first image includes M first sub-display regions, the display interface includes N second sub-display regions, the first sub-display regions correspond to the second sub-display regions one to one, and M and N are positive integers, where the above step 230 may specifically include the following steps:
determining a first target sub-display area where the target object is located in the M first sub-display areas according to the position information; and determining a target display area having a preset mapping relation with the first target sub-display area in the N second sub-display areas.
Specifically, as shown in fig. 3, first, the first image (i.e., the first image captured by the thermal infrared imager) is equally divided into m × n first sub-display regions, i.e., R11.., Rmn; dividing the display interface of the electronic device into m × n second sub-display areas, namely D11. Then, the first sub-display regions Rxx and the second sub-display regions Dxx are in one-to-one correspondence. Wherein m, n and x are positive integers.
Then, when a target object (for example, a finger) enters the infrared thermal imager shooting area, the infrared thermal imager captures an image of the finger by using the temperature difference between the human body and the object to obtain a first image including the target object, because the temperature of the human body is higher than the temperature of the surrounding environment.
And determining a first target sub-display area (Rxx) in which the target object is positioned in the M first sub-display areas according to the position information, and determining a target display area (Dxx) which has a preset mapping relation with the first target sub-display area in the N second sub-display areas. Namely, a first target sub-display area (Rxx) of the target object is mapped to a second sub-display area (Dxx) of a display interface of the electronic equipment, and the Dxx is determined as the target display area.
Therefore, a first target sub-display area where the target object is located in the M first sub-display areas is determined according to the position information, and a target display area which has a preset mapping relation with the first target sub-display area is determined in the N second sub-display areas. The target display area which the user desires to manipulate can be determined quickly and accurately.
In some embodiments of the present application, position information of a target object in each first image and a shooting time of each first image are acquired; sequentially mapping the position information of the target object in each first image to a display interface according to the shooting time to obtain the moving track of the target object; and responding to the touch operation corresponding to the moving track.
Specifically, the shooting time of each first image is: t is1、T2、T3,……,Tn(ii) a The position information of the target object in each first image is as follows: (X)1,Y1)(X2,Y2),……,(Xn,Yn) Wherein, T1Corresponds to (X)1,Y1),……,TnCorresponds to (X)n,Yn)。
According to the shooting time (T)1T2T3,……,Tn) And sequentially mapping the position information of the target object in each first image to a display interface to obtain the moving track of the target object.
Accordingly, the step of responding to the touch operation corresponding to the movement track may specifically be: responding to a drag operation corresponding to the movement trajectory, responding to a slide operation corresponding to the movement trajectory, responding to a scroll operation corresponding to the movement trajectory, and the like.
The position information of the target object in each first image is mapped to the display interface in sequence according to the shooting time to obtain the moving track of the target object, and the touch operation corresponding to the moving track is responded, so that the interactive function except for clicking the target control can be provided for the user, and the interactive experience between the user and the electronic equipment can be enriched.
In some embodiments of the present application, a user gesture in a first image is identified based on a pre-trained gesture recognition model; and responding to the touch operation corresponding to the user gesture.
Specifically, a user gesture in the first image is recognized through a pre-trained gesture recognition model, and then touch operation corresponding to the user gesture is determined according to a preset corresponding relation between the gesture and the touch operation. Illustratively, when the user gesture in the first image is recognized as "OK", it represents determination, and when the user gesture in the first image is recognized as "two-hand cross", it represents rejection.
In addition, the user action in the first image can be recognized based on a pre-trained user action recognition model, and then the touch operation corresponding to the user action is responded. The user action may include a user's limb action such as nodding and shaking.
Here, by recognizing a user gesture in the first image based on a pre-trained gesture recognition model and responding to a touch operation corresponding to the user gesture, the interactive experience between the user and the electronic device can be enriched.
Finally, step 240 is involved.
The electronic device includes a wearable device and a television.
The wearable device is a general term for intelligently designing daily wearing and developing wearable devices such as glasses, gloves, watches, clothes, shoes and the like by applying a wearable technology. For example: smart watches or smart glasses, etc.
The smart glasses, also called smart glasses, are "provided with an independent operating system like a smart phone, and the smart glasses can be installed with programs provided by software service providers such as software, games and the like by users. The intelligent glasses can complete functions of adding schedules, map navigation, interacting with friends, taking photos and videos, performing video calls with friends and the like through voice or action control, and can realize the general name of the glasses with wireless network access through a mobile communication network.
For the wearable device, the interaction between the user and the electronic device is not conveniently realized in a traditional touch screen mode, and the convenient interaction with the electronic device is realized through the steps. For the television, remote interaction with the electronic equipment can be realized through the steps without a remote controller, so that the method is more convenient and faster, and the user experience is improved.
In summary, in the embodiment of the invention, when the user hardly touches the electronic device, the first image captured by the thermal infrared imager is acquired, and in a case that the first image includes the target object, the extracted position information of the target object in the first image is mapped to the target display area of the display interface of the electronic device according to the preset mapping relationship, and then the first input of the target display area is responded. Therefore, the space interaction between the user and the electronic equipment can be smoothly realized without the need of contacting the display screen of the electronic equipment.
It should be noted that, in the information processing method provided in the embodiment of the present application, the execution main body may be an information processing apparatus, or a control module in the information processing apparatus for executing the loaded information processing method. In the embodiment of the present application, an information processing method performed by an information processing apparatus is taken as an example, and the information processing method provided in the embodiment of the present application is described.
In addition, based on the information processing method, an embodiment of the present invention further provides an information processing apparatus, which is specifically described in detail with reference to fig. 4.
Fig. 4 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present invention.
As shown in fig. 4, the information processing apparatus 400 may include:
the acquiring module 410 is configured to acquire a first image captured by a thermal infrared imager.
And an extracting module 420, configured to, in a case that it is recognized that the target object is included in the first image, extract position information of the target object in the first image.
The mapping module 430 is configured to map the position information to a target display area of a display interface of the electronic device according to a preset mapping relationship.
The response module 440 is configured to respond to a touch operation of the target display area.
In some embodiments of the present application, the first image includes M first sub-display regions, the display interface includes N second sub-display regions, the first sub-display regions correspond to the second sub-display regions one to one, M and N are positive integers, and the mapping module 430 includes:
and the determining module is used for determining the first target sub-display area of the target object in the M first sub-display areas according to the position information.
And the determining module is further used for determining the target display area with a preset mapping relation with the first target sub-display area in the N second sub-display areas.
In some embodiments of the present application, the obtaining module 410 is further configured to obtain position information of the target object in each first image and a capturing time of each first image.
The mapping module 430 is further configured to sequentially map the position information of the target object in each first image to the display interface according to the shooting time, so as to obtain a moving track of the target object.
The response module 440 is further configured to respond to a touch operation corresponding to the movement trajectory.
In some embodiments of the present application, the information processing apparatus 400 further includes:
the recognition module is used for recognizing the user gesture in the first image based on a pre-trained gesture recognition model.
And the response module is also used for responding to the touch operation corresponding to the user gesture.
The electronic device includes a wearable device and a television.
The information processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The information processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The information processing apparatus provided in the embodiment of the present application can implement each process implemented by the information processing apparatus in the method embodiments of fig. 2 to fig. 3, and is not described herein again to avoid repetition.
To sum up, the information processing apparatus provided by the embodiment of the invention, when the user is hard to touch the electronic device, obtains the first image captured by the thermal infrared imager, and maps the extracted position information of the target object in the first image to the target display area of the display interface of the electronic device according to the preset mapping relationship when the target object is identified to be included in the first image, and then responds to the first input of the target display area. Therefore, the space interaction between the user and the electronic equipment can be smoothly realized without the need of contacting the display screen of the electronic equipment.
Fig. 5 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
The electronic device 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and the like. Among other things, input unit 504 may include a graphics processor 5041 and a microphone 5042; the display unit 506 may include a display panel 5061; the user input unit 507 may include a touch panel 5071 and other input devices 5072; the memory 509 may include an application program and an operating system.
Those skilled in the art will appreciate that the electronic device 500 may further include a power supply (e.g., a battery) for supplying power to various components, and the power supply may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 5 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The network module 502 is configured to acquire a first image captured by the thermal infrared imager.
The processor 510 is further configured to extract location information of the target object in the first image if the target object is identified to be included in the first image.
The processor 510 is further configured to map the position information to a target display area of a display interface of the electronic device according to a preset mapping relationship.
The processor 510 is further configured to respond to a touch operation of the target display area.
Optionally, the processor 510 is further configured to determine, according to the position information, a first target sub-display area in which the target object is located among the M first sub-display areas.
The processor 510 is further configured to determine, in the N second sub-display regions, a target display region having a preset mapping relationship with the first target sub-display region.
Optionally, the network module 502 is further configured to obtain position information of the target object in each first image and a shooting time of each first image.
And the processor 510 is further configured to sequentially map the position information of the target object in each first image onto the display interface according to the shooting time, so as to obtain a moving track of the target object.
Optionally, the processor 510 is further configured to recognize a user gesture in the first image based on a pre-trained gesture recognition model.
In the embodiment of the invention, when a user is difficult to contact the electronic equipment, the first image shot by the thermal infrared imager is acquired, and under the condition that the first image comprises the target object, the extracted position information of the target object in the first image is mapped to the target display area of the display interface of the electronic equipment according to the preset mapping relation, and then the first input of the target display area is responded. Therefore, the space interaction between the user and the electronic equipment can be smoothly realized without the need of contacting the display screen of the electronic equipment.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above-mentioned information processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the information processing method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An information processing method is applied to electronic equipment, and is characterized in that the electronic equipment comprises a thermal infrared imager, and the method comprises the following steps:
acquiring a first image shot by the thermal infrared imager;
extracting position information of a target object in the first image if the target object is identified to be included in the first image;
mapping the position information to a target display area of a display interface of the electronic equipment according to a preset mapping relation;
and responding to the touch operation of the target display area.
2. The method according to claim 1, wherein the first image includes M first sub-display regions, the display interface includes N second sub-display regions, the first sub-display regions correspond to the second sub-display regions one to one, M and N are positive integers, and the mapping the position information to a target display region of the display interface of the electronic device according to a preset mapping relationship includes:
determining a first target sub-display area in which the target object is located in the M first sub-display areas according to the position information;
and in the N second sub-display areas, determining the target display area having the preset mapping relation with the first target sub-display area.
3. The method of claim 1, further comprising:
acquiring position information of the target object in each first image and shooting time of each first image;
sequentially mapping the position information of the target object in each first image to the display interface according to the shooting time to obtain the moving track of the target object;
and responding to the touch operation corresponding to the moving track.
4. The method of claim 1, further comprising:
recognizing a user gesture in the first image based on a pre-trained gesture recognition model;
and responding to the touch operation corresponding to the user gesture.
5. An information processing apparatus applied to an electronic device, wherein the electronic device comprises a thermal infrared imager, the apparatus comprising:
the acquisition module is used for acquiring a first image shot by the thermal infrared imager;
the extraction module is used for extracting the position information of the target object in the first image under the condition that the target object is identified to be included in the first image;
the mapping module is used for mapping the position information to a target display area of a display interface of the electronic equipment according to a preset mapping relation;
and the response module is used for responding the touch operation of the target display area.
6. The apparatus of claim 5, wherein the first image comprises M first sub-display regions, the display interface comprises N second sub-display regions, the first sub-display regions correspond to the second sub-display regions one to one, M and N are positive integers, and the mapping module comprises:
a determining module, configured to determine, according to the position information, a first target sub-display area where the target object is located in the M first sub-display areas;
the determining module is further configured to determine, in the N second sub-display areas, the target display area having the preset mapping relationship with the first target sub-display area.
7. The apparatus according to claim 5, wherein the acquiring module is further configured to acquire position information of the target object in each of the first images and a capturing time of each of the first images;
the mapping module is further configured to sequentially map the position information of the target object in each first image to the display interface according to the shooting time to obtain a moving track of the target object;
the response module is further used for responding to the touch operation corresponding to the moving track.
8. The apparatus of claim 5, further comprising:
the recognition module is used for recognizing the user gesture in the first image based on a pre-trained gesture recognition model;
the response module is further used for responding to the touch operation corresponding to the user gesture.
9. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the information processing method according to any one of claims 1 to 4.
10. A readable storage medium, characterized in that a program or instructions are stored thereon, which when executed by a processor, implement the steps of the information processing method according to any one of claims 1 to 4.
CN202011504205.5A 2020-12-17 2020-12-17 Information processing method and device, electronic equipment and readable storage medium Pending CN112486394A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011504205.5A CN112486394A (en) 2020-12-17 2020-12-17 Information processing method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011504205.5A CN112486394A (en) 2020-12-17 2020-12-17 Information processing method and device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN112486394A true CN112486394A (en) 2021-03-12

Family

ID=74914231

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011504205.5A Pending CN112486394A (en) 2020-12-17 2020-12-17 Information processing method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112486394A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113190116A (en) * 2021-04-28 2021-07-30 北京市商汤科技开发有限公司 Schedule reminding method and device, electronic equipment and storage medium
CN114741151A (en) * 2022-04-25 2022-07-12 维沃软件技术有限公司 Split screen display method and device, electronic equipment and readable storage medium
CN115877989A (en) * 2022-11-14 2023-03-31 北京深光科技有限公司 Touch identification method and device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102929547A (en) * 2012-10-22 2013-02-13 四川长虹电器股份有限公司 Intelligent terminal contactless interaction method
CN105278667A (en) * 2014-12-16 2016-01-27 维沃移动通信有限公司 Data interaction method, data interaction system and mobile terminal
CN205304923U (en) * 2015-12-23 2016-06-08 武汉哒呤科技有限公司 Realize interactive cell -phone through gesture operation
CN107589832A (en) * 2017-08-01 2018-01-16 深圳市汇春科技股份有限公司 It is a kind of based on optoelectronic induction every empty gesture identification method and its control device
CN110213407A (en) * 2019-05-28 2019-09-06 Oppo(重庆)智能科技有限公司 A kind of operating method of electronic device, electronic device and computer storage medium
CN110276251A (en) * 2019-05-13 2019-09-24 联想(上海)信息技术有限公司 A kind of image-recognizing method, device, equipment and storage medium
CN111885406A (en) * 2020-07-30 2020-11-03 深圳创维-Rgb电子有限公司 Smart television control method and device, rotatable television and readable storage medium
CN112068698A (en) * 2020-08-31 2020-12-11 北京市商汤科技开发有限公司 Interaction method and device, electronic equipment and computer storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102929547A (en) * 2012-10-22 2013-02-13 四川长虹电器股份有限公司 Intelligent terminal contactless interaction method
CN105278667A (en) * 2014-12-16 2016-01-27 维沃移动通信有限公司 Data interaction method, data interaction system and mobile terminal
CN205304923U (en) * 2015-12-23 2016-06-08 武汉哒呤科技有限公司 Realize interactive cell -phone through gesture operation
CN107589832A (en) * 2017-08-01 2018-01-16 深圳市汇春科技股份有限公司 It is a kind of based on optoelectronic induction every empty gesture identification method and its control device
CN110276251A (en) * 2019-05-13 2019-09-24 联想(上海)信息技术有限公司 A kind of image-recognizing method, device, equipment and storage medium
CN110213407A (en) * 2019-05-28 2019-09-06 Oppo(重庆)智能科技有限公司 A kind of operating method of electronic device, electronic device and computer storage medium
CN111885406A (en) * 2020-07-30 2020-11-03 深圳创维-Rgb电子有限公司 Smart television control method and device, rotatable television and readable storage medium
CN112068698A (en) * 2020-08-31 2020-12-11 北京市商汤科技开发有限公司 Interaction method and device, electronic equipment and computer storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113190116A (en) * 2021-04-28 2021-07-30 北京市商汤科技开发有限公司 Schedule reminding method and device, electronic equipment and storage medium
CN114741151A (en) * 2022-04-25 2022-07-12 维沃软件技术有限公司 Split screen display method and device, electronic equipment and readable storage medium
CN114741151B (en) * 2022-04-25 2024-05-24 维沃软件技术有限公司 Split screen display method and device, electronic equipment and readable storage medium
CN115877989A (en) * 2022-11-14 2023-03-31 北京深光科技有限公司 Touch identification method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110210571B (en) Image recognition method and device, computer equipment and computer readable storage medium
CN112486394A (en) Information processing method and device, electronic equipment and readable storage medium
US20170083741A1 (en) Method and device for generating instruction
US20090284469A1 (en) Video based apparatus and method for controlling the cursor
CN105229582A (en) Based on the gestures detection of Proximity Sensor and imageing sensor
US9538086B2 (en) Method of performing previewing and electronic device for implementing the same
CN112738402B (en) Shooting method, shooting device, electronic equipment and medium
CN104081307A (en) Image processing apparatus, image processing method, and program
CN114138121B (en) User gesture recognition method, device and system, storage medium and computing equipment
CN112068698A (en) Interaction method and device, electronic equipment and computer storage medium
CN112540696A (en) Screen touch control management method, intelligent terminal, device and readable storage medium
CN113253908A (en) Key function execution method, device, equipment and storage medium
CN109669710B (en) Note processing method and terminal
CN111291638A (en) Object comparison method, system, equipment and medium
CN112788244B (en) Shooting method, shooting device and electronic equipment
CN111770268A (en) Photographing method and device and electronic equipment
CN105094500A (en) Icon placing method and device
CN112995506B (en) Display control method, display control device, electronic device, and medium
CN113794831B (en) Video shooting method, device, electronic equipment and medium
JP2023179345A (en) Information input method, information input device, electronic equipment, and storage medium
CN113271379B (en) Image processing method and device and electronic equipment
CN116737290A (en) Finger joint knocking event identification method and electronic equipment
CN114125226A (en) Image shooting method and device, electronic equipment and readable storage medium
CN112150486B (en) Image processing method and device
CN109725722B (en) Gesture control method and device for screen equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210312